1 00:00:05,000 --> 00:00:09,280 Speaker 1: Welcome back to It Could Happen Here, a podcast about 2 00:00:09,320 --> 00:00:14,440 Speaker 1: it it being bad things happening here here being you know, 3 00:00:14,880 --> 00:00:18,960 Speaker 1: wherever you are. We're talking specifically about wherever you are. 4 00:00:19,040 --> 00:00:23,280 Speaker 1: I'm Robert Evans, one of the hosts of this podcast, 5 00:00:23,680 --> 00:00:27,280 Speaker 1: and with me today is a guy I have a 6 00:00:27,320 --> 00:00:32,400 Speaker 1: lot of admiration for probably my favorite YouTube documentarian, which 7 00:00:32,440 --> 00:00:34,640 Speaker 1: I guess would be the fastest way to sum up 8 00:00:34,680 --> 00:00:37,200 Speaker 1: who you are and what you do. Dan Olsen from 9 00:00:37,200 --> 00:00:39,120 Speaker 1: the channel Folding Ideas. 10 00:00:40,040 --> 00:00:44,200 Speaker 2: Dan. Hi, Hi, how you doing. 11 00:00:46,240 --> 00:00:48,400 Speaker 3: I'm doing well? Thanks for thanks for inviting me. 12 00:00:48,920 --> 00:00:49,400 Speaker 2: Yeah. 13 00:00:49,760 --> 00:00:52,120 Speaker 1: No, Dan, you and I have like a topic of 14 00:00:52,159 --> 00:00:55,240 Speaker 1: shared interest to discuss. But the first thing I wanted 15 00:00:55,240 --> 00:00:58,680 Speaker 1: to talk about is your name on the internet is 16 00:00:59,160 --> 00:00:59,960 Speaker 1: foldable human. 17 00:01:00,360 --> 00:01:00,600 Speaker 2: Yeah. 18 00:01:00,640 --> 00:01:04,760 Speaker 1: You. I don't feel like I could fold you very well. No, Okay, 19 00:01:04,800 --> 00:01:09,000 Speaker 1: So back in high school, I used. 20 00:01:08,800 --> 00:01:11,800 Speaker 3: To be like I was a really small guy, Like 21 00:01:11,880 --> 00:01:15,720 Speaker 3: I was a really skinny guy. And you remember the 22 00:01:16,200 --> 00:01:19,600 Speaker 3: you remember all the ads from the nineties for exercise equipment. 23 00:01:20,400 --> 00:01:22,000 Speaker 2: I do remember some of that. Yeah. 24 00:01:22,040 --> 00:01:25,000 Speaker 3: So the tagline that they always used for like the 25 00:01:25,000 --> 00:01:27,440 Speaker 3: as scene on TV exercise equipment was that it folds 26 00:01:27,480 --> 00:01:33,959 Speaker 3: for easy storage. Ah and being being dumb ass kids. 27 00:01:34,080 --> 00:01:36,520 Speaker 3: You know. It's like one person in our friend group 28 00:01:36,600 --> 00:01:38,600 Speaker 3: like has a car, but there's like seven of us, 29 00:01:38,640 --> 00:01:40,840 Speaker 3: and so someone's got to ride in the trunk. And 30 00:01:40,880 --> 00:01:42,920 Speaker 3: it's like, well, Dan gets to ride in the trunk, 31 00:01:43,240 --> 00:01:45,000 Speaker 3: Like we're going to stick Dan in the trunk because 32 00:01:45,000 --> 00:01:47,080 Speaker 3: he folds for easy storage. Because I was a small 33 00:01:47,120 --> 00:01:53,720 Speaker 3: guy and so it so I don't know why. When 34 00:01:53,760 --> 00:01:58,440 Speaker 3: I was like busy eric like trying to brand the channel, 35 00:01:58,800 --> 00:02:02,480 Speaker 3: like you know, a decade ago. Uh, I was like, 36 00:02:04,000 --> 00:02:05,920 Speaker 3: I had this phrase that I was using with students 37 00:02:05,960 --> 00:02:07,840 Speaker 3: that I was interacting with, was like, well, let's unfold 38 00:02:07,880 --> 00:02:10,720 Speaker 3: that idea, you know, yeah, yeah, yeah, But like that. 39 00:02:10,840 --> 00:02:12,400 Speaker 2: Was kind of like on my mind. 40 00:02:12,440 --> 00:02:15,000 Speaker 3: So I was like, ah, well, we'll like call the 41 00:02:15,080 --> 00:02:18,960 Speaker 3: channel like unfolding ideas. Unfolding didn't really like just sound good, 42 00:02:19,000 --> 00:02:20,480 Speaker 3: so it's like, oh, folding ideas. 43 00:02:20,919 --> 00:02:23,320 Speaker 2: Uh. And then well. 44 00:02:23,560 --> 00:02:27,480 Speaker 3: Esthetic parallel to that, you know, foldable human. I don't 45 00:02:27,480 --> 00:02:31,919 Speaker 3: know it, it just it it came to me and 46 00:02:31,960 --> 00:02:36,360 Speaker 3: it sounded good, and it was nowhere on the internet. 47 00:02:36,400 --> 00:02:39,280 Speaker 3: There was like no overlaps. So I'm like, all right, interesting, 48 00:02:39,520 --> 00:02:43,040 Speaker 3: We're good to go. SEO locked in that's. 49 00:02:42,840 --> 00:02:45,640 Speaker 1: An example of a thing that I you know, we're talking, 50 00:02:45,639 --> 00:02:47,800 Speaker 1: We're gonna be talking a lot about stuff that's unsettling 51 00:02:47,840 --> 00:02:50,560 Speaker 1: about our modern era and how the Internet has sort 52 00:02:50,560 --> 00:02:54,600 Speaker 1: of altered human dynamics. One of the things that I 53 00:02:54,600 --> 00:02:56,880 Speaker 1: think is kind of neat about it is its ability 54 00:02:56,919 --> 00:03:01,280 Speaker 1: to kind of preserve an amber aspects of you from 55 00:03:01,280 --> 00:03:03,920 Speaker 1: the deep past. Like I have one of my emails, 56 00:03:03,919 --> 00:03:06,919 Speaker 1: like my personal email is a Gmail that I got 57 00:03:07,080 --> 00:03:09,040 Speaker 1: back when you had to get an invite to get 58 00:03:09,080 --> 00:03:12,359 Speaker 1: a Gmail, right like right when Gmail first became a thing. 59 00:03:12,800 --> 00:03:15,080 Speaker 1: And it's like, I'm not going to say it on 60 00:03:15,200 --> 00:03:17,359 Speaker 1: here because then my email will get bombarded with shit. 61 00:03:17,480 --> 00:03:21,680 Speaker 1: But like it's like a stupid joke that doesn't really 62 00:03:21,720 --> 00:03:23,960 Speaker 1: make any sense. And every time I give it to someone, 63 00:03:24,000 --> 00:03:26,040 Speaker 1: their like, why is that your email? Because I was 64 00:03:26,040 --> 00:03:29,400 Speaker 1: like twelve, Like I don't even remember why I set 65 00:03:29,400 --> 00:03:32,160 Speaker 1: this thing. It's just this like moment of something I 66 00:03:32,200 --> 00:03:34,800 Speaker 1: thought was funny when I was pre Pewbescent frozen in 67 00:03:34,880 --> 00:03:38,040 Speaker 1: amber forever. Because that's the Internet does that in little 68 00:03:38,040 --> 00:03:39,080 Speaker 1: ways for each of us. 69 00:03:39,480 --> 00:03:44,600 Speaker 3: I definitely abandoned my original something Awful account because some 70 00:03:44,800 --> 00:03:47,360 Speaker 3: like's like, you know what, maybe. 71 00:03:46,600 --> 00:03:48,040 Speaker 2: Not that user name anymore. 72 00:03:48,480 --> 00:03:54,960 Speaker 1: Yeah, yeah, so you are so Dan if people are 73 00:03:54,960 --> 00:03:57,600 Speaker 1: not familiar with you, and I'm going to guess a 74 00:03:57,680 --> 00:04:01,560 Speaker 1: significant chunk of our listenership is kind of one of 75 00:04:01,600 --> 00:04:04,680 Speaker 1: the biggest touchstones for you recently was you put out 76 00:04:04,720 --> 00:04:07,880 Speaker 1: a video about the NFT craze that a lot of 77 00:04:07,880 --> 00:04:11,120 Speaker 1: people have credited with helping to kill it, me among them. 78 00:04:11,280 --> 00:04:12,800 Speaker 2: It's a wonderful, wonderful video. 79 00:04:12,840 --> 00:04:15,560 Speaker 1: Line go up was the actual title line goes yeah, 80 00:04:15,600 --> 00:04:19,120 Speaker 1: line goes up, very good breakdown of how they work, 81 00:04:19,320 --> 00:04:22,520 Speaker 1: why it was a con And you've been doing you know, 82 00:04:22,600 --> 00:04:24,800 Speaker 1: I think kind of the first really the first time 83 00:04:24,880 --> 00:04:26,719 Speaker 1: I became aware of you was you did a Flat 84 00:04:26,760 --> 00:04:28,919 Speaker 1: Earth documentary, which is is very good. 85 00:04:30,080 --> 00:04:30,279 Speaker 2: You know. 86 00:04:30,320 --> 00:04:33,200 Speaker 1: I did an article recently on like AI kids books 87 00:04:33,200 --> 00:04:36,200 Speaker 1: that was partly inspired by an investigation you did into 88 00:04:36,240 --> 00:04:40,359 Speaker 1: these kind of audible slash kindle grifters, the Mickelson Twins. 89 00:04:41,440 --> 00:04:43,200 Speaker 1: So you do that kind of thing, right like you 90 00:04:43,200 --> 00:04:46,680 Speaker 1: you you Bay, You kind of run across things that 91 00:04:46,720 --> 00:04:49,760 Speaker 1: are troubling or confusing to you, and then you investigate 92 00:04:49,839 --> 00:04:53,880 Speaker 1: them to a pretty impressive extent and put together very clear, 93 00:04:54,560 --> 00:04:58,800 Speaker 1: uh video investigations. You know, that's that's uh, I think 94 00:04:58,839 --> 00:05:00,479 Speaker 1: in a nutshell, probably pretty accurate. 95 00:05:01,000 --> 00:05:05,520 Speaker 3: Yeah, that's kind of where the where the channels ended up. Yeah, 96 00:05:05,600 --> 00:05:08,080 Speaker 3: it's it's been a few different things over over the years, 97 00:05:08,080 --> 00:05:10,320 Speaker 3: but that's kind of the phase that it's in right now. 98 00:05:10,480 --> 00:05:14,719 Speaker 3: Is this kind of like I don't know, like yeah, documentarian. 99 00:05:15,360 --> 00:05:15,720 Speaker 2: Place. 100 00:05:16,400 --> 00:05:19,600 Speaker 1: Yeah, and you know, a lot of your stuff seems 101 00:05:19,600 --> 00:05:23,160 Speaker 1: to focus on basically like the topic of kind of 102 00:05:23,360 --> 00:05:27,800 Speaker 1: online grifter culture and and sort of its intersection with 103 00:05:27,960 --> 00:05:32,200 Speaker 1: like different weird cultic milieuse like there's kind of a 104 00:05:32,520 --> 00:05:36,120 Speaker 1: cross especially like with NFTs, a real big crossover betwixt 105 00:05:36,160 --> 00:05:38,240 Speaker 1: the two, right, Like, it's kind of I think a 106 00:05:38,279 --> 00:05:42,080 Speaker 1: lot of crypto culture was kind of this intersection between 107 00:05:42,640 --> 00:05:48,520 Speaker 1: old school cons and kind of internet cult dynamics. So 108 00:05:48,560 --> 00:05:54,880 Speaker 1: I wanted to talk today about the problem of scam 109 00:05:54,920 --> 00:05:59,640 Speaker 1: culture in the United States because and I, by any 110 00:05:59,680 --> 00:06:02,760 Speaker 1: sort of like objective reckoning, I've been looking into this. 111 00:06:03,720 --> 00:06:08,000 Speaker 1: There are more scams and more fraud than right now 112 00:06:08,040 --> 00:06:10,440 Speaker 1: in the United States than at any point previously, and 113 00:06:10,480 --> 00:06:13,720 Speaker 1: basically from all sides, like phone scams are at the 114 00:06:13,760 --> 00:06:15,120 Speaker 1: highest rate they've ever been. 115 00:06:15,160 --> 00:06:16,320 Speaker 2: People are getting. 116 00:06:16,000 --> 00:06:19,200 Speaker 1: Like, like I think the statistic I've got is that 117 00:06:20,680 --> 00:06:24,760 Speaker 1: in March twenty one, I think is kind of when 118 00:06:24,760 --> 00:06:28,480 Speaker 1: that peaked at like four point nine billion robocall scams, 119 00:06:29,480 --> 00:06:32,960 Speaker 1: which is like like just kind of an outrageous increase 120 00:06:33,040 --> 00:06:34,920 Speaker 1: over where it was a few years ago. The rate 121 00:06:34,960 --> 00:06:37,560 Speaker 1: of like fraud against elderly people seems to be in 122 00:06:37,600 --> 00:06:41,080 Speaker 1: an all time high, at least in terms of dollar amount. 123 00:06:41,080 --> 00:06:43,360 Speaker 1: One of kind of the unsettling quotes that I came 124 00:06:43,360 --> 00:06:46,480 Speaker 1: across when I was looking into the degree to which 125 00:06:46,520 --> 00:06:50,440 Speaker 1: old people are being scammed, and it's often through various 126 00:06:50,440 --> 00:06:53,520 Speaker 1: email scams that are kind of based on getting trust 127 00:06:53,640 --> 00:06:55,760 Speaker 1: or frightening them that like someone else is trying to 128 00:06:55,800 --> 00:06:57,839 Speaker 1: scam them and so they need to give in from it. Anyway, 129 00:06:59,000 --> 00:07:01,760 Speaker 1: the thing that the quote I came across was a 130 00:07:01,800 --> 00:07:05,320 Speaker 1: regulator talking about this and being like, yeah, it's not 131 00:07:05,480 --> 00:07:09,840 Speaker 1: it's no longer like smaller even medium dollar cons. People 132 00:07:09,840 --> 00:07:13,480 Speaker 1: are stealing generational wealth, which was really interesting to me. 133 00:07:15,440 --> 00:07:18,120 Speaker 1: And then there's kind of like phishing attacks are at 134 00:07:19,040 --> 00:07:21,160 Speaker 1: pretty close to an all time high. I mean, there's 135 00:07:21,160 --> 00:07:23,000 Speaker 1: a I'll send you there's a graph that I came 136 00:07:23,040 --> 00:07:26,600 Speaker 1: across in a what was the source on this? In 137 00:07:26,640 --> 00:07:30,080 Speaker 1: a comparaateech article that I mean, it's just a straight 138 00:07:30,120 --> 00:07:32,960 Speaker 1: line up from January twenty twenty nine to like the 139 00:07:33,040 --> 00:07:37,920 Speaker 1: end of twenty twenty two. And so I'm I'm, I'm 140 00:07:38,000 --> 00:07:40,200 Speaker 1: kind of looking at all this, and there's a couple 141 00:07:40,280 --> 00:07:42,480 Speaker 1: of different causes, right, Like some of the stuff the 142 00:07:42,520 --> 00:07:45,760 Speaker 1: SEC did under ajit PI gets blamed for why it's 143 00:07:45,760 --> 00:07:48,800 Speaker 1: gotten even worse with with phone scams, although that's not 144 00:07:48,880 --> 00:07:53,000 Speaker 1: the whole story. AI powered tools have been a big 145 00:07:53,040 --> 00:07:55,720 Speaker 1: part of like why phishing attacks have increased so much. 146 00:07:55,800 --> 00:07:58,080 Speaker 1: But then like you've got like the degree to which 147 00:07:58,200 --> 00:08:01,280 Speaker 1: the elderly are being conned, which is like this kind 148 00:08:01,320 --> 00:08:03,800 Speaker 1: of at this intersection of a few different things. How 149 00:08:03,840 --> 00:08:06,640 Speaker 1: much more online old people are today than they were, 150 00:08:06,680 --> 00:08:08,040 Speaker 1: you know, ten to fifteen years ago. 151 00:08:08,080 --> 00:08:11,920 Speaker 3: For population bubble, yes, you know, multi valent dynamics. 152 00:08:12,240 --> 00:08:16,400 Speaker 1: Yeah, but kind of the commonality is that like scams 153 00:08:16,520 --> 00:08:19,040 Speaker 1: are all around, like people, we're all kind of being 154 00:08:19,080 --> 00:08:21,000 Speaker 1: assaulted by scams. 155 00:08:21,480 --> 00:08:24,560 Speaker 3: I mean time, I just I just grabbed my phone 156 00:08:24,760 --> 00:08:28,400 Speaker 3: and like of my last like fifteen text messages. 157 00:08:28,440 --> 00:08:31,640 Speaker 2: Maybe it's like. 158 00:08:33,480 --> 00:08:37,040 Speaker 3: Your pickup is available a couple like three with you know, 159 00:08:37,080 --> 00:08:42,199 Speaker 3: four with actual friends and then interact you just received 160 00:08:42,200 --> 00:08:45,160 Speaker 3: an e Interact transfer high long time, no talk, just 161 00:08:45,200 --> 00:08:48,240 Speaker 3: got your money, Okay, we'll send soon. Three hundred and 162 00:08:48,280 --> 00:08:50,960 Speaker 3: eighty nine can now be routed to your institution. Submit 163 00:08:51,040 --> 00:08:55,160 Speaker 3: Why the Canada Revenue Agency has sent you money? Your 164 00:08:55,240 --> 00:08:59,840 Speaker 3: verification code for visa transfer is And it's like it's 165 00:08:59,880 --> 00:09:00,679 Speaker 3: like it's just. 166 00:09:01,880 --> 00:09:05,280 Speaker 2: Constant, constant. I hadn't even because like. 167 00:09:05,240 --> 00:09:07,280 Speaker 3: I'm not going to be shocked at all if I 168 00:09:07,360 --> 00:09:10,960 Speaker 3: get a phishing text message during this conversation. 169 00:09:11,080 --> 00:09:15,040 Speaker 1: Yeah, like between my email and and like my text 170 00:09:15,080 --> 00:09:18,480 Speaker 1: messages or just phone calls, right, I get every day 171 00:09:18,520 --> 00:09:22,680 Speaker 1: I get two or three calls from scam likely, you know, yeah, 172 00:09:22,440 --> 00:09:25,719 Speaker 1: my good friend scam likely. Yeah, my old buddy, he's 173 00:09:25,760 --> 00:09:30,200 Speaker 1: always got always got something cooking. But yeah, it's and 174 00:09:30,240 --> 00:09:33,480 Speaker 1: I this kind of like I started focusing on this 175 00:09:33,559 --> 00:09:36,600 Speaker 1: more a couple of months ago because you know, I 176 00:09:36,640 --> 00:09:40,040 Speaker 1: had vaguely noticed, boy, it's just like nothing but fucking 177 00:09:40,120 --> 00:09:44,000 Speaker 1: scams coming into me through my phone these days. And 178 00:09:44,040 --> 00:09:46,040 Speaker 1: then a couple of months ago, I got a phone 179 00:09:46,040 --> 00:09:48,240 Speaker 1: call from my bank and it was one of those things, 180 00:09:48,280 --> 00:09:50,920 Speaker 1: like everyone else my cell phone lets me know when 181 00:09:51,040 --> 00:09:53,839 Speaker 1: like a call is from scam likely, or when it's 182 00:09:53,880 --> 00:09:55,480 Speaker 1: from like you know, and it had the name of 183 00:09:55,520 --> 00:09:57,800 Speaker 1: my bank on there. It was the right number, you know. 184 00:09:58,679 --> 00:10:00,880 Speaker 1: So I pick it up and human being is on 185 00:10:00,960 --> 00:10:02,720 Speaker 1: the line and they're like, is this you know, Robert 186 00:10:02,720 --> 00:10:04,960 Speaker 1: Evans And I'm like yeah, And then they're like, we've 187 00:10:05,000 --> 00:10:08,720 Speaker 1: seen some like fraudulent activity on your account. Can we 188 00:10:08,760 --> 00:10:11,280 Speaker 1: ask you a couple questions? And that is I've gotten 189 00:10:11,320 --> 00:10:14,160 Speaker 1: that call before. Legitimately, you know, it's not a weird 190 00:10:14,200 --> 00:10:16,400 Speaker 1: thing for your bank to be like, hey, let's talk 191 00:10:16,400 --> 00:10:17,839 Speaker 1: about these cuss. 192 00:10:17,400 --> 00:10:19,439 Speaker 3: Are you in the country right now? It's like no, 193 00:10:19,520 --> 00:10:21,720 Speaker 3: I'm I'm in New York. It's like, okay, so we're 194 00:10:21,760 --> 00:10:23,280 Speaker 3: seeing activity out in New York. 195 00:10:23,320 --> 00:10:23,640 Speaker 2: That's you. 196 00:10:23,960 --> 00:10:26,400 Speaker 1: Yeah, did you just buy something in Florida? No, I 197 00:10:26,480 --> 00:10:30,760 Speaker 1: never go to Florida, but uh so yeah so And 198 00:10:30,800 --> 00:10:33,720 Speaker 1: I didn't actually get to see where they were going 199 00:10:33,800 --> 00:10:36,440 Speaker 1: with this. Nothing suspicious had happened. But like after they 200 00:10:36,520 --> 00:10:39,320 Speaker 1: say that, I'm like okay, yeah, like what's what's the charge? 201 00:10:39,640 --> 00:10:42,199 Speaker 1: And then my phone disconnects right like it's you know again, 202 00:10:42,360 --> 00:10:44,959 Speaker 1: my where I live, you know, Oregon is the middle 203 00:10:44,960 --> 00:10:48,720 Speaker 1: of nowhere, so like sometimes connectivity is not great. So 204 00:10:48,800 --> 00:10:51,080 Speaker 1: I call them back, you know, and I get on 205 00:10:51,120 --> 00:10:53,640 Speaker 1: the phone with a person and they're like, yeah, what's 206 00:10:53,679 --> 00:10:54,960 Speaker 1: what seems to be up? And I'm like, well, you 207 00:10:54,960 --> 00:10:57,000 Speaker 1: guys called me saying that like that there was some 208 00:10:57,400 --> 00:11:00,680 Speaker 1: possibly fraudulent activity that we needed to talk about. The 209 00:11:00,760 --> 00:11:02,719 Speaker 1: lady on the other end is like, no one here 210 00:11:02,800 --> 00:11:05,120 Speaker 1: called you. Like, I'm looking at your record. I can 211 00:11:05,160 --> 00:11:07,240 Speaker 1: tell when someone's getting a call. We don't have any 212 00:11:07,240 --> 00:11:09,200 Speaker 1: record of that. And I explained what happened, and she 213 00:11:10,040 --> 00:11:12,960 Speaker 1: like goes back, talks to a supervisor and is like, 214 00:11:13,320 --> 00:11:16,360 Speaker 1: so that was a scam. They've this is something we've 215 00:11:16,360 --> 00:11:18,720 Speaker 1: seen more and more lately. They've they're able now to 216 00:11:18,840 --> 00:11:22,720 Speaker 1: actually just spoof our banks and this is my bank 217 00:11:22,800 --> 00:11:26,440 Speaker 1: is a significant sized institution. They're able to spoof our 218 00:11:26,640 --> 00:11:30,000 Speaker 1: phone number now and so you can't tell through the 219 00:11:30,040 --> 00:11:32,280 Speaker 1: caller ID, And like it was this whole thing where 220 00:11:32,320 --> 00:11:35,480 Speaker 1: like obviously I know, don't give certain things over the phone, 221 00:11:35,520 --> 00:11:37,400 Speaker 1: even if they're pretending to be your bank. We never 222 00:11:37,440 --> 00:11:40,400 Speaker 1: got to that point where anything was actually compromised, but 223 00:11:40,440 --> 00:11:43,880 Speaker 1: it was just like, well, shit, like what are you Like? 224 00:11:44,200 --> 00:11:46,360 Speaker 1: That's this is now well beyond a thing where like 225 00:11:46,400 --> 00:11:49,440 Speaker 1: you're getting called and someone's offering you you know, to 226 00:11:49,559 --> 00:11:52,040 Speaker 1: make a bunch of money, you know, holding a Nigerian 227 00:11:52,080 --> 00:11:54,559 Speaker 1: prince's wealth or something. If you sit in your bank account, 228 00:11:54,600 --> 00:11:57,720 Speaker 1: this is your bank calls you and your phone tells 229 00:11:57,760 --> 00:12:00,199 Speaker 1: you that it's your bank, and a human being who 230 00:12:00,240 --> 00:12:04,120 Speaker 1: sounds just like the bank teller like it's it's gotten 231 00:12:04,280 --> 00:12:08,040 Speaker 1: so and I think kind of the broad obviously, each 232 00:12:08,080 --> 00:12:11,920 Speaker 1: of these individual vectors by which scamming has increased is 233 00:12:12,240 --> 00:12:14,839 Speaker 1: a worthy story and a separate story in a lot 234 00:12:14,840 --> 00:12:18,199 Speaker 1: of ways, but they also come together in this like well, 235 00:12:18,679 --> 00:12:21,480 Speaker 1: you know, it's not it's not like weird at this 236 00:12:21,520 --> 00:12:25,479 Speaker 1: point to note that everybody seems angrier, and everybody seems paranoid, 237 00:12:25,520 --> 00:12:28,920 Speaker 1: and you hear more stories about like people opening fire 238 00:12:29,000 --> 00:12:32,160 Speaker 1: on folks pulling into their driveway to turn around, and 239 00:12:32,800 --> 00:12:35,840 Speaker 1: there's you know that story. Obviously there's guns and stuff 240 00:12:35,840 --> 00:12:37,960 Speaker 1: that's also connected to that. But I wonder how much 241 00:12:38,480 --> 00:12:43,559 Speaker 1: of the paranoia and anger is at least exacerbated by 242 00:12:43,559 --> 00:12:46,480 Speaker 1: the fact that everyone is fighting off a million scammers 243 00:12:46,480 --> 00:12:47,120 Speaker 1: at all times. 244 00:12:48,320 --> 00:12:50,679 Speaker 3: Yeah, I think that's a good observation, like that, just 245 00:12:50,760 --> 00:12:58,480 Speaker 3: we're seeing this erosion of public trust in reality. Yeah, 246 00:12:58,520 --> 00:13:02,960 Speaker 3: And and some of that is like deliberate and political. 247 00:13:03,400 --> 00:13:06,880 Speaker 3: Uh And a lot of it is just coming from 248 00:13:06,920 --> 00:13:12,559 Speaker 3: like the fact that technology has enabled spam in in 249 00:13:12,880 --> 00:13:17,520 Speaker 3: unprecedented new vectors and the fact that you can like 250 00:13:18,000 --> 00:13:21,280 Speaker 3: that you can automate bombarding people with. 251 00:13:23,320 --> 00:13:25,040 Speaker 2: Noise. Uh. 252 00:13:25,320 --> 00:13:28,840 Speaker 3: Is is just kind of it's eaten away at all 253 00:13:28,880 --> 00:13:32,520 Speaker 3: of us because it's like, how do I how do 254 00:13:32,600 --> 00:13:35,040 Speaker 3: I trust anything? I mean so like and this is 255 00:13:35,080 --> 00:13:37,200 Speaker 3: the thing is it's like, Okay, so I've been I've 256 00:13:37,240 --> 00:13:40,240 Speaker 3: been keyed into this and thus paranoid for like a 257 00:13:40,280 --> 00:13:42,680 Speaker 3: decade now. So if I get a message that's like, 258 00:13:43,120 --> 00:13:46,000 Speaker 3: you know, like that from my bank, if it comes 259 00:13:46,080 --> 00:13:49,400 Speaker 3: in text, then it's like I don't interact with the 260 00:13:49,440 --> 00:13:52,640 Speaker 3: original thing that it came from. I then go like 261 00:13:52,920 --> 00:13:55,600 Speaker 3: on the website. It's like all right, I like call 262 00:13:55,720 --> 00:13:59,440 Speaker 3: my bank to uh to to inquire about it. Like 263 00:13:59,480 --> 00:14:02,200 Speaker 3: never you know, it's like never communicate through the channel 264 00:14:02,200 --> 00:14:03,560 Speaker 3: that you're first contacted in. 265 00:14:03,760 --> 00:14:06,560 Speaker 1: Yeah, if you're dealing with your bank. And it's like 266 00:14:06,640 --> 00:14:08,880 Speaker 1: and it's like, but is that level of paranoia healthy? 267 00:14:08,960 --> 00:14:11,240 Speaker 1: And it's like that takes that also takes effort. 268 00:14:11,280 --> 00:14:13,400 Speaker 3: That means you have to have the foresight to be like, 269 00:14:13,840 --> 00:14:18,400 Speaker 3: do not panic see the thing process it consciously go 270 00:14:18,559 --> 00:14:23,800 Speaker 3: somewhere else and like, you know, activate a different channel. 271 00:14:23,880 --> 00:14:26,280 Speaker 3: You know, if they contact you through text, you know, 272 00:14:26,400 --> 00:14:30,480 Speaker 3: go through like email or or like live chat. If 273 00:14:30,520 --> 00:14:33,480 Speaker 3: they contact you through email, call them on the phone, 274 00:14:33,880 --> 00:14:35,880 Speaker 3: not with the phone number that was at the bottom 275 00:14:35,880 --> 00:14:37,600 Speaker 3: of the email, go to the website. And it's like 276 00:14:37,760 --> 00:14:41,320 Speaker 3: that's effort, that's effort. I don't even have that much 277 00:14:41,720 --> 00:14:43,880 Speaker 3: energy in me sometime, and a lot of other people 278 00:14:44,000 --> 00:14:48,440 Speaker 3: just like absolutely don't. And and that leads to like 279 00:14:49,000 --> 00:14:52,640 Speaker 3: just exhaustion, vulnerability, you know, all of the things that 280 00:14:53,080 --> 00:14:57,680 Speaker 3: feed into like paranoia, distrust, et cetera, et cetera, et cetera, 281 00:14:57,720 --> 00:15:01,600 Speaker 3: et cetera. And it's it's relentless. Like online advertising is 282 00:15:01,640 --> 00:15:04,880 Speaker 3: basically useless at this point. Oh yeah, because like if 283 00:15:05,200 --> 00:15:09,720 Speaker 3: you ran, if you ran a legitimate ad, you know, 284 00:15:10,160 --> 00:15:12,880 Speaker 3: unless you have the money to run like a real 285 00:15:13,200 --> 00:15:21,280 Speaker 3: proper you know, basically TV commercial. Yeah, like banner ads. 286 00:15:21,720 --> 00:15:25,560 Speaker 3: I haven't I've seen like I don't know one legitimate 287 00:15:25,600 --> 00:15:29,840 Speaker 3: banner ad for like a car company in the last year. 288 00:15:30,040 --> 00:15:33,880 Speaker 3: Everything else is like a hearing aid scam or you know, 289 00:15:34,080 --> 00:15:37,000 Speaker 3: liquefy your belly fat using the metaverse. 290 00:15:38,080 --> 00:15:42,240 Speaker 1: Yeah, and it's it's this constant like number one, It's 291 00:15:42,320 --> 00:15:44,880 Speaker 1: led me to the situation where when I see an 292 00:15:44,920 --> 00:15:47,520 Speaker 1: ad on social media in particular, but with any sort 293 00:15:47,520 --> 00:15:50,960 Speaker 1: of like print in online ad, my assumption is it's 294 00:15:51,040 --> 00:15:53,680 Speaker 1: probably a con right. Yeah, even if it's like, oh wow, 295 00:15:53,720 --> 00:15:55,960 Speaker 1: that shirt looks nice, well, that company is probably not 296 00:15:56,000 --> 00:15:58,040 Speaker 1: going to ship me that shirt right like, or it 297 00:15:58,080 --> 00:16:00,920 Speaker 1: won't be right like, yeah, it'll be that photo is 298 00:16:00,960 --> 00:16:03,640 Speaker 1: absolutely not from the company that is running this ad. 299 00:16:03,720 --> 00:16:07,240 Speaker 3: Yeah. You just you assume you distrust as a as 300 00:16:07,240 --> 00:16:09,680 Speaker 3: a first measure. You see you see a banner ad, 301 00:16:09,720 --> 00:16:13,720 Speaker 3: you see the aesthetics of advertising, and the assumption is 302 00:16:13,720 --> 00:16:17,040 Speaker 3: that it's like, ah, that's gonna you know, that's gonna 303 00:16:17,240 --> 00:16:19,960 Speaker 3: get me to sign up for some subscription that's going 304 00:16:20,000 --> 00:16:23,440 Speaker 3: to be buried in like recurring payments that yeah, I 305 00:16:23,480 --> 00:16:24,760 Speaker 3: will never be able to cancel. 306 00:16:25,160 --> 00:16:28,680 Speaker 1: And it's it's interesting because I, I mean, I'm not 307 00:16:28,720 --> 00:16:30,880 Speaker 1: sure if this has been your experience, but like I 308 00:16:31,240 --> 00:16:34,480 Speaker 1: can acknowledge, I think I morally have to acknowledge, like 309 00:16:34,640 --> 00:16:39,080 Speaker 1: part of my success financially as a creator has been 310 00:16:39,080 --> 00:16:41,800 Speaker 1: as a result of that. Because one of the things 311 00:16:41,840 --> 00:16:45,640 Speaker 1: that we've seen in the ad market is that text ads, 312 00:16:45,680 --> 00:16:47,880 Speaker 1: ads for print and shit do not work, do not 313 00:16:48,000 --> 00:16:51,200 Speaker 1: function in any way, shape or form, and a lot 314 00:16:51,240 --> 00:16:54,640 Speaker 1: of like random innersitt ads don't work well, but creator 315 00:16:54,680 --> 00:16:57,480 Speaker 1: ads work well, and so there's money in it, right 316 00:16:57,560 --> 00:17:01,400 Speaker 1: because people listen to like people have a degree of like, Okay, 317 00:17:01,440 --> 00:17:03,560 Speaker 1: well this is like number one. It's just like the 318 00:17:04,119 --> 00:17:07,000 Speaker 1: process of consuming a YouTube video or a podcast is 319 00:17:07,040 --> 00:17:10,160 Speaker 1: different from an article. But like the ads work better 320 00:17:10,240 --> 00:17:13,080 Speaker 1: because it doesn't feel the same as like the scrum 321 00:17:13,119 --> 00:17:15,719 Speaker 1: of shit that like is getting pushed into every conversation 322 00:17:15,800 --> 00:17:16,639 Speaker 1: you have on Twitter. 323 00:17:17,040 --> 00:17:20,399 Speaker 3: Yea a human that I can confirm existence has at 324 00:17:20,480 --> 00:17:21,320 Speaker 3: least taken. 325 00:17:21,080 --> 00:17:24,600 Speaker 1: A look at this, like this is not at least 326 00:17:24,640 --> 00:17:29,080 Speaker 1: like not a complete con or whatever, right yeah, or. 327 00:17:29,040 --> 00:17:32,520 Speaker 3: If it is, then like then the host has also 328 00:17:32,840 --> 00:17:33,840 Speaker 3: has also been. 329 00:17:33,760 --> 00:17:36,240 Speaker 2: Kind of like gone by that and whatever, Like it. 330 00:17:36,240 --> 00:17:38,520 Speaker 3: Ends up, yeah, we're in this together. Like it ends 331 00:17:38,600 --> 00:17:41,520 Speaker 3: up being at least like a little bit sort of 332 00:17:41,760 --> 00:17:47,679 Speaker 3: sort of distant distanced from that, you know, like the freakin' 333 00:17:47,960 --> 00:17:50,840 Speaker 3: you know, by by a square foot of land in 334 00:17:50,880 --> 00:17:53,840 Speaker 3: Scotland and become a lord and that whole thing is 335 00:17:53,880 --> 00:17:57,400 Speaker 3: like a scam being run out of China. Yeah. 336 00:17:57,480 --> 00:17:57,880 Speaker 2: Yeah. 337 00:17:58,040 --> 00:17:59,600 Speaker 3: I Mean one of the one of the kind of 338 00:17:59,640 --> 00:18:01,760 Speaker 3: like weird ironies for me is it's like, okay, so 339 00:18:01,840 --> 00:18:06,119 Speaker 3: line goes up, came out, well, the crypto ecosystem was 340 00:18:06,160 --> 00:18:08,960 Speaker 3: in its like biggest ad blitz ever. You know, they 341 00:18:08,960 --> 00:18:11,560 Speaker 3: had the Super Bowl ads coming up just like a 342 00:18:11,600 --> 00:18:17,480 Speaker 3: month later, actually weeks, like weeks afterwards, and so you know, 343 00:18:17,560 --> 00:18:21,760 Speaker 3: like the vast majority of the like mineral ads that 344 00:18:21,920 --> 00:18:27,440 Speaker 3: ran on that video were crypto dot Com, were ftx, 345 00:18:27,680 --> 00:18:33,440 Speaker 3: were binance, and and the ad rates that they were paying, 346 00:18:33,600 --> 00:18:36,439 Speaker 3: like the CPM that they were paying to run on 347 00:18:36,800 --> 00:18:43,600 Speaker 3: crypto relevant videos was insane. It was like it was 348 00:18:43,680 --> 00:18:45,960 Speaker 3: like twenty eleven all over again. 349 00:18:46,560 --> 00:18:49,399 Speaker 1: Twenty eleven was the only good time to be in 350 00:18:49,480 --> 00:19:06,159 Speaker 1: digital content creation. There's like a lot that's unsettling about that. 351 00:19:06,200 --> 00:19:07,920 Speaker 1: I think one of the things that is like most 352 00:19:07,960 --> 00:19:10,399 Speaker 1: frustrated to me is the degree to which it's meant 353 00:19:10,400 --> 00:19:14,919 Speaker 1: that we've we've gone backwards. Like there was this people 354 00:19:14,920 --> 00:19:19,240 Speaker 1: who like study tech and kind of the way socialization 355 00:19:19,520 --> 00:19:22,040 Speaker 1: around big tech works talk about this thing called the 356 00:19:22,080 --> 00:19:24,480 Speaker 1: trow of disappointment, right, which is when you get a 357 00:19:24,520 --> 00:19:27,480 Speaker 1: new technology, everybody we're in like the hype phase for 358 00:19:27,560 --> 00:19:30,159 Speaker 1: like AI right now, right, yeah, And then at a 359 00:19:30,200 --> 00:19:33,200 Speaker 1: certain point it becomes clear which aspects of the hype 360 00:19:33,200 --> 00:19:35,359 Speaker 1: were right, you know, the degree to which the technology 361 00:19:35,400 --> 00:19:37,960 Speaker 1: is capable of doing things that kind of the evangelists 362 00:19:37,960 --> 00:19:40,440 Speaker 1: were claiming, and to which extent the hype was wrong 363 00:19:40,640 --> 00:19:43,439 Speaker 1: right in what areas is the tech always going to 364 00:19:43,440 --> 00:19:47,280 Speaker 1: fall short? And that's called like the trow of disappointment 365 00:19:47,280 --> 00:19:49,480 Speaker 1: when people start to reel and then you know, things 366 00:19:49,560 --> 00:19:52,520 Speaker 1: kind of are supposed to level after that's dot com 367 00:19:52,600 --> 00:19:56,520 Speaker 1: is not in fact magic, Yeah, exactly, exactly. You can't 368 00:19:56,560 --> 00:20:00,320 Speaker 1: just keep shoveling money into this shit forever and the 369 00:20:00,359 --> 00:20:04,239 Speaker 1: hopes of exponential returns. Or as a consumer, like at 370 00:20:04,280 --> 00:20:06,560 Speaker 1: a certain point, I can remember the time when phones 371 00:20:06,600 --> 00:20:09,520 Speaker 1: were exciting and I was, as especially as a journalist, 372 00:20:09,600 --> 00:20:12,120 Speaker 1: like really interested every new year at like what new 373 00:20:12,119 --> 00:20:14,720 Speaker 1: things they're capable of. And then after a couple of 374 00:20:14,840 --> 00:20:17,320 Speaker 1: years it was like, well, every phone is like there's 375 00:20:17,359 --> 00:20:19,920 Speaker 1: no difference. Now there's no excitement in getting a new phone. 376 00:20:19,960 --> 00:20:22,240 Speaker 1: It's just like, well this my old phone is broken, 377 00:20:22,320 --> 00:20:24,000 Speaker 1: so I need a new phone. But like I'm not 378 00:20:24,040 --> 00:20:28,760 Speaker 1: like wow, the new capabilities of this device, but I 379 00:20:28,800 --> 00:20:31,000 Speaker 1: feel like there's another I don't even I don't really 380 00:20:31,000 --> 00:20:32,840 Speaker 1: know what to call it. But there's also this kind 381 00:20:32,840 --> 00:20:37,280 Speaker 1: of thing where we the Internet helps to create, or 382 00:20:37,359 --> 00:20:40,280 Speaker 1: is the method through which is disseminated a new labor 383 00:20:40,359 --> 00:20:44,080 Speaker 1: saving device, and then the scams reach such a density 384 00:20:44,200 --> 00:20:47,160 Speaker 1: that the amount of labor you're able to save is minimal, right, 385 00:20:48,760 --> 00:20:53,000 Speaker 1: Like that's that's I feel like there's like a that's 386 00:20:53,080 --> 00:20:55,520 Speaker 1: at least one of the things that I've noticed, especially 387 00:20:55,560 --> 00:20:58,680 Speaker 1: with like digital communication, with just communication in general. Right, 388 00:20:59,080 --> 00:21:01,640 Speaker 1: my smartphone made it easier to stay in touch all 389 00:21:01,680 --> 00:21:06,040 Speaker 1: the time, and now my smartphone like it. Obviously I 390 00:21:06,119 --> 00:21:08,720 Speaker 1: still carry the damn thing everywhere, but like my text 391 00:21:08,720 --> 00:21:11,680 Speaker 1: messages are mostly scams, and my emails are mostly scams, 392 00:21:11,680 --> 00:21:13,600 Speaker 1: and most of the calls that I get are scams. 393 00:21:13,640 --> 00:21:15,920 Speaker 2: Like yeah, yeah. 394 00:21:15,680 --> 00:21:18,640 Speaker 3: I've actually been finding myself drifting back towards email as 395 00:21:18,640 --> 00:21:22,800 Speaker 3: a communication medium just because the spam filters are better, 396 00:21:23,119 --> 00:21:26,720 Speaker 3: you know, mature and sophisticated and for the most part 397 00:21:26,760 --> 00:21:29,360 Speaker 3: they work. Yeah, like they're yeah, it's like I can 398 00:21:29,400 --> 00:21:31,720 Speaker 3: actually people can actually reach me by email. 399 00:21:32,600 --> 00:21:34,000 Speaker 2: Yeah, that's pretty cool. 400 00:21:36,119 --> 00:21:39,720 Speaker 3: And you know, like there's there's a whole tech like 401 00:21:39,920 --> 00:21:41,400 Speaker 3: really kind of the big thing is there's this whole 402 00:21:41,440 --> 00:21:42,840 Speaker 3: technological element to it. 403 00:21:42,880 --> 00:21:44,960 Speaker 2: And you know, when you when you. 404 00:21:44,880 --> 00:21:47,320 Speaker 3: Sort of pitched the idea of this conversation with the 405 00:21:47,880 --> 00:21:51,440 Speaker 3: first two places, my brain went to where John Romulus Brinkley, 406 00:21:51,760 --> 00:21:57,040 Speaker 3: the goat testicle doctor yes, yes, yes, and pioneer of 407 00:21:57,560 --> 00:22:04,399 Speaker 3: new media radio and and of course uh and Marshall mcluan. Yeah, 408 00:22:04,400 --> 00:22:06,240 Speaker 3: you know, like those those were the those were the 409 00:22:06,240 --> 00:22:08,840 Speaker 3: two things that my brain immediately was like, this is 410 00:22:08,920 --> 00:22:10,840 Speaker 3: this is sort of like relevant to it because like 411 00:22:11,280 --> 00:22:15,439 Speaker 3: Brinkley was, he was a pioneer of radio. He he 412 00:22:16,000 --> 00:22:21,800 Speaker 3: absolutely advanced sort of the format of like what radio 413 00:22:21,920 --> 00:22:25,480 Speaker 3: could be and how you could use radio to not 414 00:22:25,720 --> 00:22:29,640 Speaker 3: just extract money from people, but get them onto your 415 00:22:29,840 --> 00:22:33,280 Speaker 3: side such that after they have given you their money, 416 00:22:33,960 --> 00:22:38,040 Speaker 3: like they're they're not just they're not just your your victims, 417 00:22:38,080 --> 00:22:41,000 Speaker 3: You're you're not just rolling into town and selling them 418 00:22:41,080 --> 00:22:44,680 Speaker 3: some some snake oil and then like skid daddling as 419 00:22:44,680 --> 00:22:48,840 Speaker 3: fast as possible, you have made them into your fans, 420 00:22:49,359 --> 00:22:52,560 Speaker 3: into your followers. And you know the way that he 421 00:22:52,640 --> 00:22:55,760 Speaker 3: did that by connecting his scam to like a sense 422 00:22:55,800 --> 00:22:59,439 Speaker 3: of identity. You know that he wasn't just this fake doctor. 423 00:22:59,480 --> 00:23:01,480 Speaker 3: He was also so effectively a pastor. 424 00:23:02,280 --> 00:23:05,639 Speaker 1: Yeah yeah, people who had defend it after there was 425 00:23:05,720 --> 00:23:09,680 Speaker 1: no longer any chance of them, Like after it had 426 00:23:09,720 --> 00:23:12,359 Speaker 1: been sort of proven that the thing that he was 427 00:23:12,440 --> 00:23:15,639 Speaker 1: promising was not real, right, Yeah, like once there was 428 00:23:15,640 --> 00:23:17,879 Speaker 1: no more. It's almost like you know that play the 429 00:23:17,960 --> 00:23:21,080 Speaker 1: Music Man, Yeah, yeah, if you if you at home 430 00:23:21,200 --> 00:23:22,000 Speaker 1: or not, Like I'm not. 431 00:23:22,000 --> 00:23:24,160 Speaker 2: A huge musical theater guy. This is a pretty famous play. 432 00:23:24,200 --> 00:23:27,080 Speaker 1: But like the basic idea is this guy tells everyone come, 433 00:23:27,200 --> 00:23:29,680 Speaker 1: this con man comes to town, tells everyone he's gonna 434 00:23:29,720 --> 00:23:31,919 Speaker 1: make like a big band and raises money for it, 435 00:23:31,960 --> 00:23:34,400 Speaker 1: and his plan is to like take the money and run. 436 00:23:34,640 --> 00:23:36,960 Speaker 1: It's kind of what the Monoail Sketch and The Simpsons 437 00:23:37,040 --> 00:23:40,760 Speaker 1: is based on to a significant extent. And if I'm 438 00:23:40,800 --> 00:23:43,640 Speaker 1: remembering correctly, I shouldn't have brought this up, maybe because 439 00:23:43,640 --> 00:23:45,960 Speaker 1: I'm actually not that knowledgeable about musical theater. But my 440 00:23:46,040 --> 00:23:48,399 Speaker 1: recollection of the way it goes is that like he 441 00:23:48,480 --> 00:23:51,000 Speaker 1: falls in lover some shit and feels bad, and you know, 442 00:23:51,040 --> 00:23:54,280 Speaker 1: they wind up he winds up becoming not a con man. 443 00:23:54,359 --> 00:23:56,760 Speaker 1: But like I think the modern version of that is 444 00:23:56,800 --> 00:23:59,600 Speaker 1: he just he gets people to like adopt as a 445 00:23:59,600 --> 00:24:03,040 Speaker 1: religion and the idea that these fucking trombones and uniforms 446 00:24:03,040 --> 00:24:06,119 Speaker 1: and tubes and shit are on the way, and like, 447 00:24:06,240 --> 00:24:10,119 Speaker 1: you know, then they attack the local newspaper and string 448 00:24:10,160 --> 00:24:12,320 Speaker 1: a journalist up in the center of town for telling 449 00:24:12,359 --> 00:24:15,159 Speaker 1: them that there were ten years into this city, hasn't 450 00:24:15,200 --> 00:24:17,920 Speaker 1: started a bay anyway, whatever, Yeah, yeah. 451 00:24:17,720 --> 00:24:21,719 Speaker 3: And so so the other mccluan, you know, his his 452 00:24:21,800 --> 00:24:24,879 Speaker 3: famous postulate, the medium is the message, which remains a 453 00:24:24,960 --> 00:24:28,800 Speaker 3: radical observation to this day, is just that it's this 454 00:24:28,840 --> 00:24:34,120 Speaker 3: assertion that the medium itself is more important than any 455 00:24:34,119 --> 00:24:37,439 Speaker 3: given message on it, or even the like the the 456 00:24:37,600 --> 00:24:42,840 Speaker 3: combined weight of the individual messages. Now, I think in 457 00:24:42,880 --> 00:24:46,560 Speaker 3: some regards mccluan kind of went like overboard with that 458 00:24:46,600 --> 00:24:49,240 Speaker 3: because he said that it's like content doesn't matter at all, 459 00:24:49,280 --> 00:24:52,439 Speaker 3: and it's like, ah, I think content matters. But the 460 00:24:52,480 --> 00:24:55,520 Speaker 3: point still stands that like the medium itself, the like 461 00:24:55,680 --> 00:24:59,120 Speaker 3: the invention of radio, the invention of television, the invention 462 00:24:59,280 --> 00:25:04,000 Speaker 3: of the internet, the invention of social media had a bigger, 463 00:25:04,320 --> 00:25:06,840 Speaker 3: like has had a bigger impact than any given thing 464 00:25:06,920 --> 00:25:11,040 Speaker 3: on it, because that's the thing that ultimately we warp 465 00:25:11,080 --> 00:25:15,320 Speaker 3: our lives around that, we restructure our homes around, we 466 00:25:15,359 --> 00:25:20,680 Speaker 3: restructure our physical environment around, we restructure how we spend 467 00:25:20,880 --> 00:25:24,880 Speaker 3: our days like that, our time usage gets warped around 468 00:25:25,000 --> 00:25:29,800 Speaker 3: the medium itself, and thus the medium becomes the portal 469 00:25:29,840 --> 00:25:31,440 Speaker 3: for information to travel. 470 00:25:31,119 --> 00:25:35,120 Speaker 1: Through absolutely and it also I mean, I think there's 471 00:25:35,160 --> 00:25:38,280 Speaker 1: an extent to which that is true of kind of 472 00:25:38,280 --> 00:25:42,879 Speaker 1: the way parasocial dynamics impact things like political belief. I 473 00:25:42,920 --> 00:25:44,920 Speaker 1: think there are a lot of people, and I think 474 00:25:44,960 --> 00:25:47,080 Speaker 1: there are a lot of things that people a lot of, 475 00:25:47,200 --> 00:25:50,480 Speaker 1: especially when it comes to like radical politics, that people 476 00:25:50,520 --> 00:25:55,320 Speaker 1: adopt because somebody who they had come to already like 477 00:25:55,600 --> 00:25:58,760 Speaker 1: expresses those politics, right, and so something that maybe never 478 00:25:58,800 --> 00:26:02,120 Speaker 1: would have gotten any purchase with them suddenly is able 479 00:26:02,160 --> 00:26:04,479 Speaker 1: to get purchased with them because like a dude that 480 00:26:04,520 --> 00:26:08,399 Speaker 1: they are, a lady, that they had a parasocial relationship 481 00:26:08,560 --> 00:26:11,199 Speaker 1: express this kind of stuff. And it just it's not 482 00:26:11,280 --> 00:26:13,399 Speaker 1: that it like hacks their brain. It's not that like 483 00:26:13,520 --> 00:26:19,159 Speaker 1: people are you know, little robots. It's that, uh, this 484 00:26:19,320 --> 00:26:21,480 Speaker 1: is kind of the way influence works. It's the same 485 00:26:21,520 --> 00:26:25,520 Speaker 1: reason why like your you like, people often wind up 486 00:26:25,520 --> 00:26:28,600 Speaker 1: believing similar things to their parents or similar things to 487 00:26:28,640 --> 00:26:29,399 Speaker 1: their friends group. 488 00:26:29,640 --> 00:26:31,480 Speaker 2: You know, if your friends are. 489 00:26:31,400 --> 00:26:34,200 Speaker 1: All saying like you know, on the positive end of things, 490 00:26:34,240 --> 00:26:36,720 Speaker 1: if you're if you grow up like like I did. 491 00:26:37,040 --> 00:26:38,600 Speaker 2: I don't know about your high school, but if you. 492 00:26:38,600 --> 00:26:41,320 Speaker 3: Are like I like that. I was just thinking the 493 00:26:41,320 --> 00:26:43,840 Speaker 3: same thing. Like my vocabulary in high school. 494 00:26:43,640 --> 00:26:47,280 Speaker 1: Was yeah, yeah, there is there's a slur that starts 495 00:26:47,320 --> 00:26:49,359 Speaker 1: with F. That was like every third word out of 496 00:26:49,359 --> 00:26:52,000 Speaker 1: not just my mouth, but everyone I knew. No. The 497 00:26:52,040 --> 00:26:55,800 Speaker 1: movie super Bad captures this to a significant degree of fidelity. 498 00:26:55,800 --> 00:26:58,760 Speaker 1: To be honest, that's just the way shit was. And 499 00:26:58,880 --> 00:27:02,720 Speaker 1: like the early aughts, and then you know, the people 500 00:27:02,760 --> 00:27:05,600 Speaker 1: I hung around with, suddenly there were more people who 501 00:27:05,640 --> 00:27:08,840 Speaker 1: were openly queer. And suddenly people weren't talking that way, 502 00:27:08,960 --> 00:27:11,879 Speaker 1: and I stopped talking that way. It's just how people are. 503 00:27:12,000 --> 00:27:15,120 Speaker 3: Yeah, just from like somebody that I admired being like, hey. 504 00:27:15,200 --> 00:27:16,879 Speaker 2: Yeah, I don't say that. I don't think that's cool. 505 00:27:17,240 --> 00:27:19,959 Speaker 2: Oh that is kind of fucked up. 506 00:27:20,040 --> 00:27:23,760 Speaker 1: Yeah, And then you know, and it's it's not an 507 00:27:24,040 --> 00:27:25,919 Speaker 1: like yeah, I can I can go to my the 508 00:27:25,960 --> 00:27:28,360 Speaker 1: subreddit for my show and see people being like yeah, 509 00:27:28,359 --> 00:27:31,600 Speaker 1: I started getting interested in like anarchist politics and history 510 00:27:31,640 --> 00:27:34,000 Speaker 1: and stuff because of something Robert said. And I don't 511 00:27:34,040 --> 00:27:36,640 Speaker 1: think that's bad, because I think anarchist's history and politics 512 00:27:36,680 --> 00:27:39,520 Speaker 1: are useful even if you're not an anarchist, Right, it's 513 00:27:39,600 --> 00:27:43,879 Speaker 1: valuable to understand that history. It's often undertold. But this 514 00:27:44,040 --> 00:27:47,040 Speaker 1: is the same dynamic, This thing which has benefited me 515 00:27:47,080 --> 00:27:49,159 Speaker 1: and to some extent benefited some of the ideas that 516 00:27:49,200 --> 00:27:52,040 Speaker 1: I think are should be more widely known. This is 517 00:27:52,080 --> 00:27:54,560 Speaker 1: also why there's more Nazis, right, like it's it's it's 518 00:27:54,640 --> 00:27:56,760 Speaker 1: cuts every where which way. 519 00:27:57,200 --> 00:27:59,080 Speaker 2: And so so mccluan. 520 00:27:59,200 --> 00:28:01,760 Speaker 3: You get these new you get the Internet, you get 521 00:28:01,760 --> 00:28:04,560 Speaker 3: the subdivisions of the Internet, like you get social media, 522 00:28:04,640 --> 00:28:09,200 Speaker 3: you get email, you get you know, instant messaging and whatnot. 523 00:28:09,359 --> 00:28:14,600 Speaker 3: And because those technologies have this gravitational effect around them 524 00:28:15,000 --> 00:28:19,400 Speaker 3: that alters the trajectory of how we structure our lives, 525 00:28:21,200 --> 00:28:26,280 Speaker 3: they become because they are potent, because they are valuable 526 00:28:26,320 --> 00:28:33,000 Speaker 3: communication vectors. They become prime targets for grift. And the 527 00:28:33,040 --> 00:28:35,840 Speaker 3: thing is is that all of these technologies that have 528 00:28:35,960 --> 00:28:40,520 Speaker 3: accelerated communication, you know, people have long been pointing out 529 00:28:40,560 --> 00:28:44,160 Speaker 3: like the negative impacts of social media, and just like 530 00:28:44,880 --> 00:28:49,680 Speaker 3: the the effect on like self esteem, self perception of 531 00:28:49,800 --> 00:28:56,280 Speaker 3: just being exposed to other people's curated, idealized version of 532 00:28:56,360 --> 00:29:02,200 Speaker 3: themselves so constantly. You know, it's like this like that's 533 00:29:02,240 --> 00:29:07,800 Speaker 3: already you know, uh, impactful in potentially negative ways. 534 00:29:08,240 --> 00:29:08,360 Speaker 1: Uh. 535 00:29:08,440 --> 00:29:11,240 Speaker 2: And that's when you're dealing with like real people. 536 00:29:11,800 --> 00:29:11,920 Speaker 1: Uh. 537 00:29:12,520 --> 00:29:16,560 Speaker 3: And but then you add on to that that it's like, oh, 538 00:29:16,680 --> 00:29:19,240 Speaker 3: you go on Instagram and like you can be following 539 00:29:19,920 --> 00:29:22,800 Speaker 3: a bot and not even know it. You know, you're 540 00:29:23,000 --> 00:29:25,720 Speaker 3: you're getting you know, and the algorithm is going to 541 00:29:25,760 --> 00:29:28,080 Speaker 3: float this stuff. And so particularly if you're looking down 542 00:29:28,120 --> 00:29:34,080 Speaker 3: these like these addictive infinite scroll feeds, uh, you know, 543 00:29:34,400 --> 00:29:39,560 Speaker 3: you don't have the filter of pre interaction to to 544 00:29:39,640 --> 00:29:42,480 Speaker 3: gauge those things. So like so like I follow you 545 00:29:42,560 --> 00:29:44,880 Speaker 3: on Twitter, and I know that if I see, like, oh, 546 00:29:44,960 --> 00:29:47,640 Speaker 3: Robert Evans has retweeted this thing, that it's like okay, 547 00:29:47,720 --> 00:29:51,000 Speaker 3: so like he's taken a look at it, uh and 548 00:29:51,000 --> 00:29:53,440 Speaker 3: and it's been through like the filter, the filter of 549 00:29:53,480 --> 00:29:56,800 Speaker 3: his brain, and so I can probably just like, you know, 550 00:29:56,920 --> 00:30:00,360 Speaker 3: take my trust in that thing up like one notch, right. 551 00:30:00,800 --> 00:30:03,080 Speaker 3: But if I'm just like scrolling down the like the 552 00:30:03,280 --> 00:30:08,000 Speaker 3: algorithmically curated like this is what our computer has determined 553 00:30:08,160 --> 00:30:12,040 Speaker 3: is similar to things that you have already looked at. 554 00:30:12,640 --> 00:30:17,200 Speaker 3: It's it's just it's so much more fraught. But there's 555 00:30:17,400 --> 00:30:21,040 Speaker 3: it's really easy to be complacent and just be like, oh, 556 00:30:21,240 --> 00:30:25,000 Speaker 3: I trust this thing, I trust this platform. And that's 557 00:30:25,000 --> 00:30:28,040 Speaker 3: where we get into the trow of disappointment. Is yeah, 558 00:30:28,080 --> 00:30:31,120 Speaker 3: this like, I trust these algorithm. These algorithms do a 559 00:30:31,160 --> 00:30:34,480 Speaker 3: really good job of like, oh, I watched Dan and 560 00:30:34,560 --> 00:30:37,880 Speaker 3: so the YouTube algorithm introduced me to like a bunch of. 561 00:30:37,840 --> 00:30:40,360 Speaker 2: Other really good creators. Cool. 562 00:30:41,040 --> 00:30:45,000 Speaker 3: Uh oops, I watched one video on flat Earth and 563 00:30:45,240 --> 00:30:51,000 Speaker 3: now my my recommends are full of like COVID denihialism 564 00:30:51,040 --> 00:30:56,440 Speaker 3: and anti maskers and you know, the Trucker movement and 565 00:30:56,440 --> 00:30:59,840 Speaker 3: and all of these other like wedges to just sort 566 00:30:59,840 --> 00:31:01,480 Speaker 3: of slowly rot in my brain. 567 00:31:03,000 --> 00:31:06,640 Speaker 1: Yeah, it's like it it's it's like kind of the 568 00:31:06,680 --> 00:31:09,560 Speaker 1: way our parents told us, or dare or whatever told 569 00:31:09,640 --> 00:31:11,920 Speaker 1: us drugs worked, you know when we were little kids, 570 00:31:12,200 --> 00:31:14,920 Speaker 1: where someone's like, oh, you want some pot. Here's some 571 00:31:15,120 --> 00:31:17,840 Speaker 1: straight up heroin, right, like you want some of it is too, 572 00:31:17,920 --> 00:31:21,960 Speaker 1: like you want some crack cocade? No, and I it 573 00:31:22,080 --> 00:31:23,960 Speaker 1: is you know you were talking about like yeah, you 574 00:31:24,000 --> 00:31:26,800 Speaker 1: see you follow someone and you see them share something, 575 00:31:26,840 --> 00:31:28,920 Speaker 1: and if they're a trusted source for you, you know, 576 00:31:28,960 --> 00:31:31,760 Speaker 1: it bumps it up a notch. And even that you 577 00:31:31,800 --> 00:31:34,480 Speaker 1: know that's the way it, like it works for me 578 00:31:34,560 --> 00:31:36,840 Speaker 1: as well. But there's a degree to which I find 579 00:31:36,880 --> 00:31:40,920 Speaker 1: it like problematic, especially because like we all fuck around 580 00:31:41,000 --> 00:31:43,160 Speaker 1: on the internet too. I had a thing go crazy 581 00:31:43,240 --> 00:31:47,120 Speaker 1: viral recently where like someone someone posted an obviously photoshopped 582 00:31:47,160 --> 00:31:50,680 Speaker 1: image of like a control, like a Logitech controller at 583 00:31:50,680 --> 00:31:53,120 Speaker 1: the bottom of the of the sea and was like, look, 584 00:31:53,160 --> 00:31:55,840 Speaker 1: the controllers survive, and I like, I shared it to 585 00:31:55,840 --> 00:31:57,840 Speaker 1: make a joke, right, and the joke was that, like, well, 586 00:31:58,320 --> 00:32:00,360 Speaker 1: the joke was that like, well the control or we're 587 00:32:00,360 --> 00:32:02,240 Speaker 1: gonna find out was one of the more functional things 588 00:32:02,240 --> 00:32:04,480 Speaker 1: about that terrible sub and that was And I even 589 00:32:04,520 --> 00:32:07,040 Speaker 1: posted underneath it this obviously this is not a real image, guys. 590 00:32:07,080 --> 00:32:10,320 Speaker 1: But like then I saw, like I wound up finding 591 00:32:10,640 --> 00:32:12,680 Speaker 1: it went. The post that I did of it went 592 00:32:12,760 --> 00:32:15,960 Speaker 1: so viral that like it wound up like screen captain 593 00:32:16,000 --> 00:32:18,280 Speaker 1: in some different Reddit communities for people to talk about, 594 00:32:18,360 --> 00:32:20,600 Speaker 1: and they it was only the first post, not the 595 00:32:20,600 --> 00:32:22,840 Speaker 1: one where I was like, obviously this is fake, and 596 00:32:22,880 --> 00:32:24,840 Speaker 1: like it was a joke. You know, it was it 597 00:32:24,880 --> 00:32:26,680 Speaker 1: was a it was a it was a ship post. 598 00:32:27,240 --> 00:32:30,520 Speaker 1: We were banton online. Yeah yeah yeah. But also I'm like, 599 00:32:30,600 --> 00:32:33,880 Speaker 1: I wonder how many people now think that literally there's 600 00:32:33,880 --> 00:32:36,000 Speaker 1: a Logitech controller that they found at the bottom of. 601 00:32:36,000 --> 00:32:36,920 Speaker 2: The sea because of that? 602 00:32:38,680 --> 00:32:40,800 Speaker 1: What is what are the ethics now of like making 603 00:32:40,840 --> 00:32:44,280 Speaker 1: a making a ji a jape as like somebody who's 604 00:32:44,320 --> 00:32:46,440 Speaker 1: got like a following, like where does that come into? 605 00:32:46,480 --> 00:32:48,560 Speaker 1: And like I don't know, and I don't I'm certainly 606 00:32:48,600 --> 00:32:51,200 Speaker 1: not like clear on it because I seem to be 607 00:32:51,240 --> 00:32:54,080 Speaker 1: incapable of not shit posting. I spent too much time 608 00:32:54,080 --> 00:32:55,200 Speaker 1: on something awful as well. 609 00:32:57,360 --> 00:32:58,560 Speaker 2: But it's so. 610 00:32:58,680 --> 00:32:59,959 Speaker 3: It's so hard to give up. 611 00:33:00,120 --> 00:33:01,000 Speaker 2: I miss it. 612 00:33:01,080 --> 00:33:03,160 Speaker 3: I miss the days when I could just make like 613 00:33:03,400 --> 00:33:08,600 Speaker 3: tasteless jokes on on on Twitter and you know, a 614 00:33:08,640 --> 00:33:12,120 Speaker 3: couple hundred people would see them and go like that's funny. Uh. 615 00:33:12,160 --> 00:33:14,880 Speaker 3: And now it's like, ah, if if I'm a little 616 00:33:14,880 --> 00:33:17,600 Speaker 3: too ironic, someone's going to be like, oh crap, are 617 00:33:17,760 --> 00:33:20,320 Speaker 3: are you like that happened? It's like, no, no, that 618 00:33:20,360 --> 00:33:22,200 Speaker 3: did not happen. But that did not happen. This is 619 00:33:22,280 --> 00:33:22,880 Speaker 3: this is fake. 620 00:33:22,960 --> 00:33:25,160 Speaker 2: I am. I am telling you lies for it was 621 00:33:25,200 --> 00:33:26,400 Speaker 2: a bit, but it was a bit. 622 00:33:26,880 --> 00:33:30,120 Speaker 1: I was doing a bit, but no, And that's like, 623 00:33:30,520 --> 00:33:33,960 Speaker 1: you know, something awful, which is kind of the the 624 00:33:33,960 --> 00:33:37,240 Speaker 1: the digit It's like the oh crap, now I've forgotten 625 00:33:37,240 --> 00:33:38,600 Speaker 1: a very basic science team, you know. 626 00:33:38,560 --> 00:33:41,040 Speaker 2: The big the big puddle of boiling goop. 627 00:33:41,120 --> 00:33:45,480 Speaker 1: That life came out of the prim It's the primordial, 628 00:33:45,520 --> 00:33:49,880 Speaker 1: primordial of digital culture. That's what Something Awful was. It 629 00:33:49,960 --> 00:33:53,200 Speaker 1: was a forum website that gave birth in various ways, 630 00:33:53,240 --> 00:33:56,640 Speaker 1: some direct and some indirect to four Chan, to Reddit, 631 00:33:57,560 --> 00:34:00,000 Speaker 1: to Twitter culture, you know, to all of these different 632 00:34:00,080 --> 00:34:03,320 Speaker 1: to anonymous, to all of these different things have a 633 00:34:03,440 --> 00:34:07,840 Speaker 1: you can trace a lineage back to Something Awful. And 634 00:34:08,440 --> 00:34:11,280 Speaker 1: the motto of that website, as written by the terrible 635 00:34:11,320 --> 00:34:13,920 Speaker 1: person who founded it, was the Internet makes you stupid. 636 00:34:15,080 --> 00:34:20,640 Speaker 1: And I at the time what that kind of meant was. 637 00:34:20,960 --> 00:34:23,880 Speaker 1: And if you're younger, or if you just weren't very 638 00:34:23,960 --> 00:34:26,040 Speaker 1: online in the late nineties early two thousands, you may 639 00:34:26,080 --> 00:34:28,440 Speaker 1: not remember this long period, but there was a fairly 640 00:34:28,480 --> 00:34:32,120 Speaker 1: long period where the default assumption in regular society was 641 00:34:32,840 --> 00:34:37,880 Speaker 1: whatever happens online doesn't matter, right, like, it can't matter. 642 00:34:38,000 --> 00:34:42,760 Speaker 3: Probably fraudulent, it's almost certainly like made up. You can't 643 00:34:42,960 --> 00:34:45,400 Speaker 3: you can't trust anything online. 644 00:34:45,160 --> 00:34:47,960 Speaker 1: And real people are not on the internet, right, Like 645 00:34:48,480 --> 00:34:51,359 Speaker 1: it's kids, it's nerds, but like, you know, guys who 646 00:34:51,440 --> 00:34:54,160 Speaker 1: run banks aren't online. You know, Like the idea that 647 00:34:54,200 --> 00:34:56,359 Speaker 1: the richest man in the world would spend all of 648 00:34:56,400 --> 00:35:01,879 Speaker 1: his time shit posting was absurd, Like, so. 649 00:35:02,840 --> 00:35:05,560 Speaker 3: He really should be busier than he observably is he? 650 00:35:05,680 --> 00:35:07,799 Speaker 1: Certainly it seems like it, although I guess so should 651 00:35:07,840 --> 00:35:10,000 Speaker 1: I if I if. 652 00:35:09,920 --> 00:35:14,240 Speaker 2: I'm being fair, But yeah, it's. 653 00:35:15,600 --> 00:35:21,319 Speaker 1: There's this, uh, this degree to which digital culture is 654 00:35:21,360 --> 00:35:24,719 Speaker 1: still very much a huge chunk of it. Like we 655 00:35:24,760 --> 00:35:27,200 Speaker 1: all want it to not matter. We all want a 656 00:35:27,200 --> 00:35:30,240 Speaker 1: place where we can just shit post and bullshit because 657 00:35:30,239 --> 00:35:33,279 Speaker 1: shit posting and bullshiting comes out of like the very 658 00:35:33,320 --> 00:35:36,120 Speaker 1: same impulses that like determine a lot of how we 659 00:35:36,160 --> 00:35:38,640 Speaker 1: interact with like our friends, right, you know, we all 660 00:35:38,640 --> 00:35:40,600 Speaker 1: need some times where you can just sit down, have 661 00:35:40,640 --> 00:35:44,000 Speaker 1: a couple of beers or whatever and like say shit 662 00:35:44,120 --> 00:35:46,440 Speaker 1: with your with your buds, you know, and it's not 663 00:35:46,960 --> 00:35:50,480 Speaker 1: it's not being recorded, it's not going up any everywhere forever. 664 00:35:50,600 --> 00:35:53,160 Speaker 1: You can just kind of like talk. This is the 665 00:35:53,480 --> 00:35:56,080 Speaker 1: it's a it's a field almost social experimentation is a 666 00:35:56,160 --> 00:36:00,359 Speaker 1: huge part of maturity of growing up, becoming a person. Uh, 667 00:36:00,400 --> 00:36:04,840 Speaker 1: and I think we all get kind of there's a 668 00:36:04,880 --> 00:36:08,080 Speaker 1: degree of like the accessibility of the Internet that makes 669 00:36:08,160 --> 00:36:11,160 Speaker 1: that impossible to entirely get over even though it is 670 00:36:11,200 --> 00:36:15,200 Speaker 1: demonstrably untrue. What happens on the Internet matters quite a lot, 671 00:36:15,560 --> 00:36:19,080 Speaker 1: and you can have a real significant you can influence 672 00:36:19,120 --> 00:36:22,279 Speaker 1: your own life in very negative ways by saying. 673 00:36:21,960 --> 00:36:24,000 Speaker 2: The wrong thing on the Internet at the wrong time. 674 00:36:24,280 --> 00:36:38,000 Speaker 3: Yeah, I mean lots of people have observed just this 675 00:36:38,080 --> 00:36:40,200 Speaker 3: fact that it's like on Reddit, you can you're not 676 00:36:40,239 --> 00:36:42,799 Speaker 3: on Reddit, I mean on Reddit too, but you know, 677 00:36:42,880 --> 00:36:47,560 Speaker 3: on Twitter, like once, you know, once a month, Twitter 678 00:36:47,640 --> 00:36:55,560 Speaker 3: elects some ten follower anime profile pick with who with 679 00:36:55,560 --> 00:36:58,759 Speaker 3: with a single tasteless joke and makes it the full 680 00:36:58,800 --> 00:36:59,719 Speaker 3: crim of reality. 681 00:37:01,200 --> 00:37:04,879 Speaker 2: Yeah, and it's like that's a and. 682 00:37:04,880 --> 00:37:07,120 Speaker 3: The thing, I don't think this is actually that far 683 00:37:07,160 --> 00:37:11,759 Speaker 3: off topic, just because like it's this warping of reality, 684 00:37:11,880 --> 00:37:15,520 Speaker 3: this warping of like what is real, what is trustworthy, 685 00:37:15,920 --> 00:37:19,160 Speaker 3: what are the like impacts of things, And the fact 686 00:37:19,200 --> 00:37:23,719 Speaker 3: that like you know, ten follower account can become can 687 00:37:23,760 --> 00:37:29,680 Speaker 3: become international news. Yeah, has to sit alongside the endless 688 00:37:29,680 --> 00:37:35,799 Speaker 3: bombardment of Dick pills and global leaders. Like I had 689 00:37:35,800 --> 00:37:37,759 Speaker 3: this joke that I was trying to formulate over the 690 00:37:37,800 --> 00:37:42,279 Speaker 3: weekend of like World War two with Twitter, where it's like, 691 00:37:42,440 --> 00:37:45,279 Speaker 3: you know, just a joke hinging on the idea that 692 00:37:45,360 --> 00:37:49,000 Speaker 3: some like follower bought would observe this like, ah, it's like, 693 00:37:49,320 --> 00:37:51,279 Speaker 3: you know, two posts in a row, Like it's like 694 00:37:51,320 --> 00:37:55,320 Speaker 3: the USSR has rolled into Berlin, Stalin has unfriended the President. 695 00:37:55,560 --> 00:37:59,719 Speaker 3: I hope this doesn't mean anything, you know that it's 696 00:37:59,719 --> 00:38:04,200 Speaker 3: like the you have, like you have international politics happening 697 00:38:04,760 --> 00:38:09,000 Speaker 3: in the same space as fake international politics, as the 698 00:38:09,000 --> 00:38:13,000 Speaker 3: same space as just like this endless bombardment of you know, 699 00:38:14,200 --> 00:38:20,239 Speaker 3: of curated reality, fictionalized reality, unreality and spam, and no 700 00:38:20,239 --> 00:38:23,680 Speaker 3: one knows what's real anymore, no one knows what to trust, 701 00:38:23,760 --> 00:38:27,160 Speaker 3: and the instinct and a lot of people is to 702 00:38:27,280 --> 00:38:29,799 Speaker 3: just give up trying to parse the difference. And that 703 00:38:29,960 --> 00:38:32,480 Speaker 3: makes us like increasingly vulnerable. 704 00:38:33,520 --> 00:38:36,320 Speaker 2: Yeah, And I think. 705 00:38:37,960 --> 00:38:42,360 Speaker 1: A big part of what's what's kind of at the 706 00:38:42,360 --> 00:38:45,000 Speaker 1: core of the problem here is what you've said here 707 00:38:45,080 --> 00:38:46,840 Speaker 1: makes us vulnerable. The degree to which this can be 708 00:38:46,880 --> 00:38:51,279 Speaker 1: weaponized is really significant, Like the you know, one of 709 00:38:51,360 --> 00:38:52,920 Speaker 1: the things that we saw that I think is kind 710 00:38:52,920 --> 00:38:57,080 Speaker 1: of low key a significant moment in sort of info conflict. 711 00:38:57,120 --> 00:39:01,239 Speaker 1: Shit is this last this weekend, last weekend from you 712 00:39:01,280 --> 00:39:04,879 Speaker 1: know where we're talking now, there was a mutiny by 713 00:39:05,000 --> 00:39:09,840 Speaker 1: the Wagner Mercenary forces in Ukraine and southern Russia against 714 00:39:09,840 --> 00:39:12,840 Speaker 1: the Russian government, or at least that's what it appears 715 00:39:12,840 --> 00:39:15,120 Speaker 1: to have been now right, this is Russia. A lot 716 00:39:15,160 --> 00:39:16,719 Speaker 1: of this is really weird. So I'm not going to 717 00:39:16,800 --> 00:39:20,120 Speaker 1: say we know, we don't. We certainly don't know entirely 718 00:39:20,200 --> 00:39:23,200 Speaker 1: like what happened there, like what's going on there? But 719 00:39:24,239 --> 00:39:27,760 Speaker 1: a couple of things happened very quickly. For one, folks 720 00:39:27,800 --> 00:39:29,880 Speaker 1: on the right, and there were also a lot of 721 00:39:29,960 --> 00:39:33,480 Speaker 1: kind of like shit had left people who adopted this 722 00:39:33,520 --> 00:39:37,160 Speaker 1: too decided that liberals were cheering on the head of 723 00:39:37,200 --> 00:39:42,000 Speaker 1: Wagner Yvjiny Pregoshin because like they believed he was a 724 00:39:42,040 --> 00:39:44,040 Speaker 1: reformer and that like they'd all thought this guy, who 725 00:39:44,080 --> 00:39:46,920 Speaker 1: was like objectively a piece of shit and a fascist, 726 00:39:47,000 --> 00:39:49,080 Speaker 1: is like they're cheering him on because they hate Putin 727 00:39:49,160 --> 00:39:51,640 Speaker 1: so much and they've convinced themselves that he's you know, 728 00:39:51,680 --> 00:39:53,560 Speaker 1: going to fix Russia, and he's like, no, no, I 729 00:39:53,760 --> 00:39:57,080 Speaker 1: didn't see that. Like, look, I love calling people out 730 00:39:57,120 --> 00:39:59,920 Speaker 1: when they have shitty takes, specifically on this specific war, 731 00:40:00,040 --> 00:40:02,440 Speaker 1: because I've been covering it since twenty fourteen, But like, 732 00:40:02,680 --> 00:40:04,960 Speaker 1: I didn't see that, and none of the people talking 733 00:40:04,960 --> 00:40:08,560 Speaker 1: about how liberals were doing this provided any evidence of it. 734 00:40:08,600 --> 00:40:11,000 Speaker 1: And it happens all the time, right Sometimes people will 735 00:40:11,000 --> 00:40:13,640 Speaker 1: like take a post that has like thirty likes and 736 00:40:13,680 --> 00:40:16,120 Speaker 1: be like, this is what the left is saying, But 737 00:40:16,239 --> 00:40:18,399 Speaker 1: like with this, there was even less. Like I didn't 738 00:40:18,440 --> 00:40:21,560 Speaker 1: see a single post where someone was like, Pregosion's gonna 739 00:40:21,600 --> 00:40:24,120 Speaker 1: like fix you know, corruption in Russia or whatever. No 740 00:40:24,160 --> 00:40:27,520 Speaker 1: one was saying that they just invented that this was 741 00:40:27,560 --> 00:40:29,840 Speaker 1: going on. And it part of it is that, like, 742 00:40:30,719 --> 00:40:32,520 Speaker 1: you know, the way Twitter works now made it a 743 00:40:32,560 --> 00:40:35,439 Speaker 1: lot easier for dis info to spread from this thing, 744 00:40:35,520 --> 00:40:38,120 Speaker 1: Like there was very famously a guy who is an 745 00:40:38,239 --> 00:40:41,400 Speaker 1: absolutely a con artist just started sharing up bunch of 746 00:40:41,520 --> 00:40:44,759 Speaker 1: videos from there with like bad commentary that was inaccurate, 747 00:40:44,880 --> 00:40:45,520 Speaker 1: and Elon. 748 00:40:45,400 --> 00:40:47,080 Speaker 2: Musk was like, this is the guy I've come to 749 00:40:47,120 --> 00:40:48,920 Speaker 2: trust about. You can say Elon Musk. 750 00:40:49,120 --> 00:40:52,680 Speaker 1: Yeah, we can say Elon Musk I don't know. It's 751 00:40:52,719 --> 00:40:54,080 Speaker 1: a problem, Dan, So. 752 00:40:54,200 --> 00:40:56,319 Speaker 3: You beat me to Elon Musk because I was going 753 00:40:56,400 --> 00:40:58,279 Speaker 3: to say this like the con artist was. But then 754 00:40:58,320 --> 00:41:00,480 Speaker 3: it turns out that he was just retweeting that is, 755 00:41:00,520 --> 00:41:02,680 Speaker 3: like he got of course he got involved anyway. 756 00:41:02,840 --> 00:41:07,759 Speaker 1: Yeah, yeah, And it's the I don't think this the 757 00:41:07,800 --> 00:41:10,960 Speaker 1: solution was not because we lived, you know, our parents 758 00:41:10,960 --> 00:41:13,239 Speaker 1: and grandparents lived to the day where most people would 759 00:41:13,280 --> 00:41:16,080 Speaker 1: be like, well, you know, folks who are in politics 760 00:41:16,160 --> 00:41:18,040 Speaker 1: maybe need to care about this. I might want to 761 00:41:18,040 --> 00:41:21,280 Speaker 1: get the broad strokes of it, but like random people, 762 00:41:21,600 --> 00:41:25,480 Speaker 1: you know, shouldn't be influencing what's going on with these 763 00:41:25,520 --> 00:41:27,880 Speaker 1: international relations. And that's how you get shited, like the 764 00:41:27,960 --> 00:41:30,759 Speaker 1: Dulles brothers carrying out coups all over the world on 765 00:41:30,840 --> 00:41:33,120 Speaker 1: behalf of the US government, where most Americans are like. 766 00:41:33,040 --> 00:41:34,160 Speaker 2: What did we do in Guatemala? 767 00:41:34,200 --> 00:41:36,680 Speaker 1: I didn't know we had guys in Guatemala and that 768 00:41:36,760 --> 00:41:39,960 Speaker 1: wasn't great. But also this new thing where if you 769 00:41:40,040 --> 00:41:44,840 Speaker 1: are a personality, if you are in media, then you 770 00:41:44,920 --> 00:41:48,200 Speaker 1: are obliged to be a part of every big thing 771 00:41:48,239 --> 00:41:52,319 Speaker 1: that happens everywhere, even if you are demonstrably incompetent at 772 00:41:52,320 --> 00:41:55,360 Speaker 1: that and everyone is demonstrably incompetent at that past a 773 00:41:55,400 --> 00:41:55,960 Speaker 1: certain point. 774 00:41:56,080 --> 00:41:56,239 Speaker 2: You know. 775 00:41:57,040 --> 00:41:59,759 Speaker 3: Yeah, and that's been Oh boy has that. That's been 776 00:41:59,800 --> 00:42:00,759 Speaker 3: a lot to deal with. 777 00:42:01,120 --> 00:42:04,160 Speaker 1: And it going back to the original thing that started 778 00:42:04,160 --> 00:42:06,879 Speaker 1: this conversation. That's part of how so many of these 779 00:42:06,920 --> 00:42:12,320 Speaker 1: cons perpetuate is that, like people are only competent, including 780 00:42:12,360 --> 00:42:16,960 Speaker 1: famous people, including people with followings in limited areas, and 781 00:42:17,000 --> 00:42:19,480 Speaker 1: once you get out of your area of competence, it's 782 00:42:19,520 --> 00:42:21,759 Speaker 1: easy to get fooled. And if there's a bunch of 783 00:42:21,760 --> 00:42:23,640 Speaker 1: people who trust you because of the things you were 784 00:42:23,719 --> 00:42:26,600 Speaker 1: right about, then they can very easily get fooled when 785 00:42:26,600 --> 00:42:27,360 Speaker 1: you get fooled. 786 00:42:27,880 --> 00:42:31,879 Speaker 3: One of the big hazards there is that, and this 787 00:42:31,920 --> 00:42:36,760 Speaker 3: is a long standing observation, is that huckster's con artists 788 00:42:37,360 --> 00:42:40,719 Speaker 3: are going to be more willing than anyone else to 789 00:42:40,880 --> 00:42:45,200 Speaker 3: pretend to be up to date on it. They have 790 00:42:45,280 --> 00:42:47,359 Speaker 3: no compunction about being It's like, oh, yeah, I know, 791 00:42:47,440 --> 00:42:55,319 Speaker 3: I'm an expert on submarines and Ukraine and Russia and belarush, yeah. 792 00:42:55,239 --> 00:42:56,440 Speaker 2: You know, so there. 793 00:42:58,680 --> 00:43:00,800 Speaker 3: The reason it's a con man is it's a confidence 794 00:43:00,800 --> 00:43:03,759 Speaker 3: man because they get your confidence because they act confidently 795 00:43:04,239 --> 00:43:07,160 Speaker 3: and give you reason to give you reason to trust them, 796 00:43:07,320 --> 00:43:09,719 Speaker 3: and they have no moral compunction about lying to you. 797 00:43:10,360 --> 00:43:10,480 Speaker 1: Uh. 798 00:43:10,960 --> 00:43:15,920 Speaker 3: And and they are always going to be faster with 799 00:43:16,160 --> 00:43:19,880 Speaker 3: the take, faster with the confidence statement, faster with the solution, 800 00:43:20,160 --> 00:43:23,719 Speaker 3: faster with the with with a call to action to 801 00:43:24,120 --> 00:43:25,400 Speaker 3: buy their book or dick pills. 802 00:43:26,440 --> 00:43:29,759 Speaker 1: Yeah, and it's it's the And often I think part 803 00:43:29,760 --> 00:43:31,439 Speaker 1: one of the things that's made this all so much 804 00:43:31,480 --> 00:43:33,520 Speaker 1: harder to catch and so much more durable is that 805 00:43:33,560 --> 00:43:36,160 Speaker 1: it's it used to be as obvious. You used to 806 00:43:36,160 --> 00:43:37,799 Speaker 1: be able to see like, Okay, well, this guy's a 807 00:43:37,840 --> 00:43:40,799 Speaker 1: con man, but like, I'm not a person who can 808 00:43:40,840 --> 00:43:44,319 Speaker 1: be conned by someone selling diet pills. That's not my vulnerability, 809 00:43:44,360 --> 00:43:46,719 Speaker 1: so I immediately recognize this guy as a con man. Or 810 00:43:47,000 --> 00:43:49,000 Speaker 1: I am not a person who can be conned by 811 00:43:49,560 --> 00:43:51,880 Speaker 1: Christianity stuff because I'm not a Christian, So I'm not 812 00:43:52,640 --> 00:43:55,840 Speaker 1: vulnerable to this con man. And now so much the 813 00:43:55,960 --> 00:44:01,280 Speaker 1: cons are downstream of the fot and of the fame, 814 00:44:01,520 --> 00:44:04,319 Speaker 1: and so a lot of people are getting taken in 815 00:44:04,360 --> 00:44:06,920 Speaker 1: by conman. And maybe you know the fact that person's 816 00:44:06,920 --> 00:44:09,080 Speaker 1: putting in a link for their their's supplements. You know, 817 00:44:09,200 --> 00:44:12,640 Speaker 1: on every viral post you don't buy their supplements, but 818 00:44:12,680 --> 00:44:14,920 Speaker 1: they'll come up with something else for you once they 819 00:44:14,960 --> 00:44:18,560 Speaker 1: get you in, once you're in the funnel, or even 820 00:44:18,560 --> 00:44:21,279 Speaker 1: if they never convince you to buy anything, if you're 821 00:44:21,320 --> 00:44:24,200 Speaker 1: sharing their content, that's bringing more people into the funnel, 822 00:44:24,480 --> 00:44:26,399 Speaker 1: you know, and that really wasn't the case. That wasn't 823 00:44:26,440 --> 00:44:28,759 Speaker 1: the case with you know, you go back ten years 824 00:44:28,760 --> 00:44:30,799 Speaker 1: talking about like Young Living, right, or some other like 825 00:44:30,880 --> 00:44:34,160 Speaker 1: multi level marketing company where they're selling you know, essential 826 00:44:34,160 --> 00:44:38,360 Speaker 1: oils with fraudulent health claims. They weren't getting random people 827 00:44:38,480 --> 00:44:41,799 Speaker 1: to spread their business without paying for shit. And now 828 00:44:41,840 --> 00:44:43,840 Speaker 1: you can do that. If you're a con man and 829 00:44:43,920 --> 00:44:46,640 Speaker 1: you've already got followers because you bought a bunch and 830 00:44:46,719 --> 00:44:49,320 Speaker 1: you're you're on the Ukraine, shit, you just grab whatever 831 00:44:49,400 --> 00:44:52,759 Speaker 1: videos and say whatever about them, you know, frame them 832 00:44:52,760 --> 00:44:55,000 Speaker 1: in whatever way is likely to get people to share 833 00:44:55,000 --> 00:44:58,120 Speaker 1: them the most. Then suddenly you gain two thousand, three 834 00:44:58,160 --> 00:44:59,920 Speaker 1: hundred thousand followers in the space of a night or 835 00:45:00,120 --> 00:45:02,920 Speaker 1: to and your ability to scam people and get money 836 00:45:02,920 --> 00:45:05,759 Speaker 1: out of them has increased several times. You know, the 837 00:45:06,120 --> 00:45:10,560 Speaker 1: con is downstream of the of the platform, right, so 838 00:45:11,680 --> 00:45:14,560 Speaker 1: you know that's you get this guy and maybe he's 839 00:45:14,560 --> 00:45:17,080 Speaker 1: shilling thing X or thing. Why he's got a couple, 840 00:45:17,200 --> 00:45:20,960 Speaker 1: you know, whatever different con he has. But regular people 841 00:45:21,200 --> 00:45:23,960 Speaker 1: can be in the in the business of spreading his platform, 842 00:45:24,040 --> 00:45:27,400 Speaker 1: of increasing his profitability, even if they're not vulnerable to 843 00:45:27,440 --> 00:45:29,000 Speaker 1: the con. Maybe they're not the kind of person who's 844 00:45:29,000 --> 00:45:32,640 Speaker 1: ever going to buy weight loss pills or supplements or 845 00:45:32,680 --> 00:45:36,560 Speaker 1: whatever kind of thing. But if this guy starts, you know, 846 00:45:37,000 --> 00:45:39,760 Speaker 1: sharing all of these videos on the fighting in Ukraine, 847 00:45:39,920 --> 00:45:41,520 Speaker 1: you know, at a moment when it happens to be 848 00:45:41,600 --> 00:45:44,440 Speaker 1: the opportune moment to do that, and they go crazy viral, 849 00:45:44,840 --> 00:45:47,719 Speaker 1: well then that guy is able to triple his following 850 00:45:47,920 --> 00:45:51,000 Speaker 1: and you know, and have people who are not interested 851 00:45:51,000 --> 00:45:53,840 Speaker 1: in his con spread his shit, which gets him followers, 852 00:45:53,840 --> 00:45:56,480 Speaker 1: which brings more traffic to whatever the money generating part 853 00:45:56,520 --> 00:46:02,000 Speaker 1: of the con is. Yeah, it's it's all a sales funnel. 854 00:46:02,320 --> 00:46:06,759 Speaker 1: Are just anxiety lives we have. 855 00:46:06,719 --> 00:46:10,560 Speaker 3: Built our society into just like a giant nested series 856 00:46:10,600 --> 00:46:14,560 Speaker 3: of sales funnels. Yeah, I don't know that's bound to 857 00:46:14,600 --> 00:46:16,799 Speaker 3: be a solid foundation. I don't see if you go wrong, 858 00:46:16,920 --> 00:46:18,760 Speaker 3: that seems like that will go well for us. 859 00:46:18,960 --> 00:46:20,839 Speaker 1: How do we get any ideas on how to fix 860 00:46:20,880 --> 00:46:23,839 Speaker 1: it or should we just should we just state a 861 00:46:23,880 --> 00:46:25,080 Speaker 1: problem and then run away. 862 00:46:25,480 --> 00:46:28,480 Speaker 3: I mean, the easy thing to do would be to 863 00:46:28,520 --> 00:46:34,200 Speaker 3: restore trust in our public institutions. You know, if we 864 00:46:34,280 --> 00:46:39,640 Speaker 3: could uh have sort of like I don't even want 865 00:46:39,680 --> 00:46:43,120 Speaker 3: to say, like a unifying cause, but just a sense 866 00:46:43,160 --> 00:46:47,640 Speaker 3: of common of like shared commonalty and and trust in 867 00:46:47,800 --> 00:46:52,560 Speaker 3: like our local our local society, you know, strong like 868 00:46:52,960 --> 00:46:57,440 Speaker 3: not necessarily strong families, but like strong family units constructed 869 00:46:57,760 --> 00:47:01,440 Speaker 3: or natural or however you want to like defin our constructors, 870 00:47:01,440 --> 00:47:05,160 Speaker 3: but like local with like good infrastructure around us, so 871 00:47:05,200 --> 00:47:09,600 Speaker 3: that our physical spaces are you know, appealing and comfortable 872 00:47:09,640 --> 00:47:12,239 Speaker 3: to live in and and provide us a sense of 873 00:47:12,320 --> 00:47:14,440 Speaker 3: like enrichment and fulfillment. 874 00:47:14,760 --> 00:47:15,840 Speaker 2: You know, the easy stuff. 875 00:47:16,400 --> 00:47:20,680 Speaker 3: Yeah, just just fix infrastructure, fix society, fix media, and 876 00:47:21,120 --> 00:47:22,839 Speaker 3: uh and then I think we're good. 877 00:47:23,600 --> 00:47:23,960 Speaker 2: Yeah. 878 00:47:24,040 --> 00:47:26,680 Speaker 1: Yeah, so that's that's good. So if we fix everything, 879 00:47:26,719 --> 00:47:30,280 Speaker 1: then we won't have any more problems. That's great. We're 880 00:47:30,320 --> 00:47:32,640 Speaker 1: on the same page now. I mean it is really 881 00:47:32,719 --> 00:47:35,600 Speaker 1: like and this is there's it's also there's also this 882 00:47:35,680 --> 00:47:37,879 Speaker 1: kind of like problematic element of when you're like, well, 883 00:47:37,880 --> 00:47:40,759 Speaker 1: we want to like a problem is that there's zero 884 00:47:40,800 --> 00:47:43,840 Speaker 1: trust in institutions objectively a problem because it means that 885 00:47:43,880 --> 00:47:46,960 Speaker 1: when say the CDC is like, hey, guys, there's this, 886 00:47:47,120 --> 00:47:49,000 Speaker 1: there's a plague, we should probably do this and this 887 00:47:49,080 --> 00:47:51,719 Speaker 1: and this, it immediately becomes a culture war thing. And 888 00:47:51,800 --> 00:47:55,719 Speaker 1: so you can't actually you can't actually confront serious problems 889 00:47:55,719 --> 00:47:57,600 Speaker 1: the way that you need to be able to confront them. 890 00:47:57,840 --> 00:48:03,319 Speaker 1: It's just not possible anymore. Likewise, Like, but the other 891 00:48:03,360 --> 00:48:05,560 Speaker 1: issue is that like, well, for a significant chunks of 892 00:48:05,560 --> 00:48:08,680 Speaker 1: the population, there's never been any good reason to trust 893 00:48:08,880 --> 00:48:11,880 Speaker 1: you know, the institutions because you know, they're marginalized groups 894 00:48:11,880 --> 00:48:15,200 Speaker 1: and whatever, you know, when the institutional trust was higher, 895 00:48:15,280 --> 00:48:17,080 Speaker 1: like the government was fucking them in this way and 896 00:48:17,160 --> 00:48:21,120 Speaker 1: that way, and yeah, you know that's also so I wonder, 897 00:48:21,800 --> 00:48:24,160 Speaker 1: like I think there's a significant extent to which we 898 00:48:24,239 --> 00:48:28,640 Speaker 1: need new concepts of like what an institution is and 899 00:48:29,160 --> 00:48:31,960 Speaker 1: should be, Like we need It's it's such a ground 900 00:48:32,000 --> 00:48:35,200 Speaker 1: floor problem because like I don't know, we're never getting 901 00:48:35,239 --> 00:48:37,920 Speaker 1: back to a point where Americans trust the CDC. Like 902 00:48:38,000 --> 00:48:40,640 Speaker 1: that's just not going to happen, you know, Like whatever 903 00:48:40,760 --> 00:48:45,080 Speaker 1: the way forward is on us having less the overcoming 904 00:48:45,120 --> 00:48:48,719 Speaker 1: the anti vax anti science shit around medicine it's not 905 00:48:48,920 --> 00:48:51,480 Speaker 1: getting everyone to love the CDC. You know, that's just 906 00:48:51,560 --> 00:48:52,960 Speaker 1: not ever going to happen again. 907 00:48:53,120 --> 00:48:53,560 Speaker 2: Yeah. 908 00:48:53,719 --> 00:48:56,200 Speaker 3: And part of the complexity here is that it's it's 909 00:48:56,239 --> 00:48:59,160 Speaker 3: really easy to sort of say that, you know, it's like, Okay, 910 00:48:59,200 --> 00:49:03,960 Speaker 3: well the solution is like strong central institutions, and it's 911 00:49:03,960 --> 00:49:09,360 Speaker 3: like that's not that's not correct at all either, because 912 00:49:09,400 --> 00:49:12,000 Speaker 3: like I mean, the my go to example for that 913 00:49:12,040 --> 00:49:13,600 Speaker 3: would be that it's like look at look at the 914 00:49:13,680 --> 00:49:15,920 Speaker 3: LDS Church, look at Mormons. They have a very very 915 00:49:15,960 --> 00:49:21,080 Speaker 3: strong central institution that provides this like social anchoring point 916 00:49:21,400 --> 00:49:25,360 Speaker 3: for a lot of their lives, and yet Mormon communities 917 00:49:25,400 --> 00:49:30,400 Speaker 3: are incredibly vulnerable to affinity fraud and and MLM's you know, 918 00:49:30,520 --> 00:49:36,200 Speaker 3: like Utah Salt Lake is like the locus of MLM culture, 919 00:49:36,640 --> 00:49:40,880 Speaker 3: and so like it's not the sort of like strong 920 00:49:40,920 --> 00:49:43,319 Speaker 3: man like, ah, this is why we need strong like 921 00:49:43,400 --> 00:49:48,960 Speaker 3: you know, strong leaders is not isn't the answer in 922 00:49:49,320 --> 00:49:51,319 Speaker 3: its own way, even if it's a very tempting sort 923 00:49:51,360 --> 00:49:53,400 Speaker 3: of like answer to gravitate towards. 924 00:49:53,440 --> 00:49:58,400 Speaker 1: Yeah, And that's that's I don't know, I don't actually know. 925 00:50:01,440 --> 00:50:03,839 Speaker 1: Part of the problem is that like there are little solutions, right, 926 00:50:03,880 --> 00:50:05,799 Speaker 1: there are little things that you can do, stuff like 927 00:50:05,920 --> 00:50:11,400 Speaker 1: advocating for you know, a more functional idea of like 928 00:50:11,520 --> 00:50:13,719 Speaker 1: a more a more functional legal definition of like what 929 00:50:13,760 --> 00:50:16,839 Speaker 1: an auto dialer is and what counts as like illegally 930 00:50:16,960 --> 00:50:19,640 Speaker 1: sort of like flooding phone lines with with with cons 931 00:50:19,640 --> 00:50:22,440 Speaker 1: and stuff or restricting you know, the ability of people 932 00:50:22,560 --> 00:50:26,440 Speaker 1: like bill collectors and stuff to utilize you know, the 933 00:50:26,480 --> 00:50:28,359 Speaker 1: phone system and some of the ways that they do, 934 00:50:28,480 --> 00:50:31,400 Speaker 1: like the and that can make stuff better. Just like 935 00:50:32,680 --> 00:50:35,920 Speaker 1: you know, at a certain point, we will develop tools 936 00:50:36,040 --> 00:50:38,920 Speaker 1: that mitigate some of the harm AI is doing in 937 00:50:39,000 --> 00:50:41,279 Speaker 1: the con space. Some of its ability to automate and 938 00:50:41,719 --> 00:50:44,840 Speaker 1: push shit to people at scale will get reduced at 939 00:50:44,880 --> 00:50:46,760 Speaker 1: a certain point. That will happen, right, because it happens 940 00:50:46,800 --> 00:50:47,280 Speaker 1: with everything. 941 00:50:47,320 --> 00:50:47,480 Speaker 3: You know. 942 00:50:47,520 --> 00:50:48,720 Speaker 2: AI is not unique. 943 00:50:49,680 --> 00:50:52,080 Speaker 1: This is the It's it's a you've heard the you've 944 00:50:52,080 --> 00:50:55,120 Speaker 1: heard the red Queen hypothesis, right, Yeah. It's kind of 945 00:50:55,120 --> 00:50:56,680 Speaker 1: a way of like for it's kind of like a 946 00:50:56,719 --> 00:50:59,600 Speaker 1: way of looking at evolutionary theory. There's this this point 947 00:51:00,040 --> 00:51:03,120 Speaker 1: in Alice in Wonderland where you know, the Red Queen 948 00:51:03,200 --> 00:51:06,400 Speaker 1: kind of like traps Alice in this situation where like 949 00:51:06,480 --> 00:51:08,640 Speaker 1: she's got to keep running as fast as she can, 950 00:51:09,600 --> 00:51:13,440 Speaker 1: but it's like a situation like a conveyor belt sort 951 00:51:13,440 --> 00:51:15,680 Speaker 1: of situation, So no matter how how hard she runs, 952 00:51:15,920 --> 00:51:18,279 Speaker 1: she never gets ahead. Right, And that's kind of the 953 00:51:18,280 --> 00:51:21,759 Speaker 1: way that like the evolutionary arms race works, right, Like 954 00:51:22,160 --> 00:51:24,719 Speaker 1: you know, one one animal develops a defense against a 955 00:51:24,760 --> 00:51:27,160 Speaker 1: predator and the predator develops a way around it, and 956 00:51:27,239 --> 00:51:32,399 Speaker 1: like the the like, that's kind of the best case 957 00:51:32,400 --> 00:51:35,160 Speaker 1: scenario for how we adapt to cons I think actually, 958 00:51:35,200 --> 00:51:38,919 Speaker 1: like technology just moves too fast now for us to 959 00:51:38,920 --> 00:51:40,880 Speaker 1: to be able to keep up, right, Like we're not 960 00:51:41,280 --> 00:51:45,000 Speaker 1: We're not just standing in place. We're consistently following behind. 961 00:51:45,239 --> 00:51:48,959 Speaker 1: And I don't know, I don't know what we do here, uh. 962 00:51:49,040 --> 00:51:51,400 Speaker 3: I mean yeah, so like there will there will be 963 00:51:51,480 --> 00:51:56,160 Speaker 3: technol technological solutions to specific manifestations. I mean a big 964 00:51:56,200 --> 00:51:59,360 Speaker 3: one like in there, like to not to not bant. 965 00:51:59,239 --> 00:52:00,759 Speaker 2: Is that you know, the uh. 966 00:52:02,600 --> 00:52:06,719 Speaker 3: At A, the legal system, the governments, Like governments need 967 00:52:06,800 --> 00:52:10,400 Speaker 3: to do something about the robocalling and the text messages 968 00:52:10,440 --> 00:52:14,839 Speaker 3: because they're rendering a vital piece of like sneak infrastructure unusable. 969 00:52:15,239 --> 00:52:21,200 Speaker 1: Yeah, people don't trust their phones anymore, and that's that's bad, Yeah, 970 00:52:21,719 --> 00:52:24,160 Speaker 1: because it means they stop using it. You know, it's like. 971 00:52:24,280 --> 00:52:30,160 Speaker 3: There's yeah, there's very real like consequences. Uh, and we 972 00:52:30,560 --> 00:52:34,279 Speaker 3: need to be able to trust that we're talking to 973 00:52:35,640 --> 00:52:38,320 Speaker 3: people who aren't just trying to get our money. 974 00:52:38,960 --> 00:52:40,440 Speaker 2: Yeah. 975 00:52:40,840 --> 00:52:45,319 Speaker 1: Yep, Well, Dan, anything you want to plug at the 976 00:52:45,360 --> 00:52:49,160 Speaker 1: end of this here YouTube channel Folding Ideas everyone should 977 00:52:49,239 --> 00:52:50,680 Speaker 1: check out if you have not already. 978 00:52:51,280 --> 00:52:53,799 Speaker 3: Yeah, the YouTube channel that's going to be the big one. 979 00:52:54,400 --> 00:52:57,520 Speaker 3: I'm still on. I'm on socials at Foldable Human, though 980 00:52:57,520 --> 00:52:59,560 Speaker 3: I'm trying to wean myself off of them because they're 981 00:52:59,560 --> 00:53:02,720 Speaker 3: broken and being broken purpose and they're. 982 00:53:02,560 --> 00:53:03,440 Speaker 2: Bad for my soul. 983 00:53:04,000 --> 00:53:07,320 Speaker 3: So I still I'm addicted, So I still keep coming back. 984 00:53:07,400 --> 00:53:09,799 Speaker 3: But I'm a lot less active than I used to be. 985 00:53:10,239 --> 00:53:12,680 Speaker 1: Oh sorry, I didn't hear you. I was too busy 986 00:53:12,920 --> 00:53:19,640 Speaker 1: getting anxious because of a thing on Twitter. No. Yeah, Dan, 987 00:53:20,040 --> 00:53:21,839 Speaker 1: thank you so much for coming on today. I really 988 00:53:21,920 --> 00:53:24,840 Speaker 1: appreciate your thoughts on all of this. I'm looking forward 989 00:53:24,880 --> 00:53:29,480 Speaker 1: to your next video, your next investigation, whatever that happens 990 00:53:29,520 --> 00:53:33,040 Speaker 1: to be. Folks should check out. If you haven't, line 991 00:53:33,080 --> 00:53:36,200 Speaker 1: goes up your documentary on NFTs. You should check out 992 00:53:36,840 --> 00:53:42,000 Speaker 1: entrepreneurs Is. I think what you called your Michelson Twins documentary. Yeah, 993 00:53:42,200 --> 00:53:45,200 Speaker 1: check out everything Dan has done. Thank you Dan, And 994 00:53:45,360 --> 00:53:48,440 Speaker 1: that is the episode. You can all go home now 995 00:53:48,880 --> 00:53:52,200 Speaker 1: and deal with the fact that your bank information just 996 00:53:52,239 --> 00:53:58,360 Speaker 1: got stolen by somebody in Macedonia. 997 00:53:59,480 --> 00:54:02,000 Speaker 2: It Could Happen Here is a production of cool Zone Media. 998 00:54:02,080 --> 00:54:04,800 Speaker 2: For more podcasts from cool Zone Media, visit our website 999 00:54:04,800 --> 00:54:07,919 Speaker 2: coolzonemedia dot com, or check us out on the iHeartRadio app, 1000 00:54:07,960 --> 00:54:10,520 Speaker 2: Apple Podcasts, or wherever you listen to podcasts. 1001 00:54:11,040 --> 00:54:13,200 Speaker 1: You can find sources for It Could Happen Here, updated 1002 00:54:13,239 --> 00:54:16,360 Speaker 1: monthly at coolzonemedia dot com slash sources. 1003 00:54:16,480 --> 00:54:17,320 Speaker 3: Thanks for listening.