1 00:00:10,480 --> 00:00:14,920 Speaker 1: So one of the first videos I saw of yours 2 00:00:15,480 --> 00:00:17,720 Speaker 1: was this dolphin video. 3 00:00:18,960 --> 00:00:20,920 Speaker 2: Yeah, I made a dolphin language. 4 00:00:21,000 --> 00:00:23,440 Speaker 3: It's called and it only has vowels that are made 5 00:00:23,440 --> 00:00:25,599 Speaker 3: in the front of your mouth to best imitate dolphins beach. 6 00:00:25,680 --> 00:00:27,160 Speaker 2: Here are some example conjugations. 7 00:00:27,200 --> 00:00:29,120 Speaker 3: All the sentences are six vowels long, but the meeting 8 00:00:29,160 --> 00:00:31,840 Speaker 3: changes based on vowel placement and length. So the sentence 9 00:00:31,960 --> 00:00:35,120 Speaker 3: I eat would be pronounced, but if you want to 10 00:00:35,159 --> 00:00:37,440 Speaker 3: conjugate that too, I will eat. Since it's an east 11 00:00:37,520 --> 00:00:40,040 Speaker 3: n verb, you have to lengthen the penultimate vowel, which would. 12 00:00:39,920 --> 00:00:40,560 Speaker 2: Change it to E. 13 00:00:42,280 --> 00:00:45,760 Speaker 1: Adam Alexic is a linguist and he's known online as 14 00:00:45,800 --> 00:00:46,920 Speaker 1: the etymology nerd. 15 00:00:47,120 --> 00:00:50,320 Speaker 2: I was like combining, like something goofy with something academic, 16 00:00:50,320 --> 00:00:52,640 Speaker 2: because they have both of those impulses constantly working at me. 17 00:00:52,760 --> 00:00:54,720 Speaker 2: So the academic impulse to me is like, you know, 18 00:00:54,720 --> 00:00:56,880 Speaker 2: what if we try to make a minimalist language, see 19 00:00:56,880 --> 00:00:58,360 Speaker 2: what we can do with the boundaries of language. Here, 20 00:00:58,480 --> 00:01:00,880 Speaker 2: the goofy impulse was, let's just sound like dolphin kind 21 00:01:00,880 --> 00:01:02,440 Speaker 2: of the same thing happened with my book here, Like 22 00:01:02,480 --> 00:01:06,080 Speaker 2: what if we actually seriously study the origin of slang words? Right, 23 00:01:06,080 --> 00:01:08,000 Speaker 2: now where they're coming from. But also, what if I 24 00:01:08,040 --> 00:01:09,720 Speaker 2: just get to write a book about skibbity toilet and 25 00:01:09,720 --> 00:01:10,280 Speaker 2: get paid for. 26 00:01:10,240 --> 00:01:16,000 Speaker 1: It, Like I if hearing the phrase skibbety toilet creates 27 00:01:16,000 --> 00:01:18,560 Speaker 1: a fight or flight response, or even if you don't 28 00:01:18,600 --> 00:01:21,880 Speaker 1: know what that is yet, it's fine, I'll explain later, 29 00:01:21,920 --> 00:01:24,760 Speaker 1: and I promise you you were absolutely the target audience 30 00:01:24,800 --> 00:01:27,720 Speaker 1: for this episode. So Adams just released a book called 31 00:01:27,800 --> 00:01:30,800 Speaker 1: I'll Go Speak How social media is transforming the future 32 00:01:30,840 --> 00:01:33,880 Speaker 1: of language and overall. In the book, he shows that 33 00:01:34,040 --> 00:01:36,720 Speaker 1: the short form video content that we see on Instagram 34 00:01:36,760 --> 00:01:40,000 Speaker 1: reels or on TikTok and the algorithms that they run 35 00:01:40,040 --> 00:01:43,320 Speaker 1: on are changing how we talk in real life. So 36 00:01:43,520 --> 00:01:45,240 Speaker 1: I wanted to talk to him about how we got 37 00:01:45,280 --> 00:01:46,680 Speaker 1: here and where we're going. 38 00:01:48,320 --> 00:01:50,200 Speaker 2: We somehow think, oh, the history of right now is 39 00:01:50,240 --> 00:01:52,520 Speaker 2: not important, but in fact it's extremely important because it's 40 00:01:52,520 --> 00:01:54,480 Speaker 2: an indicator of what it means to live and exist 41 00:01:54,560 --> 00:01:57,040 Speaker 2: right now in the present, and the fact that we 42 00:01:57,120 --> 00:01:59,000 Speaker 2: are existing in the skibbity toilet era means that we 43 00:01:59,000 --> 00:01:59,920 Speaker 2: should maybe look into that. 44 00:02:00,040 --> 00:02:03,360 Speaker 1: I'm sorry, we're existing, We're existing in the skivity toilet era. 45 00:02:03,720 --> 00:02:06,280 Speaker 2: You're not wrong I'm saying goofy things, but I'm serious 46 00:02:06,280 --> 00:02:06,720 Speaker 2: about it. 47 00:02:07,000 --> 00:02:10,760 Speaker 1: No, No, you're like, everything you're saying isn't incorrect. Everything 48 00:02:10,760 --> 00:02:13,320 Speaker 1: you've said is right. I just when you say it, 49 00:02:13,040 --> 00:02:14,280 Speaker 1: it hits a little bit. 50 00:02:14,400 --> 00:02:16,119 Speaker 2: It's a lot of fun. I'm a little self aware 51 00:02:16,160 --> 00:02:25,000 Speaker 2: of my job. That is, it's a little goofy. I'm afraid. 52 00:02:27,960 --> 00:02:35,079 Speaker 1: From Kaleidoscope and iHeart podcast. This is kill Switch. I'm 53 00:02:35,120 --> 00:02:36,240 Speaker 1: Dae Thomas. 54 00:02:36,760 --> 00:02:50,080 Speaker 4: I'm sorry, I'm sorry. 55 00:03:00,720 --> 00:03:06,919 Speaker 1: Good God, let's even back up a little bit. But 56 00:03:06,960 --> 00:03:08,399 Speaker 1: feel free to talk to me like I'm a five 57 00:03:08,480 --> 00:03:10,760 Speaker 1: year old. Yeah, how do you define algo speak? 58 00:03:11,160 --> 00:03:15,440 Speaker 2: Traditionally, al go speak refers to speech made to circumvent 59 00:03:15,680 --> 00:03:20,400 Speaker 2: algorithmic censorship. The quintessential example there was on alive, like 60 00:03:20,480 --> 00:03:22,639 Speaker 2: instead of kill because you can't say kill on TikTok. 61 00:03:22,680 --> 00:03:26,120 Speaker 2: You say on alive sex instead of sex because you 62 00:03:26,120 --> 00:03:28,600 Speaker 2: can't say that on TikTok. And so you can literally 63 00:03:28,720 --> 00:03:31,919 Speaker 2: see language re routing around the algorithm with those examples. 64 00:03:31,960 --> 00:03:34,200 Speaker 2: But it's also like a game of whack a mole, right, 65 00:03:34,520 --> 00:03:38,040 Speaker 2: The algorithm sensors certain words and then a new word 66 00:03:38,040 --> 00:03:40,720 Speaker 2: will spring up. Because humans are always very good at 67 00:03:40,760 --> 00:03:44,640 Speaker 2: finding ways to express themselves, and that historically has been 68 00:03:44,640 --> 00:03:47,680 Speaker 2: referred to as algospi. How I define algo speak, I 69 00:03:47,680 --> 00:03:50,280 Speaker 2: think is a little bit different because that's just the 70 00:03:50,320 --> 00:03:54,240 Speaker 2: easiest to point to example of algorithms shaping our speech. 71 00:03:54,280 --> 00:03:57,240 Speaker 2: We also have where words are coming from, how words 72 00:03:57,240 --> 00:03:59,960 Speaker 2: get popularized, how quickly those words spread, how creators are 73 00:04:00,040 --> 00:04:02,880 Speaker 2: interacting with algorithms, how users are interacting with algorithms. All 74 00:04:02,920 --> 00:04:04,480 Speaker 2: of those things are also going to affect the way 75 00:04:04,480 --> 00:04:05,680 Speaker 2: we speak and relate to each other. 76 00:04:06,320 --> 00:04:12,040 Speaker 1: How do you explain to somebody that you're taking skibbety, 77 00:04:12,480 --> 00:04:16,680 Speaker 1: you're taking Rizzler, you're taking gat You're taking all this stuff. Seriously, Yo, 78 00:04:16,760 --> 00:04:18,640 Speaker 1: I'm gonna write a book about it, and this is 79 00:04:18,680 --> 00:04:20,400 Speaker 1: going to be something that's academically sound. 80 00:04:20,560 --> 00:04:22,440 Speaker 2: So I'm going to acknowledge that we are talking about 81 00:04:22,440 --> 00:04:25,080 Speaker 2: silly things, but even the silly things are points of 82 00:04:25,120 --> 00:04:28,640 Speaker 2: human connection. They're important trends. The fact that rizz was 83 00:04:28,680 --> 00:04:30,800 Speaker 2: the twenty twenty three Oxit English Dictionary Word of the Year, 84 00:04:31,000 --> 00:04:33,120 Speaker 2: the fact that skibbity toilet has been viewed more times 85 00:04:33,160 --> 00:04:35,080 Speaker 2: than the moon landing, the fact that middle schoolers are 86 00:04:35,120 --> 00:04:36,880 Speaker 2: out here that's how they connect with each other. 87 00:04:37,000 --> 00:04:40,520 Speaker 1: Well pause, sorry, no, no, I'm not No, you can't 88 00:04:40,560 --> 00:04:44,720 Speaker 1: steamroll through that. Sorry skip the say that one more 89 00:04:44,720 --> 00:04:46,200 Speaker 1: time than the moon Landing? Are you serious? 90 00:04:46,320 --> 00:04:49,400 Speaker 2: I think yeah, Apalla, moon Landing is reviewed like one 91 00:04:49,480 --> 00:04:52,200 Speaker 2: hundred and ten million times on TV, and then each 92 00:04:52,240 --> 00:04:55,279 Speaker 2: Skibbity toilet video by itself has over like one hundred 93 00:04:55,839 --> 00:04:58,320 Speaker 2: fifty million views on YouTube shorts or something like that. 94 00:04:58,320 --> 00:05:03,080 Speaker 1: That's just one video, all right, So this is where 95 00:05:03,120 --> 00:05:05,320 Speaker 1: I do the podcaster thing and bust out with the 96 00:05:05,360 --> 00:05:09,640 Speaker 1: explanatory comma. But like the extendo clip version, So let's 97 00:05:09,640 --> 00:05:12,760 Speaker 1: go down the list. Riz rizz you could think of 98 00:05:12,880 --> 00:05:16,159 Speaker 1: like short of charisma. So if you have RIZ, that 99 00:05:16,200 --> 00:05:19,880 Speaker 1: means you've got style, charm, attractiveness, people like you, that 100 00:05:20,000 --> 00:05:23,800 Speaker 1: kind of thing. Unalive is what it sounds like. It's 101 00:05:23,800 --> 00:05:28,240 Speaker 1: a euphemism for death, So killing someone would be unliving someone. 102 00:05:28,800 --> 00:05:34,240 Speaker 1: Suicide would be unliving yourself. Yat is a little complicated, 103 00:05:34,279 --> 00:05:37,479 Speaker 1: but short version. It's a big butt big in a 104 00:05:37,520 --> 00:05:41,600 Speaker 1: good way. Sigma is also good. Usually it's like the 105 00:05:41,640 --> 00:05:45,440 Speaker 1: apex version of an alpha male. Okay, and you know what, 106 00:05:45,480 --> 00:05:46,839 Speaker 1: Now that I've done all that, I might as well 107 00:05:46,880 --> 00:05:51,640 Speaker 1: explain skibbity toilet, so okay, paraphrasing from the Wikipedia, because yes, 108 00:05:51,720 --> 00:05:55,360 Speaker 1: there is a Wikipedia entry for skibbity toilet. Skibbity toilet 109 00:05:55,400 --> 00:05:59,360 Speaker 1: refers to an armed conflict sprawling across dozens of episodes 110 00:05:59,400 --> 00:06:03,800 Speaker 1: between humanoids and singing human headed toilets called skibbity toilets. 111 00:06:04,440 --> 00:06:08,039 Speaker 1: Skibbity toilets are called this probably because in the first video, 112 00:06:08,440 --> 00:06:10,800 Speaker 1: a human head pops out of a toilet and starts 113 00:06:10,800 --> 00:06:12,279 Speaker 1: singing like this. 114 00:06:12,800 --> 00:06:20,880 Speaker 2: Chris mustache is ridiculous, stimmy. 115 00:06:24,880 --> 00:06:26,599 Speaker 1: And then the head flies out of the toilet and 116 00:06:26,680 --> 00:06:29,400 Speaker 1: into the camera. And now you know why kids are 117 00:06:29,440 --> 00:06:32,960 Speaker 1: saying skibbety. And if your head hurts after hearing all that, 118 00:06:33,520 --> 00:06:36,919 Speaker 1: you now understand the last phrase, brain rot, which is 119 00:06:36,960 --> 00:06:40,599 Speaker 1: basically a catch all phrase for everything I just talked about. 120 00:06:41,480 --> 00:06:43,120 Speaker 1: All right, back to the interview. 121 00:06:43,400 --> 00:06:46,279 Speaker 2: Word skibbitty actually is functionally no different than the word 122 00:06:46,320 --> 00:06:48,040 Speaker 2: scooby doo. That's how I like to explain it. They 123 00:06:48,040 --> 00:06:51,640 Speaker 2: both come from non lexical vocables that's scot singing, So 124 00:06:52,160 --> 00:06:54,880 Speaker 2: just a random like phrase that rolls off the tongue. 125 00:06:55,160 --> 00:06:58,200 Speaker 2: We can't ignore cultural phenomena. Language is always a proxy 126 00:06:58,320 --> 00:07:01,040 Speaker 2: for what's going on. In culture, it's the little things 127 00:07:01,040 --> 00:07:03,039 Speaker 2: we can see that point us to the things that 128 00:07:03,080 --> 00:07:06,840 Speaker 2: we can't see. And by looking at these examples, I 129 00:07:06,839 --> 00:07:13,280 Speaker 2: hope to broadly demonstrate that algorithms are having insane impacts 130 00:07:13,960 --> 00:07:17,280 Speaker 2: on our society as a whole, but specifically through the 131 00:07:17,320 --> 00:07:18,800 Speaker 2: lens of communication as well. 132 00:07:19,080 --> 00:07:22,560 Speaker 1: You brought up the censorship thing. How have you seen 133 00:07:22,920 --> 00:07:25,880 Speaker 1: al go speak develop in response to censorship? 134 00:07:26,080 --> 00:07:29,640 Speaker 2: Censorship is a productive force, which is what linguists call 135 00:07:30,000 --> 00:07:33,880 Speaker 2: something that generates more language. One example I really like 136 00:07:34,080 --> 00:07:38,040 Speaker 2: is in Chinese, the word for censorship is censored, so 137 00:07:38,600 --> 00:07:41,560 Speaker 2: users turn to the word for harmony in allusion to 138 00:07:41,600 --> 00:07:44,480 Speaker 2: the Chinese government's goal of building a harmonious society, and 139 00:07:44,480 --> 00:07:47,800 Speaker 2: that's like hoole shit or something, and then that started 140 00:07:47,840 --> 00:07:51,200 Speaker 2: being censored as well. Now it goes down again, new 141 00:07:51,240 --> 00:07:54,040 Speaker 2: mole pops up. Now people start saying river crab because 142 00:07:54,120 --> 00:07:56,240 Speaker 2: river crab sounds like the word for harmony, and so 143 00:07:56,240 --> 00:07:59,040 Speaker 2: that's whole shit with a falling tone. And then that 144 00:07:59,120 --> 00:08:01,760 Speaker 2: word starts being sent done some platforms as well, and 145 00:08:01,800 --> 00:08:05,040 Speaker 2: then users start saying aquatic products simply because it's similar 146 00:08:05,120 --> 00:08:07,280 Speaker 2: to river crab. So you're not going to stop us 147 00:08:07,400 --> 00:08:10,280 Speaker 2: from talking about this stuff, and then sometimes it bleeds 148 00:08:10,320 --> 00:08:12,960 Speaker 2: through to the offline. My book opens with examples of 149 00:08:13,640 --> 00:08:17,040 Speaker 2: kids writing essays about Hamlet, contemplating on aliving himself, or 150 00:08:17,080 --> 00:08:20,240 Speaker 2: the Seattle Museum of Pop Culture releasing an exhibit commemorating 151 00:08:20,280 --> 00:08:23,960 Speaker 2: the thirtieth anniversary of Kirk Cobain on aliving himself. 152 00:08:24,240 --> 00:08:27,160 Speaker 1: And hold on. Just to clarify here, Adam is talking 153 00:08:27,160 --> 00:08:30,240 Speaker 1: about an actual thing. Last year, a placard on an 154 00:08:30,240 --> 00:08:34,280 Speaker 1: exhibit about Kurt Cobain said that he quote unalived himself 155 00:08:34,520 --> 00:08:38,320 Speaker 1: at twenty seven. Some people did not like this, but 156 00:08:38,520 --> 00:08:38,880 Speaker 1: it's real. 157 00:08:38,920 --> 00:08:41,240 Speaker 2: It's bleeding through. We like to pretend that the online 158 00:08:41,320 --> 00:08:44,880 Speaker 2: is a separate world, but in fact it does affect 159 00:08:44,920 --> 00:08:47,760 Speaker 2: our reality. It does affect the way we relate to 160 00:08:47,800 --> 00:08:48,160 Speaker 2: each other. 161 00:08:48,760 --> 00:08:51,719 Speaker 1: Censorship often, when we think about it, it's done by 162 00:08:51,800 --> 00:08:54,520 Speaker 1: a government, it's done by the church, something like that. 163 00:08:54,840 --> 00:08:59,079 Speaker 1: There seems to be something a little bit different about this, 164 00:08:59,200 --> 00:09:02,560 Speaker 1: and maybe it's just the rate at which it's happening. 165 00:09:03,400 --> 00:09:05,400 Speaker 1: And I think that the Chinese model is a really 166 00:09:05,400 --> 00:09:07,559 Speaker 1: good example because in a lot of ways, I think 167 00:09:07,760 --> 00:09:11,000 Speaker 1: what's happening in English is something that has been happening 168 00:09:11,000 --> 00:09:13,920 Speaker 1: in China. Since you know the two thousands. 169 00:09:14,559 --> 00:09:17,920 Speaker 2: Our algorithmic models are all based off of bite Dance's 170 00:09:18,520 --> 00:09:21,800 Speaker 2: infrastructure that they built for Duyan because China is censoring 171 00:09:22,080 --> 00:09:25,880 Speaker 2: certain language and imposed regulations on internet companies there. So 172 00:09:25,920 --> 00:09:29,920 Speaker 2: the reason English language algorithms are so good at censorship 173 00:09:30,000 --> 00:09:32,040 Speaker 2: and detection and that kind of stuff is based on 174 00:09:32,040 --> 00:09:33,839 Speaker 2: the Chinese model. They're ahead of us. 175 00:09:35,160 --> 00:09:40,360 Speaker 1: It isn't just the Chinese based apps like TikTok, YouTube, Instagram, Facebook, 176 00:09:40,600 --> 00:09:44,559 Speaker 1: All these apps have algorithm based censorship, and in response, 177 00:09:44,760 --> 00:09:47,560 Speaker 1: people are making up words to get around those sensors. 178 00:09:48,200 --> 00:09:49,440 Speaker 1: It's hard to keep up with. 179 00:09:51,840 --> 00:09:54,280 Speaker 2: I think it's happening faster and it's more compounded. It's 180 00:09:54,280 --> 00:09:56,400 Speaker 2: hard to prove that quantitatively because it's hard to even 181 00:09:56,400 --> 00:09:58,200 Speaker 2: like identify what's the slang word or something. How do 182 00:09:58,240 --> 00:09:59,120 Speaker 2: we know what's happening faster? 183 00:09:59,240 --> 00:09:59,360 Speaker 4: Right? 184 00:09:59,360 --> 00:10:01,840 Speaker 2: But it seems pretty logical right when we talk about 185 00:10:01,880 --> 00:10:04,840 Speaker 2: productive forces and linguistics that the more you sense are 186 00:10:04,880 --> 00:10:06,439 Speaker 2: a word, the more people try to come up with 187 00:10:06,520 --> 00:10:08,920 Speaker 2: new words that just logically follows. It makes sense that 188 00:10:08,960 --> 00:10:11,960 Speaker 2: algorithms are bringing us more language and ever before, the 189 00:10:12,080 --> 00:10:17,120 Speaker 2: underlying mechanisms are not new. Yeah, underlying to language is 190 00:10:17,160 --> 00:10:20,240 Speaker 2: this feeling that humans just want to express themselves and 191 00:10:20,320 --> 00:10:23,600 Speaker 2: communicate to one another, and we are very good at that. 192 00:10:23,640 --> 00:10:26,520 Speaker 2: We always find ways around that when people well, I 193 00:10:26,520 --> 00:10:29,360 Speaker 2: can't even say here, probably, but there's like a trend 194 00:10:29,400 --> 00:10:31,920 Speaker 2: recently where people are talking about somebody doing the thing, 195 00:10:32,400 --> 00:10:34,080 Speaker 2: and everybody seems to know what the thing is because 196 00:10:34,080 --> 00:10:36,920 Speaker 2: you can't talk about it, and they're asking, you, know, what, 197 00:10:37,040 --> 00:10:38,720 Speaker 2: what are we going to do when the thing happens. 198 00:10:38,800 --> 00:10:40,560 Speaker 2: I can't wait for the thing to happen. But this 199 00:10:40,679 --> 00:10:42,439 Speaker 2: is not something you can actually talk about, but you're 200 00:10:42,480 --> 00:10:44,680 Speaker 2: still expressing yourself. This is like the most taboo of 201 00:10:44,720 --> 00:10:48,400 Speaker 2: concepts in American society, and yet we're still finding ways 202 00:10:48,440 --> 00:10:53,360 Speaker 2: to talk about it that we can't actually put into words. 203 00:10:54,000 --> 00:10:57,319 Speaker 1: Yeah, and I know exactly what you're talking about. How 204 00:10:57,360 --> 00:11:01,199 Speaker 1: does that work? How does it work that when somebody 205 00:11:01,240 --> 00:11:06,559 Speaker 1: says the thing or doing the thing? How are people 206 00:11:06,720 --> 00:11:09,600 Speaker 1: able to pick up on that? Because linguistically, I have 207 00:11:09,720 --> 00:11:10,760 Speaker 1: no explanation for that. 208 00:11:11,280 --> 00:11:16,920 Speaker 2: No, Well, linguistically, language is cooperation. And when I'm talking about, oh, 209 00:11:17,000 --> 00:11:21,120 Speaker 2: I think somebody should do the thing, that's a quote 210 00:11:21,320 --> 00:11:26,000 Speaker 2: I am signaling to you that because I'm not saying 211 00:11:26,000 --> 00:11:28,200 Speaker 2: this the signal I'm standing out is this is something 212 00:11:28,200 --> 00:11:30,360 Speaker 2: I can't talk about. And you pick up on that signal. 213 00:11:30,360 --> 00:11:31,680 Speaker 2: Now you're thinking, like, what are all the things I 214 00:11:31,679 --> 00:11:35,200 Speaker 2: can't talk about right now? And then you start to think, well, 215 00:11:36,200 --> 00:11:38,960 Speaker 2: maybe it's something really illegal. Maybe it's something that like 216 00:11:39,200 --> 00:11:42,480 Speaker 2: you could go to jail. As you say, so, you 217 00:11:42,600 --> 00:11:44,520 Speaker 2: think of those things and you think, oh, there's one 218 00:11:44,600 --> 00:11:49,240 Speaker 2: obvious conclusion. Also, confusion generates engagement. Engagement pushes it further. 219 00:11:49,320 --> 00:11:51,280 Speaker 2: The algorim there probably are some people confused about what 220 00:11:51,320 --> 00:11:54,160 Speaker 2: the thing is, and then they might comment and ask questions, 221 00:11:54,400 --> 00:11:55,560 Speaker 2: and that'll push it further. 222 00:11:56,480 --> 00:11:59,079 Speaker 1: All right, social media is changing our language, whether we 223 00:11:59,240 --> 00:12:02,760 Speaker 1: like it or not. That but is it actually leading 224 00:12:02,800 --> 00:12:17,880 Speaker 1: to quote unquote brain rod that's after the break. Let's 225 00:12:17,880 --> 00:12:19,600 Speaker 1: talk about brain rot for a second. This is a 226 00:12:19,600 --> 00:12:22,320 Speaker 1: while ago. I remember this actually really clearly. I was 227 00:12:22,360 --> 00:12:26,440 Speaker 1: at the graduation ceremony for my niece. She was graduating 228 00:12:26,480 --> 00:12:27,240 Speaker 1: from middle school. 229 00:12:27,480 --> 00:12:29,160 Speaker 2: That's the ripe age for brain rot. 230 00:12:29,120 --> 00:12:32,240 Speaker 1: Perfect right, And what I started doing was, you know, 231 00:12:32,280 --> 00:12:35,520 Speaker 1: there's there's a bunch of downtime, right, and I started 232 00:12:35,559 --> 00:12:37,480 Speaker 1: texting her because she's sitting up in the front row. 233 00:12:37,520 --> 00:12:39,839 Speaker 1: She's waiting for her name to get called. I was like, Yo, 234 00:12:39,960 --> 00:12:42,720 Speaker 1: this ceremony is not busting at all. No cab on 235 00:12:42,880 --> 00:12:45,200 Speaker 1: God for real, for real, And she gets mad at 236 00:12:45,240 --> 00:12:48,440 Speaker 1: me and texts back like all these angry emoji and 237 00:12:49,440 --> 00:12:53,319 Speaker 1: why are you sending me brain rod? And I thought 238 00:12:53,320 --> 00:12:57,560 Speaker 1: it was interesting because the kids who we associate with 239 00:12:57,800 --> 00:13:02,000 Speaker 1: using brain rot kind of as a they also call 240 00:13:02,040 --> 00:13:02,839 Speaker 1: it brain rot. 241 00:13:03,240 --> 00:13:06,280 Speaker 2: I think it's important to unpack that. Obviously, words have 242 00:13:06,360 --> 00:13:08,840 Speaker 2: multiple meanings. Yeah, brain rock can be used in the 243 00:13:08,880 --> 00:13:11,120 Speaker 2: sense of this is bad for your brain. Oh, that's 244 00:13:11,240 --> 00:13:14,200 Speaker 2: brain rot. YouTube slop ai slop is brain rot. Like, 245 00:13:14,240 --> 00:13:15,920 Speaker 2: that's one way the word can use. Another way the 246 00:13:15,920 --> 00:13:18,280 Speaker 2: word is used. And I would argue the more commonly 247 00:13:18,320 --> 00:13:21,080 Speaker 2: way it's used that people ignore is that brain rot 248 00:13:21,120 --> 00:13:26,480 Speaker 2: refers to a meme aesthetic of nonsensical repetition. So things 249 00:13:26,520 --> 00:13:30,120 Speaker 2: are trending online, riskybity yat ohio is trending online. Yeah, 250 00:13:30,120 --> 00:13:33,240 Speaker 2: And it's funny to say that as a sentence, because 251 00:13:34,080 --> 00:13:38,920 Speaker 2: you're calling attention to the algorithmic oversaturation of these words. 252 00:13:39,200 --> 00:13:40,800 Speaker 2: I think in the sense that she used it, that's 253 00:13:41,440 --> 00:13:43,000 Speaker 2: brain rock. Is the meme aesthetic there. 254 00:13:43,760 --> 00:13:49,320 Speaker 1: It's funny because I can't think of another time when 255 00:13:50,440 --> 00:13:52,920 Speaker 1: anybody has used the slang that they use, they have 256 00:13:53,040 --> 00:13:57,120 Speaker 1: labeled it with something that, even on the surface is pejorative. 257 00:13:57,840 --> 00:14:00,640 Speaker 2: What's the word slang? Slang on the surface level pejorative. 258 00:14:01,160 --> 00:14:03,440 Speaker 2: The words langu was coined in the seventeen hundreds as 259 00:14:03,480 --> 00:14:06,400 Speaker 2: a way to differentiate upper class language from lower class language. 260 00:14:06,400 --> 00:14:07,080 Speaker 2: That's all it was. 261 00:14:08,480 --> 00:14:11,600 Speaker 1: The algorithm is changing how we talk, and it's not 262 00:14:11,720 --> 00:14:15,240 Speaker 1: just the words. There's entirely new accents that are developing 263 00:14:15,280 --> 00:14:18,280 Speaker 1: in response to the algorithm. You'd usually think of an 264 00:14:18,320 --> 00:14:21,000 Speaker 1: accent as something you pick up from living around people 265 00:14:21,080 --> 00:14:24,360 Speaker 1: who talk a certain way, but the effects here aren't 266 00:14:24,400 --> 00:14:30,080 Speaker 1: happening because of other people, well, not directly, the influencer accent. 267 00:14:30,160 --> 00:14:31,080 Speaker 1: Can you explain that to me? 268 00:14:31,320 --> 00:14:33,840 Speaker 2: Well, there's a stereotypical influencer accent, and then there's the 269 00:14:33,880 --> 00:14:36,360 Speaker 2: more nuanced explanation. But I'll start with the stereotype. The 270 00:14:36,680 --> 00:14:40,800 Speaker 2: Hey guys, welcome to my podcast. I'm using rising tones 271 00:14:40,920 --> 00:14:43,120 Speaker 2: as a way of retention. That kind of up talk 272 00:14:43,160 --> 00:14:45,120 Speaker 2: at the end of each sentence draws you back in, 273 00:14:45,160 --> 00:14:47,960 Speaker 2: It reels you back in. There's stress on more words 274 00:14:48,000 --> 00:14:51,360 Speaker 2: that keeps you watching videos So these are algorithmic retention 275 00:14:51,520 --> 00:14:54,680 Speaker 2: tactics that keep you watching the video, that survive as 276 00:14:55,360 --> 00:14:59,840 Speaker 2: viral accents, and then they're replicated by people consciously or unconsciously. 277 00:15:00,080 --> 00:15:02,160 Speaker 2: I use a different kind of influencer accent. I use 278 00:15:02,200 --> 00:15:04,600 Speaker 2: what I call the educational influencer action. I'll talk quickly, 279 00:15:04,600 --> 00:15:06,840 Speaker 2: I'll stress certain words to keep you watching my video. 280 00:15:07,200 --> 00:15:10,080 Speaker 2: Then there's the mister be style accent. He also very 281 00:15:10,120 --> 00:15:14,080 Speaker 2: meticulously knows what he's saying intentionally to go viral. I 282 00:15:14,160 --> 00:15:16,120 Speaker 2: just bought a private island, and today I'm giving away 283 00:15:16,120 --> 00:15:18,520 Speaker 2: a million dollars. Every word is like shouting at you. 284 00:15:18,560 --> 00:15:20,960 Speaker 2: Every word is sensationalized. You look at an interview of 285 00:15:21,000 --> 00:15:23,480 Speaker 2: mister Beast talking in real life. He doesn't talk like that. 286 00:15:24,040 --> 00:15:27,840 Speaker 2: It's a show, it's a presentation. He intentionally is very 287 00:15:28,360 --> 00:15:31,440 Speaker 2: good at manipulating the algorithm. There's people at this point 288 00:15:31,480 --> 00:15:34,000 Speaker 2: who just assume that's the accepted way to speak online 289 00:15:34,040 --> 00:15:37,640 Speaker 2: and replicated. So you have people with no followers, like 290 00:15:37,680 --> 00:15:39,520 Speaker 2: one hundred followers, and the first time they decide to 291 00:15:39,520 --> 00:15:42,160 Speaker 2: post a video on TikTok, they'll talk in that influencer 292 00:15:42,240 --> 00:15:44,800 Speaker 2: accent simply because that's what they think is the norm. 293 00:15:45,200 --> 00:15:47,200 Speaker 2: Part of this argument in this book is that algorithms 294 00:15:47,240 --> 00:15:50,720 Speaker 2: compound and amplify natural human tendencies. So there's a more 295 00:15:50,880 --> 00:15:54,040 Speaker 2: exaggerated or what I call there's the word flanderizer, which 296 00:15:54,040 --> 00:15:57,200 Speaker 2: means like pulling out a personality trade and exaggerating it. 297 00:15:58,800 --> 00:16:01,280 Speaker 1: Quick as I here because this is interesting phrase. So 298 00:16:01,520 --> 00:16:05,720 Speaker 1: flanderization refers to the actual character Ned Flanders on The Simpsons. 299 00:16:06,080 --> 00:16:08,680 Speaker 1: He started out as this well meaning dor key neighbor 300 00:16:08,720 --> 00:16:12,360 Speaker 1: who also happened to be religious, but the fans thought 301 00:16:12,400 --> 00:16:15,760 Speaker 1: that religious part was really funny, and the writers ended 302 00:16:15,840 --> 00:16:19,000 Speaker 1: up dropping the complex parts of Ned Flanders and making 303 00:16:19,000 --> 00:16:23,160 Speaker 1: that his whole personality. So Ned Flanders is now hyper religious. 304 00:16:23,400 --> 00:16:26,600 Speaker 1: He's become a simplified caricature of himself. 305 00:16:26,400 --> 00:16:29,400 Speaker 2: Hopefully dealy. I think I constantly see that happening with 306 00:16:29,440 --> 00:16:32,280 Speaker 2: influencers online. We have to play personas. I play a 307 00:16:32,320 --> 00:16:35,360 Speaker 2: persona of myself. I do care about at homology. I 308 00:16:35,360 --> 00:16:38,000 Speaker 2: am very excited about this stuff, but I will exaggerate 309 00:16:38,040 --> 00:16:41,880 Speaker 2: my voice. I will speak in a hyperbolized manner because 310 00:16:41,880 --> 00:16:43,960 Speaker 2: I know that I'll get views which will help me 311 00:16:44,080 --> 00:16:45,040 Speaker 2: earn a living right. 312 00:16:45,760 --> 00:16:48,160 Speaker 1: If a lot of people are doing this, because they 313 00:16:48,160 --> 00:16:51,720 Speaker 1: think that in order to post anything online, what they 314 00:16:51,760 --> 00:16:55,480 Speaker 1: should be doing is basically becoming a very small niche 315 00:16:55,480 --> 00:16:59,360 Speaker 1: of themselves. And then also it affects our language, which 316 00:16:59,440 --> 00:17:01,760 Speaker 1: we're not going to full no Chansky here, but you know, 317 00:17:01,960 --> 00:17:05,959 Speaker 1: also affects your thinking in some way. Then I think 318 00:17:06,000 --> 00:17:08,119 Speaker 1: that starts to get a little weird there. 319 00:17:08,640 --> 00:17:13,160 Speaker 2: I think language definitely influences thinking. We categorize the world 320 00:17:13,200 --> 00:17:15,560 Speaker 2: a certain way. Now, these categories determine how we act 321 00:17:15,600 --> 00:17:19,440 Speaker 2: because we're conforming to these categories at algorithms, although they 322 00:17:19,600 --> 00:17:24,160 Speaker 2: purport to push you into your niche or create more 323 00:17:24,200 --> 00:17:28,600 Speaker 2: specific subcultures, they actually have these broad, flattening trends that 324 00:17:28,680 --> 00:17:32,600 Speaker 2: push less nuanced versions of reality. 325 00:17:33,200 --> 00:17:36,200 Speaker 1: What Dam's saying here is that these algorithms are flattening 326 00:17:36,200 --> 00:17:39,720 Speaker 1: our culture even with the expansion of language, which in 327 00:17:39,800 --> 00:17:42,960 Speaker 1: some ways means our thinking might be getting flattened also. 328 00:17:43,400 --> 00:17:46,080 Speaker 1: But this isn't the first time that human language has 329 00:17:46,119 --> 00:17:48,080 Speaker 1: evolved in response to technology. 330 00:17:48,640 --> 00:17:52,119 Speaker 2: We have changes in mediums that affect the way we communicate. 331 00:17:52,800 --> 00:17:57,439 Speaker 2: The change from oral tradition to written chapters meant that 332 00:17:57,480 --> 00:18:01,200 Speaker 2: we could like segment our words differently. During oral traditions, 333 00:18:01,200 --> 00:18:04,280 Speaker 2: we had to have rhyming meter, that kind of stuff 334 00:18:04,320 --> 00:18:07,640 Speaker 2: which helps us remember our stories better through songs. 335 00:18:08,000 --> 00:18:10,720 Speaker 1: I actually never thought about that, the function of rhyme 336 00:18:10,840 --> 00:18:13,959 Speaker 1: helping you remember stuff. Yes, I'd never thought of it 337 00:18:14,040 --> 00:18:14,320 Speaker 1: like that. 338 00:18:14,600 --> 00:18:17,280 Speaker 2: The way we're telling our stories always reroutes around the medium. 339 00:18:17,680 --> 00:18:20,840 Speaker 2: Once we have television, things could be serialized. Once we 340 00:18:20,840 --> 00:18:24,000 Speaker 2: have the Internet, we have the written replication of informal speech. 341 00:18:24,200 --> 00:18:26,680 Speaker 2: I think we're at another inflection point, one where the 342 00:18:26,720 --> 00:18:30,200 Speaker 2: medium is now different. It's a new medium, and each 343 00:18:30,280 --> 00:18:33,200 Speaker 2: new medium is going to affect how we communicate uniquely. 344 00:18:33,680 --> 00:18:35,960 Speaker 2: So the fact that we have algorithms now means that 345 00:18:36,040 --> 00:18:39,200 Speaker 2: our language is rerouting around algorithms to the same degree, 346 00:18:39,480 --> 00:18:42,880 Speaker 2: maybe as the shift from oral tradition to writing things down. 347 00:18:43,080 --> 00:18:47,200 Speaker 2: Language itself is just humans doing what we've always been doing, 348 00:18:47,240 --> 00:18:51,800 Speaker 2: which is using tools to express ourselves. And when we 349 00:18:51,840 --> 00:18:55,120 Speaker 2: say things like on a live we're acknowledging our presence 350 00:18:55,160 --> 00:18:57,800 Speaker 2: in this communicative medium and the social context in which 351 00:18:57,840 --> 00:18:59,160 Speaker 2: we're relating to each other. 352 00:19:00,119 --> 00:19:03,119 Speaker 1: Which changing in response to new technology. Is not a 353 00:19:03,160 --> 00:19:06,240 Speaker 1: new thing. But there does seem to be something unique 354 00:19:06,280 --> 00:19:09,560 Speaker 1: about this latest trend of algo speak. There's a new 355 00:19:09,600 --> 00:19:11,560 Speaker 1: force in play confusion. 356 00:19:12,560 --> 00:19:15,160 Speaker 2: I talk about this boundary of confusion being a productive 357 00:19:15,160 --> 00:19:19,560 Speaker 2: force in language changing, where slightly confusing turns of phrase 358 00:19:19,680 --> 00:19:21,199 Speaker 2: are good for going viral. 359 00:19:21,760 --> 00:19:22,400 Speaker 1: Really like what. 360 00:19:23,000 --> 00:19:26,760 Speaker 2: I talk a lot about the boundary between irony and authenticity. 361 00:19:26,800 --> 00:19:28,720 Speaker 2: So I have a chapter on in cells. For example, 362 00:19:29,119 --> 00:19:32,080 Speaker 2: in cels being involuntary selibents this far right misogynistic group. 363 00:19:32,400 --> 00:19:36,240 Speaker 2: There's phrenological filters like cancel tilts and hunter eyes and 364 00:19:36,280 --> 00:19:38,840 Speaker 2: interocular distance. These are all in cell concepts and they 365 00:19:38,840 --> 00:19:43,199 Speaker 2: popularize these categorizations of like people's faces. But they were 366 00:19:43,280 --> 00:19:45,040 Speaker 2: kind of funny as a joke. People came up with, 367 00:19:45,119 --> 00:19:48,200 Speaker 2: like mewing, which is a jaw strengthening technique. It's funny 368 00:19:48,200 --> 00:19:50,080 Speaker 2: because it's a joke, but there's some people who believe 369 00:19:50,080 --> 00:19:52,320 Speaker 2: it's real. But it's spread as a joke, but then 370 00:19:52,359 --> 00:19:55,080 Speaker 2: some people reinterpret it as real, which spreads it further. 371 00:19:55,359 --> 00:19:58,840 Speaker 2: Same with oh, hunter eyes, cantel tilts. It's funny, haha 372 00:19:58,880 --> 00:20:01,159 Speaker 2: that inceels think these things are important. But now we 373 00:20:01,160 --> 00:20:03,440 Speaker 2: have beauty influencers on TikTok showing you how to put 374 00:20:03,480 --> 00:20:05,840 Speaker 2: on eyeline or using your canthel tilt. They wouldn't have 375 00:20:05,840 --> 00:20:07,640 Speaker 2: been doing that five years ago, but because in cell 376 00:20:07,720 --> 00:20:10,000 Speaker 2: concepts somehow wormed their way into the mainstream through this 377 00:20:10,040 --> 00:20:13,840 Speaker 2: hopscotching between what's real and what's fake. Yeah, that confusion 378 00:20:14,119 --> 00:20:16,920 Speaker 2: also generates comments from people saying is this actually real? 379 00:20:17,119 --> 00:20:20,160 Speaker 2: And then once you the comment is engagement, it tells 380 00:20:20,200 --> 00:20:22,720 Speaker 2: the algorithm, let's push this further. And now, the word 381 00:20:22,800 --> 00:20:26,880 Speaker 2: sigma is a very popular kind of middle school slang word. 382 00:20:26,960 --> 00:20:29,879 Speaker 2: Right now, Sigma just means like somebody who's a cool 383 00:20:29,920 --> 00:20:35,000 Speaker 2: dominant male. Everything's about classification, about what's your attractiveness level? 384 00:20:35,359 --> 00:20:36,840 Speaker 2: And now where does this put you in what they 385 00:20:36,840 --> 00:20:38,480 Speaker 2: call the socio sexual hierarchy. 386 00:20:39,200 --> 00:20:41,359 Speaker 1: And I know some of y'all might be saying, here, wait, 387 00:20:41,400 --> 00:20:43,720 Speaker 1: hold up. If we're going to talk about slang here, 388 00:20:43,920 --> 00:20:48,359 Speaker 1: especially quote unquote gen z slang, there's something else we 389 00:20:48,440 --> 00:20:51,440 Speaker 1: got to talk about. Not too long ago, somebody told 390 00:20:51,480 --> 00:20:53,800 Speaker 1: me about this new gen z slang they'd heard, which 391 00:20:53,920 --> 00:20:56,240 Speaker 1: was finna. And I had to tell them that My 392 00:20:56,480 --> 00:20:59,600 Speaker 1: grandmother said finna, like I'm finna. Whoop you'r behind if 393 00:20:59,600 --> 00:21:02,280 Speaker 1: you don't do your homework. That word. A lot of 394 00:21:02,280 --> 00:21:05,000 Speaker 1: this quote unquote new slang is just pulled from what 395 00:21:05,160 --> 00:21:09,440 Speaker 1: a lot of linguists now called AAE African American English. 396 00:21:09,680 --> 00:21:12,480 Speaker 1: You might have heard it called aa VE or ebonics 397 00:21:12,640 --> 00:21:15,960 Speaker 1: or just black English. And the flow of AAE being 398 00:21:16,000 --> 00:21:19,720 Speaker 1: adopted into mainstream slang has been around forever where it's 399 00:21:19,720 --> 00:21:24,960 Speaker 1: like hip, cool, woke, I could keep going, and AAE 400 00:21:25,200 --> 00:21:28,160 Speaker 1: does have a heavy influence on Internet slaying to this day. 401 00:21:28,840 --> 00:21:31,639 Speaker 1: But the Internet now also has another source of slang 402 00:21:31,880 --> 00:21:34,640 Speaker 1: and ideas in cells. 403 00:21:35,800 --> 00:21:38,040 Speaker 2: There's a rule of thumb of slang on the Internet 404 00:21:38,240 --> 00:21:41,520 Speaker 2: that it's either from four Chan or it's from AAE. Right, 405 00:21:41,560 --> 00:21:43,639 Speaker 2: those are the two big sources. There's occasionally gonna be 406 00:21:43,640 --> 00:21:45,719 Speaker 2: stuff from other things. I talk about other echo chambers 407 00:21:45,720 --> 00:21:49,359 Speaker 2: that make language, like swifties or K pop stands, but 408 00:21:49,480 --> 00:21:53,480 Speaker 2: really it's either black people or it's far right misogynistic 409 00:21:53,800 --> 00:21:57,080 Speaker 2: trolls on four Chen. How did that happen? Right? There's 410 00:21:57,160 --> 00:22:00,480 Speaker 2: increasingly porous edges to echo chambers that allow ideas to 411 00:22:00,480 --> 00:22:04,040 Speaker 2: travel through if they are compelling, and they're compelling, if 412 00:22:04,080 --> 00:22:07,000 Speaker 2: they're funny, or if they seem cool. In cells are 413 00:22:07,080 --> 00:22:10,040 Speaker 2: very good at weaponizing their memes to be funny. Sometimes 414 00:22:10,040 --> 00:22:12,640 Speaker 2: people use those memes to make fun of inseels as well, 415 00:22:12,680 --> 00:22:14,639 Speaker 2: so it wasn't always like the inseels themselves pushed in 416 00:22:14,640 --> 00:22:18,600 Speaker 2: this ideology, but in cell language spread because it was funny, right, 417 00:22:18,640 --> 00:22:20,960 Speaker 2: and then people try to talk more like that, and 418 00:22:21,000 --> 00:22:24,960 Speaker 2: then in doing so they perpetuate it to increasingly peripheral groups, 419 00:22:25,400 --> 00:22:29,600 Speaker 2: and words transcend cultural boundaries. And it also happens faster 420 00:22:29,680 --> 00:22:31,680 Speaker 2: online where there's context collapse. Where you see this video, 421 00:22:31,680 --> 00:22:33,600 Speaker 2: you're like, oh, this person's talking to me, it must 422 00:22:33,600 --> 00:22:35,520 Speaker 2: be okay for me to also use this word. So 423 00:22:35,720 --> 00:22:39,120 Speaker 2: words more quickly transcend filter bubbles like that, and all 424 00:22:39,119 --> 00:22:42,600 Speaker 2: of our slang is either African American English hm hm, 425 00:22:42,680 --> 00:22:43,720 Speaker 2: or it's in cel rhetoric. 426 00:22:44,320 --> 00:22:48,719 Speaker 1: Unfortunately, this context collapse that happens online means that words 427 00:22:48,760 --> 00:22:51,280 Speaker 1: lose their origins and it gets easier for them to 428 00:22:51,320 --> 00:22:54,159 Speaker 1: travel and be used by communities that are unfamiliar with 429 00:22:54,200 --> 00:22:57,120 Speaker 1: the original context. It's not necessarily a bad thing, it's 430 00:22:57,200 --> 00:23:00,480 Speaker 1: how language works. But it also means people get more 431 00:23:00,520 --> 00:23:04,960 Speaker 1: comfortable with using or alluding to certain words even when 432 00:23:05,000 --> 00:23:08,399 Speaker 1: they know the context. I see a lot of people 433 00:23:08,440 --> 00:23:10,639 Speaker 1: throwing around the in word more than they used to, 434 00:23:11,359 --> 00:23:12,680 Speaker 1: just kind of in a new way. 435 00:23:13,320 --> 00:23:16,520 Speaker 2: The ninja emoji that's the common like al go speak. 436 00:23:16,520 --> 00:23:20,240 Speaker 2: You'vehemism for it, right, Instagram reels just straight up allows 437 00:23:20,240 --> 00:23:23,440 Speaker 2: it now because they loosen their content guidelines after Trump 438 00:23:23,520 --> 00:23:26,639 Speaker 2: got reelected. I think Instagram reels is the worst social 439 00:23:26,680 --> 00:23:28,920 Speaker 2: media ecosystem to be on right now because there's there's 440 00:23:28,960 --> 00:23:31,919 Speaker 2: some really really racist stuff on there. Really, there's one 441 00:23:32,000 --> 00:23:35,600 Speaker 2: video with thirty million views of a swarm of shirtless 442 00:23:35,600 --> 00:23:38,960 Speaker 2: black men running towards a KFC while the underlying audio 443 00:23:39,080 --> 00:23:42,200 Speaker 2: says the N word repeatedly. It's AI generated and that's 444 00:23:42,240 --> 00:23:44,040 Speaker 2: just like one example. There's a bunch of videos like 445 00:23:44,040 --> 00:23:47,040 Speaker 2: that on Instagram reels right now. Because one meta is 446 00:23:47,080 --> 00:23:51,880 Speaker 2: allowing AI generated content, they are actively incentivizing it. Two 447 00:23:51,960 --> 00:23:54,040 Speaker 2: they're allowing for more extreme content because they know that 448 00:23:54,080 --> 00:23:56,119 Speaker 2: gets attention better and at the end of the day, 449 00:23:56,160 --> 00:23:58,560 Speaker 2: they want your attention so they can then commodify. 450 00:23:58,080 --> 00:24:02,680 Speaker 1: It instinctively in some way. I've had an idea of 451 00:24:02,920 --> 00:24:05,919 Speaker 1: what it is that attracts, say a white person to 452 00:24:06,440 --> 00:24:08,040 Speaker 1: really want to be able to say the N word. 453 00:24:09,080 --> 00:24:12,880 Speaker 1: It seems like there's different incentives now given algorithms. 454 00:24:13,119 --> 00:24:17,560 Speaker 2: When I interviewed the influencers behind these AI slob accounts 455 00:24:17,600 --> 00:24:20,960 Speaker 2: that are making really racist videos, I DMed dozens of them, 456 00:24:21,000 --> 00:24:22,840 Speaker 2: and five to six of them got back to me. 457 00:24:23,280 --> 00:24:25,360 Speaker 2: I asked them, why are you making these videos? What's 458 00:24:25,359 --> 00:24:28,159 Speaker 2: the underlying motivation? And what I kept getting isn't oh, 459 00:24:28,240 --> 00:24:30,320 Speaker 2: I'm a racist and I want to spread racism. It's 460 00:24:30,520 --> 00:24:33,840 Speaker 2: I want views, I want followers, I want likes. The 461 00:24:33,920 --> 00:24:38,400 Speaker 2: institutional structures are there for stupidity and racism and these 462 00:24:38,520 --> 00:24:42,840 Speaker 2: terrible things to occur. Once Instagram starts incentivizing that, people 463 00:24:42,920 --> 00:24:45,800 Speaker 2: will make content because that's how they're going to get 464 00:24:45,880 --> 00:24:48,840 Speaker 2: views and likes and followers and thus money and thus money. 465 00:24:49,000 --> 00:24:51,040 Speaker 2: And it's purely because Instagram's incentivizing it. 466 00:24:52,119 --> 00:24:54,480 Speaker 1: Okay, So now I have to ask the question that 467 00:24:54,600 --> 00:24:57,960 Speaker 1: probably a lot of people are wondering right now, is 468 00:24:58,080 --> 00:25:01,199 Speaker 1: al Go speak is the way that communicate online and 469 00:25:01,240 --> 00:25:03,960 Speaker 1: thus in real life? Is it making us stupider? 470 00:25:04,640 --> 00:25:07,760 Speaker 2: Now I can confidently say language is not making us stupider. 471 00:25:07,800 --> 00:25:09,960 Speaker 2: It doesn't matter whether you're saying skibbity or scooby doo 472 00:25:10,040 --> 00:25:13,359 Speaker 2: or whatever. Language is a mechanism for humans to relate 473 00:25:13,400 --> 00:25:16,000 Speaker 2: to other humans. It's a way for us to capture 474 00:25:16,040 --> 00:25:19,679 Speaker 2: our worldview and then transmit that to other people. We 475 00:25:19,760 --> 00:25:22,440 Speaker 2: are doing it perfectly, even if we're talking about Skibby toilt, 476 00:25:22,440 --> 00:25:25,399 Speaker 2: even if that's quote unquote brain rot. Culture, on the 477 00:25:25,440 --> 00:25:29,240 Speaker 2: other hand, is a subjective thing that is constructed internally, 478 00:25:29,720 --> 00:25:34,320 Speaker 2: and there are definitely concerning cultural trends, declines and literacy rates, 479 00:25:34,359 --> 00:25:36,720 Speaker 2: shortened attention spans. We know these things are happening. 480 00:25:37,320 --> 00:25:40,120 Speaker 1: Throughout this whole conversation, I had this feeling that there's 481 00:25:40,160 --> 00:25:42,639 Speaker 1: not quite a word yet for something I've been seeing 482 00:25:42,720 --> 00:25:45,879 Speaker 1: that's happening, because I'll go speak so Adam and I 483 00:25:45,960 --> 00:26:02,119 Speaker 1: made one up. That's after the break. Protests are still 484 00:26:02,160 --> 00:26:05,440 Speaker 1: happening in Los Angeles, and as I've been out documenting it, 485 00:26:05,880 --> 00:26:09,359 Speaker 1: I've noticed something different. One examp that I've seen that 486 00:26:09,400 --> 00:26:12,120 Speaker 1: really kind of hit me is IOF. 487 00:26:11,680 --> 00:26:15,120 Speaker 2: That stands for Israeli Offensive Forces, which is a pejorative 488 00:26:15,840 --> 00:26:19,360 Speaker 2: algo speak euphemism for IDF Offensive forces. 489 00:26:19,600 --> 00:26:23,320 Speaker 1: Right exactly, if you say IDF in TikTok or Instagram 490 00:26:23,359 --> 00:26:27,520 Speaker 1: or whatever, you will potentially be shadow band or your 491 00:26:27,680 --> 00:26:29,760 Speaker 1: posts will not get shown to as many people. What 492 00:26:29,880 --> 00:26:32,800 Speaker 1: I started noticing though, is I'm in Los Angeles and 493 00:26:32,840 --> 00:26:35,240 Speaker 1: there's a bunch of protests here, the anti ice raids, 494 00:26:35,760 --> 00:26:40,760 Speaker 1: and I started hearing people chanting about IOF, and it 495 00:26:40,800 --> 00:26:43,600 Speaker 1: took me a second to realize what was going on. Yeah, 496 00:26:43,600 --> 00:26:47,680 Speaker 1: so people are chanting it, but also dig this. There's 497 00:26:47,960 --> 00:26:52,680 Speaker 1: graffiti and people had tagged IOF and crossed that out. 498 00:26:52,680 --> 00:26:56,120 Speaker 2: And that just comes from algo speak. That's a fascinating example. 499 00:26:56,200 --> 00:26:57,840 Speaker 1: It's nuts because, I mean, if you think about it, 500 00:26:57,840 --> 00:27:00,479 Speaker 1: graffiti by nature, you can write the bed, you can 501 00:27:00,520 --> 00:27:02,480 Speaker 1: write whatever you want on a wall, and so why 502 00:27:02,480 --> 00:27:05,040 Speaker 1: would you censor yourself in graffiti. 503 00:27:05,320 --> 00:27:07,679 Speaker 2: Well, it's not even censorship anymore. The words taken out 504 00:27:07,720 --> 00:27:10,280 Speaker 2: a new life, I mean online IOF is a metal 505 00:27:10,320 --> 00:27:14,840 Speaker 2: linguistic wink. It's saying, yes, I'm submitting to the algorithm 506 00:27:15,080 --> 00:27:16,920 Speaker 2: while I'm saying this thing that I want to say, 507 00:27:16,960 --> 00:27:19,800 Speaker 2: but also I'm reclaiming it. There's an act of reclamation 508 00:27:19,920 --> 00:27:22,800 Speaker 2: and turning the D into an oh, because you're signaling 509 00:27:22,840 --> 00:27:25,879 Speaker 2: something else about the ideaf that you want to communicate. 510 00:27:25,920 --> 00:27:26,080 Speaker 4: Here. 511 00:27:26,200 --> 00:27:29,359 Speaker 2: When we say something like segs or essay or on 512 00:27:29,400 --> 00:27:32,159 Speaker 2: a live or all these things, we're always doing so 513 00:27:33,080 --> 00:27:38,120 Speaker 2: with an implicit acknowledgment that there is an algorithm governing 514 00:27:38,119 --> 00:27:41,520 Speaker 2: our speech, that it's always present when we say these 515 00:27:41,560 --> 00:27:43,840 Speaker 2: words and there's always some level of acknowledgment for that. 516 00:27:44,160 --> 00:27:48,080 Speaker 1: Watching people say something, for example, watermelon at a protest. 517 00:27:48,200 --> 00:27:50,320 Speaker 1: You know, oh, I want to support watermelon. I've heard 518 00:27:50,359 --> 00:27:53,200 Speaker 1: people say that you don't have to say that you're 519 00:27:53,240 --> 00:27:56,560 Speaker 1: at a protest. Everybody here is on the same side 520 00:27:56,600 --> 00:27:58,960 Speaker 1: as you are. But you're saying, oh, you know, well, 521 00:27:59,160 --> 00:28:03,080 Speaker 1: watermelon cause and things like that is implying somebody could 522 00:28:03,119 --> 00:28:04,560 Speaker 1: be watching us. Nobody is. 523 00:28:04,680 --> 00:28:07,960 Speaker 2: Nobody's listening, but the signal that they could be is 524 00:28:08,000 --> 00:28:09,600 Speaker 2: also impactful in its own way. 525 00:28:09,960 --> 00:28:13,000 Speaker 1: It's signaling to an end group. But the extra thing 526 00:28:13,160 --> 00:28:17,200 Speaker 1: here that I'm seeing is something that I can't think 527 00:28:17,240 --> 00:28:20,360 Speaker 1: of a time a part of English that has done 528 00:28:20,440 --> 00:28:23,080 Speaker 1: this in the past, which is to say, for example, 529 00:28:23,480 --> 00:28:26,159 Speaker 1: you know, in Chinese you have the completion marker, you know, 530 00:28:26,160 --> 00:28:29,560 Speaker 1: the lea in Japanese, you have certain linguistic features that 531 00:28:29,600 --> 00:28:32,720 Speaker 1: don't exist in English. In Black English you have certain 532 00:28:32,840 --> 00:28:35,280 Speaker 1: versions that don't exist in you know, things in standard 533 00:28:35,320 --> 00:28:38,040 Speaker 1: English like you know, I've been gone to the store already, 534 00:28:38,080 --> 00:28:39,880 Speaker 1: or you know, things like that I've been hungry, things 535 00:28:39,920 --> 00:28:44,080 Speaker 1: like that. Yeah, I can't think of anything in any 536 00:28:44,160 --> 00:28:46,760 Speaker 1: language that I know where you use a word that 537 00:28:46,920 --> 00:28:51,080 Speaker 1: is also implying I shouldn't be saying this, and somebody 538 00:28:51,160 --> 00:28:54,680 Speaker 1: is watching me say this, and you understand what I'm saying, 539 00:28:54,720 --> 00:28:58,840 Speaker 1: and there's a sense of forbiddenness built into the word. 540 00:28:59,440 --> 00:29:02,880 Speaker 2: It's called a voidance speech. There are languages in Polynesia 541 00:29:02,880 --> 00:29:05,680 Speaker 2: as a common example, I have taboos on certain words, 542 00:29:05,800 --> 00:29:10,440 Speaker 2: and you can't talk about things like menstruation sometimes okay, 543 00:29:10,600 --> 00:29:14,240 Speaker 2: or you can't mention certain relatives that are passed away, 544 00:29:14,960 --> 00:29:20,400 Speaker 2: and there are ways of circumventing that with euphemisms avoidance speech. 545 00:29:21,320 --> 00:29:24,120 Speaker 2: That's not super new. I will say what you said 546 00:29:24,160 --> 00:29:29,560 Speaker 2: about this awareness of surveillance is really interesting. I don't 547 00:29:29,560 --> 00:29:32,800 Speaker 2: think we've ever had as much of a perception of 548 00:29:32,880 --> 00:29:35,640 Speaker 2: being watched and surveilled, and the fact that we're in 549 00:29:35,640 --> 00:29:37,520 Speaker 2: this digital surveillance panopticon. 550 00:29:37,960 --> 00:29:41,719 Speaker 1: I feel like we almost need a grammatical term for this, 551 00:29:41,800 --> 00:29:44,320 Speaker 1: because it seems like this is something that we're going 552 00:29:44,400 --> 00:29:45,720 Speaker 1: to see more of and not less of. 553 00:29:46,080 --> 00:29:49,560 Speaker 2: Well, we should quote one, what about algorithmic performativity. There 554 00:29:49,600 --> 00:29:53,040 Speaker 2: we go, let's do that one. Algorithmic performativity here is 555 00:29:53,760 --> 00:29:56,960 Speaker 2: speech with the knowledge that you're being watched by an algorithm. 556 00:29:57,200 --> 00:29:58,720 Speaker 2: And I also want to be careful when I say 557 00:29:58,720 --> 00:30:00,800 Speaker 2: watched here, because no one's act sitting in a room 558 00:30:00,840 --> 00:30:04,600 Speaker 2: looking at you like it's honestly, it's eerier than that 559 00:30:04,600 --> 00:30:08,600 Speaker 2: that there is no person behind the control room. It's 560 00:30:09,000 --> 00:30:12,480 Speaker 2: it's all automated. Our lives are being controlled by something 561 00:30:12,520 --> 00:30:15,960 Speaker 2: that not even the engineers knows what's happening. So I 562 00:30:16,080 --> 00:30:18,200 Speaker 2: find that more terrifying than if I actually had an 563 00:30:18,240 --> 00:30:20,720 Speaker 2: FBI agent behind his computer looking at everything I said. 564 00:30:22,040 --> 00:30:26,800 Speaker 1: Given how language clearly is being changed by the algorithm, 565 00:30:27,120 --> 00:30:28,840 Speaker 1: by which we mean, you know, the three companies that 566 00:30:28,920 --> 00:30:32,080 Speaker 1: control the algorithms that we use to distribute stuff, are 567 00:30:32,080 --> 00:30:32,560 Speaker 1: we cooked? 568 00:30:32,880 --> 00:30:37,400 Speaker 2: I like to think optimistically. I think tentatively. No, especially 569 00:30:37,440 --> 00:30:39,720 Speaker 2: in regards to language. I hope I've drilled home the 570 00:30:39,760 --> 00:30:43,560 Speaker 2: point that with language we're fine with other stuff. Culturally, 571 00:30:44,320 --> 00:30:46,360 Speaker 2: are we cooked? I don't know. We're still humans doing 572 00:30:46,440 --> 00:30:49,000 Speaker 2: human things, right. We're humans using tools to communicate with 573 00:30:49,040 --> 00:30:52,400 Speaker 2: each other. That feels fine. At the end of the day. 574 00:30:53,320 --> 00:30:56,120 Speaker 2: We should just do what makes us feel good. Life's 575 00:30:56,160 --> 00:30:58,360 Speaker 2: too short to not just try to like vibe when 576 00:30:58,400 --> 00:31:01,960 Speaker 2: we can, and if algorithms help us vibe, then great, 577 00:31:02,240 --> 00:31:04,360 Speaker 2: But you know, we should we should be aware of 578 00:31:05,040 --> 00:31:10,560 Speaker 2: how they affect our lives and piece together what makes 579 00:31:10,600 --> 00:31:11,440 Speaker 2: you feel good. 580 00:31:17,360 --> 00:31:19,760 Speaker 1: And that is it for this one. Thank you so 581 00:31:19,880 --> 00:31:22,720 Speaker 1: much to Adam Alexic for talking with me. Adam's new book, 582 00:31:22,760 --> 00:31:25,360 Speaker 1: I'll Go Speak is out now and if you look 583 00:31:25,360 --> 00:31:27,320 Speaker 1: in the show notes as a link to that, and 584 00:31:27,360 --> 00:31:31,000 Speaker 1: of course everywhere else you can find Adam online. And 585 00:31:31,080 --> 00:31:33,959 Speaker 1: thank you to you for listening to kill Switch and 586 00:31:34,200 --> 00:31:35,920 Speaker 1: let us know what you think. I know we've been 587 00:31:36,000 --> 00:31:38,120 Speaker 1: kind of all over the place with different topics, but 588 00:31:38,360 --> 00:31:42,400 Speaker 1: the world of technology obviously is pretty wide. So if 589 00:31:42,400 --> 00:31:43,920 Speaker 1: it's something you want us to cover or something you're 590 00:31:43,960 --> 00:31:47,520 Speaker 1: curious about, hit us up on email or kill Switch 591 00:31:47,720 --> 00:31:51,440 Speaker 1: at Kaleidoscope dot NYC. Or you can find us on 592 00:31:51,480 --> 00:31:55,720 Speaker 1: Instagram at kill Switch pod or I'm dex digit that's 593 00:31:55,800 --> 00:31:58,320 Speaker 1: d e X d I g I on Blue Sky 594 00:31:58,560 --> 00:32:01,360 Speaker 1: or Instagram and we'rever you're listening to this right now. 595 00:32:01,400 --> 00:32:04,480 Speaker 1: Whatever podcast service you use, leave us a review. It 596 00:32:04,640 --> 00:32:07,520 Speaker 1: helps other people find the show, which in turn helps 597 00:32:07,600 --> 00:32:11,080 Speaker 1: us keep doing our things. Kill Switch is hosted by 598 00:32:11,160 --> 00:32:15,560 Speaker 1: Me Dexter Thomas. It's produced by Shena Ozaki, Darla Potts, 599 00:32:15,760 --> 00:32:19,080 Speaker 1: and Kate Osborne. Our theme song is by me and 600 00:32:19,200 --> 00:32:23,480 Speaker 1: Kyle Murdoch and Kyle also mixes a show from Kaleidoscope. 601 00:32:23,480 --> 00:32:27,840 Speaker 1: Our executive producers are oz Ba Lachin, Mangesh Hatikador and 602 00:32:27,960 --> 00:32:32,160 Speaker 1: Kate Osborne. From iHeart Our executive producers are Katrina Norvil 603 00:32:32,360 --> 00:32:43,480 Speaker 1: and Nikki Etur. Catch you on the next one.