1 00:00:05,160 --> 00:00:07,720 Speaker 1: Hey, this is Annie and Samantha. Welcome to Stephane never 2 00:00:07,760 --> 00:00:19,520 Speaker 1: told your production of I Heart Radio, and welcome to 3 00:00:19,880 --> 00:00:21,680 Speaker 1: you know what, I'm gonna call this a new ish 4 00:00:21,760 --> 00:00:24,640 Speaker 1: kind of segment for the Monday Mini because we've done 5 00:00:24,640 --> 00:00:27,000 Speaker 1: a few things throughout and you know, typically this is 6 00:00:27,040 --> 00:00:29,840 Speaker 1: our shorter episodes where we talk about current events, current 7 00:00:29,880 --> 00:00:33,840 Speaker 1: issues and all of that. But you also know, I 8 00:00:33,880 --> 00:00:37,519 Speaker 1: am addicted to TikTok and I am on a lot 9 00:00:37,600 --> 00:00:42,040 Speaker 1: of controversial TikTok side, so like you have specific algorithms 10 00:00:42,040 --> 00:00:44,239 Speaker 1: that sticks to you. It's quite funny because people make 11 00:00:44,320 --> 00:00:47,880 Speaker 1: jokes of like, apparently TikTok things. I'm a lesbian introvert 12 00:00:48,640 --> 00:00:51,480 Speaker 1: with a d h D. That's fine, that's fine, that's fine. 13 00:00:53,280 --> 00:00:56,120 Speaker 1: But you know, I find a lot of things fascinating 14 00:00:56,200 --> 00:00:59,240 Speaker 1: which I send to you often. Um that happens on 15 00:00:59,280 --> 00:01:01,880 Speaker 1: TikTok and so media, and I know we're gonna have 16 00:01:02,080 --> 00:01:05,839 Speaker 1: a fun episode coming up. Correct. I think we've already 17 00:01:05,840 --> 00:01:09,200 Speaker 1: recorded with Bridget talking about social media and some of 18 00:01:09,240 --> 00:01:12,000 Speaker 1: the new things that is happening that kind of seems 19 00:01:12,040 --> 00:01:15,200 Speaker 1: to be going in reverse to what we know we 20 00:01:15,319 --> 00:01:18,800 Speaker 1: being the millennials are for me, elder millennials. I don't 21 00:01:18,800 --> 00:01:22,160 Speaker 1: want to hear that again to what we began. So 22 00:01:22,240 --> 00:01:26,959 Speaker 1: Facebook was what our first social media presence, and it 23 00:01:27,040 --> 00:01:30,360 Speaker 1: is quickly kind of becoming like a boomer thing somehow, 24 00:01:32,080 --> 00:01:34,679 Speaker 1: and the rest of us have slowly, like quietly quit 25 00:01:34,959 --> 00:01:37,800 Speaker 1: from Facebook. I think we still have accounts, but we 26 00:01:37,880 --> 00:01:40,880 Speaker 1: rarely post um. But then we moved on to Twitter 27 00:01:41,200 --> 00:01:43,280 Speaker 1: and that was a big thing and trying to keep 28 00:01:43,360 --> 00:01:46,679 Speaker 1: up with that. Oh no, my Space, I forgot. I 29 00:01:46,800 --> 00:01:51,360 Speaker 1: keep forgetting about my Space. Did you have on MySpace? No? 30 00:01:52,040 --> 00:01:56,000 Speaker 1: But I like hearing my Space. I wasn't too young, 31 00:01:56,200 --> 00:01:59,280 Speaker 1: but I was just a late adopter to everything. So 32 00:01:59,400 --> 00:02:02,440 Speaker 1: I think when I in high school phone got it, 33 00:02:02,480 --> 00:02:06,600 Speaker 1: but I never got it. Okay, um. Yeah, So there's 34 00:02:06,640 --> 00:02:08,400 Speaker 1: so many things and because of that, I feel like 35 00:02:08,400 --> 00:02:12,840 Speaker 1: I want to have a Samantha's Corner. Samantha's rant. Uh 36 00:02:13,040 --> 00:02:15,760 Speaker 1: that's not specifically like the happy hours where we talk 37 00:02:15,800 --> 00:02:18,480 Speaker 1: about emotions. We're gonna talk about things that pop up 38 00:02:18,480 --> 00:02:20,880 Speaker 1: and I'm like, what is this and let's talk about it. 39 00:02:20,919 --> 00:02:22,720 Speaker 1: And I know those articles out there and there are 40 00:02:23,320 --> 00:02:25,600 Speaker 1: about this, and this one has been on our plate 41 00:02:25,639 --> 00:02:27,160 Speaker 1: for a while, Like I was like, I need to 42 00:02:27,160 --> 00:02:30,480 Speaker 1: do this, um, And it's the call out culture in 43 00:02:30,560 --> 00:02:34,800 Speaker 1: social media. And I'm going to kind of go in 44 00:02:34,840 --> 00:02:39,640 Speaker 1: between call out culture, cancel culture, and accountability. So these 45 00:02:39,680 --> 00:02:43,160 Speaker 1: three things kind of start blurring lines. Um, but for 46 00:02:43,240 --> 00:02:45,960 Speaker 1: now we'll just call it call out culture. And Annie, 47 00:02:46,240 --> 00:02:48,320 Speaker 1: have you seen any of it? Like I know people 48 00:02:48,360 --> 00:02:51,040 Speaker 1: send you TikTok's. Do you have you seen any of 49 00:02:51,080 --> 00:02:56,920 Speaker 1: that would be call out videos? You think? No? Uh no, 50 00:02:57,160 --> 00:02:59,680 Speaker 1: I I know people. And again this is we've talked 51 00:02:59,680 --> 00:03:02,760 Speaker 1: about ancel culture before. It is like that there is 52 00:03:02,800 --> 00:03:05,959 Speaker 1: a blind and it gets messy. I've had friends who 53 00:03:06,000 --> 00:03:12,000 Speaker 1: experienced it where the story wasn't complete, um, and that 54 00:03:12,040 --> 00:03:14,160 Speaker 1: can be scary. But I've never heard I don't think 55 00:03:14,160 --> 00:03:18,200 Speaker 1: anybody's ever sent me like a TikTok call out video. 56 00:03:18,480 --> 00:03:21,040 Speaker 1: I will. That's an assignment I'm going to take on. 57 00:03:21,480 --> 00:03:25,000 Speaker 1: And this definition comes out of the dictionary from Cambridge 58 00:03:25,000 --> 00:03:28,040 Speaker 1: dot org and it says call out culture is a 59 00:03:28,080 --> 00:03:31,080 Speaker 1: way of behaving in society or group in which people 60 00:03:31,120 --> 00:03:35,000 Speaker 1: are often criticized in public, for example, on social media, 61 00:03:35,320 --> 00:03:39,000 Speaker 1: for their words or actions are asked to explain them. 62 00:03:39,120 --> 00:03:42,480 Speaker 1: And one of the things that has happened since the pandemic, 63 00:03:42,760 --> 00:03:46,120 Speaker 1: since the protests for the Black Lives Matter movements. Is 64 00:03:46,160 --> 00:03:49,560 Speaker 1: a lot of people are coming on here with heavy opinions, 65 00:03:49,920 --> 00:03:53,480 Speaker 1: saying the wrong things, doing the wrong things. Anti Vactor 66 00:03:53,600 --> 00:03:57,560 Speaker 1: is coming out with misinformation, racist tie raids, you know, 67 00:03:57,600 --> 00:04:00,160 Speaker 1: all these things that have happened, and we get with 68 00:04:00,240 --> 00:04:04,440 Speaker 1: the age of social media cameras on phones where they 69 00:04:04,440 --> 00:04:07,920 Speaker 1: can get caught. And then all the sleuths that have 70 00:04:08,040 --> 00:04:11,280 Speaker 1: come through, all the all the TikTok detectives or the 71 00:04:11,960 --> 00:04:15,000 Speaker 1: Instagram detectives that go through define and to get to 72 00:04:15,040 --> 00:04:18,240 Speaker 1: these people. Of course, again this kind of varies into 73 00:04:18,520 --> 00:04:22,520 Speaker 1: is that cancel culture kind of in between and just 74 00:04:22,560 --> 00:04:27,280 Speaker 1: as a reminder, cancel culture, which is defined by Mariam Webster, 75 00:04:27,520 --> 00:04:31,440 Speaker 1: the practice or tendency of engaging mass canceling as a 76 00:04:31,440 --> 00:04:35,480 Speaker 1: way of expressing disapproval and exerting social pressure. So there's 77 00:04:35,520 --> 00:04:40,520 Speaker 1: been tons. I can name four specific people that literally 78 00:04:40,800 --> 00:04:45,120 Speaker 1: go through and their entire content is finding who they 79 00:04:45,279 --> 00:04:49,719 Speaker 1: think it's problematic, calling them out, and then giving out 80 00:04:49,720 --> 00:04:54,520 Speaker 1: public information. Now have we talked about docks and before 81 00:04:54,520 --> 00:04:57,719 Speaker 1: any Yes, I don't think we've ever done like a 82 00:04:57,760 --> 00:05:00,240 Speaker 1: whole episode on it, but it comes up in literal really, 83 00:05:00,279 --> 00:05:05,920 Speaker 1: I think every technology episode we do. Um, So this 84 00:05:05,960 --> 00:05:08,880 Speaker 1: is where we're gonna also talk about doxing, which is 85 00:05:08,920 --> 00:05:13,039 Speaker 1: defined as also from Miriam Webster, to publicly identify or 86 00:05:13,080 --> 00:05:16,760 Speaker 1: publish private information about someone, especially as a form of 87 00:05:16,800 --> 00:05:21,159 Speaker 1: punishment or revenge. So now I'm not saying that they 88 00:05:21,440 --> 00:05:24,200 Speaker 1: are doxing. Most of these sites will get public records 89 00:05:24,240 --> 00:05:27,000 Speaker 1: and then post them up. They will call out their jobs, 90 00:05:27,240 --> 00:05:29,960 Speaker 1: they will call out their family, and they will call 91 00:05:29,960 --> 00:05:33,400 Speaker 1: out their other social media's with their full name. There's 92 00:05:33,480 --> 00:05:37,839 Speaker 1: one woman whose whole TikTok is just having people say 93 00:05:37,920 --> 00:05:40,400 Speaker 1: can you find me, to see if she can find 94 00:05:40,440 --> 00:05:43,239 Speaker 1: them as she can, like with the as minimal little 95 00:05:43,320 --> 00:05:45,520 Speaker 1: amount of information they can put out there, she can 96 00:05:45,560 --> 00:05:47,920 Speaker 1: figure out who they are and she'll do it. It's 97 00:05:47,960 --> 00:05:50,320 Speaker 1: almost like a party trick. It's kind of interesting and 98 00:05:50,320 --> 00:05:52,920 Speaker 1: I'm like, also very terrifying. Of course, we're pretty public, 99 00:05:52,920 --> 00:05:56,560 Speaker 1: so it's not it's something too new to us, I guess, um. 100 00:05:56,640 --> 00:06:00,680 Speaker 1: But it's very interesting because I was the verse one 101 00:06:00,800 --> 00:06:04,080 Speaker 1: to be like, yes, yes, go get them. These racist 102 00:06:04,279 --> 00:06:07,840 Speaker 1: pieces of jerks need to be called out. Yes, if 103 00:06:07,880 --> 00:06:12,000 Speaker 1: they are serving children, like the teachers or or nurses, 104 00:06:12,279 --> 00:06:15,640 Speaker 1: and they're spreading misinformation or talking about how they hate 105 00:06:15,680 --> 00:06:19,640 Speaker 1: a group of people. That's really problematic, and we need 106 00:06:19,680 --> 00:06:23,400 Speaker 1: to have this conversation and we need they need consequences. 107 00:06:24,440 --> 00:06:29,560 Speaker 1: But it's become such a point of entertainment. I'm starting 108 00:06:29,560 --> 00:06:32,000 Speaker 1: to wonder is this a good thing? And in a 109 00:06:32,040 --> 00:06:36,640 Speaker 1: lot of these people have controversial backgrounds themselves or have 110 00:06:36,760 --> 00:06:41,560 Speaker 1: made statements themselves that put them in light, and so 111 00:06:41,640 --> 00:06:43,600 Speaker 1: you kind of start wondering, is this a way of 112 00:06:43,720 --> 00:07:00,360 Speaker 1: getting the justice? I guess so, Eddie, can you think 113 00:07:00,400 --> 00:07:06,160 Speaker 1: of a time what you've seen a person get justice. 114 00:07:07,560 --> 00:07:09,479 Speaker 1: I don't know how else to say this. I think 115 00:07:10,560 --> 00:07:13,920 Speaker 1: we've talked about this in our Cancel Culture episode. I 116 00:07:13,960 --> 00:07:17,680 Speaker 1: would have said maybe Kevin Spacey, and now he's like 117 00:07:17,920 --> 00:07:22,440 Speaker 1: speaking again in publican rights Harvey Wine scene. I guess 118 00:07:22,800 --> 00:07:25,840 Speaker 1: went to jail, but it took so much to do it, 119 00:07:25,880 --> 00:07:28,000 Speaker 1: and I'm not sure, like the social media part was 120 00:07:28,040 --> 00:07:30,400 Speaker 1: a big part of it, but it wasn't the only 121 00:07:31,360 --> 00:07:37,040 Speaker 1: um part of it. I also want, I guess I 122 00:07:37,040 --> 00:07:39,559 Speaker 1: want to caveat. I think my answer is not really 123 00:07:39,840 --> 00:07:42,080 Speaker 1: um but I do want to caveat. I'm not on 124 00:07:42,200 --> 00:07:46,600 Speaker 1: social media that much, um so because I think it's 125 00:07:46,680 --> 00:07:48,920 Speaker 1: it's it's it's one of those really fine lines, Like 126 00:07:48,960 --> 00:07:52,040 Speaker 1: we said, it's it's great it it can be really 127 00:07:52,080 --> 00:07:54,520 Speaker 1: great at calling attention to people in power and like, hey, 128 00:07:54,560 --> 00:07:58,920 Speaker 1: we can't forget what has happened here, but if nothing 129 00:07:58,960 --> 00:08:03,600 Speaker 1: happens outside of that, I don't know. It's complicated. But 130 00:08:03,640 --> 00:08:06,760 Speaker 1: I would say my answers with all those caveats not 131 00:08:06,800 --> 00:08:10,200 Speaker 1: really right. So I know. And we're going to talk 132 00:08:10,200 --> 00:08:14,679 Speaker 1: a bit about businesses too, because they are not immune 133 00:08:14,720 --> 00:08:18,040 Speaker 1: to this. But in a Guardian article that talked about 134 00:08:18,200 --> 00:08:21,720 Speaker 1: call out cultures, they said some people feel that callouts 135 00:08:21,760 --> 00:08:24,559 Speaker 1: are an excuse for petty drama, which is what I'm saying, 136 00:08:24,720 --> 00:08:26,920 Speaker 1: in a way to stir up gossip more than to 137 00:08:27,000 --> 00:08:30,080 Speaker 1: promote social justice. And that is exactly one of the 138 00:08:30,080 --> 00:08:33,400 Speaker 1: things that like, I feel like, yes, there's vindication, but 139 00:08:33,440 --> 00:08:35,679 Speaker 1: at the same time, this feels gross that we are 140 00:08:35,720 --> 00:08:40,120 Speaker 1: all celebrating it. We don't. We shouldn't be sitting here saying, yes, 141 00:08:40,160 --> 00:08:42,320 Speaker 1: we caught them, we got them fired. In the story, 142 00:08:42,640 --> 00:08:45,960 Speaker 1: we know the whole event that happened in New York 143 00:08:46,080 --> 00:08:48,319 Speaker 1: with a bird watcher and the woman with the dog. 144 00:08:49,000 --> 00:08:54,760 Speaker 1: She sued her company, and I think one because she 145 00:08:54,840 --> 00:08:57,920 Speaker 1: felt like that was, you know, done incorrectly. What she 146 00:08:58,000 --> 00:09:00,840 Speaker 1: did was racist as wrong. I will doubt that this 147 00:09:00,880 --> 00:09:05,360 Speaker 1: could have completely harmed that individual. But also the victims 148 00:09:05,400 --> 00:09:07,960 Speaker 1: said he did not want that for her. He did 149 00:09:08,000 --> 00:09:10,600 Speaker 1: not want all this conversation. What he wanted was just 150 00:09:10,640 --> 00:09:12,880 Speaker 1: to be able to peacefully go and have an interaction 151 00:09:12,920 --> 00:09:15,960 Speaker 1: of him move on. But this was problem to him. 152 00:09:16,280 --> 00:09:17,840 Speaker 1: So I feel like we also need to listen to 153 00:09:17,880 --> 00:09:20,839 Speaker 1: the victims of the people. Right, So if he's the 154 00:09:20,840 --> 00:09:23,680 Speaker 1: one that saying I didn't want all of that for her, yes, 155 00:09:24,000 --> 00:09:26,960 Speaker 1: I wanted justice, but not all of this. This is unnecessary. 156 00:09:27,080 --> 00:09:29,440 Speaker 1: And in that same article, they said a frequently cited 157 00:09:29,480 --> 00:09:32,840 Speaker 1: problem with callouts is that it's all too easy to 158 00:09:32,920 --> 00:09:38,080 Speaker 1: get carried away and overpunished people, turning alleged perpetrators of 159 00:09:38,200 --> 00:09:42,000 Speaker 1: upsetting acts into victims themselves. Um, and I thought that 160 00:09:42,080 --> 00:09:46,080 Speaker 1: was interesting because again, people are jerks and we know 161 00:09:46,280 --> 00:09:49,200 Speaker 1: things happen, but we also have seen things like people 162 00:09:49,240 --> 00:09:53,800 Speaker 1: writing from when they were seventeen fifteen because they have 163 00:09:54,200 --> 00:09:57,000 Speaker 1: the newer generations have grown up with social media which 164 00:09:57,000 --> 00:09:59,520 Speaker 1: we didn't, which I didn't, that are being they are 165 00:09:59,520 --> 00:10:04,240 Speaker 1: being pun now for it. I will say don't be 166 00:10:04,920 --> 00:10:08,360 Speaker 1: uh ignorant and think that people cannot find your past. 167 00:10:09,840 --> 00:10:13,000 Speaker 1: That's always something. But also I will say people change. 168 00:10:13,440 --> 00:10:17,599 Speaker 1: I changed. I changed from thinking that a forvord of 169 00:10:17,640 --> 00:10:21,400 Speaker 1: action was racist tokenism um, and I was gonna help 170 00:10:21,400 --> 00:10:23,640 Speaker 1: hauld up the white supremacy. Of course I didn't know 171 00:10:23,679 --> 00:10:25,920 Speaker 1: that then by saying I can earn my own which 172 00:10:25,920 --> 00:10:29,040 Speaker 1: is not the truth, this is not the truth, um, 173 00:10:29,120 --> 00:10:31,800 Speaker 1: and what it actually did. But because I was so 174 00:10:31,920 --> 00:10:34,960 Speaker 1: young and really under the influence of my family who 175 00:10:35,000 --> 00:10:37,960 Speaker 1: believed that way, I thought that too. Would I have 176 00:10:38,040 --> 00:10:41,080 Speaker 1: said something if I had Twitter at that point in time, 177 00:10:41,200 --> 00:10:44,079 Speaker 1: I don't know. Because making controversy for marcial statements can 178 00:10:44,160 --> 00:10:48,520 Speaker 1: also be a thing that makes you rise up right, Yeah, 179 00:10:48,640 --> 00:10:51,920 Speaker 1: and uh we've discussed that too a lot of times 180 00:10:51,960 --> 00:10:54,440 Speaker 1: Brigid or even in our recent YouTube episode of the 181 00:10:54,480 --> 00:10:58,480 Speaker 1: Way the Algorithm Works, because you can get more attraction. 182 00:10:59,559 --> 00:11:02,200 Speaker 1: I doing that by making these kind of like really 183 00:11:02,240 --> 00:11:06,960 Speaker 1: incendiary claims and just catches on. And I think that's 184 00:11:07,000 --> 00:11:10,080 Speaker 1: like one of my biggest anxieties around all of this 185 00:11:10,320 --> 00:11:14,960 Speaker 1: is um because as we said that Cancel Culture episode, 186 00:11:15,840 --> 00:11:19,120 Speaker 1: when you use correctly all for it all great, and 187 00:11:19,160 --> 00:11:22,080 Speaker 1: it's one of those things that like people, especially conservatives, 188 00:11:22,080 --> 00:11:24,120 Speaker 1: like to like say which hunter or whatever, and that's 189 00:11:24,120 --> 00:11:27,480 Speaker 1: not what's happening. But when it becomes like I know, 190 00:11:27,520 --> 00:11:31,480 Speaker 1: there's been a couple of instances of somebody has a 191 00:11:31,520 --> 00:11:36,280 Speaker 1: similar user name to someone else and then you as 192 00:11:36,360 --> 00:11:40,120 Speaker 1: the as the person who just like glombs on, it's like, oh, yeah, 193 00:11:40,200 --> 00:11:41,960 Speaker 1: we're mad at this person because they did a terrible thing, 194 00:11:41,960 --> 00:11:44,120 Speaker 1: and you don't do the research, you don't do the 195 00:11:44,480 --> 00:11:47,719 Speaker 1: like the step, and you just do it. And then 196 00:11:47,720 --> 00:11:49,760 Speaker 1: it turns out that wasn't who you were mad at. 197 00:11:50,360 --> 00:11:54,240 Speaker 1: The different person. Now this person is dealing with all 198 00:11:54,280 --> 00:11:59,120 Speaker 1: of these things that maybe they had nothing to do 199 00:12:00,000 --> 00:12:02,199 Speaker 1: with this whole conversation. So that's what kind of makes 200 00:12:02,240 --> 00:12:06,320 Speaker 1: me makes me nervous because I think it is set up. 201 00:12:06,880 --> 00:12:09,160 Speaker 1: A lot of social media is set up to reward 202 00:12:10,320 --> 00:12:14,880 Speaker 1: kind of this lake yeah, controversial cindiary things, right, and 203 00:12:14,920 --> 00:12:17,719 Speaker 1: it brings out a mob mentality for sure. And there 204 00:12:17,760 --> 00:12:20,720 Speaker 1: have been several instances when they go after private people, 205 00:12:21,080 --> 00:12:26,120 Speaker 1: tag the wrong person and they get flooded with harassment repeatedly. 206 00:12:26,559 --> 00:12:29,120 Speaker 1: And it is such an unfortunate event because we've seen 207 00:12:29,160 --> 00:12:32,640 Speaker 1: that constantly because people don't do background checks, and they 208 00:12:32,760 --> 00:12:35,240 Speaker 1: try to be detectives by saying this is here to 209 00:12:35,320 --> 00:12:37,840 Speaker 1: here to hear, and this is this person obviously, and 210 00:12:37,880 --> 00:12:40,680 Speaker 1: it kind of becomes again a show for them to 211 00:12:40,720 --> 00:12:43,439 Speaker 1: try to undo that. It's hard to undo that. UM. 212 00:12:43,559 --> 00:12:47,000 Speaker 1: Now things people do move on to the next person, 213 00:12:47,440 --> 00:12:50,760 Speaker 1: but it's hard to undo a lot of those things. UM. 214 00:12:50,960 --> 00:12:53,680 Speaker 1: One of the articles that I read was an interview 215 00:12:53,880 --> 00:12:58,120 Speaker 1: with scholar and feminist Loretta Ross, Professor Ross, and she says, 216 00:12:58,240 --> 00:13:00,600 Speaker 1: it's the tendency which is unfortunate for people to want 217 00:13:00,640 --> 00:13:03,280 Speaker 1: to publicly shame and humiliate people, and it's based on 218 00:13:03,360 --> 00:13:05,360 Speaker 1: what they say, or what they look like, or what 219 00:13:05,440 --> 00:13:08,000 Speaker 1: they wear, or who they're hanging out with, or who 220 00:13:08,040 --> 00:13:11,400 Speaker 1: they agree or disagree with. Is attaching labels to people 221 00:13:11,440 --> 00:13:15,360 Speaker 1: without really doing any kind of nuance, without understanding that 222 00:13:15,400 --> 00:13:17,839 Speaker 1: even if you disagree with someone, you shouldn't want to 223 00:13:17,880 --> 00:13:21,320 Speaker 1: attack their humanity, call them a toxic person, or things 224 00:13:21,360 --> 00:13:24,120 Speaker 1: like that. It's usually done most damaging lee over the 225 00:13:24,200 --> 00:13:28,840 Speaker 1: internet because social media amplifies and makes all callouts basically 226 00:13:28,840 --> 00:13:31,920 Speaker 1: go viral immediately. And again, this is kind of that thing. 227 00:13:31,960 --> 00:13:34,480 Speaker 1: Like I can think of five people like they were 228 00:13:34,520 --> 00:13:36,439 Speaker 1: and they were all fighting with each other. They've all 229 00:13:36,480 --> 00:13:39,400 Speaker 1: had to step away from their account because they got 230 00:13:39,440 --> 00:13:42,319 Speaker 1: something so wrong, or they were so offensive in some 231 00:13:42,360 --> 00:13:45,520 Speaker 1: things that they had to stop and say uh, but 232 00:13:45,600 --> 00:13:51,160 Speaker 1: they were still too deep into their content to stop. Altogether, 233 00:13:51,640 --> 00:13:53,160 Speaker 1: there are so many things. And then I don't know 234 00:13:53,200 --> 00:13:55,000 Speaker 1: if you remember this video I think I've talked about 235 00:13:55,000 --> 00:13:57,160 Speaker 1: it before where a girl gets into a wreck with 236 00:13:57,320 --> 00:14:00,360 Speaker 1: a very fancy vehicle and all you see is the 237 00:14:00,400 --> 00:14:02,960 Speaker 1: girl screaming at him and it looks like that she 238 00:14:03,000 --> 00:14:06,559 Speaker 1: had hit him. But then you she gets a surveillance 239 00:14:06,600 --> 00:14:10,800 Speaker 1: camera CCTV footage which shows he actually cut her off 240 00:14:11,160 --> 00:14:14,640 Speaker 1: and she he hit her. But like, because he released 241 00:14:14,679 --> 00:14:18,360 Speaker 1: that video, people started attacking her and immediately calling her 242 00:14:18,440 --> 00:14:21,480 Speaker 1: job because that was all public information, and she was like, 243 00:14:21,560 --> 00:14:24,480 Speaker 1: what the hell? And then because of that, it got 244 00:14:24,600 --> 00:14:29,440 Speaker 1: really really complicated in the poort case for that wreck 245 00:14:30,200 --> 00:14:32,680 Speaker 1: or the insurance case, I guess. But we see stuff 246 00:14:32,720 --> 00:14:35,000 Speaker 1: like that and always sees half the time it's half 247 00:14:35,080 --> 00:14:38,480 Speaker 1: videos and it moves on. Now let me put this 248 00:14:38,520 --> 00:14:44,680 Speaker 1: caveat here. Videos like police brutality, videos like that, there 249 00:14:44,800 --> 00:14:47,440 Speaker 1: is no what did they do? I won't we won't 250 00:14:47,440 --> 00:14:53,440 Speaker 1: stand for that. That's both no, no, police or no. 251 00:14:53,760 --> 00:14:56,200 Speaker 1: People who think they're an authority have the right to beat, 252 00:14:56,960 --> 00:15:01,960 Speaker 1: to abuse, or to u totally anyone, point blank. That 253 00:15:01,960 --> 00:15:05,320 Speaker 1: should not be a thing. I won't talk about that. 254 00:15:05,360 --> 00:15:07,360 Speaker 1: We're not going to talk about the current issues that 255 00:15:07,400 --> 00:15:12,400 Speaker 1: have happened. Again, even if they did something wrong outside 256 00:15:12,440 --> 00:15:15,440 Speaker 1: of like truly almost killing someone, like with a gun 257 00:15:15,560 --> 00:15:21,800 Speaker 1: with intent, A guilty person should not die either. A 258 00:15:21,880 --> 00:15:25,600 Speaker 1: person who commits a theft should not have to die. 259 00:15:25,600 --> 00:15:29,000 Speaker 1: A person who is asking questions and trying to figure 260 00:15:29,000 --> 00:15:32,040 Speaker 1: out why they're being detained should not be hurt and abused. 261 00:15:32,320 --> 00:15:34,760 Speaker 1: So that part we're not talking about that, We're not 262 00:15:34,800 --> 00:15:37,280 Speaker 1: talking about that issue. What we're talking about is the 263 00:15:37,320 --> 00:15:40,800 Speaker 1: small accounts of she wrote this, let's cancel this person. 264 00:15:41,000 --> 00:15:44,040 Speaker 1: She said this, let's cancel this person. Racist tirades are 265 00:15:44,040 --> 00:15:46,280 Speaker 1: not good. That needs to be called out. But again, 266 00:15:46,280 --> 00:15:49,920 Speaker 1: we do need to talk about growth. You can't fix racist, 267 00:15:49,960 --> 00:15:52,440 Speaker 1: but you can't fix situations, and you can talk about 268 00:15:52,480 --> 00:15:56,040 Speaker 1: why they're angry. Mm hmm, I mean, I think that's 269 00:15:57,120 --> 00:15:58,560 Speaker 1: next quote you just read. I think that's one of 270 00:15:58,560 --> 00:16:04,440 Speaker 1: the keys is social media can because of the limitations 271 00:16:04,480 --> 00:16:07,840 Speaker 1: of what you communicate, it can be very limited in 272 00:16:07,920 --> 00:16:11,320 Speaker 1: terms of nuance, and I do think nuance is important, 273 00:16:11,360 --> 00:16:14,000 Speaker 1: and you know, there are some things that yeah, absolutely 274 00:16:14,120 --> 00:16:16,960 Speaker 1: or like well I got the nuance and they're still 275 00:16:17,040 --> 00:16:20,120 Speaker 1: canceled like that. But like if you, I feel like 276 00:16:20,160 --> 00:16:22,080 Speaker 1: a lot of us don't even take that step, would 277 00:16:22,120 --> 00:16:23,880 Speaker 1: just see like, oh everyone else is doing it, I'm 278 00:16:23,880 --> 00:16:25,840 Speaker 1: gonna get on board or else they're gonna think that 279 00:16:25,960 --> 00:16:30,280 Speaker 1: I'm bad because I didn't say anything and just build 280 00:16:30,360 --> 00:16:32,800 Speaker 1: and its snowballs out. And especially because we have it, 281 00:16:32,840 --> 00:16:35,560 Speaker 1: I'm soo coming out on AI. We've talked about deep 282 00:16:35,560 --> 00:16:38,240 Speaker 1: fakes before, like it's going to get more and more 283 00:16:38,360 --> 00:16:42,200 Speaker 1: complicated in the future for us to know what is 284 00:16:42,400 --> 00:16:45,240 Speaker 1: the reality? When is the truth of what we're seeing online? 285 00:16:45,480 --> 00:16:48,320 Speaker 1: And that was something like the older generation struggled with 286 00:16:48,360 --> 00:16:50,680 Speaker 1: because they didn't grow up with social media, they didn't 287 00:16:50,680 --> 00:16:53,240 Speaker 1: grow up on the internet. Now, you and I, especially 288 00:16:53,280 --> 00:16:56,800 Speaker 1: because this is kind of a burgeoning technology, we're going 289 00:16:56,840 --> 00:16:59,800 Speaker 1: to have to be vigilant about that because you can't 290 00:17:00,240 --> 00:17:03,720 Speaker 1: if the technology is so good and like I mean 291 00:17:03,880 --> 00:17:07,280 Speaker 1: fred accounts of you know, revenge porn targeted at women, 292 00:17:07,400 --> 00:17:10,800 Speaker 1: especially like journalists or something like that, and if nobody 293 00:17:11,359 --> 00:17:13,440 Speaker 1: I mean number one. A lot of that shouldn't even 294 00:17:13,480 --> 00:17:15,840 Speaker 1: be a fire able offense with us a whole other situation. 295 00:17:15,880 --> 00:17:20,000 Speaker 1: But anyway, Um, but it's like once people just make 296 00:17:20,040 --> 00:17:21,919 Speaker 1: these videos and they're sending them to like your family, 297 00:17:22,720 --> 00:17:25,600 Speaker 1: so hard to be like, hey, no, I know it 298 00:17:25,680 --> 00:17:45,359 Speaker 1: looks right anyway, which is a concern with the power 299 00:17:45,400 --> 00:17:48,600 Speaker 1: of social media. There's a power to shut down businesses. 300 00:17:48,800 --> 00:17:50,960 Speaker 1: We you know, I kind of alluded to that in 301 00:17:51,040 --> 00:17:55,480 Speaker 1: a Forbes article about call outs. They actually posted this. 302 00:17:55,600 --> 00:17:57,919 Speaker 1: In a recent study conducted by who is hosting this 303 00:17:58,040 --> 00:18:00,600 Speaker 1: dot com, it was noted that it can take dozens 304 00:18:00,640 --> 00:18:04,520 Speaker 1: of glowing reviews to reverse the damage of single scathing complaints, 305 00:18:04,720 --> 00:18:08,359 Speaker 1: especially if the complaint goes viral. More importantly, more people 306 00:18:08,520 --> 00:18:12,560 Speaker 1: now publicly calling out companies on social media. Now, I 307 00:18:12,600 --> 00:18:16,439 Speaker 1: will say, I feel like oftentimes the big corporates, you 308 00:18:16,520 --> 00:18:19,320 Speaker 1: can't get their attention until you do call them out. 309 00:18:19,600 --> 00:18:22,959 Speaker 1: Airbnb is a prime example of that. Amazon is a 310 00:18:23,000 --> 00:18:26,160 Speaker 1: prime example of that. All the airports apparently, like all 311 00:18:26,200 --> 00:18:28,160 Speaker 1: of them, there's not a single one that's not been 312 00:18:28,160 --> 00:18:30,840 Speaker 1: called out. I feel like they've all been called out, 313 00:18:30,880 --> 00:18:32,840 Speaker 1: and that's seemingly the only way you can get to them. 314 00:18:32,840 --> 00:18:37,640 Speaker 1: Because you can't find any satisfaction through doing it individually quietly. 315 00:18:37,840 --> 00:18:40,359 Speaker 1: And the same study found that the fifty one of 316 00:18:40,359 --> 00:18:43,880 Speaker 1: respondents of who have they surveyed said they called out 317 00:18:43,920 --> 00:18:46,920 Speaker 1: a company, and it was it isn't just the young 318 00:18:47,000 --> 00:18:49,840 Speaker 1: crowd that is being as so vocal on social media. 319 00:18:49,960 --> 00:18:53,639 Speaker 1: In fact, forty four percent of baby boomers said they 320 00:18:53,840 --> 00:18:57,399 Speaker 1: made complaints about a company, and men were more likely 321 00:18:57,520 --> 00:18:59,960 Speaker 1: to voide their concerns, which we should talk about because 322 00:19:00,040 --> 00:19:02,520 Speaker 1: you all have this habit of calling Karen's because we 323 00:19:02,520 --> 00:19:05,080 Speaker 1: do see the videos, but apparently then make most of 324 00:19:05,080 --> 00:19:10,399 Speaker 1: these complaints anyway. But yes, I've seen so many and 325 00:19:10,520 --> 00:19:13,000 Speaker 1: don't get me wrong, these businesses need to be called out. 326 00:19:13,000 --> 00:19:15,760 Speaker 1: If they refused to serve someone because of race, call 327 00:19:15,840 --> 00:19:18,720 Speaker 1: them out. They don't deserve that business if they like 328 00:19:18,840 --> 00:19:21,919 Speaker 1: all these things, if they kick you out because of 329 00:19:21,960 --> 00:19:24,240 Speaker 1: one thing or another. But it was also used by 330 00:19:24,240 --> 00:19:28,359 Speaker 1: anti vaxers because they refused to give service to people 331 00:19:28,440 --> 00:19:32,320 Speaker 1: who didn't wear masks inside during the pandemic. So this 332 00:19:32,400 --> 00:19:36,360 Speaker 1: is where it also flips. Mm hmm, yeah, I mean 333 00:19:36,359 --> 00:19:38,439 Speaker 1: that's the thing, um, because I don't think it's a 334 00:19:38,440 --> 00:19:41,679 Speaker 1: surprise to anyone. I doubt it was listening. We're pretty 335 00:19:41,800 --> 00:19:45,240 Speaker 1: were liberal because I feel like so many things we 336 00:19:45,280 --> 00:19:49,720 Speaker 1: talked about, it has like the potential to do these 337 00:19:49,760 --> 00:19:51,320 Speaker 1: good things, but it has the potential to be like 338 00:19:51,400 --> 00:19:54,760 Speaker 1: weaponized in the other way where we've talked about kind 339 00:19:54,800 --> 00:19:57,280 Speaker 1: of like you know, co opting of terms like woke. 340 00:19:57,480 --> 00:20:00,480 Speaker 1: Are things like that from like conservative people who are 341 00:20:00,480 --> 00:20:03,160 Speaker 1: now using it, like not in the way like it's 342 00:20:03,200 --> 00:20:06,400 Speaker 1: been like taking those ideas and then changing them and 343 00:20:06,480 --> 00:20:09,760 Speaker 1: turning themselves into the victim here. So like the whole 344 00:20:09,840 --> 00:20:12,960 Speaker 1: idea about masks, you know, I'm the victim. I can't 345 00:20:12,960 --> 00:20:14,480 Speaker 1: do what I want or like even if you can 346 00:20:15,400 --> 00:20:18,960 Speaker 1: think about like things like businesses who were refused to 347 00:20:19,040 --> 00:20:23,560 Speaker 1: make like cakes for gay couples or whatever, where it's 348 00:20:23,640 --> 00:20:26,800 Speaker 1: like they want they want to do whatever they want, 349 00:20:26,800 --> 00:20:30,800 Speaker 1: but they but unless unless it doesn't agree with like 350 00:20:30,880 --> 00:20:33,920 Speaker 1: my value set, and it is it's murky. I get 351 00:20:33,920 --> 00:20:35,920 Speaker 1: to keep going back to the world nuance, because there 352 00:20:36,000 --> 00:20:39,640 Speaker 1: is a lot of nuance in this conversation. But yeah, 353 00:20:39,720 --> 00:20:42,160 Speaker 1: like that again, if you just don't do your research, 354 00:20:42,200 --> 00:20:44,479 Speaker 1: if you're just like, oh, look at this one star review, 355 00:20:45,560 --> 00:20:47,480 Speaker 1: and that's where I will never go there, And then 356 00:20:47,520 --> 00:20:49,960 Speaker 1: the one star review is maybe, oh, they made me 357 00:20:50,200 --> 00:20:52,080 Speaker 1: put on a mask, and maybe you agree with that, 358 00:20:52,080 --> 00:20:54,439 Speaker 1: and you're like, actually, I really like that, right, I 359 00:20:54,480 --> 00:20:57,080 Speaker 1: would like to go there then, right, and yeah, that's 360 00:20:57,080 --> 00:21:00,199 Speaker 1: exactly the right. Like, there's so many things and at 361 00:21:00,240 --> 00:21:03,560 Speaker 1: the very beginning we talked about is that when someone 362 00:21:03,880 --> 00:21:07,120 Speaker 1: comes on social media and says this person this to me, 363 00:21:07,400 --> 00:21:09,960 Speaker 1: and then you have people all over the world, not country, 364 00:21:10,000 --> 00:21:12,479 Speaker 1: all of the world going to their side and just 365 00:21:12,800 --> 00:21:16,240 Speaker 1: crashing their reviews and making stuff up. They made stuff up, 366 00:21:16,240 --> 00:21:18,439 Speaker 1: like people would be like I saw this roach, this 367 00:21:18,480 --> 00:21:21,000 Speaker 1: person trying to kick me out, this person trying to 368 00:21:21,040 --> 00:21:23,720 Speaker 1: serve me, a severed foot like stuff like that. That 369 00:21:23,840 --> 00:21:26,720 Speaker 1: wasn't making I made that one. I'm sorry, but that 370 00:21:26,760 --> 00:21:28,879 Speaker 1: will be my review for somebody. I'm just kidding. But like, 371 00:21:28,960 --> 00:21:31,800 Speaker 1: they made things up, so you don't know exactly what's happening. 372 00:21:31,880 --> 00:21:33,960 Speaker 1: Don't get me wrong again, if they're racist, that they're 373 00:21:33,960 --> 00:21:36,359 Speaker 1: not serving people. But there was a lot of times 374 00:21:36,440 --> 00:21:40,359 Speaker 1: where the person who was trying to get something was 375 00:21:40,400 --> 00:21:42,640 Speaker 1: the offender, and then we did not know that until 376 00:21:42,720 --> 00:21:45,159 Speaker 1: way later. There's so many things like that that we 377 00:21:45,240 --> 00:21:48,560 Speaker 1: don't quite understand. Now. Again, I don't want to go 378 00:21:48,640 --> 00:21:51,240 Speaker 1: back to the this is this this side of the story, 379 00:21:51,280 --> 00:21:53,040 Speaker 1: that side of the story. There's definitely situations to that, 380 00:21:53,080 --> 00:21:55,280 Speaker 1: but there's definitely situations where like, no, that was injustice 381 00:21:55,440 --> 00:21:57,840 Speaker 1: and that was unfair. And we believe that anybody who's 382 00:21:57,880 --> 00:22:00,840 Speaker 1: being sexually harassed, anybody, any we know this, So this 383 00:22:00,920 --> 00:22:03,240 Speaker 1: is what that in mind. Um And by the way, 384 00:22:03,280 --> 00:22:05,639 Speaker 1: again in the Forbes article, it says that the survey 385 00:22:05,760 --> 00:22:09,600 Speaker 1: also found that fifty two of those who said they 386 00:22:09,760 --> 00:22:14,520 Speaker 1: call out companies did so to raise awareness and said 387 00:22:14,560 --> 00:22:17,320 Speaker 1: it was so that others could avoid similar problems. So 388 00:22:17,359 --> 00:22:20,720 Speaker 1: the motives were high minded, so the survey noted, but 389 00:22:20,760 --> 00:22:23,480 Speaker 1: it suggests that there's more than just a refund or 390 00:22:23,640 --> 00:22:26,640 Speaker 1: discount in mind. And again, I've definitely seen those people 391 00:22:26,640 --> 00:22:29,399 Speaker 1: who were like, they've tried that route, they got no answer, 392 00:22:29,560 --> 00:22:31,359 Speaker 1: So this is what they're gonna do is publicly called 393 00:22:31,400 --> 00:22:34,560 Speaker 1: them out. I I think like, there's again this is 394 00:22:34,600 --> 00:22:37,960 Speaker 1: one of those like I don't know how I feel moments, 395 00:22:38,040 --> 00:22:41,720 Speaker 1: you know, like those people who call out racists and 396 00:22:42,280 --> 00:22:46,440 Speaker 1: sexist acts and people. I'm glad, I guess, but I 397 00:22:46,520 --> 00:22:50,520 Speaker 1: really hate that we rely on these people when they 398 00:22:50,560 --> 00:22:55,080 Speaker 1: do this. It's all gloating and it's all it feels petty, 399 00:22:55,119 --> 00:22:59,479 Speaker 1: and I'm like, you're literally getting views on the backs 400 00:22:59,520 --> 00:23:03,119 Speaker 1: of someone else is suffering. Is that okay? All right? 401 00:23:03,160 --> 00:23:04,880 Speaker 1: I mean that's going back to what you said about 402 00:23:04,880 --> 00:23:06,960 Speaker 1: the accident of New York, Like what who are we 403 00:23:07,000 --> 00:23:10,040 Speaker 1: doing this for? Like? Who who are you calling out for? 404 00:23:10,240 --> 00:23:12,040 Speaker 1: And I think it's really easy to tell yourself while 405 00:23:12,040 --> 00:23:16,360 Speaker 1: I'm like doing doing the God's work or whatever. Right, 406 00:23:16,840 --> 00:23:22,040 Speaker 1: But yeah, I mean we're kind of we've really monetized 407 00:23:22,880 --> 00:23:24,880 Speaker 1: and we're not like we're making money, but like the 408 00:23:24,880 --> 00:23:30,920 Speaker 1: the whole the currency, the whole currency of like internet 409 00:23:31,000 --> 00:23:34,399 Speaker 1: kind of performative action and getting those views and getting 410 00:23:34,400 --> 00:23:39,960 Speaker 1: those followers can be I think people can lose sight 411 00:23:41,440 --> 00:23:44,360 Speaker 1: of who it should be for. Like if you if 412 00:23:44,400 --> 00:23:47,680 Speaker 1: you're doing it for like high minded things or something 413 00:23:47,720 --> 00:23:50,000 Speaker 1: other than a refund and discount, that's not necessarily wrong. 414 00:23:50,040 --> 00:23:52,919 Speaker 1: But if but if that's just kind of a performative 415 00:23:52,920 --> 00:23:57,200 Speaker 1: thing that you're doing or not doing the research, then 416 00:23:57,320 --> 00:24:02,480 Speaker 1: that's that's not the same. Yeah, Well, any thank you 417 00:24:02,520 --> 00:24:04,680 Speaker 1: for going down this rabbit hole with me, because I've 418 00:24:04,720 --> 00:24:07,320 Speaker 1: got so many more questions that I'll be bringing about 419 00:24:07,440 --> 00:24:13,200 Speaker 1: in my now Sam's corner segment. Okay, but why Internet 420 00:24:14,440 --> 00:24:17,200 Speaker 1: but my Internet? Well that we do have a lot 421 00:24:17,200 --> 00:24:19,520 Speaker 1: of questions about that and we will not run out 422 00:24:19,520 --> 00:24:27,040 Speaker 1: of content. Awesome. Well, thanks for bringing this topic, look 423 00:24:27,040 --> 00:24:30,480 Speaker 1: forward to the next one and listeners. If you have 424 00:24:30,520 --> 00:24:33,760 Speaker 1: any like why Internet topics we should discuss any thoughts 425 00:24:33,760 --> 00:24:36,160 Speaker 1: about this, you can email us at Stefan mom Stuff 426 00:24:36,160 --> 00:24:37,919 Speaker 1: at ihart media dot com. You can find us on 427 00:24:37,960 --> 00:24:41,000 Speaker 1: Twitter at most of the podcast on Instagram and TikTok 428 00:24:41,160 --> 00:24:43,800 Speaker 1: that stuff I've Never told you. Thanks as always to 429 00:24:43,840 --> 00:24:47,560 Speaker 1: our super producer Christina. Thank you Christina, and thanks to 430 00:24:47,600 --> 00:24:50,000 Speaker 1: you for listening. Stefan never told his prediction I Hire 431 00:24:50,040 --> 00:24:51,399 Speaker 1: Radio For more podcast from my Heart Radio, you can 432 00:24:51,440 --> 00:24:53,240 Speaker 1: check out that Hurt Radio have a podcast wherever you 433 00:24:53,240 --> 00:24:54,320 Speaker 1: listen to your favorite shows.