1 00:00:09,840 --> 00:00:12,080 Speaker 1: A little while back, I called one of my friends 2 00:00:12,160 --> 00:00:14,920 Speaker 1: up just to say hey, and when they picked up, 3 00:00:15,400 --> 00:00:18,200 Speaker 1: I could tell they were freaking out, like their voice 4 00:00:18,320 --> 00:00:21,640 Speaker 1: was shaking, And I asked what was wrong and they said, oh, 5 00:00:21,800 --> 00:00:24,599 Speaker 1: I thought you called me because you saw And I 6 00:00:24,640 --> 00:00:27,120 Speaker 1: said saw, what? What are you talking about? And they said, well, 7 00:00:27,200 --> 00:00:29,360 Speaker 1: it's all over the internet have you seen it? And 8 00:00:29,440 --> 00:00:31,880 Speaker 1: I still didn't get it. So it turns out this 9 00:00:31,920 --> 00:00:34,320 Speaker 1: friend of mine had posted a TikTok of themselves talking 10 00:00:34,360 --> 00:00:37,720 Speaker 1: about something honestly pretty innocuous, but some small group of 11 00:00:37,720 --> 00:00:40,320 Speaker 1: people decided that they didn't like what was being said, 12 00:00:40,680 --> 00:00:43,199 Speaker 1: and they were being harassed online. I mean, it was 13 00:00:43,440 --> 00:00:45,960 Speaker 1: really bad. The post was full of hate comments, there 14 00:00:45,960 --> 00:00:48,920 Speaker 1: were hate messages in the inbox they'd gotten docked, and 15 00:00:49,080 --> 00:00:53,040 Speaker 1: someone had found their mother's email address and was sending 16 00:00:53,040 --> 00:00:55,840 Speaker 1: the family hate mail. I had no idea this was 17 00:00:55,880 --> 00:01:00,000 Speaker 1: going on, and it really bothered me, not just because 18 00:01:00,320 --> 00:01:03,520 Speaker 1: people are harassing my friend, but because this was the 19 00:01:03,560 --> 00:01:07,040 Speaker 1: third friend that this was happening to that month that 20 00:01:07,120 --> 00:01:07,680 Speaker 1: I knew of. 21 00:01:08,440 --> 00:01:12,480 Speaker 2: I would say most people do deal with online harassment 22 00:01:12,520 --> 00:01:16,240 Speaker 2: and don't necessarily talk about it or share or express it. 23 00:01:17,200 --> 00:01:20,080 Speaker 1: Rommy Galley is a security professional, and he's been working 24 00:01:20,080 --> 00:01:23,520 Speaker 1: on keeping journalists around the globe safe for over a decade. 25 00:01:24,040 --> 00:01:28,560 Speaker 2: It's sort of like an invisible epidemic cooksorce. Everyone is 26 00:01:28,600 --> 00:01:32,640 Speaker 2: so accessible today that sometimes things can just blow up 27 00:01:32,680 --> 00:01:38,080 Speaker 2: out of proportions, and because you're only experiencing it through 28 00:01:38,160 --> 00:01:41,520 Speaker 2: a screen, people aren't actually fully aware of what it 29 00:01:41,600 --> 00:01:44,240 Speaker 2: is that you're dealing with, so it's very isolating. 30 00:01:44,560 --> 00:01:47,039 Speaker 1: Romi actually started his career as a combat medic in 31 00:01:47,040 --> 00:01:50,080 Speaker 1: the US Marine Corps and he served in Afghanistan. His 32 00:01:50,200 --> 00:01:53,200 Speaker 1: unit responded to everything from backup requests from other units 33 00:01:53,200 --> 00:01:55,720 Speaker 1: to dealing with suicide bombings. When I met him in 34 00:01:55,760 --> 00:01:59,160 Speaker 1: twenty eighteen, he was my coworker advice on the security team. 35 00:01:59,360 --> 00:02:01,240 Speaker 1: At first, I just num is the guy who helped 36 00:02:01,320 --> 00:02:03,760 Speaker 1: organize training sessions for people who were going to go 37 00:02:03,840 --> 00:02:07,000 Speaker 1: report on dangerous conflict zones, and he'd tell you how 38 00:02:07,000 --> 00:02:09,560 Speaker 1: to stay safe out there. But as time went on, 39 00:02:09,760 --> 00:02:12,040 Speaker 1: even those of us who weren't traveling the war zones 40 00:02:12,120 --> 00:02:14,600 Speaker 1: realized that we also needed Robbie's help. 41 00:02:15,160 --> 00:02:18,800 Speaker 2: The reality is danger can be present in very different ways, 42 00:02:18,800 --> 00:02:21,239 Speaker 2: and today I would say, actually it's much more conventionally 43 00:02:21,360 --> 00:02:25,399 Speaker 2: like the online sphere because online harassment incidents are much 44 00:02:25,440 --> 00:02:28,720 Speaker 2: more common than let's say, like a field incident. The 45 00:02:28,800 --> 00:02:32,239 Speaker 2: digital an online world are always there, so and it 46 00:02:32,360 --> 00:02:36,520 Speaker 2: sounds you could argue that when your digital hygiene isn't 47 00:02:36,600 --> 00:02:40,480 Speaker 2: up to par, you're sort of unwillingly placing yourself in 48 00:02:40,520 --> 00:02:42,320 Speaker 2: a vulnerable situation. 49 00:02:44,000 --> 00:02:46,240 Speaker 1: Four at ten adults in the US have dealt with 50 00:02:46,320 --> 00:02:49,320 Speaker 1: some kind of online harassment, and that's debt is from 51 00:02:49,400 --> 00:02:52,120 Speaker 1: almost a decade ago. So what do you do when 52 00:02:52,120 --> 00:02:54,480 Speaker 1: it happens to you or to someone you care about? 53 00:02:54,760 --> 00:03:01,160 Speaker 1: And is there a way to prepare for it? From 54 00:03:01,200 --> 00:03:13,160 Speaker 1: Kaleidoscope and iHeart Podcast. This is kill Switch, thisnect I'm 55 00:03:13,200 --> 00:03:13,920 Speaker 1: Dexter Thomas. 56 00:03:15,600 --> 00:03:53,000 Speaker 3: I'm sorry, I'm sorry, Starr, goodbye. 57 00:03:57,000 --> 00:03:59,440 Speaker 1: How would you define online harassment? 58 00:03:59,800 --> 00:04:01,400 Speaker 2: I'm I mean, I think we know when we see 59 00:04:01,400 --> 00:04:08,040 Speaker 2: about generally it's like unwelcome engagement or interactions in the 60 00:04:08,120 --> 00:04:14,760 Speaker 2: online world that are intended to inflict harm troll in tendidate. 61 00:04:15,040 --> 00:04:16,919 Speaker 1: For this episode, we're going to use the phrasing that 62 00:04:16,960 --> 00:04:20,679 Speaker 1: a human rights organization called pin America uses. They define 63 00:04:20,680 --> 00:04:24,760 Speaker 1: online harassment as quote pervasive or severe targeting of an 64 00:04:24,760 --> 00:04:29,120 Speaker 1: individual or group online through harmful behavior it's kind of 65 00:04:29,160 --> 00:04:31,840 Speaker 1: a broad definition, but that's because the kinds of things 66 00:04:31,880 --> 00:04:35,440 Speaker 1: that can happen to you online are pretty varied arranged. 67 00:04:35,480 --> 00:04:39,520 Speaker 2: Could be something as general as this individual account comes 68 00:04:39,560 --> 00:04:42,320 Speaker 2: to pay a lot of attention to my post, so 69 00:04:42,400 --> 00:04:45,760 Speaker 2: it could almost be something that stalker esque. Could be 70 00:04:45,800 --> 00:04:48,839 Speaker 2: someone who's just like a reply guy, you know, someone 71 00:04:48,880 --> 00:04:51,760 Speaker 2: who's always got something to say all the way, to 72 00:04:52,560 --> 00:04:56,279 Speaker 2: threats that are intended to essentially force people to stop 73 00:04:56,440 --> 00:05:00,640 Speaker 2: what they're doing or go offline. Sometimes it's like bots, farms, 74 00:05:00,680 --> 00:05:02,960 Speaker 2: people that get paid to harass people. 75 00:05:04,320 --> 00:05:07,640 Speaker 1: Online harassment is a very old problem. It's probably almost 76 00:05:07,680 --> 00:05:10,559 Speaker 1: as old as the Internet itself. The first mainstream media 77 00:05:10,640 --> 00:05:14,200 Speaker 1: use of the word cyberbullying was over two decades ago. 78 00:05:14,400 --> 00:05:18,960 Speaker 1: But it looks like it's intensifying. That statistic that I 79 00:05:19,000 --> 00:05:21,440 Speaker 1: mentioned in the beginning of the episode that in twenty seventeen, 80 00:05:21,520 --> 00:05:24,480 Speaker 1: close to half of US adults had experienced some form 81 00:05:24,520 --> 00:05:27,479 Speaker 1: of online harassment. When they did the same survey a 82 00:05:27,480 --> 00:05:30,919 Speaker 1: few years later, the ratios were about the same, but 83 00:05:31,200 --> 00:05:34,840 Speaker 1: the severity had increased. There were major percentage increases in 84 00:05:34,880 --> 00:05:39,160 Speaker 1: people reporting things like physical threats, sexual harassment, in stocking, 85 00:05:39,560 --> 00:05:44,320 Speaker 1: and in long sustained harassment campaigns. But regardless of the 86 00:05:44,360 --> 00:05:46,640 Speaker 1: scale this stuff can affect you in real. 87 00:05:46,480 --> 00:05:51,240 Speaker 2: Life, the stress response is very real, right, and like 88 00:05:51,760 --> 00:05:55,920 Speaker 2: this is I think where people can really underappreciate what 89 00:05:56,000 --> 00:05:58,760 Speaker 2: the impact of that is. It could be very destructive. 90 00:05:58,800 --> 00:06:02,720 Speaker 2: There are people that like leave industries, and today there 91 00:06:02,720 --> 00:06:05,240 Speaker 2: are a lot more instances in cases of like young 92 00:06:05,279 --> 00:06:08,280 Speaker 2: adults not just like going off of social media platforms 93 00:06:08,320 --> 00:06:13,920 Speaker 2: because they're harassed online or bullied, but like taking permanent measures. 94 00:06:14,160 --> 00:06:16,719 Speaker 1: One study of kids age ten to thirteen found that 95 00:06:16,800 --> 00:06:21,560 Speaker 1: being targeted by cyberbullying was associated with suicidality That includes 96 00:06:21,560 --> 00:06:25,440 Speaker 1: everything from suicidal thoughts up to actual attempts, with odds 97 00:06:25,480 --> 00:06:28,480 Speaker 1: more than four times higher. So the feelings that we 98 00:06:28,520 --> 00:06:32,160 Speaker 1: get when we're being abused online are very real, and 99 00:06:32,480 --> 00:06:35,520 Speaker 1: online harassment doesn't always just stay online. 100 00:06:36,240 --> 00:06:40,320 Speaker 2: What's always really concerned me is when online harassment shifts 101 00:06:40,320 --> 00:06:43,560 Speaker 2: over it to docsing and people's addresses are posted, or 102 00:06:43,640 --> 00:06:46,960 Speaker 2: you know, someone takes a photo or shows a Google 103 00:06:47,040 --> 00:06:50,320 Speaker 2: Maps like street photo of their home. It's very very 104 00:06:50,360 --> 00:06:55,120 Speaker 2: unsettling because you could have someone that's initially like I 105 00:06:55,200 --> 00:06:57,200 Speaker 2: hate you, you suck, and then it could be like 106 00:06:57,680 --> 00:07:00,920 Speaker 2: you deserve to be harmed, you know, all the way 107 00:07:00,960 --> 00:07:03,640 Speaker 2: to Like I've done research on you. I know where 108 00:07:03,680 --> 00:07:06,920 Speaker 2: you live, I know who your family members are, I 109 00:07:06,960 --> 00:07:09,600 Speaker 2: know you frequent these spaces, so like I know where 110 00:07:09,640 --> 00:07:11,160 Speaker 2: you go to school, I know where you go to work, 111 00:07:11,520 --> 00:07:12,720 Speaker 2: I know where you like to hang out. 112 00:07:13,160 --> 00:07:16,400 Speaker 1: Yeah, that's when it gets really scary. Yeah, but also 113 00:07:16,680 --> 00:07:20,480 Speaker 1: that sort of thing is getting easier to do. Like, 114 00:07:20,560 --> 00:07:24,440 Speaker 1: if I wanted to find where somebody lived, say even 115 00:07:24,520 --> 00:07:26,280 Speaker 1: five years ago, I think it was a little bit 116 00:07:26,320 --> 00:07:29,960 Speaker 1: more difficult and I need a little bit more technical skill. 117 00:07:30,720 --> 00:07:32,880 Speaker 1: It's easier to do that now, which is to say 118 00:07:32,880 --> 00:07:36,920 Speaker 1: that I actually don't need as much motivation as I 119 00:07:37,000 --> 00:07:39,600 Speaker 1: used to have to have in order to find somebody's 120 00:07:39,640 --> 00:07:43,680 Speaker 1: information and dox them or send them a bunch of messages. 121 00:07:44,400 --> 00:07:47,640 Speaker 2: Yeah, Like a simple backdrop of like the window that 122 00:07:47,760 --> 00:07:50,680 Speaker 2: looks out of your apartment building could be just an 123 00:07:50,720 --> 00:07:53,520 Speaker 2: easy way to like get a general guess as to 124 00:07:53,560 --> 00:07:56,600 Speaker 2: where you live. Right now, You've also got AI tools 125 00:07:56,680 --> 00:07:59,280 Speaker 2: and such that also make these things much easier today. 126 00:07:59,520 --> 00:08:02,320 Speaker 1: Yeah, totally right. I mean you could get a picture 127 00:08:02,360 --> 00:08:04,840 Speaker 1: of somebody and there's a window in the background. You 128 00:08:04,880 --> 00:08:08,480 Speaker 1: could ask Chad JBT, Hey where is this probably and 129 00:08:08,640 --> 00:08:10,200 Speaker 1: you know it'll give you a few guesses, and the 130 00:08:10,280 --> 00:08:11,520 Speaker 1: guesses might be correct. 131 00:08:12,080 --> 00:08:15,320 Speaker 2: Also, like the reality, I don't want to fear monger 132 00:08:15,360 --> 00:08:19,760 Speaker 2: about this topic itself. Most instances of online harassment are 133 00:08:20,080 --> 00:08:23,080 Speaker 2: you know, typically what you're going to see is an 134 00:08:23,160 --> 00:08:25,960 Speaker 2: uptick for a couple of days of harassment and generally 135 00:08:26,040 --> 00:08:29,240 Speaker 2: tends to die down. Most cases of online harassment don't 136 00:08:29,680 --> 00:08:32,080 Speaker 2: shift into physical risk. 137 00:08:33,440 --> 00:08:36,559 Speaker 1: But some of it does. In one survey, four percent 138 00:08:36,559 --> 00:08:39,640 Speaker 1: of Americans say that they've been personally docked, and sixteen 139 00:08:39,679 --> 00:08:42,120 Speaker 1: percent know a friend or family member who's been docked. 140 00:08:42,360 --> 00:08:45,240 Speaker 1: Docing used to feel like this fringe thing, but now 141 00:08:45,280 --> 00:08:47,240 Speaker 1: it's just something that some people do when they don't 142 00:08:47,320 --> 00:08:50,120 Speaker 1: like someone's opinion, and it's becoming more common for these 143 00:08:50,160 --> 00:08:53,480 Speaker 1: things to be organized. After Charlie Kirk was killed, an 144 00:08:53,480 --> 00:08:57,880 Speaker 1: anonymous website popped up called Exposed Charlie's Murderers that ask 145 00:08:58,000 --> 00:09:01,160 Speaker 1: people to submit personal information of anyone that they thought 146 00:09:01,320 --> 00:09:05,360 Speaker 1: wasn't being respectful enough about Charlie Kirk. The possible implication 147 00:09:05,480 --> 00:09:08,200 Speaker 1: here is that, hey, you don't have to do anything 148 00:09:08,200 --> 00:09:10,960 Speaker 1: with that information, but if someone else with a hotter 149 00:09:11,040 --> 00:09:14,959 Speaker 1: temper happens to see it and they live nearby that address, Well, 150 00:09:15,679 --> 00:09:18,760 Speaker 1: not your problem. Vice President Jady Vance did a guest 151 00:09:18,760 --> 00:09:22,240 Speaker 1: spot on Charlie Kirk's podcast, and well, he didn't exactly 152 00:09:22,280 --> 00:09:25,640 Speaker 1: condemn the idea of using someone's personal information to punish 153 00:09:25,679 --> 00:09:28,520 Speaker 1: them for an opinion that he didn't like. So when 154 00:09:28,559 --> 00:09:32,320 Speaker 1: you see someone celebrating Charlie's murder, call them out in hell, 155 00:09:32,480 --> 00:09:35,720 Speaker 1: call their employer. We don't believe in political violence, but 156 00:09:35,760 --> 00:09:39,559 Speaker 1: we do believe in civility. Again, doxing is like an 157 00:09:39,600 --> 00:09:43,600 Speaker 1: informal ad hoc collaboration with strangers that can lead to 158 00:09:43,640 --> 00:09:46,720 Speaker 1: potential violence. One guy puts the information out there, and 159 00:09:46,760 --> 00:09:49,600 Speaker 1: then someone else takes that information and does whatever the 160 00:09:49,640 --> 00:09:52,400 Speaker 1: first guy didn't have the nerve to do. Like swatting. 161 00:09:52,679 --> 00:09:55,680 Speaker 4: Alarming warning from the FBI this afternoon about the emerging 162 00:09:55,760 --> 00:09:59,640 Speaker 4: crime of swatting. A swatting incident is when someone targets 163 00:09:59,640 --> 00:10:02,040 Speaker 4: of the by making a nine to one one call 164 00:10:02,200 --> 00:10:05,640 Speaker 4: falsely claiming there's an emergency at the victim's home, and 165 00:10:05,679 --> 00:10:07,599 Speaker 4: the medicine can be dangerous. 166 00:10:07,840 --> 00:10:11,440 Speaker 5: The FBI and some former federal investigators warn this is 167 00:10:11,520 --> 00:10:15,720 Speaker 5: a danger to neighborhoods when there are swattings if something 168 00:10:15,760 --> 00:10:18,320 Speaker 5: goes wrong, when a false call is made and swat 169 00:10:18,320 --> 00:10:21,440 Speaker 5: teams show up at somebody's door unknown to that person. 170 00:10:21,480 --> 00:10:24,440 Speaker 5: In fact, one former FBI agent told CBS News, it 171 00:10:24,480 --> 00:10:25,360 Speaker 5: could turn deadly. 172 00:10:26,040 --> 00:10:29,439 Speaker 1: Swatting unfortunately, has been common going back well over a decade, 173 00:10:29,600 --> 00:10:33,040 Speaker 1: so some police departments are better prepared for these fake calls, 174 00:10:33,520 --> 00:10:37,240 Speaker 1: but not all of them. So doxing harassment. These are 175 00:10:37,240 --> 00:10:39,920 Speaker 1: not new concepts, but something that is new in this 176 00:10:39,960 --> 00:10:43,560 Speaker 1: equation is AI. Just like it can make mundane tasks 177 00:10:43,679 --> 00:10:47,480 Speaker 1: like sorting files or writing emails easier, AI can make 178 00:10:47,559 --> 00:10:50,680 Speaker 1: harassing people a lot easier than it's ever been. 179 00:10:51,360 --> 00:10:55,000 Speaker 2: They're like certain ways that AI can be used that 180 00:10:55,440 --> 00:10:58,640 Speaker 2: are really important to consider, like deep fakes, some things 181 00:10:58,640 --> 00:11:01,920 Speaker 2: of the type AI tool that take someone's face I 182 00:11:02,040 --> 00:11:05,319 Speaker 2: like superimpose it onto like an inappropriate picture or something. 183 00:11:05,480 --> 00:11:09,120 Speaker 2: One trend that I would say can be in line 184 00:11:09,160 --> 00:11:14,559 Speaker 2: with online harassments, like online extortion specifically used to target 185 00:11:14,679 --> 00:11:18,640 Speaker 2: children and young adults. Were like scam centers that essentially 186 00:11:18,679 --> 00:11:21,880 Speaker 2: look someone up, find either are an embarrassing photo of someone, 187 00:11:22,000 --> 00:11:24,920 Speaker 2: or take a photo of someone superimposed it make it 188 00:11:24,920 --> 00:11:27,560 Speaker 2: look like something else, and then they reach out and say, hey, 189 00:11:27,960 --> 00:11:31,280 Speaker 2: I have this really embarrassing thing about you, or I 190 00:11:31,360 --> 00:11:35,520 Speaker 2: hijacked your account and found these personal photos or something 191 00:11:35,559 --> 00:11:38,679 Speaker 2: of the sort. I'm gonna post them online unless you 192 00:11:38,720 --> 00:11:41,000 Speaker 2: give me x amount of money. 193 00:11:41,520 --> 00:11:44,200 Speaker 1: Deep fake revenge porn has been a problem for years, 194 00:11:44,360 --> 00:11:46,760 Speaker 1: and for a long time there was no real legal 195 00:11:46,800 --> 00:11:50,080 Speaker 1: recourse to address it in the US. So some things 196 00:11:50,120 --> 00:11:53,000 Speaker 1: are changing. Just this past May, a new law now 197 00:11:53,040 --> 00:11:56,760 Speaker 1: requires platforms to take down non consensual intimate images, whether 198 00:11:56,840 --> 00:11:59,920 Speaker 1: they're real or AI generated, if the person in them 199 00:12:00,160 --> 00:12:04,360 Speaker 1: request it. But there's limitations here. That law can't necessarily 200 00:12:04,360 --> 00:12:06,760 Speaker 1: do much about deep fakes hosted on a service that's 201 00:12:06,800 --> 00:12:09,920 Speaker 1: outside of the US, or prevent people from, say, sending 202 00:12:09,960 --> 00:12:12,400 Speaker 1: deep fake nude pictures of you to your friends to 203 00:12:12,520 --> 00:12:15,360 Speaker 1: harass you faster than the services can take them down. 204 00:12:15,920 --> 00:12:20,800 Speaker 2: I would say I've lost faith and social media platforms 205 00:12:20,880 --> 00:12:24,480 Speaker 2: being able to actually protect people. Social media platforms, i 206 00:12:24,520 --> 00:12:27,000 Speaker 2: would argue, are there to make money off of us, 207 00:12:27,080 --> 00:12:29,760 Speaker 2: you know, learn about our habits and sell that information 208 00:12:29,840 --> 00:12:32,640 Speaker 2: to advertisers. Right Like, it's a business at the end 209 00:12:32,679 --> 00:12:33,080 Speaker 2: of the day. 210 00:12:35,040 --> 00:12:38,720 Speaker 1: So if you can't really rely on platforms or the law, 211 00:12:39,200 --> 00:12:43,040 Speaker 1: what can you do if someone is harassing you online, 212 00:12:43,280 --> 00:12:57,120 Speaker 1: that's after the break. Let's say you wake up, look 213 00:12:57,160 --> 00:12:59,920 Speaker 1: at your phone, you see all these notifications, and you 214 00:13:00,120 --> 00:13:04,000 Speaker 1: realize there's a bunch of people harassing you. What do 215 00:13:04,040 --> 00:13:04,360 Speaker 1: you do? 216 00:13:05,240 --> 00:13:08,920 Speaker 2: So first and foremost, take a breath, right, like take 217 00:13:08,960 --> 00:13:11,760 Speaker 2: a moment. Am I safe? Am I good? Right now? 218 00:13:11,960 --> 00:13:16,720 Speaker 2: Because like, online harassment isn't just digital right like immediately 219 00:13:16,760 --> 00:13:19,680 Speaker 2: once you're being attacked, once someone is holding a magnifying 220 00:13:19,679 --> 00:13:24,240 Speaker 2: glass to you, it's also like mentally very taxing and 221 00:13:24,320 --> 00:13:27,839 Speaker 2: can be extremely draining. So it's important to like try 222 00:13:27,920 --> 00:13:30,559 Speaker 2: and pace yourself as much as we're immediately going to 223 00:13:30,600 --> 00:13:33,920 Speaker 2: go in the sort of fight flight or freeze responses, 224 00:13:34,679 --> 00:13:37,320 Speaker 2: be mindful that this has a mental impact on you. 225 00:13:37,559 --> 00:13:40,640 Speaker 2: But when it comes to like okay, I'm seeing these messages, 226 00:13:40,679 --> 00:13:42,360 Speaker 2: I don't know what's happening, what's going on? The first 227 00:13:42,400 --> 00:13:45,000 Speaker 2: thing I would personally do is like get a read 228 00:13:45,080 --> 00:13:48,520 Speaker 2: on the situation. So that's like going online to see 229 00:13:48,559 --> 00:13:51,320 Speaker 2: the types of messages you got, getting a sense of 230 00:13:51,360 --> 00:13:54,600 Speaker 2: how sort of large spans is this just on like 231 00:13:54,920 --> 00:13:59,120 Speaker 2: my install, my snapchat, my TikTok, or is it all 232 00:13:59,160 --> 00:14:02,640 Speaker 2: the weight in my work? Professional accounts, are loved ones 233 00:14:02,720 --> 00:14:06,160 Speaker 2: being impacted. You could, on one hand, just be dealing 234 00:14:06,200 --> 00:14:09,120 Speaker 2: with like a post or something that blows up, goes 235 00:14:09,240 --> 00:14:12,640 Speaker 2: viral all the way to I'm getting emails that are 236 00:14:12,679 --> 00:14:15,720 Speaker 2: saying my account is like open on another browser or 237 00:14:15,840 --> 00:14:19,280 Speaker 2: laptop or phone. So yeah, Initially, what you want to 238 00:14:19,320 --> 00:14:22,040 Speaker 2: do is like get a sense of how big the 239 00:14:22,080 --> 00:14:25,840 Speaker 2: footprint of the harassment is, and then through that, obviously 240 00:14:26,000 --> 00:14:27,880 Speaker 2: you should be able to also get a sense of 241 00:14:28,000 --> 00:14:31,440 Speaker 2: are these people that are just disagreeing with me? Are 242 00:14:31,440 --> 00:14:33,760 Speaker 2: they just trying to sort of insult me, make me 243 00:14:33,800 --> 00:14:36,480 Speaker 2: feel bad? Or is there like anything in there that 244 00:14:36,640 --> 00:14:39,840 Speaker 2: is truly concerning, right, like we talked about with docsing 245 00:14:40,280 --> 00:14:42,520 Speaker 2: And then along with that, right, like, I think some 246 00:14:42,600 --> 00:14:45,200 Speaker 2: things you have to consider. You know, am I going 247 00:14:45,280 --> 00:14:48,040 Speaker 2: to put my phone on silent? It's like blowing up nonstoff? 248 00:14:48,320 --> 00:14:50,120 Speaker 2: Do I need to lie low? Do I need to 249 00:14:50,120 --> 00:14:52,600 Speaker 2: stop posting online if there's a lot more attention on 250 00:14:52,640 --> 00:14:54,400 Speaker 2: me all of a sudden? Do I need to lock 251 00:14:54,440 --> 00:14:57,560 Speaker 2: down my personal accounts that might have just been compromised. 252 00:14:57,800 --> 00:15:00,520 Speaker 2: Are authorities helpful in this situation? Are they going to 253 00:15:00,600 --> 00:15:01,280 Speaker 2: make things worse? 254 00:15:02,760 --> 00:15:06,160 Speaker 1: So step one is the step back remember when I 255 00:15:06,160 --> 00:15:07,960 Speaker 1: said that my friend had told me that it's all 256 00:15:07,960 --> 00:15:10,800 Speaker 1: over the internet. I had not seen any of this 257 00:15:10,840 --> 00:15:13,960 Speaker 1: stuff because the two of us run in very different 258 00:15:14,000 --> 00:15:18,520 Speaker 1: online circles. It was not all over the internet, but 259 00:15:18,600 --> 00:15:21,040 Speaker 1: for them it was. And this is the same thing 260 00:15:21,120 --> 00:15:23,440 Speaker 1: with anyone else who's ever experienced this that I've talked 261 00:15:23,440 --> 00:15:26,760 Speaker 1: to about this. It really does feel like everyone in 262 00:15:26,800 --> 00:15:29,560 Speaker 1: the world is focused on you, like you can't even 263 00:15:29,600 --> 00:15:33,520 Speaker 1: go outside. This isn't you being irrational? I mean, yes, 264 00:15:33,640 --> 00:15:36,760 Speaker 1: of course, there is a logical difference in danger between 265 00:15:36,800 --> 00:15:38,240 Speaker 1: you sitting on the couch and getting a bunch of 266 00:15:38,280 --> 00:15:40,880 Speaker 1: angry messages and you sitting on the couch and somebody 267 00:15:40,920 --> 00:15:44,800 Speaker 1: walks in with a knife. But your brain might interpret 268 00:15:44,920 --> 00:15:48,240 Speaker 1: both as Yo, someone just entered your space and you 269 00:15:48,280 --> 00:15:50,960 Speaker 1: are now in danger. So those feelings are not fake, 270 00:15:51,200 --> 00:15:54,480 Speaker 1: they're real and we should respect them. But realistically, if 271 00:15:54,480 --> 00:15:56,920 Speaker 1: you can take a breath in that moment, it is 272 00:15:57,120 --> 00:15:59,800 Speaker 1: only a very tiny corner of the Internet that is 273 00:15:59,840 --> 00:16:03,920 Speaker 1: interested in you, and it is almost certainly temporary. Sometimes 274 00:16:04,000 --> 00:16:06,520 Speaker 1: it feels like you have to be glued to your phone, 275 00:16:06,680 --> 00:16:10,120 Speaker 1: watching every comment, every message as it comes in. But 276 00:16:10,240 --> 00:16:13,360 Speaker 1: maybe walking away for a bit might help. 277 00:16:14,200 --> 00:16:16,360 Speaker 2: Typically you'd want to do that after you make sure 278 00:16:16,480 --> 00:16:19,720 Speaker 2: like you changed your passwords or like you checked and 279 00:16:19,760 --> 00:16:22,640 Speaker 2: make sure like you're not docs. It's just people being mean, 280 00:16:22,920 --> 00:16:25,400 Speaker 2: So like maybe then you take a breather, go spend 281 00:16:25,480 --> 00:16:28,760 Speaker 2: time with friends. From the sort of mental health like 282 00:16:28,840 --> 00:16:33,880 Speaker 2: mental hygiene perspective, endless doom scrolling is not necessarily good 283 00:16:33,960 --> 00:16:37,720 Speaker 2: for us, and especially not the type of constant oh 284 00:16:37,960 --> 00:16:40,040 Speaker 2: what's being said about me now? That type of thing. 285 00:16:40,360 --> 00:16:42,840 Speaker 2: One thing that I would also especially recommend when you 286 00:16:42,880 --> 00:16:45,360 Speaker 2: see these types of things is I think there's always 287 00:16:45,360 --> 00:16:48,080 Speaker 2: like an instinct to just delete, delete, you know, like 288 00:16:48,400 --> 00:16:50,840 Speaker 2: get rid of it clear at all. But the issue 289 00:16:50,880 --> 00:16:53,520 Speaker 2: with that is that you're not then able to sort 290 00:16:53,560 --> 00:16:55,800 Speaker 2: of monitor or get a sense of the way things 291 00:16:55,800 --> 00:16:59,440 Speaker 2: are changing. And then also, like from a legal perspective 292 00:17:00,040 --> 00:17:02,960 Speaker 2: over holding people accountable, it's also important so like not 293 00:17:03,200 --> 00:17:07,080 Speaker 2: get rid of that information. So don't just like delete 294 00:17:07,119 --> 00:17:10,199 Speaker 2: everything all of a sudden, because sometimes there's really like 295 00:17:10,280 --> 00:17:13,720 Speaker 2: important information in there that you might overlook. There are 296 00:17:13,840 --> 00:17:17,919 Speaker 2: ways that you can also limit the chat or and 297 00:17:17,960 --> 00:17:20,680 Speaker 2: the notifications and things like that, and like there are 298 00:17:20,720 --> 00:17:24,440 Speaker 2: ways that you could set filters or restrictions some platforms 299 00:17:24,720 --> 00:17:29,399 Speaker 2: have privacy settings that allow you to either block, which 300 00:17:29,440 --> 00:17:32,320 Speaker 2: is like completely out of sight muting meaning like you 301 00:17:32,320 --> 00:17:37,320 Speaker 2: don't constantly get notifications, or even then like restriction, like 302 00:17:37,400 --> 00:17:40,760 Speaker 2: some social media platforms have the ability to restrict someone, 303 00:17:40,840 --> 00:17:43,119 Speaker 2: and you can also set it to where like you 304 00:17:43,160 --> 00:17:45,040 Speaker 2: can see only if you want to appear in the 305 00:17:45,040 --> 00:17:46,680 Speaker 2: headspace to see and deal with it. 306 00:17:47,000 --> 00:17:50,400 Speaker 1: Shout out to the mute function. Muting can be really useful. 307 00:17:50,680 --> 00:17:52,919 Speaker 1: You can still use Instagram or whatever to talk to 308 00:17:52,960 --> 00:17:55,320 Speaker 1: your friends without having to see accounts or posts or 309 00:17:55,359 --> 00:17:58,240 Speaker 1: notifications that are stressing you out. But in case you 310 00:17:58,320 --> 00:18:00,719 Speaker 1: do need to go back and show those post to someone, 311 00:18:00,960 --> 00:18:05,159 Speaker 1: like maybe say, worst case scenario, the police, it's still available. 312 00:18:05,600 --> 00:18:07,480 Speaker 1: On Instagram, you can mute a person so that you 313 00:18:07,520 --> 00:18:10,240 Speaker 1: don't see their posts or their stories. On Twitter, you 314 00:18:10,280 --> 00:18:14,240 Speaker 1: can mute accounts or even specific words or hashtags. Facebook, 315 00:18:14,280 --> 00:18:18,520 Speaker 1: Blue Sky, TikTok. Most platforms have similar options just like this. 316 00:18:18,960 --> 00:18:22,040 Speaker 2: Beyond the point of like taking a breath, like pacing yourself, 317 00:18:22,119 --> 00:18:24,879 Speaker 2: like using the resources on people around you and like 318 00:18:24,920 --> 00:18:28,160 Speaker 2: your family and your loved ones. You know, online harassment, 319 00:18:28,240 --> 00:18:31,080 Speaker 2: that type of thing can like feel very embarrassing and 320 00:18:31,320 --> 00:18:33,959 Speaker 2: so because you're embarrassed, you don't talk about it. So 321 00:18:34,000 --> 00:18:36,720 Speaker 2: then when you don't talk about it, you're like isolating yourself. 322 00:18:37,000 --> 00:18:39,640 Speaker 2: So yeah, like leaning into your social groups, whether that's 323 00:18:39,680 --> 00:18:41,679 Speaker 2: like family, loved ones, colleagues. 324 00:18:43,560 --> 00:18:46,080 Speaker 1: That statistic that I mentioned at the beginning that almost 325 00:18:46,080 --> 00:18:48,840 Speaker 1: half of all American adults have experienced some kind of 326 00:18:48,880 --> 00:18:52,879 Speaker 1: online harassment. It sounds really bad, but there's actually I 327 00:18:52,880 --> 00:18:55,520 Speaker 1: think it's silver lining. It means that you're not alone. 328 00:18:56,240 --> 00:18:58,399 Speaker 1: But this is where a lot of people get stuck 329 00:18:58,600 --> 00:19:02,080 Speaker 1: by self isolating. It is extremely important to talk to 330 00:19:02,119 --> 00:19:04,359 Speaker 1: people about what you're going through. Do not hide this. 331 00:19:05,040 --> 00:19:07,439 Speaker 1: People in your life might be able to help you out. 332 00:19:07,640 --> 00:19:09,440 Speaker 1: Maybe your boss can let you work from home if 333 00:19:09,440 --> 00:19:11,560 Speaker 1: you need to, or maybe your friends can monitor your 334 00:19:11,560 --> 00:19:14,120 Speaker 1: social media for you while you take a quick break 335 00:19:14,160 --> 00:19:14,479 Speaker 1: from it. 336 00:19:15,119 --> 00:19:18,399 Speaker 2: I've had friends that reach out to me because we 337 00:19:18,440 --> 00:19:21,320 Speaker 2: work in the same industry and they're like, I'm expecting 338 00:19:21,359 --> 00:19:25,199 Speaker 2: some emails that are really mean and hurtful. Could you 339 00:19:25,240 --> 00:19:27,159 Speaker 2: just take a look at these really quickly, or like, 340 00:19:27,280 --> 00:19:29,119 Speaker 2: could you sit with me while I read through some 341 00:19:29,160 --> 00:19:32,119 Speaker 2: of these rather than like looking at it alone. They 342 00:19:32,160 --> 00:19:34,600 Speaker 2: could sit next to someone and like kind of laugh 343 00:19:34,640 --> 00:19:36,919 Speaker 2: about it. You know, and just kind of more so 344 00:19:37,040 --> 00:19:39,240 Speaker 2: put it in its place. In that sense, I. 345 00:19:39,240 --> 00:19:43,359 Speaker 1: Think a lot of times there's the desire to jump 346 00:19:43,400 --> 00:19:46,920 Speaker 1: in and respond to people. What do you think about that? 347 00:19:47,880 --> 00:19:51,600 Speaker 2: Yeah, I would say it definitely depends, right, because sometimes 348 00:19:51,920 --> 00:19:55,920 Speaker 2: especially when you feel like you're overwhelmed by negative comments 349 00:19:56,000 --> 00:19:59,520 Speaker 2: or people kind of puting a negative light out of context, 350 00:19:59,600 --> 00:20:03,159 Speaker 2: it is a important to set the record straight, say something. 351 00:20:03,880 --> 00:20:06,080 Speaker 2: But it very much depends, right. I like to say 352 00:20:06,119 --> 00:20:08,960 Speaker 2: first and foremost like don't feed the trolls, And a 353 00:20:09,040 --> 00:20:12,160 Speaker 2: lot of instances someone is looking to like get under 354 00:20:12,160 --> 00:20:15,800 Speaker 2: your skin. In certain instances, like you're responding is actually 355 00:20:15,960 --> 00:20:20,000 Speaker 2: encouraging people to harass you a bit more, and online harassment, 356 00:20:20,240 --> 00:20:24,280 Speaker 2: you're having something harmful or damaging being done to you. 357 00:20:24,640 --> 00:20:27,399 Speaker 2: So it's also important to take power. But like the 358 00:20:27,440 --> 00:20:30,879 Speaker 2: point I argue is like taking power in the constructive 359 00:20:30,880 --> 00:20:34,240 Speaker 2: and helpful way rather than just engaging with a troll 360 00:20:34,359 --> 00:20:35,520 Speaker 2: or something of the sort. 361 00:20:35,800 --> 00:20:38,560 Speaker 1: If you're a friend who's watching someone deal with harassment, 362 00:20:38,600 --> 00:20:42,439 Speaker 1: the same thing applies. It might not be helpful to 363 00:20:42,480 --> 00:20:44,639 Speaker 1: try to argue in the comments on someone's behalf, and 364 00:20:44,680 --> 00:20:46,880 Speaker 1: it actually might be dangerous for you to do that. 365 00:20:47,240 --> 00:20:49,920 Speaker 1: Really you can just ask the person, hey, yo, is 366 00:20:49,960 --> 00:20:52,480 Speaker 1: there anything I can do for you? And it might 367 00:20:52,520 --> 00:20:54,160 Speaker 1: be to just go over and hang out or bring 368 00:20:54,200 --> 00:20:57,119 Speaker 1: them dinner. I mean, seriously, it sounds kind of touchy feely, 369 00:20:57,280 --> 00:21:00,000 Speaker 1: but if you want to fight back against the keyboard warriors, 370 00:21:00,560 --> 00:21:03,480 Speaker 1: one of your best weapons is probably a well placed 371 00:21:03,520 --> 00:21:07,800 Speaker 1: pizza order. So those are some concrete steps you can 372 00:21:07,840 --> 00:21:10,520 Speaker 1: take if you're dealing with some sudden hate comments or 373 00:21:10,520 --> 00:21:12,439 Speaker 1: some hate mail, But the best thing you can do 374 00:21:12,560 --> 00:21:16,960 Speaker 1: to protect yourself happens way before the harassment ever starts. 375 00:21:17,000 --> 00:21:18,760 Speaker 2: One of the most important things I like the best 376 00:21:18,800 --> 00:21:21,800 Speaker 2: way to deal with online harassment is by initially preparing 377 00:21:21,840 --> 00:21:23,760 Speaker 2: for it, so like you can kind of front load 378 00:21:23,800 --> 00:21:25,959 Speaker 2: a lot of the work, and then ultimately when it 379 00:21:26,000 --> 00:21:28,880 Speaker 2: does come to harassment, you shut it up. You're kind 380 00:21:28,880 --> 00:21:32,399 Speaker 2: of ready for it. You've established and set yourself in 381 00:21:32,440 --> 00:21:34,199 Speaker 2: a way where when it comes to the range of 382 00:21:34,359 --> 00:21:37,320 Speaker 2: what the actual level of disruption is like, it's going 383 00:21:37,400 --> 00:21:38,040 Speaker 2: to be minimal. 384 00:21:38,440 --> 00:21:40,760 Speaker 1: We'll get into how to do that after the break. 385 00:21:49,840 --> 00:21:53,320 Speaker 1: If somebody has ten minutes to do something that could 386 00:21:53,320 --> 00:21:56,600 Speaker 1: help lessen the damage. If somebody does decide to bother 387 00:21:56,680 --> 00:21:59,680 Speaker 1: them in the future, what could they do? What would 388 00:21:59,720 --> 00:22:02,000 Speaker 1: you say? The first thing to do is take. 389 00:22:01,800 --> 00:22:04,359 Speaker 2: A look at the privacy settings on your social media 390 00:22:04,400 --> 00:22:07,240 Speaker 2: profiles and set them in a way where you have 391 00:22:07,320 --> 00:22:10,600 Speaker 2: a little bit more control of like who follows you, 392 00:22:11,119 --> 00:22:13,359 Speaker 2: who can tag you, who can comment on your posts, 393 00:22:13,359 --> 00:22:15,800 Speaker 2: and things of that type. That's a very easy one, 394 00:22:15,960 --> 00:22:19,119 Speaker 2: you know, look at the settings for messaging. Is my 395 00:22:19,320 --> 00:22:22,480 Speaker 2: phone number readily shared? If you pull up my profile, 396 00:22:22,800 --> 00:22:25,240 Speaker 2: can anyone dm me? Or like, do I have to 397 00:22:25,240 --> 00:22:27,719 Speaker 2: approve it? You know, like that type of thing. And 398 00:22:27,800 --> 00:22:29,800 Speaker 2: so that way you can kind of cut out a 399 00:22:29,800 --> 00:22:32,320 Speaker 2: lot of noise or reduce like what someone can actually 400 00:22:32,400 --> 00:22:34,560 Speaker 2: do to like your social media platforms. 401 00:22:36,080 --> 00:22:40,200 Speaker 1: Social media apps don't exactly optimize for you protecting your account. 402 00:22:40,240 --> 00:22:42,760 Speaker 1: I mean why would they. If you did, then it 403 00:22:42,800 --> 00:22:45,240 Speaker 1: could reduce how many people you interact with, and that 404 00:22:45,280 --> 00:22:48,479 Speaker 1: could potentially mean less time spent on their app. So 405 00:22:48,680 --> 00:22:51,600 Speaker 1: they don't always make these settings as easy to find 406 00:22:51,640 --> 00:22:54,960 Speaker 1: as they probably could. But in general, there are a 407 00:22:54,960 --> 00:22:56,720 Speaker 1: lot of settings that you can use to give you 408 00:22:56,760 --> 00:23:00,200 Speaker 1: more granular control over who can do what with your information, 409 00:23:00,520 --> 00:23:04,000 Speaker 1: and they're usually pretty easy to manage once you find them. Seriously, 410 00:23:04,040 --> 00:23:05,800 Speaker 1: you can do all this stuff on your phone right 411 00:23:05,840 --> 00:23:07,920 Speaker 1: now as you listen to the rest of this episode, 412 00:23:08,119 --> 00:23:10,040 Speaker 1: you can probably figure it out on your own, but 413 00:23:10,200 --> 00:23:12,120 Speaker 1: also put some links in the show notes that'll walk 414 00:23:12,160 --> 00:23:15,159 Speaker 1: you through securing an account for Instagram, for TikTok and 415 00:23:15,200 --> 00:23:15,560 Speaker 1: so on. 416 00:23:17,040 --> 00:23:20,560 Speaker 2: Just by toggling that switch in your privacy settings, you 417 00:23:20,680 --> 00:23:24,399 Speaker 2: already made yourself like a non target for a scammer. 418 00:23:24,560 --> 00:23:27,000 Speaker 2: You know, you talk about the ten seconds. Something that 419 00:23:27,040 --> 00:23:30,159 Speaker 2: takes one second could be the difference between like you 420 00:23:30,280 --> 00:23:34,159 Speaker 2: being targeted or not, because like a criminal or a 421 00:23:34,200 --> 00:23:37,880 Speaker 2: harasser is always kind of looking for the easiest target 422 00:23:38,000 --> 00:23:39,119 Speaker 2: or low hanging fruit. 423 00:23:39,640 --> 00:23:41,679 Speaker 1: Maybe the best way to do this is just for 424 00:23:41,720 --> 00:23:45,680 Speaker 1: a second, pretend that you are the aps put yourself 425 00:23:45,680 --> 00:23:47,639 Speaker 1: in the mind of someone who wants to find you 426 00:23:48,000 --> 00:23:50,480 Speaker 1: or hurt you. How would you go about it? 427 00:23:51,119 --> 00:23:54,640 Speaker 2: Imagine going to your social media profile and like look 428 00:23:54,720 --> 00:23:56,960 Speaker 2: at it from the perspective of someone who might harass 429 00:23:57,040 --> 00:23:59,400 Speaker 2: you and being like, is there anything that people can 430 00:24:00,040 --> 00:24:02,520 Speaker 2: see here use here that could be like used against me. 431 00:24:03,040 --> 00:24:07,440 Speaker 1: Googling yourself, Yeah, I've heard this advice is like google yourself, 432 00:24:08,160 --> 00:24:13,280 Speaker 1: see what you can find on just public services, and 433 00:24:13,320 --> 00:24:16,920 Speaker 1: if you can find your own phone number, maybe there's 434 00:24:16,960 --> 00:24:18,040 Speaker 1: something you can do about that. 435 00:24:18,720 --> 00:24:21,439 Speaker 2: Yeah, And I think that really can vary because like, 436 00:24:21,520 --> 00:24:24,840 Speaker 2: for example, in the US, privacy laws are really terrible. 437 00:24:24,880 --> 00:24:27,280 Speaker 2: Like you could you know, if you registered to vote, 438 00:24:27,400 --> 00:24:30,240 Speaker 2: there's probably an address for you out there right like 439 00:24:30,320 --> 00:24:33,320 Speaker 2: on a phone number that was like scraped offline, where 440 00:24:33,400 --> 00:24:35,359 Speaker 2: like if you live in Europe or somewhere else, like 441 00:24:35,400 --> 00:24:37,720 Speaker 2: it might be more difficult to get that information. And 442 00:24:37,800 --> 00:24:40,840 Speaker 2: so yeah, it's almost like a self doxing exercise looking 443 00:24:40,880 --> 00:24:43,440 Speaker 2: at it from the perspective of when people look me up, 444 00:24:43,840 --> 00:24:45,760 Speaker 2: how can they find me? What can they see about me? 445 00:24:46,080 --> 00:24:49,560 Speaker 1: What do you think about the services that will go 446 00:24:49,600 --> 00:24:54,240 Speaker 1: and delete your information from those kind of online repositories 447 00:24:54,280 --> 00:24:55,640 Speaker 1: of collections of information. 448 00:24:56,200 --> 00:24:58,760 Speaker 2: I mean, especially if you work in an industry where 449 00:24:58,800 --> 00:25:01,920 Speaker 2: you're prone to have like on you, I would highly 450 00:25:01,920 --> 00:25:04,240 Speaker 2: recommend considering some of those services. 451 00:25:04,480 --> 00:25:07,160 Speaker 1: There's a lot of these services. There's delete me, Optoy, 452 00:25:07,400 --> 00:25:11,280 Speaker 1: and Cognitive basically services that will remove your personal data 453 00:25:11,440 --> 00:25:14,600 Speaker 1: for you from these databases that includes stuff like your address, 454 00:25:14,680 --> 00:25:17,240 Speaker 1: your email, your phone number, all that stuff that might 455 00:25:17,240 --> 00:25:20,600 Speaker 1: be floating out there online. These services do cost money, 456 00:25:20,680 --> 00:25:23,320 Speaker 1: but one thing you could try is asking your employer 457 00:25:23,359 --> 00:25:26,000 Speaker 1: if they'll pay for it for you, and if they don't, 458 00:25:26,280 --> 00:25:28,600 Speaker 1: I mean send them the episode and make the case 459 00:25:28,680 --> 00:25:30,560 Speaker 1: that maybe they should. 460 00:25:30,840 --> 00:25:34,080 Speaker 2: A lot of Like what the things that services like 461 00:25:34,119 --> 00:25:37,240 Speaker 2: delete me does are things you can do for free 462 00:25:37,400 --> 00:25:40,080 Speaker 2: to make it more accessible. If you can't dishot one 463 00:25:40,160 --> 00:25:43,440 Speaker 2: hundred dollars, you could request that information take an offline, 464 00:25:43,800 --> 00:25:45,879 Speaker 2: but it's like you have to do it individually. It 465 00:25:45,920 --> 00:25:47,000 Speaker 2: just takes a lot more work. 466 00:25:47,720 --> 00:25:49,520 Speaker 1: And look, I get it. Maybe your boss doesn't want 467 00:25:49,560 --> 00:25:52,199 Speaker 1: to play ball, or maybe money's just tight, or you 468 00:25:52,240 --> 00:25:54,560 Speaker 1: want to go the DIY route. So I'll put a 469 00:25:54,560 --> 00:25:56,119 Speaker 1: link in the show notes to the site that the 470 00:25:56,119 --> 00:25:58,960 Speaker 1: Freedom of the Press Foundation put together. It's called you 471 00:25:59,040 --> 00:26:02,399 Speaker 1: can't make this up. It's called the Big Ass Data 472 00:26:02,400 --> 00:26:04,959 Speaker 1: Broker opt Out List, and it will show you how 473 00:26:04,960 --> 00:26:08,320 Speaker 1: to send those takedown requests on your own. You can 474 00:26:08,359 --> 00:26:11,240 Speaker 1: also go through old social media posts and delete anything 475 00:26:11,240 --> 00:26:14,440 Speaker 1: with personal information, like let's say you once posted something 476 00:26:14,480 --> 00:26:17,520 Speaker 1: about moving into your new apartment, or god forbid those 477 00:26:17,560 --> 00:26:21,520 Speaker 1: pictures that people keep posting holding their keys outside in 478 00:26:21,520 --> 00:26:23,639 Speaker 1: front of the house that they just bought, please do 479 00:26:23,720 --> 00:26:27,760 Speaker 1: not do that. That is prime doxing material. And there's 480 00:26:27,800 --> 00:26:30,639 Speaker 1: absolutely no reason to keep that online. And if you 481 00:26:30,720 --> 00:26:32,480 Speaker 1: use Twitter, and let's say you don't want to have 482 00:26:32,560 --> 00:26:34,720 Speaker 1: somebody dig something up you said about Trump back in 483 00:26:34,720 --> 00:26:37,679 Speaker 1: twenty sixteen, you can use the advantagd search feature and 484 00:26:37,760 --> 00:26:41,080 Speaker 1: look for keywords or a past guest of the show, 485 00:26:41,240 --> 00:26:45,600 Speaker 1: Michael Lee, help make this tool called cyd dot Social 486 00:26:45,920 --> 00:26:47,880 Speaker 1: that will go back through your old tweets and help 487 00:26:47,920 --> 00:26:50,879 Speaker 1: you decide what you want to take down. And in 488 00:26:50,920 --> 00:26:52,760 Speaker 1: the future you can just be a little bit more 489 00:26:52,760 --> 00:26:57,080 Speaker 1: discerning about what you share online. It is okay to 490 00:26:57,240 --> 00:27:00,720 Speaker 1: lie to the computer. So for example, that field in 491 00:27:00,760 --> 00:27:02,640 Speaker 1: your social media account where it tells you to put 492 00:27:02,640 --> 00:27:06,320 Speaker 1: your location live, like, I don't use Twitter anymore, but 493 00:27:06,320 --> 00:27:08,600 Speaker 1: if you look at my account, it says that I'm 494 00:27:08,640 --> 00:27:11,359 Speaker 1: in Tokyo right now, and just between me and you, 495 00:27:12,119 --> 00:27:13,520 Speaker 1: I'm not in Tokyo right now. 496 00:27:14,240 --> 00:27:17,240 Speaker 2: A lot of it is just like creating the habits 497 00:27:17,320 --> 00:27:20,520 Speaker 2: to make it harder or prayer to be a target 498 00:27:20,640 --> 00:27:23,800 Speaker 2: or like less vulnerable. So like being much more mindful 499 00:27:23,800 --> 00:27:26,959 Speaker 2: about posting about loved ones. You know, am I posting 500 00:27:27,000 --> 00:27:30,080 Speaker 2: anything that like directly shows where I live or like 501 00:27:30,160 --> 00:27:32,520 Speaker 2: places I like to hang out or spend time out 502 00:27:32,560 --> 00:27:35,240 Speaker 2: Pretty routinely. If you do want to post them chair 503 00:27:35,280 --> 00:27:39,240 Speaker 2: something online, just remember that like when you post the 504 00:27:39,359 --> 00:27:42,920 Speaker 2: chair something, it's ultimately like always. 505 00:27:42,560 --> 00:27:46,040 Speaker 1: There nobody's telling you that you can't post stuff online 506 00:27:46,160 --> 00:27:48,399 Speaker 1: if you really want to. There's always the option of 507 00:27:48,440 --> 00:27:51,760 Speaker 1: making a fensta, a fake Instagram account or TikTok account 508 00:27:51,800 --> 00:27:55,680 Speaker 1: or whatever, basically making another account on any app aside 509 00:27:55,720 --> 00:27:58,239 Speaker 1: from the one that is more public facing, and on 510 00:27:58,240 --> 00:27:59,880 Speaker 1: that one you can be a little bit more personal 511 00:28:00,000 --> 00:28:02,280 Speaker 1: and you can only give that account out to family 512 00:28:02,359 --> 00:28:06,399 Speaker 1: and close friends, and this separation can extend beyond social media. 513 00:28:06,960 --> 00:28:09,520 Speaker 2: You can even make a Google Voice number or any 514 00:28:09,560 --> 00:28:11,439 Speaker 2: other type of like service where you can get a 515 00:28:11,440 --> 00:28:15,679 Speaker 2: free digital number that you can sort of proverbally like 516 00:28:15,920 --> 00:28:18,240 Speaker 2: plug and unplug. So that way, like if you do 517 00:28:18,320 --> 00:28:20,159 Speaker 2: get her ass and you need some like quiets on, 518 00:28:20,320 --> 00:28:21,280 Speaker 2: you can do that. 519 00:28:21,760 --> 00:28:24,320 Speaker 1: I do this. I have a normal number for friends 520 00:28:24,359 --> 00:28:26,560 Speaker 1: and family, and I have another number that I give 521 00:28:26,600 --> 00:28:29,480 Speaker 1: out for work use. And there's free services for stuff 522 00:28:29,520 --> 00:28:32,680 Speaker 1: like this, like Google Voice is probably the most common. Also, 523 00:28:32,760 --> 00:28:36,280 Speaker 1: you don't have to have two physical phones. Whichever number 524 00:28:36,320 --> 00:28:39,160 Speaker 1: someone calls, they both go to my same phone. And 525 00:28:39,240 --> 00:28:41,080 Speaker 1: the cool thing about this is if I need to 526 00:28:41,120 --> 00:28:43,840 Speaker 1: deactivate that public number for a while, I can do that, 527 00:28:43,960 --> 00:28:45,800 Speaker 1: or if it really got bad, I just throw it 528 00:28:45,800 --> 00:28:47,520 Speaker 1: out and get a new one. And if you really 529 00:28:47,560 --> 00:28:49,920 Speaker 1: want to get secure, you can use signal and turn 530 00:28:49,960 --> 00:28:53,120 Speaker 1: on the option to hide your phone number. So let's 531 00:28:53,120 --> 00:28:55,240 Speaker 1: talk about passwords and things like that. What do you 532 00:28:55,280 --> 00:28:56,120 Speaker 1: recommend people do. 533 00:28:57,080 --> 00:29:01,200 Speaker 2: Yeah, I mean, first and foremost, go to kill switches. 534 00:29:01,280 --> 00:29:05,520 Speaker 2: Episode on password managers. No, like, seriously though, like a 535 00:29:05,560 --> 00:29:08,360 Speaker 2: password manager is really great, not way like your accounts 536 00:29:08,360 --> 00:29:09,640 Speaker 2: are more locked down. 537 00:29:10,560 --> 00:29:12,160 Speaker 1: If you want to check out our episode about that, 538 00:29:12,200 --> 00:29:13,920 Speaker 1: there's a link in the show notes for that one, 539 00:29:14,000 --> 00:29:16,800 Speaker 1: but here's a short version. First, make sure that your 540 00:29:16,800 --> 00:29:19,640 Speaker 1: email account has a good password that isn't used in 541 00:29:19,800 --> 00:29:22,840 Speaker 1: any other log in, and then also get a password manager, 542 00:29:22,880 --> 00:29:25,239 Speaker 1: and some of those can be free. On top of that, 543 00:29:25,400 --> 00:29:28,480 Speaker 1: whenever you can use two factor authentication or a pass key. 544 00:29:29,440 --> 00:29:33,120 Speaker 1: So all this stuff setting up a new account, doxing yourself, 545 00:29:33,400 --> 00:29:36,320 Speaker 1: deleting all your personal information, and taking back control over 546 00:29:36,320 --> 00:29:38,920 Speaker 1: the information that is already out there. It might feel 547 00:29:38,920 --> 00:29:41,360 Speaker 1: a little bit overwhelming, but you don't have to do 548 00:29:41,400 --> 00:29:44,320 Speaker 1: all this stuff at once. Robbie recommends doing it a 549 00:29:44,320 --> 00:29:45,320 Speaker 1: little bit at a time. 550 00:29:46,760 --> 00:29:49,120 Speaker 2: If you could take like five minutes to do one 551 00:29:49,160 --> 00:29:52,400 Speaker 2: small thing like once a week for a few weeks, 552 00:29:52,720 --> 00:29:54,959 Speaker 2: you could be really locked down. In the long term, 553 00:29:55,320 --> 00:29:58,920 Speaker 2: you don't have to spend an hour, you know, like 554 00:29:59,120 --> 00:30:01,720 Speaker 2: five hours out of time, just do it little bits, 555 00:30:01,960 --> 00:30:05,240 Speaker 2: and like that makes it a lot more manageable. It's 556 00:30:05,360 --> 00:30:08,320 Speaker 2: way better to like do those things at a leisurely 557 00:30:08,440 --> 00:30:11,440 Speaker 2: pace where you're not like pressured or stress Like before 558 00:30:11,520 --> 00:30:13,880 Speaker 2: a situation comes up, you do it in little bits. 559 00:30:14,200 --> 00:30:17,040 Speaker 2: Let's say it hits the fan, right like you're gonna 560 00:30:17,040 --> 00:30:20,440 Speaker 2: be under way more stress and pressure. You might already 561 00:30:20,480 --> 00:30:23,560 Speaker 2: have information that's been compromised, and now all of a sudden, 562 00:30:24,040 --> 00:30:26,520 Speaker 2: not only are you trying to like get the basics done, 563 00:30:26,520 --> 00:30:30,080 Speaker 2: but then like do way more to like keep an 564 00:30:30,080 --> 00:30:32,920 Speaker 2: eye on what the harassment looks like. So it's like 565 00:30:33,080 --> 00:30:35,400 Speaker 2: way way more work to do. It's better to like 566 00:30:35,520 --> 00:30:38,360 Speaker 2: do some of that stuff very naturally over time at 567 00:30:38,440 --> 00:30:41,280 Speaker 2: your own pace, and then also just like have have 568 00:30:41,320 --> 00:30:43,800 Speaker 2: a plan, like think about when I'm under stress, like 569 00:30:44,040 --> 00:30:45,680 Speaker 2: what is it that I can do to take care 570 00:30:45,720 --> 00:30:49,280 Speaker 2: of myself on an individual level? And like who can 571 00:30:49,320 --> 00:30:51,040 Speaker 2: I reach out to if a problem comes up? 572 00:30:52,720 --> 00:30:55,200 Speaker 1: All the things that I mentioned this episode, all those tools, 573 00:30:55,240 --> 00:30:58,040 Speaker 1: all those guides, they're in the show notes, in the description. 574 00:30:58,480 --> 00:31:00,880 Speaker 1: But if you just look at one, I'm just gonna 575 00:31:00,880 --> 00:31:06,920 Speaker 1: tell you this Online Harassment Field Manual dot pin dot org. 576 00:31:06,920 --> 00:31:09,800 Speaker 1: I mta say it one more time. Online Harassment Field 577 00:31:09,840 --> 00:31:13,560 Speaker 1: Manual dot pin dot org. That site has a lot 578 00:31:13,600 --> 00:31:15,840 Speaker 1: more guys that can work for just about everyone in 579 00:31:16,000 --> 00:31:19,000 Speaker 1: just about every situation. You can imagine, whether you're being 580 00:31:19,040 --> 00:31:21,680 Speaker 1: targeted right now or you're worried about it happening to 581 00:31:21,760 --> 00:31:24,800 Speaker 1: you or someone else in the future. And it covers 582 00:31:24,800 --> 00:31:27,040 Speaker 1: all the stuff that we've covered and a lot more. 583 00:31:28,120 --> 00:31:31,080 Speaker 2: Most people don't have to take, like you know, one 584 00:31:31,200 --> 00:31:35,120 Speaker 2: hundred percent def con, I'm gonna lock everything down. You're 585 00:31:35,160 --> 00:31:37,240 Speaker 2: not gonna be able to reach me. Most people don't 586 00:31:37,280 --> 00:31:39,880 Speaker 2: need that. But I think you just have to think about, 587 00:31:40,000 --> 00:31:44,000 Speaker 2: like for yourself, are you more vulnerable because of the 588 00:31:44,080 --> 00:31:45,280 Speaker 2: work you do, who you are? 589 00:31:45,920 --> 00:31:46,080 Speaker 3: If? 590 00:31:46,080 --> 00:31:49,880 Speaker 2: It's very assuring, so many people have dealt with this 591 00:31:49,960 --> 00:31:52,360 Speaker 2: type of thing before, and like most people get through 592 00:31:52,400 --> 00:31:56,360 Speaker 2: it just fine. For anyone that's harassed, like know that 593 00:31:56,480 --> 00:31:59,680 Speaker 2: you're not alone, like a lot of people are really 594 00:32:00,400 --> 00:32:03,160 Speaker 2: to step in and support you. And not judge you, 595 00:32:03,320 --> 00:32:06,480 Speaker 2: so it's like critical that you don't isolate yourself. 596 00:32:11,120 --> 00:32:12,960 Speaker 1: Shout out to ROMI, shout out to the Freedom of 597 00:32:12,960 --> 00:32:15,520 Speaker 1: the Press Foundation, and shout out to everyone who's put 598 00:32:15,560 --> 00:32:18,880 Speaker 1: out materials to help us be safer online. Just enter 599 00:32:18,920 --> 00:32:23,800 Speaker 1: everyday lives and Happy New Year. Thank you so much 600 00:32:23,800 --> 00:32:26,480 Speaker 1: for listening to another episode of kill Switch. You want 601 00:32:26,520 --> 00:32:28,560 Speaker 1: to talk to us, you can email us at kill 602 00:32:28,600 --> 00:32:33,200 Speaker 1: switch at Kaleidoscope dot NYC, or we're also on Instagram 603 00:32:33,240 --> 00:32:36,400 Speaker 1: at kill switch pod. If you dug this one, and 604 00:32:36,480 --> 00:32:39,320 Speaker 1: hopefully you did wherever you're listening to these podcasts, you know, 605 00:32:39,480 --> 00:32:42,040 Speaker 1: leave us a review. It helps other people find the show, 606 00:32:42,120 --> 00:32:45,040 Speaker 1: which in turn helps us keep doing our thing. Kill 607 00:32:45,040 --> 00:32:48,200 Speaker 1: Switch is hosted by Me Dexter Thomas. It's produced by 608 00:32:48,200 --> 00:32:52,080 Speaker 1: Shena Ozaki, dar Luck Potts and Julia Nutter. Our theme 609 00:32:52,120 --> 00:32:55,000 Speaker 1: song is by me and Kyle Murdoch, and Kyle also 610 00:32:55,040 --> 00:32:59,080 Speaker 1: mixed the show from Kaleidoscope. Our executive producers are oswaalachin 611 00:32:59,200 --> 00:33:02,160 Speaker 1: I Guess How to Get Tour and Kate Osborne from 612 00:33:02,200 --> 00:33:05,920 Speaker 1: iHeart are executive producers of Katrina Norville and Nikki e Tour. 613 00:33:06,720 --> 00:33:08,400 Speaker 1: Catch on the next one 614 00:33:10,280 --> 00:33:10,800 Speaker 3: Goodbye,