1 00:00:01,720 --> 00:00:07,040 Speaker 1: Cool Zone Media. Hello, and welcome back to It Could 2 00:00:07,040 --> 00:00:10,200 Speaker 1: Happen Here, a show that is no longer hypothetical now 3 00:00:10,200 --> 00:00:14,280 Speaker 1: that it is happening here. I'm your occasional host, Molly Conger, 4 00:00:14,360 --> 00:00:15,720 Speaker 1: and today I just want to talk to you a 5 00:00:15,720 --> 00:00:20,120 Speaker 1: little bit about your online security. It's a hot topic 6 00:00:20,200 --> 00:00:23,159 Speaker 1: right now for obvious reasons, and this won't be a 7 00:00:23,200 --> 00:00:27,120 Speaker 1: comprehensive overview on the subject by any meanings. I'm sure 8 00:00:27,120 --> 00:00:29,560 Speaker 1: there will be more episodes in the future covering specific 9 00:00:29,600 --> 00:00:32,800 Speaker 1: angles on this and more depth, but today I just 10 00:00:32,800 --> 00:00:35,400 Speaker 1: want to touch on some basics, especially for people who 11 00:00:35,479 --> 00:00:38,360 Speaker 1: may be asking themselves some of these questions for the 12 00:00:38,400 --> 00:00:42,199 Speaker 1: first time. This is more of a mental framework and 13 00:00:42,400 --> 00:00:46,320 Speaker 1: a pep talk. The main message here is don't freak out. 14 00:00:46,800 --> 00:00:49,760 Speaker 1: I'm not saying the situation isn't serious for your concerns 15 00:00:49,800 --> 00:00:54,360 Speaker 1: aren't real. It's very serious. But freaking out is not 16 00:00:54,400 --> 00:00:56,920 Speaker 1: going to do you any good. And if you're looking 17 00:00:56,920 --> 00:01:01,640 Speaker 1: for complicated, high tech solutions to the very real anxiety 18 00:01:01,680 --> 00:01:04,440 Speaker 1: that you're feeling right now, this episode doesn't have it. 19 00:01:04,840 --> 00:01:06,880 Speaker 1: That's not what I have for you today. And I 20 00:01:06,959 --> 00:01:10,000 Speaker 1: know a lot of people have really specific concerns about 21 00:01:10,000 --> 00:01:12,720 Speaker 1: apps they might be using to track their menstrual cycles 22 00:01:12,840 --> 00:01:15,640 Speaker 1: or fertility, and we're not going to touch on that 23 00:01:15,680 --> 00:01:18,440 Speaker 1: today because I think it's a topic that deserves its 24 00:01:18,440 --> 00:01:21,760 Speaker 1: own episode, and an episode where I talk to an 25 00:01:21,800 --> 00:01:25,320 Speaker 1: actual expert. So I'm hoping to get that out next month. 26 00:01:26,600 --> 00:01:30,240 Speaker 1: So what are we talking about? The answer is pretty simple, 27 00:01:30,959 --> 00:01:35,720 Speaker 1: calming down and shutting up. That's right. It's only Thursday 28 00:01:35,720 --> 00:01:38,160 Speaker 1: when this airs, but it is always shut the fuck 29 00:01:38,240 --> 00:01:41,640 Speaker 1: up Friday in our hearts, because the main source of 30 00:01:41,680 --> 00:01:46,000 Speaker 1: the risks you can do something about is your own mouth. 31 00:01:47,160 --> 00:01:51,680 Speaker 1: Because here's the thing. I'm not an expert on digital security. 32 00:01:52,120 --> 00:01:56,040 Speaker 1: I'm not a computer programmer or a hacker. I had 33 00:01:56,040 --> 00:01:58,720 Speaker 1: to call our producer Danil one time because I went 34 00:01:58,760 --> 00:02:02,280 Speaker 1: to record an episode in my little recording device said no, 35 00:02:03,160 --> 00:02:05,920 Speaker 1: and I almost cried. And it turned out I accidentally 36 00:02:05,960 --> 00:02:09,440 Speaker 1: slid the little tab on the data card that locks it. 37 00:02:09,960 --> 00:02:13,000 Speaker 1: I don't know, but what I do know a lot 38 00:02:13,120 --> 00:02:18,240 Speaker 1: about is how to exploit someone else's lack of digital security. 39 00:02:18,840 --> 00:02:21,120 Speaker 1: If you're a listener to my show We're Little Guys, 40 00:02:21,560 --> 00:02:23,760 Speaker 1: you know that I kind of have a knack for 41 00:02:23,880 --> 00:02:26,679 Speaker 1: finding out everything there is to know about a guy 42 00:02:27,520 --> 00:02:29,079 Speaker 1: so what I can offer you is a sort of 43 00:02:29,280 --> 00:02:34,600 Speaker 1: reverse engineered guide to stay safe online from someone like 44 00:02:34,720 --> 00:02:38,239 Speaker 1: me but evil. I like to tell people that you 45 00:02:38,280 --> 00:02:41,200 Speaker 1: should be thinking of your digital security kind of like 46 00:02:41,240 --> 00:02:44,320 Speaker 1: your health. People are going to have different risk factors, 47 00:02:44,360 --> 00:02:50,120 Speaker 1: different vulnerabilities, different concerns, different goals. If you're undocumented or 48 00:02:50,160 --> 00:02:52,960 Speaker 1: on a student or work visa, the risks and possible 49 00:02:53,000 --> 00:02:56,760 Speaker 1: consequences for you are very different. If you're queer or trans, 50 00:02:56,880 --> 00:02:59,520 Speaker 1: or a person of color, your risk profile looks different. 51 00:03:00,040 --> 00:03:03,239 Speaker 1: If you're economically dependent on family members whose politics don't 52 00:03:03,240 --> 00:03:06,240 Speaker 1: align with yours, your risk profile is different. If you 53 00:03:06,280 --> 00:03:08,320 Speaker 1: have a criminal record, if you work in a field 54 00:03:08,360 --> 00:03:11,120 Speaker 1: where your political activity is a significant threat to your 55 00:03:11,120 --> 00:03:13,880 Speaker 1: continued employment, if you're running for office, if you have 56 00:03:13,919 --> 00:03:16,640 Speaker 1: a security clearance, if you have children or vulnerable family 57 00:03:17,120 --> 00:03:20,800 Speaker 1: these are all different vulnerabilities, and you're going to have 58 00:03:20,840 --> 00:03:24,240 Speaker 1: specific concerns that are unique to you. And this isn't 59 00:03:24,240 --> 00:03:28,760 Speaker 1: meant to address those specific risk scenarios. But just like 60 00:03:28,800 --> 00:03:30,799 Speaker 1: people who may have different risk factors when it comes 61 00:03:30,800 --> 00:03:34,800 Speaker 1: to their health, everyone can benefit from the basics. You know, 62 00:03:34,920 --> 00:03:37,240 Speaker 1: no matter who you are, you have to wash your 63 00:03:37,240 --> 00:03:39,840 Speaker 1: hands and when it comes to digital security. A lot 64 00:03:39,880 --> 00:03:43,200 Speaker 1: of people want to jump right to the exciting, complicated 65 00:03:43,240 --> 00:03:47,160 Speaker 1: technical fixes. You know, they want the Kim Kardashian full 66 00:03:47,200 --> 00:03:52,360 Speaker 1: body MRI equivalent of being safe online. People want to 67 00:03:52,400 --> 00:03:54,880 Speaker 1: talk about buying burner phones and getting a Faraday bag 68 00:03:54,920 --> 00:03:58,440 Speaker 1: and evading high tech surveillance, but they're not washing their hands. 69 00:03:59,280 --> 00:04:01,400 Speaker 1: People love to say they're going to buy a burner phone. 70 00:04:01,480 --> 00:04:04,200 Speaker 1: But if you go to Walmart and you buy a 71 00:04:04,200 --> 00:04:06,600 Speaker 1: burner phone, and you put your credit card into the 72 00:04:06,640 --> 00:04:08,840 Speaker 1: machine that is recording a video of your face, and 73 00:04:08,840 --> 00:04:10,440 Speaker 1: then you take that phone home and turn it on 74 00:04:10,480 --> 00:04:13,960 Speaker 1: inside your house next to your real phone, you've done 75 00:04:14,000 --> 00:04:16,160 Speaker 1: nothing but waste your time and money. So we're not 76 00:04:16,800 --> 00:04:21,000 Speaker 1: talking about solutions like that. We are talking about is boring, 77 00:04:21,560 --> 00:04:25,159 Speaker 1: un sexy, basic stuff that everybody can and should be 78 00:04:25,200 --> 00:04:28,000 Speaker 1: doing before they jump into the deep end. If you 79 00:04:28,080 --> 00:04:30,280 Speaker 1: choose to go that route, because I'm not saying you 80 00:04:30,320 --> 00:04:33,200 Speaker 1: shouldn't worry about more advanced threats, I'm just saying you 81 00:04:33,279 --> 00:04:36,200 Speaker 1: have to start here. So before you can figure out 82 00:04:36,240 --> 00:04:39,279 Speaker 1: how to mitigate a risk, you have to nail down 83 00:04:39,640 --> 00:04:43,440 Speaker 1: what that risk actually is, What is the outcome that 84 00:04:43,480 --> 00:04:46,600 Speaker 1: you're hoping to avoid. There's a lot of anxiety right 85 00:04:46,640 --> 00:04:50,880 Speaker 1: now about unknowable possibilities, and it's really easy to get 86 00:04:50,920 --> 00:04:54,400 Speaker 1: overwhelmed with the what ifs of a worst case scenario 87 00:04:54,520 --> 00:04:59,839 Speaker 1: and then you just end up feeling really helpless. And look, yeah, 88 00:05:00,080 --> 00:05:02,760 Speaker 1: there are there are some potential threats here that I 89 00:05:02,800 --> 00:05:05,800 Speaker 1: don't have the tools to help you address. But that 90 00:05:05,839 --> 00:05:08,279 Speaker 1: doesn't mean you shouldn't be taking the steps that are 91 00:05:08,320 --> 00:05:11,120 Speaker 1: within your control right now. You have to fight off 92 00:05:11,120 --> 00:05:14,279 Speaker 1: that feeling of helplessness. So what we're talking about here 93 00:05:14,360 --> 00:05:17,000 Speaker 1: is threat modeling. I gave a little workshop a few 94 00:05:17,040 --> 00:05:19,560 Speaker 1: months ago about digital security, and the first thing I 95 00:05:19,600 --> 00:05:23,960 Speaker 1: asked the group was what is the bad thing that 96 00:05:24,000 --> 00:05:27,719 Speaker 1: you were worried will happen? And most people's answer to 97 00:05:27,720 --> 00:05:32,240 Speaker 1: that was they're worried about getting docksed. Okay, that's fair, 98 00:05:32,320 --> 00:05:35,039 Speaker 1: that's a valid fear, But what do you mean by that? 99 00:05:36,080 --> 00:05:39,560 Speaker 1: What specifically is the piece of information you are worried 100 00:05:39,600 --> 00:05:44,080 Speaker 1: someone will discover? Is it your name, your address where 101 00:05:44,120 --> 00:05:47,839 Speaker 1: you work? Is it connecting two pieces of your online 102 00:05:47,880 --> 00:05:51,919 Speaker 1: identity that you thought were separate. Docsing can mean a 103 00:05:51,960 --> 00:05:55,400 Speaker 1: lot of things to different people at different contexts, and 104 00:05:55,440 --> 00:05:58,320 Speaker 1: it can happen in degrees right, like you know my 105 00:05:58,520 --> 00:06:03,920 Speaker 1: full legal name. I'm doxed to whatever extent that means anything. 106 00:06:04,680 --> 00:06:07,080 Speaker 1: But this could still happen to me. Someone could still 107 00:06:07,120 --> 00:06:10,080 Speaker 1: discover a piece of information about me that I wish 108 00:06:10,080 --> 00:06:15,200 Speaker 1: they didn't have. And most people can't become completely anonymous. 109 00:06:16,000 --> 00:06:18,600 Speaker 1: I can't help you do that, and honestly, I don't 110 00:06:18,600 --> 00:06:24,120 Speaker 1: think that should be most people's goals. Don't disappear. I'm 111 00:06:24,120 --> 00:06:26,720 Speaker 1: not telling you you should disappear. This is just about 112 00:06:26,760 --> 00:06:29,720 Speaker 1: figuring out what makes sense for you and what you 113 00:06:29,800 --> 00:06:33,200 Speaker 1: can do to navigate the landscape that you've chosen to 114 00:06:33,240 --> 00:06:48,640 Speaker 1: operate in. So, what is the actual negative outcome specifically 115 00:06:49,200 --> 00:06:52,240 Speaker 1: that is making you feel afraid? What is the concrete 116 00:06:52,320 --> 00:06:55,320 Speaker 1: thing that you are thinking about when you experience that fear. 117 00:06:56,040 --> 00:06:58,360 Speaker 1: And people's answers tend to be that they're worried about 118 00:06:58,400 --> 00:07:02,200 Speaker 1: getting harassed, They're worried about their physical safety, they're worried 119 00:07:02,200 --> 00:07:06,080 Speaker 1: about negative fallout at work or at school. People's fears 120 00:07:06,080 --> 00:07:09,560 Speaker 1: tend to be about things like getting arrested, getting sued, 121 00:07:09,880 --> 00:07:15,240 Speaker 1: getting fired, getting hurt, and getting embarrassed. And so the 122 00:07:15,280 --> 00:07:18,680 Speaker 1: next question is can you identify the potential sources for 123 00:07:18,760 --> 00:07:21,480 Speaker 1: the kinds of harm you're worried about, and you can 124 00:07:21,520 --> 00:07:25,160 Speaker 1: sort these into a few primary categories. The state can 125 00:07:25,200 --> 00:07:28,160 Speaker 1: harm you. That's the police, the government. You can get 126 00:07:28,200 --> 00:07:32,400 Speaker 1: charged with a crime. Institutions can harm you. If you're 127 00:07:32,440 --> 00:07:34,560 Speaker 1: a student, you can get in trouble at school. If 128 00:07:34,560 --> 00:07:37,040 Speaker 1: you have some kind of professional license, people could file 129 00:07:37,080 --> 00:07:41,680 Speaker 1: complaints against you. Politicians and organized political groups can harm you. 130 00:07:41,680 --> 00:07:44,920 Speaker 1: You know, Marjorie Taylor Green might tweet your TikTok video 131 00:07:45,120 --> 00:07:47,960 Speaker 1: or Canary Mission might do a blog post about where 132 00:07:48,000 --> 00:07:51,080 Speaker 1: you work. And right wing groups can harm you. You 133 00:07:51,160 --> 00:07:55,080 Speaker 1: might get targeted harassment from some Nazi telegram channel. The 134 00:07:55,200 --> 00:07:58,480 Speaker 1: worst case scenario, maybe you were physically threatened or attacked 135 00:07:58,480 --> 00:08:02,520 Speaker 1: by an extremist group. You could get swatted. And then 136 00:08:02,560 --> 00:08:05,800 Speaker 1: there's just this sort of wild card of the random 137 00:08:05,840 --> 00:08:09,520 Speaker 1: strangers and Internet mobs and the way they factor into 138 00:08:09,640 --> 00:08:13,600 Speaker 1: and exacerbate all of the above scenarios. When it comes 139 00:08:13,680 --> 00:08:16,640 Speaker 1: to harm from the state, that's beyond what we're talking 140 00:08:16,680 --> 00:08:21,080 Speaker 1: about with this digital hand washing metaphor. A lot of 141 00:08:21,120 --> 00:08:23,960 Speaker 1: the prevention steps you can take today are still going 142 00:08:24,000 --> 00:08:26,320 Speaker 1: to help you. They're still worth taking. But at the 143 00:08:26,440 --> 00:08:28,640 Speaker 1: end of the day, if the government wants to know 144 00:08:28,680 --> 00:08:30,880 Speaker 1: who runs a Twitter account, who drove to a protest, 145 00:08:30,880 --> 00:08:34,640 Speaker 1: who supported a movement, who donated money. That's beyond the basics. 146 00:08:35,200 --> 00:08:38,080 Speaker 1: Most of what I have direct experience with are just 147 00:08:38,160 --> 00:08:41,480 Speaker 1: these basic measures that you can take today to make 148 00:08:41,520 --> 00:08:44,600 Speaker 1: it a little bit harder for the average weird little 149 00:08:44,679 --> 00:08:48,080 Speaker 1: guy to get into your business. It'll stop the average 150 00:08:48,080 --> 00:08:51,839 Speaker 1: online troll, it'll slow down a decent sleuth, but it's 151 00:08:51,880 --> 00:08:53,319 Speaker 1: not the kind of stuff that stops a guy with 152 00:08:53,360 --> 00:08:58,280 Speaker 1: a warrant. Think of protecting your online identity like being 153 00:08:58,280 --> 00:09:03,000 Speaker 1: inside your house. If you have no curtains, someone walking 154 00:09:03,040 --> 00:09:06,400 Speaker 1: down the street can see you, even if they didn't 155 00:09:06,400 --> 00:09:08,960 Speaker 1: go out of their way to look. If you're putting 156 00:09:09,000 --> 00:09:12,480 Speaker 1: everything out there with no thought to digital security, somebody 157 00:09:12,520 --> 00:09:15,120 Speaker 1: could dock see without even trying, just like they would 158 00:09:15,120 --> 00:09:17,679 Speaker 1: be able to see and through your windows from the street. 159 00:09:17,960 --> 00:09:20,600 Speaker 1: Somebody who is a little more curious about you might 160 00:09:20,920 --> 00:09:23,840 Speaker 1: walk into your yard. But if you put up a fence, 161 00:09:24,320 --> 00:09:27,640 Speaker 1: maybe that person will decide this isn't really worth my time. 162 00:09:28,240 --> 00:09:31,320 Speaker 1: Somebody who loves peeping in windows and really wants to 163 00:09:31,360 --> 00:09:34,440 Speaker 1: see you, he's gonna hop your fence, right. But the 164 00:09:34,559 --> 00:09:38,800 Speaker 1: average troll will see these barriers and they'll get bored. 165 00:09:39,520 --> 00:09:42,680 Speaker 1: But again, curtains, a fence, a locked door, a guard dog, 166 00:09:42,800 --> 00:09:45,319 Speaker 1: these don't stop a guy with a warrant. So we're 167 00:09:45,360 --> 00:09:49,480 Speaker 1: talking about just putting up barriers that slow down and 168 00:09:49,559 --> 00:09:54,640 Speaker 1: discourage the average low to mid level weirdo. In short, 169 00:09:54,920 --> 00:09:58,360 Speaker 1: delete your Facebook, set your accounts to private, you signal, 170 00:09:58,720 --> 00:10:02,120 Speaker 1: put a passcode on your phone, and say less and 171 00:10:02,320 --> 00:10:05,640 Speaker 1: try to do something about the data brokers. Let's break 172 00:10:05,679 --> 00:10:08,400 Speaker 1: these down one at a time. I'm sure it's been 173 00:10:08,400 --> 00:10:11,120 Speaker 1: talked about on this show before, but I tell everyone 174 00:10:11,280 --> 00:10:15,680 Speaker 1: in my life download Signal. Download Signal. It's free, put 175 00:10:15,679 --> 00:10:18,680 Speaker 1: it on your phone. It's just an encrypted messaging app, 176 00:10:19,120 --> 00:10:21,800 Speaker 1: and I use it by default, pretty much exclusively in 177 00:10:21,840 --> 00:10:24,360 Speaker 1: place of regular texting, just because it's easier for me 178 00:10:24,400 --> 00:10:27,120 Speaker 1: to have everything in one place. It doesn't collect or 179 00:10:27,160 --> 00:10:29,560 Speaker 1: store your metadata, it doesn't back up to the cloud, 180 00:10:29,760 --> 00:10:32,360 Speaker 1: and you can set all of your conversations to automatically 181 00:10:32,400 --> 00:10:37,400 Speaker 1: disappear at whatever time interval you choose. You don't need 182 00:10:37,920 --> 00:10:40,360 Speaker 1: text messages from a year ago, You don't. Those can 183 00:10:40,400 --> 00:10:42,640 Speaker 1: never help you, They can only hurt you. Just let 184 00:10:42,720 --> 00:10:47,080 Speaker 1: them go and turn off the biometric unlock on your phone, 185 00:10:47,200 --> 00:10:50,920 Speaker 1: whether that's a fingerprint or a face ID, turn it off, 186 00:10:51,520 --> 00:10:54,640 Speaker 1: turn it off. Set a passcode. If you get arrested 187 00:10:54,679 --> 00:10:57,160 Speaker 1: and you have your phone on you, they can use 188 00:10:57,160 --> 00:11:00,560 Speaker 1: your finger or your face to unlock it without warrant. 189 00:11:00,920 --> 00:11:03,640 Speaker 1: But if you have a passcode, you're a little bit safer. 190 00:11:04,440 --> 00:11:06,400 Speaker 1: So set a pass code that's at least six digits 191 00:11:06,400 --> 00:11:09,640 Speaker 1: long longer if you can bear it, I know. But 192 00:11:09,679 --> 00:11:12,480 Speaker 1: when it comes to social media, you have some choices. 193 00:11:12,960 --> 00:11:15,560 Speaker 1: You may look at your own threat model and say, well, 194 00:11:15,600 --> 00:11:17,880 Speaker 1: I don't care if everyone can see what I've posted, 195 00:11:18,320 --> 00:11:22,800 Speaker 1: and that's okay, right. We all have different goals and vulnerabilities. 196 00:11:23,240 --> 00:11:26,560 Speaker 1: And if you're a very public organizer, then yeah, you 197 00:11:26,640 --> 00:11:30,280 Speaker 1: need public social media. But if you've been using Facebook 198 00:11:30,320 --> 00:11:35,199 Speaker 1: for twenty years, you probably weren't always very careful about 199 00:11:35,200 --> 00:11:37,800 Speaker 1: what was on there. And there are privacy settings now 200 00:11:37,800 --> 00:11:40,480 Speaker 1: where you can retroactively set all of your old posts 201 00:11:40,559 --> 00:11:43,679 Speaker 1: to a new privacy settings. You should do that start 202 00:11:43,679 --> 00:11:46,920 Speaker 1: there if you haven't done that, but that still leaves 203 00:11:46,960 --> 00:11:51,000 Speaker 1: a lot of digital debris. If you've changed your display 204 00:11:51,080 --> 00:11:53,760 Speaker 1: name to something more private in recent years, something that 205 00:11:54,080 --> 00:11:58,080 Speaker 1: isn't your current legal name, old posts that other people 206 00:11:58,160 --> 00:12:01,600 Speaker 1: made about you still have your old name in them. 207 00:12:01,920 --> 00:12:05,280 Speaker 1: So if they tagged you ten years ago, that old 208 00:12:05,400 --> 00:12:09,280 Speaker 1: name is still a link to your current profile, and 209 00:12:09,320 --> 00:12:12,160 Speaker 1: you can't control the content that your friends and family 210 00:12:12,559 --> 00:12:15,920 Speaker 1: posted years ago. And on the flip side, if in 211 00:12:15,960 --> 00:12:19,200 Speaker 1: the end you decide you don't care what's on your 212 00:12:19,240 --> 00:12:23,439 Speaker 1: Facebook about you, when you're doing your threat modeling, consider 213 00:12:23,480 --> 00:12:26,560 Speaker 1: the people close to you, because when I'm working at 214 00:12:26,559 --> 00:12:29,240 Speaker 1: this from the other side, a lot of times I'll 215 00:12:29,280 --> 00:12:31,880 Speaker 1: find that, you know, the guy that I'm looking for 216 00:12:31,960 --> 00:12:34,880 Speaker 1: has done a pretty good job cleaning up his own 217 00:12:34,960 --> 00:12:39,800 Speaker 1: digital presence, but his wife, his mom, his sister, someone 218 00:12:39,840 --> 00:12:43,320 Speaker 1: in his life as not. So if there's someone in 219 00:12:43,360 --> 00:12:45,959 Speaker 1: your life who maybe is at greater risk than you are, 220 00:12:46,679 --> 00:12:49,319 Speaker 1: don't be their weak spot. And if you're in a 221 00:12:49,400 --> 00:12:52,240 Speaker 1: position to do so, talk to the people in your 222 00:12:52,280 --> 00:12:56,760 Speaker 1: life about this. Have these conversations about what are our risks, 223 00:12:56,840 --> 00:13:00,079 Speaker 1: what are our goals, Let's do a digital hygiene and 224 00:13:00,200 --> 00:13:04,520 Speaker 1: check together. Because you can build an impenetrable digital fortress 225 00:13:04,559 --> 00:13:07,240 Speaker 1: around yourself. But if you're aunt Kathy is live streaming 226 00:13:07,240 --> 00:13:10,480 Speaker 1: your baby shower, that didn't do you much good. And 227 00:13:10,559 --> 00:13:13,280 Speaker 1: now that more people are talking about these kinds of concerns. 228 00:13:13,559 --> 00:13:15,880 Speaker 1: You can try approaching the subject with people in your 229 00:13:15,920 --> 00:13:18,240 Speaker 1: life that may not have been receptive to it a 230 00:13:18,320 --> 00:13:21,120 Speaker 1: year ago. Show your mom how to set her Facebook 231 00:13:21,160 --> 00:13:24,840 Speaker 1: to private. Take the time to explain to your less 232 00:13:24,840 --> 00:13:29,160 Speaker 1: political siblings why they should think about the ways in 233 00:13:29,200 --> 00:13:32,959 Speaker 1: which their social media use might expose someone they care about. 234 00:13:33,480 --> 00:13:37,000 Speaker 1: Don't just scold them or you say it's reckless that 235 00:13:37,040 --> 00:13:51,000 Speaker 1: you're doing this, talk about why so when it comes 236 00:13:51,000 --> 00:13:54,880 Speaker 1: to social media, I'm saying delete your Facebook as a 237 00:13:54,920 --> 00:13:59,720 Speaker 1: sort of shorthand for the general cleanup of the stuff 238 00:14:00,120 --> 00:14:04,000 Speaker 1: that you've left online for the last twenty years. Cleaning 239 00:14:04,080 --> 00:14:06,760 Speaker 1: up your online presence is the number one thing you 240 00:14:06,800 --> 00:14:11,600 Speaker 1: can do right now to thwart the bizarro universe version 241 00:14:11,679 --> 00:14:13,800 Speaker 1: of me who is trying to collect every piece of 242 00:14:13,800 --> 00:14:18,160 Speaker 1: information about you. Because even if you're careful today, even 243 00:14:18,160 --> 00:14:21,360 Speaker 1: if you're so smart about it now and you're not 244 00:14:21,360 --> 00:14:24,640 Speaker 1: putting anything online that puts you at risk, you weren't 245 00:14:24,640 --> 00:14:28,040 Speaker 1: always that careful. We're all guilty of it. People who've 246 00:14:28,080 --> 00:14:30,240 Speaker 1: been doing this for a long time, people who know better, 247 00:14:30,600 --> 00:14:34,440 Speaker 1: We're all guilty of being a little messy online. It's okay, 248 00:14:34,760 --> 00:14:37,720 Speaker 1: there's no shame that you didn't know before. Don't feel silly, 249 00:14:37,760 --> 00:14:41,960 Speaker 1: don't feel guilty. Just start cleaning it up today and 250 00:14:41,960 --> 00:14:44,280 Speaker 1: so to figure out what exactly you might have been 251 00:14:44,360 --> 00:14:46,960 Speaker 1: leaving out in the open, one thing you can try 252 00:14:47,160 --> 00:14:50,960 Speaker 1: is doxing yourself, or do it with a friend, right, 253 00:14:51,040 --> 00:14:54,920 Speaker 1: try doxing each other. So start with a completely clean cash, 254 00:14:54,960 --> 00:14:58,800 Speaker 1: deleat your cookies, whatever, Open an incognito browser, start with 255 00:14:58,840 --> 00:15:02,640 Speaker 1: a blank slate, and just google yourself. Google your name, 256 00:15:03,040 --> 00:15:07,240 Speaker 1: your address, your phone number. Google the usernames that you 257 00:15:07,320 --> 00:15:11,360 Speaker 1: currently use on various sites. But google the username you 258 00:15:11,440 --> 00:15:15,280 Speaker 1: used in high school. Google your old aim handle, Google 259 00:15:15,320 --> 00:15:18,200 Speaker 1: the email address you made when you were twelve. What 260 00:15:18,320 --> 00:15:22,360 Speaker 1: comes up? And is that information you want everybody to have? 261 00:15:23,560 --> 00:15:27,400 Speaker 1: Probably not. Start by deleting accounts you don't use anymore. 262 00:15:28,160 --> 00:15:31,320 Speaker 1: Just wipe those bad boys right out. You don't need those. 263 00:15:32,280 --> 00:15:34,560 Speaker 1: A lot of people have no idea that the ghost 264 00:15:34,640 --> 00:15:39,480 Speaker 1: of their old MySpace page still exists online. I've actually 265 00:15:39,600 --> 00:15:42,760 Speaker 1: used that one fairly recently to confirm the details about 266 00:15:42,760 --> 00:15:47,080 Speaker 1: a person's close associates and family members. They hadn't logged 267 00:15:47,120 --> 00:15:50,680 Speaker 1: into MySpace since twenty ten. But your top eight lives 268 00:15:50,720 --> 00:15:54,640 Speaker 1: forever so delete or set to privates any account that 269 00:15:54,680 --> 00:15:57,400 Speaker 1: you don't use, don't need, or just don't need to 270 00:15:57,400 --> 00:16:02,280 Speaker 1: be public facing. Log into every social media site, every forum, 271 00:16:02,400 --> 00:16:06,320 Speaker 1: every online store where you've ever created an account, and 272 00:16:06,440 --> 00:16:09,840 Speaker 1: just look at what's visible. Your online reviews may contain 273 00:16:09,960 --> 00:16:13,720 Speaker 1: information about where you live. Your profile on some forum 274 00:16:13,800 --> 00:16:16,520 Speaker 1: you posted on in twenty twelve probably has your birthday 275 00:16:16,560 --> 00:16:19,480 Speaker 1: on it. And if you're an active Pinterest user, your 276 00:16:19,520 --> 00:16:23,440 Speaker 1: Pinterest boards are probably revealing a lot more information about 277 00:16:23,480 --> 00:16:26,920 Speaker 1: you than you realize, information about your family, your interests, 278 00:16:26,920 --> 00:16:30,160 Speaker 1: your plans for the future. People will make Pinterest boards 279 00:16:30,160 --> 00:16:34,480 Speaker 1: with names like Jaden's second birthday, and now I know 280 00:16:34,680 --> 00:16:37,560 Speaker 1: that you have a son named Jaden whose second birthday 281 00:16:37,560 --> 00:16:40,880 Speaker 1: party you were planning last July. That's a real example. 282 00:16:41,760 --> 00:16:44,720 Speaker 1: So set these things to private. Change your profile picture 283 00:16:44,760 --> 00:16:47,720 Speaker 1: to something that isn't your face. Look at your username. 284 00:16:48,120 --> 00:16:49,720 Speaker 1: Did you have to put some numbers at the end 285 00:16:49,760 --> 00:16:52,280 Speaker 1: of that because the one you wanted was taken? Are 286 00:16:52,320 --> 00:16:56,440 Speaker 1: those numbers your birthday? And vary your usernames a little bit. 287 00:16:57,000 --> 00:16:59,720 Speaker 1: Unless you have some kind of professional reason for using 288 00:16:59,760 --> 00:17:03,720 Speaker 1: a personal brand across every platform, don't use the same 289 00:17:03,840 --> 00:17:09,120 Speaker 1: username everywhere. Keep separate areas of your life. Separate, don't 290 00:17:09,119 --> 00:17:11,320 Speaker 1: make it any easier than it needs to be to 291 00:17:11,440 --> 00:17:15,320 Speaker 1: connect these different pieces of your digital footprint into one 292 00:17:15,560 --> 00:17:19,600 Speaker 1: picture of who you are. Because again I'm not talking 293 00:17:19,600 --> 00:17:23,800 Speaker 1: about becoming completely anonymous online. A lot of people need 294 00:17:23,840 --> 00:17:27,280 Speaker 1: to exist online as the person that they are. You 295 00:17:27,359 --> 00:17:30,960 Speaker 1: have a LinkedIn, you do public facing organizing. I'm not 296 00:17:31,000 --> 00:17:34,760 Speaker 1: saying you need to disappear from online, but if you 297 00:17:34,840 --> 00:17:37,920 Speaker 1: have accounts that you don't want connected back to your 298 00:17:38,000 --> 00:17:40,480 Speaker 1: true identity, If there are pieces of you that exist 299 00:17:40,560 --> 00:17:44,880 Speaker 1: that you don't want side by side, don't connect them. 300 00:17:45,320 --> 00:17:48,399 Speaker 1: So if you anonymously run a social media account for 301 00:17:48,640 --> 00:17:51,920 Speaker 1: an activist group, don't use it to follow your own 302 00:17:51,960 --> 00:17:55,600 Speaker 1: real account. Don't like your boyfriend's posts when you're logged 303 00:17:55,640 --> 00:17:58,880 Speaker 1: into your anarchist shit posting account if you don't want 304 00:17:58,880 --> 00:18:02,960 Speaker 1: it connected to you, don't create overlap. If you post 305 00:18:03,000 --> 00:18:06,320 Speaker 1: a screenshot from one social media platform onto another, a 306 00:18:06,320 --> 00:18:10,040 Speaker 1: screenshot of a tweet on your Instagram, whatever, be mindful 307 00:18:10,359 --> 00:18:13,199 Speaker 1: of what's in that image. Is there a thumbnail of 308 00:18:13,200 --> 00:18:16,359 Speaker 1: your own profile picture in there? Does the screenshot show 309 00:18:16,400 --> 00:18:19,800 Speaker 1: that you interacted with that post? Because a filled in 310 00:18:19,880 --> 00:18:23,600 Speaker 1: heart on an Instagram screenshot is something I have used 311 00:18:23,880 --> 00:18:27,000 Speaker 1: as a building block for a docs. And maybe you've 312 00:18:27,000 --> 00:18:30,800 Speaker 1: never posted anything identifiable on Twitter, but if you've posted 313 00:18:30,800 --> 00:18:34,480 Speaker 1: a link to your Twitter account on Reddit, or are 314 00:18:34,480 --> 00:18:36,399 Speaker 1: you in a big discord and you shared one of 315 00:18:36,440 --> 00:18:38,600 Speaker 1: your own posts with your friends in there, like hey 316 00:18:38,640 --> 00:18:41,240 Speaker 1: look at this bang or tweet I'm going viral. And 317 00:18:41,320 --> 00:18:44,040 Speaker 1: I say both of those specifically because both of those 318 00:18:44,160 --> 00:18:47,320 Speaker 1: are specific mistakes that I have seen people make that 319 00:18:47,440 --> 00:18:51,760 Speaker 1: were for me a crucial link between two accounts that 320 00:18:51,840 --> 00:18:55,080 Speaker 1: connected the dots to figure out who they were. Use 321 00:18:55,119 --> 00:18:59,119 Speaker 1: two factor authentication, use a password manager, use complex passwords, 322 00:18:59,200 --> 00:19:03,560 Speaker 1: never recycle a password. Check databases like have I been poned? 323 00:19:03,880 --> 00:19:06,919 Speaker 1: See what's been leaked about you? And some of that 324 00:19:07,000 --> 00:19:09,680 Speaker 1: data is out of your control now, but it's out 325 00:19:09,720 --> 00:19:12,240 Speaker 1: there and you can't call it back. But you can 326 00:19:12,359 --> 00:19:15,399 Speaker 1: change all of your passwords today. You can download a 327 00:19:15,440 --> 00:19:18,719 Speaker 1: password manager and change all of your passwords today, and 328 00:19:18,760 --> 00:19:21,359 Speaker 1: all of your passwords should be something different from one another. 329 00:19:21,440 --> 00:19:24,000 Speaker 1: I'm going to say it again, change all your passwords. 330 00:19:24,400 --> 00:19:27,520 Speaker 1: Stop using your dog's name as your password for everything. 331 00:19:28,320 --> 00:19:31,720 Speaker 1: It was hard, but I did it, okay. And when 332 00:19:31,720 --> 00:19:35,000 Speaker 1: you're doing this digital hygiene check. You know you're googling yourself. 333 00:19:35,040 --> 00:19:38,400 Speaker 1: You're checking these breach databases. One of the things you're 334 00:19:38,400 --> 00:19:42,439 Speaker 1: going to find is your address, your email address, and 335 00:19:42,480 --> 00:19:46,000 Speaker 1: your phone number, and your parents' names and your parents address. 336 00:19:46,520 --> 00:19:49,000 Speaker 1: All of these pieces of what you thought were personal 337 00:19:49,119 --> 00:19:53,560 Speaker 1: private information. They are bought and sold to data brokers, 338 00:19:53,720 --> 00:19:56,439 Speaker 1: and these data brokers put them online on sites that 339 00:19:56,480 --> 00:20:00,159 Speaker 1: people can pay to access. Fel like people find your 340 00:20:00,320 --> 00:20:05,000 Speaker 1: true people search white Pages. There's hundreds of them. Now. 341 00:20:05,280 --> 00:20:07,840 Speaker 1: By law, all of these sites have to have a 342 00:20:07,920 --> 00:20:10,240 Speaker 1: link on them somewhere where you can ask them to 343 00:20:10,240 --> 00:20:13,439 Speaker 1: delete your information. Some of them make it kind of hard, 344 00:20:13,840 --> 00:20:16,080 Speaker 1: and it may take weeks for them to actually honor 345 00:20:16,119 --> 00:20:19,720 Speaker 1: the requests, and you may have to follow up. But theoretically, 346 00:20:20,400 --> 00:20:24,320 Speaker 1: if they're operating legally, you do have the ability to 347 00:20:24,600 --> 00:20:28,359 Speaker 1: manually clean up how much of your personal information comes 348 00:20:28,400 --> 00:20:31,480 Speaker 1: up from these data brokers. But I'll be honest with you, 349 00:20:32,280 --> 00:20:35,639 Speaker 1: it's whack a mole. You could spend one afternoon a 350 00:20:35,680 --> 00:20:37,840 Speaker 1: week for the rest of your life making opt out 351 00:20:37,880 --> 00:20:40,240 Speaker 1: requests and following up on them and checking back to 352 00:20:40,240 --> 00:20:43,399 Speaker 1: make sure it's really gone. You can do that. I 353 00:20:43,480 --> 00:20:46,080 Speaker 1: used to do that, but there are also services that 354 00:20:46,119 --> 00:20:48,560 Speaker 1: will do it for you for a fee. I think 355 00:20:48,560 --> 00:20:51,040 Speaker 1: there may be an episode in the pipeline examining that 356 00:20:51,080 --> 00:20:54,480 Speaker 1: particular ecosystem and some more details. So I won't go 357 00:20:54,480 --> 00:20:57,080 Speaker 1: into the pros and cons of different services that exist, 358 00:20:57,520 --> 00:21:00,399 Speaker 1: but if that's something you're interested in paying for, do 359 00:21:00,480 --> 00:21:03,600 Speaker 1: some research about it before you put your money down. 360 00:21:03,920 --> 00:21:05,520 Speaker 1: But at the end of the day, I just want 361 00:21:05,560 --> 00:21:10,439 Speaker 1: you to remember you can't solve this whole problem. That 362 00:21:10,560 --> 00:21:13,800 Speaker 1: might sound like a defeatist message, but I think it's healthy. 363 00:21:14,520 --> 00:21:18,159 Speaker 1: I'm not saying it's hopeless. I'm saying you have to 364 00:21:18,200 --> 00:21:21,479 Speaker 1: spend your energy where it counts. People ask me all 365 00:21:21,520 --> 00:21:23,520 Speaker 1: the time, you know, are you worried about this or 366 00:21:23,520 --> 00:21:27,960 Speaker 1: that specific threat? And the answer is yeah, probably, Yeah, 367 00:21:27,960 --> 00:21:30,879 Speaker 1: of course I'm worried. But you can't let that fear 368 00:21:30,960 --> 00:21:34,320 Speaker 1: overwhelm you. You know, if I get fixated on the 369 00:21:34,359 --> 00:21:38,560 Speaker 1: existence of threats that are outside of my control, I'll 370 00:21:38,560 --> 00:21:41,760 Speaker 1: just freak out, and that makes me less capable of 371 00:21:41,840 --> 00:21:45,680 Speaker 1: focusing on mitigating the threats that are within my control. 372 00:21:46,480 --> 00:21:48,960 Speaker 1: So don't put blinders on, don't lie to yourself, you know, 373 00:21:49,040 --> 00:21:53,200 Speaker 1: be realistic, but don't wear yourself out worrying about things 374 00:21:53,240 --> 00:21:55,400 Speaker 1: that are so far out of your control that all 375 00:21:55,440 --> 00:22:01,080 Speaker 1: you have is fear. So today, now take a deep breath, 376 00:22:01,840 --> 00:22:05,320 Speaker 1: delete your MySpace account, and talk to your mom about 377 00:22:05,400 --> 00:22:11,920 Speaker 1: setting all her old Facebook pictures to private. It could 378 00:22:11,920 --> 00:22:14,359 Speaker 1: Happen Here is a production of cool Zone Media. For 379 00:22:14,440 --> 00:22:18,160 Speaker 1: more podcasts from cool Zone Media, visit our website foolzonmedia 380 00:22:18,200 --> 00:22:21,000 Speaker 1: dot com, or check us out on the iHeartRadio app, 381 00:22:21,040 --> 00:22:24,600 Speaker 1: Apple Podcasts, or wherever you listen to podcasts. You can 382 00:22:24,640 --> 00:22:27,000 Speaker 1: now find sources for it could Happen Here, listed directly 383 00:22:27,000 --> 00:22:29,320 Speaker 1: in episode descriptions. Thanks for listening.