1 00:00:05,519 --> 00:00:09,440 Speaker 1: Uh, welcome to It Could Happen here, a podcast about 2 00:00:09,520 --> 00:00:12,920 Speaker 1: things falling apart and how to deal with that and 3 00:00:13,000 --> 00:00:17,000 Speaker 1: hopefully take care of yourself and your people. Um. Today 4 00:00:17,120 --> 00:00:21,920 Speaker 1: we have a returning guest, Carl Casarda from Enranged TV. UM. Now, Carl, 5 00:00:22,079 --> 00:00:25,400 Speaker 1: every time you and I have chatted on a show together, 6 00:00:25,440 --> 00:00:28,479 Speaker 1: it has been about firearms, which is obviously your passion 7 00:00:28,560 --> 00:00:31,000 Speaker 1: and specialty, well one of your specialties. But today we're 8 00:00:31,000 --> 00:00:33,680 Speaker 1: not talking at all about guns. Um. I mean maybe 9 00:00:33,720 --> 00:00:36,120 Speaker 1: here and there, but today we're talking about the thing 10 00:00:36,159 --> 00:00:40,000 Speaker 1: that is has been your your career, uh for what 11 00:00:40,040 --> 00:00:43,400 Speaker 1: most of your working life. Fair to say, Um, you 12 00:00:43,400 --> 00:00:45,160 Speaker 1: want to kind of walk through your background here because 13 00:00:45,159 --> 00:00:47,640 Speaker 1: we're gonna be talking about information security and like sort 14 00:00:47,680 --> 00:00:50,120 Speaker 1: of the future of threats that are going to be 15 00:00:50,760 --> 00:00:54,520 Speaker 1: like coming uh throughout like the next few years of 16 00:00:54,560 --> 00:00:57,120 Speaker 1: our lives. Obviously, this year in particular, there's been a 17 00:00:57,120 --> 00:01:00,000 Speaker 1: bunch of stories about like Russian attacks on digital infrastry 18 00:01:00,080 --> 00:01:03,000 Speaker 1: sure and vice versa. And that's always like pretty much 19 00:01:03,000 --> 00:01:05,679 Speaker 1: has been something that's in everybody's backburter since we got 20 00:01:05,680 --> 00:01:11,120 Speaker 1: the Internet, usually through like Questionable films with Sandra Bullock. UM, 21 00:01:11,160 --> 00:01:14,199 Speaker 1: I think net that was net right, Um, the net, 22 00:01:14,520 --> 00:01:17,560 Speaker 1: the net, yes exactly, Yes, where they somehow hacked a 23 00:01:17,560 --> 00:01:20,600 Speaker 1: car and nineteen ninety eight or something. You gotta do 24 00:01:20,640 --> 00:01:22,960 Speaker 1: that when you're flying through cyberspace with your VR comment 25 00:01:23,040 --> 00:01:26,320 Speaker 1: on and your gloves right yea, um, but yeah, you 26 00:01:26,360 --> 00:01:28,280 Speaker 1: want to walk everyone through kind of what your actual 27 00:01:28,280 --> 00:01:31,760 Speaker 1: background is in this industry first. Yeah, totally. So if 28 00:01:31,800 --> 00:01:33,840 Speaker 1: anyone watches En Rangers, watched it for a long time, 29 00:01:33,880 --> 00:01:35,959 Speaker 1: you'll see this reflected in some of my content because 30 00:01:35,959 --> 00:01:38,520 Speaker 1: I do deal with some of this intermittently on the channel, 31 00:01:38,520 --> 00:01:41,520 Speaker 1: and it's definitely influenced how I approach my work there 32 00:01:41,520 --> 00:01:44,600 Speaker 1: with the social media and all that. But so way 33 00:01:44,600 --> 00:01:46,320 Speaker 1: back when I was like one of those kids that 34 00:01:46,440 --> 00:01:48,520 Speaker 1: was in the hacker space, and I grew up like 35 00:01:48,560 --> 00:01:50,840 Speaker 1: trying to make computers and technology do what it wasn't 36 00:01:50,880 --> 00:01:52,520 Speaker 1: designed to do, and learn to make it do things 37 00:01:52,520 --> 00:01:55,080 Speaker 1: it shouldn't have done for my own interests or others 38 00:01:55,120 --> 00:01:57,120 Speaker 1: around me, not not in any really negative way, but 39 00:01:57,200 --> 00:01:59,960 Speaker 1: like just a deep curiosity and how does this stuff work? 40 00:02:00,040 --> 00:02:02,920 Speaker 1: And being part of the early online community. We're talking 41 00:02:02,920 --> 00:02:05,520 Speaker 1: pre Internet, where you have like an acoustic coupling jack 42 00:02:05,560 --> 00:02:08,160 Speaker 1: modem and you would dial in like war games. Yeah, 43 00:02:08,720 --> 00:02:11,600 Speaker 1: literally plug your headset into the fad. I was on 44 00:02:11,639 --> 00:02:14,280 Speaker 1: boards like that way back when we never should have 45 00:02:14,320 --> 00:02:17,760 Speaker 1: gone past those days doing things wirelessly with such a mistake, 46 00:02:18,120 --> 00:02:20,359 Speaker 1: Like I'm so piste off that when I like sit 47 00:02:20,440 --> 00:02:23,280 Speaker 1: down to research, I'm not like jacking into a gigantic 48 00:02:23,360 --> 00:02:26,480 Speaker 1: box um like it that makes me live it, Like 49 00:02:26,560 --> 00:02:29,320 Speaker 1: shadow Run promised me that I was going to be 50 00:02:29,360 --> 00:02:34,120 Speaker 1: like using one hand to shoot at the approaching corporate 51 00:02:34,160 --> 00:02:36,880 Speaker 1: security guards and have another hand on my like keyboard 52 00:02:36,960 --> 00:02:39,040 Speaker 1: that I wear around my neck that I like plug 53 00:02:39,080 --> 00:02:42,280 Speaker 1: into the wall to hack buildings. But well, hey, maybe 54 00:02:42,280 --> 00:02:45,360 Speaker 1: someday we'll have neurological implants or wet wire implants brought 55 00:02:45,360 --> 00:02:48,200 Speaker 1: to us by Monsanto Li. The rmum will just get 56 00:02:48,200 --> 00:02:50,280 Speaker 1: shut off in our own rooms, right from their mouth 57 00:02:50,400 --> 00:02:53,520 Speaker 1: to God's ears. Carl absolutely, who doesn't want that? Who 58 00:02:53,560 --> 00:02:56,360 Speaker 1: doesn't want my neural tissue tied directly to a corporation? 59 00:02:56,639 --> 00:03:00,400 Speaker 1: But that? Fuck yes? But anyway, so I grew up 60 00:03:00,400 --> 00:03:03,560 Speaker 1: in that space, and it actually back then it naturally 61 00:03:03,639 --> 00:03:05,960 Speaker 1: turned into a career. It wasn't like now nowadays, you 62 00:03:05,960 --> 00:03:08,559 Speaker 1: pretty much have to go get a bunch of certificates 63 00:03:08,560 --> 00:03:10,720 Speaker 1: and a college degree to even start looking at an 64 00:03:10,760 --> 00:03:13,440 Speaker 1: infrazet career. But back then, if you kind of had 65 00:03:13,480 --> 00:03:16,720 Speaker 1: like skills with a Z at the end of you 66 00:03:16,720 --> 00:03:18,520 Speaker 1: could get a job. And I ended up doing like 67 00:03:18,560 --> 00:03:21,280 Speaker 1: help desk at this one company landed up they noticed 68 00:03:21,320 --> 00:03:23,040 Speaker 1: that that's where my interests were, and I ended up 69 00:03:23,080 --> 00:03:25,919 Speaker 1: becoming their information security architect over a couple of years, 70 00:03:26,200 --> 00:03:29,480 Speaker 1: and that turned into a multiple decade career pretty much 71 00:03:29,520 --> 00:03:32,560 Speaker 1: culminating and working at a Tier one Internet backbone provider 72 00:03:32,919 --> 00:03:37,280 Speaker 1: doing sub seed fiber optic like routing, networking and de 73 00:03:37,440 --> 00:03:40,560 Speaker 1: dos mitigation and bought net control, search and destroy, So 74 00:03:40,720 --> 00:03:43,720 Speaker 1: it really turned into a really wide career, not only 75 00:03:43,760 --> 00:03:47,760 Speaker 1: like when I started off backbone internet, but like encryption, firewalls, 76 00:03:47,800 --> 00:03:51,480 Speaker 1: application layer controls across the board for multiple corporations. So 77 00:03:51,800 --> 00:03:54,200 Speaker 1: it was a weird and interesting space. But I don't 78 00:03:54,200 --> 00:03:57,000 Speaker 1: really do that much anymore except on the side, but 79 00:03:57,080 --> 00:04:01,240 Speaker 1: I've had a pretty exciting career with it. So I 80 00:04:01,240 --> 00:04:03,400 Speaker 1: think probably a good place to start is just in general, 81 00:04:03,440 --> 00:04:05,880 Speaker 1: because folks are always interested about this, what what do 82 00:04:05,920 --> 00:04:07,920 Speaker 1: you what is your recommendation for people ask like what 83 00:04:07,960 --> 00:04:10,800 Speaker 1: should I be doing to kind of protect myself as 84 00:04:10,840 --> 00:04:16,240 Speaker 1: I forced my head under the constant stream of sewer 85 00:04:16,279 --> 00:04:19,039 Speaker 1: water that is social media these days. Well, yeah, you know, 86 00:04:19,080 --> 00:04:22,280 Speaker 1: the simplest thing and everything. An infosec is always controversial, 87 00:04:22,360 --> 00:04:25,600 Speaker 1: just like anything, Like any any recommendation makes someone's gonna 88 00:04:25,600 --> 00:04:28,480 Speaker 1: be like, but otherwise or anyways, or there's a better solution, 89 00:04:28,760 --> 00:04:31,200 Speaker 1: and there always is a better solution. But the realistic 90 00:04:31,240 --> 00:04:33,599 Speaker 1: thing is when you talk to the average person, the 91 00:04:33,600 --> 00:04:36,320 Speaker 1: average person isn't gonna sit there and hack a Linux 92 00:04:36,320 --> 00:04:39,080 Speaker 1: box to have a better social media experience. That's that's 93 00:04:39,120 --> 00:04:42,640 Speaker 1: not realistic. So the best thing anyone can do, the simplest, 94 00:04:42,720 --> 00:04:46,160 Speaker 1: best thing, is to get one of the trusted password managers. 95 00:04:46,400 --> 00:04:47,719 Speaker 1: There's a number of them out there. I'm not going 96 00:04:47,760 --> 00:04:50,240 Speaker 1: to recommend an individual one right now, because anyone I 97 00:04:50,279 --> 00:04:52,760 Speaker 1: recommend someone's gonna go. But there's another one, but there's 98 00:04:52,760 --> 00:04:55,160 Speaker 1: a few of them out there. Having a password manager 99 00:04:55,200 --> 00:05:00,000 Speaker 1: and having a unique, difficult, complex password for every account 100 00:05:00,040 --> 00:05:03,159 Speaker 1: you log in onto on the internet is the first 101 00:05:03,360 --> 00:05:05,640 Speaker 1: number one thing you can do as an individual to 102 00:05:05,720 --> 00:05:07,960 Speaker 1: protect your interest, because if you're logging in with the 103 00:05:08,000 --> 00:05:12,400 Speaker 1: same password monkey to Facebook, Twitter, and your bank account, 104 00:05:12,600 --> 00:05:14,680 Speaker 1: that is a disaster. Waiting to happen. So the first 105 00:05:14,680 --> 00:05:18,600 Speaker 1: thing you can do password manager passwords you yourself can't remember. 106 00:05:18,640 --> 00:05:21,120 Speaker 1: As a result, I allow the password manager to generate 107 00:05:21,120 --> 00:05:25,039 Speaker 1: like twenty four character long alpha numeric crypto. Nonsense. You 108 00:05:25,080 --> 00:05:26,400 Speaker 1: put a gun on my mouth to say, what's your 109 00:05:26,400 --> 00:05:28,640 Speaker 1: password to your bank and I don't know. I can't 110 00:05:28,640 --> 00:05:30,920 Speaker 1: give it. You have no idea. And so that right 111 00:05:30,920 --> 00:05:33,800 Speaker 1: there is the first thing any basic individual can do 112 00:05:33,880 --> 00:05:38,760 Speaker 1: to protect themselves on the Internet that, uh is totally sensible. Um, 113 00:05:38,839 --> 00:05:41,080 Speaker 1: I don't I'm not great at password managers, but I 114 00:05:41,120 --> 00:05:43,360 Speaker 1: never know what my passwords are and they're all different, 115 00:05:43,400 --> 00:05:45,440 Speaker 1: and so my life is this constant stream of like 116 00:05:45,960 --> 00:05:49,520 Speaker 1: needing to figure out what my password was, failing and 117 00:05:49,560 --> 00:05:53,680 Speaker 1: resetting it. But it does mean that I change passwords regularly. Right, 118 00:05:53,720 --> 00:05:55,840 Speaker 1: But what's so great about password managers. You can have 119 00:05:55,880 --> 00:05:58,120 Speaker 1: passwords that you could never human remember, and you can 120 00:05:58,160 --> 00:06:01,120 Speaker 1: have weak ones per website, every website you log in, 121 00:06:01,200 --> 00:06:03,120 Speaker 1: you can be unique, and by having it in this 122 00:06:03,200 --> 00:06:06,320 Speaker 1: database that's properly encrypted with a key phrase or even 123 00:06:06,360 --> 00:06:08,880 Speaker 1: dual factor, then at that point means you literally just 124 00:06:08,920 --> 00:06:11,440 Speaker 1: can cut and paste your passwords into things you don't 125 00:06:11,480 --> 00:06:14,600 Speaker 1: yourself know what they are and if depending on your 126 00:06:14,680 --> 00:06:17,880 Speaker 1: privacy levels, you can do that locally with local solutions 127 00:06:17,880 --> 00:06:21,080 Speaker 1: with files like on your own machine. But frankly a 128 00:06:21,080 --> 00:06:22,840 Speaker 1: couple of the cloud based solutions, as much as the 129 00:06:22,839 --> 00:06:25,160 Speaker 1: cloud freaks people out, is the better one because it 130 00:06:25,160 --> 00:06:27,640 Speaker 1: will work on your phone, it'll work on your black top, 131 00:06:27,920 --> 00:06:31,839 Speaker 1: it'll work on everything everywhere. That makes total sense, um 132 00:06:31,880 --> 00:06:33,560 Speaker 1: I think. Another good thing to get into while we're 133 00:06:33,600 --> 00:06:35,960 Speaker 1: on this subject, we just started talking about passwords, and 134 00:06:36,000 --> 00:06:39,960 Speaker 1: obviously it is important to keep and secure those. Um. 135 00:06:40,000 --> 00:06:42,880 Speaker 1: I think one thing folks don't often think about, especially 136 00:06:42,880 --> 00:06:46,479 Speaker 1: people who are activists UM who who may foresee or 137 00:06:46,520 --> 00:06:50,400 Speaker 1: have engaged in things that are legally questionable, don't think 138 00:06:50,400 --> 00:06:54,720 Speaker 1: about enough is social media networking um as and by 139 00:06:54,720 --> 00:06:57,200 Speaker 1: which I mean having social media that like It is 140 00:06:57,240 --> 00:07:00,520 Speaker 1: possible to find your other social media by like knowing 141 00:07:00,800 --> 00:07:03,000 Speaker 1: you know, like having the same name and Twitter and 142 00:07:03,040 --> 00:07:06,400 Speaker 1: on Instagram and stuff. Um Having social media that like 143 00:07:06,480 --> 00:07:09,400 Speaker 1: can be tracked across accounts um most people would be 144 00:07:09,440 --> 00:07:11,600 Speaker 1: surprised at how easy it is to do that, and 145 00:07:11,680 --> 00:07:16,080 Speaker 1: mellanct a huge amount of tracking nazis tracking even like 146 00:07:16,520 --> 00:07:19,400 Speaker 1: a ton of the what the work I did not do, 147 00:07:19,480 --> 00:07:23,040 Speaker 1: but my colleagues did to like doc docs Russian like 148 00:07:23,120 --> 00:07:26,720 Speaker 1: secret service agents and stuff. Was like, oh, we found 149 00:07:26,760 --> 00:07:30,600 Speaker 1: them in you know somebody their their bosses wedding, Like 150 00:07:30,680 --> 00:07:33,480 Speaker 1: they're tagged in this thing in v K and from 151 00:07:33,520 --> 00:07:36,760 Speaker 1: that we were able to like find their their account 152 00:07:36,840 --> 00:07:39,000 Speaker 1: on this other site and like from that, like now 153 00:07:39,040 --> 00:07:41,000 Speaker 1: we have this like map of everywhere they've been for 154 00:07:41,040 --> 00:07:42,760 Speaker 1: the last like three weeks, and we can like build 155 00:07:42,760 --> 00:07:46,000 Speaker 1: this social map of their entire life. Yeah. No, by list. 156 00:07:46,080 --> 00:07:49,480 Speaker 1: By just literally existing in modern space, you're constantly leaking 157 00:07:49,920 --> 00:07:52,600 Speaker 1: some form of metadata, right you are, You are always 158 00:07:52,680 --> 00:07:55,320 Speaker 1: leaking metadata, and the more of you allow to exist 159 00:07:55,320 --> 00:07:57,120 Speaker 1: in the world, the more that's the case. So, like 160 00:07:57,360 --> 00:07:59,400 Speaker 1: there's also you've got to think about what the threat 161 00:07:59,520 --> 00:08:01,360 Speaker 1: is and what the risk is. Right, there's the risk 162 00:08:01,400 --> 00:08:04,480 Speaker 1: of the individual having a para social relationship with the 163 00:08:04,520 --> 00:08:07,480 Speaker 1: Internet like I do as a content creator is one thing. People, 164 00:08:07,720 --> 00:08:09,920 Speaker 1: there's always someone that wants to delve into your private life. 165 00:08:09,920 --> 00:08:13,800 Speaker 1: But that's a very different risk than a nation state actor. Right, 166 00:08:13,840 --> 00:08:15,920 Speaker 1: those are two different things. And when it comes to 167 00:08:15,960 --> 00:08:19,360 Speaker 1: a nation state actor, quite honestly, unless you're real good 168 00:08:19,720 --> 00:08:21,160 Speaker 1: and I've been doing it for a long time. The 169 00:08:21,200 --> 00:08:26,520 Speaker 1: individual bluntly is kind of fucked word. As a general rule, 170 00:08:26,600 --> 00:08:29,400 Speaker 1: your best security as an individual in that situation is 171 00:08:29,960 --> 00:08:33,920 Speaker 1: the anonymity of the crowd. But when we're also not 172 00:08:33,960 --> 00:08:36,280 Speaker 1: talking about most people who are threatened to kind of 173 00:08:36,280 --> 00:08:38,520 Speaker 1: by the state and that situation are not being threatened 174 00:08:38,559 --> 00:08:41,240 Speaker 1: by the federal government. But they may have they may 175 00:08:41,280 --> 00:08:45,199 Speaker 1: like be attending protests and not want the Louisville Police 176 00:08:45,640 --> 00:08:47,960 Speaker 1: to like put together that they're in an affinity group 177 00:08:47,960 --> 00:08:49,440 Speaker 1: with people, and like, something you can do for that 178 00:08:49,559 --> 00:08:51,880 Speaker 1: is make sure you're not like if you have a 179 00:08:51,920 --> 00:08:54,560 Speaker 1: personal account that's under your name with your friends, that 180 00:08:54,600 --> 00:08:57,400 Speaker 1: account shouldn't be liking and sharing things from like a 181 00:08:57,400 --> 00:08:59,679 Speaker 1: political account that you have, or from the account of 182 00:08:59,679 --> 00:09:01,480 Speaker 1: like a a group that you're a part of or 183 00:09:01,520 --> 00:09:04,640 Speaker 1: something like that. Like just try to think about and 184 00:09:04,679 --> 00:09:08,480 Speaker 1: look at your your digital footprint from the outside and think, 185 00:09:08,559 --> 00:09:10,839 Speaker 1: is it possible to connect me to people I don't 186 00:09:10,880 --> 00:09:13,320 Speaker 1: want to be publicly connected to through this And the 187 00:09:13,360 --> 00:09:15,840 Speaker 1: minute you've breached that connection once it's gone forever. Right, 188 00:09:15,920 --> 00:09:18,360 Speaker 1: this is the forever. Yes. This is the same thing 189 00:09:18,400 --> 00:09:20,319 Speaker 1: as like with phones, like someone will have like their 190 00:09:20,360 --> 00:09:22,360 Speaker 1: regular phone which, by the way, all these smartphones are 191 00:09:22,400 --> 00:09:25,319 Speaker 1: just surveillance devices in our pocket. Right. Let's let's say 192 00:09:25,320 --> 00:09:27,160 Speaker 1: you go get a burner so that you don't want 193 00:09:27,160 --> 00:09:29,079 Speaker 1: to be connected to the device that you normally use 194 00:09:29,679 --> 00:09:32,160 Speaker 1: on on a level that's one step above the regular 195 00:09:32,200 --> 00:09:35,360 Speaker 1: individual level. If you ever have those two devices emanating 196 00:09:35,400 --> 00:09:37,719 Speaker 1: at the same time, they're now connected in a way that, 197 00:09:37,800 --> 00:09:41,000 Speaker 1: like let's say, the authorities can associate them together because 198 00:09:41,000 --> 00:09:44,480 Speaker 1: of triangulation and seeing a burner phone and your phone 199 00:09:44,679 --> 00:09:47,240 Speaker 1: coming from the same house, you've breached all the privacy 200 00:09:47,280 --> 00:09:50,040 Speaker 1: you would have had from your burner phone for example. Now, Carl, 201 00:09:50,080 --> 00:09:51,600 Speaker 1: do you have much to say on the subject of 202 00:09:51,600 --> 00:09:53,520 Speaker 1: because I know one thing I have seen people do 203 00:09:53,920 --> 00:09:57,520 Speaker 1: people who are you know, having conversations that they're concerned about, 204 00:09:57,600 --> 00:10:00,440 Speaker 1: is put bags in Faraday cages. And I've heard things 205 00:10:00,520 --> 00:10:04,120 Speaker 1: about how reliable Faraday bags and stuff are for actually 206 00:10:04,160 --> 00:10:06,160 Speaker 1: stopping signals. Do you have much to say on that matter? 207 00:10:06,440 --> 00:10:08,640 Speaker 1: My experience with that is not all not all bags 208 00:10:08,640 --> 00:10:11,040 Speaker 1: that you can just buy off the internet or made equally. Um, 209 00:10:11,080 --> 00:10:13,320 Speaker 1: So what you want to do is tested, and you 210 00:10:13,320 --> 00:10:14,960 Speaker 1: can only test it to a certain degree. But the 211 00:10:15,000 --> 00:10:17,360 Speaker 1: really simple tests are you put it in the bag 212 00:10:17,400 --> 00:10:19,040 Speaker 1: and you try to darn dial the darn thing or 213 00:10:19,120 --> 00:10:21,480 Speaker 1: use any WiFi connections to it. And that's a simple test. 214 00:10:21,559 --> 00:10:24,080 Speaker 1: Now is it as good as like? Is it as 215 00:10:24,120 --> 00:10:25,960 Speaker 1: good as not having the thing on you? Of course, 216 00:10:26,000 --> 00:10:28,760 Speaker 1: not leaving or else is always the best answer, but 217 00:10:28,880 --> 00:10:33,280 Speaker 1: it properly In my opinion, a properly built Faraday box 218 00:10:33,360 --> 00:10:36,679 Speaker 1: or cage or bag that you've put some testing into 219 00:10:36,840 --> 00:10:40,000 Speaker 1: is a pretty reliable solution. And it's you know, there 220 00:10:40,000 --> 00:10:43,079 Speaker 1: are so a problem that you might encounter, is um 221 00:10:43,160 --> 00:10:44,880 Speaker 1: or that I have so one thing I have heard 222 00:10:44,920 --> 00:10:47,000 Speaker 1: people talk about it is like, well, in order to 223 00:10:47,040 --> 00:10:49,560 Speaker 1: have kind of a private conversation, we like drove to 224 00:10:49,679 --> 00:10:52,440 Speaker 1: a specific location and we left our phones off in 225 00:10:52,520 --> 00:10:54,480 Speaker 1: the car and then went on a walk. And the 226 00:10:54,520 --> 00:10:56,240 Speaker 1: problem with that is that now you have both just 227 00:10:56,360 --> 00:10:58,880 Speaker 1: driven to a location with those phones, and those phones 228 00:10:58,920 --> 00:11:01,840 Speaker 1: are associated with each at it right right, Well, so 229 00:11:01,920 --> 00:11:03,200 Speaker 1: first of all, you got to think of a world 230 00:11:03,240 --> 00:11:05,840 Speaker 1: where all of this metadata is being collected at all times. 231 00:11:05,840 --> 00:11:09,600 Speaker 1: So these phones and their associations in physical proximity to 232 00:11:09,600 --> 00:11:12,880 Speaker 1: one another is stored somewhere at all times, whether or 233 00:11:12,880 --> 00:11:15,600 Speaker 1: not it's going to be resourced or accessible to the 234 00:11:15,640 --> 00:11:18,040 Speaker 1: powers that be when they wanted to be. It's all there, 235 00:11:18,280 --> 00:11:20,480 Speaker 1: my phone next to your phone, next to that guy's phone. 236 00:11:20,720 --> 00:11:23,600 Speaker 1: Those associations all exists. They're all talking to the same 237 00:11:23,679 --> 00:11:25,960 Speaker 1: cell phone towers in the same area, giving them not 238 00:11:26,040 --> 00:11:29,360 Speaker 1: only GPS coordinates but triangulation data, which, by the way, 239 00:11:29,360 --> 00:11:31,320 Speaker 1: if you go way back to the hacker Kevin Mitnick, 240 00:11:31,559 --> 00:11:33,160 Speaker 1: that stuff was going on back then before they had 241 00:11:33,200 --> 00:11:37,280 Speaker 1: gasiness triangulation data to get him right, So that stuff 242 00:11:37,320 --> 00:11:40,720 Speaker 1: is all still happening, and those associations occur. In regards 243 00:11:40,760 --> 00:11:43,320 Speaker 1: to saying I turned my phone off, how do you 244 00:11:43,360 --> 00:11:46,360 Speaker 1: know that's off? Most of these moderate phones, what does 245 00:11:46,440 --> 00:11:51,920 Speaker 1: off mean? And yeah, okay, pull the battery maybe, But 246 00:11:52,080 --> 00:11:55,280 Speaker 1: even then, I would not trust any of these devices 247 00:11:55,320 --> 00:11:58,760 Speaker 1: in the regards to them. Quote being off, especially things 248 00:11:58,840 --> 00:12:02,319 Speaker 1: like phones that have un or not removable batteries. Off 249 00:12:02,400 --> 00:12:06,400 Speaker 1: is more like sleep it is, right, Yeah, I mean, 250 00:12:06,440 --> 00:12:08,320 Speaker 1: I think one of the worst things that's happened for 251 00:12:08,520 --> 00:12:11,040 Speaker 1: personal security is the end of the phone where you 252 00:12:11,040 --> 00:12:15,680 Speaker 1: can remove the battery, like being unable to actually cut 253 00:12:15,720 --> 00:12:18,680 Speaker 1: power to it without you know, disassembling it is a 254 00:12:18,720 --> 00:12:21,400 Speaker 1: real issue. One could argue that there was like that 255 00:12:21,400 --> 00:12:23,680 Speaker 1: that's a much much more insidious reason they did that, 256 00:12:23,760 --> 00:12:25,600 Speaker 1: or one could argue that it was just one of 257 00:12:25,640 --> 00:12:28,719 Speaker 1: design and comfort, and it's like hard to say. It 258 00:12:28,760 --> 00:12:31,240 Speaker 1: doesn't really matter if it wasn't sidious or not. Reality 259 00:12:31,360 --> 00:12:34,319 Speaker 1: kind of a poor Kano Los dose situation right totally 260 00:12:34,320 --> 00:12:35,920 Speaker 1: is so well. Now that we're talking about phones, here's 261 00:12:35,920 --> 00:12:37,560 Speaker 1: another thing that's been near and dear, and I think 262 00:12:37,559 --> 00:12:40,439 Speaker 1: you've seen some posts for me about this. Um everybody 263 00:12:40,480 --> 00:12:43,720 Speaker 1: really likes the convenience of things like biometrics, thumb authentication, 264 00:12:43,800 --> 00:12:48,040 Speaker 1: fingerprint i D, facial identification. And here's the reality of that. 265 00:12:48,120 --> 00:12:50,520 Speaker 1: We know this already and there's legal this exists in 266 00:12:50,600 --> 00:12:52,920 Speaker 1: legal space already. But the reality is that you can 267 00:12:52,960 --> 00:12:57,000 Speaker 1: be coerced to provide biometric data against your will. So 268 00:12:57,040 --> 00:12:59,320 Speaker 1: if your phone is authenticated to with a fingerprint i 269 00:12:59,360 --> 00:13:01,720 Speaker 1: D or your face i D, they can pretty much 270 00:13:01,760 --> 00:13:04,840 Speaker 1: say you must give us your thumb to unlock this phone, 271 00:13:05,000 --> 00:13:07,200 Speaker 1: or for that matter, frankly, they could hold the phone 272 00:13:07,200 --> 00:13:10,120 Speaker 1: in front of your face in certain circumstract circumstances, even 273 00:13:10,160 --> 00:13:12,520 Speaker 1: against your will, and it will unlock the device. And 274 00:13:12,520 --> 00:13:15,160 Speaker 1: that is considered not a violation of your rights. So 275 00:13:15,200 --> 00:13:18,320 Speaker 1: for example, if you had a long, strong password on 276 00:13:18,360 --> 00:13:20,440 Speaker 1: the phone, they cannot coerce you to give that up 277 00:13:20,480 --> 00:13:22,720 Speaker 1: because that would be a violation of your own rights 278 00:13:22,760 --> 00:13:26,400 Speaker 1: and Fifth Amendment, which is interesting, um so. But at 279 00:13:26,440 --> 00:13:28,959 Speaker 1: the same time, one could also argue that a certain 280 00:13:29,000 --> 00:13:31,200 Speaker 1: circumstances where there's a lot of cameras that are not 281 00:13:31,360 --> 00:13:34,760 Speaker 1: necessarily watching everything you do. But you could also consider 282 00:13:34,800 --> 00:13:37,400 Speaker 1: that pass phrases could be dangerous, like saying an airport, 283 00:13:37,679 --> 00:13:39,960 Speaker 1: because all those cameras could see you plugging in your 284 00:13:40,000 --> 00:13:42,320 Speaker 1: pass code. So it's a matter of if, when and 285 00:13:42,360 --> 00:13:45,160 Speaker 1: where right, So what's the right solution at the best time. 286 00:13:45,160 --> 00:13:46,560 Speaker 1: But I would say that if you were going to 287 00:13:46,559 --> 00:13:49,760 Speaker 1: be in a place that was contentious, um it is 288 00:13:49,800 --> 00:13:52,280 Speaker 1: almost always better to make sure you do not allow 289 00:13:52,360 --> 00:13:55,880 Speaker 1: for any biometric authentication on device. Yes, I never like 290 00:13:56,000 --> 00:13:58,440 Speaker 1: never turn on don't even like ever have had it 291 00:13:58,480 --> 00:14:01,560 Speaker 1: in the like ideally you have never turned on facial 292 00:14:01,559 --> 00:14:05,199 Speaker 1: recognition on your phone, like even if you like deactivated. 293 00:14:05,280 --> 00:14:08,480 Speaker 1: I don't know, I don't I really that was that 294 00:14:08,520 --> 00:14:09,760 Speaker 1: was one of the first I used to be in 295 00:14:09,760 --> 00:14:11,960 Speaker 1: tech journalism, right, obviously I'm not an expert on any 296 00:14:12,000 --> 00:14:15,920 Speaker 1: of this. But like the worst thing in terms of 297 00:14:15,920 --> 00:14:18,360 Speaker 1: like my personal comfort with devices was when they were 298 00:14:18,400 --> 00:14:21,920 Speaker 1: like everything's gonna read faces and fingerprints. Now I don't. 299 00:14:22,320 --> 00:14:27,160 Speaker 1: I don't love that, um, but you know it's it's inevitable, right, 300 00:14:27,160 --> 00:14:30,080 Speaker 1: because it is and I had in the past. I 301 00:14:30,360 --> 00:14:33,560 Speaker 1: did a fingerprint unlock earlier in my life, and I 302 00:14:33,600 --> 00:14:35,840 Speaker 1: do not have any devices that unlocked that way anymore. 303 00:14:35,840 --> 00:14:37,720 Speaker 1: But you do like that it is more convenient, right, 304 00:14:37,720 --> 00:14:39,520 Speaker 1: You miss it when you need to get to your 305 00:14:39,520 --> 00:14:42,080 Speaker 1: phone quickly and you can't do it. But like, I 306 00:14:42,120 --> 00:14:43,800 Speaker 1: don't even I don't even let my phone have just 307 00:14:43,880 --> 00:14:47,320 Speaker 1: like a four phrase like password anymore. Like it's eight 308 00:14:47,400 --> 00:14:49,120 Speaker 1: characters for me. It's a little bit of a pain 309 00:14:49,160 --> 00:14:52,360 Speaker 1: in the ass, but it comes with fewer risks. And 310 00:14:52,440 --> 00:14:54,920 Speaker 1: one of the things that's challenging to every individuals they 311 00:14:54,920 --> 00:14:57,240 Speaker 1: have to look at what their threat profile is, right, 312 00:14:57,360 --> 00:15:01,240 Speaker 1: So like, for example, um, soccer mom driving her kids 313 00:15:01,240 --> 00:15:03,640 Speaker 1: to school and stuff, she might be really good, well 314 00:15:03,680 --> 00:15:06,800 Speaker 1: off with a biometric authentication on her phone, frankly, because 315 00:15:06,840 --> 00:15:08,680 Speaker 1: if she didn't use that, maybe she wouldn't even use 316 00:15:08,680 --> 00:15:10,920 Speaker 1: a proper four character pass phrase. And if she's not 317 00:15:10,960 --> 00:15:13,920 Speaker 1: concerned about being at a protest, for example, and having 318 00:15:13,960 --> 00:15:16,880 Speaker 1: some authoritarian take her phone away from her and authenticate 319 00:15:16,920 --> 00:15:19,520 Speaker 1: to it. Maybe she doesn't need to worry about that. 320 00:15:19,560 --> 00:15:21,040 Speaker 1: But for a lot of us in the world's we 321 00:15:21,080 --> 00:15:23,560 Speaker 1: live in, that's a different risk profile. Right, We've got 322 00:15:23,560 --> 00:15:26,280 Speaker 1: to think about what our risks are as individuals and 323 00:15:26,360 --> 00:15:28,560 Speaker 1: what makes sense. So if your passphrase is going to 324 00:15:28,640 --> 00:15:31,000 Speaker 1: be one, two, three, four or use a thumb print 325 00:15:31,040 --> 00:15:33,040 Speaker 1: I D. For most people, they'd be better with the 326 00:15:33,080 --> 00:15:35,800 Speaker 1: thumb print I D. But for someone like myself, no, 327 00:15:36,000 --> 00:15:39,560 Speaker 1: it's not a good idea. Yeah, and that's UM. Yeah, 328 00:15:39,720 --> 00:15:43,160 Speaker 1: I think that kind of brings us to uh, probably 329 00:15:43,200 --> 00:15:45,800 Speaker 1: the last part of this, which is um, do you 330 00:15:45,800 --> 00:15:49,440 Speaker 1: have specific advice on like VPNs UM. Obviously I recommend 331 00:15:49,480 --> 00:15:54,080 Speaker 1: everybody you signal I I justform messages in general, but 332 00:15:54,200 --> 00:15:57,280 Speaker 1: like a specially stuff that is secure. Don't if you 333 00:15:57,560 --> 00:16:00,720 Speaker 1: if you like Number one first rule of any kind 334 00:16:00,760 --> 00:16:05,680 Speaker 1: of this sort of security. Don't ever put anything on 335 00:16:05,720 --> 00:16:09,160 Speaker 1: your phone ever that's legally questionable if you can avoid it, 336 00:16:09,280 --> 00:16:12,880 Speaker 1: like conversationally, like right, do not don't send it over 337 00:16:12,960 --> 00:16:16,000 Speaker 1: a phone if it's something you would not be able 338 00:16:16,040 --> 00:16:19,680 Speaker 1: to survive. Having read D you in a courtroom. Yeah, 339 00:16:19,920 --> 00:16:21,440 Speaker 1: so for the audience, a lot of the audience may 340 00:16:21,440 --> 00:16:23,640 Speaker 1: not know what signal even is, right, So signal ism 341 00:16:23,720 --> 00:16:26,840 Speaker 1: is a is a text messaging alternative. So like for example, 342 00:16:26,840 --> 00:16:28,360 Speaker 1: on your phone, you've got regular text or if you've 343 00:16:28,360 --> 00:16:31,520 Speaker 1: got an iPhone, you've got I message. Signal is an 344 00:16:31,720 --> 00:16:34,880 Speaker 1: end end encrypted solution that you install as an app. 345 00:16:35,520 --> 00:16:37,960 Speaker 1: And because it's end end encryption, it means that it 346 00:16:38,000 --> 00:16:41,760 Speaker 1: passes the wire in theory not decryptable by the parties 347 00:16:41,760 --> 00:16:44,440 Speaker 1: that are passing the data packets in the middle, So 348 00:16:44,480 --> 00:16:46,240 Speaker 1: that's a man in the middle the decryption. Right, So 349 00:16:46,280 --> 00:16:49,800 Speaker 1: for example, I message is encrypted theoretically end to end, 350 00:16:49,840 --> 00:16:53,480 Speaker 1: but Apple ultimately has the cryptographic keys, so there is well, 351 00:16:53,560 --> 00:16:56,560 Speaker 1: they might say one thing, there is nothing really preventing 352 00:16:56,600 --> 00:16:58,240 Speaker 1: them from being man in the middle and being able 353 00:16:58,280 --> 00:17:01,280 Speaker 1: to read the message in transit from a to be. 354 00:17:01,960 --> 00:17:04,280 Speaker 1: But if the keys are stored on your device, which 355 00:17:04,280 --> 00:17:07,399 Speaker 1: are then protected with your passphrase or whatever your authentication 356 00:17:07,400 --> 00:17:10,480 Speaker 1: mechanism is, and those keys are not archived or kept 357 00:17:10,480 --> 00:17:14,200 Speaker 1: by some hierarchical man in the middle authority, if it's 358 00:17:14,240 --> 00:17:17,600 Speaker 1: done right, which signal is done pretty well, it means 359 00:17:17,600 --> 00:17:21,479 Speaker 1: that your data in transit is probably not decryptible, and 360 00:17:21,480 --> 00:17:23,280 Speaker 1: that's why Signal is a good solution, and it's a 361 00:17:23,280 --> 00:17:26,199 Speaker 1: good one for the average person. Install the app. It 362 00:17:26,280 --> 00:17:29,000 Speaker 1: works just like test text messaging, but you can have 363 00:17:29,119 --> 00:17:32,440 Speaker 1: a pretty good level of knowledge that the data you're 364 00:17:32,480 --> 00:17:36,359 Speaker 1: passing is not being decrypted or caught in transmission or 365 00:17:36,400 --> 00:17:50,840 Speaker 1: in the path. So I would say get get signal. Um, 366 00:17:50,840 --> 00:17:53,880 Speaker 1: it's it's your best bet, right Like, and again we said, 367 00:17:54,080 --> 00:17:56,640 Speaker 1: I said, you know, you don't want to ever say 368 00:17:56,640 --> 00:17:59,760 Speaker 1: anything over a phone. That is something that can get 369 00:17:59,760 --> 00:18:02,600 Speaker 1: you in trouble. But also like, life is life, and 370 00:18:02,760 --> 00:18:06,840 Speaker 1: that's not always realistic for people in certain situations. So again, 371 00:18:07,040 --> 00:18:10,840 Speaker 1: signal is your best bet. Nothing is perfect. And again 372 00:18:10,880 --> 00:18:13,040 Speaker 1: if you're putting it on your phone, there's a number 373 00:18:13,080 --> 00:18:15,040 Speaker 1: of things that could go wrong every single time you 374 00:18:15,080 --> 00:18:18,320 Speaker 1: do that. But um, that that's one of your better 375 00:18:18,600 --> 00:18:20,040 Speaker 1: things that you could do. And then of course we 376 00:18:20,080 --> 00:18:24,520 Speaker 1: talk about VPNs. Yeah, so so VPN to those like, 377 00:18:24,560 --> 00:18:26,080 Speaker 1: I'm just gonna go with the basic levels because I 378 00:18:26,080 --> 00:18:29,240 Speaker 1: don't necessarily know the level of knowledge that people are listening. 379 00:18:29,520 --> 00:18:32,160 Speaker 1: VPN is a virtual private network. So with that is 380 00:18:32,440 --> 00:18:35,680 Speaker 1: you connect to this virtual private network and it passes 381 00:18:35,720 --> 00:18:38,840 Speaker 1: your data through an encrypted tunnel to an exit point 382 00:18:38,920 --> 00:18:43,199 Speaker 1: somewhere else on the Internet in theory masking the source 383 00:18:43,240 --> 00:18:47,120 Speaker 1: and origin of your requests. So like, for example, let's 384 00:18:47,119 --> 00:18:48,800 Speaker 1: say you were looking up something on the Internet that 385 00:18:48,840 --> 00:18:51,800 Speaker 1: you didn't necessarily want people to know you're looking up. Yeah, Like, 386 00:18:51,880 --> 00:18:55,800 Speaker 1: let's say you're researching the truth about the assassination of 387 00:18:55,840 --> 00:18:59,679 Speaker 1: President John F. Kennedy by Bernard Montgomery Sanders Um. And 388 00:18:59,760 --> 00:19:01,879 Speaker 1: you know that the n s A is looking for 389 00:19:01,920 --> 00:19:05,119 Speaker 1: truth seekers who are who are finding out the reality 390 00:19:05,160 --> 00:19:07,560 Speaker 1: of that situation. You know, you don't necessarily want them 391 00:19:07,600 --> 00:19:11,560 Speaker 1: to know that you have have become pilled. Right, So 392 00:19:11,600 --> 00:19:13,840 Speaker 1: if you were to do this from your computer at home, 393 00:19:14,240 --> 00:19:16,040 Speaker 1: but what happen is to people that don't know how 394 00:19:16,040 --> 00:19:17,680 Speaker 1: this all works, you would be coming from an I 395 00:19:17,760 --> 00:19:21,159 Speaker 1: P address that's associated with your account that you're connecting to, 396 00:19:21,240 --> 00:19:25,000 Speaker 1: whether it's Verizon or Comcast or whatever. And you go 397 00:19:25,040 --> 00:19:27,159 Speaker 1: and search up that truth, and the n s A 398 00:19:27,359 --> 00:19:30,520 Speaker 1: finds you with a keyword search for jfk and the truth. 399 00:19:30,880 --> 00:19:33,600 Speaker 1: And therefore, because of that keyword search, they go to 400 00:19:33,680 --> 00:19:37,760 Speaker 1: Comcast or to Verizon and say, hey, we are requesting 401 00:19:37,760 --> 00:19:40,400 Speaker 1: you tell us who did this search, They will get 402 00:19:40,440 --> 00:19:43,600 Speaker 1: them essentially a request that's a legal request for information, 403 00:19:43,640 --> 00:19:46,320 Speaker 1: and then Comcast or Verizon will provide the n s 404 00:19:46,400 --> 00:19:48,480 Speaker 1: A this is the I P address and account of 405 00:19:48,480 --> 00:19:52,000 Speaker 1: the person that did that. What VPN does is you 406 00:19:52,040 --> 00:19:54,960 Speaker 1: connect to the vp and service. First, the connection from 407 00:19:54,960 --> 00:19:58,240 Speaker 1: your machine to the VPN services that encrypted. Now does 408 00:19:58,280 --> 00:20:02,320 Speaker 1: the VPN service no, your IP address, yes, But when 409 00:20:02,320 --> 00:20:04,359 Speaker 1: you actually type in that information or go to the 410 00:20:04,440 --> 00:20:07,000 Speaker 1: Internet to request that data, it actually goes through the 411 00:20:07,080 --> 00:20:10,800 Speaker 1: VPNs private tunneling network and egresses from somewhere else on 412 00:20:10,840 --> 00:20:15,600 Speaker 1: the Internet, thus masking your actual I P address and 413 00:20:15,920 --> 00:20:20,800 Speaker 1: in theory your origin of source. Now that's not true, 414 00:20:21,080 --> 00:20:23,240 Speaker 1: but what that does is mean that if someone if 415 00:20:23,280 --> 00:20:25,240 Speaker 1: say the n s A I wanted to know who's 416 00:20:25,280 --> 00:20:27,880 Speaker 1: doing this truth search, they would then find an eat 417 00:20:27,960 --> 00:20:32,679 Speaker 1: IP address that actually came out of let's say Joe's 418 00:20:32,760 --> 00:20:35,119 Speaker 1: VPN service, and they would have to go to Joe's 419 00:20:35,200 --> 00:20:39,080 Speaker 1: VPN service and go, we noticed this emanated from your network? 420 00:20:39,400 --> 00:20:42,200 Speaker 1: Who did this? At that point you have to trust 421 00:20:43,040 --> 00:20:47,000 Speaker 1: Joe's VPN service to not disclose their account information about you. 422 00:20:47,920 --> 00:20:50,119 Speaker 1: So what you've done is you've changed it. We know 423 00:20:50,200 --> 00:20:53,800 Speaker 1: the telecoms will communicate with the government or whoever if 424 00:20:53,840 --> 00:20:57,080 Speaker 1: they need to, they always will have. You don't necessarily 425 00:20:57,080 --> 00:21:00,480 Speaker 1: know if Joe's VPN service will You've changed your trust 426 00:21:00,520 --> 00:21:04,399 Speaker 1: model from your telecom to your VPN service. So if 427 00:21:04,400 --> 00:21:06,280 Speaker 1: you're gonna pick a VPN, you have to do a 428 00:21:06,320 --> 00:21:08,080 Speaker 1: little bit of research to know that it's a trustworthy 429 00:21:08,080 --> 00:21:11,440 Speaker 1: resource that won't just give you up at the lightest 430 00:21:11,520 --> 00:21:15,240 Speaker 1: form of interrogation. Yeah, and none of them again, there's 431 00:21:15,280 --> 00:21:18,639 Speaker 1: nothing perfect and often like we did find out what 432 00:21:18,720 --> 00:21:21,159 Speaker 1: was it last year that one of popular VPN was 433 00:21:21,240 --> 00:21:24,480 Speaker 1: like run by the FEDS, Like it's yeah, that's not 434 00:21:24,560 --> 00:21:27,399 Speaker 1: an impossible thing. Um. I know a lot of folks, 435 00:21:27,640 --> 00:21:31,240 Speaker 1: particularly journalists, use proton Um, which is i think based 436 00:21:31,240 --> 00:21:34,480 Speaker 1: in Switzerland, and you will get given up if you 437 00:21:34,760 --> 00:21:38,199 Speaker 1: if the Swiss government is angry at you. Right, you 438 00:21:38,359 --> 00:21:42,320 Speaker 1: brought up a very good point. Uh, Services that exist 439 00:21:42,440 --> 00:21:45,760 Speaker 1: outside of the KNUS the continental US mean that they 440 00:21:45,800 --> 00:21:49,000 Speaker 1: are under different legal jurisdiction than ones that exist wholly 441 00:21:49,000 --> 00:21:52,560 Speaker 1: within the CONUS. So as a result, if something from 442 00:21:52,600 --> 00:21:55,280 Speaker 1: the United States government comes as a request to the 443 00:21:55,320 --> 00:21:58,920 Speaker 1: Swiss company, there's a much like like higher chance of 444 00:21:58,960 --> 00:22:01,320 Speaker 1: a Swiss company would be like, we don't really care 445 00:22:01,359 --> 00:22:05,000 Speaker 1: about your reports. That's worth considering. Also, think about this, 446 00:22:05,000 --> 00:22:06,639 Speaker 1: This actually works in reverse. And I don't want to 447 00:22:06,640 --> 00:22:08,600 Speaker 1: get too deep into this, but when you're working at 448 00:22:08,600 --> 00:22:10,719 Speaker 1: a tier one Internet back one provider, you should know 449 00:22:11,440 --> 00:22:15,840 Speaker 1: that sometimes traffic strangely gets pushed offshore and then back 450 00:22:15,880 --> 00:22:18,159 Speaker 1: to the United States for analysis that would normally be 451 00:22:18,760 --> 00:22:22,760 Speaker 1: let's say, not necessarily constitutionally legal in the United States. 452 00:22:23,200 --> 00:22:28,320 Speaker 1: So there's a lot of shenanigans going on. Yeah, and again, like, 453 00:22:28,840 --> 00:22:31,320 Speaker 1: I think protons are generally a pretty good service. I've 454 00:22:31,359 --> 00:22:34,800 Speaker 1: had no problems with it. Um, But we should should 455 00:22:34,840 --> 00:22:37,040 Speaker 1: be clear hear. None of these are perfect solutions. There 456 00:22:37,119 --> 00:22:40,439 Speaker 1: is no perfect solution. The only perfect method of digital 457 00:22:40,440 --> 00:22:43,320 Speaker 1: security is not putting things on the Internet or like 458 00:22:43,400 --> 00:22:46,199 Speaker 1: through you know, the mobile networks and stuff like that. Is, 459 00:22:46,440 --> 00:22:49,879 Speaker 1: if it stays between you and someone else, Um, that 460 00:22:50,000 --> 00:22:53,040 Speaker 1: is your best bet of it not being you know, 461 00:22:53,119 --> 00:22:56,040 Speaker 1: intercepted or something. A conversation that you have in the 462 00:22:56,080 --> 00:22:59,240 Speaker 1: woods without phones anywhere near you is the most secure 463 00:22:59,320 --> 00:23:02,159 Speaker 1: kind of conversation station. Let me second on proton. I 464 00:23:02,160 --> 00:23:03,800 Speaker 1: agree it's a good service. There are others out there. 465 00:23:03,800 --> 00:23:05,520 Speaker 1: We're not trying to pick on one in particular or 466 00:23:05,520 --> 00:23:08,320 Speaker 1: pick against anyone in particularly. There's a bunch of work. Yeah, 467 00:23:08,720 --> 00:23:10,480 Speaker 1: another thing that you need to consider in this sort 468 00:23:10,520 --> 00:23:13,080 Speaker 1: of thing is also what you're dealing with. Like so, 469 00:23:13,119 --> 00:23:15,480 Speaker 1: for example, on I put up a post a while 470 00:23:15,480 --> 00:23:17,320 Speaker 1: back because there was a bunch of stuff going on 471 00:23:17,359 --> 00:23:20,960 Speaker 1: in Ukraine with with people posting photos that got their locations, 472 00:23:21,720 --> 00:23:24,199 Speaker 1: had things happen. I mean, that's and that has been 473 00:23:24,240 --> 00:23:27,080 Speaker 1: happening for a decade in that war, like well almost 474 00:23:27,080 --> 00:23:29,000 Speaker 1: a decade as long as it's been going on, And 475 00:23:29,040 --> 00:23:31,360 Speaker 1: I posted something about it. And one of the recommendations 476 00:23:31,359 --> 00:23:33,000 Speaker 1: I made on there was a contentious one, but I'm 477 00:23:33,000 --> 00:23:34,360 Speaker 1: gonna back it up in a minute. As I used, 478 00:23:34,320 --> 00:23:38,320 Speaker 1: I mentioned tour the Onion relay. So the tour is 479 00:23:38,480 --> 00:23:42,679 Speaker 1: essentially it was originally created as a as a way 480 00:23:42,720 --> 00:23:46,399 Speaker 1: to deal with the dark Web quote unquote and to 481 00:23:46,600 --> 00:23:49,760 Speaker 1: also relay traffic in a way to mask the origins, 482 00:23:49,920 --> 00:23:52,520 Speaker 1: very much like a VPN service. Now there are a 483 00:23:52,560 --> 00:23:54,040 Speaker 1: bunch of these, So what it was is there's these 484 00:23:54,080 --> 00:23:56,879 Speaker 1: Onion relay nodes all over the Internet, and when you 485 00:23:56,880 --> 00:23:59,960 Speaker 1: connect to the to the Onion network, your traffic bounces 486 00:24:00,040 --> 00:24:02,600 Speaker 1: through three, four, or five, six, seven of these nodes, 487 00:24:02,640 --> 00:24:04,960 Speaker 1: you can sort of dictate what you want depending on 488 00:24:04,960 --> 00:24:06,960 Speaker 1: the client you have. And so let's say you connect 489 00:24:06,960 --> 00:24:10,080 Speaker 1: to an Onion router network node in Arizona and then 490 00:24:10,119 --> 00:24:13,240 Speaker 1: you egress somewhere in France, and you've jumped through six 491 00:24:13,280 --> 00:24:15,760 Speaker 1: nodes in the process. Well, one of the things that's 492 00:24:15,760 --> 00:24:17,760 Speaker 1: a well known fact is that a number of these 493 00:24:17,800 --> 00:24:21,520 Speaker 1: Onion relay routing nodes are owned by nation state actors, 494 00:24:21,520 --> 00:24:25,080 Speaker 1: whether it's the United States or others. So, so one 495 00:24:25,080 --> 00:24:26,720 Speaker 1: of the things I got taken to task for, and 496 00:24:26,800 --> 00:24:28,320 Speaker 1: I want to explain this is people like, well, that's 497 00:24:28,320 --> 00:24:31,080 Speaker 1: not a compromise network. It doesn't mean that it's useful. 498 00:24:31,400 --> 00:24:34,320 Speaker 1: Actually it does, because depending on what you're trying to 499 00:24:34,400 --> 00:24:37,880 Speaker 1: do may matter. If you're trying to mask the origin 500 00:24:38,040 --> 00:24:42,400 Speaker 1: of your data source or your upload or your search 501 00:24:42,760 --> 00:24:45,760 Speaker 1: for a short duration of time, this will still help 502 00:24:45,800 --> 00:24:48,320 Speaker 1: you jump through six nodes. They've got a relay back 503 00:24:48,480 --> 00:24:51,520 Speaker 1: six nodes to figure out the origin of the person 504 00:24:51,600 --> 00:24:55,080 Speaker 1: connecting to the relay network. And that's assuming that there 505 00:24:55,160 --> 00:24:58,080 Speaker 1: was a compromise node in the process. So that means 506 00:24:58,560 --> 00:25:01,040 Speaker 1: if you're passing data through a compromise No. Does that 507 00:25:01,080 --> 00:25:04,720 Speaker 1: mean the data in transit is safe? No? But is 508 00:25:04,760 --> 00:25:08,119 Speaker 1: the is the anonymity of the origin of the poster 509 00:25:08,640 --> 00:25:12,440 Speaker 1: safer for a longer duration of time? Yes. So these 510 00:25:12,440 --> 00:25:15,439 Speaker 1: things get really complex, real fast. And this is again 511 00:25:15,880 --> 00:25:17,840 Speaker 1: one of the best things you can do, because there's 512 00:25:17,880 --> 00:25:21,240 Speaker 1: no single perfect solution, but stacking, so not just going 513 00:25:21,280 --> 00:25:23,840 Speaker 1: through tour but also tour into VPN at the same time. 514 00:25:23,920 --> 00:25:26,720 Speaker 1: And you're I think one of the better ways to 515 00:25:26,720 --> 00:25:30,639 Speaker 1: think about security is kind of the way Sebastian Younger 516 00:25:30,680 --> 00:25:34,160 Speaker 1: describes how insurgent war works, which is it's all about 517 00:25:34,160 --> 00:25:37,560 Speaker 1: creating friction for anybody trying to spy on your ship. 518 00:25:38,000 --> 00:25:40,600 Speaker 1: There's no perfect answer, but the more things you can 519 00:25:40,600 --> 00:25:42,880 Speaker 1: make be a pain in the ass, the better your 520 00:25:42,880 --> 00:25:45,359 Speaker 1: odds that you will not have an issue. Right like, 521 00:25:45,480 --> 00:25:48,880 Speaker 1: That's all you can do is make it potentially more 522 00:25:48,880 --> 00:25:53,000 Speaker 1: annoying and more difficult for for whoever might be looking 523 00:25:53,119 --> 00:25:57,040 Speaker 1: right like it. The more friction you can create, broadly speaking, 524 00:25:57,040 --> 00:26:00,000 Speaker 1: the more secure you're going to be. Absolutely Now another 525 00:26:00,040 --> 00:26:01,640 Speaker 1: thing to think about, and we're getting kind of deep 526 00:26:01,640 --> 00:26:03,199 Speaker 1: in the weeds too. This is above and on the 527 00:26:03,240 --> 00:26:06,080 Speaker 1: average person, right, the average person get a password manager 528 00:26:06,280 --> 00:26:09,439 Speaker 1: don't use your same password everywhere, and don't use biometrics 529 00:26:09,480 --> 00:26:11,800 Speaker 1: unless you're forced, like pretty much have to and move 530 00:26:11,840 --> 00:26:14,440 Speaker 1: on with your life. But once you're beyond the average person, 531 00:26:14,640 --> 00:26:16,600 Speaker 1: this is what we're talking about now. So like if 532 00:26:16,600 --> 00:26:18,200 Speaker 1: you're if you have a computer and you use it 533 00:26:18,240 --> 00:26:21,240 Speaker 1: as your normal day to day operating system, talking to 534 00:26:21,280 --> 00:26:23,680 Speaker 1: your friends, doing dot dot dot dot dot, but then 535 00:26:23,720 --> 00:26:26,600 Speaker 1: also need to do something else a little more privacy inclined, 536 00:26:27,200 --> 00:26:29,800 Speaker 1: you should not trust that device. So at that point, 537 00:26:30,080 --> 00:26:32,919 Speaker 1: your web browser may have all sorts of cookies and 538 00:26:33,040 --> 00:26:35,800 Speaker 1: metadata and storage in it that, even if you're going 539 00:26:35,840 --> 00:26:38,720 Speaker 1: through a VPN, still may be able to reveal your 540 00:26:38,760 --> 00:26:42,040 Speaker 1: identity a spell as Mac addresses and other stuff. So 541 00:26:42,080 --> 00:26:43,879 Speaker 1: if you really want to get pretty into the weeds 542 00:26:43,880 --> 00:26:46,080 Speaker 1: with this, you have to do something like use an 543 00:26:46,080 --> 00:26:50,639 Speaker 1: ephemeral operating system, install that it has no legacy data 544 00:26:50,720 --> 00:26:53,000 Speaker 1: on it. One example of that that is it's a 545 00:26:53,040 --> 00:26:55,720 Speaker 1: Linux based when it's called TAILS, you essentially use it 546 00:26:55,760 --> 00:26:58,879 Speaker 1: like a live USB drive. You boot off of that only, 547 00:26:59,000 --> 00:27:01,000 Speaker 1: or you use a machine dead cared for this and 548 00:27:01,080 --> 00:27:03,920 Speaker 1: you burn the OS down every time you're done. Because 549 00:27:03,960 --> 00:27:08,040 Speaker 1: there's no legacy information or data that can be pulled 550 00:27:08,040 --> 00:27:10,800 Speaker 1: out of your web browser or your cookies or your 551 00:27:10,880 --> 00:27:14,520 Speaker 1: Mac dress information that can associate it with you, regardless 552 00:27:14,520 --> 00:27:17,120 Speaker 1: of if you've done everything right to mask your IP 553 00:27:17,200 --> 00:27:20,800 Speaker 1: address of origin. God, that's the hot girl shit. Um, 554 00:27:20,920 --> 00:27:22,919 Speaker 1: when you're when you're when you're doing when you when 555 00:27:22,960 --> 00:27:25,560 Speaker 1: you're doing that kind of stuff. Um. And again I 556 00:27:25,600 --> 00:27:27,600 Speaker 1: think at this point, I think, up through most of this, 557 00:27:27,680 --> 00:27:30,400 Speaker 1: it's been kind of like people being like, that's too much, 558 00:27:30,440 --> 00:27:32,520 Speaker 1: and people being like, Okay, yep, this is exactly what 559 00:27:32,560 --> 00:27:35,320 Speaker 1: I already am or need to be doing. Um, this 560 00:27:35,400 --> 00:27:38,520 Speaker 1: is probably very few people need to be concerned about 561 00:27:38,560 --> 00:27:41,119 Speaker 1: that sort of thing, but um, you know it it 562 00:27:41,600 --> 00:27:43,680 Speaker 1: is I I've know, I know. Like again, I worked 563 00:27:43,680 --> 00:27:46,120 Speaker 1: at Belling cat Um. I had a number of colleagues 564 00:27:46,119 --> 00:27:49,120 Speaker 1: who were like personal enemies of the Russian state who 565 00:27:50,280 --> 00:27:54,600 Speaker 1: had to do stuff like this. Um and it's you know, paranoia, 566 00:27:54,800 --> 00:27:57,400 Speaker 1: I mean, And here's the thing going above. So again 567 00:27:57,440 --> 00:27:59,560 Speaker 1: like if you're a normal person, you probably don't need 568 00:27:59,600 --> 00:28:03,000 Speaker 1: to be you know, doing stacking a VPN, you know, 569 00:28:03,400 --> 00:28:07,320 Speaker 1: getting signal and all this stuff. But also why not, right, like, 570 00:28:07,400 --> 00:28:10,040 Speaker 1: there's no harm in in the additional security. It is 571 00:28:10,080 --> 00:28:12,920 Speaker 1: a little bit frustrating. But here's one of the things 572 00:28:12,920 --> 00:28:15,520 Speaker 1: I think people don't often think about enough. You're not 573 00:28:16,200 --> 00:28:20,280 Speaker 1: engaging in that kind of security stuff purely because there's 574 00:28:20,280 --> 00:28:23,040 Speaker 1: a threat now, but in part because you don't know 575 00:28:23,080 --> 00:28:25,000 Speaker 1: what the future is going to bring. And one of 576 00:28:25,040 --> 00:28:27,439 Speaker 1: the things that I would point out for that is 577 00:28:27,840 --> 00:28:30,360 Speaker 1: a lot of people right now have been having for 578 00:28:30,480 --> 00:28:34,359 Speaker 1: years conversations about a thing that may soon legally be 579 00:28:34,480 --> 00:28:38,640 Speaker 1: murder on a federal level, you know, um abortion right, 580 00:28:39,080 --> 00:28:42,800 Speaker 1: And so it is possible that overnight an awful lot 581 00:28:42,840 --> 00:28:46,680 Speaker 1: of conversations a bunch of people have had legally will 582 00:28:46,680 --> 00:28:49,520 Speaker 1: suddenly be very illegal conversations. And then you may be 583 00:28:49,640 --> 00:28:52,840 Speaker 1: glad that you took greater care with your your personal 584 00:28:52,880 --> 00:28:55,240 Speaker 1: security prior to that point. Yeah, I mean, like so 585 00:28:55,360 --> 00:28:57,920 Speaker 1: think of the I mean, I'm not a person that menstrates, 586 00:28:57,920 --> 00:29:00,120 Speaker 1: but on menstruation tracking app it's very useful to a 587 00:29:00,120 --> 00:29:02,680 Speaker 1: lot of people who do. And those tracking apps now 588 00:29:03,240 --> 00:29:06,680 Speaker 1: that metadata in there, at some point could be extremely 589 00:29:06,760 --> 00:29:11,840 Speaker 1: dangerous or incriminalized or incrimin criminalizing, incriminating excuse me, to 590 00:29:12,080 --> 00:29:15,240 Speaker 1: someone who otherwise was doing nothing more than trying to 591 00:29:15,320 --> 00:29:18,120 Speaker 1: maintain their natural health. And so that is a really 592 00:29:18,200 --> 00:29:20,640 Speaker 1: dangerous concept. So at this point, I mean, within the 593 00:29:20,720 --> 00:29:23,239 Speaker 1: United States, I hate to say this, those apps are 594 00:29:23,280 --> 00:29:26,760 Speaker 1: probably dangerous to the individual because that data could be 595 00:29:26,800 --> 00:29:30,000 Speaker 1: easily used by a government resource to UH to do 596 00:29:30,080 --> 00:29:33,240 Speaker 1: something bad to someone who's done nothing wrong. So I 597 00:29:33,280 --> 00:29:34,920 Speaker 1: think we should move. I mean, at this point, I 598 00:29:34,920 --> 00:29:37,000 Speaker 1: think we've covered the basis that you could kind of 599 00:29:37,000 --> 00:29:40,240 Speaker 1: responsibly the advice you can responsibly give someone in a podcast, 600 00:29:40,280 --> 00:29:43,080 Speaker 1: and and folks should be able to let let me 601 00:29:43,120 --> 00:29:45,560 Speaker 1: throw one thing out real quickly. So you mentioned, like, 602 00:29:45,600 --> 00:29:47,720 Speaker 1: for example, we don't you don't necessarily have the risk 603 00:29:47,880 --> 00:29:50,520 Speaker 1: vector that requires using VPN or signal. But let me 604 00:29:50,520 --> 00:29:52,760 Speaker 1: say this, way back when, gosh, when I was doing 605 00:29:52,760 --> 00:29:57,120 Speaker 1: crypto work decades ago, I was what you mean, cryptography 606 00:29:57,160 --> 00:30:02,520 Speaker 1: and not we should specify these days, excuse me, cryptograencryption. Yeah, yeah, 607 00:30:02,600 --> 00:30:05,760 Speaker 1: I had the opportunity. We're with Phil Zimmerman of p 608 00:30:05,920 --> 00:30:08,960 Speaker 1: GP and actually PGP pretty Good Privacy, which was one 609 00:30:09,000 --> 00:30:13,040 Speaker 1: of the fundamental UH security project or projects way back when, 610 00:30:13,160 --> 00:30:15,560 Speaker 1: was actually written for human rights violations. He wrote it 611 00:30:15,600 --> 00:30:19,600 Speaker 1: because people were doing research of like warlords were getting 612 00:30:19,640 --> 00:30:21,760 Speaker 1: their laptops taken away and then finding out who spoke 613 00:30:21,800 --> 00:30:24,040 Speaker 1: to them and getting people killed. So PGP was like 614 00:30:24,080 --> 00:30:27,240 Speaker 1: this human rights thing right from the beginning. And cryptography 615 00:30:27,280 --> 00:30:29,800 Speaker 1: back when I was young and naive, I always thought 616 00:30:29,840 --> 00:30:31,720 Speaker 1: to myself, this is what we need. This is the 617 00:30:31,720 --> 00:30:35,240 Speaker 1: future when everyone gets proper crypto will blind the government 618 00:30:35,240 --> 00:30:38,640 Speaker 1: will blind the corporations. We're gonna have this crypto anarchist 619 00:30:38,720 --> 00:30:41,840 Speaker 1: future where the government and corporations can't get us. And 620 00:30:41,880 --> 00:30:44,920 Speaker 1: the reality is most of that God who served. And 621 00:30:45,000 --> 00:30:47,320 Speaker 1: the truth is cryptography is too hard for most people 622 00:30:47,360 --> 00:30:49,760 Speaker 1: to use, and as a result, we don't. But here's 623 00:30:49,760 --> 00:30:51,920 Speaker 1: what I will say. The more people that do something 624 00:30:51,960 --> 00:30:55,320 Speaker 1: simple like you signal or use a VPN just to 625 00:30:55,320 --> 00:30:58,000 Speaker 1: browse the Internet, not because they're doing anything to various 626 00:30:58,040 --> 00:31:02,080 Speaker 1: just because their privacy like conscious, Yeah, because it makes 627 00:31:02,080 --> 00:31:05,000 Speaker 1: it normalize. And that means that the person that's using 628 00:31:05,000 --> 00:31:07,600 Speaker 1: it because they need to for like, let's say, to 629 00:31:07,720 --> 00:31:10,920 Speaker 1: protect human rights doesn't stick out like a needle in 630 00:31:10,960 --> 00:31:14,080 Speaker 1: the haystack because everybody's already doing something sane in the 631 00:31:14,080 --> 00:31:19,880 Speaker 1: first place. Normalizing proper privacy and cryptography is better for everyone. Yes, yes, 632 00:31:20,080 --> 00:31:33,280 Speaker 1: absolutely agreed. This is a nice segue because you were 633 00:31:33,320 --> 00:31:36,760 Speaker 1: just talking about the past and how beautiful and bright 634 00:31:36,800 --> 00:31:39,600 Speaker 1: it seemed. Um, let's talk about what you see as 635 00:31:39,680 --> 00:31:44,880 Speaker 1: kind of the future of info security threats. Well, I mean, 636 00:31:44,920 --> 00:31:47,000 Speaker 1: so there's so many levels to that. First of all, 637 00:31:47,160 --> 00:31:50,000 Speaker 1: if we're talking nation state level, I personally strongly believe 638 00:31:50,040 --> 00:31:53,240 Speaker 1: that all of the big players have already compromised everyone's network. 639 00:31:54,080 --> 00:31:58,960 Speaker 1: Everybody got everybody's got us, China's got us, we got China, 640 00:31:59,240 --> 00:32:01,440 Speaker 1: anybody right now it could go in and pretty much 641 00:32:01,440 --> 00:32:04,560 Speaker 1: funk up the grid on someone else like that, And 642 00:32:04,560 --> 00:32:08,120 Speaker 1: there's yeah, and that's not actually the least that's that's 643 00:32:08,160 --> 00:32:11,400 Speaker 1: safer than other possibilities, like because there is the level 644 00:32:11,400 --> 00:32:14,200 Speaker 1: of of mutually a shared destruction there where it's like, yeah, man, 645 00:32:14,280 --> 00:32:16,680 Speaker 1: Russia could take down the grid, but like that wouldn't 646 00:32:16,680 --> 00:32:19,760 Speaker 1: be good for them, and vice versa. You know, Yeah, no, true. 647 00:32:19,840 --> 00:32:22,400 Speaker 1: So the reality is, though everybody's and everybody's network, those 648 00:32:22,440 --> 00:32:25,560 Speaker 1: days are over. Um, when it comes to the individual 649 00:32:25,640 --> 00:32:28,960 Speaker 1: and I'm gonna have a the audience, there might be 650 00:32:28,960 --> 00:32:30,800 Speaker 1: people in the audience to feel differently, and it still 651 00:32:30,800 --> 00:32:32,440 Speaker 1: doesn't mean that we don't try. So one of the 652 00:32:32,440 --> 00:32:34,440 Speaker 1: things I want to say is you're gonna hear some 653 00:32:34,560 --> 00:32:36,520 Speaker 1: skepticism here because I've been doing this career for a 654 00:32:36,560 --> 00:32:39,640 Speaker 1: long time and I've seen things go wrong more than right, 655 00:32:39,960 --> 00:32:42,560 Speaker 1: and so in that regard, this is gonna sound kind 656 00:32:42,560 --> 00:32:44,480 Speaker 1: of cynical. But when it comes to the idea of 657 00:32:45,400 --> 00:32:49,040 Speaker 1: individual privacy, in my opinion, with the exception of when 658 00:32:49,080 --> 00:32:52,600 Speaker 1: you're taking a very active effort in something very specific 659 00:32:52,640 --> 00:32:55,000 Speaker 1: that you want to keep private because that's something you're 660 00:32:55,040 --> 00:32:59,080 Speaker 1: working on personally, the reality is individual privacy is dead 661 00:32:59,120 --> 00:33:03,880 Speaker 1: and gone, and we're just starting to smell that corpse um. 662 00:33:04,120 --> 00:33:08,600 Speaker 1: Whether it is credit card data transactions, your cell phone history, 663 00:33:08,680 --> 00:33:11,200 Speaker 1: your phone numbers, what you've done on the internet, what 664 00:33:11,320 --> 00:33:13,840 Speaker 1: you've done on social media or not done on social media, 665 00:33:14,040 --> 00:33:16,000 Speaker 1: whether you have an account on Facebook or not, doesn't 666 00:33:16,000 --> 00:33:19,120 Speaker 1: even matter. The metadata and the trailer you're leaving behind 667 00:33:19,160 --> 00:33:23,240 Speaker 1: you is all aggregated, all of it behind big data corporations, 668 00:33:23,480 --> 00:33:26,840 Speaker 1: all of it compromised, all of it searchable. Even stuff 669 00:33:26,840 --> 00:33:30,160 Speaker 1: the government has on you has been sold to large corporations. 670 00:33:30,160 --> 00:33:32,280 Speaker 1: Because I can tell you that some of the data 671 00:33:32,320 --> 00:33:34,480 Speaker 1: that they kept for like let's say D m V 672 00:33:34,680 --> 00:33:37,200 Speaker 1: or m v D, they decided to sell it off 673 00:33:37,240 --> 00:33:40,080 Speaker 1: to a corporation and they themselves access it through a 674 00:33:40,160 --> 00:33:43,360 Speaker 1: third party when doing research on you. So all of 675 00:33:43,360 --> 00:33:45,560 Speaker 1: that big data, there's a law of physics. The more 676 00:33:45,640 --> 00:33:52,320 Speaker 1: you aggregate, the mortal get compromised. Um, geez, I'm sorry, 677 00:33:52,360 --> 00:33:54,520 Speaker 1: that's the truth. No, no, no, I mean yeah, you're 678 00:33:54,560 --> 00:34:01,840 Speaker 1: you're you're like it's this Uh, there's this frustration because 679 00:34:01,840 --> 00:34:07,400 Speaker 1: I can remember the days when the privacy hounds. And 680 00:34:07,400 --> 00:34:09,360 Speaker 1: I don't say that in a negative term. We're like 681 00:34:09,440 --> 00:34:12,719 Speaker 1: warning everybody about, Hey, you don't want to be aggregating 682 00:34:12,719 --> 00:34:14,719 Speaker 1: all of these different social media things together. Hey you 683 00:34:14,719 --> 00:34:16,400 Speaker 1: don't want to be using all of these services. Hey 684 00:34:16,440 --> 00:34:19,480 Speaker 1: there's actually some like real downsides like all of what's happening. 685 00:34:19,560 --> 00:34:22,000 Speaker 1: Like part of why things are so cheap on Amazon 686 00:34:22,160 --> 00:34:24,160 Speaker 1: is you know that that your data there is is 687 00:34:24,239 --> 00:34:27,799 Speaker 1: one of the assets that they have. And um, those 688 00:34:27,800 --> 00:34:31,920 Speaker 1: people were absolutely right, and they lost harder than anyone 689 00:34:31,920 --> 00:34:35,400 Speaker 1: has ever lost at anything. Like So, Like when I 690 00:34:35,480 --> 00:34:38,040 Speaker 1: was back there at that company doing all that cryptography work, 691 00:34:38,440 --> 00:34:40,719 Speaker 1: we were trying to give crypto like to the average 692 00:34:40,840 --> 00:34:43,680 Speaker 1: general population of the Internet. I had this, like I said, 693 00:34:43,719 --> 00:34:46,160 Speaker 1: this naive view of like the future that was gonna 694 00:34:46,200 --> 00:34:47,879 Speaker 1: be this place where we're gonna have the Internet where 695 00:34:47,880 --> 00:34:50,120 Speaker 1: everyone was connected, and it was gonna be not only 696 00:34:50,120 --> 00:34:52,680 Speaker 1: would we have personal privacy through cryptography, but we would 697 00:34:52,719 --> 00:34:55,120 Speaker 1: be able to transfer information to one another in a 698 00:34:55,160 --> 00:34:58,279 Speaker 1: way that would make the Shenanigans impossible. Well, to some 699 00:34:58,360 --> 00:35:00,719 Speaker 1: degree that's been true, we've seen some of that, but 700 00:35:00,880 --> 00:35:04,760 Speaker 1: to another degree, we also have Snowdon dropping the bomb 701 00:35:04,800 --> 00:35:07,319 Speaker 1: on revelations about what the government has done to the 702 00:35:07,360 --> 00:35:10,279 Speaker 1: individual and how they've broken the law with all of 703 00:35:10,320 --> 00:35:13,640 Speaker 1: our privacy and data and what came of that a 704 00:35:13,680 --> 00:35:18,840 Speaker 1: man in exile in Russia and pretty much fucking nothing, Yeah, right, nothing? 705 00:35:19,239 --> 00:35:22,959 Speaker 1: And um I was sitting at a Defcon presentation where 706 00:35:23,040 --> 00:35:26,160 Speaker 1: General Alexander was on the screen talking about what they 707 00:35:26,160 --> 00:35:30,360 Speaker 1: weren't doing while Snowden was dropping revelations proving him to 708 00:35:30,400 --> 00:35:34,000 Speaker 1: be lying. And nothing comes of it, right, nothing really 709 00:35:34,040 --> 00:35:37,200 Speaker 1: comes of it. And one of the things that's so real. 710 00:35:37,520 --> 00:35:41,320 Speaker 1: And so whether it's the tribal level, your neighbors across 711 00:35:41,360 --> 00:35:45,120 Speaker 1: the street or the internet tribe, we as a people 712 00:35:45,200 --> 00:35:48,879 Speaker 1: in the aggregate are always willing to give up our 713 00:35:49,000 --> 00:35:52,160 Speaker 1: rights to something bigger for convenience. And we've done that 714 00:35:52,200 --> 00:35:56,120 Speaker 1: and it's called Facebook and Twitter and social media and 715 00:35:56,200 --> 00:35:58,680 Speaker 1: in the process, what was going to be an amazing 716 00:35:58,760 --> 00:36:05,359 Speaker 1: resource has become the trap. Uh, it's such a it's 717 00:36:05,360 --> 00:36:08,279 Speaker 1: because you know, you know Garrison, I I my my 718 00:36:08,440 --> 00:36:11,720 Speaker 1: friend who is much younger than me, UM has grown 719 00:36:11,800 --> 00:36:15,919 Speaker 1: up with the Internet being being what it is now 720 00:36:16,120 --> 00:36:18,840 Speaker 1: right like this this kind of like nightmare trap. You 721 00:36:18,880 --> 00:36:21,240 Speaker 1: know that that's sucking us all in this like giant 722 00:36:21,360 --> 00:36:24,800 Speaker 1: squid that has us in its tentacles. Um. And it's 723 00:36:24,840 --> 00:36:27,560 Speaker 1: I get I sometimes like dissociate talking with them about 724 00:36:27,600 --> 00:36:30,920 Speaker 1: certain Internet things because in my heart it's still the 725 00:36:30,920 --> 00:36:34,759 Speaker 1: promised land. Yeah, I wish I I guess my I 726 00:36:35,000 --> 00:36:36,560 Speaker 1: wish I felt that way. It doesn't feel like that 727 00:36:36,600 --> 00:36:38,319 Speaker 1: way to me anymore, to be honest, I mean it's 728 00:36:38,400 --> 00:36:40,920 Speaker 1: it's not right like and what I mean that in 729 00:36:41,000 --> 00:36:44,080 Speaker 1: like sort of I have this I don't know. I've 730 00:36:44,080 --> 00:36:46,120 Speaker 1: never entirely been able to like let go of the 731 00:36:46,200 --> 00:36:47,960 Speaker 1: vision of like, oh it could have been There's so 732 00:36:48,000 --> 00:36:50,239 Speaker 1: many things that could have been. Well, it's like, you know, 733 00:36:50,280 --> 00:36:53,560 Speaker 1: it's like all technology, anything can be remonized, right right, 734 00:36:53,640 --> 00:36:55,879 Speaker 1: Like an a R fifteen can be used for good 735 00:36:56,200 --> 00:36:58,319 Speaker 1: or for evil. A knife can be used to make 736 00:36:58,360 --> 00:37:00,719 Speaker 1: a beautiful meal or to commit a murder. And the 737 00:37:00,760 --> 00:37:03,359 Speaker 1: Internet is technology and it has been weaponized. It's been 738 00:37:03,360 --> 00:37:06,000 Speaker 1: weaponized against us. But at the same time, if we 739 00:37:06,080 --> 00:37:08,239 Speaker 1: just turn a blind eye to it and then not 740 00:37:08,360 --> 00:37:10,960 Speaker 1: learn how to use this technology to our advantage, we're 741 00:37:11,000 --> 00:37:13,480 Speaker 1: allowing them to do that unabated. And that's where like 742 00:37:13,480 --> 00:37:15,400 Speaker 1: the kind of hacker mindset comes from, which is like, 743 00:37:15,520 --> 00:37:17,600 Speaker 1: how do I make this thing do what I wanted 744 00:37:17,640 --> 00:37:20,120 Speaker 1: to do for me while not letting someone else do 745 00:37:20,160 --> 00:37:23,000 Speaker 1: it for them. And unless we take control of the 746 00:37:23,040 --> 00:37:26,960 Speaker 1: technology for ourselves, like I said earlier, normalizing using signal 747 00:37:27,280 --> 00:37:31,000 Speaker 1: and even basic VPN and cryptography, then we're just giving 748 00:37:31,000 --> 00:37:33,239 Speaker 1: it up. We're not even making it a challenge. We're 749 00:37:33,280 --> 00:37:36,000 Speaker 1: just like, here, you go have it. And uh, that's 750 00:37:36,000 --> 00:37:38,120 Speaker 1: something that I think that's more important as a community. 751 00:37:38,160 --> 00:37:41,120 Speaker 1: Maybe as people grow up on the Internet versus seeing 752 00:37:41,120 --> 00:37:45,160 Speaker 1: it becoming something that I saw become something maybe either 753 00:37:45,239 --> 00:37:48,040 Speaker 1: a they'll just accept which I hope isn't the case 754 00:37:48,080 --> 00:37:50,600 Speaker 1: that the reality is privacy is dead, or maybe they'll 755 00:37:50,640 --> 00:37:54,040 Speaker 1: approach the Internet differently than say someone at my age did. 756 00:37:54,520 --> 00:37:56,680 Speaker 1: We're frankly, we kind of messed up and we didn't 757 00:37:56,719 --> 00:38:01,440 Speaker 1: realize that Primrose Path was actually trap and that's a 758 00:38:01,640 --> 00:38:04,239 Speaker 1: like that was a mistake and maybe we can kind 759 00:38:04,239 --> 00:38:07,359 Speaker 1: of like evolve beyond that. But like you're asking, where 760 00:38:07,400 --> 00:38:10,120 Speaker 1: is info set going now? I I don't have good 761 00:38:10,160 --> 00:38:12,040 Speaker 1: notes for that. Like when I first started working in 762 00:38:12,040 --> 00:38:14,319 Speaker 1: the career, it really felt like a great thing. We 763 00:38:14,320 --> 00:38:17,080 Speaker 1: were doing important stuff. We were doing the DOOS mitigation. 764 00:38:17,160 --> 00:38:19,680 Speaker 1: We were going into hospitals and making sure that insulin 765 00:38:19,760 --> 00:38:23,600 Speaker 1: pumps weren't compromised as a DIDOS host. Believe it or not, 766 00:38:23,960 --> 00:38:27,640 Speaker 1: hospitals are INFOSECT nightmares. And we were doing stuff that 767 00:38:27,680 --> 00:38:30,560 Speaker 1: felt good. And then later in the career I realized, 768 00:38:30,920 --> 00:38:33,280 Speaker 1: wait a minute, I'm not doing anything to secure anybody's 769 00:38:33,320 --> 00:38:36,440 Speaker 1: personal information or make the Internet safer. I was just 770 00:38:36,520 --> 00:38:40,239 Speaker 1: protecting some corporate coffer And the reality was that the 771 00:38:40,280 --> 00:38:43,680 Speaker 1: private information that we were supposedly protecting the debate would 772 00:38:43,719 --> 00:38:47,360 Speaker 1: turn into calls, which was what's more expensive losing the 773 00:38:47,440 --> 00:38:51,040 Speaker 1: data or the lawsuit for losing the data. Literally, those 774 00:38:51,080 --> 00:38:54,000 Speaker 1: were the conversations and corporations, and those are the conversations 775 00:38:54,040 --> 00:38:56,880 Speaker 1: that corporations have now about each and every one of 776 00:38:56,880 --> 00:39:02,080 Speaker 1: ours personal information. Now when you when you think about 777 00:39:02,800 --> 00:39:06,360 Speaker 1: because so I obviously I'm in a different it was 778 00:39:06,400 --> 00:39:08,279 Speaker 1: in a different field. But when I was doing a 779 00:39:08,280 --> 00:39:10,480 Speaker 1: lot of the research on terrorism that I was doing, 780 00:39:10,920 --> 00:39:12,920 Speaker 1: I had these things that were like sort of the 781 00:39:13,680 --> 00:39:15,719 Speaker 1: this kind of attack is going to happen at something 782 00:39:15,719 --> 00:39:18,319 Speaker 1: I feel that very much about, Like drones. There's going 783 00:39:18,360 --> 00:39:21,440 Speaker 1: to be like a mass killing of civilians not in 784 00:39:21,480 --> 00:39:24,640 Speaker 1: a war zone by a civilian weaponized drone at some 785 00:39:24,719 --> 00:39:26,799 Speaker 1: point in the not too distant features going to happen. 786 00:39:26,800 --> 00:39:31,120 Speaker 1: It's going to be done. It's absolutely in an inevitability. Um, 787 00:39:31,160 --> 00:39:32,920 Speaker 1: that kind of stuff. Do you what are you when 788 00:39:33,000 --> 00:39:35,920 Speaker 1: you think about kind of the the digital equivalence of that, 789 00:39:35,960 --> 00:39:40,600 Speaker 1: Like what are you looking towards? Well, I agree with 790 00:39:40,640 --> 00:39:42,720 Speaker 1: you about the drone, Like you can see God, Yes, 791 00:39:42,960 --> 00:39:44,680 Speaker 1: you plot the you plot the dots and you know 792 00:39:44,719 --> 00:39:46,600 Speaker 1: what's going to occur, right, It's it's not it's not 793 00:39:46,680 --> 00:39:49,920 Speaker 1: possible to avoid. We unleashed that out of the cage 794 00:39:49,920 --> 00:39:53,040 Speaker 1: and it's going to happen. Um. Quite honestly, I think 795 00:39:53,080 --> 00:39:57,040 Speaker 1: we're seeing it already. We're seeing we're seeing the level 796 00:39:57,080 --> 00:40:00,600 Speaker 1: of privacy invasion that I don't think people already know 797 00:40:00,800 --> 00:40:03,160 Speaker 1: has happened. Like I know some of us realize that 798 00:40:03,239 --> 00:40:05,319 Speaker 1: we talked about it, we rant about it, but like, 799 00:40:05,880 --> 00:40:09,080 Speaker 1: I don't think people realize the level of the incursion 800 00:40:09,120 --> 00:40:12,719 Speaker 1: that has occurred to the point where all of this 801 00:40:12,840 --> 00:40:15,560 Speaker 1: data aggregated to the point they know what toilet paper 802 00:40:15,640 --> 00:40:18,480 Speaker 1: you prefer to buy. Like I'm talking like people like 803 00:40:18,520 --> 00:40:22,640 Speaker 1: Facebook knowing that um or the size of the corporate 804 00:40:22,680 --> 00:40:25,600 Speaker 1: oligarchy that controls the Internet, whether it's the small like 805 00:40:25,880 --> 00:40:30,480 Speaker 1: Alphabet Court, Facebook, Apple, Microsoft's becoming a smaller player weirdly, 806 00:40:30,600 --> 00:40:34,080 Speaker 1: but when you think about those big names, they kind 807 00:40:34,120 --> 00:40:38,360 Speaker 1: of like control everything in every piece of data about 808 00:40:38,400 --> 00:40:41,960 Speaker 1: you and everything you move, and say that, I think, 809 00:40:42,000 --> 00:40:43,640 Speaker 1: I think what's the end of that? I don't think 810 00:40:43,680 --> 00:40:45,799 Speaker 1: we got to the end game of that, but I 811 00:40:45,880 --> 00:40:48,480 Speaker 1: don't know how we roll it back. And that's the thing. 812 00:40:48,920 --> 00:40:52,439 Speaker 1: So what's the prediction. My prediction is it's gonna get 813 00:40:52,440 --> 00:40:54,960 Speaker 1: worse and we're gonna get to the point where there 814 00:40:55,080 --> 00:40:59,480 Speaker 1: isn't room to move without that surveillance tracking you and 815 00:40:59,560 --> 00:41:01,880 Speaker 1: like so for example, you think of things like Sci 816 00:41:01,920 --> 00:41:04,560 Speaker 1: Fi Minority Report, you walk to the mall and there's 817 00:41:04,640 --> 00:41:08,560 Speaker 1: facial idea happening everywhere you go with targeted advertising at 818 00:41:08,600 --> 00:41:12,640 Speaker 1: the mall. Oh that's coming I guarantee that's coming, and 819 00:41:13,200 --> 00:41:16,680 Speaker 1: all of that's happening already, and that facial recognition stuff 820 00:41:16,680 --> 00:41:20,040 Speaker 1: that's going on is happening currently now we're just not 821 00:41:20,160 --> 00:41:23,640 Speaker 1: that aware of it happening. The cop cars driving down 822 00:41:23,640 --> 00:41:27,319 Speaker 1: the road and every license plate is being measured with 823 00:41:27,360 --> 00:41:31,160 Speaker 1: the cameras being O CRD optical character recognition, and that's 824 00:41:31,160 --> 00:41:34,120 Speaker 1: coming back, and they're tracking every car they're driving by 825 00:41:34,120 --> 00:41:36,760 Speaker 1: on the highway even though there's not a GPS unit 826 00:41:36,840 --> 00:41:43,000 Speaker 1: on your car. The ability to not be tracked will 827 00:41:43,080 --> 00:41:48,920 Speaker 1: soon be impossible. How's that yeah? I mean allegedly, when 828 00:41:48,960 --> 00:41:53,400 Speaker 1: I was younger, they were like certain stupid petty crimes 829 00:41:53,440 --> 00:41:55,920 Speaker 1: I would commit just because like, people will not be 830 00:41:56,000 --> 00:41:58,000 Speaker 1: able to do this in the future, and I have 831 00:41:58,040 --> 00:42:00,919 Speaker 1: a moral responsibility to steal the bulbs from in front 832 00:42:00,920 --> 00:42:02,680 Speaker 1: of this bar and throw them in my threads, Like 833 00:42:02,760 --> 00:42:04,839 Speaker 1: what one day that will be a thing that people 834 00:42:04,960 --> 00:42:07,160 Speaker 1: can't do without getting caught, And so like I just 835 00:42:07,239 --> 00:42:10,800 Speaker 1: I had to. You know, there are like some bright spots, 836 00:42:10,840 --> 00:42:13,200 Speaker 1: because I think you're absolutely right, there's no on like 837 00:42:13,320 --> 00:42:16,520 Speaker 1: a broader scale, there's no turning back the clock for 838 00:42:16,560 --> 00:42:19,000 Speaker 1: stuff like facial recognition and how funked up. It's going 839 00:42:19,040 --> 00:42:21,080 Speaker 1: to get There are states like where I live in Oregon, 840 00:42:21,080 --> 00:42:23,160 Speaker 1: where like they have passed laws that are just like you, 841 00:42:23,400 --> 00:42:26,240 Speaker 1: public facial recognition is not a thing that is legal 842 00:42:26,280 --> 00:42:30,880 Speaker 1: in this state. Um. And I definitely support more attempts 843 00:42:30,960 --> 00:42:33,799 Speaker 1: like that, because again, anything you can do to sty 844 00:42:33,920 --> 00:42:36,080 Speaker 1: me them, to reduce the spread of the grid, to 845 00:42:36,520 --> 00:42:39,280 Speaker 1: reduce the profitability of these things, even though it's again 846 00:42:39,719 --> 00:42:43,680 Speaker 1: overall a doomed cause. Right, Um yeah, I don't know. 847 00:42:43,920 --> 00:42:46,319 Speaker 1: I mean I obviously I think that that's a good law, 848 00:42:46,360 --> 00:42:48,719 Speaker 1: but I don't know that laws stop corporations when the 849 00:42:48,800 --> 00:42:53,719 Speaker 1: corporations have more power than law. Yes, of course. Um. 850 00:42:53,719 --> 00:42:56,239 Speaker 1: And it's like I mean, obviously you can you can 851 00:42:56,400 --> 00:42:58,959 Speaker 1: ban it for police to use and stuff, which does 852 00:42:59,000 --> 00:43:02,040 Speaker 1: something to the extent that you know they follow the law, 853 00:43:02,160 --> 00:43:04,440 Speaker 1: but um, none of this is I don't know, Like 854 00:43:04,719 --> 00:43:07,680 Speaker 1: I That's one of the things that makes me most 855 00:43:08,560 --> 00:43:13,200 Speaker 1: depressed about the future is the thought that like the 856 00:43:13,680 --> 00:43:16,560 Speaker 1: space for this is not like a major issue, I guess, 857 00:43:16,560 --> 00:43:18,560 Speaker 1: but like the space for kids just like funk around 858 00:43:19,640 --> 00:43:21,920 Speaker 1: and do dumb ship when they're nineteen is going to 859 00:43:21,960 --> 00:43:23,839 Speaker 1: get so much smaller. I mean, I would say. I mean, 860 00:43:23,840 --> 00:43:26,560 Speaker 1: I think the thing is like, as a natural human being, 861 00:43:26,560 --> 00:43:29,359 Speaker 1: whether you're doing anything wrong, even if you're not doing 862 00:43:29,400 --> 00:43:32,319 Speaker 1: anything wrong, the nature to feel like you have a 863 00:43:32,360 --> 00:43:36,080 Speaker 1: private space that's to your private community space. I'm not 864 00:43:36,120 --> 00:43:37,839 Speaker 1: even talking about wrong or right here. We're just talking 865 00:43:37,880 --> 00:43:41,080 Speaker 1: about just that feeling that at this moment, this is 866 00:43:41,120 --> 00:43:44,880 Speaker 1: my space where I'm not being watched. Is a natural, 867 00:43:45,040 --> 00:43:51,960 Speaker 1: healthy need of the human orgasm or organism. Um and interesting, 868 00:43:52,080 --> 00:43:55,319 Speaker 1: Uh yeah, but no, it's it's a it's a human need. 869 00:43:55,680 --> 00:43:59,160 Speaker 1: And I think we're gonna find those spaces become smaller 870 00:43:59,200 --> 00:44:02,000 Speaker 1: and small. And I think when you said, what's your prediction, 871 00:44:02,040 --> 00:44:03,520 Speaker 1: I hate to say it, but I think the prediction 872 00:44:03,600 --> 00:44:06,840 Speaker 1: is it will become impossible to not be tracked. Now, 873 00:44:07,400 --> 00:44:09,279 Speaker 1: the bright spide of that, the bright side of that, 874 00:44:09,360 --> 00:44:12,600 Speaker 1: maybe maybe there's a bright side. Maybe at some point 875 00:44:13,360 --> 00:44:16,279 Speaker 1: when that's the reality, it could somehow also affect the 876 00:44:16,320 --> 00:44:20,240 Speaker 1: people that are powerful and the people that are small. 877 00:44:20,640 --> 00:44:23,720 Speaker 1: And we all realize that humans are humans and therefore 878 00:44:24,280 --> 00:44:27,360 Speaker 1: the failings that sometimes we have as all human beings 879 00:44:27,560 --> 00:44:29,839 Speaker 1: we just kind of acknowledge and be like, oh, yeah, 880 00:44:29,840 --> 00:44:32,759 Speaker 1: of course that's just what people do. Like maybe we 881 00:44:32,880 --> 00:44:35,359 Speaker 1: just realize people are people. But the idea that there's 882 00:44:35,400 --> 00:44:37,600 Speaker 1: never going to be a space to not get tracked, 883 00:44:37,960 --> 00:44:41,200 Speaker 1: I don't know. To me, I find darkly disturbing. It 884 00:44:41,360 --> 00:44:43,920 Speaker 1: is disturbing. I do think it kind of to pivot 885 00:44:43,920 --> 00:44:45,600 Speaker 1: off of what you were saying. The other aspect of 886 00:44:45,600 --> 00:44:49,440 Speaker 1: that that is more positive is that all of this stuff, 887 00:44:49,960 --> 00:44:53,799 Speaker 1: all of this surveillance ship um or at least not all, 888 00:44:53,800 --> 00:44:57,279 Speaker 1: but quite a bit of it is you know, in 889 00:44:57,320 --> 00:44:59,439 Speaker 1: a way, it's like a knife fight. There's no way 890 00:44:59,480 --> 00:45:01,719 Speaker 1: that both parties don't get cut, and you know, the 891 00:45:01,760 --> 00:45:04,520 Speaker 1: ones wielding the knife might get cut less, but they're 892 00:45:04,520 --> 00:45:06,560 Speaker 1: still going to get cut. And part of what that 893 00:45:06,640 --> 00:45:10,560 Speaker 1: means in this situation is that the prevalence of all 894 00:45:10,560 --> 00:45:13,840 Speaker 1: of these different ways to surveil and track also allows 895 00:45:14,680 --> 00:45:18,080 Speaker 1: us to track in the same way that like police 896 00:45:18,360 --> 00:45:21,319 Speaker 1: law enforcement watches people through their phones, but also a 897 00:45:21,320 --> 00:45:23,120 Speaker 1: hell of a lot of cops are getting filmed doing 898 00:45:23,160 --> 00:45:29,120 Speaker 1: fucked up ship now right now. Again, the balance of 899 00:45:29,160 --> 00:45:30,960 Speaker 1: the cuts I don't think is going to be work 900 00:45:31,000 --> 00:45:32,640 Speaker 1: out in our favor, but it's not going to be 901 00:45:32,760 --> 00:45:35,080 Speaker 1: nothing on them either. And and and you're right, I 902 00:45:35,080 --> 00:45:37,600 Speaker 1: think there are there are some things that we will 903 00:45:37,840 --> 00:45:40,360 Speaker 1: learn in the future about the people in power in 904 00:45:40,400 --> 00:45:42,600 Speaker 1: the world that would it wouldn't have been possible for 905 00:45:42,680 --> 00:45:44,319 Speaker 1: us to learn in the past, or may not be 906 00:45:44,360 --> 00:45:48,160 Speaker 1: possibly even right now, And that could be And if 907 00:45:48,160 --> 00:45:50,040 Speaker 1: we learned that about people in power, then they can't 908 00:45:50,040 --> 00:45:53,200 Speaker 1: weaponize it as much against the people that aren't in power. Right. Yeah, yeah, 909 00:45:53,280 --> 00:45:55,600 Speaker 1: you know one thing that I'm because I'm thinking a 910 00:45:55,640 --> 00:45:57,839 Speaker 1: lot about the fact that a bunch of folks in 911 00:45:57,880 --> 00:46:03,000 Speaker 1: the reproductive healthcare industry have pointed out that right wingers 912 00:46:03,000 --> 00:46:07,640 Speaker 1: have started using drones uh to follow people home from 913 00:46:07,640 --> 00:46:09,560 Speaker 1: like planted parrot hoods and followed them to their cards 914 00:46:09,560 --> 00:46:11,680 Speaker 1: to build databases of the people who are going to 915 00:46:11,920 --> 00:46:16,080 Speaker 1: places to potentially like do that kind of reproductive healthcare 916 00:46:16,160 --> 00:46:19,560 Speaker 1: that these folks don't think should exist. Um. The other 917 00:46:19,600 --> 00:46:22,839 Speaker 1: side of it, though, is that, um, it is also 918 00:46:23,440 --> 00:46:27,440 Speaker 1: possible to surveil them, um, and it will be possible 919 00:46:27,480 --> 00:46:30,719 Speaker 1: to track the people doing that sort of thing, and 920 00:46:30,719 --> 00:46:32,520 Speaker 1: it will be possible to do that in terms of 921 00:46:32,520 --> 00:46:34,560 Speaker 1: like legal accountability, and it will be possible to do 922 00:46:34,600 --> 00:46:39,520 Speaker 1: that for the people who embrace uh questionably legal tactics 923 00:46:39,719 --> 00:46:44,200 Speaker 1: for for frustrating those efforts um or illegal tactics for 924 00:46:44,200 --> 00:46:47,680 Speaker 1: frustrating those efforts. They have access to the same technology. 925 00:46:47,800 --> 00:46:50,520 Speaker 1: Um And again it's it's it is a knife that 926 00:46:50,800 --> 00:46:54,759 Speaker 1: will cut everybody. Um And I guess that's better than 927 00:46:54,800 --> 00:46:58,000 Speaker 1: just one person getting cut in this situation. That's that's 928 00:46:58,040 --> 00:46:59,759 Speaker 1: the concern I have, right, I agree with that like 929 00:46:59,800 --> 00:47:02,280 Speaker 1: I that technology goes it's a weapon and its weaponized 930 00:47:02,280 --> 00:47:03,920 Speaker 1: in all directions, depending on how to use it for 931 00:47:03,920 --> 00:47:06,160 Speaker 1: good over bad. And so this is the same place 932 00:47:06,200 --> 00:47:08,320 Speaker 1: I come to when it comes to the gun control argument. 933 00:47:08,360 --> 00:47:14,640 Speaker 1: I mean, we can do no, no, no, the same problem, right, 934 00:47:14,880 --> 00:47:19,279 Speaker 1: because if we allow only one side to have all 935 00:47:19,320 --> 00:47:21,880 Speaker 1: of the control and power and understanding of the technology, 936 00:47:22,080 --> 00:47:24,480 Speaker 1: then we at ourselves are at a huge deficit. We 937 00:47:24,560 --> 00:47:27,960 Speaker 1: cannot defend ourselves or fight back. So when it comes 938 00:47:28,040 --> 00:47:30,640 Speaker 1: to this kind of data and technology, knowing the basic 939 00:47:30,640 --> 00:47:33,360 Speaker 1: fundamentals of what you can do to protect yourself, understand 940 00:47:33,400 --> 00:47:36,879 Speaker 1: the reality of what the surveillance state or corporation is, 941 00:47:37,160 --> 00:47:39,160 Speaker 1: and then doing your best to not make it easy 942 00:47:39,239 --> 00:47:41,640 Speaker 1: for them is at least one step forward. But if 943 00:47:41,680 --> 00:47:44,719 Speaker 1: we don't own this technology, if we don't own the tech, 944 00:47:45,239 --> 00:47:48,000 Speaker 1: someone else will and they will use it against us. 945 00:47:48,000 --> 00:47:50,680 Speaker 1: It's as simple as that, and like they're super simple 946 00:47:50,719 --> 00:47:52,120 Speaker 1: stuff like I was gonna bring us up with like 947 00:47:52,680 --> 00:47:54,440 Speaker 1: you can't see video because it's a podcast, but like 948 00:47:54,480 --> 00:47:57,960 Speaker 1: there's these cool glasses from doctor are called reflectacles that 949 00:47:58,040 --> 00:48:01,319 Speaker 1: I'm showing you Robert, and look like regular sunglasses, but 950 00:48:01,320 --> 00:48:04,880 Speaker 1: when you put them on, they do they reflect I 951 00:48:05,160 --> 00:48:08,879 Speaker 1: R light and actually mess with cameras in a way 952 00:48:08,920 --> 00:48:11,279 Speaker 1: that your turns are diet facing a ball of light. 953 00:48:11,520 --> 00:48:14,160 Speaker 1: So you can wear these, You can wear their cold reflectacles. 954 00:48:14,200 --> 00:48:15,960 Speaker 1: You can wear them and just walk around the mall 955 00:48:16,360 --> 00:48:18,480 Speaker 1: and all the cameras get blown out by your by 956 00:48:18,480 --> 00:48:21,960 Speaker 1: your glasses. Like doing that just because you can. It's 957 00:48:22,040 --> 00:48:24,919 Speaker 1: kind of fun. That's the hot ship. That's the ship. 958 00:48:25,040 --> 00:48:28,200 Speaker 1: I was promised that that at least does exist. It's 959 00:48:28,239 --> 00:48:31,160 Speaker 1: not everything I had hoped it would be in terms 960 00:48:31,160 --> 00:48:33,080 Speaker 1: of its ability, but it is like that kind of 961 00:48:33,080 --> 00:48:35,200 Speaker 1: stuff rules and I will be picking up a pair 962 00:48:35,239 --> 00:48:37,800 Speaker 1: of those. Um, well, we should probably close out. I 963 00:48:37,840 --> 00:48:40,080 Speaker 1: didn't want to note because I mentioned this. UM I 964 00:48:40,120 --> 00:48:41,720 Speaker 1: got something a little wrong when I was talking about 965 00:48:41,719 --> 00:48:45,160 Speaker 1: the facial recognition ban. UM. It is in an ordinance 966 00:48:45,160 --> 00:48:48,680 Speaker 1: in the City of Portland itself, Um, it's the first 967 00:48:48,760 --> 00:48:51,120 Speaker 1: city that has done this, and it prohibits the use 968 00:48:51,239 --> 00:48:55,400 Speaker 1: of public facial recognition technology by all private businesses in 969 00:48:55,440 --> 00:48:58,680 Speaker 1: the city. UM. So that is the scope of the 970 00:48:58,719 --> 00:49:00,920 Speaker 1: band that a band that exists in Portland. I recommend 971 00:49:00,920 --> 00:49:02,560 Speaker 1: looking it up. It is the kind of thing that 972 00:49:02,600 --> 00:49:06,480 Speaker 1: I would support everyone pushing forward in their city. UM. 973 00:49:06,560 --> 00:49:09,600 Speaker 1: Because again, the more holes you can make in this thing, 974 00:49:09,760 --> 00:49:11,799 Speaker 1: the better. Yeah. I don't want to put that down. 975 00:49:11,800 --> 00:49:13,520 Speaker 1: That's a good thing. But the challenge of this is 976 00:49:13,560 --> 00:49:15,680 Speaker 1: just like I mentioned earlier, moving the data out of 977 00:49:15,719 --> 00:49:18,759 Speaker 1: the konas and back the minute photos from like I 978 00:49:18,800 --> 00:49:20,920 Speaker 1: take my iPhone and scan the crowd and then put 979 00:49:20,960 --> 00:49:24,320 Speaker 1: that picture up on the internet, not under their jurisdiction, 980 00:49:24,360 --> 00:49:27,640 Speaker 1: and all that facial recondition happens on every face in 981 00:49:27,719 --> 00:49:31,440 Speaker 1: that yep. And that is again, well, we'll do another 982 00:49:31,480 --> 00:49:33,480 Speaker 1: episode at some point about things that you can do 983 00:49:33,560 --> 00:49:36,399 Speaker 1: to just get like there. That's a whole different bag 984 00:49:36,440 --> 00:49:38,880 Speaker 1: of tricks. Um. But this has been really useful and 985 00:49:38,920 --> 00:49:41,160 Speaker 1: really valuable. Carl, do you want to plug anything before 986 00:49:41,160 --> 00:49:43,839 Speaker 1: we roll out here? Not much, just my normal thing 987 00:49:43,880 --> 00:49:45,520 Speaker 1: if you're interested in this kind of content, but with 988 00:49:45,560 --> 00:49:47,680 Speaker 1: a more firearms oriented thing. You can find me an 989 00:49:47,680 --> 00:49:50,800 Speaker 1: in range dot tv. But you'll also find some information 990 00:49:50,840 --> 00:49:53,680 Speaker 1: security stuff there as well. I cover that intermittently when 991 00:49:53,719 --> 00:49:56,880 Speaker 1: it applies to both topics. So if you, if you 992 00:49:57,480 --> 00:49:59,799 Speaker 1: even if you disagree, but appreciate my approach to this, 993 00:50:00,320 --> 00:50:03,920 Speaker 1: come check me out. I appreciate it awesome. Check out Carl, 994 00:50:04,040 --> 00:50:07,520 Speaker 1: check out in range tv, and continue to listen to 995 00:50:07,640 --> 00:50:13,320 Speaker 1: podcasts because the only thing that will save us is podcasts. 996 00:50:13,760 --> 00:50:16,440 Speaker 1: When I think the same, right, but good for business. 997 00:50:21,239 --> 00:50:23,600 Speaker 1: It could happen here is a production of cool Zone Media. 998 00:50:23,840 --> 00:50:26,520 Speaker 1: For more podcasts from cool Zone Media, visit our website 999 00:50:26,520 --> 00:50:28,640 Speaker 1: cool zone media dot com, or check us out on 1000 00:50:28,680 --> 00:50:31,239 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 1001 00:50:31,280 --> 00:50:34,080 Speaker 1: listen to podcasts. You can find sources for It could 1002 00:50:34,080 --> 00:50:37,080 Speaker 1: Happen here, updated monthly at cool zone media dot com 1003 00:50:37,120 --> 00:50:39,040 Speaker 1: slash sources. Thanks for listening.