1 00:00:01,720 --> 00:00:07,440 Speaker 1: Also media. 2 00:00:08,680 --> 00:00:11,800 Speaker 2: Welcome back to it could happen here. I am Robert Evans, 3 00:00:11,880 --> 00:00:16,160 Speaker 2: and this is a podcast about things falling apart. Sometimes 4 00:00:16,160 --> 00:00:19,040 Speaker 2: it's about how to make things not fall apart, and 5 00:00:19,120 --> 00:00:22,360 Speaker 2: other times it's more about enduring it. Today is more 6 00:00:22,400 --> 00:00:24,639 Speaker 2: on the endurance side of things. And we're talking about 7 00:00:24,640 --> 00:00:27,440 Speaker 2: a subject that we get a lot of requests about here. 8 00:00:27,920 --> 00:00:30,240 Speaker 2: We've discussed this a year or so ago with one 9 00:00:30,240 --> 00:00:33,120 Speaker 2: of our guests, a great Carl Pasarta. We're talking about 10 00:00:33,159 --> 00:00:36,839 Speaker 2: like security culture, and particularly the aspect of security culture 11 00:00:36,840 --> 00:00:40,360 Speaker 2: that involves digital devices and how to communicate with your friends' 12 00:00:40,360 --> 00:00:45,920 Speaker 2: affinity groups, whomever via your phone essentially or your computer. 13 00:00:46,400 --> 00:00:49,080 Speaker 2: This is a thing where there's a huge amount of 14 00:00:49,120 --> 00:00:52,599 Speaker 2: disinformation as to which apps are safe. What does it 15 00:00:52,640 --> 00:00:55,960 Speaker 2: actually mean to say that an app is encrypted? How 16 00:00:56,000 --> 00:00:58,840 Speaker 2: far does encryption get you? What sort of like cultural 17 00:00:58,880 --> 00:01:03,920 Speaker 2: things come alongside the actual, like physical reality of the 18 00:01:03,920 --> 00:01:06,160 Speaker 2: security of the device in order to kind of make 19 00:01:06,319 --> 00:01:09,560 Speaker 2: a comprehensive security profile. We're gonna be talking about all 20 00:01:09,560 --> 00:01:11,560 Speaker 2: that today and hopefully giving you some good advice on 21 00:01:11,560 --> 00:01:14,240 Speaker 2: what you can trust. Because I am the furthest thing 22 00:01:14,280 --> 00:01:17,360 Speaker 2: in the world from a technical expert. We have two 23 00:01:17,440 --> 00:01:21,959 Speaker 2: actual experts with us today. Carolyn Senders and Cooper Quinton 24 00:01:22,200 --> 00:01:25,800 Speaker 2: have both recently published a paper alongside several other authors 25 00:01:26,560 --> 00:01:31,360 Speaker 2: Leila Wagner, Tim Bernard, Ami Meta, and Justin Hendricks called 26 00:01:31,400 --> 00:01:34,720 Speaker 2: what Is Secure? An Analysis of Popular Messaging Apps, and 27 00:01:34,760 --> 00:01:38,720 Speaker 2: it's it's basically going over what is the actual level 28 00:01:38,760 --> 00:01:42,120 Speaker 2: of security with a number of things like Telegram, you know, 29 00:01:42,200 --> 00:01:47,880 Speaker 2: Telegram's private messaging system, Facebook Messenger, Apple Message, or I Message. 30 00:01:47,880 --> 00:01:51,200 Speaker 2: I guess it's called and obviously signal and kind of 31 00:01:51,240 --> 00:01:54,080 Speaker 2: as a spoiler, signal is your best bet, but that 32 00:01:54,200 --> 00:01:56,600 Speaker 2: also isn't where you should end, right I think we 33 00:01:56,640 --> 00:01:58,760 Speaker 2: want to also talk about kind of like why and 34 00:01:59,160 --> 00:02:01,680 Speaker 2: to what extent that's the case. But anyway, I'm going 35 00:02:01,720 --> 00:02:04,520 Speaker 2: to turn things over to Carolyn and Cooper now because 36 00:02:04,520 --> 00:02:07,840 Speaker 2: I have talked enough about this. Hey guys, welcome to 37 00:02:07,880 --> 00:02:08,200 Speaker 2: the show. 38 00:02:09,680 --> 00:02:11,760 Speaker 3: Hey, Robert, thanks so much for having us on. 39 00:02:12,280 --> 00:02:15,119 Speaker 4: Yeah ah, yeah, thank you so much, A big fan 40 00:02:15,360 --> 00:02:18,840 Speaker 4: of the podcast, so always lovely, really lovely to be here. 41 00:02:19,320 --> 00:02:20,359 Speaker 5: Yeah, thank you so much. 42 00:02:20,960 --> 00:02:23,840 Speaker 2: Yeah, it's really lovely to have you both again. Listeners, 43 00:02:23,880 --> 00:02:26,600 Speaker 2: if you want to take a look at this their paper, 44 00:02:26,600 --> 00:02:29,000 Speaker 2: if you just google what is secure and analysis of 45 00:02:29,040 --> 00:02:32,000 Speaker 2: popular messaging apps. You'll find the Tech Policy Press has 46 00:02:32,240 --> 00:02:34,760 Speaker 2: a summary of it that's pretty quick. The full paper 47 00:02:34,840 --> 00:02:37,720 Speaker 2: is eighty six pages or so. I also recommend reading that, 48 00:02:37,800 --> 00:02:40,240 Speaker 2: but if you wanted to give this the summary a 49 00:02:40,240 --> 00:02:43,200 Speaker 2: skin before you continue, that might help. But I kind 50 00:02:43,200 --> 00:02:45,760 Speaker 2: of wanted to start by asking you, guys, what is 51 00:02:45,840 --> 00:02:49,600 Speaker 2: it that makes Signal a good option for people? Right? 52 00:02:49,600 --> 00:02:52,040 Speaker 2: Because I think most folks you describe it as sort 53 00:02:52,040 --> 00:02:55,240 Speaker 2: of security folklore, right, the stuff that you hear about 54 00:02:55,280 --> 00:02:58,320 Speaker 2: security from your friends, and if you're not a technical person, 55 00:02:58,360 --> 00:03:00,680 Speaker 2: you kind of just like trust what the folks around 56 00:03:00,720 --> 00:03:02,200 Speaker 2: you were saying. And that was sort of how I 57 00:03:02,240 --> 00:03:05,040 Speaker 2: got into Signal. Right, I'm not a technical person, but 58 00:03:05,080 --> 00:03:07,240 Speaker 2: people I knew and trusted who were were like, this 59 00:03:07,280 --> 00:03:08,080 Speaker 2: is your best option? 60 00:03:08,760 --> 00:03:09,639 Speaker 5: Yeah, thank you so much. 61 00:03:09,760 --> 00:03:11,840 Speaker 4: That's such a good question, and I think Cooper and 62 00:03:11,880 --> 00:03:15,480 Speaker 4: I probably have similar but also like very different answers 63 00:03:15,560 --> 00:03:15,880 Speaker 4: to it. 64 00:03:16,360 --> 00:03:18,320 Speaker 5: Cooper, I can go first if you want. 65 00:03:18,960 --> 00:03:20,560 Speaker 4: One of the things I love about Signal is it's 66 00:03:20,560 --> 00:03:24,200 Speaker 4: just really easy to use. It's in and encrypted, it's 67 00:03:24,200 --> 00:03:26,640 Speaker 4: a messaging app. There's not a lot of stuff on it, 68 00:03:26,680 --> 00:03:29,360 Speaker 4: but you can do a lot with it, so you 69 00:03:29,360 --> 00:03:32,720 Speaker 4: can do video calls, you can send actually pretty large 70 00:03:32,720 --> 00:03:33,799 Speaker 4: files like PDFs. 71 00:03:34,120 --> 00:03:35,680 Speaker 5: You can have drag and drop stuff. 72 00:03:36,080 --> 00:03:39,720 Speaker 4: It's like such a low threshold for use for users 73 00:03:40,040 --> 00:03:42,160 Speaker 4: because it is a messaging app, but it does so 74 00:03:42,200 --> 00:03:45,280 Speaker 4: many different kinds of things. But then related to that, 75 00:03:45,320 --> 00:03:48,920 Speaker 4: it's also actually quite minimal. So the paper which everyone 76 00:03:48,960 --> 00:03:52,120 Speaker 4: should read and we'll probably get into this later. Different 77 00:03:52,160 --> 00:03:58,800 Speaker 4: apps like Telegram or Facebook's Messenger app, for example, have have. 78 00:03:58,840 --> 00:04:00,840 Speaker 5: This thing we've been calling feature bloat. 79 00:04:01,360 --> 00:04:04,160 Speaker 4: They are messaging services that actually feel a bit more 80 00:04:04,240 --> 00:04:06,480 Speaker 4: like social networks if you look at the amount of 81 00:04:06,520 --> 00:04:08,840 Speaker 4: stuff that's on there, and by stuff, I don't just 82 00:04:08,920 --> 00:04:11,880 Speaker 4: mean like stickers, I mean if you look at there's 83 00:04:11,920 --> 00:04:15,080 Speaker 4: all these sort of specific and strange settings you can 84 00:04:15,160 --> 00:04:17,920 Speaker 4: use to have all different kinds of messages and all 85 00:04:17,920 --> 00:04:21,120 Speaker 4: different kinds of privacy settings, and all privacy settings are 86 00:04:21,160 --> 00:04:25,640 Speaker 4: really really great. Because Telegram and Facebook Messenger are not 87 00:04:25,960 --> 00:04:28,840 Speaker 4: encrypted by default, actually some of those settings can make 88 00:04:28,880 --> 00:04:31,279 Speaker 4: you feel more secure when you're not so. Kind of 89 00:04:31,279 --> 00:04:33,159 Speaker 4: the beauty of Signal is that out of the box, 90 00:04:33,680 --> 00:04:37,279 Speaker 4: it's incredibly secure. It's an inn encrypted They're not holding 91 00:04:37,279 --> 00:04:40,360 Speaker 4: any data about you. I believe the only only day 92 00:04:40,400 --> 00:04:42,920 Speaker 4: they hold is like when you've like when a phone 93 00:04:43,000 --> 00:04:46,120 Speaker 4: number or a profile has downloaded signal, like when you've 94 00:04:46,279 --> 00:04:50,600 Speaker 4: when you've signed up. But again it's it's incredibly easy 95 00:04:50,640 --> 00:04:54,159 Speaker 4: to use. And another thing is, you know, if this 96 00:04:54,240 --> 00:04:56,039 Speaker 4: was a few years ago, we've been looking at wire 97 00:04:56,120 --> 00:04:56,679 Speaker 4: for example. 98 00:04:56,680 --> 00:04:58,440 Speaker 5: One of the nice things about Signal. 99 00:04:58,200 --> 00:05:00,800 Speaker 4: And this might be controversial to some, is that it 100 00:05:00,839 --> 00:05:03,640 Speaker 4: does follow modern design patterns and standards. So if you're 101 00:05:03,720 --> 00:05:07,240 Speaker 4: using an iOS or Android version, like there are buttons 102 00:05:07,279 --> 00:05:09,880 Speaker 4: in places where you expect them to be. Signal is 103 00:05:09,880 --> 00:05:15,040 Speaker 4: not perfectly designed, but it is quite usable. Yeah, So 104 00:05:15,080 --> 00:05:17,240 Speaker 4: for me, that's kind of what I think makes it 105 00:05:17,279 --> 00:05:18,400 Speaker 4: makes it really wonderful. 106 00:05:18,680 --> 00:05:21,840 Speaker 2: Yeah, it's definitely as much as I love it, and 107 00:05:21,880 --> 00:05:24,880 Speaker 2: it's my like standard messaging app I do every now 108 00:05:24,920 --> 00:05:26,559 Speaker 2: and then run into the thing where like my friends 109 00:05:26,560 --> 00:05:28,640 Speaker 2: will call me through Signal, which is great if you 110 00:05:28,640 --> 00:05:31,160 Speaker 2: need a call to be secure, but it's not nearly 111 00:05:31,200 --> 00:05:33,280 Speaker 2: as good, Like it drops a lot more often than 112 00:05:33,320 --> 00:05:35,720 Speaker 2: a regular phone call, and I'm like, we're just trying 113 00:05:35,760 --> 00:05:37,680 Speaker 2: to meet at the movie theater. It's okay if the 114 00:05:37,800 --> 00:05:40,800 Speaker 2: nsay noes right, Like I've. 115 00:05:40,640 --> 00:05:42,800 Speaker 4: Definitely had that with friends where I'm like, I'm like yeah, 116 00:05:42,800 --> 00:05:46,640 Speaker 4: I'm like, we're just calling to talk about like your dog. 117 00:05:47,120 --> 00:05:48,320 Speaker 5: It's probably fine. 118 00:05:48,560 --> 00:05:50,839 Speaker 2: Yeah, the FBI can have this stuff. 119 00:05:51,160 --> 00:05:54,800 Speaker 4: Yeah, please send, please send, please send dog picks through 120 00:05:55,279 --> 00:05:56,479 Speaker 4: all all messaging apps. 121 00:05:57,760 --> 00:06:00,839 Speaker 3: You know. But on that note, it's uh, right, writing 122 00:06:01,080 --> 00:06:05,480 Speaker 3: usable software that is also secure is really hard, right, 123 00:06:05,520 --> 00:06:09,560 Speaker 3: And like as a like as cryptographer, I'm not a cryptographer, 124 00:06:09,640 --> 00:06:13,479 Speaker 3: but like as somebody cryptographer adjacent, we got that wrong 125 00:06:13,600 --> 00:06:17,480 Speaker 3: for a long time, right, Like before Signal the problem, 126 00:06:17,680 --> 00:06:20,359 Speaker 3: you know, there were the the the sort of most 127 00:06:20,520 --> 00:06:24,919 Speaker 3: used encryption methods were probably uh PGP email, which is 128 00:06:24,920 --> 00:06:28,159 Speaker 3: a method for encrypting email, and off the record chats, 129 00:06:28,240 --> 00:06:31,760 Speaker 3: and both of those none of those ever got to 130 00:06:31,920 --> 00:06:35,320 Speaker 3: the sort of level of user base that Signal and 131 00:06:35,640 --> 00:06:40,240 Speaker 3: and certainly not WhatsApp have, right, And and that's largely 132 00:06:40,320 --> 00:06:44,880 Speaker 3: because they were pretty much unusable, like PGP, almost entirely 133 00:06:44,960 --> 00:06:49,360 Speaker 3: unusable even by cryptography professionals, right, even by computer security 134 00:06:49,360 --> 00:06:54,080 Speaker 3: professionals like ourselves. OTR chat total pain in the butt, right, 135 00:06:54,200 --> 00:06:58,679 Speaker 3: like just just a real nightmare to use. So, like Signal, 136 00:06:58,880 --> 00:07:01,239 Speaker 3: there are still some rough edges, and we talked about 137 00:07:01,240 --> 00:07:04,280 Speaker 3: some of those in our paper. But overall, I think 138 00:07:04,320 --> 00:07:06,560 Speaker 3: that the big the big innovation they've had is just 139 00:07:07,200 --> 00:07:10,000 Speaker 3: remembering that what people want to do on a chat 140 00:07:10,040 --> 00:07:12,840 Speaker 3: app is not encrypt things. What people want to do 141 00:07:12,880 --> 00:07:14,800 Speaker 3: on a chat app is they want to they want 142 00:07:14,800 --> 00:07:18,040 Speaker 3: to chat right. And and the second that that that 143 00:07:18,120 --> 00:07:20,320 Speaker 3: the security sort of gets in the way of that, 144 00:07:20,880 --> 00:07:23,000 Speaker 3: people will stop using it and go find something that's 145 00:07:23,040 --> 00:07:26,000 Speaker 3: more usable. And it seems like that's been Signals sort 146 00:07:26,040 --> 00:07:30,320 Speaker 3: of guiding star and it's and they've you know, doing 147 00:07:30,360 --> 00:07:33,040 Speaker 3: the doing the most secure thing that you can will 148 00:07:33,120 --> 00:07:38,720 Speaker 3: still being fun and usable to actually just chat on right. 149 00:07:38,800 --> 00:07:40,960 Speaker 3: And I think that that has served them quite well. 150 00:07:41,800 --> 00:07:45,840 Speaker 2: Yeah, I think there's it's it's so important. One of 151 00:07:45,880 --> 00:07:47,960 Speaker 2: I think one of the things that that contributes to 152 00:07:48,040 --> 00:07:52,400 Speaker 2: good overall security is setting yourself up for success, which 153 00:07:52,440 --> 00:07:54,960 Speaker 2: means setting yourself up for a system that can function 154 00:07:55,040 --> 00:07:56,920 Speaker 2: well if you're lazy, which is one of the nice 155 00:07:56,960 --> 00:07:58,800 Speaker 2: things that you know, with Signal, you don't have to 156 00:07:58,840 --> 00:08:01,120 Speaker 2: worry about like opting in and out and like selecting 157 00:08:01,120 --> 00:08:03,840 Speaker 2: a bunch of stuff. It's pretty safe, especially for a 158 00:08:03,840 --> 00:08:07,040 Speaker 2: normal person's uses right out of the box, which is 159 00:08:07,160 --> 00:08:10,040 Speaker 2: huge and kind of in the same line, as that 160 00:08:10,160 --> 00:08:13,200 Speaker 2: is the fact that because Signal doesn't store metadata, you're 161 00:08:13,200 --> 00:08:16,760 Speaker 2: not relying upon them being like committed you know, anti 162 00:08:16,760 --> 00:08:20,440 Speaker 2: state actors or whatever like, because they don't have access 163 00:08:20,480 --> 00:08:22,840 Speaker 2: to the thing that for example, Facebook will hand over 164 00:08:22,920 --> 00:08:25,560 Speaker 2: to the cops if the cops just like breathe in 165 00:08:25,600 --> 00:08:26,160 Speaker 2: their direction. 166 00:08:28,040 --> 00:08:30,680 Speaker 3: Yeah, that's that's exactly right, And that's that is that 167 00:08:30,760 --> 00:08:32,760 Speaker 3: is the other really cool thing about Signal. You know, we, 168 00:08:33,400 --> 00:08:36,560 Speaker 3: as Carolyn said, the only data that Signal gives over 169 00:08:36,640 --> 00:08:40,480 Speaker 3: in response to uh A subpoena is the time that 170 00:08:41,040 --> 00:08:43,319 Speaker 3: the phone number signed up for Signal account and the 171 00:08:43,400 --> 00:08:45,760 Speaker 3: last time it connected to the Signal server. And the 172 00:08:45,800 --> 00:08:49,600 Speaker 3: reason we know that is because Signal publishes transparency reports 173 00:08:50,080 --> 00:08:53,199 Speaker 3: with the full text and full response of any subpoena 174 00:08:53,240 --> 00:08:56,480 Speaker 3: that they get, so like we can actually just see 175 00:08:56,760 --> 00:08:59,280 Speaker 3: in the responses that all they've given over is these 176 00:08:59,320 --> 00:09:01,920 Speaker 3: two pieces of information, because that's all they have, and 177 00:09:01,960 --> 00:09:05,319 Speaker 3: they've done some pretty clever things to make that be the. 178 00:09:05,280 --> 00:09:08,840 Speaker 4: Case, right, And that's actually so different than how other 179 00:09:08,960 --> 00:09:13,160 Speaker 4: companies are I think reporting on either subpoenas or any 180 00:09:13,200 --> 00:09:13,640 Speaker 4: kind of. 181 00:09:15,280 --> 00:09:17,040 Speaker 5: Weight that law enforcement puts on them. 182 00:09:17,400 --> 00:09:20,160 Speaker 4: So for our report, I don't remember how much it's 183 00:09:20,200 --> 00:09:23,120 Speaker 4: it's mentioned in the report actually, but we did go 184 00:09:23,240 --> 00:09:28,440 Speaker 4: through and look at Apple Meta and I think Google 185 00:09:28,520 --> 00:09:31,040 Speaker 4: like in their own transparency reports to try to get 186 00:09:31,040 --> 00:09:34,000 Speaker 4: a sense of how how that would stack up in 187 00:09:34,040 --> 00:09:38,120 Speaker 4: comparison to Signals. I think in some cases it's saying 188 00:09:38,280 --> 00:09:42,640 Speaker 4: like they received some kind of like notification, but like no, 189 00:09:43,880 --> 00:09:47,840 Speaker 4: nothing really clear or specific on like what they received 190 00:09:47,880 --> 00:09:50,640 Speaker 4: from law enforcement or government, but rather just that they 191 00:09:50,720 --> 00:09:52,640 Speaker 4: received one. And so that's also the really great thing 192 00:09:52,679 --> 00:09:55,600 Speaker 4: about Signal is you are getting all of this information 193 00:09:56,000 --> 00:10:00,400 Speaker 4: that you're not getting from other companies or PLATF forms. 194 00:10:01,040 --> 00:10:04,400 Speaker 2: Yeah, you know, I wanted to kind of in the 195 00:10:04,480 --> 00:10:07,280 Speaker 2: same subject and going back to we kind of opened 196 00:10:07,280 --> 00:10:11,320 Speaker 2: this introducing the concept that y'all introduced me to. I 197 00:10:11,320 --> 00:10:13,480 Speaker 2: guess I was aware of this, but not the terminology 198 00:10:13,520 --> 00:10:16,080 Speaker 2: security folklore, and I wanted to chat a little bit 199 00:10:16,080 --> 00:10:18,040 Speaker 2: about kind of the most recent example of this something 200 00:10:18,320 --> 00:10:20,560 Speaker 2: a lot of folks have probably been wondering about since 201 00:10:20,559 --> 00:10:23,200 Speaker 2: we started talking about Signal, which is that roughly a 202 00:10:23,240 --> 00:10:27,320 Speaker 2: week before y'all and I sat down to talk about this, 203 00:10:28,320 --> 00:10:32,280 Speaker 2: a kind of viral info meme started coming through that 204 00:10:32,400 --> 00:10:36,200 Speaker 2: was like Signal has a zero day exploit, which is 205 00:10:36,559 --> 00:10:40,320 Speaker 2: basically a hole that a hacker found in an Apple program. 206 00:10:40,520 --> 00:10:43,880 Speaker 2: That is that can't expose you. You have to turn off 207 00:10:45,160 --> 00:10:47,720 Speaker 2: link previews, right, which is that when you when someone 208 00:10:47,760 --> 00:10:50,040 Speaker 2: sends you like a link to an article in Signal, 209 00:10:50,240 --> 00:10:53,800 Speaker 2: you get a little preview not dissimilar to how it is. 210 00:10:54,360 --> 00:10:56,720 Speaker 2: And I think to be fair, just based on my 211 00:10:56,880 --> 00:10:59,559 Speaker 2: very limited knowledge, that is, when I think about, like, 212 00:10:59,600 --> 00:11:01,840 Speaker 2: what are potential holes in Signal, I don't think it's 213 00:11:01,880 --> 00:11:05,680 Speaker 2: unreasonable to be concerned about that specific feature. But that 214 00:11:05,920 --> 00:11:09,040 Speaker 2: warning was not what it kind of seemed to be basically, 215 00:11:09,120 --> 00:11:10,679 Speaker 2: or not as accurate as I think a lot of 216 00:11:10,720 --> 00:11:11,760 Speaker 2: people took it as being. 217 00:11:11,800 --> 00:11:12,080 Speaker 3: I don't know. 218 00:11:12,160 --> 00:11:13,840 Speaker 2: I'll let I'll tell I'll turn it over to you, guys. 219 00:11:13,840 --> 00:11:15,280 Speaker 2: I think that's the next thing I want to talk about. 220 00:11:15,600 --> 00:11:18,480 Speaker 4: I'll turn it over to Cooper, who had you had 221 00:11:18,679 --> 00:11:21,600 Speaker 4: a Uh, you have a lot of feels about that. 222 00:11:21,880 --> 00:11:25,640 Speaker 3: I have so many feelings about this. I was working 223 00:11:25,679 --> 00:11:26,680 Speaker 3: on this all weekend. 224 00:11:26,800 --> 00:11:30,520 Speaker 6: So this yeah, so this copy pasta I'm calling this 225 00:11:30,600 --> 00:11:33,800 Speaker 6: like this signal copy pasta yeah, which is a term 226 00:11:33,880 --> 00:11:37,960 Speaker 6: from you know, four Chan and other horrible internet places, 227 00:11:37,960 --> 00:11:39,640 Speaker 6: but some. 228 00:11:40,120 --> 00:11:42,440 Speaker 5: Media audience is probably Internet enough. 229 00:11:43,240 --> 00:11:45,600 Speaker 2: Yeah, I'm gonna guess a good half of the people 230 00:11:45,640 --> 00:11:47,880 Speaker 2: listening at least got that message. 231 00:11:47,920 --> 00:11:52,080 Speaker 3: Yeah yeah. And it's it's like, first of all, this 232 00:11:52,240 --> 00:11:55,760 Speaker 3: is not if you if you had a zero day 233 00:11:55,760 --> 00:11:57,560 Speaker 3: in Signal, which is it, which is an exploit for 234 00:11:57,600 --> 00:12:00,360 Speaker 3: Signal that has been unpatched but has not been patched 235 00:12:00,360 --> 00:12:02,560 Speaker 3: by the vendor so you can actively exploit it. There 236 00:12:02,600 --> 00:12:06,360 Speaker 3: are no people in the world who would choose to 237 00:12:08,520 --> 00:12:12,760 Speaker 3: quietly leak this over you know, over vague signal texts. 238 00:12:13,000 --> 00:12:15,960 Speaker 3: There are two types of people. One uh, you know, 239 00:12:16,200 --> 00:12:19,559 Speaker 3: people like us that would bring this to Signal immediately 240 00:12:19,600 --> 00:12:22,320 Speaker 3: and get them to patch it to protect the you know, 241 00:12:22,400 --> 00:12:25,079 Speaker 3: millions of high risk users that you signal, or to 242 00:12:25,360 --> 00:12:27,520 Speaker 3: the type of people that would go sell this exploit 243 00:12:27,960 --> 00:12:30,440 Speaker 3: to some horrible company that would use it, you know, 244 00:12:30,520 --> 00:12:32,920 Speaker 3: sell it to to Saudi Arabia or something and use 245 00:12:32,960 --> 00:12:35,800 Speaker 3: it to kill activists. Right, Like there is and there's 246 00:12:35,840 --> 00:12:39,160 Speaker 3: no in between. There's nobody that is going to quietly 247 00:12:39,240 --> 00:12:43,840 Speaker 3: leak this for you know, just for fun with vague details. Right. So, 248 00:12:43,840 --> 00:12:46,640 Speaker 3: so this this message set up red flags immediately, and 249 00:12:47,480 --> 00:12:51,280 Speaker 3: like it's because I really do not like lying previews, 250 00:12:51,320 --> 00:12:54,000 Speaker 3: And in our paper we discussed some of the issues 251 00:12:54,040 --> 00:12:56,800 Speaker 3: that we have with link previews. You know, we think 252 00:12:56,840 --> 00:13:00,360 Speaker 3: that they can they can leak some information about your 253 00:13:00,840 --> 00:13:04,319 Speaker 3: chats to the owner of the website. Right. We think 254 00:13:04,360 --> 00:13:06,680 Speaker 3: it's a kind of a large attack service. It's not 255 00:13:06,800 --> 00:13:07,719 Speaker 3: super necessary. 256 00:13:08,120 --> 00:13:11,640 Speaker 4: Would you mind explaining to actually the audience to like 257 00:13:12,240 --> 00:13:15,840 Speaker 4: a little bit about what what we found when looking 258 00:13:15,880 --> 00:13:16,880 Speaker 4: at link previews. 259 00:13:17,520 --> 00:13:20,880 Speaker 3: Yeah. So, the way that link previews work is when 260 00:13:20,920 --> 00:13:23,719 Speaker 3: you the way that they work on Signal and on 261 00:13:23,760 --> 00:13:27,480 Speaker 3: WhatsApp is that when you send a link to somebody, 262 00:13:28,120 --> 00:13:32,400 Speaker 3: the Signal app or WhatsApp goes and like fetches the 263 00:13:32,440 --> 00:13:34,520 Speaker 3: web page that that you know for that link, Right, 264 00:13:34,559 --> 00:13:37,000 Speaker 3: It goes and downloads, you know, downloads the content of 265 00:13:37,040 --> 00:13:41,079 Speaker 3: that link and gets a There are some there's some 266 00:13:41,120 --> 00:13:45,080 Speaker 3: special HTML tags that describe, you know, sort of what 267 00:13:45,120 --> 00:13:47,200 Speaker 3: the page is about, what the title of the page is, 268 00:13:47,240 --> 00:13:49,400 Speaker 3: and like an image for the page. And it gets 269 00:13:49,480 --> 00:13:51,280 Speaker 3: those tags and it puts them all together in this 270 00:13:51,280 --> 00:13:53,839 Speaker 3: little package and then sends that all as part of 271 00:13:53,920 --> 00:13:57,120 Speaker 3: the signal message. So when you put a link in Signal, 272 00:13:57,160 --> 00:13:59,120 Speaker 3: your phone actually goes out and gets that web page, 273 00:13:59,120 --> 00:14:04,360 Speaker 3: and it gets that web page with a what's called 274 00:14:04,360 --> 00:14:07,040 Speaker 3: the user agent, which is like a piece of text 275 00:14:07,080 --> 00:14:11,079 Speaker 3: that's attached to the request that uniquely that that identifies 276 00:14:11,120 --> 00:14:15,280 Speaker 3: it as being a request from Signal and from like 277 00:14:15,320 --> 00:14:18,680 Speaker 3: from signal and from your IP address. Right, So when 278 00:14:18,679 --> 00:14:20,880 Speaker 3: you put a link in, the owner of that website, 279 00:14:20,880 --> 00:14:23,200 Speaker 3: whoever has the logs for that website can know that 280 00:14:24,280 --> 00:14:27,240 Speaker 3: somebody at your IP address is using signal and sending 281 00:14:27,320 --> 00:14:31,440 Speaker 3: this link over signal. What we're what our concern is 282 00:14:31,440 --> 00:14:32,360 Speaker 3: is that if that. 283 00:14:32,280 --> 00:14:37,960 Speaker 7: Link is unique, then anybody else who visits that link 284 00:14:38,400 --> 00:14:42,560 Speaker 7: can be inferred to be somebody that you are talking 285 00:14:42,600 --> 00:14:43,280 Speaker 7: with over. 286 00:14:43,200 --> 00:14:47,720 Speaker 3: Signal, right, And so like this can be this can 287 00:14:47,760 --> 00:14:51,920 Speaker 3: be a good an interesting a source of intelligence for 288 00:14:53,040 --> 00:14:56,680 Speaker 3: website owners, especially for big websites that can easily generate 289 00:14:56,760 --> 00:14:59,800 Speaker 3: unique links with like tracking parameters at the end of 290 00:14:59,800 --> 00:15:04,320 Speaker 3: the right, Like when you share a Instagram post and 291 00:15:04,560 --> 00:15:06,560 Speaker 3: like at the end it's like question mark I G 292 00:15:06,920 --> 00:15:09,320 Speaker 3: S H I D equals you know, a long string 293 00:15:09,360 --> 00:15:12,200 Speaker 3: of numbers and letters, right, or a Twitter post where 294 00:15:12,320 --> 00:15:15,160 Speaker 3: you know T equals a long string of letters and numbers. Right. 295 00:15:15,240 --> 00:15:17,880 Speaker 3: That makes a unique link, and then anybody who visits 296 00:15:17,920 --> 00:15:20,880 Speaker 3: that same link can be determined to be somebody that 297 00:15:20,920 --> 00:15:27,400 Speaker 3: you're speaking with over Signal, so and also WhatsApp and so. 298 00:15:27,400 --> 00:15:33,080 Speaker 8: So for that reason, we we we think that Signal 299 00:15:33,160 --> 00:15:36,640 Speaker 8: and WhatsApp should turn link previews off by by default 300 00:15:37,520 --> 00:15:40,800 Speaker 8: because we think that that's an unncessary information. Link Signal 301 00:15:40,840 --> 00:15:45,560 Speaker 8: and WhatsApps pushed back on that is that link previews 302 00:15:45,640 --> 00:15:51,040 Speaker 8: are a core feature that people demand and if they 303 00:15:51,200 --> 00:15:53,440 Speaker 8: if they were to turn off link previews by default, 304 00:15:53,960 --> 00:15:56,800 Speaker 8: they're worried that people would leave the platform for less 305 00:15:56,800 --> 00:15:59,360 Speaker 8: secure platforms like Telegram. 306 00:16:00,480 --> 00:16:00,640 Speaker 3: Yeah. 307 00:16:00,640 --> 00:16:02,600 Speaker 2: I mean, I don't want to tell them their business, 308 00:16:02,600 --> 00:16:05,920 Speaker 2: because I'm sure they have data on this, but I've 309 00:16:06,000 --> 00:16:10,920 Speaker 2: never thought about link previews as being a thing that 310 00:16:11,000 --> 00:16:11,800 Speaker 2: I needed. 311 00:16:12,920 --> 00:16:15,480 Speaker 4: It's like, yeah, I think it's I think it's one 312 00:16:15,520 --> 00:16:18,560 Speaker 4: of those things. And you know, we haven't necessarily done 313 00:16:18,840 --> 00:16:24,120 Speaker 4: like extensive general design research in this right, Like we 314 00:16:24,200 --> 00:16:27,600 Speaker 4: haven't surveyed like three thousand people in the US. We 315 00:16:27,640 --> 00:16:31,080 Speaker 4: haven't had like a Pew Research survey across countries and 316 00:16:31,200 --> 00:16:32,880 Speaker 4: be like, what are your thoughts on link previews? 317 00:16:33,760 --> 00:16:34,920 Speaker 5: But I would. 318 00:16:34,600 --> 00:16:38,400 Speaker 4: Probably argue because it is it is included in so 319 00:16:38,560 --> 00:16:40,200 Speaker 4: much of modern messaging. 320 00:16:39,800 --> 00:16:43,000 Speaker 5: Apps that we now assume it's like a core feature. 321 00:16:43,840 --> 00:16:45,880 Speaker 4: One thing I will give Signal that I think is 322 00:16:45,920 --> 00:16:48,760 Speaker 4: amazing that other apps don't do, and this is true 323 00:16:48,800 --> 00:16:52,800 Speaker 4: of WhatsApp is pretty much every feature except for encryption, 324 00:16:52,920 --> 00:16:55,920 Speaker 4: you can there's something you can toggle or turn off. Right, 325 00:16:56,000 --> 00:17:00,440 Speaker 4: So like link preview already was available for people to 326 00:17:00,440 --> 00:17:05,080 Speaker 4: turn off on Signal, WhatsApp does not allow that, and 327 00:17:05,840 --> 00:17:09,920 Speaker 4: it seems like they're making no moves to allow that 328 00:17:09,960 --> 00:17:12,680 Speaker 4: future to be optional to turn on or off. 329 00:17:13,000 --> 00:17:13,880 Speaker 5: But that is I will say. 330 00:17:13,880 --> 00:17:15,639 Speaker 4: One of the things that's really lovely about Signal that 331 00:17:16,080 --> 00:17:20,680 Speaker 4: is so different from modern design and modern like big 332 00:17:20,720 --> 00:17:23,480 Speaker 4: tech platforms and just platforms in general, is that those 333 00:17:23,640 --> 00:17:26,720 Speaker 4: a lot of features are optional, whereas you know, WhatsApp 334 00:17:26,720 --> 00:17:29,399 Speaker 4: in metas sort of stance on design is that a 335 00:17:29,400 --> 00:17:31,920 Speaker 4: lot of things are not optional, that those are things 336 00:17:32,000 --> 00:17:35,439 Speaker 4: users would want. Why would we make foundational elements like 337 00:17:35,480 --> 00:17:40,359 Speaker 4: link previews optional? And you're just like starting like gesturing wildly, 338 00:17:40,480 --> 00:17:42,560 Speaker 4: but like you know, it's like, well, you don't know 339 00:17:42,560 --> 00:17:44,720 Speaker 4: what people want, and I mean, what's the harm in 340 00:17:44,760 --> 00:17:46,320 Speaker 4: turning off some of some of these things? 341 00:17:46,400 --> 00:17:46,560 Speaker 3: Right? 342 00:17:47,200 --> 00:17:50,320 Speaker 4: You know, like maybe maybe people don't want to receive gifts. 343 00:17:50,400 --> 00:17:52,320 Speaker 4: I don't know, maybe they don't want to receive stickers. 344 00:17:52,359 --> 00:17:55,040 Speaker 4: Why don't you like let them have that option. What's 345 00:17:55,040 --> 00:17:56,000 Speaker 4: the harm that could happen? 346 00:17:56,200 --> 00:17:56,440 Speaker 3: Yeah? 347 00:17:56,720 --> 00:17:58,520 Speaker 2: Yeah, yeah, I couldn't agree more. 348 00:17:58,560 --> 00:18:01,119 Speaker 3: Yeah. Two things I want to say that one is 349 00:18:01,240 --> 00:18:03,760 Speaker 3: one is that and first we should acknowledge that this 350 00:18:03,920 --> 00:18:05,840 Speaker 3: it turns out that there was no zero day, there 351 00:18:05,880 --> 00:18:10,040 Speaker 3: was no vulnerability. Yeah, this was absolutely just something that 352 00:18:10,040 --> 00:18:13,960 Speaker 3: that spread virally out of nowhere. I'd be really interested 353 00:18:13,960 --> 00:18:16,600 Speaker 3: to find out what the origin of this copy of 354 00:18:16,640 --> 00:18:19,119 Speaker 3: past I was, but I haven't. I haven't been able to. 355 00:18:19,200 --> 00:18:20,639 Speaker 3: But it's I'm. 356 00:18:20,520 --> 00:18:21,840 Speaker 5: Curious about that as as well. 357 00:18:21,880 --> 00:18:23,560 Speaker 4: Because I was in another group threw that was like, 358 00:18:23,600 --> 00:18:25,960 Speaker 4: we really need outside auditors to look at these. 359 00:18:25,800 --> 00:18:27,840 Speaker 5: And I was like, we have a whole report that 360 00:18:27,960 --> 00:18:29,919 Speaker 5: we wrote that didn't look at this. 361 00:18:30,920 --> 00:18:34,840 Speaker 2: Speaking of outside auditors, I gotta pause you guys just 362 00:18:34,840 --> 00:18:37,760 Speaker 2: a second, because it is time for an ad break, 363 00:18:38,400 --> 00:18:42,600 Speaker 2: So please spend your money and then come back to 364 00:18:42,680 --> 00:18:59,080 Speaker 2: learn more. Ah and we're back. Okay, sorry about that, Cooper, Carolyn, 365 00:18:59,640 --> 00:19:02,000 Speaker 2: you make continue as you were. 366 00:19:02,400 --> 00:19:04,480 Speaker 3: The other thing I was, I was going to say 367 00:19:05,080 --> 00:19:09,520 Speaker 3: that the idea that anybody would leave WhatsApp because they 368 00:19:09,560 --> 00:19:13,800 Speaker 3: stopped having link previews is completely preposterous to me. Like 369 00:19:14,080 --> 00:19:21,600 Speaker 3: Clownish has over two billion users. They are the you know, 370 00:19:21,720 --> 00:19:26,240 Speaker 3: in a position to set the standard for what people 371 00:19:26,280 --> 00:19:31,440 Speaker 3: expect from a messaging app, and so like they could 372 00:19:31,520 --> 00:19:35,840 Speaker 3: do things like turn on disappearing messages by default and 373 00:19:35,920 --> 00:19:38,320 Speaker 3: change that culture. They could do things like turn off 374 00:19:38,320 --> 00:19:41,440 Speaker 3: link previews by default and change that culture. Like, they 375 00:19:41,440 --> 00:19:44,159 Speaker 3: could do these things, and you know they would you know, 376 00:19:45,280 --> 00:19:50,600 Speaker 3: they would not lose enough users to even notice or 377 00:19:50,640 --> 00:19:51,159 Speaker 3: care about. 378 00:19:51,320 --> 00:19:51,439 Speaker 2: Right. 379 00:19:51,520 --> 00:19:53,760 Speaker 3: Yeah, they are the only people in the position in 380 00:19:53,800 --> 00:19:57,080 Speaker 3: the world, in the position to decide what the culture 381 00:19:57,119 --> 00:19:59,159 Speaker 3: should be, and this is what they've decided the culture 382 00:19:59,200 --> 00:19:59,600 Speaker 3: should be. 383 00:20:00,240 --> 00:20:00,680 Speaker 5: Totally. 384 00:20:01,040 --> 00:20:03,040 Speaker 4: I hate to break it to you, but if WhatsApp 385 00:20:03,320 --> 00:20:05,879 Speaker 4: just got rid of link previews, I'm just throwing my 386 00:20:05,960 --> 00:20:09,040 Speaker 4: whole phone into the garbage garbage can, getting rid of it. 387 00:20:09,040 --> 00:20:11,159 Speaker 2: Just tossing it back to a landline. 388 00:20:11,560 --> 00:20:12,080 Speaker 5: Yeah, I'm just. 389 00:20:12,000 --> 00:20:14,240 Speaker 4: Gonna eat it into a river. I feel like I 390 00:20:14,280 --> 00:20:17,360 Speaker 4: don't need this anymore. Actually, I'm going back to carryer pigeons. 391 00:20:17,400 --> 00:20:18,919 Speaker 5: That's how far back I'm going to go. 392 00:20:19,280 --> 00:20:21,280 Speaker 2: I mean that that does kind of lead into the 393 00:20:21,280 --> 00:20:23,480 Speaker 2: next thing I wanted to talk about, which is sort 394 00:20:23,520 --> 00:20:28,760 Speaker 2: of the other wing from security folklore, which is security nihilism. 395 00:20:29,400 --> 00:20:32,160 Speaker 2: And yeah, this is kind of you introduce this when 396 00:20:32,200 --> 00:20:35,400 Speaker 2: talking about sort of if you do try to engage 397 00:20:35,480 --> 00:20:37,760 Speaker 2: somewhat with the technology, or if you wind up just 398 00:20:37,800 --> 00:20:39,760 Speaker 2: kind of in the position I think most lay people are, 399 00:20:39,840 --> 00:20:42,240 Speaker 2: where you know, maybe you have some friends who know more, 400 00:20:42,440 --> 00:20:44,200 Speaker 2: or maybe you have some friends who think they know more, 401 00:20:44,240 --> 00:20:46,760 Speaker 2: and you get all these conflicting things about like this 402 00:20:46,920 --> 00:20:49,159 Speaker 2: is safe, No, it's not. You can't trust signal. The 403 00:20:49,160 --> 00:20:51,280 Speaker 2: FEDS could be running signal all this kind of stuff, 404 00:20:51,880 --> 00:20:54,560 Speaker 2: And to be fair, the FEDS have run security based 405 00:20:54,600 --> 00:20:57,159 Speaker 2: services before. It's not like I don't believe that's happening 406 00:20:57,160 --> 00:20:59,920 Speaker 2: with signal, but it's not like I understand where parent 407 00:21:00,480 --> 00:21:04,239 Speaker 2: like that can can enter into people's calculus, especially if 408 00:21:04,280 --> 00:21:08,560 Speaker 2: you're not technically knowledgeable, and that can lead to this 409 00:21:08,640 --> 00:21:11,959 Speaker 2: sort of state of security nihilism where you're just like, 410 00:21:12,400 --> 00:21:14,720 Speaker 2: you can't communicate it all online. There's no way to 411 00:21:14,760 --> 00:21:18,199 Speaker 2: do it securely, and obviously there's no perfect right you 412 00:21:18,240 --> 00:21:20,040 Speaker 2: never have it, but you don't have one hundred percent 413 00:21:20,119 --> 00:21:24,760 Speaker 2: with like talking in person to somebody. Right there are 414 00:21:24,760 --> 00:21:28,280 Speaker 2: individuals in prison right now who you know somebody they 415 00:21:28,359 --> 00:21:31,240 Speaker 2: loved and trusted rat it on them. There's no one 416 00:21:31,280 --> 00:21:35,399 Speaker 2: hundred percents in this world. But that doesn't mean nihilism 417 00:21:35,480 --> 00:21:38,280 Speaker 2: is the right response to like trying to figure out 418 00:21:38,400 --> 00:21:41,920 Speaker 2: how to set up your communications standards with people right. 419 00:21:42,320 --> 00:21:44,680 Speaker 4: Totally, I mean, I think the approach we take in 420 00:21:44,920 --> 00:21:48,399 Speaker 4: because throughout this report we were also teaching workshops to 421 00:21:49,040 --> 00:21:53,120 Speaker 4: reproductive justice activists across the US and states where abortion 422 00:21:53,240 --> 00:21:56,680 Speaker 4: is banned. I'm from Louisiana, I live half the year there, 423 00:21:57,040 --> 00:22:01,600 Speaker 4: the abortion is banned there, and we were also working 424 00:22:01,600 --> 00:22:04,000 Speaker 4: with journalists in India. So a big big thing for 425 00:22:04,080 --> 00:22:07,520 Speaker 4: us was also teaching threat modeling and different kinds of 426 00:22:07,680 --> 00:22:11,359 Speaker 4: what Matt Mitchell, a security trainer and expert, calls digital hygiene, 427 00:22:11,840 --> 00:22:14,840 Speaker 4: and so a lot of this was recognizing that there 428 00:22:14,960 --> 00:22:18,000 Speaker 4: was certain practices we were picking up on, particularly with 429 00:22:18,119 --> 00:22:19,639 Speaker 4: folks we were working with. So like a lot of 430 00:22:19,720 --> 00:22:23,399 Speaker 4: reproductive justice activists we were working with are new to security, 431 00:22:23,440 --> 00:22:25,720 Speaker 4: they're new to technology, they don't have a background in tech, 432 00:22:25,840 --> 00:22:29,880 Speaker 4: and generally, you know, the American South, the American Deep 433 00:22:29,920 --> 00:22:33,520 Speaker 4: South is super overlooked in terms of tech policy, in 434 00:22:33,640 --> 00:22:37,200 Speaker 4: terms of just I think a general focus when people 435 00:22:37,280 --> 00:22:40,280 Speaker 4: are talking about tech or tech literacy or tech activism, 436 00:22:40,960 --> 00:22:44,160 Speaker 4: and that is like leaving really massive gaps and knowledge 437 00:22:44,240 --> 00:22:47,760 Speaker 4: for people. And so you know, when we were working 438 00:22:47,800 --> 00:22:50,720 Speaker 4: on this security folkal or and security nihilism, we're both 439 00:22:50,840 --> 00:22:51,840 Speaker 4: actually very. 440 00:22:51,920 --> 00:22:53,639 Speaker 5: Almost like I won't say, like a pendulum, but they 441 00:22:53,640 --> 00:22:56,199 Speaker 5: were very connected. And so some of that was. 442 00:22:56,200 --> 00:22:58,520 Speaker 4: People hearing things like oh, I should put my phone 443 00:22:58,960 --> 00:23:02,120 Speaker 4: in a microwave when I'm having a very sensitive conversation, right, 444 00:23:02,160 --> 00:23:04,480 Speaker 4: And so that's where some of that security folklore is 445 00:23:04,520 --> 00:23:08,399 Speaker 4: coming in. It is something that is technically safe, but 446 00:23:08,480 --> 00:23:10,920 Speaker 4: it's like not the thing you necessarily, like totally need 447 00:23:10,960 --> 00:23:13,760 Speaker 4: to do in that moment. And with security nihilism, what 448 00:23:13,840 --> 00:23:15,359 Speaker 4: it kind of came down to, and this is stuff 449 00:23:15,359 --> 00:23:20,000 Speaker 4: we've seen with other groups and other circumstances. A great 450 00:23:20,040 --> 00:23:23,520 Speaker 4: example are are you know Palestinian activists and journalists. Let's say, 451 00:23:23,560 --> 00:23:26,680 Speaker 4: who are you know facing the threat of all different 452 00:23:26,760 --> 00:23:30,080 Speaker 4: kinds of governmental censorship and surveillance of sort of saying like, 453 00:23:30,080 --> 00:23:32,560 Speaker 4: when there's this large threat sort of hanging on us, 454 00:23:33,040 --> 00:23:36,439 Speaker 4: and there's also physical surveillance. And this is true for 455 00:23:36,480 --> 00:23:39,120 Speaker 4: a lot of journalists in other countries like India as well. 456 00:23:39,240 --> 00:23:44,280 Speaker 4: For example, you know, like should everything go through signal 457 00:23:44,359 --> 00:23:45,280 Speaker 4: or does it really matter? 458 00:23:45,400 --> 00:23:46,400 Speaker 5: Like does it really matter? 459 00:23:46,480 --> 00:23:49,080 Speaker 4: And this is also something again we saw with some 460 00:23:49,080 --> 00:23:51,479 Speaker 4: some reproductive justice activists as well, where it's like if 461 00:23:51,520 --> 00:23:53,480 Speaker 4: everything is being monitored, what's safe? 462 00:23:53,680 --> 00:23:57,080 Speaker 5: Like can I send stuff? Like can I even use Google? 463 00:23:57,480 --> 00:24:01,480 Speaker 4: And part of this was, you know, by teaching privacy 464 00:24:01,480 --> 00:24:04,680 Speaker 4: and security workshops, by teaching things like threat modeling, which 465 00:24:04,720 --> 00:24:06,720 Speaker 4: is a framework for just assessing what. 466 00:24:06,800 --> 00:24:08,200 Speaker 5: Are what are threats? 467 00:24:08,560 --> 00:24:11,040 Speaker 4: Like are what are all the potential threats you could 468 00:24:11,040 --> 00:24:13,359 Speaker 4: face and kind of mapping them from like the most 469 00:24:13,400 --> 00:24:16,159 Speaker 4: minor to like the most major, and what you can 470 00:24:16,200 --> 00:24:18,320 Speaker 4: do about that. That's a way to try to combat 471 00:24:18,440 --> 00:24:20,840 Speaker 4: security nihilism. But I think an approach Cooper and I 472 00:24:20,880 --> 00:24:23,119 Speaker 4: are also really fond of is thinking of this like 473 00:24:23,160 --> 00:24:26,040 Speaker 4: safer sex. There's all different kinds of things you can 474 00:24:26,080 --> 00:24:29,679 Speaker 4: do that our mitigations are actually incredibly helpful, and we 475 00:24:29,760 --> 00:24:32,240 Speaker 4: can't look at it as a binary of safe or 476 00:24:32,400 --> 00:24:35,600 Speaker 4: not safe. It's actually like much more of a gradient. 477 00:24:37,240 --> 00:24:40,080 Speaker 4: But you know, the focal are and the nihilism, I 478 00:24:40,080 --> 00:24:42,520 Speaker 4: think come from a very similar place, which is we're 479 00:24:42,560 --> 00:24:45,960 Speaker 4: asking people like society is kind of asking or demanding 480 00:24:46,000 --> 00:24:49,880 Speaker 4: that people be experts and something that's really hard. I 481 00:24:49,920 --> 00:24:52,480 Speaker 4: am like a fairly technical person, and even there are 482 00:24:52,560 --> 00:24:54,639 Speaker 4: some things that I find hard to serve wrap my 483 00:24:54,680 --> 00:24:57,320 Speaker 4: head around. And I've been working in privacy and security 484 00:24:57,359 --> 00:25:00,719 Speaker 4: for like quite a while, and I think think, you know, 485 00:25:01,920 --> 00:25:03,879 Speaker 4: it's also really hard when you think about these apps 486 00:25:03,880 --> 00:25:06,280 Speaker 4: as like a brand new person. So, like, one of 487 00:25:06,280 --> 00:25:07,840 Speaker 4: the things that popped up a lot in our research 488 00:25:07,960 --> 00:25:10,080 Speaker 4: is like why should we trust signal? And that's actually 489 00:25:10,160 --> 00:25:13,000 Speaker 4: a great question, Like what about Signal in its interface 490 00:25:13,040 --> 00:25:14,120 Speaker 4: and its design. 491 00:25:14,760 --> 00:25:17,320 Speaker 5: Would cause you to trust it? Like some people were 492 00:25:17,359 --> 00:25:19,359 Speaker 5: like it's a nonprofit. That's great, but I don't know 493 00:25:19,359 --> 00:25:21,640 Speaker 5: what that means. I'm like, that's actually a fantastic question, 494 00:25:21,720 --> 00:25:23,040 Speaker 5: Like what does that mean? Right? 495 00:25:23,080 --> 00:25:26,040 Speaker 4: Like why should you trust this? You've heard through the 496 00:25:26,040 --> 00:25:28,479 Speaker 4: grapevine that you should. And I think these are kind 497 00:25:28,520 --> 00:25:30,240 Speaker 4: of all the things that people are dealing with because 498 00:25:30,240 --> 00:25:32,000 Speaker 4: if you sort of take a step back and just 499 00:25:32,040 --> 00:25:34,920 Speaker 4: look at software or any different kinds of software generally, 500 00:25:35,359 --> 00:25:37,600 Speaker 4: why should you trust that it's safe and secure when 501 00:25:37,640 --> 00:25:40,480 Speaker 4: there have been so many different kinds of leaks or 502 00:25:40,520 --> 00:25:46,040 Speaker 4: breaches or things breaking, right, Yeah, Like, so these are 503 00:25:46,359 --> 00:25:48,600 Speaker 4: I think really really closely tied. But I think a 504 00:25:48,640 --> 00:25:52,080 Speaker 4: big thing for us is trying to combat that security nihilism. 505 00:25:52,200 --> 00:25:54,719 Speaker 4: Like when whenever we can like, there is things you 506 00:25:54,760 --> 00:25:57,480 Speaker 4: can do. I don't want to say like no matter 507 00:25:57,480 --> 00:25:59,560 Speaker 4: how great the threat, but I believe like, no matter 508 00:25:59,600 --> 00:26:01,760 Speaker 4: how great threat, there is stuff, there is stuff you 509 00:26:01,800 --> 00:26:02,120 Speaker 4: can do. 510 00:26:02,760 --> 00:26:04,679 Speaker 3: No matter how great the threat is, there's stuff that 511 00:26:04,720 --> 00:26:07,920 Speaker 3: you can do to make it more difficult and more 512 00:26:07,920 --> 00:26:10,679 Speaker 3: expensive for that person to attack you. Right, Like we 513 00:26:10,760 --> 00:26:13,480 Speaker 3: all lock the doors to our house, or you know, 514 00:26:13,560 --> 00:26:16,639 Speaker 3: for the most part, or you know, we all we 515 00:26:16,720 --> 00:26:20,320 Speaker 3: all do things to to protect ourselves like that that 516 00:26:20,400 --> 00:26:22,680 Speaker 3: aren't fool proof, right. Somebody can always break a window 517 00:26:22,720 --> 00:26:24,680 Speaker 3: to get into your house, right. Somebody can find other 518 00:26:24,680 --> 00:26:27,040 Speaker 3: ways to get into your house. But locking the door 519 00:26:27,240 --> 00:26:30,000 Speaker 3: makes it so that somebody has to do the noisy 520 00:26:30,040 --> 00:26:33,600 Speaker 3: thing of breaking a window. Right. It makes it so that, 521 00:26:33,640 --> 00:26:35,840 Speaker 3: you know, somebody has to spend more time and effort 522 00:26:35,920 --> 00:26:40,000 Speaker 3: and more risk of getting caught in getting into your house. Right. 523 00:26:40,040 --> 00:26:42,760 Speaker 3: And that's and that's like we layer. When you layer 524 00:26:42,880 --> 00:26:47,160 Speaker 3: these protections, right, the idea, you know is that you're 525 00:26:47,280 --> 00:26:51,200 Speaker 3: you're you're making it harder, You're making there be more 526 00:26:51,240 --> 00:26:54,200 Speaker 3: friction right to piercing your security. 527 00:26:54,760 --> 00:26:57,280 Speaker 2: Yeah, I think that's that's a really good point, and 528 00:26:57,320 --> 00:26:59,720 Speaker 2: that the concept of friction, you know, this is something 529 00:26:59,760 --> 00:27:02,639 Speaker 2: I've talked about. Not that these are exactly the same things, 530 00:27:02,640 --> 00:27:05,360 Speaker 2: but in the although there's not not wildly different when 531 00:27:05,359 --> 00:27:09,360 Speaker 2: it comes to like how insurgents win insurgencies, right, it's 532 00:27:09,800 --> 00:27:13,439 Speaker 2: not by carrying out these sort of like great battlefield 533 00:27:13,560 --> 00:27:17,760 Speaker 2: victories that sweep the enemy from the field. It's by friction, right, 534 00:27:18,040 --> 00:27:22,199 Speaker 2: which wears down both the culture and the kind of 535 00:27:22,280 --> 00:27:27,080 Speaker 2: readiness of the opponent until they simply bounce, which is 536 00:27:27,160 --> 00:27:29,679 Speaker 2: a pretty durable and effective strategy. You can keep it up. 537 00:27:29,720 --> 00:27:35,679 Speaker 2: It's this matter of like there's no like sweeping sudden 538 00:27:35,800 --> 00:27:39,719 Speaker 2: like ninety minute three act win here. It's more a 539 00:27:39,720 --> 00:27:43,239 Speaker 2: matter of the more difficult, the more expensive you make it, 540 00:27:43,480 --> 00:27:46,240 Speaker 2: the more you hold on to and the more all 541 00:27:46,280 --> 00:27:48,160 Speaker 2: of us hold on to. Right. That's the other benefit 542 00:27:48,240 --> 00:27:50,200 Speaker 2: is like, even if you're not even if you are 543 00:27:50,400 --> 00:27:53,000 Speaker 2: the most law abiding person in the world like myself, 544 00:27:54,600 --> 00:27:58,160 Speaker 2: having these security measures in place means that you're kind 545 00:27:58,160 --> 00:28:02,480 Speaker 2: of contributing to the overall immune system of a of 546 00:28:02,520 --> 00:28:05,399 Speaker 2: a kind of community of people who don't want the 547 00:28:05,520 --> 00:28:06,720 Speaker 2: NSA listening to their ship. 548 00:28:08,600 --> 00:28:12,680 Speaker 3: Yeah, exactly, exactly. And the friction thing is is also 549 00:28:12,760 --> 00:28:16,800 Speaker 3: exactly what Signal does, right, Like by the threat model 550 00:28:16,840 --> 00:28:21,080 Speaker 3: for Signal is stopping the NSA or other global adversaries 551 00:28:21,119 --> 00:28:25,560 Speaker 3: from listening to all communications as they travel over the internet, right, 552 00:28:25,680 --> 00:28:28,360 Speaker 3: And that's when you can when you can do that, 553 00:28:28,720 --> 00:28:30,840 Speaker 3: like when you can when you can listen to everybody's 554 00:28:30,880 --> 00:28:33,800 Speaker 3: conversations as they travel over the Internet, it's really cheap 555 00:28:33,840 --> 00:28:37,080 Speaker 3: to spy on anybody. Right when you're encrypting that communication, 556 00:28:38,160 --> 00:28:41,239 Speaker 3: then the NSA or whatever other global adversary has to 557 00:28:41,280 --> 00:28:45,120 Speaker 3: go actually hack your phone, right, they have to. They 558 00:28:45,120 --> 00:28:48,920 Speaker 3: have to target you specifically, they have to burn resources 559 00:28:49,000 --> 00:28:53,920 Speaker 3: and you know, burn weapons, right, zero days to get 560 00:28:53,960 --> 00:28:56,080 Speaker 3: access to your phone. And that's a lot more costly, 561 00:28:56,560 --> 00:28:59,240 Speaker 3: it's a lot more noisy, it's a much higher risk 562 00:28:59,280 --> 00:29:02,280 Speaker 3: of them getting caught. So it's introduced to huge friction, 563 00:29:02,920 --> 00:29:08,560 Speaker 3: uh in that in that area. 564 00:29:08,720 --> 00:29:10,200 Speaker 5: Go ahead, okay, go ahead, go. 565 00:29:10,160 --> 00:29:13,040 Speaker 3: Ahead, I say, and I think you're asymmetic. The sort 566 00:29:13,080 --> 00:29:16,000 Speaker 3: of comparison to asymmetric warfare is exactly spot on, because 567 00:29:16,040 --> 00:29:17,840 Speaker 3: none of us are ever going to have the money 568 00:29:17,840 --> 00:29:20,160 Speaker 3: that that the NSA or Masade has. None of us 569 00:29:20,160 --> 00:29:23,760 Speaker 3: are ever going to have the the total technical acumen 570 00:29:23,840 --> 00:29:26,560 Speaker 3: that the NSA or MASAD has, right, but like those 571 00:29:27,200 --> 00:29:28,920 Speaker 3: that you know, so we have to kind of fight 572 00:29:29,000 --> 00:29:32,080 Speaker 3: a you know, in terms of caryption, in terms of encryption, 573 00:29:32,160 --> 00:29:35,320 Speaker 3: a guerrilla war, right, and we have to make things 574 00:29:35,840 --> 00:29:38,840 Speaker 3: so expensive and so annoying for them that it's not. 575 00:29:38,760 --> 00:29:41,800 Speaker 5: Worth it totally. And just to sort of building that. 576 00:29:41,840 --> 00:29:44,000 Speaker 4: One of the things I love about Signal is while 577 00:29:44,040 --> 00:29:48,680 Speaker 4: they're creating friction for our adversaries, it's actually so frictionless 578 00:29:48,720 --> 00:29:51,440 Speaker 4: to use as a user. And I think that's one 579 00:29:51,480 --> 00:29:54,800 Speaker 4: of the things I find just continually impressive about that. 580 00:29:55,120 --> 00:29:58,160 Speaker 4: I don't want this to turn into like the like. 581 00:29:58,160 --> 00:30:01,200 Speaker 5: We're all himbos for signal. Looks we probably are. 582 00:30:01,720 --> 00:30:03,760 Speaker 4: But because like that's one of the things as a 583 00:30:03,800 --> 00:30:06,200 Speaker 4: researcher like Kuber and IOMs have to be like, we're 584 00:30:06,280 --> 00:30:08,640 Speaker 4: not paid by a signal at all, Like, but this 585 00:30:08,840 --> 00:30:10,640 Speaker 4: is in fact, like one of the best things you 586 00:30:10,680 --> 00:30:12,480 Speaker 4: can use. But again, one of the things I think 587 00:30:12,520 --> 00:30:16,000 Speaker 4: is amazing is that it is so easy to use 588 00:30:17,080 --> 00:30:21,440 Speaker 4: and it really is designed for and I'm using the 589 00:30:21,520 --> 00:30:25,960 Speaker 4: term usability as as a design term, meaning that it 590 00:30:26,000 --> 00:30:29,920 Speaker 4: is they're thinking about a common user, including those with 591 00:30:29,960 --> 00:30:33,480 Speaker 4: like lower digital literacy or those that are have never 592 00:30:33,680 --> 00:30:37,160 Speaker 4: used any kind of any kind of security tool, and 593 00:30:37,240 --> 00:30:40,800 Speaker 4: so they're hitting a specific threshold of usability for things 594 00:30:40,840 --> 00:30:44,120 Speaker 4: to be understandable. And again, that's incredibly hard to do well, 595 00:30:44,280 --> 00:30:46,280 Speaker 4: and they are they are doing it quite well. Like 596 00:30:46,320 --> 00:30:48,760 Speaker 4: it's very I would argue, it's very easy and sort 597 00:30:48,760 --> 00:30:52,080 Speaker 4: of seamless for people to make a jump from WhatsApp 598 00:30:52,240 --> 00:30:55,240 Speaker 4: or if you're on like Google or Android using like 599 00:30:55,280 --> 00:30:59,400 Speaker 4: Google Messages, sorry Google, if you're on Android or an iPhone, 600 00:30:59,600 --> 00:31:03,200 Speaker 4: from Like Messages to Google Messages to signal like it 601 00:31:03,240 --> 00:31:05,400 Speaker 4: doesn't It might look slightly different. I might feel a 602 00:31:05,400 --> 00:31:07,080 Speaker 4: lot more blue, I might feel a lot more black, 603 00:31:07,120 --> 00:31:10,040 Speaker 4: depending on how yours is constructed. But for the most part, 604 00:31:10,480 --> 00:31:12,160 Speaker 4: a lot of the features are kind of where you 605 00:31:12,200 --> 00:31:15,760 Speaker 4: expect them to be, and it's not at all difficult 606 00:31:15,760 --> 00:31:17,960 Speaker 4: to get it up and running, which is not something 607 00:31:18,120 --> 00:31:19,200 Speaker 4: against Cooper said earlier. 608 00:31:19,200 --> 00:31:21,640 Speaker 5: We could say about things like PGP. 609 00:31:22,320 --> 00:31:25,200 Speaker 2: Yeah, I wanted to kind of move on to talking 610 00:31:25,240 --> 00:31:28,040 Speaker 2: about other apps and their security or lack of it, 611 00:31:28,080 --> 00:31:31,040 Speaker 2: and I think we should start probably by talking about Telegram, 612 00:31:31,120 --> 00:31:33,920 Speaker 2: because that's probably close to top of the list of 613 00:31:33,960 --> 00:31:38,800 Speaker 2: things people use for secure communications that is not nearly 614 00:31:38,840 --> 00:31:43,160 Speaker 2: as secure as they think. So, yeah, I wanted to 615 00:31:43,200 --> 00:31:45,840 Speaker 2: kind of chat with you about like why that is, 616 00:31:45,920 --> 00:31:48,120 Speaker 2: and I specifically I wanted to talk one of the 617 00:31:48,160 --> 00:31:52,400 Speaker 2: things that is frustrating about Telegram is they kind of 618 00:31:52,480 --> 00:31:54,560 Speaker 2: have they have like a secret chat or private chat, 619 00:31:54,600 --> 00:31:57,480 Speaker 2: like they have a couple of different options that don't 620 00:31:57,920 --> 00:32:01,760 Speaker 2: necessarily mean what they sound like they mean to most people. 621 00:32:01,840 --> 00:32:04,800 Speaker 4: Yeah, so that's actually one thing our report found. So 622 00:32:04,920 --> 00:32:07,520 Speaker 4: private chat and secret chat are in fact. 623 00:32:07,560 --> 00:32:08,200 Speaker 5: The same thing. 624 00:32:09,240 --> 00:32:12,560 Speaker 4: They're just called slightly different things in the app, which 625 00:32:12,640 --> 00:32:15,120 Speaker 4: for for again, for those listening. 626 00:32:14,760 --> 00:32:17,400 Speaker 5: That are don't have the background in design, that's bad design. 627 00:32:17,480 --> 00:32:21,920 Speaker 4: That's actually not that's not professional, that's a that is 628 00:32:22,040 --> 00:32:25,680 Speaker 4: a mistake. There's no reason for a feature to have 629 00:32:25,760 --> 00:32:29,720 Speaker 4: like two different names inside of inside of your software. 630 00:32:30,760 --> 00:32:33,560 Speaker 4: And so I don't know if that's an oversight on 631 00:32:33,600 --> 00:32:38,080 Speaker 4: their part. I'm assuming so, but like those two things 632 00:32:38,200 --> 00:32:40,880 Speaker 4: correlate to the same feature, and so they should actually 633 00:32:40,880 --> 00:32:43,719 Speaker 4: be called the same thing. But then even further that 634 00:32:43,760 --> 00:32:46,720 Speaker 4: being said, what does private mean to a user? 635 00:32:46,800 --> 00:32:47,960 Speaker 5: What does secret mean? 636 00:32:49,440 --> 00:32:53,680 Speaker 4: You know, Facebook Messenger they call their encrypted message secure 637 00:32:53,880 --> 00:32:55,040 Speaker 4: or no, they also call it secret. 638 00:32:55,080 --> 00:32:57,720 Speaker 5: Sorry, they also call it secret. But does that mean security? 639 00:32:57,720 --> 00:32:58,720 Speaker 5: Does that mean encrypted? 640 00:32:58,880 --> 00:33:02,640 Speaker 4: And so that's like one of the one of the 641 00:33:02,640 --> 00:33:05,880 Speaker 4: weird things where it's like, you know, I think by 642 00:33:06,000 --> 00:33:10,280 Speaker 4: using a very sort of like normalize or culturally almost 643 00:33:10,320 --> 00:33:13,840 Speaker 4: like emotional name like private, it makes something seem like 644 00:33:14,120 --> 00:33:17,600 Speaker 4: it's actually quite safe, when in fact it's not. And 645 00:33:17,640 --> 00:33:20,760 Speaker 4: there's a variety of reasons as why, like Telegram is 646 00:33:20,800 --> 00:33:24,680 Speaker 4: not not a very secure app that I will let Cooper. 647 00:33:24,440 --> 00:33:25,040 Speaker 5: Talk about more. 648 00:33:25,520 --> 00:33:28,680 Speaker 3: Yeah, I would never advise anybody to have a chat 649 00:33:29,080 --> 00:33:33,320 Speaker 3: over Telegram if they are concerned about the privacy of that. Yeah, 650 00:33:33,360 --> 00:33:38,040 Speaker 3: So we were talking about friction and the fact that 651 00:33:38,760 --> 00:33:42,600 Speaker 3: and encrypted chats are not the default in Telegram creates 652 00:33:42,600 --> 00:33:47,720 Speaker 3: a friction for users to have an actually secure chat. 653 00:33:47,800 --> 00:33:49,920 Speaker 3: Right you have to go remember to turn it. 654 00:33:49,880 --> 00:33:53,040 Speaker 4: On, and you can only turn it on turn it 655 00:33:53,080 --> 00:33:57,680 Speaker 4: on individually per message. It's not like an overall feature 656 00:33:57,880 --> 00:34:01,200 Speaker 4: on Telegram or Facebook Messenger. You have to go select 657 00:34:01,200 --> 00:34:06,200 Speaker 4: a specific like the specific conversation per conversation, which is 658 00:34:06,880 --> 00:34:09,399 Speaker 4: and another thing ourper gets into is how also those 659 00:34:09,520 --> 00:34:12,640 Speaker 4: chats don't look very different. They look almost identical to 660 00:34:13,239 --> 00:34:17,239 Speaker 4: a normal chat. So for for low vision users or 661 00:34:17,280 --> 00:34:21,160 Speaker 4: anyone with any kind of like disability, especially a vision 662 00:34:21,239 --> 00:34:25,960 Speaker 4: related disability, it's almost impossible to it's like nearly impossible 663 00:34:25,960 --> 00:34:29,759 Speaker 4: to recognize which chat you're using if you're looking at. 664 00:34:29,640 --> 00:34:30,360 Speaker 5: The chat logs. 665 00:34:31,080 --> 00:34:34,680 Speaker 2: Yeah, outside of that, like if people, you know, in 666 00:34:34,760 --> 00:34:37,080 Speaker 2: terms of like things, that may not be options right now, 667 00:34:37,120 --> 00:34:40,840 Speaker 2: I think basically everyone listening signal is a perfectly viable option. 668 00:34:40,880 --> 00:34:43,920 Speaker 2: But it's not impossible that, for example, you might wind 669 00:34:44,000 --> 00:34:46,640 Speaker 2: up in a country where, even if there's not a 670 00:34:46,680 --> 00:34:49,880 Speaker 2: specific law against it, there is a precedent established that 671 00:34:49,920 --> 00:34:52,279 Speaker 2: if you have signal on your phone, you know, it 672 00:34:52,719 --> 00:34:55,600 Speaker 2: can be at least used as a justification for charges 673 00:34:55,640 --> 00:34:58,200 Speaker 2: that you are planning to use. Like you know, with Atlanta, 674 00:34:58,280 --> 00:35:01,440 Speaker 2: people are getting charges because they had a lawyer's name 675 00:35:01,480 --> 00:35:04,440 Speaker 2: written on their arm right, and and so the state saying, well, 676 00:35:04,480 --> 00:35:07,440 Speaker 2: that's evidence that we're planning to commit a crime. You know, 677 00:35:07,640 --> 00:35:09,880 Speaker 2: that doesn't mean that convictions will go through in that 678 00:35:09,960 --> 00:35:11,480 Speaker 2: kind of thing, but it may be a reason why 679 00:35:11,560 --> 00:35:15,000 Speaker 2: signal might not be an option, or say, you know, 680 00:35:15,080 --> 00:35:17,520 Speaker 2: something comes out about it that makes it seem less secure. 681 00:35:18,000 --> 00:35:22,360 Speaker 2: What are other good or or acceptable options? And I 682 00:35:22,400 --> 00:35:24,720 Speaker 2: know when we're talking about this, these are often options 683 00:35:24,719 --> 00:35:27,319 Speaker 2: that require more input and work from the user in 684 00:35:27,440 --> 00:35:30,120 Speaker 2: order to maximize their potential security. But I do think 685 00:35:30,160 --> 00:35:31,600 Speaker 2: it's good to like let people kind of know what 686 00:35:31,640 --> 00:35:32,439 Speaker 2: else is out there. 687 00:35:32,840 --> 00:35:36,920 Speaker 3: Yeah, so when signal isn't an option, WhatsApp is actually 688 00:35:37,040 --> 00:35:41,439 Speaker 3: not a bad option. So WhatsApp it is owned by Meta, 689 00:35:41,640 --> 00:35:44,400 Speaker 3: which is a you know which is which is you 690 00:35:44,440 --> 00:35:48,880 Speaker 3: know not which is not ideal? But WhatsApp actually uses 691 00:35:48,960 --> 00:35:53,399 Speaker 3: the same encryption protocol as signal. Uh so, like under 692 00:35:53,400 --> 00:35:55,000 Speaker 3: the hood, the way that the you know, the way 693 00:35:55,040 --> 00:35:57,000 Speaker 3: that the math works to hide your messages from the 694 00:35:57,120 --> 00:36:02,400 Speaker 3: NSA is exactly the same, right, and they've implemented it well. 695 00:36:02,440 --> 00:36:05,759 Speaker 3: You know, there are a few more steps that, you know, 696 00:36:05,800 --> 00:36:07,960 Speaker 3: a few more precautions that you need to take with WhatsApp, 697 00:36:08,040 --> 00:36:11,120 Speaker 3: like making sure that your chats aren't backed up being 698 00:36:11,120 --> 00:36:16,399 Speaker 3: the main one. But WhatsApp is certainly good enough, right 699 00:36:16,440 --> 00:36:20,760 Speaker 3: if you're if you're you know, chat networks aren't using signal, 700 00:36:20,840 --> 00:36:23,000 Speaker 3: if you're in a country where you can't use signal, right, 701 00:36:23,120 --> 00:36:27,680 Speaker 3: Like WhatsApp has two billion users, I'm you know, it's 702 00:36:27,719 --> 00:36:30,040 Speaker 3: it's you can use WhatsApp almost anywhere in the world. 703 00:36:30,080 --> 00:36:32,319 Speaker 3: It's and it's ubiquitous enough that it's not going to 704 00:36:32,440 --> 00:36:35,120 Speaker 3: mark you as you know, somebody with something to hide, right, 705 00:36:35,680 --> 00:36:37,680 Speaker 3: And like, and I don't want to I don't want 706 00:36:37,719 --> 00:36:42,000 Speaker 3: to discount what's app. Right, getting two billion people to 707 00:36:42,520 --> 00:36:46,840 Speaker 3: have end to end encrypted messaging by default overnight basically 708 00:36:47,360 --> 00:36:52,080 Speaker 3: was a major cupe. Like that that was world changing, right, 709 00:36:52,320 --> 00:36:57,280 Speaker 3: and like they they really do deserve applause for that obviously, 710 00:36:57,520 --> 00:36:59,720 Speaker 3: you know, I think partly because of their scale, partly 711 00:36:59,719 --> 00:37:02,520 Speaker 3: because they're own b meta, right, they haven't taken all 712 00:37:02,560 --> 00:37:06,880 Speaker 3: of these same steps, Like they do have more metadata 713 00:37:07,320 --> 00:37:11,239 Speaker 3: on their servers than Signal does. Right. But if that's 714 00:37:11,239 --> 00:37:13,080 Speaker 3: your option, that is a fine option. 715 00:37:13,880 --> 00:37:17,560 Speaker 2: Yeah, I think that's really good to know, particularly since 716 00:37:18,040 --> 00:37:21,960 Speaker 2: options are always more secure than not having any kind 717 00:37:21,960 --> 00:37:24,280 Speaker 2: of a backup plan totally. 718 00:37:24,719 --> 00:37:28,000 Speaker 4: And if people are like even slightly nervous about WhatsApp, 719 00:37:28,200 --> 00:37:31,480 Speaker 4: of great things, they do have disappearing messages. The downside 720 00:37:31,560 --> 00:37:35,000 Speaker 4: is like the fastest disappearing message is only twenty four hours. 721 00:37:35,000 --> 00:37:39,040 Speaker 4: But that's something that again you still have, and that's 722 00:37:39,280 --> 00:37:41,520 Speaker 4: like that is that is an amazing feature. 723 00:37:42,280 --> 00:37:45,640 Speaker 2: Yeah, and that kind of gets into also what kind 724 00:37:45,680 --> 00:37:48,960 Speaker 2: of stuff you can do in order to maximize the 725 00:37:49,040 --> 00:37:52,200 Speaker 2: value of features like that. Like, for example, if you're 726 00:37:52,480 --> 00:37:55,560 Speaker 2: coming back into the country or a country and your 727 00:37:55,600 --> 00:37:59,920 Speaker 2: phone gets confiscated by customs or whatever because security so 728 00:38:00,160 --> 00:38:02,600 Speaker 2: uses have some sort of eye on you for whatever reason. 729 00:38:03,000 --> 00:38:07,880 Speaker 2: If you've got you know, thumbprint log in or face 730 00:38:07,920 --> 00:38:09,920 Speaker 2: log in, they're going to get into that phone right 731 00:38:10,320 --> 00:38:12,799 Speaker 2: in your twenty four hour delete thing may not have 732 00:38:12,840 --> 00:38:15,799 Speaker 2: gotten taken care of everything. If you've got like a 733 00:38:15,840 --> 00:38:20,640 Speaker 2: complicated eight digit password and no biometrics enabled, maybe depending 734 00:38:20,680 --> 00:38:23,040 Speaker 2: on where you are and whatnot, that'll keep your phone 735 00:38:23,080 --> 00:38:25,680 Speaker 2: locked long enough for those messages to get deleted, right, Like, 736 00:38:25,719 --> 00:38:28,680 Speaker 2: it's all about kind of maximizing the chances that something. 737 00:38:28,400 --> 00:38:32,680 Speaker 3: Like that helps. Yeah, exactly. We definitely recommend that people 738 00:38:32,719 --> 00:38:35,920 Speaker 3: turn on disappearing messages. I think that that's just a 739 00:38:35,960 --> 00:38:40,600 Speaker 3: good sensible default to have. Also definitely recommend that if 740 00:38:40,640 --> 00:38:43,480 Speaker 3: you're going to be in a situation where you think 741 00:38:43,480 --> 00:38:45,400 Speaker 3: you're going to be, you know, there's a higher likelihood 742 00:38:45,400 --> 00:38:47,960 Speaker 3: if you interacting with law enforcement, if you're crossing a border, 743 00:38:48,000 --> 00:38:50,759 Speaker 3: if you're going to a protest turn off the biometric 744 00:38:50,960 --> 00:38:54,600 Speaker 3: unlock on your phone. Certainly, especially in the US, there's 745 00:38:54,719 --> 00:38:57,319 Speaker 3: the case law isn't settled, but there's a lot of 746 00:38:57,360 --> 00:39:01,279 Speaker 3: state courts that have decided that police can force you 747 00:39:01,320 --> 00:39:03,560 Speaker 3: to unlock your phone with your biometrics and that that's 748 00:39:03,600 --> 00:39:06,759 Speaker 3: totally fine. So this, you know, in the in the 749 00:39:06,840 --> 00:39:09,120 Speaker 3: US context, is a good idea in any context. I 750 00:39:09,120 --> 00:39:11,040 Speaker 3: think it's a good idea if you're at heightened risk 751 00:39:11,120 --> 00:39:12,680 Speaker 3: to turn off totally. 752 00:39:14,080 --> 00:39:15,560 Speaker 4: I mean, one thing we're also a big fan of 753 00:39:15,719 --> 00:39:18,000 Speaker 4: is figuring out too like and this is again where 754 00:39:18,000 --> 00:39:20,759 Speaker 4: threat modeling is so key, is like, is this a 755 00:39:20,760 --> 00:39:23,759 Speaker 4: circumstance where you need your phone or another thing that 756 00:39:24,400 --> 00:39:26,040 Speaker 4: you know you can always do if you are nervous 757 00:39:26,080 --> 00:39:29,800 Speaker 4: about traveling across the border, is you can delete signal 758 00:39:29,840 --> 00:39:33,240 Speaker 4: and reinstall it and everything is gone. You can delete 759 00:39:33,239 --> 00:39:36,560 Speaker 4: WhatsApp temporarily while you're crossing a border so it's not 760 00:39:37,200 --> 00:39:40,600 Speaker 4: on your phone. You know, there are things like that 761 00:39:40,640 --> 00:39:43,720 Speaker 4: you can do if you feel comfortable wiping your phone, 762 00:39:43,760 --> 00:39:47,680 Speaker 4: that's something also you can do. You know, these are 763 00:39:47,680 --> 00:39:50,560 Speaker 4: all again these are these are these are different things, 764 00:39:50,600 --> 00:39:52,240 Speaker 4: and I think this is one of the things our 765 00:39:52,360 --> 00:39:55,279 Speaker 4: report I don't remember how too much we get into 766 00:39:55,280 --> 00:39:57,080 Speaker 4: a bit something that at least we've been thinking about. 767 00:39:57,080 --> 00:40:00,040 Speaker 4: Cooper and I run a little lab called Complication, and 768 00:40:01,400 --> 00:40:04,440 Speaker 4: one of the things we've been thinking about there is 769 00:40:04,520 --> 00:40:07,240 Speaker 4: just also how do we instill sort of like better, 770 00:40:07,440 --> 00:40:10,680 Speaker 4: better holistic practices where we understand that a phone is 771 00:40:10,760 --> 00:40:15,320 Speaker 4: just one component of our safety and so like secure messaging, 772 00:40:15,360 --> 00:40:17,920 Speaker 4: encrypted messaging is one component of that safe safety. So 773 00:40:18,000 --> 00:40:20,279 Speaker 4: like what are other things we can do? 774 00:40:20,760 --> 00:40:21,840 Speaker 5: And some of that can. 775 00:40:21,640 --> 00:40:24,680 Speaker 4: Be you know, wiping your phone if traveling, if that 776 00:40:24,719 --> 00:40:26,560 Speaker 4: makes sense for you, or if that's a thing that 777 00:40:26,640 --> 00:40:30,760 Speaker 4: makes you feel safer, or removing certain apps and then 778 00:40:31,160 --> 00:40:33,560 Speaker 4: you know, reinstalling them, reinstalling them later. 779 00:40:34,280 --> 00:40:37,080 Speaker 3: Yeah, yeah, and it and it really is holistic. Right, 780 00:40:37,160 --> 00:40:39,640 Speaker 3: Like a thing that a thing that people need to 781 00:40:39,680 --> 00:40:43,000 Speaker 3: keep in mind is that, you know, disappearing messages can't 782 00:40:43,040 --> 00:40:50,760 Speaker 3: stop an untrustworthy uh conversation partner, right Like if if 783 00:40:50,880 --> 00:40:53,800 Speaker 3: my conversation partner is untrustworthy, they can take screenshots of 784 00:40:53,840 --> 00:40:57,160 Speaker 3: the messages, right, they can you know, go they can 785 00:40:57,200 --> 00:41:01,080 Speaker 3: go snitch to law enforcement about what I've hold them. Right, 786 00:41:02,480 --> 00:41:06,200 Speaker 3: Encrypted messaging, discipree messages, These are not panaspeas. Right, you 787 00:41:06,280 --> 00:41:09,640 Speaker 3: still have to you still have to keep all of 788 00:41:09,680 --> 00:41:17,160 Speaker 3: your other aspects of security as well, right, So don't 789 00:41:17,320 --> 00:41:19,680 Speaker 3: entirely rely on these technologies to save you, right, you 790 00:41:19,719 --> 00:41:21,680 Speaker 3: have to also trust the people you're working with and 791 00:41:22,040 --> 00:41:23,440 Speaker 3: build these layers of security. 792 00:41:23,480 --> 00:41:26,000 Speaker 4: Yet it's true, I mean, Cooper, you could leak all 793 00:41:26,000 --> 00:41:29,000 Speaker 4: of my secrets right now on this podcast and them. 794 00:41:28,920 --> 00:41:30,120 Speaker 5: That too, what a gentleman. 795 00:41:30,719 --> 00:41:33,600 Speaker 2: And that is that is the other thing, right where 796 00:41:34,640 --> 00:41:37,879 Speaker 2: when it comes to like what is secure, one thing 797 00:41:37,920 --> 00:41:40,520 Speaker 2: to remember is that signal for all the good things 798 00:41:40,560 --> 00:41:43,120 Speaker 2: about it, Nothing, nothing at all about that app stops 799 00:41:43,520 --> 00:41:45,719 Speaker 2: the recipient of a message from you from taking a 800 00:41:45,760 --> 00:41:49,520 Speaker 2: screen grab or just handing their phone over to their 801 00:41:49,520 --> 00:41:54,160 Speaker 2: friendly local federal agent, right, which is always you know, 802 00:41:54,960 --> 00:41:56,560 Speaker 2: we don't want to be I'm not trying to be 803 00:41:56,640 --> 00:41:58,960 Speaker 2: a security nihilist here. I think you know, there's no 804 00:41:59,120 --> 00:42:04,880 Speaker 2: replacing communication over phones in many situations. But if you are, 805 00:42:04,960 --> 00:42:11,040 Speaker 2: for example, going to be transferring a bunch of Plan 806 00:42:11,160 --> 00:42:15,279 Speaker 2: B pills in an area where that is prosecutable, that 807 00:42:15,320 --> 00:42:17,799 Speaker 2: probably shouldn't go on your phone in that language. Right, 808 00:42:17,880 --> 00:42:20,600 Speaker 2: Perhaps you know you could come up with a clever 809 00:42:20,719 --> 00:42:26,400 Speaker 2: codeword or whatever. But don't you know it is security 810 00:42:26,480 --> 00:42:28,640 Speaker 2: is like you said, holistic. You know you should not 811 00:42:28,800 --> 00:42:31,560 Speaker 2: be looking at it as just like, well the app 812 00:42:31,640 --> 00:42:32,120 Speaker 2: is secure. 813 00:42:32,160 --> 00:42:32,879 Speaker 3: So that's enough. 814 00:42:33,760 --> 00:42:35,400 Speaker 4: I mean, one thing I also want people sort of 815 00:42:35,400 --> 00:42:37,360 Speaker 4: think about too, because that's a really great point Robert, 816 00:42:37,520 --> 00:42:39,799 Speaker 4: is like, we do all different kinds of things every 817 00:42:39,840 --> 00:42:43,320 Speaker 4: day in our lives that could, you know, in dangerous. 818 00:42:43,360 --> 00:42:45,600 Speaker 4: Like I think a lot of the work I do 819 00:42:45,719 --> 00:42:47,360 Speaker 4: is I work a lot with people facing all different 820 00:42:47,400 --> 00:42:50,520 Speaker 4: kinds of online harassment. So like falling in love, for example, 821 00:42:50,600 --> 00:42:52,640 Speaker 4: is a dangerous thing to do. You could have your 822 00:42:52,680 --> 00:42:56,799 Speaker 4: heart broken or that person could hurt you. Learning how 823 00:42:56,800 --> 00:43:01,640 Speaker 4: to trust people, you know, crossing the street, deciding to 824 00:43:01,760 --> 00:43:04,440 Speaker 4: jaywalk right, all different things we do sort of. 825 00:43:04,360 --> 00:43:06,520 Speaker 5: Every day actually can expose us to harm. 826 00:43:06,560 --> 00:43:08,400 Speaker 4: And so one thing I think for people listening to 827 00:43:08,440 --> 00:43:11,040 Speaker 4: keep in mind is that's the same one we have conversations. 828 00:43:11,120 --> 00:43:14,200 Speaker 4: And I think a way to avoid nihilism is just 829 00:43:14,239 --> 00:43:17,120 Speaker 4: to remember that that every day we are sort of 830 00:43:17,120 --> 00:43:19,920 Speaker 4: going out there and actually being incredibly brave just by 831 00:43:19,960 --> 00:43:23,480 Speaker 4: living our everyday lives, by deciding to be in community 832 00:43:23,520 --> 00:43:28,200 Speaker 4: and have friendships and have relationships, and in my case, 833 00:43:28,200 --> 00:43:31,680 Speaker 4: I love jaywalking, and no one around me does, and 834 00:43:32,040 --> 00:43:35,920 Speaker 4: that's why that's my choice. And I have not yet 835 00:43:36,239 --> 00:43:38,080 Speaker 4: gotten hit by a car jaywalking. 836 00:43:38,880 --> 00:43:42,080 Speaker 2: I think it's good to look at this the same way. 837 00:43:42,120 --> 00:43:46,719 Speaker 2: There's a concept that the military has sort of developed 838 00:43:46,719 --> 00:43:49,279 Speaker 2: when talking about how not to die when you're in 839 00:43:49,320 --> 00:43:52,359 Speaker 2: a gunfight or something. It's called the survivability onion, right, 840 00:43:52,640 --> 00:43:55,000 Speaker 2: And I think it's extremely useful both if you're talking 841 00:43:55,040 --> 00:43:56,759 Speaker 2: about like, well, I'm going to a protest and there 842 00:43:56,800 --> 00:43:58,919 Speaker 2: will be violence there, you know, should I wear armor, 843 00:43:58,920 --> 00:44:01,880 Speaker 2: et cetera. But it's also just really it's really useful 844 00:44:01,880 --> 00:44:05,160 Speaker 2: with any kind of security, and and the onion, it's 845 00:44:05,200 --> 00:44:08,040 Speaker 2: it's envision, doesn't on you because like the largest outside 846 00:44:08,120 --> 00:44:10,920 Speaker 2: chunk of it is don't be seen, don't be acquired, 847 00:44:10,960 --> 00:44:13,120 Speaker 2: which means somebody actually getting you in their head sights. 848 00:44:13,640 --> 00:44:17,960 Speaker 2: Don't be a hit, which means being behind cover or something. 849 00:44:18,200 --> 00:44:20,640 Speaker 2: And then the very internal part of it is like, 850 00:44:20,719 --> 00:44:22,800 Speaker 2: have some sort of armor in case you are shot. 851 00:44:23,160 --> 00:44:24,040 Speaker 3: But if you if the. 852 00:44:24,080 --> 00:44:27,640 Speaker 2: Armor is useful, the majority of the onion has already failed. 853 00:44:27,760 --> 00:44:27,920 Speaker 3: Right. 854 00:44:28,320 --> 00:44:32,480 Speaker 2: If encryption is useful, that is not a dissimilar sort 855 00:44:32,520 --> 00:44:35,040 Speaker 2: of situation. Right, So there's a there's a degree of 856 00:44:35,840 --> 00:44:41,000 Speaker 2: canniness is super helpful and thinking about like what is 857 00:44:41,320 --> 00:44:44,799 Speaker 2: what is visible about me? If I'm doing something, I 858 00:44:44,880 --> 00:44:47,480 Speaker 2: know that I have to be extra concerned about the 859 00:44:47,520 --> 00:44:51,920 Speaker 2: state seeing what is visible about me from the outside, 860 00:44:52,200 --> 00:44:53,399 Speaker 2: you know, tobly, I mean. 861 00:44:53,360 --> 00:44:55,040 Speaker 4: I think that's an amazing thing to think about. Like 862 00:44:55,480 --> 00:44:57,920 Speaker 4: where where are you sending a text message? Are you 863 00:44:58,000 --> 00:45:00,879 Speaker 4: in a place in which like someone can lean over, 864 00:45:00,960 --> 00:45:04,680 Speaker 4: Like I'm the nosiest motherfucker, and all the time I'm 865 00:45:04,760 --> 00:45:07,400 Speaker 4: constantly like like looking around being like what's that person 866 00:45:07,440 --> 00:45:10,240 Speaker 4: watching on an airplane? Or like if someone is sitting 867 00:45:10,239 --> 00:45:12,960 Speaker 4: next to me scrolling, So like you wouldn't want to 868 00:45:12,960 --> 00:45:15,360 Speaker 4: like send a sensitive text message like next to me, 869 00:45:15,400 --> 00:45:18,520 Speaker 4: because I'd be like, that's that's interesting fodder. 870 00:45:18,600 --> 00:45:23,200 Speaker 5: That's kind of a show Texas to Cooper later, you know. 871 00:45:23,239 --> 00:45:24,919 Speaker 4: And so I think it's important to think about that, 872 00:45:25,200 --> 00:45:28,200 Speaker 4: Like who's around you? Is this is like how are 873 00:45:28,280 --> 00:45:31,400 Speaker 4: you describing something? Do you know the person you're messaging? 874 00:45:31,440 --> 00:45:33,560 Speaker 4: If you're in a group message, you know everybody there? 875 00:45:33,600 --> 00:45:35,759 Speaker 4: Like do you trust all of them? 876 00:45:36,760 --> 00:45:37,000 Speaker 5: You know? 877 00:45:37,040 --> 00:45:39,200 Speaker 4: And if you're ever nervous there are this is I 878 00:45:39,200 --> 00:45:42,279 Speaker 4: guess the upside also to in person conversations. You can have, 879 00:45:43,200 --> 00:45:45,440 Speaker 4: you know, a phone call or an in person conversation 880 00:45:45,560 --> 00:45:48,640 Speaker 4: with someone. Right if you're really not sure or you 881 00:45:48,680 --> 00:45:51,839 Speaker 4: don't feel comfortable even sending something over signal, that might 882 00:45:51,840 --> 00:45:53,000 Speaker 4: be the time to be like, hey, do you want 883 00:45:53,000 --> 00:45:55,000 Speaker 4: to meet up and get a coffee and then you know, 884 00:45:56,719 --> 00:46:01,160 Speaker 4: try to find a discreet place to have have a conversation. 885 00:46:02,000 --> 00:46:07,680 Speaker 2: Yeah, yeah, I do want to roll to ads real quick. 886 00:46:07,719 --> 00:46:09,799 Speaker 2: One second, and I think Cooper had something to say, 887 00:46:09,840 --> 00:46:25,880 Speaker 2: and we'll we'll continue, but first products ah, we're back Cooper. 888 00:46:26,200 --> 00:46:28,440 Speaker 2: You look like you had something to add. 889 00:46:28,200 --> 00:46:31,080 Speaker 3: On that, nothing particularly serious, just that. I think that 890 00:46:31,080 --> 00:46:32,920 Speaker 3: that's I think that that's really good advice for the 891 00:46:32,920 --> 00:46:36,080 Speaker 3: military and absolutely justifies the nine hundred billion dollars. 892 00:46:39,040 --> 00:46:42,799 Speaker 2: Yeah, I'm glad they put together a fucking graphic. I 893 00:46:42,840 --> 00:46:45,879 Speaker 2: wonder how many billions of dollars that did best I could. 894 00:46:45,920 --> 00:46:49,360 Speaker 4: I could make a graphic for hundreds of millions of dollars. 895 00:46:49,600 --> 00:46:52,160 Speaker 3: Yeah, if anybody, if anybody wants to fund us for 896 00:46:52,239 --> 00:46:54,319 Speaker 3: hundreds of millions, we will will do it less now 897 00:46:54,640 --> 00:46:56,239 Speaker 3: a year, hundreds of millions. 898 00:46:56,320 --> 00:46:59,000 Speaker 4: We have so many good T shirt ideas and sticker 899 00:46:59,080 --> 00:47:03,239 Speaker 4: ideas y'all like so many good ones, so many unhinged 900 00:47:03,239 --> 00:47:04,880 Speaker 4: ones that the world needs to see. 901 00:47:05,280 --> 00:47:08,759 Speaker 2: Yeah, I mean I do. I guess just because of 902 00:47:08,800 --> 00:47:10,920 Speaker 2: the amount of time I've spent thinking about this stuff 903 00:47:11,200 --> 00:47:14,080 Speaker 2: from my old job. There are a couple of concepts 904 00:47:14,200 --> 00:47:18,920 Speaker 2: from military planning I think about in this context, and 905 00:47:18,960 --> 00:47:21,520 Speaker 2: one of them that I also think is relevant to 906 00:47:21,520 --> 00:47:23,800 Speaker 2: what we're talking about with friction is the concept of 907 00:47:23,840 --> 00:47:26,480 Speaker 2: an ode loop, right, which is how do you win 908 00:47:27,360 --> 00:47:30,480 Speaker 2: and combat against an opponent, And it's by disrupting this 909 00:47:30,520 --> 00:47:32,200 Speaker 2: thing called the ode loop. And the ODO loop is 910 00:47:32,239 --> 00:47:37,719 Speaker 2: how an adversary carries out actions in a conflict like this, right, 911 00:47:38,120 --> 00:47:42,360 Speaker 2: and the steps you have to go for observe, orient, decide, 912 00:47:42,400 --> 00:47:44,799 Speaker 2: and act. And if you can disrupt any stage of that, 913 00:47:45,120 --> 00:47:48,280 Speaker 2: you can stop them from taking actions, right, which stops 914 00:47:48,320 --> 00:47:52,480 Speaker 2: them from being able to harm you. And the good 915 00:47:52,600 --> 00:47:56,960 Speaker 2: security is going to impact all three of those things, right, 916 00:47:57,000 --> 00:47:58,719 Speaker 2: It's going to stop them from being able to see 917 00:47:58,760 --> 00:48:02,200 Speaker 2: you sometimes if they can see you stuff like you know, 918 00:48:02,560 --> 00:48:04,840 Speaker 2: you were just talking, we were just talking earlier about 919 00:48:05,600 --> 00:48:07,760 Speaker 2: link previews, right, and how that can kind of expose 920 00:48:07,880 --> 00:48:11,319 Speaker 2: maybe who you're in communication with potentially, well, that could 921 00:48:11,320 --> 00:48:13,719 Speaker 2: allow the state to orient themselves to you and to 922 00:48:13,840 --> 00:48:17,879 Speaker 2: your friends, right, And obviously stuff like locking down your 923 00:48:17,880 --> 00:48:21,800 Speaker 2: devices not having unnecessarily info online can stop them bring 924 00:48:22,080 --> 00:48:24,320 Speaker 2: being able to decide, you know, what you're doing and 925 00:48:24,640 --> 00:48:27,480 Speaker 2: how they should respond to that. And I think that's 926 00:48:27,520 --> 00:48:30,480 Speaker 2: also good if you're thinking, if you're not just somebody 927 00:48:30,480 --> 00:48:32,839 Speaker 2: who is concerned about your security like most people are, 928 00:48:32,880 --> 00:48:35,720 Speaker 2: because it's good to have some security. If you're actually 929 00:48:36,400 --> 00:48:40,120 Speaker 2: dealing with the state or a corporation as an adversary 930 00:48:40,200 --> 00:48:43,400 Speaker 2: in some way, it can be useful to think about 931 00:48:43,520 --> 00:48:45,440 Speaker 2: your security culture in those terms. 932 00:48:46,560 --> 00:48:50,279 Speaker 3: Yeah. Absolutely, I think that's absolutely right. It's it's and 933 00:48:50,320 --> 00:48:52,040 Speaker 3: I think that it's you know, it points to you 934 00:48:52,160 --> 00:48:56,719 Speaker 3: like we should, we should understand what the you know, 935 00:48:57,080 --> 00:49:00,480 Speaker 3: mode of thinking of our adversaries is, right, like we 936 00:49:00,640 --> 00:49:04,359 Speaker 3: you know, we should if your adversary is the NSA, right, 937 00:49:04,360 --> 00:49:08,160 Speaker 3: which is like probably actually not most people in the US, 938 00:49:08,200 --> 00:49:10,960 Speaker 3: Like for most US activists, the NSA is not actually 939 00:49:11,040 --> 00:49:14,279 Speaker 3: your biggest adversary, right, Like your biggest adversary is going 940 00:49:14,360 --> 00:49:17,000 Speaker 3: to be local police, right, your biggest adversary is going 941 00:49:17,080 --> 00:49:20,080 Speaker 3: to be you know, the the you know, somebody like 942 00:49:20,120 --> 00:49:22,480 Speaker 3: your abusive partner, right, and you need to. And this 943 00:49:22,520 --> 00:49:27,440 Speaker 3: is why threat modeling is important, because you need to 944 00:49:27,440 --> 00:49:29,919 Speaker 3: to really to really think about, you know, think through 945 00:49:30,040 --> 00:49:32,279 Speaker 3: like you know, well, okay, wait, am I actually worried 946 00:49:32,280 --> 00:49:34,279 Speaker 3: about protecting myself from the NSA? Or am I more 947 00:49:34,320 --> 00:49:38,120 Speaker 3: worried about uh uh, you know the the racist police 948 00:49:38,120 --> 00:49:41,000 Speaker 3: officer that drives down my street every day? Right? And yeah, 949 00:49:41,040 --> 00:49:43,440 Speaker 3: probably it's the latter. And so you can you can 950 00:49:43,480 --> 00:49:47,279 Speaker 3: take a lot more useful actions, right uh. And and 951 00:49:47,360 --> 00:49:51,239 Speaker 3: you know you can you can you know, break that 952 00:49:51,360 --> 00:49:55,600 Speaker 3: oda loop for him once you know actually what it is. Right, Yeah, 953 00:49:55,680 --> 00:49:58,000 Speaker 3: if you're defending yourself against the NSA, you're gonna leave 954 00:49:58,000 --> 00:50:01,879 Speaker 3: yourself wide open to the actual threat. Yeah. 955 00:50:01,880 --> 00:50:05,200 Speaker 2: It's totally I think a great example. And I don't 956 00:50:05,239 --> 00:50:07,680 Speaker 2: mean to be like quote unquote sub tweeting somebody here, 957 00:50:07,719 --> 00:50:10,640 Speaker 2: but I've known a couple of folks like this. It's like, 958 00:50:10,640 --> 00:50:12,920 Speaker 2: if you have if you're super paranoid, you're not putting 959 00:50:12,920 --> 00:50:16,000 Speaker 2: anything online, You're only talking with your close friends, you 960 00:50:16,120 --> 00:50:18,799 Speaker 2: use a dumb phone, you have burners, but you also 961 00:50:18,880 --> 00:50:20,560 Speaker 2: drive around with a shitload of weed in your car 962 00:50:20,600 --> 00:50:22,839 Speaker 2: in a state where that's illegal. It's like, well, like 963 00:50:23,080 --> 00:50:27,239 Speaker 2: your threat modeling is not great in that situation, right, 964 00:50:27,360 --> 00:50:29,080 Speaker 2: or like I do all that, but I carry in 965 00:50:29,120 --> 00:50:32,319 Speaker 2: a legal handgun with me wherever I go. It's like, well, 966 00:50:32,400 --> 00:50:34,880 Speaker 2: it may be more of a threat than your phone. 967 00:50:35,120 --> 00:50:36,840 Speaker 4: My partner the other day was like, what if I 968 00:50:36,840 --> 00:50:38,120 Speaker 4: got a dumb phone? I was like, what if I 969 00:50:38,160 --> 00:50:41,759 Speaker 4: divorced you? Like like what if? 970 00:50:43,239 --> 00:50:44,120 Speaker 5: They were like what do you mean? 971 00:50:44,160 --> 00:50:45,640 Speaker 4: And I was like, well, I'm going to be the 972 00:50:45,640 --> 00:50:48,840 Speaker 4: one using all the maps for both of us, yeah, 973 00:50:48,880 --> 00:50:52,120 Speaker 4: and having to google all the dumb shit you want 974 00:50:52,120 --> 00:50:55,000 Speaker 4: at Google. That doesn't make I'm now your weakest link, 975 00:50:55,040 --> 00:50:57,399 Speaker 4: like go fuck yourself. But also I was like, I'm 976 00:50:57,440 --> 00:51:02,560 Speaker 4: absolutely not going to be your your your Google maps bitch, 977 00:51:02,640 --> 00:51:06,239 Speaker 4: Like I'm not not doing that. But but I mean 978 00:51:06,280 --> 00:51:08,560 Speaker 4: I think also, you know, to both of y'all's points 979 00:51:08,560 --> 00:51:10,560 Speaker 4: to get serious again for a second. I mean, you know, 980 00:51:10,600 --> 00:51:15,040 Speaker 4: like my threat model, for example, might be similar or 981 00:51:15,040 --> 00:51:18,680 Speaker 4: slightly different, maybe slightly less serious than than Cooper's. But 982 00:51:18,840 --> 00:51:21,520 Speaker 4: you know, like some of the like the the the 983 00:51:21,600 --> 00:51:23,560 Speaker 4: journalists in India we were working with, have quite a 984 00:51:23,640 --> 00:51:26,840 Speaker 4: high threat model, right, Like, yeah, the Indian police force 985 00:51:27,200 --> 00:51:30,920 Speaker 4: are very much like the NSA. They're very talented they 986 00:51:30,960 --> 00:51:33,200 Speaker 4: have a lot of money and tech at their disposal, 987 00:51:33,560 --> 00:51:36,840 Speaker 4: and that might be different for some of the activists 988 00:51:36,880 --> 00:51:39,480 Speaker 4: we're working with, let's say in like Louisiana or Texas, right, 989 00:51:40,719 --> 00:51:44,160 Speaker 4: but the differences is, like we're still talking about I 990 00:51:44,160 --> 00:51:48,920 Speaker 4: would argue two brutal police forces that just have different 991 00:51:49,000 --> 00:51:51,839 Speaker 4: means of disposal at their hands. So like the Louisiana 992 00:51:52,080 --> 00:51:55,880 Speaker 4: police are our groups you should totally be worried about. 993 00:51:56,120 --> 00:51:58,880 Speaker 4: They might not be able to hack your phone, but 994 00:51:59,400 --> 00:52:00,600 Speaker 4: maybe eventually they could. 995 00:52:00,960 --> 00:52:03,000 Speaker 5: But there are other there are obviously other things that 996 00:52:03,040 --> 00:52:03,719 Speaker 5: were about them. 997 00:52:03,760 --> 00:52:07,000 Speaker 4: But you know, in the context of like some of 998 00:52:07,040 --> 00:52:09,800 Speaker 4: the folks who are working with in the South, like 999 00:52:09,840 --> 00:52:13,160 Speaker 4: reproductive justice activists, some of the things are probably much 1000 00:52:13,200 --> 00:52:17,160 Speaker 4: more serious in terms of your threat model would be 1001 00:52:17,200 --> 00:52:20,680 Speaker 4: like a nurse for someone who let's say is miscarring 1002 00:52:20,960 --> 00:52:23,359 Speaker 4: or has sought an abortion. And this is something Kate 1003 00:52:23,360 --> 00:52:26,719 Speaker 4: Bertosh from the Digital Defense Fund, a friend of of 1004 00:52:26,880 --> 00:52:29,720 Speaker 4: you know ours, has talked about where like the people 1005 00:52:29,800 --> 00:52:31,799 Speaker 4: that are supposed to take care of you might be 1006 00:52:31,840 --> 00:52:35,359 Speaker 4: the ones that are actually your your biggest threat, right, 1007 00:52:35,640 --> 00:52:37,960 Speaker 4: the ones that have heard you say something or you've 1008 00:52:37,960 --> 00:52:41,200 Speaker 4: can fight it in for example, and that is kind 1009 00:52:41,239 --> 00:52:42,640 Speaker 4: of a horrifying thing to think about. 1010 00:52:42,680 --> 00:52:44,399 Speaker 5: But that is, that is a thing you. 1011 00:52:44,400 --> 00:52:47,479 Speaker 4: Have to threat model, right, is is can I trust 1012 00:52:47,520 --> 00:52:49,640 Speaker 4: this person? How am I describing? 1013 00:52:49,760 --> 00:52:50,760 Speaker 5: You know? What's happening? 1014 00:52:51,400 --> 00:52:51,640 Speaker 3: Yeah? 1015 00:52:52,200 --> 00:52:56,239 Speaker 2: Yeah, absolutely, Well, did y'all have anything else you wanted 1016 00:52:56,280 --> 00:52:59,080 Speaker 2: to make sure to get into in this conversation? There's 1017 00:52:59,080 --> 00:53:01,360 Speaker 2: so much more in your in the great paper you 1018 00:53:01,719 --> 00:53:05,200 Speaker 2: helped co author, What is Secure and Analysis of Popular 1019 00:53:05,239 --> 00:53:08,600 Speaker 2: Messaging Apps on the Tech Policy Press. But yeah, is 1020 00:53:08,600 --> 00:53:10,560 Speaker 2: there anything else y'all wanted to really make sure you 1021 00:53:10,640 --> 00:53:11,640 Speaker 2: hit before we roll out? 1022 00:53:12,160 --> 00:53:14,400 Speaker 4: Yeah? Please don't use telegram for a variety of reasons, 1023 00:53:14,440 --> 00:53:17,560 Speaker 4: but also like it's very unclear how they respond to 1024 00:53:17,600 --> 00:53:20,200 Speaker 4: any law enforcement or government. They don't say anything, and 1025 00:53:20,200 --> 00:53:22,520 Speaker 4: it's kind of impossible to reach anyone that works there. 1026 00:53:23,160 --> 00:53:27,000 Speaker 4: Please don't use Facebook Messenger other than maybe sending memes. 1027 00:53:28,520 --> 00:53:31,160 Speaker 4: There's a lot of really gross surveillance capitalism inside a 1028 00:53:31,200 --> 00:53:35,160 Speaker 4: Facebook messenger that the paper gets into. But effectively, Meta 1029 00:53:35,239 --> 00:53:38,720 Speaker 4: is building this weird, sprawling infrastructure inside a Facebook Messenger 1030 00:53:38,719 --> 00:53:40,640 Speaker 4: and try to link Facebook and Instagram. 1031 00:53:41,080 --> 00:53:43,239 Speaker 5: And one of the things we noticed is. 1032 00:53:43,200 --> 00:53:45,919 Speaker 4: That, like if you've blocked someone on Instagram or mute 1033 00:53:45,920 --> 00:53:48,799 Speaker 4: to them, but you haven't blocked remuted them on Facebook, 1034 00:53:49,160 --> 00:53:51,680 Speaker 4: that your stories, like all those stories are still coming 1035 00:53:51,719 --> 00:53:55,560 Speaker 4: across in messengers, so you can still see content from 1036 00:53:55,600 --> 00:53:59,480 Speaker 4: someone because it's linking both of those both of those profiles. 1037 00:54:00,239 --> 00:54:02,719 Speaker 4: So you know, you could see how partaking like an 1038 00:54:02,760 --> 00:54:05,720 Speaker 4: online harassment lens like why that's why that's really bad, 1039 00:54:06,320 --> 00:54:11,440 Speaker 4: that's really harmful and could be potentially you know, upsetting 1040 00:54:11,480 --> 00:54:13,000 Speaker 4: and triggering for folks. 1041 00:54:14,640 --> 00:54:18,040 Speaker 3: Yeah, I'll add that. I think my the major thing 1042 00:54:18,080 --> 00:54:21,320 Speaker 3: I want people to to think about is that encryption 1043 00:54:21,440 --> 00:54:24,239 Speaker 3: really does work, and it works really well. And we 1044 00:54:24,280 --> 00:54:26,759 Speaker 3: can see that because a lot of countries right now 1045 00:54:27,120 --> 00:54:30,160 Speaker 3: are trying to pass laws that either weaken or byan 1046 00:54:30,320 --> 00:54:34,160 Speaker 3: encryption right and in fact, the UK uh did passed, 1047 00:54:34,480 --> 00:54:37,399 Speaker 3: did just pass such a law, the online Safety built 1048 00:54:37,400 --> 00:54:40,680 Speaker 3: in the UK. And so it's really important that we 1049 00:54:41,480 --> 00:54:43,799 Speaker 3: that we you know, push back against these laws and 1050 00:54:43,840 --> 00:54:48,160 Speaker 3: fight back against these laws and and whatever we can, right. 1051 00:54:48,200 --> 00:54:49,720 Speaker 3: And I'm not I'm not coming at. 1052 00:54:49,560 --> 00:54:53,799 Speaker 9: This as somebody who's a big believer in the you know, 1053 00:54:53,880 --> 00:54:57,040 Speaker 9: in in incrementalism and in working with governments, but I 1054 00:54:57,120 --> 00:54:59,640 Speaker 9: think that I still think that it's really important to 1055 00:55:00,480 --> 00:55:00,839 Speaker 9: you know. 1056 00:55:01,719 --> 00:55:04,040 Speaker 3: Educate folks and push back against these laws and try 1057 00:55:04,040 --> 00:55:08,040 Speaker 3: to not let these pasts because these will be really 1058 00:55:08,080 --> 00:55:10,200 Speaker 3: bad for all of us totally. 1059 00:55:10,400 --> 00:55:12,680 Speaker 4: And not to defend the Online Safet Bill, because I 1060 00:55:12,680 --> 00:55:14,719 Speaker 4: would never do that. I'll go to my grave not 1061 00:55:15,280 --> 00:55:19,000 Speaker 4: speaking highly of it, only speaking critically at least, like 1062 00:55:19,080 --> 00:55:24,280 Speaker 4: the pushback from encryption experts and encryption supporters like Merrit Whitaker, 1063 00:55:24,719 --> 00:55:28,919 Speaker 4: president of Signal, did lead to lawmakers in the UK, 1064 00:55:29,080 --> 00:55:33,200 Speaker 4: for example, admitting that there's no sort of feasible safe 1065 00:55:33,239 --> 00:55:35,359 Speaker 4: way to build a back door, right, And that is 1066 00:55:35,440 --> 00:55:39,719 Speaker 4: I think also a win because of so much pushback, 1067 00:55:39,800 --> 00:55:42,839 Speaker 4: because of so much research, because of so much criticism 1068 00:55:42,920 --> 00:55:46,160 Speaker 4: that security and privacy folks gave people that are pro 1069 00:55:46,280 --> 00:55:49,920 Speaker 4: encryption like that, we you know, we were able to 1070 00:55:50,000 --> 00:55:53,279 Speaker 4: walk back that part. And I do think that's a 1071 00:55:53,320 --> 00:55:58,439 Speaker 4: big deal, even if there are other issues with that bill, 1072 00:55:58,440 --> 00:56:01,480 Speaker 4: because I think it also sends a sick pun intended 1073 00:56:01,800 --> 00:56:07,040 Speaker 4: to other governments as well, and I think that that's 1074 00:56:07,040 --> 00:56:09,319 Speaker 4: incredibly important. But yeah, I would also say just just 1075 00:56:09,400 --> 00:56:10,479 Speaker 4: use Signal whenever you can. 1076 00:56:13,000 --> 00:56:17,399 Speaker 2: But yeah, yeah, well all right, folks, that is going 1077 00:56:17,480 --> 00:56:22,120 Speaker 2: to be it for us here at it could happen here. Yeah, 1078 00:56:22,160 --> 00:56:25,000 Speaker 2: thank you all for listening, and thank you Cooper and 1079 00:56:25,200 --> 00:56:26,680 Speaker 2: Carolyn for coming on. 1080 00:56:27,880 --> 00:56:29,440 Speaker 3: Thank you for having us, yeah. 1081 00:56:29,280 --> 00:56:30,200 Speaker 5: And thank you for having us. 1082 00:56:30,400 --> 00:56:34,239 Speaker 4: You can find us on social media for now, I 1083 00:56:34,239 --> 00:56:35,759 Speaker 4: guess until it all. 1084 00:56:35,600 --> 00:56:36,320 Speaker 5: Lights on fire. 1085 00:56:36,520 --> 00:56:38,360 Speaker 2: Yeah, whichever one you want to trust. 1086 00:56:40,080 --> 00:56:44,400 Speaker 3: I'm Cooper Cue on most social media's Blue Sky Mastered 1087 00:56:44,440 --> 00:56:45,480 Speaker 3: on Shitter. 1088 00:56:46,520 --> 00:56:49,799 Speaker 4: Yeah, I'm Caroline Cinders. My first name, last name. Our 1089 00:56:49,880 --> 00:56:54,800 Speaker 4: lab is Convocation Research and Design Record Labs on Twitter 1090 00:56:54,840 --> 00:56:55,360 Speaker 4: at the moment. 1091 00:56:55,680 --> 00:56:57,879 Speaker 5: Hopefully we'll get be getting on Blue Sky very soon. 1092 00:56:58,920 --> 00:57:01,560 Speaker 2: Yeah. Yeah, probably get back on there more. 1093 00:57:01,600 --> 00:57:01,799 Speaker 3: Now. 1094 00:57:01,880 --> 00:57:06,879 Speaker 2: Twitter has gotten remarkably worse, which you know, we had 1095 00:57:06,880 --> 00:57:11,440 Speaker 2: a back in the day on the old something Awful forums. 1096 00:57:11,480 --> 00:57:14,320 Speaker 2: There was a thread in one of the debate forums 1097 00:57:14,360 --> 00:57:17,919 Speaker 2: about this very right wing site called Free Republic, which 1098 00:57:17,960 --> 00:57:20,440 Speaker 2: is like one of the earliest reservoirs of what became 1099 00:57:20,960 --> 00:57:24,320 Speaker 2: trump Ism, and the tagline for the thread just kind 1100 00:57:24,360 --> 00:57:26,920 Speaker 2: of watching these people, was there is always more and 1101 00:57:26,960 --> 00:57:30,440 Speaker 2: it is always worse, And boy, goddamn, if that hasn't 1102 00:57:30,480 --> 00:57:34,720 Speaker 2: been a continually accurate statement about the whole of social media. 1103 00:57:34,800 --> 00:57:37,640 Speaker 4: Right now, isn't a time amazing to watch someone just 1104 00:57:37,720 --> 00:57:39,600 Speaker 4: light forty billion dollars on fire. 1105 00:57:39,800 --> 00:57:43,680 Speaker 2: Yeah, just like yeah, totally there to it. Yeah, it's 1106 00:57:43,720 --> 00:57:47,160 Speaker 2: like the nihilist and me being like, wow, Comrade Musk 1107 00:57:47,200 --> 00:57:50,040 Speaker 2: really really taking some hits to capitalism here. 1108 00:57:54,600 --> 00:57:56,960 Speaker 1: It could Happen here as a production of cool Zone Media. 1109 00:57:57,200 --> 00:57:59,880 Speaker 1: For more podcasts from cool Zone Media, visit our website 1110 00:58:00,000 --> 00:58:02,120 Speaker 1: fo zonemedia dot com or check us out on the 1111 00:58:02,160 --> 00:58:05,680 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. 1112 00:58:06,120 --> 00:58:08,280 Speaker 1: You can find sources for It Could Happen Here, updated 1113 00:58:08,320 --> 00:58:12,360 Speaker 1: monthly at coolzonemedia dot com slash sources. Thanks for listening,