1 00:00:04,040 --> 00:00:12,039 Speaker 1: Again text technology with tech Stuff from stuff works dot com. 2 00:00:12,080 --> 00:00:14,720 Speaker 1: Hey there, and welcome to tech Stuff. I am your host, 3 00:00:14,880 --> 00:00:18,760 Speaker 1: Jonathan Strickland, and I'm just super I'm also the executive 4 00:00:18,800 --> 00:00:22,040 Speaker 1: producer or an executive producer here ed how stuff works 5 00:00:22,160 --> 00:00:25,160 Speaker 1: and I love talking about technology. And if you listen 6 00:00:25,239 --> 00:00:28,360 Speaker 1: to the previous episodes of tech Stuff, the two that 7 00:00:28,480 --> 00:00:32,480 Speaker 1: came before this, you heard about the the evolution did 8 00:00:32,479 --> 00:00:36,159 Speaker 1: the birth really of the Electronic Frontier Foundation as a 9 00:00:36,200 --> 00:00:42,960 Speaker 1: result of a really bizarre and ultimately over zealous sting 10 00:00:42,960 --> 00:00:47,080 Speaker 1: operation that the Secret Service did back in the early nineties. 11 00:00:47,720 --> 00:00:51,760 Speaker 1: So I want to talk more about the Electronic Frontier 12 00:00:51,800 --> 00:00:55,200 Speaker 1: Foundation and exactly what they do since really I just 13 00:00:55,240 --> 00:00:57,760 Speaker 1: talked about them kind of coming into being in the 14 00:00:57,840 --> 00:01:01,920 Speaker 1: last episode, and you know all about how uh the 15 00:01:02,520 --> 00:01:06,200 Speaker 1: Steve Jackson, who was the target of the Secret Service 16 00:01:06,319 --> 00:01:09,920 Speaker 1: in part of their sting operation, felt that his and 17 00:01:10,000 --> 00:01:15,240 Speaker 1: his employees civil liberties had been violated and was looking 18 00:01:15,319 --> 00:01:18,880 Speaker 1: for someone to help him pursue a claim against the 19 00:01:18,920 --> 00:01:21,639 Speaker 1: United States government, but found that none of the civil 20 00:01:21,640 --> 00:01:24,600 Speaker 1: liberties organizations that were in existence were really prepared to 21 00:01:24,600 --> 00:01:28,960 Speaker 1: help him out because they just weren't savvy as far 22 00:01:29,000 --> 00:01:32,759 Speaker 1: as the Internet goes, and so the Electronic Frontier Foundation 23 00:01:33,200 --> 00:01:36,920 Speaker 1: was borne out necessity by a group of people who 24 00:01:36,959 --> 00:01:41,160 Speaker 1: felt that they're definitely needed to be an organization dedicated 25 00:01:41,280 --> 00:01:46,560 Speaker 1: to defending civil liberties in the Internet space space, especially 26 00:01:46,560 --> 00:01:50,600 Speaker 1: since they foresaw the Internet becoming an increasingly important role 27 00:01:50,640 --> 00:01:53,440 Speaker 1: in our lives. And that has absolutely been the case. 28 00:01:53,480 --> 00:01:56,440 Speaker 1: The Internet has become instrumental to the point where some 29 00:01:56,480 --> 00:02:00,120 Speaker 1: people argue that it is a a fundamental necessity, the 30 00:02:00,280 --> 00:02:02,800 Speaker 1: kind of like food and water. You pretty much need 31 00:02:02,840 --> 00:02:05,960 Speaker 1: the Internet to make a go of it today, which 32 00:02:05,960 --> 00:02:09,120 Speaker 1: is now for at least for on the national level, 33 00:02:09,360 --> 00:02:12,960 Speaker 1: certainly you can make a case for that. Well, we're 34 00:02:12,960 --> 00:02:16,480 Speaker 1: gonna talk more about what the Electronic Frontier Foundation is 35 00:02:16,600 --> 00:02:19,840 Speaker 1: and what they do in this episode. The e f 36 00:02:19,840 --> 00:02:24,640 Speaker 1: F focuses on several broad areas of civil liberties, and 37 00:02:24,840 --> 00:02:29,920 Speaker 1: here's how the organization itself defines that. They actually have 38 00:02:30,160 --> 00:02:34,280 Speaker 1: a selection where they have a page where they talk 39 00:02:34,320 --> 00:02:36,640 Speaker 1: about all of these different ones. And we'll start with 40 00:02:36,720 --> 00:02:40,120 Speaker 1: free speech. So, freedom of speech typically means that the 41 00:02:40,160 --> 00:02:42,760 Speaker 1: people of any given population have the right to express 42 00:02:42,800 --> 00:02:47,680 Speaker 1: their opinions publicly without governmental interference, and a lot of 43 00:02:47,800 --> 00:02:50,920 Speaker 1: countries around the world have this as a fundamental freedom 44 00:02:51,040 --> 00:02:54,560 Speaker 1: codified in some way. Here in the United States, we 45 00:02:54,600 --> 00:02:58,040 Speaker 1: have the Constitution, and freedom of speech is in the 46 00:02:58,080 --> 00:03:00,839 Speaker 1: first Amendment in the Bill of Right, those first ten 47 00:03:01,000 --> 00:03:05,680 Speaker 1: Amendments to the Constitution. Is therefore a foundational right of 48 00:03:05,760 --> 00:03:09,840 Speaker 1: a citizen in the United States. There are some considerations 49 00:03:09,880 --> 00:03:12,080 Speaker 1: we have to make, however, freedom of speech does not 50 00:03:12,200 --> 00:03:17,160 Speaker 1: give you absolute freedom to say anything without governmental interference. 51 00:03:17,240 --> 00:03:22,880 Speaker 1: As it turns out, uh first, the right typically protects 52 00:03:22,919 --> 00:03:26,640 Speaker 1: against governmental abuse. Only that's something a lot of people 53 00:03:26,680 --> 00:03:30,400 Speaker 1: don't seem to understand, is that this concept of censorship 54 00:03:30,520 --> 00:03:35,720 Speaker 1: and denying someone their First Amendment rights. The Constitution protects 55 00:03:36,280 --> 00:03:41,040 Speaker 1: citizens against the government doing this. It's not meant to 56 00:03:41,120 --> 00:03:44,400 Speaker 1: be a an overall protection against any entity doing this 57 00:03:44,480 --> 00:03:48,280 Speaker 1: under any circumstances. So in other words, I make a blog. Alright, 58 00:03:48,280 --> 00:03:51,160 Speaker 1: create a blog, and I include a comments section. But 59 00:03:51,320 --> 00:03:54,120 Speaker 1: I decide that I'm going to monitor those comments, and 60 00:03:54,240 --> 00:03:57,280 Speaker 1: anything I see that's left in those comments I don't like, 61 00:03:57,880 --> 00:03:59,880 Speaker 1: I decide I'm just gonna go ahead and delete it. 62 00:04:00,680 --> 00:04:04,960 Speaker 1: That's perfectly fine. It's not the government that's restricting anyone's 63 00:04:04,960 --> 00:04:08,840 Speaker 1: free speech. But if the government was forcing me to 64 00:04:08,960 --> 00:04:13,160 Speaker 1: erase comments for whatever reason. That would be a violation 65 00:04:13,200 --> 00:04:17,679 Speaker 1: of free speech from a legal standpoint. So I can 66 00:04:17,720 --> 00:04:20,919 Speaker 1: I can create a space, whether it's online or maybe 67 00:04:20,920 --> 00:04:23,320 Speaker 1: I'm hosting a party, and if someone says something I 68 00:04:23,360 --> 00:04:25,520 Speaker 1: don't like, I can tell them that they can't say 69 00:04:25,600 --> 00:04:29,000 Speaker 1: that in my space. Because I'm not the government, I'm 70 00:04:29,040 --> 00:04:33,600 Speaker 1: not violating their First Amendment rights. This is more about 71 00:04:33,680 --> 00:04:37,400 Speaker 1: them being in my space and not just a fundamental 72 00:04:37,480 --> 00:04:40,280 Speaker 1: right as a citizen of the country. Also another thing 73 00:04:40,279 --> 00:04:42,760 Speaker 1: that considers that freedom of speech does have limitations. I 74 00:04:42,839 --> 00:04:46,440 Speaker 1: cannot make false claims against others in an effort to 75 00:04:46,480 --> 00:04:49,720 Speaker 1: harm them in some way. For example, that would be 76 00:04:50,120 --> 00:04:54,720 Speaker 1: slander or libel. Both of those are referencing making these 77 00:04:54,839 --> 00:04:58,599 Speaker 1: untrue claims in an effort to cause harm to another individual. 78 00:04:58,880 --> 00:05:02,080 Speaker 1: That harm might be physical, it might be financial, it 79 00:05:02,200 --> 00:05:05,200 Speaker 1: might be harm to their reputation. Uh, that is not 80 00:05:05,279 --> 00:05:08,919 Speaker 1: a protected form of speech. I also have no legal 81 00:05:08,920 --> 00:05:12,120 Speaker 1: protection if I use my speech to incite others to violence. 82 00:05:12,279 --> 00:05:16,279 Speaker 1: But this gets to be a tricky area. So you 83 00:05:16,400 --> 00:05:20,000 Speaker 1: might find court cases where there's an argument about whether 84 00:05:20,080 --> 00:05:23,799 Speaker 1: or not someone was calling people to violence, and sometimes 85 00:05:23,800 --> 00:05:26,280 Speaker 1: there are specific criteria that must be met, such as 86 00:05:27,000 --> 00:05:30,960 Speaker 1: was there a sense that the call to violence was 87 00:05:31,040 --> 00:05:35,400 Speaker 1: for an immediate response or for some predetermined time. Because 88 00:05:35,480 --> 00:05:39,560 Speaker 1: if that's the case, absolutely not protected by freedom of speech. 89 00:05:39,600 --> 00:05:43,240 Speaker 1: If it's more of a general let's resist, then you 90 00:05:43,279 --> 00:05:46,200 Speaker 1: could argue that should be protected under freedom of speech. 91 00:05:46,279 --> 00:05:48,560 Speaker 1: So it's it's not so cut and dry, it's not 92 00:05:48,600 --> 00:05:52,960 Speaker 1: black and white. A lot of these subtle variations have 93 00:05:53,080 --> 00:05:55,960 Speaker 1: to be hashed out in court cases which can go 94 00:05:56,040 --> 00:06:00,720 Speaker 1: either way. Accords not automatically going to decide in favor 95 00:06:00,760 --> 00:06:04,240 Speaker 1: of one side versus the other, and sometimes they decisions 96 00:06:04,240 --> 00:06:08,960 Speaker 1: can actually be surprising. But speech is protected within certain limits. Now, 97 00:06:08,960 --> 00:06:13,000 Speaker 1: communication online gets to be pretty complicated because you've got 98 00:06:13,160 --> 00:06:18,480 Speaker 1: Internet service providers, you've got financial services, you've got lots 99 00:06:18,520 --> 00:06:21,880 Speaker 1: of different working parts to the Internet, and a government 100 00:06:21,920 --> 00:06:27,160 Speaker 1: can lean on any component within that infrastructure. Right, a 101 00:06:27,240 --> 00:06:32,120 Speaker 1: government agent might come to an Internet service provider and say, hey, 102 00:06:32,120 --> 00:06:38,800 Speaker 1: we're looking for any signs of of criticisms against this 103 00:06:39,000 --> 00:06:43,200 Speaker 1: specific person. We want to suppress that, uh, and we're 104 00:06:43,200 --> 00:06:45,520 Speaker 1: going to do it through your service. Well, that would 105 00:06:45,560 --> 00:06:48,120 Speaker 1: be a form of censorship, and it would be a 106 00:06:48,200 --> 00:06:53,279 Speaker 1: violation of First Amendment rights. Um Even though the actual 107 00:06:53,400 --> 00:06:56,040 Speaker 1: silencing would seem to come from the Internet service provider, 108 00:06:56,240 --> 00:07:00,279 Speaker 1: which you might be able to argue has that capac city. 109 00:07:00,320 --> 00:07:03,760 Speaker 1: It would be ultimately due to government interference or a 110 00:07:03,800 --> 00:07:08,880 Speaker 1: financial company saying, hey, don't allow for any transactions to 111 00:07:09,040 --> 00:07:12,640 Speaker 1: happen with this particular entity because some of the stuff 112 00:07:12,680 --> 00:07:17,240 Speaker 1: they do is illegal. Uh. Because unless everything is illegal 113 00:07:17,800 --> 00:07:20,480 Speaker 1: in that, you know, and unless you can prove that 114 00:07:20,720 --> 00:07:24,200 Speaker 1: all those transactions are illegal, that gets super tricky because 115 00:07:24,200 --> 00:07:28,600 Speaker 1: if there are legal transactions happening, you're preventing those legal 116 00:07:28,640 --> 00:07:32,480 Speaker 1: transactions from actually occurring. So it's a very complicated mess, 117 00:07:32,680 --> 00:07:35,000 Speaker 1: and that's part of the reason why the Electronic Frontier 118 00:07:35,040 --> 00:07:39,040 Speaker 1: Foundation is getting involved in these sort of discussions. It's 119 00:07:39,120 --> 00:07:43,000 Speaker 1: to try and make sure that the rights of citizens 120 00:07:43,440 --> 00:07:47,400 Speaker 1: remain protected. The e f F is also dedicated to 121 00:07:47,440 --> 00:07:51,720 Speaker 1: protecting creativity and innovation, and that's their words. That's a 122 00:07:51,760 --> 00:07:54,680 Speaker 1: big playing field there. Some of the ideas that the 123 00:07:55,000 --> 00:07:58,320 Speaker 1: FF have are pretty simple. So, for example, the e 124 00:07:58,640 --> 00:08:02,600 Speaker 1: f F views copy right law as being somewhat problematic, 125 00:08:03,000 --> 00:08:07,080 Speaker 1: particularly when the entities that own intellectual property overstepped their bounds. 126 00:08:07,760 --> 00:08:11,480 Speaker 1: Copyright definitely has a place. It exists to protect a 127 00:08:11,520 --> 00:08:14,920 Speaker 1: creator or in some cases an owner of intellectual property 128 00:08:14,960 --> 00:08:18,840 Speaker 1: who did not themselves create the content, but they own 129 00:08:18,920 --> 00:08:22,119 Speaker 1: the content. But it protects them from those who would 130 00:08:22,120 --> 00:08:25,520 Speaker 1: reproduce that intellectual property for their own profit. But it 131 00:08:25,560 --> 00:08:30,480 Speaker 1: can also lead to draconian practices and even abuse. Another 132 00:08:30,520 --> 00:08:33,400 Speaker 1: example of protecting creativity and innovation comes in the form 133 00:08:33,520 --> 00:08:36,200 Speaker 1: of fighting patent trolls. So, if you don't know what 134 00:08:36,240 --> 00:08:39,520 Speaker 1: a patentrol is, um, a patentrol is an entity that 135 00:08:39,600 --> 00:08:44,640 Speaker 1: purchases patents not for the purposes of creating or implementing 136 00:08:44,640 --> 00:08:50,760 Speaker 1: a particular technology, but rather withholding that design from other 137 00:08:50,920 --> 00:08:54,520 Speaker 1: entities that could actually put it to use. So it's 138 00:08:54,559 --> 00:08:56,840 Speaker 1: kind of like, let's say you've got an inventor and 139 00:08:56,880 --> 00:08:59,520 Speaker 1: the inventor has come up with a brilliant way of 140 00:08:59,640 --> 00:09:04,000 Speaker 1: create eating a very efficient solar panel, and you pay 141 00:09:04,040 --> 00:09:07,720 Speaker 1: the inventor for the rights to the inventor's patent, and 142 00:09:07,800 --> 00:09:09,719 Speaker 1: she hands the patent over to you, and you've got 143 00:09:09,760 --> 00:09:11,240 Speaker 1: the patent in your hands, and you're like, ha ha, 144 00:09:11,360 --> 00:09:13,839 Speaker 1: it's mine now. And instead of going forward and then 145 00:09:13,880 --> 00:09:17,200 Speaker 1: manufacturing these solar panels and trying to make a business 146 00:09:17,320 --> 00:09:19,920 Speaker 1: that way, you just sit on the patent and you 147 00:09:19,960 --> 00:09:22,320 Speaker 1: look around and you pay attention, and you try and 148 00:09:22,360 --> 00:09:26,600 Speaker 1: see if anyone out there is making solar panels using 149 00:09:26,640 --> 00:09:30,000 Speaker 1: the method that you talked about or that is talked 150 00:09:30,000 --> 00:09:32,760 Speaker 1: about within the patent that you own. Now, patents are 151 00:09:32,840 --> 00:09:37,600 Speaker 1: public documents. Anyone can read them, but you have to 152 00:09:37,679 --> 00:09:40,240 Speaker 1: get the permission of the patent holder in order to 153 00:09:40,400 --> 00:09:43,320 Speaker 1: use the technology, and typically this is done through licensing. 154 00:09:43,920 --> 00:09:49,760 Speaker 1: Some patentrols will license the the technology or processes that 155 00:09:49,800 --> 00:09:53,040 Speaker 1: are in patents. They might charge quite a bit of money, 156 00:09:53,080 --> 00:09:57,400 Speaker 1: but they'll license it. Others won't, and instead, what they 157 00:09:57,440 --> 00:10:01,000 Speaker 1: do is they look for instances of someone using that 158 00:10:01,080 --> 00:10:05,040 Speaker 1: technology or that process and then threaten them with lawsuits, 159 00:10:05,800 --> 00:10:10,280 Speaker 1: typically very hefty ones. And uh it usually is an 160 00:10:10,280 --> 00:10:13,000 Speaker 1: attempt to get the other entity to settle out of court. 161 00:10:13,240 --> 00:10:15,920 Speaker 1: And that's how the patentrol makes money. It's not making 162 00:10:15,920 --> 00:10:19,439 Speaker 1: money by creating technology. It's not even making money by 163 00:10:19,480 --> 00:10:23,400 Speaker 1: licensing technology. It's making money by waiting until someone else 164 00:10:23,480 --> 00:10:27,280 Speaker 1: has tried to use that idea and then pouncing on them, 165 00:10:27,400 --> 00:10:30,000 Speaker 1: which legally they are allowed to do because they own 166 00:10:30,040 --> 00:10:33,160 Speaker 1: the patent. But ethically you might argue, well, all they're 167 00:10:33,200 --> 00:10:36,160 Speaker 1: really doing is withholding something that could be of value 168 00:10:36,160 --> 00:10:38,920 Speaker 1: to people, and they're doing it just so that they 169 00:10:38,920 --> 00:10:43,319 Speaker 1: can jump on any company that has the the desire 170 00:10:43,440 --> 00:10:47,239 Speaker 1: to actually make that technology. So it's a sticky situation 171 00:10:47,840 --> 00:10:49,360 Speaker 1: because on the one hand, you don't want to get 172 00:10:49,360 --> 00:10:54,319 Speaker 1: a get away from the concept of protecting intellectual property. 173 00:10:54,440 --> 00:10:57,839 Speaker 1: You want people to have a reasonable expectation of protection 174 00:10:58,240 --> 00:11:01,280 Speaker 1: so that they can profit from ideas and the work 175 00:11:01,440 --> 00:11:05,120 Speaker 1: that they do. But on the other hand, you don't 176 00:11:05,120 --> 00:11:10,760 Speaker 1: want entities hoarding knowledge and not letting it out unless 177 00:11:11,040 --> 00:11:16,400 Speaker 1: companies are paying exorbitant fees, because that stifles innovation. So 178 00:11:17,160 --> 00:11:21,199 Speaker 1: e f F is very much against the whole patentrol approach. 179 00:11:21,880 --> 00:11:26,000 Speaker 1: One huge area of interest for the e f is privacy. 180 00:11:26,040 --> 00:11:28,439 Speaker 1: This is one of those topics that some people don't 181 00:11:28,480 --> 00:11:32,600 Speaker 1: really consider until the worst has already happened to them. 182 00:11:32,920 --> 00:11:36,360 Speaker 1: So we've been conditioned as citizens of the Internet to 183 00:11:36,440 --> 00:11:40,319 Speaker 1: surrender more and more of our privacy in our online interactions, 184 00:11:40,400 --> 00:11:46,200 Speaker 1: sometimes without knowingly doing so, and occasionally it's because we're 185 00:11:46,240 --> 00:11:50,120 Speaker 1: sharing stuff about ourselves with our friends. So interactions on 186 00:11:50,160 --> 00:11:54,920 Speaker 1: Facebook are a great example. We willingly share information about ourselves, 187 00:11:54,920 --> 00:11:58,800 Speaker 1: it's part of our socialization. But that information isn't just 188 00:11:59,080 --> 00:12:02,400 Speaker 1: going to our ends, it's going to Facebook, which is 189 00:12:02,480 --> 00:12:06,360 Speaker 1: making very good use of that information. They're profiting off 190 00:12:06,360 --> 00:12:11,000 Speaker 1: of it. That is their business, it's how they make money. 191 00:12:11,120 --> 00:12:15,360 Speaker 1: And if we didn't share all this information, then Facebook 192 00:12:15,400 --> 00:12:19,040 Speaker 1: would not have a huge revenue source. Uh, it's certainly 193 00:12:19,480 --> 00:12:23,360 Speaker 1: wouldn't be as successful as it is today. Other times 194 00:12:23,360 --> 00:12:26,880 Speaker 1: we're sharing information because it's a matter of convenience. So 195 00:12:27,160 --> 00:12:31,680 Speaker 1: for example, Amazon has algorithms that will present you with 196 00:12:32,080 --> 00:12:35,920 Speaker 1: suggested items that you might be interested in based upon 197 00:12:35,960 --> 00:12:41,040 Speaker 1: your past browsing and shopping sessions. But again, that means 198 00:12:41,080 --> 00:12:43,800 Speaker 1: that it's collecting information about you and about your habits 199 00:12:43,880 --> 00:12:46,000 Speaker 1: in order to make these kind of suggestions in the 200 00:12:46,000 --> 00:12:48,080 Speaker 1: first place. Now for some people that's not a big deal. 201 00:12:48,160 --> 00:12:52,040 Speaker 1: For other people, they think it's super creepy. And in 202 00:12:52,080 --> 00:12:54,560 Speaker 1: many cases, there's another game going on that's going on 203 00:12:54,640 --> 00:12:58,040 Speaker 1: behind the scenes. That's where the information about us gets 204 00:12:58,040 --> 00:13:01,319 Speaker 1: collected into ever growing dos a's. It might be a 205 00:13:01,400 --> 00:13:03,760 Speaker 1: dossier that doesn't have your name on it, but it 206 00:13:03,840 --> 00:13:06,760 Speaker 1: has all the information that would anyone would need to 207 00:13:06,760 --> 00:13:10,120 Speaker 1: be able to identify you anyway. There's also this other 208 00:13:10,360 --> 00:13:14,240 Speaker 1: kind of slippery slope where companies will say, hey, we 209 00:13:14,240 --> 00:13:17,839 Speaker 1: we use anonymous data. We don't. We don't connect an 210 00:13:17,840 --> 00:13:21,160 Speaker 1: identity to the information we get. We want the information 211 00:13:21,520 --> 00:13:24,200 Speaker 1: so that we can run analyzes, but we don't want 212 00:13:24,240 --> 00:13:27,920 Speaker 1: to say, oh, well, that's that's Bob Smith. That's clearly 213 00:13:27,960 --> 00:13:30,360 Speaker 1: Bob Smith. The problem is the more points of data 214 00:13:30,440 --> 00:13:33,439 Speaker 1: you have about any individual person, the easier it is 215 00:13:33,480 --> 00:13:37,160 Speaker 1: to identify who that person is. And you could even 216 00:13:37,280 --> 00:13:39,959 Speaker 1: argue that the person's name is the least important thing 217 00:13:40,000 --> 00:13:44,839 Speaker 1: about that person from a certain perspective. Knowing all about 218 00:13:44,880 --> 00:13:47,960 Speaker 1: that person in their habits and what they want and 219 00:13:48,040 --> 00:13:50,520 Speaker 1: what they don't want, what they like, what they hate, 220 00:13:50,960 --> 00:13:55,839 Speaker 1: all that is valuable information. The name is superfluous at 221 00:13:55,880 --> 00:13:59,559 Speaker 1: that point, and we should still be concerned about all 222 00:13:59,559 --> 00:14:03,559 Speaker 1: that other information about us. That's the Electronic Frontier Foundation's 223 00:14:03,600 --> 00:14:06,719 Speaker 1: point is that we need to be cognizant that all 224 00:14:06,760 --> 00:14:09,520 Speaker 1: of this information about us is being shared and and 225 00:14:09,559 --> 00:14:14,719 Speaker 1: it's being h collected and stored and sold. There are 226 00:14:15,000 --> 00:14:19,840 Speaker 1: entire companies that exist just to buy and sell user information, 227 00:14:20,480 --> 00:14:23,800 Speaker 1: and that gets a little creepy. Beyond that, there are 228 00:14:23,840 --> 00:14:27,480 Speaker 1: also organizations that are actively spying on people. They're not 229 00:14:27,520 --> 00:14:31,360 Speaker 1: just passively collecting information or or building dossiers. There are 230 00:14:31,400 --> 00:14:37,720 Speaker 1: some that are really performing surveillance upon folks. Take the 231 00:14:37,840 --> 00:14:39,840 Speaker 1: n s A for example, and I don't mean the 232 00:14:39,920 --> 00:14:44,120 Speaker 1: National Scrabble Association, but I do mean the National Security Agency. 233 00:14:44,680 --> 00:14:46,840 Speaker 1: Over the last few years, details about the n S 234 00:14:46,880 --> 00:14:50,960 Speaker 1: AS various surveillance programs have raised lots of red flags 235 00:14:51,040 --> 00:14:55,560 Speaker 1: among numerous civil liberties organizations, including the e f F. Now, 236 00:14:55,600 --> 00:14:59,760 Speaker 1: another issue that the organization concerns itself with is online 237 00:14:59,800 --> 00:15:04,320 Speaker 1: s purity, which often goes hand in hand with privacy, 238 00:15:04,400 --> 00:15:07,880 Speaker 1: and this goes from supporting various encryption strategies to help 239 00:15:08,240 --> 00:15:13,560 Speaker 1: protecting communication from prying eyes to launching the Coders Rights 240 00:15:13,600 --> 00:15:16,120 Speaker 1: Project to protect programmers rights as they work on the 241 00:15:16,160 --> 00:15:19,800 Speaker 1: bleeding edge of technology. The FF advocates for a secure, 242 00:15:19,840 --> 00:15:25,600 Speaker 1: supportive online environment, and they also argue for transparency from 243 00:15:25,640 --> 00:15:28,320 Speaker 1: the various entities that either make up the infrastructure of 244 00:15:28,360 --> 00:15:31,240 Speaker 1: the Internet or use the Internet in some fundamental way, 245 00:15:31,560 --> 00:15:36,040 Speaker 1: which includes holding government agencies accountable and demanding that these 246 00:15:36,040 --> 00:15:40,680 Speaker 1: agencies are forthright about their online activities, which isn't always easy. 247 00:15:40,720 --> 00:15:43,720 Speaker 1: It can sometimes require extraordinary efforts to get the real 248 00:15:43,760 --> 00:15:46,680 Speaker 1: story out of various government agencies about what is actually 249 00:15:46,720 --> 00:15:50,400 Speaker 1: going on, and sometimes it even requires legal action to 250 00:15:50,520 --> 00:15:54,320 Speaker 1: force government offices to change their policies when those policies 251 00:15:54,360 --> 00:15:58,000 Speaker 1: aren't aligned with constitutional protections. That's a fancy way of saying. 252 00:15:58,040 --> 00:15:59,960 Speaker 1: The e f F is sort of a watch dot 253 00:16:00,360 --> 00:16:04,800 Speaker 1: and just making sure that governments are doing the right 254 00:16:04,840 --> 00:16:09,800 Speaker 1: thing for their citizens and they're not working against their citizens. 255 00:16:09,960 --> 00:16:13,040 Speaker 1: And transparency is absolutely important for that because if you 256 00:16:13,160 --> 00:16:17,480 Speaker 1: if you have a policy that's not transparent, that does 257 00:16:17,520 --> 00:16:20,640 Speaker 1: not tell the citizens, hey, this is what's really going on, 258 00:16:21,400 --> 00:16:26,480 Speaker 1: then you've introduced a reason to distrust the government. And 259 00:16:26,760 --> 00:16:28,520 Speaker 1: I would argue the e f F has got a 260 00:16:28,640 --> 00:16:33,080 Speaker 1: very healthy distrust of government and in many ways it's justified. 261 00:16:33,480 --> 00:16:35,720 Speaker 1: In some ways, it may be over zealous, but in 262 00:16:36,160 --> 00:16:39,560 Speaker 1: most ways, I think, uh, the government has given plenty 263 00:16:39,640 --> 00:16:43,840 Speaker 1: of of reasons why the FFS position is not unreasonable. 264 00:16:43,920 --> 00:16:46,640 Speaker 1: You may not agree with it, but you probably would 265 00:16:46,640 --> 00:16:49,400 Speaker 1: at least come to the conclusion that they have reason 266 00:16:49,600 --> 00:16:52,880 Speaker 1: for their behaviors. The e f F also works to 267 00:16:53,000 --> 00:16:56,840 Speaker 1: help free people who are held prisoner for speaking out online, 268 00:16:56,880 --> 00:17:01,600 Speaker 1: particularly people who are journalists or activists, and they also 269 00:17:01,640 --> 00:17:05,000 Speaker 1: try to raise awareness about potentially abusive policies, such as 270 00:17:05,000 --> 00:17:09,040 Speaker 1: electronic media searches conducted at border crossings. You may have 271 00:17:09,119 --> 00:17:12,080 Speaker 1: heard of this in the last several years, where people 272 00:17:12,320 --> 00:17:16,160 Speaker 1: who are crossing over the US border, whether on land 273 00:17:16,280 --> 00:17:20,199 Speaker 1: or by air by sea, will be greeted by border 274 00:17:21,040 --> 00:17:24,000 Speaker 1: officers or custom officials and they will say, hey, we 275 00:17:24,480 --> 00:17:26,920 Speaker 1: want to get hold of your mobile device or your 276 00:17:26,920 --> 00:17:29,919 Speaker 1: computer and make an electronic search of the materials to 277 00:17:29,920 --> 00:17:35,600 Speaker 1: make sure there's nothing questionable on there. And the FF 278 00:17:35,640 --> 00:17:38,400 Speaker 1: steps in and says, hey, unless you've got a warrant, 279 00:17:38,760 --> 00:17:41,240 Speaker 1: then you really can't do that would be unreasonable search 280 00:17:41,280 --> 00:17:47,640 Speaker 1: and seizure to just search somebody randomly like that. Um 281 00:17:47,680 --> 00:17:50,240 Speaker 1: so that that's another one of their big positions. Of course, 282 00:17:50,240 --> 00:17:52,280 Speaker 1: when the e f F first got its start, its 283 00:17:52,320 --> 00:17:55,679 Speaker 1: mission wasn't quite so methodically organized. It all began with 284 00:17:55,720 --> 00:17:59,120 Speaker 1: that court case with Steve Jackson Games bringing it against 285 00:17:59,119 --> 00:18:02,399 Speaker 1: the Secret Service, and that case was instrumental because it 286 00:18:02,560 --> 00:18:06,000 Speaker 1: established in a United States court of law that electronic 287 00:18:06,080 --> 00:18:10,720 Speaker 1: mail should receive the same protections and considerations as telephone communication, 288 00:18:11,080 --> 00:18:13,960 Speaker 1: which means that law enforcement does not have the authority 289 00:18:14,040 --> 00:18:18,760 Speaker 1: to intercept or otherwise read electronic communication without first securing 290 00:18:18,800 --> 00:18:21,399 Speaker 1: a warrant from a judge, and that was just the 291 00:18:21,400 --> 00:18:25,320 Speaker 1: beginning of the FF. The next big case that they 292 00:18:25,400 --> 00:18:29,120 Speaker 1: became involved in was Bernstein versus U S Department of Justice. 293 00:18:29,600 --> 00:18:31,639 Speaker 1: And you might have heard the phrase the pen is 294 00:18:31,720 --> 00:18:35,440 Speaker 1: mightier than the sword. Well, Daniel J. Bernstein's pen must 295 00:18:35,440 --> 00:18:39,280 Speaker 1: have been a real whopper. The US government identified Bernstein's 296 00:18:39,320 --> 00:18:41,880 Speaker 1: work as so dangerous that he was told he need 297 00:18:41,880 --> 00:18:45,639 Speaker 1: to secure an arms dealer license to talk about it 298 00:18:45,800 --> 00:18:49,560 Speaker 1: or to publish his work outside the United States. What 299 00:18:50,240 --> 00:18:52,680 Speaker 1: you might be saying, I know I did. Well, let's 300 00:18:52,720 --> 00:18:56,240 Speaker 1: backtrack a bit. In fact, we're gonna backtrack a lot, 301 00:18:56,280 --> 00:18:58,720 Speaker 1: because that's what I do on this show. So we're 302 00:18:58,760 --> 00:19:02,560 Speaker 1: gonna talk about cryptography. The cryptography branches off of an 303 00:19:02,600 --> 00:19:06,280 Speaker 1: ancient strategy of hiding messages in some way so that 304 00:19:06,320 --> 00:19:09,520 Speaker 1: the person or persons you are contacting are the only 305 00:19:09,600 --> 00:19:13,119 Speaker 1: ones who get the information you're actually sending. And there 306 00:19:13,160 --> 00:19:16,600 Speaker 1: are lots of different ways to do this. There's steganography 307 00:19:16,960 --> 00:19:20,840 Speaker 1: that involves hiding data within some other non secret formats, 308 00:19:20,880 --> 00:19:23,760 Speaker 1: such as inside the code of an MP three file 309 00:19:24,160 --> 00:19:27,280 Speaker 1: or inside of an image. Uh, you could hide it 310 00:19:27,320 --> 00:19:30,680 Speaker 1: with tiny little micro dots on microfilm, which would be 311 00:19:30,800 --> 00:19:32,840 Speaker 1: very difficult to detect unless you knew what you were 312 00:19:32,880 --> 00:19:36,000 Speaker 1: looking for. Or there's encryption in which you take a 313 00:19:36,000 --> 00:19:39,240 Speaker 1: message in one format, you apply some sort of process 314 00:19:39,359 --> 00:19:42,520 Speaker 1: to the message to transform it in some way, and 315 00:19:42,520 --> 00:19:46,280 Speaker 1: then you send that transformed message to the recipient, who 316 00:19:46,280 --> 00:19:49,280 Speaker 1: presumably has a method for reversing this process and gaining 317 00:19:49,320 --> 00:19:52,919 Speaker 1: access to the original message. So a simple version of 318 00:19:52,960 --> 00:19:55,520 Speaker 1: this is just a cipher where you have a set 319 00:19:55,520 --> 00:19:58,639 Speaker 1: of rules, and the simplest of all ciphers would be 320 00:19:58,680 --> 00:20:03,040 Speaker 1: just shifting all the letters down by a slot, so 321 00:20:03,119 --> 00:20:05,520 Speaker 1: A is B B, S C, C is D, that 322 00:20:05,600 --> 00:20:07,280 Speaker 1: kind of thing, and then writing it all out, and 323 00:20:07,320 --> 00:20:10,000 Speaker 1: because the person you're sending it to knows the rules, 324 00:20:10,080 --> 00:20:15,440 Speaker 1: they can reverse that and they can then decipher the message. Obviously, 325 00:20:15,440 --> 00:20:18,080 Speaker 1: cryptography tends to be a little more complicated than that, 326 00:20:18,160 --> 00:20:21,800 Speaker 1: because people are onto us. They figured out the whole 327 00:20:21,840 --> 00:20:25,200 Speaker 1: A S, B, B is C thing, the clever clever people. 328 00:20:25,640 --> 00:20:30,399 Speaker 1: According to the site tech Target, cryptography has four objectives. 329 00:20:31,160 --> 00:20:33,960 Speaker 1: The first is secrecy, which means only the person with 330 00:20:34,000 --> 00:20:37,920 Speaker 1: the proper key, as in the intended recipient, can read 331 00:20:37,960 --> 00:20:42,040 Speaker 1: the original message. The second is integrity, which means the 332 00:20:42,080 --> 00:20:45,119 Speaker 1: message cannot be changed in any way between the sender 333 00:20:45,200 --> 00:20:49,960 Speaker 1: and the recipient without being detected. Next is non repudiation, 334 00:20:50,080 --> 00:20:52,640 Speaker 1: which means that there's no plausible deniability on the part 335 00:20:52,640 --> 00:20:55,359 Speaker 1: of the sender. Uh This is to remove the possibility 336 00:20:55,400 --> 00:20:58,480 Speaker 1: the sender could leave a recipient out to hang by 337 00:20:58,520 --> 00:21:02,080 Speaker 1: denying they ever intended to create or send the message. 338 00:21:02,160 --> 00:21:05,240 Speaker 1: And fourth is authentication, which means that both the sender 339 00:21:05,320 --> 00:21:08,280 Speaker 1: and the recipient can be assured of each other's identities. 340 00:21:08,760 --> 00:21:11,760 Speaker 1: Now that does not mean that all cryptographic systems meet 341 00:21:11,840 --> 00:21:16,320 Speaker 1: all four of those objectives, but a good cryptographic system 342 00:21:16,400 --> 00:21:19,840 Speaker 1: should meet at least most of them, and an ideal 343 00:21:19,880 --> 00:21:22,919 Speaker 1: system would meet all four criteria. So with such a system, 344 00:21:22,960 --> 00:21:25,359 Speaker 1: you can be sure that any message you sent would 345 00:21:25,400 --> 00:21:28,119 Speaker 1: be secure until it got to the right person, and 346 00:21:28,200 --> 00:21:31,439 Speaker 1: that only that right person would be able to decrypt it. 347 00:21:31,560 --> 00:21:33,640 Speaker 1: And further, both of you could be sure the message 348 00:21:33,720 --> 00:21:37,000 Speaker 1: came from and went to the right people. Cryptography is 349 00:21:37,080 --> 00:21:39,600 Speaker 1: used for all sorts of communication, but it's clearly of 350 00:21:39,600 --> 00:21:43,120 Speaker 1: critical importance if you want to send someone information that 351 00:21:43,200 --> 00:21:46,560 Speaker 1: other people absolutely should not see. For this reason, it's 352 00:21:46,600 --> 00:21:50,680 Speaker 1: a big part of diplomacy. For espionage for financial operations 353 00:21:50,680 --> 00:21:54,600 Speaker 1: and other supersensitive activities, and the United States, like many 354 00:21:54,640 --> 00:21:58,600 Speaker 1: other countries, places a very high value on cryptographic strategies, 355 00:21:58,800 --> 00:22:02,200 Speaker 1: which is why the US government decided to include cryptographic 356 00:22:02,240 --> 00:22:08,560 Speaker 1: systems in its documentation for International Traffic in Arms Regulations. Now, 357 00:22:08,600 --> 00:22:12,119 Speaker 1: this is a document that effectively defined cryptography as a 358 00:22:12,119 --> 00:22:16,280 Speaker 1: weapon itself under a category thirteen of Title twenty two, 359 00:22:16,600 --> 00:22:21,359 Speaker 1: Chapter one, sub Chapter M of the Code of Federal Regulations. 360 00:22:21,720 --> 00:22:25,840 Speaker 1: You'll see the title Auxiliary Military Equipment. And here's an 361 00:22:25,920 --> 00:22:31,080 Speaker 1: excerpt from that section. Information Security systems and Equipment, cryptographic devices, 362 00:22:31,200 --> 00:22:36,400 Speaker 1: software and components specifically designed or modified therefore including cryptographic 363 00:22:36,480 --> 00:22:42,080 Speaker 1: including key management systems, equipment assemblies, modules, integrated circuits, components, 364 00:22:42,160 --> 00:22:45,840 Speaker 1: or software with the capability of maintaining secrecy or confidentiality 365 00:22:45,880 --> 00:22:49,720 Speaker 1: of information or information systems. Now, the reason why I 366 00:22:49,960 --> 00:22:52,040 Speaker 1: just give the whole list is to show that the 367 00:22:52,160 --> 00:22:55,800 Speaker 1: United States government to find cryptographic systems as a weapon 368 00:22:56,400 --> 00:23:00,359 Speaker 1: under the official United States Munitions List, and as such, 369 00:23:00,520 --> 00:23:04,120 Speaker 1: only a licensed authority would be allowed to export such 370 00:23:04,160 --> 00:23:07,000 Speaker 1: a weapon. In the case of cryptographic systems, that meant 371 00:23:07,040 --> 00:23:09,359 Speaker 1: you would need a license to either distribute the code 372 00:23:09,440 --> 00:23:12,760 Speaker 1: or publish the information. Now, Daniel Bernstein wanted to do 373 00:23:12,840 --> 00:23:14,919 Speaker 1: both of those things. He actually wanted to talk about 374 00:23:15,000 --> 00:23:19,560 Speaker 1: this cryptographic approach he had developed and to distribute electronic 375 00:23:20,119 --> 00:23:24,800 Speaker 1: programs that could create these sort of cryptographic messages themselves, 376 00:23:25,320 --> 00:23:29,200 Speaker 1: and he called it Snuffle. She's pretty harmless. And we'll 377 00:23:29,280 --> 00:23:32,359 Speaker 1: learn more about Snuffle and his legal battles in just 378 00:23:32,440 --> 00:23:36,399 Speaker 1: a second, but first let's take a quick break to 379 00:23:36,520 --> 00:23:48,639 Speaker 1: thank our sponsor. So Bernstein developed a pretty straightforward system 380 00:23:48,680 --> 00:23:51,480 Speaker 1: as a grad student at the University of California at 381 00:23:51,480 --> 00:23:57,400 Speaker 1: Berkeley in ninet for cryptographic systems. It used an encryption 382 00:23:57,480 --> 00:24:01,359 Speaker 1: key to generate some seemingly random tech XT, which then 383 00:24:01,400 --> 00:24:04,399 Speaker 1: could be run in an x or process against a 384 00:24:04,480 --> 00:24:08,400 Speaker 1: message you intended to incipher. This inciphered text could then 385 00:24:08,440 --> 00:24:11,440 Speaker 1: be sent to your recipient and they would use their 386 00:24:11,480 --> 00:24:15,120 Speaker 1: copy of the encryption key to essentially reverse the process 387 00:24:15,160 --> 00:24:17,159 Speaker 1: and read the original message. It gets a little more 388 00:24:17,160 --> 00:24:19,520 Speaker 1: technical than that, but that's really the key of it. 389 00:24:19,760 --> 00:24:23,720 Speaker 1: Bernstein's method was called a zero delay private key encryption system, 390 00:24:23,720 --> 00:24:25,320 Speaker 1: which I mean it could be used in real time 391 00:24:25,359 --> 00:24:29,000 Speaker 1: communication between two individuals who are sharing a common private 392 00:24:29,400 --> 00:24:33,880 Speaker 1: encryption key. Bernstein's approach relied on that cryptographic hash function, 393 00:24:34,440 --> 00:24:37,080 Speaker 1: which in itself was a perfectly legal approach, should have 394 00:24:37,119 --> 00:24:41,600 Speaker 1: been upheld as being not that's not a munition, it's 395 00:24:41,600 --> 00:24:46,880 Speaker 1: not a weapon. But Bernstein's implementation created a strong encryption system, 396 00:24:47,240 --> 00:24:50,480 Speaker 1: and that was what the United States government had determined 397 00:24:50,560 --> 00:24:53,359 Speaker 1: was illegal to distribute. So it wasn't so much the 398 00:24:53,440 --> 00:24:56,520 Speaker 1: encryption approach, but rather the description of how to use 399 00:24:56,600 --> 00:24:59,960 Speaker 1: it that was deemed to be illegal for export. Bernstein 400 00:25:00,160 --> 00:25:03,159 Speaker 1: sought out permission to publish his algorithm. He wanted to 401 00:25:03,200 --> 00:25:06,640 Speaker 1: also publish a paper to describe the algorithm, and also 402 00:25:06,760 --> 00:25:09,800 Speaker 1: a computer program that could run the algorithm, and he 403 00:25:09,840 --> 00:25:12,800 Speaker 1: wanted to present all of them in an electronic conference. 404 00:25:13,280 --> 00:25:15,760 Speaker 1: So he petitioned the State Department. Even though his own 405 00:25:15,760 --> 00:25:21,440 Speaker 1: work had no connection to government funding, uh or anything 406 00:25:21,600 --> 00:25:24,600 Speaker 1: any state funding. There's nothing that was coming through too 407 00:25:25,240 --> 00:25:30,280 Speaker 1: to connect his work with any sort of official government process. 408 00:25:30,320 --> 00:25:32,800 Speaker 1: So what would happen if he had skipped this permission 409 00:25:32,840 --> 00:25:34,840 Speaker 1: step and if he had just gone ahead and done 410 00:25:34,840 --> 00:25:38,800 Speaker 1: it well? According to US law, exporting a weapon to 411 00:25:38,920 --> 00:25:42,080 Speaker 1: a foreign agent could result in a ten year jail 412 00:25:42,119 --> 00:25:45,679 Speaker 1: sentence and a million dollar criminal fine, and that's just 413 00:25:45,920 --> 00:25:49,760 Speaker 1: for start. So that would mean because the government classified 414 00:25:49,760 --> 00:25:53,200 Speaker 1: cryptographic systems as a type of weapon. In his petition, 415 00:25:53,280 --> 00:25:57,119 Speaker 1: Bernstein wrote, quote and effect, what I want to export 416 00:25:57,320 --> 00:26:00,840 Speaker 1: is a description of a way to use existing technology 417 00:26:00,880 --> 00:26:04,679 Speaker 1: in a more effective manner. I do not foresee military 418 00:26:04,800 --> 00:26:07,639 Speaker 1: or commercial use of Snuffle by anyone who does not 419 00:26:07,880 --> 00:26:12,600 Speaker 1: already have access to the cryptographic technology contained in I 420 00:26:12,720 --> 00:26:15,720 Speaker 1: do foresee practical use of Snuffle by those who do 421 00:26:15,840 --> 00:26:19,720 Speaker 1: have such access, in particular for the purpose of interactively 422 00:26:19,840 --> 00:26:25,000 Speaker 1: exchanging encrypted text end quote. The State Department rejected his 423 00:26:25,080 --> 00:26:28,400 Speaker 1: request and said that he would have to obtain a license. 424 00:26:29,040 --> 00:26:34,240 Speaker 1: The response said, this commodity is a standalone cryptographic algorithm 425 00:26:34,240 --> 00:26:38,000 Speaker 1: which is not incorporated into a finished software product, which 426 00:26:38,040 --> 00:26:43,960 Speaker 1: meant that unlike various copyright protection schemes and other implementations 427 00:26:44,040 --> 00:26:49,320 Speaker 1: that included encryption, Bernstein's approach was just the method of 428 00:26:49,400 --> 00:26:52,240 Speaker 1: encrypting data, and therefore that's what made it subject to 429 00:26:52,280 --> 00:26:55,160 Speaker 1: these rules, which is kind of crazy when you think 430 00:26:55,200 --> 00:26:58,920 Speaker 1: about it. It's essentially saying, hey, if if your approach 431 00:26:59,000 --> 00:27:01,840 Speaker 1: was actually part of a piece of software, you might 432 00:27:01,880 --> 00:27:04,000 Speaker 1: not even be running into these issues. But you're just 433 00:27:04,040 --> 00:27:09,200 Speaker 1: specifically talking about how to encrypt data. So Bernstein submitted 434 00:27:09,240 --> 00:27:13,400 Speaker 1: five separate requests to publish the source code for encryption, 435 00:27:13,880 --> 00:27:17,399 Speaker 1: the source code for decryption and English explanation of how 436 00:27:17,440 --> 00:27:21,160 Speaker 1: to encrypt data and English description of how to decrypt data, 437 00:27:21,400 --> 00:27:24,440 Speaker 1: and the original paper. The State Department would lump all 438 00:27:24,480 --> 00:27:27,640 Speaker 1: of those requests together in a single one and denied 439 00:27:27,720 --> 00:27:33,040 Speaker 1: them unilaterally. Bernstein then contacted the Electronic Frontier Foundation as 440 00:27:33,040 --> 00:27:36,080 Speaker 1: he felt his First Amendment rights were being violated. Now 441 00:27:36,080 --> 00:27:39,840 Speaker 1: this was in This was years after his initial attempts, 442 00:27:40,320 --> 00:27:43,679 Speaker 1: as the wheels of government do not turn quickly, and 443 00:27:43,760 --> 00:27:46,760 Speaker 1: the e f F agreed to support his case and 444 00:27:46,800 --> 00:27:49,399 Speaker 1: they brought a lawsuit against the government, stating that the 445 00:27:49,480 --> 00:27:53,040 Speaker 1: law was unconstitutional and that the State Department had overstepped 446 00:27:53,040 --> 00:27:57,400 Speaker 1: its authority. Now, the legal proceedings lasted four more years, 447 00:27:58,000 --> 00:28:01,400 Speaker 1: so this is a decade out of Bernstein's life that's 448 00:28:01,440 --> 00:28:06,600 Speaker 1: just dedicated just getting this cryptographic software published. The court 449 00:28:06,680 --> 00:28:10,000 Speaker 1: ultimately decided that software code is a type of speech 450 00:28:10,040 --> 00:28:12,760 Speaker 1: and therefore is covered by the protections of the First Amendment, 451 00:28:13,160 --> 00:28:15,320 Speaker 1: which meant the court agreed with the e f F 452 00:28:15,640 --> 00:28:20,120 Speaker 1: that the State Department's actions were unconstitutional. As such, the 453 00:28:20,240 --> 00:28:24,120 Speaker 1: United States coders can now create encryption software and publish 454 00:28:24,160 --> 00:28:27,359 Speaker 1: it or export it without first securing permission from the 455 00:28:27,440 --> 00:28:30,760 Speaker 1: US government, which is a pretty huge deal. Several of 456 00:28:30,800 --> 00:28:33,480 Speaker 1: the cases that the e f has participated in have 457 00:28:33,600 --> 00:28:37,679 Speaker 1: set important precedents in law. It's not just that the 458 00:28:37,760 --> 00:28:39,880 Speaker 1: e f F has helped the tech sector, but that 459 00:28:40,040 --> 00:28:43,120 Speaker 1: is played an instrumental role in extending the protections that 460 00:28:43,200 --> 00:28:46,920 Speaker 1: exist for one type of expression to the technological realm. 461 00:28:47,360 --> 00:28:50,760 Speaker 1: So one of the big arguments the e f makes 462 00:28:50,840 --> 00:28:53,240 Speaker 1: over and over again is that if this were any 463 00:28:53,240 --> 00:28:56,640 Speaker 1: other form of communication, there would be no question that 464 00:28:56,760 --> 00:29:00,480 Speaker 1: the First Amendment protection extends to it, because, as there's 465 00:29:00,520 --> 00:29:05,800 Speaker 1: plenty of precedent to show, so what makes technology technological 466 00:29:05,840 --> 00:29:09,440 Speaker 1: expression so much different than these other formats? And the 467 00:29:09,560 --> 00:29:12,160 Speaker 1: response over and over again has been, well, there really 468 00:29:12,360 --> 00:29:17,880 Speaker 1: is no important distinction between them, and therefore those First 469 00:29:17,880 --> 00:29:23,120 Speaker 1: Amendment rights should absolutely extend to electronic communication. So the 470 00:29:23,160 --> 00:29:24,760 Speaker 1: e f F has played a big role in just 471 00:29:24,960 --> 00:29:29,160 Speaker 1: making sure that this kind of of technology and this 472 00:29:29,320 --> 00:29:33,680 Speaker 1: kind of expression receives that same protection one of those cases, 473 00:29:33,880 --> 00:29:36,200 Speaker 1: uh or really to go through all of the cases 474 00:29:36,400 --> 00:29:39,520 Speaker 1: the ff has been part of, would be ridiculous. It 475 00:29:39,520 --> 00:29:43,479 Speaker 1: would require many many many more episodes, and I'm not 476 00:29:43,520 --> 00:29:45,840 Speaker 1: going to do that, will do an overview of a 477 00:29:45,880 --> 00:29:50,240 Speaker 1: few cases and kind of explain how the outcomes were important. 478 00:29:50,560 --> 00:29:55,160 Speaker 1: So one of those cases helped differentiate protection against piracy 479 00:29:55,360 --> 00:29:59,960 Speaker 1: versus establishing an anti competitive practice. And this is kind 480 00:30:00,000 --> 00:30:03,560 Speaker 1: of interesting because it doesn't involve communication over the Internet, 481 00:30:03,600 --> 00:30:06,920 Speaker 1: but it does involve technology. The case was called lex 482 00:30:07,000 --> 00:30:10,080 Speaker 1: Mark versus Static Control, and it all had to do 483 00:30:10,120 --> 00:30:16,440 Speaker 1: with toner cartridges, like printer cartridges. Lex Mark makes laser 484 00:30:16,480 --> 00:30:22,719 Speaker 1: printers and Static Control would sell remanufactured, refilled toner cartridges. Now, 485 00:30:22,760 --> 00:30:26,200 Speaker 1: in case you're not aware, toner cartridges are where the 486 00:30:26,240 --> 00:30:29,160 Speaker 1: real money is at. With printers. There are a lot 487 00:30:29,160 --> 00:30:31,440 Speaker 1: of companies that will sell their printers at a loss 488 00:30:31,720 --> 00:30:34,959 Speaker 1: and make up for it in toner sales. After all, 489 00:30:35,040 --> 00:30:37,719 Speaker 1: if you don't have toner printers, not very much good 490 00:30:37,760 --> 00:30:41,240 Speaker 1: to you. So Static Control would compete with lex Mark 491 00:30:41,400 --> 00:30:45,520 Speaker 1: by offering compatible toner cartridges at a lower cost. Lex 492 00:30:45,560 --> 00:30:49,880 Speaker 1: Mark in turn sued Seatic Control, claiming that this amounted 493 00:30:49,880 --> 00:30:52,920 Speaker 1: to piracy, but the e f F and Static Control 494 00:30:52,960 --> 00:30:55,240 Speaker 1: were able to argue that this was not a valid 495 00:30:55,280 --> 00:30:58,240 Speaker 1: claim and that what lex Mark was really trying to 496 00:30:58,320 --> 00:31:03,240 Speaker 1: do was force customers into lex Marks ecosystem and only 497 00:31:03,320 --> 00:31:07,760 Speaker 1: purchase lex Marks products, So their actions were meant to 498 00:31:07,840 --> 00:31:13,240 Speaker 1: suppress legitimate competition. Now, not all victories are so straightforward. 499 00:31:13,560 --> 00:31:16,440 Speaker 1: In the mid two thousand's, the Supreme Court heard a 500 00:31:16,480 --> 00:31:19,600 Speaker 1: case in which twenty eight companies that were led by 501 00:31:19,760 --> 00:31:26,080 Speaker 1: MGM Studios sued Groster as well as Streamcast for copyright infringement. 502 00:31:26,880 --> 00:31:29,760 Speaker 1: Roxter was built on top of the Morpheus peer to 503 00:31:29,880 --> 00:31:34,120 Speaker 1: peer file sharing software. File sharing software allows users to 504 00:31:34,160 --> 00:31:37,560 Speaker 1: connect in sort of an ad hoc network and share 505 00:31:37,600 --> 00:31:41,040 Speaker 1: files with one another. So think of it as you 506 00:31:41,200 --> 00:31:44,960 Speaker 1: download some software that allows you to log into a 507 00:31:45,240 --> 00:31:48,480 Speaker 1: service that connects you with other users that are using 508 00:31:48,520 --> 00:31:53,040 Speaker 1: that same software, and you can share files that you've 509 00:31:53,040 --> 00:31:55,960 Speaker 1: stored in a specific folder with anyone else that's on 510 00:31:55,960 --> 00:31:58,920 Speaker 1: that network. They can share files that they've stored in 511 00:31:59,080 --> 00:32:03,200 Speaker 1: folders on their machine that have been designated as valid 512 00:32:03,240 --> 00:32:08,040 Speaker 1: folders with you, and you can download these files, and 513 00:32:08,080 --> 00:32:09,960 Speaker 1: if there are multiple people who have that file on 514 00:32:10,000 --> 00:32:13,120 Speaker 1: their machines, your downloads go much more quickly because it 515 00:32:13,200 --> 00:32:17,080 Speaker 1: can pull information from any or all of those machines simultaneously. 516 00:32:17,920 --> 00:32:20,960 Speaker 1: So even if a connection between you and computer A 517 00:32:21,800 --> 00:32:24,760 Speaker 1: is bad. It can still pull data from computers B 518 00:32:24,960 --> 00:32:28,040 Speaker 1: through Z as long as they all have instances of 519 00:32:28,080 --> 00:32:32,560 Speaker 1: a file of that specific file, so there's nothing illegal 520 00:32:32,600 --> 00:32:37,040 Speaker 1: about that. Peer to peer sharing is perfectly legitimate as 521 00:32:37,040 --> 00:32:41,880 Speaker 1: a means of distributing data. However, the Supreme Court found 522 00:32:41,880 --> 00:32:45,040 Speaker 1: that roxter had effectively promoted the use of its service 523 00:32:45,120 --> 00:32:48,760 Speaker 1: to engage in copyright infringement, and as such was responsible 524 00:32:48,800 --> 00:32:52,160 Speaker 1: for the acts that happened on its service. However, there's 525 00:32:52,160 --> 00:32:54,480 Speaker 1: a previous decision that was made by the Supreme Court 526 00:32:54,680 --> 00:33:00,520 Speaker 1: involving Sony Corporation versus the Universal City Studios are University 527 00:33:00,560 --> 00:33:03,240 Speaker 1: Cities do. Is also known as the Beta Max case. 528 00:33:03,840 --> 00:33:10,720 Speaker 1: Beta max being a a magnetic tape format kind of 529 00:33:10,760 --> 00:33:16,000 Speaker 1: like VCRs the old VHS versus Beta Max wars. But 530 00:33:16,280 --> 00:33:20,680 Speaker 1: that particular case established the technologies manufacturer is not responsible 531 00:33:20,720 --> 00:33:24,480 Speaker 1: for the way it's customers use that technology if it's 532 00:33:24,480 --> 00:33:28,680 Speaker 1: outside the stated purpose of said technology. So if I 533 00:33:28,800 --> 00:33:33,840 Speaker 1: make a type of tech that allows you to record 534 00:33:33,920 --> 00:33:36,840 Speaker 1: stuff on it, and I say the reason for this 535 00:33:37,040 --> 00:33:39,600 Speaker 1: is so that you can make a backup recording of 536 00:33:39,760 --> 00:33:43,800 Speaker 1: media that you already own, backup recordings are perfectly legal. 537 00:33:43,880 --> 00:33:47,440 Speaker 1: It's a type of fair use in the United States. 538 00:33:47,480 --> 00:33:51,200 Speaker 1: So assuming that's what you're using the thing I made, 539 00:33:51,840 --> 00:33:53,760 Speaker 1: that's why you're using it, and how you're using it, 540 00:33:53,840 --> 00:33:57,240 Speaker 1: everything's fine. You're using it legally. But then let's say 541 00:33:57,240 --> 00:34:00,920 Speaker 1: that you use that same technology to record something that 542 00:34:00,960 --> 00:34:04,560 Speaker 1: you do not own, and then maybe even record copies 543 00:34:04,560 --> 00:34:08,359 Speaker 1: of that and distribute it. That would be outside of 544 00:34:08,400 --> 00:34:11,440 Speaker 1: what I stated the technology was for. And by this 545 00:34:11,560 --> 00:34:15,000 Speaker 1: argument of safe harbor, I could perhaps get out of 546 00:34:15,040 --> 00:34:18,200 Speaker 1: any liability for the way that you are using my 547 00:34:18,239 --> 00:34:20,279 Speaker 1: technology because you're using it in a way that I 548 00:34:20,280 --> 00:34:22,799 Speaker 1: did not intend, and it's not my fault if you 549 00:34:23,000 --> 00:34:28,520 Speaker 1: use the technology poorly. Just like if I am selling 550 00:34:28,560 --> 00:34:32,600 Speaker 1: pies and you buy a pie, I'm not responsible if 551 00:34:32,600 --> 00:34:34,799 Speaker 1: you throw that pie in someone's face instead of just 552 00:34:34,800 --> 00:34:38,000 Speaker 1: sitting down to have a delicious slice of pie. It's 553 00:34:38,000 --> 00:34:42,080 Speaker 1: not my fault. I just made the pie. I still 554 00:34:42,120 --> 00:34:44,799 Speaker 1: want pie right now. You guys who are long time 555 00:34:44,880 --> 00:34:49,560 Speaker 1: listeners of tech stuff, you know that's my secret. I 556 00:34:49,600 --> 00:34:55,200 Speaker 1: always want pie, like the Incredible Hulk. Anyway, in Groxter's case, 557 00:34:55,239 --> 00:34:58,040 Speaker 1: the court decided that the service had encouraged users to 558 00:34:58,120 --> 00:35:01,480 Speaker 1: engage in unlawful activity, but Court was also very careful 559 00:35:01,520 --> 00:35:04,560 Speaker 1: not to tread too far and negate those safe harbor 560 00:35:04,640 --> 00:35:09,120 Speaker 1: aspects that were established during Sony and University University City studios. 561 00:35:09,400 --> 00:35:13,000 Speaker 1: Those safe harbor protections now extend to software, though again 562 00:35:13,040 --> 00:35:15,600 Speaker 1: with the caveat that there must be legitimate use for 563 00:35:15,640 --> 00:35:19,799 Speaker 1: the technology that does not require unlawful behavior. So again, 564 00:35:19,880 --> 00:35:22,600 Speaker 1: peer to peer networking is by itself a perfectly reasonable 565 00:35:22,640 --> 00:35:25,600 Speaker 1: means to distribute information across the network of computers. There's 566 00:35:25,640 --> 00:35:30,960 Speaker 1: nothing illegal inherently about it. That people use that technology 567 00:35:31,000 --> 00:35:34,799 Speaker 1: for piracy is not necessarily surprising, but at the same time, 568 00:35:34,800 --> 00:35:38,440 Speaker 1: it's not the only or even intended purpose of that tech. 569 00:35:39,000 --> 00:35:42,360 Speaker 1: Another case I have referenced on this show involved Sony's 570 00:35:42,440 --> 00:35:46,160 Speaker 1: use of DRM measures on their music CDs. It used 571 00:35:46,160 --> 00:35:48,359 Speaker 1: to be a thing music CDs. I don't know how 572 00:35:48,360 --> 00:35:52,040 Speaker 1: many of you out there still listened to CDs, But 573 00:35:52,160 --> 00:35:55,480 Speaker 1: Sony had this bright idea, why not include some software 574 00:35:55,640 --> 00:35:59,120 Speaker 1: on music CDs that could prevent people from ripping those 575 00:35:59,160 --> 00:36:02,960 Speaker 1: tracks on to a computer and then distributing them willy nilly, 576 00:36:03,040 --> 00:36:07,640 Speaker 1: hither thither and yawn across the Internet. And boy did 577 00:36:07,880 --> 00:36:12,399 Speaker 1: things go wrong from there because Sony's approach introduced an 578 00:36:12,440 --> 00:36:17,799 Speaker 1: incredible vulnerability onto customer computers. Back in two thousand five, 579 00:36:18,280 --> 00:36:22,080 Speaker 1: Sony Music Entertainment, which in those days was actually known 580 00:36:22,120 --> 00:36:26,279 Speaker 1: as Sony b MG, included what they called an extended 581 00:36:26,400 --> 00:36:30,520 Speaker 1: copy protection feature on music CDs and and all. That 582 00:36:30,600 --> 00:36:33,960 Speaker 1: also included a copy of some software called Media Max 583 00:36:34,080 --> 00:36:37,240 Speaker 1: c D three, And if a customer were to put 584 00:36:37,360 --> 00:36:41,440 Speaker 1: their brand new Sony b MG manufactured music CD into 585 00:36:41,480 --> 00:36:46,160 Speaker 1: a computer, the computer would auto install the software on 586 00:36:46,239 --> 00:36:48,359 Speaker 1: the c D and that would give Sony a heads 587 00:36:48,440 --> 00:36:50,839 Speaker 1: up as to what was going on. So, in other words, 588 00:36:50,840 --> 00:36:52,680 Speaker 1: Sony could keep track of what you were doing with 589 00:36:52,719 --> 00:36:55,920 Speaker 1: your computer. And the whole justification here was that Sony 590 00:36:55,960 --> 00:36:58,760 Speaker 1: wanted to be on alert in case people were ripping 591 00:36:58,760 --> 00:37:01,520 Speaker 1: those c ds in or to burn pirated copies for 592 00:37:01,560 --> 00:37:06,239 Speaker 1: other people, or otherwise distribute the music illegally. The software 593 00:37:06,360 --> 00:37:10,920 Speaker 1: coasted under the noses of anti virus and anti spyware programs. 594 00:37:11,200 --> 00:37:14,120 Speaker 1: They just didn't detect it, and it created a backdoor 595 00:37:14,239 --> 00:37:18,440 Speaker 1: vulnerability for infected systems, which allowed the potential for hackers 596 00:37:18,440 --> 00:37:23,680 Speaker 1: to leverage those vulnerabilities and install malware on affected computers. So, 597 00:37:23,760 --> 00:37:26,319 Speaker 1: if you'd like an analogy, it would sort of be 598 00:37:26,440 --> 00:37:29,880 Speaker 1: like if Sony showed up at your house, got a 599 00:37:29,920 --> 00:37:32,960 Speaker 1: hold of your keys, made a copy of your house keys, 600 00:37:33,680 --> 00:37:38,160 Speaker 1: and then they just left those key copies laying around 601 00:37:38,160 --> 00:37:40,440 Speaker 1: where anyone could grab them and walts into your home 602 00:37:40,480 --> 00:37:44,800 Speaker 1: and take whatever they wanted. It was not good. Now 603 00:37:45,680 --> 00:37:48,400 Speaker 1: we call this the Sony root Kit because it gave 604 00:37:48,560 --> 00:37:53,239 Speaker 1: rude access to affected machines and it was distributed by Sony. 605 00:37:53,480 --> 00:37:56,400 Speaker 1: When news got out of what had happened, Sony b 606 00:37:56,560 --> 00:37:59,920 Speaker 1: MGS president of Global Digital Business, a guy named Thomas 607 00:38:00,040 --> 00:38:02,840 Speaker 1: Us said the wrong thing at the wrong time. He 608 00:38:02,920 --> 00:38:06,000 Speaker 1: specifically said, most people don't even know what a root 609 00:38:06,080 --> 00:38:10,520 Speaker 1: kit is, so why should they care about it. Yikes. 610 00:38:11,160 --> 00:38:13,480 Speaker 1: That's like saying most people don't realize that their back 611 00:38:13,520 --> 00:38:17,160 Speaker 1: door is unlock, so why should they care. It's It's 612 00:38:17,160 --> 00:38:20,800 Speaker 1: patently ridiculous. The e f F filed a class action 613 00:38:20,880 --> 00:38:24,120 Speaker 1: lawsuit against Sony to force the company to make amends 614 00:38:24,360 --> 00:38:27,640 Speaker 1: for the choices they made, and ultimately Sony was able 615 00:38:27,680 --> 00:38:29,880 Speaker 1: to settle out of court with the e f F. 616 00:38:30,520 --> 00:38:34,440 Speaker 1: The organization also pressured sun Calm, which was responsible for 617 00:38:34,480 --> 00:38:36,600 Speaker 1: one of the elements that created the root kit in 618 00:38:36,640 --> 00:38:39,400 Speaker 1: the first place, and Suncolm was forced to patch its 619 00:38:39,440 --> 00:38:42,759 Speaker 1: software and remove security flaws. Now the e f f's 620 00:38:42,800 --> 00:38:46,640 Speaker 1: work also reminds me what a strange and amazing thing 621 00:38:46,680 --> 00:38:49,319 Speaker 1: the Internet is. The Internet allows people to find others 622 00:38:49,320 --> 00:38:51,839 Speaker 1: who share their own interests and hobbies, and to find 623 00:38:52,000 --> 00:38:56,360 Speaker 1: enrichment through sharing with those people. One such group would 624 00:38:56,360 --> 00:39:00,680 Speaker 1: be embroidery fans, as in people who enjoy embroidering stuff, 625 00:39:00,719 --> 00:39:02,960 Speaker 1: which seems like a pretty harmless group of people, and 626 00:39:03,000 --> 00:39:05,520 Speaker 1: it's easy to imagine them using the Internet because they 627 00:39:05,560 --> 00:39:07,920 Speaker 1: might want to form some clubs to chat about projects 628 00:39:08,000 --> 00:39:11,560 Speaker 1: or share patterns, and that's actually where the problems began. 629 00:39:12,600 --> 00:39:16,080 Speaker 1: There was a group of embroidery pattern publishers and vendors 630 00:39:16,239 --> 00:39:18,759 Speaker 1: who banded together to form a new group called the 631 00:39:18,800 --> 00:39:24,680 Speaker 1: Embroidery Software Protection Coalition or e SPC. They were concerned 632 00:39:24,680 --> 00:39:29,080 Speaker 1: with preventing piracy of various embroidery patterns and programs, and 633 00:39:29,120 --> 00:39:34,000 Speaker 1: their target was an online community of embroiderers. Like many 634 00:39:34,080 --> 00:39:37,600 Speaker 1: such communities, the members on this online discussion group use 635 00:39:37,719 --> 00:39:41,200 Speaker 1: nicknames or handles on their profiles. So the e s 636 00:39:41,239 --> 00:39:46,080 Speaker 1: PC wanted to file defamation claims against some of the 637 00:39:46,120 --> 00:39:49,000 Speaker 1: members of that discussion group. So what they did is 638 00:39:49,040 --> 00:39:53,480 Speaker 1: they issued a subpoena demanding that the organization received detailed 639 00:39:53,480 --> 00:39:56,919 Speaker 1: personal information about everyone who was part of that discussion group, 640 00:39:57,280 --> 00:40:00,480 Speaker 1: regardless of whether or not they had ever even posted 641 00:40:00,520 --> 00:40:03,359 Speaker 1: a message to the group. So, in other words, if 642 00:40:03,400 --> 00:40:05,239 Speaker 1: you belong to this group, even if you had never 643 00:40:05,280 --> 00:40:07,960 Speaker 1: said anything one way or the other, you'd never engaged 644 00:40:08,000 --> 00:40:13,719 Speaker 1: in any kind of of copyright infringement. They wanted all 645 00:40:13,800 --> 00:40:17,000 Speaker 1: your personal information so they could make a decision about 646 00:40:17,000 --> 00:40:18,960 Speaker 1: whether or not to sue you. Now, the e F 647 00:40:19,080 --> 00:40:22,279 Speaker 1: got involved and argue that this was an unreasonable and 648 00:40:22,360 --> 00:40:25,200 Speaker 1: unlawful tactic. And after the e F made it clear 649 00:40:25,239 --> 00:40:28,360 Speaker 1: that it would provide legal support for the embroiderers, the 650 00:40:28,440 --> 00:40:32,920 Speaker 1: e s PC withdrew its subpoena. E F State Attorney 651 00:40:32,960 --> 00:40:36,239 Speaker 1: Core and McSherry issued a statement that read quote, E 652 00:40:36,520 --> 00:40:39,600 Speaker 1: s PC should have never filed this frivolous case in 653 00:40:39,600 --> 00:40:42,239 Speaker 1: the first place, but where pleased that E s PC 654 00:40:42,440 --> 00:40:45,799 Speaker 1: now understands that it can't use the courts to intimidate 655 00:40:45,840 --> 00:40:48,160 Speaker 1: those who who want to talk about E s p 656 00:40:48,320 --> 00:40:52,439 Speaker 1: C s ham Thisted tactics. The First Amendment forbids such 657 00:40:52,480 --> 00:40:56,000 Speaker 1: abusive use of the discovery process. So what they're saying 658 00:40:56,080 --> 00:40:58,560 Speaker 1: is all these embroiderers who are on this online discussion 659 00:40:58,640 --> 00:41:02,040 Speaker 1: group and saying, well, this this group over here is 660 00:41:02,080 --> 00:41:07,240 Speaker 1: being really unreasonable. The actions that group took was way 661 00:41:07,280 --> 00:41:10,919 Speaker 1: more unreasonable, you know, saying, Hey, they're saying nasty things 662 00:41:10,920 --> 00:41:14,320 Speaker 1: about us, so we want to know who they are. Yikes. 663 00:41:15,480 --> 00:41:17,560 Speaker 1: There are dozens of other cases we could look at, 664 00:41:18,000 --> 00:41:20,120 Speaker 1: but I think after we come back from the next break, 665 00:41:20,160 --> 00:41:22,480 Speaker 1: will focus instead on some of the other work the 666 00:41:22,560 --> 00:41:25,680 Speaker 1: e f F has been involved with outside of legal cases, 667 00:41:26,120 --> 00:41:28,480 Speaker 1: So we'll talk about that after this quick break to 668 00:41:28,560 --> 00:41:39,879 Speaker 1: thank our sponsor. Now, the Electronic Frontier Foundation does more 669 00:41:39,920 --> 00:41:42,720 Speaker 1: than just jump into legal battles. It also helps develop 670 00:41:42,760 --> 00:41:46,480 Speaker 1: and distribute various tools to make the use of the 671 00:41:46,520 --> 00:41:49,839 Speaker 1: Internet more secure and protect the privacy of folks when 672 00:41:49,840 --> 00:41:54,040 Speaker 1: they're online. One of those tools is a browser extension 673 00:41:54,080 --> 00:41:59,960 Speaker 1: called Privacy Badger. This extension blocks websites from using brows 674 00:42:00,200 --> 00:42:04,600 Speaker 1: trackers on you without your permission. Now, those trackers could 675 00:42:04,640 --> 00:42:07,600 Speaker 1: otherwise keep tabs on which sites you are visiting and 676 00:42:07,920 --> 00:42:10,879 Speaker 1: your browsing behaviors in general, and as e f says 677 00:42:10,880 --> 00:42:14,680 Speaker 1: in their page on the tool, quote, if an advertiser 678 00:42:14,719 --> 00:42:18,400 Speaker 1: seems to be tracking you across multiple websites without your permission. 679 00:42:18,719 --> 00:42:22,560 Speaker 1: Privacy badger automatically blocks that advertiser from loading any more 680 00:42:22,600 --> 00:42:25,640 Speaker 1: content in your browser. To the advertiser, it's like you 681 00:42:25,719 --> 00:42:29,799 Speaker 1: suddenly disappeared. End quote. Another extension that they offer is 682 00:42:30,000 --> 00:42:34,400 Speaker 1: HTTPS Everywhere, which is an encryption tool. This extension can 683 00:42:34,480 --> 00:42:38,160 Speaker 1: encrypt communications with many different websites to help improve the 684 00:42:38,200 --> 00:42:41,880 Speaker 1: security of your web browsing. So some websites have pages 685 00:42:41,920 --> 00:42:46,000 Speaker 1: that support HTTPS encryption you see that little lock that 686 00:42:46,120 --> 00:42:49,360 Speaker 1: shows up in your address bar, but some websites have 687 00:42:50,080 --> 00:42:53,920 Speaker 1: certain pages that are still over unencrypted h T t P. 688 00:42:54,920 --> 00:42:59,960 Speaker 1: The extension that the f F offers rewrites all require 689 00:43:00,320 --> 00:43:04,280 Speaker 1: to go through HTTPS and keep everything nice and jumbled 690 00:43:04,360 --> 00:43:06,160 Speaker 1: up for any snoops that happened to be out there. 691 00:43:06,920 --> 00:43:09,080 Speaker 1: And there are other tools that you can learn about 692 00:43:09,120 --> 00:43:11,279 Speaker 1: over at the e f F website to help keep 693 00:43:11,320 --> 00:43:14,160 Speaker 1: you and your browsing safe. And part of what the 694 00:43:14,239 --> 00:43:17,960 Speaker 1: organization does is raise awareness of potential threats to civil 695 00:43:17,960 --> 00:43:22,160 Speaker 1: liberties in proposed legislation or corporate practices, which is not 696 00:43:22,280 --> 00:43:26,280 Speaker 1: always about filing a court case, though that does sometimes 697 00:43:26,320 --> 00:43:29,480 Speaker 1: end up being part of it. If it goes far enough. 698 00:43:29,880 --> 00:43:33,080 Speaker 1: On the e f site, you can find numerous articles, 699 00:43:33,360 --> 00:43:37,920 Speaker 1: blog posts, and white papers about important civil liberties issues, 700 00:43:37,960 --> 00:43:40,920 Speaker 1: and some of them deal with pretty thorny subjects and 701 00:43:41,000 --> 00:43:44,359 Speaker 1: some require you to really keep control of your emotions. 702 00:43:44,960 --> 00:43:50,320 Speaker 1: For example, on January three, the e f F submitted 703 00:43:50,360 --> 00:43:54,320 Speaker 1: a letter opposing a proposed act in South Carolina called 704 00:43:54,520 --> 00:43:58,440 Speaker 1: H three zero zero three, also known as the Human 705 00:43:58,600 --> 00:44:04,080 Speaker 1: Trafficking Prevention Act. Now, immediately, that's an incredibly charged topic. Clearly, 706 00:44:04,840 --> 00:44:08,600 Speaker 1: human trafficking is an evil act and any decent person 707 00:44:08,640 --> 00:44:11,799 Speaker 1: would oppose it. So why would the e f F 708 00:44:12,040 --> 00:44:18,000 Speaker 1: come out against legislation meant to prevent occurrences of human trafficking? Well, 709 00:44:18,040 --> 00:44:21,480 Speaker 1: the key here is the language inside the act itself. 710 00:44:21,920 --> 00:44:25,800 Speaker 1: As their letter would state, the Act requires any entity 711 00:44:25,880 --> 00:44:29,960 Speaker 1: wishing to conduct business in South Carolina quote that manufacturers, 712 00:44:30,120 --> 00:44:33,759 Speaker 1: distributes or sells a product that makes content accessible on 713 00:44:33,840 --> 00:44:37,640 Speaker 1: the Internet. End quote to include in the product a 714 00:44:37,680 --> 00:44:43,200 Speaker 1: digital blocking capability that blocks online access to quote obscenity, 715 00:44:43,360 --> 00:44:48,480 Speaker 1: child pornography, revenge pornography, or any hub that facilitates prostitution, 716 00:44:48,640 --> 00:44:51,719 Speaker 1: and websites that are known to facilitate any trafficking of 717 00:44:51,880 --> 00:44:55,879 Speaker 1: persons end quote Well, the e f F maintains that 718 00:44:56,280 --> 00:44:59,640 Speaker 1: the proposed legislation is and this is, from their letter, 719 00:45:00,120 --> 00:45:03,920 Speaker 1: a gross violation of the free speech rights of Internet users, 720 00:45:03,960 --> 00:45:09,120 Speaker 1: particularly given that content filtering technology is inaccurate. The bill 721 00:45:09,200 --> 00:45:12,440 Speaker 1: also violates the privacy of consumers by requiring them to 722 00:45:12,520 --> 00:45:16,680 Speaker 1: engage in invasive interactions with companies in order to deactivate 723 00:45:16,719 --> 00:45:20,440 Speaker 1: the blocking feature. The bill creates an exorbitant new tax 724 00:45:20,560 --> 00:45:23,600 Speaker 1: for both Internet users and companies wishing to opt out 725 00:45:23,640 --> 00:45:26,920 Speaker 1: of the regulatory regime. The bill violates the rights of 726 00:45:27,000 --> 00:45:30,640 Speaker 1: online service providers by penalizing them for content created by 727 00:45:30,680 --> 00:45:35,120 Speaker 1: their users. The bill further hurts companies by requiring costly 728 00:45:35,160 --> 00:45:38,960 Speaker 1: technical features that may ultimately discourage business in the state 729 00:45:39,080 --> 00:45:42,120 Speaker 1: of South Carolina. End quote now. The e f F 730 00:45:42,239 --> 00:45:46,320 Speaker 1: points out that in the effort to filter out illegal material, 731 00:45:46,920 --> 00:45:50,680 Speaker 1: the enforcement of this act would undoubtedly sweep up legal 732 00:45:50,719 --> 00:45:54,120 Speaker 1: stuff as well, so you would get both illegal and 733 00:45:54,239 --> 00:45:57,520 Speaker 1: legal things filtered out of your view. And while the 734 00:45:57,560 --> 00:46:01,960 Speaker 1: Act does contain some language in that it requires entities 735 00:46:01,960 --> 00:46:05,400 Speaker 1: to restore access to legal material if you file acclaim 736 00:46:05,440 --> 00:46:09,160 Speaker 1: against it, this still represents a First Amendment harm and 737 00:46:09,200 --> 00:46:11,440 Speaker 1: the e f also states that the language is so 738 00:46:11,560 --> 00:46:14,680 Speaker 1: broad that it could encompass any company that makes any 739 00:46:14,719 --> 00:46:17,600 Speaker 1: device that allows access to the Internet, so that could 740 00:46:17,600 --> 00:46:23,320 Speaker 1: include computers, smartphones, tablets, set top boxes, WiFi routers, modems, 741 00:46:23,320 --> 00:46:27,280 Speaker 1: and more. Every single type of technology would be required 742 00:46:27,280 --> 00:46:30,360 Speaker 1: to have some sort of filter on it. The FFS 743 00:46:30,480 --> 00:46:33,960 Speaker 1: argument is not that human trafficking isn't a problem or 744 00:46:34,000 --> 00:46:37,319 Speaker 1: that it shouldn't be taken seriously. Their argument is that 745 00:46:37,400 --> 00:46:42,640 Speaker 1: the proposed act fails to address that problem while simultaneously 746 00:46:42,760 --> 00:46:47,920 Speaker 1: courts unconstitutional implementations. So, as of the recording of this podcast, 747 00:46:47,960 --> 00:46:51,600 Speaker 1: that act is currently still in committee in the South 748 00:46:51,600 --> 00:46:54,879 Speaker 1: Carolina House of Representatives, and it has been there since 749 00:46:54,960 --> 00:46:59,040 Speaker 1: January two thousand seventeen. The f F also offers up 750 00:46:59,080 --> 00:47:02,400 Speaker 1: helpful papers the all sorts of topics from the Internet 751 00:47:02,440 --> 00:47:05,200 Speaker 1: registries that are least likely to give up details on 752 00:47:05,280 --> 00:47:08,440 Speaker 1: domain owners, to a list of government data requests and 753 00:47:08,440 --> 00:47:12,400 Speaker 1: how various Internet service providers and other companies responded to 754 00:47:12,440 --> 00:47:15,080 Speaker 1: those requests. So, in other words, it tells you these 755 00:47:15,120 --> 00:47:17,520 Speaker 1: are the partners that aren't going to give you up 756 00:47:18,280 --> 00:47:22,120 Speaker 1: under just a casual request from the government. The e 757 00:47:22,280 --> 00:47:25,279 Speaker 1: f F also provides information on civil liberties, the best 758 00:47:25,280 --> 00:47:28,279 Speaker 1: practices to follow to ensure your rights are protected, and 759 00:47:28,360 --> 00:47:30,800 Speaker 1: also deep dives on the areas of focus I mentioned 760 00:47:30,800 --> 00:47:33,360 Speaker 1: at the top of this episode. Now, you may not 761 00:47:33,440 --> 00:47:36,080 Speaker 1: agree with every stance the e f F takes. I 762 00:47:36,120 --> 00:47:39,880 Speaker 1: know there are some that I personally find a little problematic, 763 00:47:40,480 --> 00:47:43,520 Speaker 1: though I also admit that I sighed more often with 764 00:47:43,680 --> 00:47:48,359 Speaker 1: the e f than otherwise. As a nonprofit organization, they 765 00:47:48,400 --> 00:47:52,000 Speaker 1: depend upon donations in order to fund their work, and 766 00:47:52,040 --> 00:47:54,160 Speaker 1: there's also a shop where you can buy products and 767 00:47:54,200 --> 00:47:57,040 Speaker 1: the money goes to support e f f's efforts. Some 768 00:47:57,120 --> 00:47:59,719 Speaker 1: of the examples of merchandise you could purchase include things 769 00:47:59,719 --> 00:48:02,640 Speaker 1: like kickers or r f I D blocking wallets were 770 00:48:02,680 --> 00:48:06,360 Speaker 1: temporary tattoos if you really want to, you know, sport 771 00:48:06,440 --> 00:48:10,480 Speaker 1: the e f F logo on your bicep. But that's it. 772 00:48:11,040 --> 00:48:13,520 Speaker 1: On the e f F, the organization continues to play 773 00:48:13,560 --> 00:48:16,760 Speaker 1: an important role to make sure that corporations and governments 774 00:48:16,800 --> 00:48:20,040 Speaker 1: don't compromise our civil liberties in the online space. I 775 00:48:20,120 --> 00:48:22,799 Speaker 1: suspect that we're gonna have a need for the e 776 00:48:23,000 --> 00:48:26,840 Speaker 1: f F for many years to come, as online technology 777 00:48:26,880 --> 00:48:30,640 Speaker 1: continues to insinuates itself into all aspects of our lives. 778 00:48:31,080 --> 00:48:35,279 Speaker 1: As that happens, those protections are going to become more 779 00:48:35,280 --> 00:48:39,000 Speaker 1: and more important. So that wraps up this episode. If 780 00:48:39,040 --> 00:48:41,879 Speaker 1: you guys have suggestions for future topics I could cover 781 00:48:42,000 --> 00:48:44,640 Speaker 1: on episodes of tech Stuff, why don't you send me 782 00:48:44,640 --> 00:48:46,960 Speaker 1: a message and tell me about your big old ideas. 783 00:48:47,040 --> 00:48:52,720 Speaker 1: Smart guy, smart lady, smart people. I'm I'm being serious, 784 00:48:52,760 --> 00:48:54,560 Speaker 1: I'm not being sarcastic. I really would love to hear 785 00:48:54,600 --> 00:48:56,759 Speaker 1: your ideas. You can get in touch with me using 786 00:48:56,760 --> 00:49:00,880 Speaker 1: the email address tech Stuff at how stuff works dot com, 787 00:49:01,040 --> 00:49:05,400 Speaker 1: or if you prefer to communicate via Twitter or Facebook, 788 00:49:05,760 --> 00:49:09,000 Speaker 1: you can find the text stuff handle at either of 789 00:49:09,000 --> 00:49:13,080 Speaker 1: those locations. It is text stuff h s W. And remember, 790 00:49:13,120 --> 00:49:15,800 Speaker 1: you can go to twitch dot tv slash tech stuff 791 00:49:15,800 --> 00:49:18,839 Speaker 1: to watch me record these shows live. You can watch 792 00:49:18,840 --> 00:49:22,640 Speaker 1: as I make mistakes or occasionally curse into the microphone. 793 00:49:23,320 --> 00:49:27,680 Speaker 1: Sometimes that happens like today, but the only way you 794 00:49:27,719 --> 00:49:29,640 Speaker 1: ever get to see it is if you go to 795 00:49:29,640 --> 00:49:33,880 Speaker 1: twitch dot tv slash tech stuff because the wonder producer 796 00:49:34,000 --> 00:49:36,840 Speaker 1: Ramsey is very good at taking those things out before 797 00:49:36,920 --> 00:49:41,319 Speaker 1: the show goes live. So come and see me record live. 798 00:49:41,320 --> 00:49:43,360 Speaker 1: I do it on Wednesdays and Fridays. The schedule is 799 00:49:43,400 --> 00:49:45,839 Speaker 1: up at twitch dot tv slash tech stuff, and I'll 800 00:49:45,840 --> 00:49:55,759 Speaker 1: talk to you again really soon for more on this 801 00:49:55,920 --> 00:49:58,399 Speaker 1: and thousands of other topics. Because it has stop works 802 00:49:58,440 --> 00:50:03,160 Speaker 1: dot com wh wh