1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to text Stuff, a production from I Heart Radio. 2 00:00:12,240 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,000 --> 00:00:18,200 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,280 --> 00:00:23,000 Speaker 1: and I love all things tech And what if websites 5 00:00:23,040 --> 00:00:27,160 Speaker 1: were held responsible for the content that other people post 6 00:00:27,400 --> 00:00:31,200 Speaker 1: to those websites. What if after a customer left a 7 00:00:31,360 --> 00:00:35,440 Speaker 1: bad review for a product online, the company that makes 8 00:00:35,560 --> 00:00:40,960 Speaker 1: that product would sue Amazon for hosting the review. What 9 00:00:41,159 --> 00:00:44,319 Speaker 1: if that company that makes the thing, what if they 10 00:00:44,400 --> 00:00:47,720 Speaker 1: won that lawsuit. What if Facebook were liable for the 11 00:00:47,760 --> 00:00:52,800 Speaker 1: posts made by its two billion users worldwide. If sites 12 00:00:52,840 --> 00:00:56,440 Speaker 1: were held accountable for the content that users and third 13 00:00:56,480 --> 00:01:00,320 Speaker 1: parties posted to them, we would not have the ornet 14 00:01:00,360 --> 00:01:07,840 Speaker 1: we have today. In fact, companies and organizations like Wikipedia, Amazon, Facebook, Twitter, 15 00:01:07,920 --> 00:01:11,479 Speaker 1: even Google wouldn't exist, or at least they wouldn't exist 16 00:01:11,520 --> 00:01:14,720 Speaker 1: in the forms they do today if that were the case. Now, 17 00:01:14,760 --> 00:01:17,000 Speaker 1: if you live in the United States, you might have 18 00:01:17,040 --> 00:01:21,000 Speaker 1: heard a bit about a section to thirty. Even if 19 00:01:21,000 --> 00:01:23,480 Speaker 1: you're outside the United States, you might have heard some 20 00:01:23,520 --> 00:01:26,840 Speaker 1: references to it. Now, if you're only casually following the 21 00:01:26,880 --> 00:01:30,240 Speaker 1: news or you just here section to thirty in passing, 22 00:01:30,920 --> 00:01:34,120 Speaker 1: it's probably pretty confusing. It clearly has something to do 23 00:01:34,200 --> 00:01:39,440 Speaker 1: with technology and liability and communication. President Donald Trump called 24 00:01:39,520 --> 00:01:43,280 Speaker 1: upon Congress to revoke it several times now, even threatening 25 00:01:43,319 --> 00:01:47,360 Speaker 1: to veto the funding of the National Defense Authorization Act 26 00:01:47,800 --> 00:01:52,520 Speaker 1: unless Congress repealed Section to thirty. But Trump is not 27 00:01:52,600 --> 00:01:56,880 Speaker 1: the only politician to call out this legislation. Representatives from 28 00:01:56,920 --> 00:02:00,640 Speaker 1: both the Republican and Democrat parties have proposed those changes 29 00:02:00,720 --> 00:02:03,400 Speaker 1: or even the outright elimination of Section to thirty over 30 00:02:03,400 --> 00:02:07,240 Speaker 1: the years. Heck, President elect Joe Biden has also called 31 00:02:07,360 --> 00:02:10,120 Speaker 1: upon the need to revoke to thirty. And if you 32 00:02:10,200 --> 00:02:13,240 Speaker 1: live in America, you might be surprised to hear that 33 00:02:13,280 --> 00:02:16,880 Speaker 1: Trump and Biden have agreed on something, though I guess 34 00:02:16,919 --> 00:02:20,760 Speaker 1: it's fair to point out they agree on the end result, 35 00:02:20,840 --> 00:02:24,720 Speaker 1: but for very different motivations. So in today's episode, I 36 00:02:24,760 --> 00:02:28,320 Speaker 1: want to talk about what Section to thirty is, where 37 00:02:28,320 --> 00:02:31,880 Speaker 1: it came from, what its purpose was and is, and 38 00:02:31,919 --> 00:02:35,040 Speaker 1: why there's so much discussion about the need to change 39 00:02:35,240 --> 00:02:38,560 Speaker 1: or get rid of it from various viewpoints across the 40 00:02:38,560 --> 00:02:42,800 Speaker 1: political spectrum. And I'll do my best to avoid any 41 00:02:42,800 --> 00:02:45,680 Speaker 1: political commentary, but I do want to say that the 42 00:02:45,760 --> 00:02:50,080 Speaker 1: motivations behind these various calls for change. They very a 43 00:02:50,200 --> 00:02:53,079 Speaker 1: great deal. I think a lot of folks in politics 44 00:02:53,120 --> 00:02:56,720 Speaker 1: agree that section to thy needs some attention, but they 45 00:02:56,760 --> 00:02:59,600 Speaker 1: don't all agree as to the reasons why or how 46 00:02:59,639 --> 00:03:01,880 Speaker 1: it should be done. So we're gonna get into all 47 00:03:01,919 --> 00:03:04,240 Speaker 1: of that. And before we jump in, I want to 48 00:03:04,280 --> 00:03:08,800 Speaker 1: recommend an amazing resource. It's a book by Jeff Kossa 49 00:03:09,040 --> 00:03:13,000 Speaker 1: titled The Twenty six Words That Created the Internet, and 50 00:03:13,080 --> 00:03:16,760 Speaker 1: it's all about section to thirty, from the genesis of 51 00:03:16,800 --> 00:03:20,920 Speaker 1: the idea to the implementation of Section two thirty in 52 00:03:21,000 --> 00:03:23,919 Speaker 1: court cases. And it's also a really good read, which 53 00:03:23,960 --> 00:03:26,800 Speaker 1: is a weird thing to say about a book centering 54 00:03:26,840 --> 00:03:30,200 Speaker 1: on a subsection of a huge telecommunications bill. I should 55 00:03:30,280 --> 00:03:34,480 Speaker 1: also add as a trigger warning that book discusses some 56 00:03:34,600 --> 00:03:39,600 Speaker 1: cases that deal with some really heavy, dark stuff. To 57 00:03:39,840 --> 00:03:45,240 Speaker 1: thirty has been tested in some really emotionally charged cases 58 00:03:45,720 --> 00:03:50,680 Speaker 1: that include truly awful things that have happened to people, 59 00:03:50,760 --> 00:03:53,880 Speaker 1: So fair warning if you do want to check that 60 00:03:53,880 --> 00:03:56,240 Speaker 1: book out and give it a read. Now. At the 61 00:03:56,320 --> 00:03:59,960 Speaker 1: heart of all of this are concepts like free speed, 62 00:04:00,000 --> 00:04:05,520 Speaker 1: each which has very wide protection in America, and liability 63 00:04:05,680 --> 00:04:10,880 Speaker 1: that is being held accountable legally accountable for something. Because 64 00:04:11,080 --> 00:04:14,280 Speaker 1: while there are broad protections for freedom of speech, it 65 00:04:14,400 --> 00:04:17,320 Speaker 1: is not absolute. There are some forms of speech and 66 00:04:17,440 --> 00:04:21,640 Speaker 1: expression that are not protected under the First Amendment. And 67 00:04:21,880 --> 00:04:26,240 Speaker 1: that's because free speech sometimes bumps up against other important 68 00:04:26,279 --> 00:04:30,120 Speaker 1: things like security or privacy, that kind of thing. So 69 00:04:31,520 --> 00:04:33,760 Speaker 1: it's one of those things where it's not pure black 70 00:04:33,800 --> 00:04:37,719 Speaker 1: and white. There's some shades of gray. Now, let's find 71 00:04:37,720 --> 00:04:41,120 Speaker 1: out what section to thirty is. And it's called Section 72 00:04:41,200 --> 00:04:44,360 Speaker 1: two thirty, so that suggests it's part of something bigger. Right, 73 00:04:44,400 --> 00:04:47,520 Speaker 1: it's a section of something. Well that it's a section 74 00:04:47,520 --> 00:04:50,160 Speaker 1: of a larger piece of legislation, and that piece was 75 00:04:50,240 --> 00:04:55,000 Speaker 1: the Communications Decency Act. So let's turn back the hands 76 00:04:55,040 --> 00:04:58,440 Speaker 1: of the clock a bit. Heck, maybe i'll maybe I'll 77 00:04:58,520 --> 00:05:01,400 Speaker 1: dust off the old text of time Machine for this one. 78 00:05:01,440 --> 00:05:06,000 Speaker 1: I don't think we've actually used it in years. Fortunately, 79 00:05:06,080 --> 00:05:08,760 Speaker 1: I did bring it home with me when our office 80 00:05:08,760 --> 00:05:12,480 Speaker 1: went into lockdown, so it's really just taking up space 81 00:05:12,520 --> 00:05:14,800 Speaker 1: in the corner. Hang on, hang on a second. I'm 82 00:05:14,800 --> 00:05:16,320 Speaker 1: just gonna get it out. I gotta move a couple 83 00:05:16,320 --> 00:05:29,360 Speaker 1: of things. Be right back. Alright, alright, so I've got it. Now, 84 00:05:29,760 --> 00:05:33,200 Speaker 1: let me set the dial back to uh, let's see 85 00:05:34,160 --> 00:05:37,960 Speaker 1: nineteen all right, Okay, here we go, everybody in, come on, 86 00:05:38,040 --> 00:05:41,200 Speaker 1: let's all get into the time machine. All right, Ready, 87 00:05:41,960 --> 00:05:57,440 Speaker 1: push the button, Frank, and here we are. It's The 88 00:05:57,560 --> 00:05:59,760 Speaker 1: number one hit single of the year is hold On 89 00:06:00,080 --> 00:06:03,080 Speaker 1: by Wilson Phillips, a song I'm not ashamed to say 90 00:06:03,120 --> 00:06:06,520 Speaker 1: I absolutely loved at the time, shows like Cheers, A 91 00:06:06,600 --> 00:06:09,680 Speaker 1: Different World and Murphy Brown, or on television that the 92 00:06:09,720 --> 00:06:12,960 Speaker 1: box office, the film Ghost comes out on top, with 93 00:06:13,080 --> 00:06:16,479 Speaker 1: Home Alone not far behind. But we're not here to 94 00:06:16,520 --> 00:06:20,120 Speaker 1: see the awkward transition of the nineteen eighties transform into 95 00:06:20,160 --> 00:06:23,480 Speaker 1: the nineteen nineties. Now, we're here to learn about how 96 00:06:23,720 --> 00:06:27,360 Speaker 1: US law would view the role of online platforms now 97 00:06:27,440 --> 00:06:31,920 Speaker 1: back in where we are now, the Internet isn't really 98 00:06:32,000 --> 00:06:34,599 Speaker 1: a thing as far as the mainstream public is concerned. 99 00:06:35,120 --> 00:06:38,680 Speaker 1: It exists, but hardly anyone knows very much about it 100 00:06:38,880 --> 00:06:43,760 Speaker 1: Outside of research facilities and government offices. There's no such 101 00:06:43,800 --> 00:06:47,920 Speaker 1: thing as the Worldwide Web. Yet, However, there are a 102 00:06:47,960 --> 00:06:52,240 Speaker 1: few big online service providers or OSPs. Now these are 103 00:06:52,279 --> 00:06:56,479 Speaker 1: sort of the predecessors to Internet Service providers or I 104 00:06:56,760 --> 00:06:59,200 Speaker 1: s p s, and OSP is kind of like its 105 00:06:59,240 --> 00:07:03,039 Speaker 1: own micro internet, though really, we would just kind of 106 00:07:03,080 --> 00:07:04,719 Speaker 1: call it a network. So think of it as a 107 00:07:04,760 --> 00:07:09,039 Speaker 1: self contained collection of servers that hosts stuff like forums 108 00:07:09,080 --> 00:07:12,040 Speaker 1: and newsletters and articles and files, and you're on the 109 00:07:12,120 --> 00:07:15,440 Speaker 1: right track, and they don't necessarily talk to each other, 110 00:07:15,720 --> 00:07:18,080 Speaker 1: so they're kind of self contained. Well, one of those 111 00:07:18,120 --> 00:07:22,000 Speaker 1: big OSPs is compu Serve, and it's going to get 112 00:07:22,040 --> 00:07:24,760 Speaker 1: taken to court. At the heart of the matter is 113 00:07:24,800 --> 00:07:29,800 Speaker 1: an accusation of libel that is, uh, a misinformation with 114 00:07:29,920 --> 00:07:34,200 Speaker 1: the intent to cause harm that's in print. The plaintiffs, 115 00:07:34,320 --> 00:07:38,440 Speaker 1: Robert Blanchard and a company called Cubby Incorporated, have developed 116 00:07:38,440 --> 00:07:43,160 Speaker 1: a news and rumors service called scuttle Butt, which focuses 117 00:07:43,240 --> 00:07:47,600 Speaker 1: on the radio and TV industries. Now, according to the plaintiffs, 118 00:07:47,640 --> 00:07:51,880 Speaker 1: a newsletter that is also that's called Rumorville USA, which 119 00:07:52,200 --> 00:07:55,880 Speaker 1: also covers rumors in the TV and radio spaces, has 120 00:07:55,960 --> 00:08:00,680 Speaker 1: published untrue and harmful things about scuttle Butt, and Rumorville 121 00:08:01,080 --> 00:08:05,160 Speaker 1: is available on compu Serve. So the plaintiffs targeted not 122 00:08:05,280 --> 00:08:08,720 Speaker 1: just Rumorville, but compu Serve in their lawsuits. So they 123 00:08:08,720 --> 00:08:13,480 Speaker 1: say compu Serve is responsible because it allows the distribution 124 00:08:13,520 --> 00:08:17,920 Speaker 1: of Rumorville, which in turn has published libelous content about 125 00:08:17,920 --> 00:08:22,560 Speaker 1: scuttle butt. So the lawyers representing Company Serve argue that 126 00:08:22,640 --> 00:08:26,120 Speaker 1: the service has no connection to Rumorville other than as 127 00:08:26,120 --> 00:08:28,160 Speaker 1: serving as a way for people to get the newsletter. 128 00:08:28,160 --> 00:08:32,199 Speaker 1: In other words, compu Service saying, hey, we don't write that. 129 00:08:32,760 --> 00:08:35,680 Speaker 1: We just have it on our service, but we don't 130 00:08:35,760 --> 00:08:40,400 Speaker 1: write it. There's no there's no employment here to generate 131 00:08:40,440 --> 00:08:44,760 Speaker 1: that newsletter. Compu Serve isn't involved editorially in the newsletter 132 00:08:44,800 --> 00:08:47,439 Speaker 1: at all. It just comes from another company. That company 133 00:08:47,480 --> 00:08:52,400 Speaker 1: is Don Fitzpatrick Associates of San Francisco, and it's referred 134 00:08:52,440 --> 00:08:55,560 Speaker 1: to in the court documents as d f A. Compu 135 00:08:55,679 --> 00:08:59,560 Speaker 1: Serve did not employ this company or pay for this 136 00:08:59,679 --> 00:09:03,000 Speaker 1: news letter. And moreover, according to the agreement between compy 137 00:09:03,040 --> 00:09:05,960 Speaker 1: Serve and d f A, d f A accepts full 138 00:09:05,960 --> 00:09:09,800 Speaker 1: responsibility for the contents of its newsletter. So compy serves 139 00:09:09,880 --> 00:09:12,559 Speaker 1: lawyers go and make a motion for a summary judgment, 140 00:09:12,720 --> 00:09:17,280 Speaker 1: which in this case was to dismiss the the the charges, 141 00:09:17,400 --> 00:09:20,200 Speaker 1: just for the court to make a decision on behalf 142 00:09:20,240 --> 00:09:23,640 Speaker 1: of one party against another party without the need to 143 00:09:23,679 --> 00:09:26,640 Speaker 1: go to a full trial. And the judge grants this 144 00:09:26,800 --> 00:09:30,559 Speaker 1: to Compy Serve. And the judge agrees that compu Serve 145 00:09:30,640 --> 00:09:34,000 Speaker 1: did not bear responsibility for the contents of this newsletter. 146 00:09:34,400 --> 00:09:36,760 Speaker 1: The judge says that compu Serve is kind of like 147 00:09:36,760 --> 00:09:41,080 Speaker 1: a bookstore, and you wouldn't hold a bookstore responsible for 148 00:09:41,160 --> 00:09:43,760 Speaker 1: the contents of a book that was published by a 149 00:09:43,840 --> 00:09:47,000 Speaker 1: third party just because it happens to be in that bookstore. 150 00:09:47,320 --> 00:09:50,440 Speaker 1: The bookstore is just that's where customers can buy books. 151 00:09:50,480 --> 00:09:54,560 Speaker 1: The store did not put the actual content into the books, 152 00:09:54,640 --> 00:09:57,120 Speaker 1: and this becomes a precedent that would serve as a 153 00:09:57,160 --> 00:10:02,599 Speaker 1: foundational building block for section to thirty later. All right, everyone, um, 154 00:10:02,679 --> 00:10:05,080 Speaker 1: we're done here. Let's all jump back in the time machine. 155 00:10:05,280 --> 00:10:06,920 Speaker 1: Come out, no stragglers. I don't want to have to 156 00:10:06,920 --> 00:10:10,520 Speaker 1: come back to lived through it. Once we're done. We 157 00:10:10,600 --> 00:10:12,440 Speaker 1: got a hop forward a couple of years. Okay, ready 158 00:10:13,360 --> 00:10:29,160 Speaker 1: push the button. All right, now we're in. So now 159 00:10:29,240 --> 00:10:31,640 Speaker 1: the number one song in America would be Ace of 160 00:10:31,760 --> 00:10:34,840 Speaker 1: Bases the Sign which I don't know about you, but 161 00:10:34,920 --> 00:10:38,520 Speaker 1: it opened up my eyes. I'm happy now. Seinfeld is 162 00:10:38,600 --> 00:10:41,840 Speaker 1: dominating TV ratings and the big movies at the box 163 00:10:41,840 --> 00:10:45,319 Speaker 1: office are The Lion King and Forest Gumps. So what 164 00:10:45,360 --> 00:10:48,280 Speaker 1: are we doing here? And now, well, this time we've 165 00:10:48,280 --> 00:10:51,760 Speaker 1: got another lawsuit, but this is one that's against a 166 00:10:51,880 --> 00:10:57,920 Speaker 1: different osp called Prodigy. Now. Like compu Serve, Prodigy hosts 167 00:10:58,000 --> 00:11:02,280 Speaker 1: stuff like forums and articles and their services on one forum, 168 00:11:02,480 --> 00:11:07,040 Speaker 1: and anonymous user alleged that a securities firm named Stratton 169 00:11:07,080 --> 00:11:12,920 Speaker 1: Oakmont was committing fraud in a stock offering and Stratton 170 00:11:13,040 --> 00:11:17,959 Speaker 1: Stratton Oakmont would sue Prodigy for liabel. Now, Prodigies lawyers 171 00:11:18,000 --> 00:11:22,120 Speaker 1: said that we shouldn't be held responsible for the content 172 00:11:22,280 --> 00:11:25,199 Speaker 1: that's posted by a third party by a Prodigy user, 173 00:11:25,600 --> 00:11:28,720 Speaker 1: and they cited the CompuServe ruling that came back a 174 00:11:28,760 --> 00:11:32,920 Speaker 1: few years earlier. But the judge in this case disagrees 175 00:11:33,520 --> 00:11:38,599 Speaker 1: with Prodigies lawyers. They rule against the service, and the 176 00:11:38,679 --> 00:11:43,080 Speaker 1: judge says that Prodigy exercised editorial control over the forums. 177 00:11:43,280 --> 00:11:47,600 Speaker 1: The service could and did remove material that was objectionable. 178 00:11:47,720 --> 00:11:51,560 Speaker 1: Unlike CompuServe, which had taken largely a hands off approach 179 00:11:51,679 --> 00:11:55,480 Speaker 1: to the stuff that was published on CompuServe, Prodigy got 180 00:11:55,520 --> 00:11:59,400 Speaker 1: more involved and would remove things that were in violation 181 00:12:00,040 --> 00:12:04,400 Speaker 1: of you know, community standards, and therefore that made Prodigy 182 00:12:04,640 --> 00:12:07,440 Speaker 1: not like a bookstore and made them more like a 183 00:12:07,480 --> 00:12:11,560 Speaker 1: publication like a newspaper, and the editorial control means that 184 00:12:11,600 --> 00:12:14,800 Speaker 1: Prodigy would have to assume responsibility for stuff that appeared 185 00:12:14,800 --> 00:12:18,280 Speaker 1: on the service. After all, if Prodigy intervenes in some cases, 186 00:12:18,559 --> 00:12:22,160 Speaker 1: it means it could and should have intervened in other cases, 187 00:12:22,320 --> 00:12:27,280 Speaker 1: like with Strattonoqumont. So we have that nine one decision 188 00:12:27,520 --> 00:12:31,160 Speaker 1: that says a platform is not responsible for third party 189 00:12:31,240 --> 00:12:34,719 Speaker 1: content published on that platform. But then we have a 190 00:12:34,840 --> 00:12:39,960 Speaker 1: nineteen five decision that seems to contradict that if the 191 00:12:40,000 --> 00:12:45,000 Speaker 1: platform exercised any sort of content management, that means that 192 00:12:45,080 --> 00:12:47,520 Speaker 1: they can be held liable. And I know we traveled 193 00:12:47,559 --> 00:12:51,640 Speaker 1: to nine respectively, but court cases can take a really 194 00:12:51,640 --> 00:12:54,600 Speaker 1: long time. So the decisions were actually handed down a 195 00:12:54,679 --> 00:12:57,840 Speaker 1: year after the initial lawsuits started. I'm sorry that we 196 00:12:57,840 --> 00:13:00,360 Speaker 1: all had to wait around so long for that. Well, 197 00:13:00,400 --> 00:13:06,600 Speaker 1: this created a precarious situation for online companies because the 198 00:13:06,640 --> 00:13:09,080 Speaker 1: message seemed to be that if you provided a space 199 00:13:09,120 --> 00:13:12,440 Speaker 1: for users and third parties to post stuff, it would 200 00:13:12,480 --> 00:13:15,080 Speaker 1: best suit you if you just never ever interfered with that, 201 00:13:15,600 --> 00:13:19,440 Speaker 1: regardless of what gets posted to the platform, because intervening 202 00:13:19,440 --> 00:13:22,200 Speaker 1: would be a slippery slope if you start removing content, 203 00:13:22,640 --> 00:13:27,520 Speaker 1: even stuff that clearly should not be there, like videos 204 00:13:27,559 --> 00:13:32,640 Speaker 1: of violence, or pornography, or death threats, or or personal 205 00:13:32,640 --> 00:13:35,680 Speaker 1: information of other users, whatever it might be. If you 206 00:13:35,720 --> 00:13:40,040 Speaker 1: remove it, no matter how obviously awful it is, you 207 00:13:40,120 --> 00:13:42,720 Speaker 1: create a precedent in which you are acting in an 208 00:13:42,840 --> 00:13:46,000 Speaker 1: editorial capacity. And if you can do that, then are 209 00:13:46,040 --> 00:13:49,280 Speaker 1: you really free of liability when someone posts something that's 210 00:13:49,520 --> 00:13:54,120 Speaker 1: libelous or otherwise illegal. And remember this is the mid nineties. 211 00:13:54,160 --> 00:13:58,000 Speaker 1: The online world hadn't even really started to take off yet, 212 00:13:58,240 --> 00:14:01,080 Speaker 1: so there was a real concern that we would see 213 00:14:01,120 --> 00:14:05,680 Speaker 1: a big negative impact, a chilling effect on the founding 214 00:14:05,720 --> 00:14:09,959 Speaker 1: and evolution of online businesses. Considering that people had generally 215 00:14:10,000 --> 00:14:12,200 Speaker 1: come to the belief that the internet was going to 216 00:14:12,240 --> 00:14:14,720 Speaker 1: be the future of business, or at least an important 217 00:14:14,720 --> 00:14:18,600 Speaker 1: component of business, this was a bad thing. Now, if 218 00:14:18,600 --> 00:14:21,480 Speaker 1: you guys pay attention out there, you probably know that 219 00:14:21,560 --> 00:14:27,840 Speaker 1: politicians typically lag behind big issues and technology. Technology changes 220 00:14:28,000 --> 00:14:31,400 Speaker 1: much faster than policies do. And a lot of politicians 221 00:14:31,440 --> 00:14:34,680 Speaker 1: in this country are in the United States, Um, how 222 00:14:34,680 --> 00:14:39,120 Speaker 1: do I put this delicately? They're old, Like a lot 223 00:14:39,160 --> 00:14:43,760 Speaker 1: of them are really old. The average age, the average 224 00:14:43,800 --> 00:14:48,240 Speaker 1: age of a congress person is twenty years older than 225 00:14:48,280 --> 00:14:53,080 Speaker 1: the average American in the average age of a representative 226 00:14:53,480 --> 00:14:58,000 Speaker 1: is fifty seven. For senators it's sixty one. And generally speaking, 227 00:14:58,200 --> 00:15:01,320 Speaker 1: older generations are a little a slower to pick up 228 00:15:01,360 --> 00:15:05,160 Speaker 1: on technological advances than younger ones. Now there are exceptions 229 00:15:05,160 --> 00:15:07,040 Speaker 1: to this, don't get me wrong, I'm not trying to 230 00:15:07,080 --> 00:15:11,520 Speaker 1: be agist here, but as a general rule, older people 231 00:15:11,760 --> 00:15:15,120 Speaker 1: are less up to speed on emerging technologies. I think 232 00:15:15,120 --> 00:15:19,000 Speaker 1: that rule applies double when it comes to politicians from 233 00:15:19,000 --> 00:15:23,240 Speaker 1: my own anecdotal experience, which I get isn't really evidence. 234 00:15:23,760 --> 00:15:27,600 Speaker 1: So while this was a growing concern within the tech world, 235 00:15:27,880 --> 00:15:30,640 Speaker 1: only a couple of politicians really picked up on how 236 00:15:30,680 --> 00:15:34,440 Speaker 1: these court decisions could create an issue and impede the 237 00:15:34,480 --> 00:15:37,880 Speaker 1: growth of online services in those early days. Now, that 238 00:15:38,000 --> 00:15:42,359 Speaker 1: pair included a Democrat named Ron Wyden and a Republican 239 00:15:42,440 --> 00:15:46,920 Speaker 1: named Chris Cox, who wanted to create legislation full stop. 240 00:15:47,760 --> 00:15:49,880 Speaker 1: They both wanted to make their mark, they wanted to 241 00:15:49,920 --> 00:15:53,720 Speaker 1: pass some laws, but at this particular time in American 242 00:15:53,760 --> 00:15:56,880 Speaker 1: history where it was really hard to do that because 243 00:15:57,280 --> 00:16:02,360 Speaker 1: partisan politics were any vicious at the time. I think 244 00:16:02,440 --> 00:16:06,480 Speaker 1: they were gentle as a kitten compared to today's politics, 245 00:16:06,520 --> 00:16:09,600 Speaker 1: but at the time it was considered pretty brutal, and 246 00:16:09,640 --> 00:16:12,280 Speaker 1: that meant there was very little chance to get agreement 247 00:16:12,400 --> 00:16:16,160 Speaker 1: across the aisle. Republicans at the time controlled both the 248 00:16:16,240 --> 00:16:19,680 Speaker 1: House and the Senate. In Congress in the United States 249 00:16:20,080 --> 00:16:23,000 Speaker 1: are our Congress is divided up into two branches, the 250 00:16:23,040 --> 00:16:27,680 Speaker 1: House of Representatives and the Senate. And a Democrat was president, 251 00:16:27,840 --> 00:16:32,120 Speaker 1: so president and a Republican Congress, and they figured. These 252 00:16:32,160 --> 00:16:34,520 Speaker 1: two people figured that their best chance at making an 253 00:16:34,560 --> 00:16:37,000 Speaker 1: impact was to find a topic that was so new, 254 00:16:37,160 --> 00:16:40,840 Speaker 1: so cutting edge that neither party had actually formed an 255 00:16:40,840 --> 00:16:44,920 Speaker 1: opinion about it. Yet the Internet was a perfect target. 256 00:16:45,080 --> 00:16:49,520 Speaker 1: And this, my friends, drives me bonkers because it points 257 00:16:49,800 --> 00:16:52,480 Speaker 1: out that the writing of section to thirty didn't begin 258 00:16:53,120 --> 00:16:56,200 Speaker 1: with politicians identifying an issue and then finding a way 259 00:16:56,280 --> 00:16:59,320 Speaker 1: to solve it. Instead, it was a case of a 260 00:16:59,400 --> 00:17:04,400 Speaker 1: couple of politicians trying to figure out what sort of problem, 261 00:17:04,480 --> 00:17:07,600 Speaker 1: any problem, could they find where they could potentially tackle 262 00:17:07,640 --> 00:17:10,280 Speaker 1: it and get their names on some legislation. Uh. It's 263 00:17:10,280 --> 00:17:13,199 Speaker 1: probably being a little unfair, but it is sort of 264 00:17:13,200 --> 00:17:16,080 Speaker 1: the reality of politics, and from a practical standpoint, I 265 00:17:16,119 --> 00:17:18,480 Speaker 1: get it. But it's also kind of disillusioning to me. 266 00:17:18,920 --> 00:17:22,280 Speaker 1: The pair determined that they would have a decent chance 267 00:17:22,359 --> 00:17:26,120 Speaker 1: at proposing legislation that would protect online services from being 268 00:17:26,200 --> 00:17:29,199 Speaker 1: sued for the content that other people were posting to 269 00:17:29,240 --> 00:17:33,359 Speaker 1: those services, plus give the services the freedom to moderate 270 00:17:33,480 --> 00:17:37,679 Speaker 1: content without fear of being sued for that. Either. The 271 00:17:37,680 --> 00:17:40,919 Speaker 1: Internet was so new and the potential was so huge 272 00:17:41,200 --> 00:17:42,960 Speaker 1: that they felt that this was a pretty good bet, 273 00:17:43,240 --> 00:17:46,399 Speaker 1: so they drafted out a proposed piece of legislation that 274 00:17:46,440 --> 00:17:50,920 Speaker 1: they called the Internet Freedom and Family Empowerment Act. It 275 00:17:51,000 --> 00:17:55,400 Speaker 1: would ultimately have as its core principle the following quote. 276 00:17:56,160 --> 00:18:00,120 Speaker 1: No provider or user of an interactive computer service shall 277 00:18:00,119 --> 00:18:03,639 Speaker 1: be treated as the publisher or speaker of any information 278 00:18:03,680 --> 00:18:08,960 Speaker 1: provided by another information content provider. End quote. Now. In addition, 279 00:18:09,400 --> 00:18:12,480 Speaker 1: the piece has what has been called the good faith section, 280 00:18:12,800 --> 00:18:15,240 Speaker 1: which states that platforms will not be held liable for 281 00:18:15,560 --> 00:18:20,400 Speaker 1: quote any action voluntarily taken in good faith to restrict 282 00:18:20,440 --> 00:18:24,600 Speaker 1: access to or availability of material that the provider or 283 00:18:24,840 --> 00:18:31,679 Speaker 1: user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, 284 00:18:31,920 --> 00:18:36,879 Speaker 1: or otherwise objectionable, whether or not such material is constitutionally protected. 285 00:18:37,240 --> 00:18:40,040 Speaker 1: End quote. Now when we come back, we'll learn more 286 00:18:40,040 --> 00:18:42,960 Speaker 1: about what this actually means from a practical sense. But 287 00:18:43,080 --> 00:18:54,120 Speaker 1: first let's take a quick break. So on the face 288 00:18:54,160 --> 00:18:58,400 Speaker 1: of it, Section two thirty is fairly simple. You cannot 289 00:18:58,440 --> 00:19:02,040 Speaker 1: hold someone or thing, in the case of a platform, 290 00:19:02,080 --> 00:19:06,120 Speaker 1: as the person responsible for what someone else or something 291 00:19:06,200 --> 00:19:11,000 Speaker 1: else says on that platform. So that applies to the 292 00:19:11,080 --> 00:19:13,960 Speaker 1: services and the users. If I'm on a forum and 293 00:19:14,000 --> 00:19:16,480 Speaker 1: I'm making an argument for a certain point of view 294 00:19:16,680 --> 00:19:20,000 Speaker 1: and someone else joins them with libelious accusations about a 295 00:19:20,160 --> 00:19:23,760 Speaker 1: third person, it would be unreasonable to hold me accountable 296 00:19:23,960 --> 00:19:26,639 Speaker 1: for that other person's words. Right, I didn't say the 297 00:19:26,680 --> 00:19:30,000 Speaker 1: libelous thing, why would I be responsible for it. I 298 00:19:30,080 --> 00:19:33,040 Speaker 1: might have started the discussion thread, I might have initiated 299 00:19:33,080 --> 00:19:36,359 Speaker 1: the conversation, But if I didn't actually say anything libelous, 300 00:19:36,440 --> 00:19:40,000 Speaker 1: then I shouldn't be held responsible. Right. Well, that same protection, 301 00:19:40,160 --> 00:19:45,200 Speaker 1: according to this legislation, would apply to online platforms. In addition, 302 00:19:45,520 --> 00:19:48,719 Speaker 1: the platforms would be able to make their own moderation 303 00:19:48,840 --> 00:19:52,480 Speaker 1: policies and not be held liable if the platform removed 304 00:19:52,640 --> 00:19:56,920 Speaker 1: something that would otherwise be constitutionally protected. So, in other words, 305 00:19:57,200 --> 00:20:01,760 Speaker 1: if Facebook removed a post because it violated a Facebook standard, 306 00:20:01,960 --> 00:20:05,920 Speaker 1: even if that post would otherwise be protected constitutionally, Facebook 307 00:20:05,920 --> 00:20:08,560 Speaker 1: would not be held liable as long as that removal 308 00:20:08,640 --> 00:20:12,760 Speaker 1: was done in good faith. Now, in at the time 309 00:20:12,800 --> 00:20:15,480 Speaker 1: that Cox and Widen were putting together their proposal in 310 00:20:15,560 --> 00:20:19,320 Speaker 1: the House of Representatives, there was another piece of legislation 311 00:20:19,400 --> 00:20:23,280 Speaker 1: under consideration that aimed at the issue of pornography online. 312 00:20:23,760 --> 00:20:26,400 Speaker 1: But that was taking place over in the US Senate, 313 00:20:26,840 --> 00:20:29,639 Speaker 1: and this piece tried to treat online content in a 314 00:20:29,680 --> 00:20:33,080 Speaker 1: way similar to how the US government regulates content on 315 00:20:33,320 --> 00:20:37,200 Speaker 1: TV and on radio broadcasts. There was a growing concern 316 00:20:37,600 --> 00:20:42,160 Speaker 1: about the availability the accessibility of pornographic and obscene material. 317 00:20:42,880 --> 00:20:46,879 Speaker 1: Obscenity in this case would fall under a pretty conservative definition, 318 00:20:47,119 --> 00:20:49,840 Speaker 1: like you know, kind of like beauty, obscenity is in 319 00:20:49,840 --> 00:20:51,960 Speaker 1: the eye of the beholder. It's one of those things 320 00:20:51,960 --> 00:20:53,919 Speaker 1: that you know it when you see it. That's kind 321 00:20:53,920 --> 00:20:56,960 Speaker 1: of the famous quote in US history. But the proposed 322 00:20:57,119 --> 00:21:02,000 Speaker 1: legislation would criminalize the act of exposing those under eighteen 323 00:21:02,040 --> 00:21:05,639 Speaker 1: to obscene or pornographic materials online. Now, I should say 324 00:21:05,800 --> 00:21:08,760 Speaker 1: it covered instances in which the age of the recipient 325 00:21:09,200 --> 00:21:13,199 Speaker 1: or at least they're under eighteen status would have to 326 00:21:13,200 --> 00:21:15,640 Speaker 1: be known to the person or entity sending the material 327 00:21:15,680 --> 00:21:19,560 Speaker 1: for this to be relevant. The proposed legislation went a 328 00:21:19,560 --> 00:21:22,959 Speaker 1: bit further than that too, with sections related to speech 329 00:21:23,000 --> 00:21:27,720 Speaker 1: that is indecent but not obscene. Again, really weird gray 330 00:21:27,760 --> 00:21:31,480 Speaker 1: area territory here, but the language raised some concerns among 331 00:21:31,680 --> 00:21:35,600 Speaker 1: civil liberties organizations, so people made some pretty strong arguments 332 00:21:35,640 --> 00:21:40,359 Speaker 1: against this Senate version of the idea. For example, in 333 00:21:40,400 --> 00:21:44,359 Speaker 1: the medical industry, there's terminology that some people might define 334 00:21:44,400 --> 00:21:49,399 Speaker 1: as indecent that's absolutely critical for legitimate medical communication in 335 00:21:49,440 --> 00:21:53,679 Speaker 1: clear terms, So sites that provide useful information for sensitive 336 00:21:53,680 --> 00:21:58,800 Speaker 1: topics like educating people about sexually transmitted diseases or resources 337 00:21:58,800 --> 00:22:00,920 Speaker 1: for people who are in the g b t Q 338 00:22:01,200 --> 00:22:05,080 Speaker 1: plus communities, all that could be at risk if you 339 00:22:06,280 --> 00:22:09,680 Speaker 1: allowed this kind of legislation to go forward. So Ron 340 00:22:09,720 --> 00:22:12,480 Speaker 1: Wyden and Chris Cox were positioning their proposal in the 341 00:22:12,520 --> 00:22:15,320 Speaker 1: House of Representatives as kind of an alternative to the 342 00:22:15,359 --> 00:22:18,760 Speaker 1: approach that was being talked about in the Senate. They 343 00:22:18,800 --> 00:22:22,440 Speaker 1: suggested that online platforms have the ability to moderate content 344 00:22:22,560 --> 00:22:25,080 Speaker 1: on their sites without the risk of being held liable 345 00:22:25,119 --> 00:22:27,760 Speaker 1: for the stuff that people and other parties were posting 346 00:22:27,800 --> 00:22:31,840 Speaker 1: to those sites, And according to Kasaf, their idea met 347 00:22:31,840 --> 00:22:37,040 Speaker 1: with no resistance. In fact, barely anyone even noticed. In 348 00:22:37,119 --> 00:22:41,280 Speaker 1: the United States, the way Congress creates new laws requires 349 00:22:41,320 --> 00:22:44,840 Speaker 1: both the House of Representatives and the Senate to vote 350 00:22:45,000 --> 00:22:48,840 Speaker 1: on the legislation and approve it before sending that piece 351 00:22:48,880 --> 00:22:51,720 Speaker 1: on to the President to be signed into law or 352 00:22:51,840 --> 00:22:55,159 Speaker 1: potentially vetoed. In a case where both the House and 353 00:22:55,200 --> 00:22:58,719 Speaker 1: the Senate are working on something similar but distinct as 354 00:22:58,800 --> 00:23:01,959 Speaker 1: far as legislation goes, those two versions have to be 355 00:23:02,040 --> 00:23:05,080 Speaker 1: hashed out in committee to create a more unified approach. 356 00:23:05,640 --> 00:23:09,120 Speaker 1: Both the Communications Decency Act out of the Senate and 357 00:23:09,280 --> 00:23:11,600 Speaker 1: what would become Section to thirty in the House of 358 00:23:11,640 --> 00:23:15,879 Speaker 1: Representatives would ultimately be lumped in with the overall discussions 359 00:23:15,920 --> 00:23:20,879 Speaker 1: of what would become the Telecommunications Act of NIX. Most 360 00:23:21,040 --> 00:23:24,600 Speaker 1: of Congress was really focused on the other elements of 361 00:23:24,640 --> 00:23:28,520 Speaker 1: the Telecommunications Act. The stuff that related to telephone companies 362 00:23:28,840 --> 00:23:33,120 Speaker 1: and telephone infrastructure and cable companies and the Internet portions 363 00:23:33,160 --> 00:23:35,520 Speaker 1: were more of an afterthought. It was so new that 364 00:23:35,600 --> 00:23:37,400 Speaker 1: a lot of people weren't really thinking about that. They 365 00:23:37,400 --> 00:23:39,600 Speaker 1: were thinking, no, the futures and long distance phone calls, 366 00:23:39,640 --> 00:23:42,560 Speaker 1: Gosh darn't it. And so both the c d A 367 00:23:42,920 --> 00:23:45,199 Speaker 1: which had support due to it being positioned as a 368 00:23:45,200 --> 00:23:49,560 Speaker 1: piece of legislation that was advocating for family values, which, boy, 369 00:23:49,640 --> 00:23:52,400 Speaker 1: was that a big, big point of discussion in the nineties. 370 00:23:52,640 --> 00:23:55,400 Speaker 1: And then Section to thirty, which was positioned as protecting 371 00:23:55,520 --> 00:23:59,840 Speaker 1: new and a vulnerable industry. Both of them made it through. 372 00:24:00,480 --> 00:24:03,399 Speaker 1: So in the end, the language that would become Section 373 00:24:03,440 --> 00:24:06,800 Speaker 1: to thirty and the alternative proposal which would become the 374 00:24:06,880 --> 00:24:10,680 Speaker 1: online portions of the Communications Decency Act would be bound 375 00:24:10,680 --> 00:24:13,679 Speaker 1: together to form kind of a vultron like construction of 376 00:24:13,760 --> 00:24:18,920 Speaker 1: online policy. Now, the Telecommunications Act is enormous. It's a 377 00:24:18,960 --> 00:24:22,879 Speaker 1: beast of a law, covering stuff like telephone lines, cable television, 378 00:24:22,880 --> 00:24:25,800 Speaker 1: and more. Section to thirty is just a tiny part 379 00:24:25,840 --> 00:24:30,280 Speaker 1: of that beast. Interestingly, the anti obscenity measures in the 380 00:24:30,320 --> 00:24:34,080 Speaker 1: Communications Decency Act would not stand the test of time. 381 00:24:34,160 --> 00:24:37,560 Speaker 1: Judges would strike down large portions of it, citing issues 382 00:24:37,600 --> 00:24:43,360 Speaker 1: such as vague terminology like indecent and offensive without firm definitions. 383 00:24:43,400 --> 00:24:46,840 Speaker 1: That legislation was open to far too much subjective interpretation 384 00:24:46,920 --> 00:24:49,639 Speaker 1: to be useful. But while a lot of the c 385 00:24:49,760 --> 00:24:54,040 Speaker 1: d A would go by by Section to thirty remained intact, 386 00:24:54,320 --> 00:24:58,320 Speaker 1: so all that obscenity and decency stuff was gone, but 387 00:24:58,440 --> 00:25:01,359 Speaker 1: that section to thirty I idea from Widen and Cox 388 00:25:01,440 --> 00:25:03,840 Speaker 1: was still there. It wasn't brought into question in the 389 00:25:03,880 --> 00:25:05,800 Speaker 1: courts the way the parts of the c d A 390 00:25:05,840 --> 00:25:09,800 Speaker 1: that specifically dealt with indecency online did. But Section to 391 00:25:09,960 --> 00:25:13,919 Speaker 1: thirty went untouched and it was a doozy. So with 392 00:25:13,960 --> 00:25:19,080 Speaker 1: Section to thirty, online services received immunization from liability regarding 393 00:25:19,200 --> 00:25:22,639 Speaker 1: what their users and third parties published. Section to thirty 394 00:25:22,720 --> 00:25:28,439 Speaker 1: means that these entities things like Facebook, Twitter, Amazon, Google, 395 00:25:28,920 --> 00:25:33,880 Speaker 1: they cannot be held legally responsible for user generated content, 396 00:25:35,040 --> 00:25:39,480 Speaker 1: with only the most minute exceptions, and those most of 397 00:25:39,480 --> 00:25:42,680 Speaker 1: those would come much much later. It's sort of a 398 00:25:42,760 --> 00:25:45,200 Speaker 1: get out of jail free card. Especially in the early 399 00:25:45,280 --> 00:25:49,199 Speaker 1: days when drafting the language, Widen and Cox were careful 400 00:25:49,240 --> 00:25:52,760 Speaker 1: to avoid being too broad. Uh. The idea was that 401 00:25:52,840 --> 00:25:56,800 Speaker 1: outright criminal activities and stuff like copyright infringement wouldn't receive 402 00:25:56,880 --> 00:26:00,440 Speaker 1: full protection under too thirty, but pretty much anything else 403 00:26:00,600 --> 00:26:03,720 Speaker 1: was at least that was the potential for it. Now 404 00:26:03,760 --> 00:26:07,680 Speaker 1: here's the thing about laws. They get tested in the courts. 405 00:26:08,400 --> 00:26:12,600 Speaker 1: Courts are left to interpret and enforce laws. So Congress 406 00:26:13,119 --> 00:26:16,399 Speaker 1: writes the laws. The President approves the laws, but the 407 00:26:16,480 --> 00:26:19,919 Speaker 1: laws are interpreted and enforced in the court system, in 408 00:26:19,960 --> 00:26:23,160 Speaker 1: the judicial system, So the fact that the courts could 409 00:26:23,160 --> 00:26:26,760 Speaker 1: interpret this meant that there was still a question as 410 00:26:26,800 --> 00:26:30,560 Speaker 1: to whether the court would say that Section to thirty 411 00:26:30,640 --> 00:26:34,360 Speaker 1: would mean that online services would have a conditional shield, 412 00:26:34,440 --> 00:26:38,200 Speaker 1: a conditional immunity to liability as long as they did 413 00:26:38,280 --> 00:26:41,280 Speaker 1: operate in good faith, you know, kind of like if 414 00:26:41,280 --> 00:26:45,200 Speaker 1: they were told to remove something because it was illegal 415 00:26:45,280 --> 00:26:47,480 Speaker 1: or harmful, that they then went and did so, and 416 00:26:47,520 --> 00:26:50,560 Speaker 1: then they'd be fine. Or the courts could interpret a 417 00:26:50,560 --> 00:26:52,840 Speaker 1: different way. They could say that the protections are more 418 00:26:52,920 --> 00:26:56,920 Speaker 1: broad and they could give online services complete freedom from 419 00:26:56,960 --> 00:27:00,760 Speaker 1: liability unless an exception were to be applied. And as 420 00:27:00,760 --> 00:27:04,120 Speaker 1: it turned out, the court system leaned toward option number two, 421 00:27:04,400 --> 00:27:08,840 Speaker 1: that broad application approach, and it wasn't done lightly or easily. 422 00:27:09,080 --> 00:27:10,840 Speaker 1: I would love to go into this more, but it 423 00:27:10,840 --> 00:27:12,800 Speaker 1: would take up way too much time, and trust me, 424 00:27:12,880 --> 00:27:14,640 Speaker 1: this episode is going to be a long one. So 425 00:27:15,040 --> 00:27:17,359 Speaker 1: if you want to learn about the legal decisions that 426 00:27:17,359 --> 00:27:20,560 Speaker 1: would codify the extent to which Section to thirty protects 427 00:27:20,600 --> 00:27:24,960 Speaker 1: online services, read Jeff Kossoff's book that I mentioned earlier now. 428 00:27:24,960 --> 00:27:29,280 Speaker 1: When Widen and Cox put this plan together, their concept 429 00:27:29,760 --> 00:27:33,760 Speaker 1: was that these service providers, protected by immunity to legal 430 00:27:33,800 --> 00:27:38,400 Speaker 1: repercussions as far as user generated content is concerned, would 431 00:27:38,440 --> 00:27:42,399 Speaker 1: be freed up to moderate that user generated content as 432 00:27:42,560 --> 00:27:44,920 Speaker 1: much as they needed to. The goal was to create 433 00:27:44,960 --> 00:27:49,359 Speaker 1: a safety structure so that Prodigy or America Online or 434 00:27:49,600 --> 00:27:53,399 Speaker 1: much later, Amazon or Facebook or Google would have the 435 00:27:53,520 --> 00:27:58,280 Speaker 1: freedom to exercise objectionable material off the site without risking 436 00:27:58,720 --> 00:28:03,080 Speaker 1: being seen as a published that has liability like Prodigy 437 00:28:03,359 --> 00:28:06,639 Speaker 1: was a few years earlier. And I've seen this described 438 00:28:06,720 --> 00:28:11,200 Speaker 1: as the sword and shield approach. The shield is that immunity, 439 00:28:11,280 --> 00:28:15,080 Speaker 1: and the sword is that editorial capacity to intervene without 440 00:28:15,280 --> 00:28:19,520 Speaker 1: fear of retribution. But there was a problem. Platforms began 441 00:28:19,600 --> 00:28:23,160 Speaker 1: to adopt a tendency to just rely on the shield part. 442 00:28:23,800 --> 00:28:28,040 Speaker 1: Frequently they did very little to editorialize or to moderate. 443 00:28:28,640 --> 00:28:31,520 Speaker 1: Now this might sound familiar to you. Over the past 444 00:28:31,520 --> 00:28:34,480 Speaker 1: few years, Twitter, for example, has come under fire for 445 00:28:34,560 --> 00:28:38,120 Speaker 1: being reticent when it comes to applying the company's own 446 00:28:38,280 --> 00:28:41,440 Speaker 1: terms of service as far as content goes. Time and again, 447 00:28:41,480 --> 00:28:45,520 Speaker 1: people have cited tweets from various users and asked Twitter's 448 00:28:45,560 --> 00:28:50,000 Speaker 1: management why such examples are allowed to stay on Twitter when, 449 00:28:50,520 --> 00:28:54,960 Speaker 1: but at least what is arguably a reasonable interpretation, those 450 00:28:54,960 --> 00:28:58,719 Speaker 1: tweets violate the code that Twitter says it has in place. 451 00:28:59,280 --> 00:29:02,600 Speaker 1: We've seen this a hunt with certain politicians in particular, 452 00:29:02,960 --> 00:29:05,440 Speaker 1: and it was only recently that Twitter would even flag 453 00:29:05,520 --> 00:29:10,600 Speaker 1: posts from prominent politicians like the President as containing misleading 454 00:29:10,640 --> 00:29:14,960 Speaker 1: information or untruths, and Twitter in no way rushed to 455 00:29:15,040 --> 00:29:19,440 Speaker 1: address this problem. I'll get to why they ultimately did later, 456 00:29:20,040 --> 00:29:22,920 Speaker 1: but spoiler alert, it doesn't really have anything to do 457 00:29:22,960 --> 00:29:25,760 Speaker 1: with Section to thirty. Now. I don't mean to just 458 00:29:26,040 --> 00:29:29,240 Speaker 1: single out Twitter and ignore everyone else, because, as I said, 459 00:29:29,680 --> 00:29:32,720 Speaker 1: it was far more common across the board for platforms 460 00:29:32,760 --> 00:29:35,360 Speaker 1: to take a very hands off approach when it came 461 00:29:35,400 --> 00:29:40,240 Speaker 1: to moderating content. Typically, they only stepped in during particularly 462 00:29:40,320 --> 00:29:44,160 Speaker 1: extreme or blatant abuses of the platform's policies, and not 463 00:29:44,240 --> 00:29:48,640 Speaker 1: always even then. All of this despite that immunity granted 464 00:29:48,680 --> 00:29:51,840 Speaker 1: by Section to thirty. There were a couple of reasons 465 00:29:51,840 --> 00:29:54,360 Speaker 1: for this. One is that, as I mentioned in the 466 00:29:54,400 --> 00:29:58,840 Speaker 1: Facebook algorithm episode that published last week, companies like Facebook 467 00:29:59,160 --> 00:30:04,040 Speaker 1: benefit financially from people being active on their platforms and 468 00:30:04,240 --> 00:30:09,920 Speaker 1: controversial posts generate a lot of action, so you could argue, hey, 469 00:30:09,960 --> 00:30:12,520 Speaker 1: that bad stuff that we would rather not have on 470 00:30:12,560 --> 00:30:15,520 Speaker 1: these platforms, that's making the platforms a lot of money, 471 00:30:15,680 --> 00:30:17,880 Speaker 1: so there's not a lot of financial incentive for them 472 00:30:17,920 --> 00:30:21,760 Speaker 1: to act against it. Second, over the past few years, 473 00:30:21,880 --> 00:30:25,520 Speaker 1: several platforms have emerged that seek to encourage trolling and 474 00:30:25,560 --> 00:30:29,240 Speaker 1: malicious behaviors. Like they're not just hosting it, their whole 475 00:30:29,320 --> 00:30:33,040 Speaker 1: purpose is to be a hot spot for that, and 476 00:30:33,080 --> 00:30:36,080 Speaker 1: they rely on Section to thirty to shield them from 477 00:30:36,080 --> 00:30:40,480 Speaker 1: repercussions because that immunity, at least at first, extended so 478 00:30:40,560 --> 00:30:43,680 Speaker 1: far that even if platform were to distribute or perhaps 479 00:30:43,680 --> 00:30:48,080 Speaker 1: even encourage the distribution of malicious content, it was still 480 00:30:48,080 --> 00:30:52,400 Speaker 1: protected as long as it was not generating the content. 481 00:30:52,600 --> 00:30:54,920 Speaker 1: If it was someone else was using their platform to 482 00:30:55,040 --> 00:30:58,240 Speaker 1: spread terrible stuff, they were still in the clear because 483 00:30:58,280 --> 00:31:01,040 Speaker 1: Section to thirty gave them that protect action. There were 484 00:31:01,080 --> 00:31:04,440 Speaker 1: numerous court cases that challenge this in different ways, including 485 00:31:04,480 --> 00:31:08,240 Speaker 1: ones that would see plaintiffs sup platforms for negligence for 486 00:31:08,320 --> 00:31:11,360 Speaker 1: failing to take down a harmful post after being told 487 00:31:11,400 --> 00:31:14,440 Speaker 1: about it, But early on courts and decided that if 488 00:31:14,440 --> 00:31:17,920 Speaker 1: they found in favor of plaintiffs in these cases, it 489 00:31:17,920 --> 00:31:21,280 Speaker 1: would just represent a work around for Section two thirty, 490 00:31:21,400 --> 00:31:24,920 Speaker 1: and it would essentially invalidate the protections entirely, because all 491 00:31:24,920 --> 00:31:27,600 Speaker 1: it would do is send the message of don't sue 492 00:31:28,000 --> 00:31:32,600 Speaker 1: Twitter for libel, for example, suthe them for negligence. So 493 00:31:33,600 --> 00:31:36,880 Speaker 1: that would mean that you would just have a different 494 00:31:36,880 --> 00:31:40,280 Speaker 1: pathway to go after these platforms, and Section to thirty 495 00:31:40,360 --> 00:31:42,960 Speaker 1: would be meaningless. So the courts decided that was unacceptable, 496 00:31:43,440 --> 00:31:46,720 Speaker 1: and they ruled early on that immunity from liability extended 497 00:31:46,800 --> 00:31:50,800 Speaker 1: beyond stuff like libel. In fact, some of the rulings 498 00:31:50,800 --> 00:31:54,360 Speaker 1: appeared to give online companies even more protection than Chris Cox, 499 00:31:54,440 --> 00:31:56,800 Speaker 1: one of the two original drafters of the law, had 500 00:31:56,840 --> 00:31:59,440 Speaker 1: in mind. He said, it may be that this was 501 00:31:59,480 --> 00:32:04,480 Speaker 1: applied more broadly than we had intended for malicious websites 502 00:32:04,520 --> 00:32:06,800 Speaker 1: and forums. This could mean that if you have a 503 00:32:06,840 --> 00:32:11,240 Speaker 1: particular agenda and a means to launch an online platform, 504 00:32:11,400 --> 00:32:14,480 Speaker 1: you can push your agenda even if it causes harm 505 00:32:14,560 --> 00:32:18,040 Speaker 1: to other people by giving sympathizers. You know people who 506 00:32:18,080 --> 00:32:21,200 Speaker 1: share your philosophy if you give them a place to 507 00:32:21,200 --> 00:32:24,920 Speaker 1: promote their ideology your ideology, as long as you're careful 508 00:32:24,960 --> 00:32:27,800 Speaker 1: not to do it. Yourself, because you're really just providing 509 00:32:27,800 --> 00:32:30,160 Speaker 1: a place for third parties to do it. If you 510 00:32:30,160 --> 00:32:32,320 Speaker 1: start to publish your own words, you no longer have 511 00:32:32,400 --> 00:32:35,080 Speaker 1: protection because you're not acting as a distributor for a 512 00:32:35,160 --> 00:32:39,680 Speaker 1: third party's content, and and that would be fine. So 513 00:32:39,800 --> 00:32:42,200 Speaker 1: we saw it happen a lot. We still do. And 514 00:32:42,200 --> 00:32:45,880 Speaker 1: then third, over the years, the government has steadily chipped 515 00:32:45,920 --> 00:32:50,160 Speaker 1: away bit by bit at that protective shield, with various 516 00:32:50,160 --> 00:32:55,320 Speaker 1: court cases leading judges to interpret exemptions for that immunity 517 00:32:55,560 --> 00:32:58,680 Speaker 1: to liability. And that makes platforms a little less keen 518 00:32:58,760 --> 00:33:03,280 Speaker 1: on editorializing fear of being pulled into one of those exceptions. Now, 519 00:33:03,280 --> 00:33:05,960 Speaker 1: when we come back, I'll talk a bit more about 520 00:33:06,000 --> 00:33:08,720 Speaker 1: the ways that courts have eroded Section to thirty from 521 00:33:08,760 --> 00:33:12,720 Speaker 1: its original and arguably o P status. But first, let's 522 00:33:12,760 --> 00:33:22,920 Speaker 1: take another quick break. All right, guys, you know what, 523 00:33:23,000 --> 00:33:25,400 Speaker 1: I guess it's about time we kind of jumped back 524 00:33:25,440 --> 00:33:28,120 Speaker 1: into the present. So the nineties were fun and all, 525 00:33:28,160 --> 00:33:33,440 Speaker 1: but take off the flannel, stop listening to grunge, forget 526 00:33:33,440 --> 00:33:35,520 Speaker 1: about my so called life for a minute, and let's 527 00:33:35,520 --> 00:33:37,240 Speaker 1: all get back in the time. Everyone back in the 528 00:33:37,280 --> 00:33:42,280 Speaker 1: time machine, you two, all right, I'll push the button 529 00:33:52,640 --> 00:33:56,040 Speaker 1: for the first decade of its existence. Section to thirty 530 00:33:56,120 --> 00:33:59,000 Speaker 1: was kind of like a bulletproof vest for online platforms 531 00:33:59,000 --> 00:34:01,760 Speaker 1: when it came to liability linked to user and third 532 00:34:01,800 --> 00:34:07,400 Speaker 1: party content. There are numerous incredible court cases featuring some 533 00:34:07,560 --> 00:34:12,879 Speaker 1: really sympathetic plaintiffs, you know, people who inarguably suffered hardships 534 00:34:13,000 --> 00:34:17,520 Speaker 1: because of someone sharing harmful or abuse of materials online. 535 00:34:18,400 --> 00:34:20,920 Speaker 1: And the message seemed to be that, at least in 536 00:34:21,000 --> 00:34:25,399 Speaker 1: some cases, there was no real legal recourse for these 537 00:34:25,400 --> 00:34:29,799 Speaker 1: people to seek out justice. If the perpetrator who's publishing 538 00:34:30,200 --> 00:34:35,480 Speaker 1: this harmful information is anonymous, it can be really hard 539 00:34:35,560 --> 00:34:38,080 Speaker 1: to track that person down and hold them accountable for 540 00:34:38,160 --> 00:34:42,359 Speaker 1: what they said online. There are lots of ways that 541 00:34:42,400 --> 00:34:46,120 Speaker 1: people can hide their identity, including ways to hide their 542 00:34:46,120 --> 00:34:50,200 Speaker 1: IP address, which means that in some cases it would 543 00:34:50,239 --> 00:34:53,279 Speaker 1: just be impossible to figure out who was ultimately responsible. 544 00:34:53,360 --> 00:34:57,120 Speaker 1: And Section to thirty would shield the platforms from legal action, 545 00:34:57,400 --> 00:35:00,800 Speaker 1: which meant that the victim would have no real options 546 00:35:01,320 --> 00:35:04,239 Speaker 1: and that rubbed a lot of people the wrong way. Now, 547 00:35:04,280 --> 00:35:06,560 Speaker 1: before I get into details, I do want to say 548 00:35:06,600 --> 00:35:10,360 Speaker 1: that I acknowledged this is a very tricky situation. On 549 00:35:10,400 --> 00:35:13,600 Speaker 1: the one hand, it seems unreasonable to hold a platform 550 00:35:13,680 --> 00:35:18,040 Speaker 1: accountable for something that it didn't generate. It wasn't responsible 551 00:35:18,080 --> 00:35:20,839 Speaker 1: for creating it. So if I pop onto Twitter and 552 00:35:20,920 --> 00:35:24,800 Speaker 1: I post harmful lies about you, it's not Twitter's fault 553 00:35:24,960 --> 00:35:27,920 Speaker 1: that I did that, right, It's my fault, and Twitter 554 00:35:28,200 --> 00:35:31,600 Speaker 1: should be protected from being unfairly lumped in with me 555 00:35:31,960 --> 00:35:36,640 Speaker 1: on that matter. Online platforms serve millions or even billions 556 00:35:36,680 --> 00:35:39,279 Speaker 1: of people, and there's just no way to filter every 557 00:35:39,280 --> 00:35:42,319 Speaker 1: single post from every single person to make sure that 558 00:35:42,400 --> 00:35:46,479 Speaker 1: there's nothing harmful there. But on the other hand, let's 559 00:35:46,480 --> 00:35:50,800 Speaker 1: say I post something that is demonstrably false and harmful 560 00:35:51,080 --> 00:35:54,880 Speaker 1: directed towards you, and you alert the platform I've used 561 00:35:55,040 --> 00:35:58,480 Speaker 1: to this problem. If the platform fails to act to 562 00:35:58,600 --> 00:36:01,920 Speaker 1: take down that post or perhaps even go further, maybe 563 00:36:01,960 --> 00:36:05,400 Speaker 1: ban me from using the service, then doesn't that suggest 564 00:36:05,400 --> 00:36:08,399 Speaker 1: the platform itself should be held accountable. I mean, it's 565 00:36:08,440 --> 00:36:12,680 Speaker 1: allowing a wrong to continue. Is there no responsibility to 566 00:36:12,719 --> 00:36:16,560 Speaker 1: prevent harm? And if there isn't, what makes the Internet 567 00:36:16,600 --> 00:36:20,880 Speaker 1: different from other things that law covers Because other areas 568 00:36:20,920 --> 00:36:23,400 Speaker 1: of the world that you know, in the legal world 569 00:36:23,400 --> 00:36:26,560 Speaker 1: in the United States, they don't enjoy this protection. But 570 00:36:26,640 --> 00:36:29,399 Speaker 1: it's a really complex issue, and it reminds us that 571 00:36:29,520 --> 00:36:32,440 Speaker 1: you can have a lot of things that are really important, 572 00:36:32,480 --> 00:36:35,040 Speaker 1: like freedom of speech or a right to privacy or 573 00:36:35,080 --> 00:36:38,040 Speaker 1: an expectation of security, and you can have those come 574 00:36:38,040 --> 00:36:40,960 Speaker 1: into conflict with one another, and it ultimately means that 575 00:36:41,239 --> 00:36:44,239 Speaker 1: whatever decision you make, it's not going to be satisfying 576 00:36:44,280 --> 00:36:48,440 Speaker 1: to everybody. Starting it. Around two thousand eight, courts began 577 00:36:48,520 --> 00:36:51,360 Speaker 1: to rule that Section two thirty wasn't a perfect force 578 00:36:51,440 --> 00:36:56,520 Speaker 1: field protecting online services from all liability. In California, a 579 00:36:56,560 --> 00:37:01,080 Speaker 1: pair of fair housing nonprofit organizations brought a lawsuit against 580 00:37:01,120 --> 00:37:04,680 Speaker 1: a website called roommates dot Com. They said that the 581 00:37:04,719 --> 00:37:08,560 Speaker 1: site was encouraging users to post and sort housing opportunities 582 00:37:08,880 --> 00:37:13,440 Speaker 1: in a discriminatory way, and that violated federal and state law. 583 00:37:13,920 --> 00:37:17,040 Speaker 1: It is illegal to advertise housing with a language that 584 00:37:17,080 --> 00:37:23,080 Speaker 1: indicates preference, limitation, or discrimination based on race, sex, familial status, 585 00:37:23,080 --> 00:37:25,880 Speaker 1: and that kind of thing. But roommates dot Com allowed 586 00:37:25,960 --> 00:37:29,160 Speaker 1: users to fill out fields on all that kind of stuff. 587 00:37:28,880 --> 00:37:32,960 Speaker 1: You could create a profile where you included things like 588 00:37:33,360 --> 00:37:38,000 Speaker 1: your gender, your familial you know status, your sexual orientation, 589 00:37:38,000 --> 00:37:40,200 Speaker 1: all this kind of stuff, and some people on the 590 00:37:40,200 --> 00:37:45,120 Speaker 1: site started posting discriminatory notices saying things like, you know, 591 00:37:45,239 --> 00:37:48,480 Speaker 1: essentially only white people need to apply that kind of stuff. 592 00:37:48,800 --> 00:37:52,120 Speaker 1: Terrible stuff. Now, the lawyer for roommates dot Com argued 593 00:37:52,239 --> 00:37:55,319 Speaker 1: that the site was protected under Section to thirty, but 594 00:37:55,440 --> 00:37:59,040 Speaker 1: the plaintiffs lawyer said, hang on roommates is totally setting 595 00:37:59,080 --> 00:38:02,040 Speaker 1: all this up because it has people fill in that 596 00:38:02,120 --> 00:38:05,439 Speaker 1: information in the first place. It asks people to give 597 00:38:05,480 --> 00:38:09,560 Speaker 1: that that those details. Now, the case was initially dismissed 598 00:38:09,560 --> 00:38:13,000 Speaker 1: in a lower court, but an appeals court would take 599 00:38:13,040 --> 00:38:17,000 Speaker 1: it into further consideration, and that three judge court ultimately 600 00:38:17,040 --> 00:38:21,040 Speaker 1: decided that with a very narrow focus, roommates dot Com 601 00:38:21,160 --> 00:38:26,160 Speaker 1: was liable for asking questions that were allegedly discriminatory, but 602 00:38:26,320 --> 00:38:29,720 Speaker 1: that it was not liable for the content that users 603 00:38:29,719 --> 00:38:33,759 Speaker 1: were writing in the site, like under additional comments. And 604 00:38:33,840 --> 00:38:38,000 Speaker 1: it's a fine distinction, but it marked a small weakness 605 00:38:38,040 --> 00:38:42,200 Speaker 1: in two thirties armor. Moreover, a subsequent hearing found that 606 00:38:42,320 --> 00:38:46,759 Speaker 1: a website develops content if it quote contributes materially to 607 00:38:46,800 --> 00:38:51,040 Speaker 1: the alleged illegality of the conduct end quote, which would 608 00:38:51,080 --> 00:38:53,840 Speaker 1: mean in those cases the service would no longer be 609 00:38:53,960 --> 00:38:58,040 Speaker 1: a simple publisher or distributor. It would be a developer 610 00:38:58,160 --> 00:39:01,640 Speaker 1: of content, and thus to thirty protection would not apply. 611 00:39:02,560 --> 00:39:06,680 Speaker 1: Subsequent court cases reinforce the idea that if a website 612 00:39:06,800 --> 00:39:11,680 Speaker 1: quote unquote materially contributes to the illegality of material posted 613 00:39:11,719 --> 00:39:14,960 Speaker 1: to that site by third parties, it would not or 614 00:39:15,040 --> 00:39:18,520 Speaker 1: may not qualify for Section to thirty immunity. So the 615 00:39:18,560 --> 00:39:21,880 Speaker 1: parameters of protection began to change a little bit. Should 616 00:39:21,880 --> 00:39:24,360 Speaker 1: a court find that a site had not just allowed 617 00:39:24,480 --> 00:39:27,400 Speaker 1: users to post a legal material, but to somehow be 618 00:39:27,840 --> 00:39:31,759 Speaker 1: active in that process beyond just being a publication platform, 619 00:39:32,200 --> 00:39:34,839 Speaker 1: then it could be held accountable. But while there were 620 00:39:34,840 --> 00:39:39,279 Speaker 1: new parameters, they weren't strictly defined, and courts would have 621 00:39:39,360 --> 00:39:43,200 Speaker 1: to interpret specific cases within the context of this kind 622 00:39:43,200 --> 00:39:47,640 Speaker 1: of vague notion of restrictions to immunity. A subsequent case 623 00:39:47,680 --> 00:39:52,120 Speaker 1: brought against Yahoo by a woman named Cecilia Barnes would 624 00:39:52,160 --> 00:39:57,000 Speaker 1: further complicate matters. Barnes's ex boyfriend created a fake profile 625 00:39:57,160 --> 00:40:01,759 Speaker 1: on Yahoo, claiming that the profile will longed to Cecilia, 626 00:40:01,840 --> 00:40:05,040 Speaker 1: and he included nude pictures of Barnes that he had 627 00:40:05,080 --> 00:40:10,279 Speaker 1: taken without her consent, which is truly horrifying. And then 628 00:40:10,320 --> 00:40:14,760 Speaker 1: he also included her work contact information and before long 629 00:40:15,480 --> 00:40:18,319 Speaker 1: men were showing up and trying to contact Cecilia, and 630 00:40:18,360 --> 00:40:19,960 Speaker 1: that must have come as a real shock to her. 631 00:40:20,000 --> 00:40:23,640 Speaker 1: I can't imagine how disruptive that had to have been 632 00:40:23,680 --> 00:40:27,360 Speaker 1: to her life. Cecilia found a link on Yahoo that 633 00:40:27,480 --> 00:40:31,160 Speaker 1: explained what people should do in the event that they 634 00:40:31,160 --> 00:40:35,400 Speaker 1: wanted to claim a profile that purported to represent them 635 00:40:35,640 --> 00:40:38,360 Speaker 1: was in fact a fake, and it involves sending an 636 00:40:38,400 --> 00:40:41,880 Speaker 1: assigned statement and a copy of their ID to Yahoo 637 00:40:42,239 --> 00:40:47,319 Speaker 1: via snail mail. So not exactly the fastest or most 638 00:40:47,360 --> 00:40:50,600 Speaker 1: streamlined of processes, but Barnes went ahead and did it, 639 00:40:51,560 --> 00:40:54,440 Speaker 1: but the Yahoo profile stayed up. Barnes had heard nothing 640 00:40:54,480 --> 00:40:57,880 Speaker 1: back from Yahoo, so she tried it again a couple 641 00:40:57,920 --> 00:41:01,719 Speaker 1: of times and still didn't receive any reply. Then she 642 00:41:01,840 --> 00:41:05,320 Speaker 1: was scheduled to give an interview on local television and 643 00:41:05,840 --> 00:41:09,800 Speaker 1: talk about her experience, when miraculously, a Yahoo representative actually 644 00:41:09,800 --> 00:41:12,880 Speaker 1: reached out to her. Now that representative was the director 645 00:41:12,920 --> 00:41:17,200 Speaker 1: of communications at Yahoo, and the director of communications promised 646 00:41:17,239 --> 00:41:22,960 Speaker 1: Cecilia that she would take a fact request from Cecilia 647 00:41:23,160 --> 00:41:26,920 Speaker 1: over to the proper division by hand and make certain 648 00:41:26,960 --> 00:41:30,520 Speaker 1: that the profile was removed. And the profile was not 649 00:41:30,680 --> 00:41:33,640 Speaker 1: removed it stayed up for another couple of months. So 650 00:41:34,040 --> 00:41:37,000 Speaker 1: Barnes goes and sues Yahoo. Now I'm going to skip 651 00:41:37,040 --> 00:41:40,440 Speaker 1: most of the court process, But ultimately the case hinged 652 00:41:40,480 --> 00:41:44,040 Speaker 1: on the fact that a Yahoo representative had made a 653 00:41:44,120 --> 00:41:48,879 Speaker 1: promise to do something but then didn't do it, and 654 00:41:49,000 --> 00:41:52,840 Speaker 1: that the court found was outside the protections of section 655 00:41:52,920 --> 00:41:57,359 Speaker 1: two thirty. Ironically, if Yahoo had not reached out at all, 656 00:41:57,440 --> 00:41:59,640 Speaker 1: if the company had just allowed things to keep on 657 00:41:59,680 --> 00:42:01,839 Speaker 1: going as they were going with the fake profile up 658 00:42:01,880 --> 00:42:05,880 Speaker 1: and they just never replied to Barnes, Section to thirty 659 00:42:05,920 --> 00:42:09,000 Speaker 1: would still have applied to Yahoo. It was only because 660 00:42:09,239 --> 00:42:12,520 Speaker 1: the representative had promised to do something and did not 661 00:42:12,680 --> 00:42:16,480 Speaker 1: follow through that the company was found liable. Now, if 662 00:42:16,560 --> 00:42:19,839 Speaker 1: y'all who had actually pulled down that profile, there also 663 00:42:19,880 --> 00:42:21,480 Speaker 1: would have been nothing to talk about here. So if 664 00:42:21,520 --> 00:42:23,520 Speaker 1: they had done what they said they were going to do, 665 00:42:23,880 --> 00:42:26,560 Speaker 1: there also wouldn't have been a problem. So literally, Yah, 666 00:42:26,560 --> 00:42:29,080 Speaker 1: who went down the one pathway where there still would 667 00:42:29,120 --> 00:42:33,480 Speaker 1: be liability. Barnes actually ultimately withdrew her lawsuit before it 668 00:42:33,480 --> 00:42:36,440 Speaker 1: would go through the entire court process, but that earlier 669 00:42:36,560 --> 00:42:41,320 Speaker 1: finding in court would hold and it would oddly discourage 670 00:42:41,360 --> 00:42:45,240 Speaker 1: platforms from taking a more active role in moderation, because 671 00:42:45,520 --> 00:42:49,120 Speaker 1: if a site did promise to remove something and then 672 00:42:49,160 --> 00:42:51,960 Speaker 1: they didn't do it in a timely enough manner, they 673 00:42:52,000 --> 00:42:55,360 Speaker 1: could be held liable for that because they didn't carry 674 00:42:55,360 --> 00:42:58,200 Speaker 1: through on a promise. If they did nothing at all, 675 00:42:58,400 --> 00:43:01,560 Speaker 1: they wouldn't be liable. They beat protected under section to thirty. 676 00:43:01,640 --> 00:43:04,680 Speaker 1: And I don't know about you, but doing nothing tends 677 00:43:04,719 --> 00:43:07,520 Speaker 1: to be easier than doing something. I mean, it's even 678 00:43:07,560 --> 00:43:10,600 Speaker 1: easier than promising to do something but not doing it. 679 00:43:10,719 --> 00:43:13,160 Speaker 1: Just not doing anything at all, it's still the easiest 680 00:43:13,160 --> 00:43:15,960 Speaker 1: thing to do. So in a way, these rulings that 681 00:43:16,080 --> 00:43:20,160 Speaker 1: found limitations to two thirty reinforce the behaviors of companies 682 00:43:20,200 --> 00:43:23,799 Speaker 1: that were reluctant to moderate the content on their platforms. 683 00:43:24,280 --> 00:43:28,280 Speaker 1: More recently, cases have been brought to light that section 684 00:43:28,320 --> 00:43:31,680 Speaker 1: to can play a role in suppressing voices of marginalized 685 00:43:31,680 --> 00:43:34,840 Speaker 1: and vulnerable population. So, in other words, a piece of 686 00:43:34,920 --> 00:43:39,040 Speaker 1: legislation tied to the spirit of free speech could in 687 00:43:39,120 --> 00:43:43,400 Speaker 1: itself be suppressing the free speech of others. For example, 688 00:43:43,640 --> 00:43:48,000 Speaker 1: while there are plenty of cases of online harassment campaigns 689 00:43:48,040 --> 00:43:52,280 Speaker 1: for all sorts of people, women represent a disproportionate number 690 00:43:52,640 --> 00:43:57,800 Speaker 1: of victims of online harassment. Women, particularly young women, encounter 691 00:43:57,920 --> 00:44:02,520 Speaker 1: sexualized online abuse are more frequently than men do, and 692 00:44:02,600 --> 00:44:05,560 Speaker 1: so there is a real issue of Section to thirty 693 00:44:05,600 --> 00:44:10,919 Speaker 1: providing immunity to platforms that house communities who are perpetuating 694 00:44:10,960 --> 00:44:13,880 Speaker 1: an abusive set of behaviors, which is not great and 695 00:44:13,960 --> 00:44:17,000 Speaker 1: other vulnerable populations face this too. We see it in 696 00:44:17,080 --> 00:44:22,239 Speaker 1: terms of race and sexual orientation, religious affiliations, political affiliations, 697 00:44:22,280 --> 00:44:25,600 Speaker 1: and more. And that harassment can have the effect of 698 00:44:25,719 --> 00:44:29,239 Speaker 1: silencing the people who are being harassed, So it is 699 00:44:29,280 --> 00:44:34,080 Speaker 1: a form of suppression of the freedom of speech. So 700 00:44:34,440 --> 00:44:37,719 Speaker 1: courts have whittled back a bit of the Section to 701 00:44:37,920 --> 00:44:40,880 Speaker 1: thirty protection and we have seen a bit more of 702 00:44:40,920 --> 00:44:45,799 Speaker 1: a move towards moderating content on platforms. However, this is 703 00:44:45,840 --> 00:44:49,279 Speaker 1: not out of a fear of legal liability. Instead, it's 704 00:44:49,320 --> 00:44:53,680 Speaker 1: because of consumer demand. We've seen platforms like Facebook and 705 00:44:53,719 --> 00:44:58,279 Speaker 1: YouTube and Twitter get more involved in content moderation, not 706 00:44:58,400 --> 00:45:02,680 Speaker 1: because the government was, you know, not going to provide 707 00:45:02,719 --> 00:45:05,879 Speaker 1: them immunity. The immunity, the illegal immunity was still there. 708 00:45:06,480 --> 00:45:09,719 Speaker 1: It's because users were demanding it and a failure to 709 00:45:10,040 --> 00:45:14,400 Speaker 1: act could have resulted in users dumping the services. And 710 00:45:14,440 --> 00:45:17,640 Speaker 1: these companies could still operate with an incredible amount of 711 00:45:17,760 --> 00:45:20,759 Speaker 1: legal protection, but that doesn't save them from the consequences 712 00:45:20,760 --> 00:45:25,200 Speaker 1: of people abandoning their their business. They need those customers. 713 00:45:25,560 --> 00:45:30,280 Speaker 1: And then in Congress passed a bill that amended section 714 00:45:30,560 --> 00:45:34,640 Speaker 1: to thirty. It removed protections for any site that knowingly 715 00:45:34,719 --> 00:45:38,960 Speaker 1: contributes to or supports sex trafficking. While the goal of 716 00:45:39,000 --> 00:45:42,319 Speaker 1: eliminating the support of sex trafficking is a really good one, 717 00:45:42,320 --> 00:45:46,560 Speaker 1: it's one we absolutely need to focus on. The actual 718 00:45:46,640 --> 00:45:50,920 Speaker 1: bill itself would receive some criticism, not for its purpose 719 00:45:51,000 --> 00:45:53,960 Speaker 1: but in its creation, like in its wording. So law 720 00:45:54,000 --> 00:45:58,439 Speaker 1: professor Eric Goldman wrote that quote As a result, liability 721 00:45:58,520 --> 00:46:02,600 Speaker 1: based on knowledge pushes internet companies to adopt one of 722 00:46:02,760 --> 00:46:08,200 Speaker 1: two extreme positions moderate all content perfectly and accept the 723 00:46:08,280 --> 00:46:12,720 Speaker 1: legal risk for any errors, or don't moderate content at all, 724 00:46:12,760 --> 00:46:16,719 Speaker 1: as a way of negating knowledge. So when you think 725 00:46:16,760 --> 00:46:20,399 Speaker 1: of it that way, if if the law says if 726 00:46:20,440 --> 00:46:23,839 Speaker 1: you know that this is happening, you're obligated to stop it, 727 00:46:23,960 --> 00:46:26,880 Speaker 1: or if not, you're going to be held liable, then 728 00:46:28,040 --> 00:46:31,040 Speaker 1: that also opens up the opportunity to quote unquote not 729 00:46:31,280 --> 00:46:35,120 Speaker 1: know it is happening. It creates an incentive to not 730 00:46:35,320 --> 00:46:39,560 Speaker 1: get involved, which is like the earlier problems that I 731 00:46:39,600 --> 00:46:42,759 Speaker 1: mentioned in this episode. But you know more, but any 732 00:46:42,800 --> 00:46:45,440 Speaker 1: system designed by humans is going to be imperfect, right, 733 00:46:45,840 --> 00:46:48,600 Speaker 1: And that brings us up to this year where we're 734 00:46:48,600 --> 00:46:51,840 Speaker 1: seeing various politicians and others calling for an end or 735 00:46:51,880 --> 00:46:56,160 Speaker 1: at least an amendment to Section to thirty. President Trump 736 00:46:56,360 --> 00:47:00,080 Speaker 1: appears angry that Twitter, for example, has flagged many of 737 00:47:00,120 --> 00:47:03,799 Speaker 1: his tweets as containing misleading information. He has gone so 738 00:47:03,840 --> 00:47:06,040 Speaker 1: far as to call Section to thirty a threat to 739 00:47:06,120 --> 00:47:09,919 Speaker 1: national security, which echoes something that was actually argued back 740 00:47:09,920 --> 00:47:13,279 Speaker 1: in the nineteen seventies that related to the publication of 741 00:47:13,320 --> 00:47:16,919 Speaker 1: the Pentagon papers. Uh, that tactic didn't work then, and 742 00:47:16,960 --> 00:47:19,480 Speaker 1: I don't think it's really gonna work now. And it 743 00:47:19,520 --> 00:47:23,880 Speaker 1: doesn't help that the root of the problem, which is misinformation, 744 00:47:24,280 --> 00:47:28,600 Speaker 1: is really to blame here. There's also and a misinterpretation 745 00:47:28,640 --> 00:47:31,120 Speaker 1: of Section to thirty going on here because Trump has 746 00:47:31,200 --> 00:47:35,279 Speaker 1: argued that platforms must be neutral in their approach, which 747 00:47:35,400 --> 00:47:38,520 Speaker 1: just isn't true. It's not part of the original law 748 00:47:38,600 --> 00:47:41,600 Speaker 1: at all, Nor is that an interpretation that's been supported 749 00:47:41,640 --> 00:47:45,160 Speaker 1: in the numerous court cases that have shaped the practice 750 00:47:45,280 --> 00:47:48,360 Speaker 1: of Section two thirty. Nowhere does it state that a 751 00:47:48,400 --> 00:47:50,960 Speaker 1: platform has to be neutral. In fact, in some of 752 00:47:51,000 --> 00:47:54,520 Speaker 1: the most famous cases in which Section to thirty protections 753 00:47:54,520 --> 00:47:59,000 Speaker 1: were upheld, it involved content platforms that were most assuredly 754 00:47:59,440 --> 00:48:02,160 Speaker 1: not new neutral, and in at least a few cases, 755 00:48:02,160 --> 00:48:06,400 Speaker 1: they were arguably downright maliciously biased. Now last year, in 756 00:48:06,440 --> 00:48:11,200 Speaker 1: twenty nineteen, Senator Josh Holly, a Republican from Missouri, introduced 757 00:48:11,280 --> 00:48:14,680 Speaker 1: legislation that would require any online service with more than 758 00:48:14,760 --> 00:48:19,359 Speaker 1: thirty million US users or three million users globally, or 759 00:48:19,360 --> 00:48:21,800 Speaker 1: with a revenue of at least five hundred million dollars 760 00:48:22,080 --> 00:48:25,759 Speaker 1: would be required to take a politically neutral stance when 761 00:48:25,760 --> 00:48:29,400 Speaker 1: it came to moderating content in order to qualify for 762 00:48:29,520 --> 00:48:33,279 Speaker 1: Section to thirty protection. So the implication here is that 763 00:48:33,360 --> 00:48:37,560 Speaker 1: the platforms have a bias against a particular political philosophy. 764 00:48:37,600 --> 00:48:42,760 Speaker 1: In this case, he's arguing that they are biased against conservatives, 765 00:48:43,000 --> 00:48:47,719 Speaker 1: and therefore when these platforms moderate content, they tend to 766 00:48:47,760 --> 00:48:52,319 Speaker 1: do so disproportionately to the detriment of conservative voices. Not 767 00:48:52,480 --> 00:48:55,719 Speaker 1: many people are taking this particular proposal seriously because it 768 00:48:55,719 --> 00:48:59,000 Speaker 1: would probably get torn to shreds under First Amendment arguments 769 00:48:59,000 --> 00:49:01,600 Speaker 1: in court. It wouldn't hold up to scrutiny at all. 770 00:49:02,280 --> 00:49:05,439 Speaker 1: Other proposals aimed to follow the path of the teen 771 00:49:05,480 --> 00:49:09,000 Speaker 1: Amendment that covered sex trafficking, with the idea that you 772 00:49:09,040 --> 00:49:13,000 Speaker 1: could do the same thing with other exceptions to the 773 00:49:13,120 --> 00:49:16,160 Speaker 1: section to thirty protection. But one potential problem with that 774 00:49:16,200 --> 00:49:18,759 Speaker 1: approach is that it creates a real mess as far 775 00:49:18,800 --> 00:49:22,680 Speaker 1: as what to thirty does and doesn't apply to, and 776 00:49:22,719 --> 00:49:25,359 Speaker 1: it could potentially reach a point where it's harder to 777 00:49:25,400 --> 00:49:30,480 Speaker 1: tell under what conditions platforms have protection versus the conditions 778 00:49:30,520 --> 00:49:32,880 Speaker 1: where they don't, and it would place a very heavy 779 00:49:32,920 --> 00:49:36,840 Speaker 1: burden on the court system to determine through various lawsuits 780 00:49:37,160 --> 00:49:40,480 Speaker 1: if the defendants, that being the platforms, had met the 781 00:49:40,600 --> 00:49:43,440 Speaker 1: legal burden to qualify for two thirty protection. So, in 782 00:49:43,440 --> 00:49:47,520 Speaker 1: other words, that's not ideal. Either. Then there's the possibility 783 00:49:47,800 --> 00:49:51,520 Speaker 1: that Congress will just repeal TO thirty totally, which would 784 00:49:51,640 --> 00:49:54,080 Speaker 1: have been terrifying to companies back in the nineties when 785 00:49:54,080 --> 00:49:57,800 Speaker 1: they were still trying to establish themselves. But frankly, to 786 00:49:58,000 --> 00:50:02,000 Speaker 1: thirty protection is an im American thing. In other places 787 00:50:02,040 --> 00:50:06,000 Speaker 1: like Europe, these broad protections don't exist in that form, 788 00:50:06,040 --> 00:50:10,320 Speaker 1: and yet social networking platforms forums that kind of stuff. 789 00:50:10,360 --> 00:50:13,000 Speaker 1: They still operate in those places now, granted they have 790 00:50:13,040 --> 00:50:15,600 Speaker 1: to do so while following a more strict set of rules, 791 00:50:16,040 --> 00:50:18,479 Speaker 1: and it's a pain in the butt. But we should 792 00:50:18,520 --> 00:50:21,719 Speaker 1: also remember that the big tech companies that this would 793 00:50:21,719 --> 00:50:26,120 Speaker 1: affect are also heavily involved in mobbying efforts in politics. 794 00:50:26,160 --> 00:50:28,719 Speaker 1: So how likely is it that we're going to see 795 00:50:28,800 --> 00:50:32,799 Speaker 1: section to thirty completely repealed? I honestly don't know, but 796 00:50:32,840 --> 00:50:35,400 Speaker 1: I think it would be a steep, uphill battle because 797 00:50:35,400 --> 00:50:37,640 Speaker 1: you've got a lot of money from these tech companies 798 00:50:38,040 --> 00:50:41,680 Speaker 1: influencing a lot of politicians. According to a great piece 799 00:50:41,680 --> 00:50:45,120 Speaker 1: in Ours Technica titled Section to thirty, the Internet Law 800 00:50:45,200 --> 00:50:48,800 Speaker 1: Politicians Love to hate, explained a law professor at the 801 00:50:48,880 --> 00:50:53,440 Speaker 1: University of Maryland named Danielle Citron, and researcher Ben Wits 802 00:50:53,600 --> 00:50:57,239 Speaker 1: from the Brookings Institution suggests that two thirty should be 803 00:50:57,239 --> 00:51:00,560 Speaker 1: amended so that the platforms receive immunity only if they 804 00:51:00,640 --> 00:51:05,080 Speaker 1: ensure quote reasonable steps to prevent or address unlawful uses 805 00:51:05,160 --> 00:51:07,960 Speaker 1: of its services end quote, leaving a lot of that 806 00:51:08,040 --> 00:51:11,280 Speaker 1: language up for interpretation in the courts. So the idea 807 00:51:11,360 --> 00:51:14,520 Speaker 1: being that you should be fine, you should be immune 808 00:51:14,640 --> 00:51:19,040 Speaker 1: as long as you can prove that whenever bad stuff 809 00:51:19,080 --> 00:51:21,200 Speaker 1: is happening on your platform, you're doing your best to 810 00:51:21,560 --> 00:51:27,160 Speaker 1: stop it. Um So, in cases where a platform was notified, hey, 811 00:51:27,760 --> 00:51:31,440 Speaker 1: some other user has published my private information on your 812 00:51:31,440 --> 00:51:34,480 Speaker 1: platform without my permission. Take it down, they would actually 813 00:51:34,520 --> 00:51:38,040 Speaker 1: go and take it down right now. Section to applies 814 00:51:38,080 --> 00:51:40,560 Speaker 1: to those companies, whether they take anything down or not, 815 00:51:41,120 --> 00:51:44,920 Speaker 1: and that has led to some pretty tragic circumstances in 816 00:51:44,960 --> 00:51:49,080 Speaker 1: the lives of people who have been affected by malicious 817 00:51:49,200 --> 00:51:53,760 Speaker 1: users of various services out there. Now, as I said, 818 00:51:54,360 --> 00:51:58,720 Speaker 1: this is a complicated subject. There is a real need 819 00:51:58,800 --> 00:52:03,640 Speaker 1: to protect freedom of speech because without it, if companies 820 00:52:03,680 --> 00:52:06,520 Speaker 1: can be held liable for everything that users right, we're 821 00:52:06,520 --> 00:52:09,799 Speaker 1: gonna see a disappearance of all of those things that 822 00:52:09,840 --> 00:52:12,839 Speaker 1: we take for granted. Now, I mean, social networks would 823 00:52:12,840 --> 00:52:15,720 Speaker 1: be totally different. We wouldn't be able to leave reviews 824 00:52:15,760 --> 00:52:19,120 Speaker 1: because any company that didn't like a review could end 825 00:52:19,200 --> 00:52:24,479 Speaker 1: up suing the marketplace for hosting that review. It would 826 00:52:24,480 --> 00:52:27,880 Speaker 1: be a huge mess. So we do need something there. 827 00:52:27,920 --> 00:52:32,120 Speaker 1: At the same time, we have to address that the 828 00:52:32,239 --> 00:52:37,360 Speaker 1: rules as they state right now do disproportionately affect vulnerable 829 00:52:37,360 --> 00:52:40,799 Speaker 1: populations in a negative way. And we need to fix that, 830 00:52:41,920 --> 00:52:44,640 Speaker 1: and we've got to figure out how to give more 831 00:52:44,680 --> 00:52:48,880 Speaker 1: incentives for platforms to take an active role in moderating 832 00:52:48,920 --> 00:52:53,160 Speaker 1: the content that appear on those platforms. And to me, 833 00:52:53,239 --> 00:52:57,000 Speaker 1: that's that's a tough tough nut to crack because there's 834 00:52:57,040 --> 00:52:59,680 Speaker 1: not a whole lot of financial incentive to do it unless, 835 00:53:00,080 --> 00:53:03,400 Speaker 1: as we've seen in there's a thread of people leaving 836 00:53:03,640 --> 00:53:08,120 Speaker 1: those platforms. Otherwise there's more of a financial incentive to 837 00:53:08,200 --> 00:53:11,600 Speaker 1: keep it up there. So it's a complicated situation. But 838 00:53:11,719 --> 00:53:14,840 Speaker 1: I hope that this helps you have an understanding of 839 00:53:14,880 --> 00:53:17,440 Speaker 1: what section to thirty is, what it was intended to do, 840 00:53:17,560 --> 00:53:20,719 Speaker 1: and what it actually has done, because, as we know, 841 00:53:21,600 --> 00:53:26,120 Speaker 1: often we will create a construct planning for it to 842 00:53:26,160 --> 00:53:29,040 Speaker 1: do one thing, only to see it go off and 843 00:53:29,160 --> 00:53:32,160 Speaker 1: rampage through the village and you know, throw a little 844 00:53:32,160 --> 00:53:35,120 Speaker 1: girl in the river. That's a Frankenstein reference, although I 845 00:53:35,120 --> 00:53:36,560 Speaker 1: think it might have been a little boy in the book. 846 00:53:36,600 --> 00:53:38,880 Speaker 1: I don't remember. I haven't writen a long time. Anyway. 847 00:53:38,920 --> 00:53:42,600 Speaker 1: That wraps up this discussion of section to thirty on 848 00:53:42,680 --> 00:53:45,719 Speaker 1: tech stuff. Hope you guys learn something. I hope you 849 00:53:45,800 --> 00:53:48,040 Speaker 1: have heard my dog shaking his head in the background. 850 00:53:48,400 --> 00:53:51,560 Speaker 1: If you have any suggestions for future topics of tech stuff, 851 00:53:51,600 --> 00:53:54,160 Speaker 1: reach out to me. You can do so on Twitter, 852 00:53:54,560 --> 00:53:58,640 Speaker 1: where unless they start moderating your comments, I'll see it 853 00:53:59,120 --> 00:54:02,320 Speaker 1: and they handle for that is text stuff H s W. 854 00:54:02,920 --> 00:54:11,000 Speaker 1: I'll talk to you again really soon. Y. Text Stuff 855 00:54:11,080 --> 00:54:14,200 Speaker 1: is an I Heart Radio production. For more podcasts from 856 00:54:14,239 --> 00:54:18,040 Speaker 1: I Heart Radio, visit the I Heart Radio app, Apple Podcasts, 857 00:54:18,160 --> 00:54:20,160 Speaker 1: or wherever you listen to your favorite shows.