1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to Tech Stuff, a production of I Heart Radios, 2 00:00:07,320 --> 00:00:13,760 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,800 --> 00:00:17,480 Speaker 1: I am your host, executive producer Jonathan Strickland. I'm with 4 00:00:17,560 --> 00:00:19,360 Speaker 1: How Stuff Works and iHeart Radio and a lot of 5 00:00:19,360 --> 00:00:23,000 Speaker 1: all things tech, and boy, how they we have a 6 00:00:23,079 --> 00:00:26,120 Speaker 1: story to talk about today. So in the United States, 7 00:00:26,600 --> 00:00:30,040 Speaker 1: a Court of Appeals upheld a ruling from two thousand 8 00:00:30,160 --> 00:00:33,040 Speaker 1: eighteen that stated that it was a violation of the 9 00:00:33,080 --> 00:00:36,960 Speaker 1: First Amendment for Donald Trump, the President of the United States, 10 00:00:37,000 --> 00:00:40,440 Speaker 1: to block users on Twitter. So in this episode, I'm 11 00:00:40,479 --> 00:00:44,120 Speaker 1: going to tackle the very complicated issue of social media, 12 00:00:44,600 --> 00:00:48,560 Speaker 1: free speech, government communication, and more. And this is not 13 00:00:48,760 --> 00:00:51,760 Speaker 1: going to be a political episode in the sense that 14 00:00:51,840 --> 00:00:55,800 Speaker 1: I am going to espouse a particular political philosophy. So 15 00:00:55,840 --> 00:00:58,160 Speaker 1: in other words, I'm not going to use this episode 16 00:00:58,160 --> 00:01:02,120 Speaker 1: to argue that one political outlook is better than any 17 00:01:02,160 --> 00:01:05,680 Speaker 1: other political outlook. That's I'm not interested in doing that 18 00:01:05,720 --> 00:01:08,640 Speaker 1: on this show ever. So Rather, this is about the 19 00:01:08,640 --> 00:01:11,679 Speaker 1: concept of free speech and why the digital age has 20 00:01:11,720 --> 00:01:14,640 Speaker 1: made it an even more complicated subject than it was before. 21 00:01:15,120 --> 00:01:17,840 Speaker 1: And it was already pretty thorny, So this is sort 22 00:01:17,840 --> 00:01:20,200 Speaker 1: of an unbiased approach to it. Now I'm going to 23 00:01:20,319 --> 00:01:24,440 Speaker 1: start off with the actual news story that prompted this episode, 24 00:01:24,520 --> 00:01:27,040 Speaker 1: and then we'll dive into a deeper look into the 25 00:01:27,040 --> 00:01:30,759 Speaker 1: First Amendment before he transitioned to the complications of applying 26 00:01:30,800 --> 00:01:35,280 Speaker 1: that concept to social media networks. It all stems from 27 00:01:35,319 --> 00:01:38,759 Speaker 1: a lawsuit that was first filed on July eleven, two 28 00:01:38,840 --> 00:01:42,280 Speaker 1: thousand seventeen, in the United States District Court for the 29 00:01:42,360 --> 00:01:46,480 Speaker 1: Southern District of New York. The plaintiffs were Twitter users 30 00:01:46,520 --> 00:01:49,480 Speaker 1: who had been blocked by Donald Trump on Twitter after 31 00:01:49,520 --> 00:01:52,520 Speaker 1: he had taken the position of President of the United States, 32 00:01:53,120 --> 00:01:55,200 Speaker 1: and it meant that they could no longer see what 33 00:01:55,400 --> 00:01:59,000 Speaker 1: he posted, nor would their messages be visible to him. 34 00:01:59,800 --> 00:02:04,600 Speaker 1: The Night Institute at Columbia University would represent this group 35 00:02:04,800 --> 00:02:07,640 Speaker 1: and argue that this amounted to a violation of the 36 00:02:07,640 --> 00:02:11,359 Speaker 1: First Amendment guarantee of free speech. The judge in that 37 00:02:11,440 --> 00:02:15,080 Speaker 1: case found in favor of the plaintiffs, saying that yes, 38 00:02:15,560 --> 00:02:18,679 Speaker 1: this is a violation of their First Amendment, and then 39 00:02:18,919 --> 00:02:22,560 Speaker 1: the government filed an appeal and this case then went 40 00:02:22,639 --> 00:02:25,840 Speaker 1: to the United States Court of Appeals for the Second Circuit, 41 00:02:26,320 --> 00:02:31,040 Speaker 1: and that court upheld the earlier ruling. Now, at this stage, 42 00:02:31,320 --> 00:02:35,720 Speaker 1: the government may either drop the matter and presumably obey 43 00:02:35,800 --> 00:02:39,600 Speaker 1: the court, or it may appeal to the Supreme Court, 44 00:02:40,040 --> 00:02:43,000 Speaker 1: which would be the final arbiter in matters of justice 45 00:02:43,040 --> 00:02:45,720 Speaker 1: in the United States. But the Supreme Court in the 46 00:02:45,800 --> 00:02:49,120 Speaker 1: United States here's only a small percentage of all the 47 00:02:49,200 --> 00:02:52,960 Speaker 1: cases that are submitted to it. It's submitted thousands of 48 00:02:52,960 --> 00:02:56,880 Speaker 1: cases each year, and it will hear something like less 49 00:02:56,880 --> 00:02:59,320 Speaker 1: than three percent of all the cases that have been 50 00:02:59,320 --> 00:03:02,720 Speaker 1: submitted to it. So there's no guarantee that the Supreme 51 00:03:02,760 --> 00:03:05,799 Speaker 1: Court would choose to hear this case, and if it did, 52 00:03:05,840 --> 00:03:08,840 Speaker 1: there's no guarantee that it would find any differently than 53 00:03:08,880 --> 00:03:12,880 Speaker 1: the earlier courts had. So the ruling says the president 54 00:03:12,919 --> 00:03:17,760 Speaker 1: and presumably any elected official cannot use social media to 55 00:03:17,800 --> 00:03:21,960 Speaker 1: communicate matters relating to government activities in any sort of 56 00:03:21,960 --> 00:03:26,480 Speaker 1: official way and still block citizens because that infringes on 57 00:03:26,520 --> 00:03:29,880 Speaker 1: the First Amendment rights of those citizens. So now let's 58 00:03:29,880 --> 00:03:32,120 Speaker 1: start breaking down what all this means before we get 59 00:03:32,160 --> 00:03:35,840 Speaker 1: into how it's really complicated in the digital age. First, 60 00:03:36,200 --> 00:03:39,000 Speaker 1: for those not versed in the Constitution of the United 61 00:03:39,040 --> 00:03:44,000 Speaker 1: States of America, let's have a quick civics lesson the constitution. 62 00:03:44,360 --> 00:03:47,560 Speaker 1: Is the basic foundation of law in the United States. 63 00:03:47,760 --> 00:03:51,320 Speaker 1: It is the bedrock upon which all other law is built. 64 00:03:51,720 --> 00:03:55,760 Speaker 1: Should a law be challenged in court to violate the Constitution, 65 00:03:56,280 --> 00:04:00,120 Speaker 1: the court is to strike down that law. The body 66 00:04:00,160 --> 00:04:03,280 Speaker 1: of the Constitution includes seven articles that lay out the 67 00:04:03,320 --> 00:04:06,600 Speaker 1: basic structure of the American government, and they cover such 68 00:04:06,640 --> 00:04:09,520 Speaker 1: things as the three branches of government. That includes the 69 00:04:09,640 --> 00:04:12,440 Speaker 1: legislative branch that makes the laws. That would be the 70 00:04:12,520 --> 00:04:15,080 Speaker 1: Senate and the House of Representatives. Then you have the 71 00:04:15,160 --> 00:04:19,000 Speaker 1: executive branch that executes the laws and access commander of 72 00:04:19,080 --> 00:04:22,040 Speaker 1: chief in chief of the Armed forces that would be 73 00:04:22,440 --> 00:04:25,680 Speaker 1: the president, vice president that group. Then you have the 74 00:04:25,760 --> 00:04:30,400 Speaker 1: judicial branch that well judges, and they're the ones who 75 00:04:30,440 --> 00:04:35,360 Speaker 1: actually decide if laws are constitutional or not, as well 76 00:04:35,400 --> 00:04:39,480 Speaker 1: as decide the individual cases federal cases that come up 77 00:04:39,520 --> 00:04:42,760 Speaker 1: in courts. And the other articles lay out the basic 78 00:04:42,839 --> 00:04:46,679 Speaker 1: rules that bind states together and establish where the federal 79 00:04:46,680 --> 00:04:49,720 Speaker 1: system fits in with the state system, as well as 80 00:04:49,720 --> 00:04:52,680 Speaker 1: the procedures for the government has to follow to amend 81 00:04:52,839 --> 00:04:57,919 Speaker 1: the Constitution and to ratify it. So that's the basic document. 82 00:04:58,000 --> 00:05:00,600 Speaker 1: Then you have the first ten amendments to the Institution, 83 00:05:00,680 --> 00:05:03,159 Speaker 1: also known as the Bill of Rights, and we're only 84 00:05:03,200 --> 00:05:07,120 Speaker 1: concerned with the First Amendment, which reads as follows. This 85 00:05:07,200 --> 00:05:11,839 Speaker 1: is the actual wording on the Constitution. Congress shall make 86 00:05:12,000 --> 00:05:15,599 Speaker 1: no law respecting an establishment of religion, or prohibiting the 87 00:05:15,720 --> 00:05:20,279 Speaker 1: free exercise thereof, or abridging the freedom of speech or 88 00:05:20,279 --> 00:05:23,359 Speaker 1: of the press, or the right of the people peaceably 89 00:05:23,440 --> 00:05:26,520 Speaker 1: to a symbol and the and to petition the Government 90 00:05:26,600 --> 00:05:29,640 Speaker 1: for a redress of grievances. Now we need to get 91 00:05:29,680 --> 00:05:31,360 Speaker 1: a few other things out of the way. The First 92 00:05:31,400 --> 00:05:35,400 Speaker 1: Amendment only applies to the public sector and the government's 93 00:05:35,520 --> 00:05:39,200 Speaker 1: role in those matters. So, according to the U. S Constitution, 94 00:05:39,839 --> 00:05:43,039 Speaker 1: no government in the States may restrict or deny a 95 00:05:43,120 --> 00:05:47,840 Speaker 1: citizens speech or right to redress grievances. Now that's not 96 00:05:47,880 --> 00:05:51,400 Speaker 1: to say a private sector company couldn't do the same. 97 00:05:51,720 --> 00:05:54,760 Speaker 1: So if I have a business and own some land, 98 00:05:54,920 --> 00:05:58,880 Speaker 1: and I create a whole corporation, I can to some 99 00:05:58,960 --> 00:06:02,960 Speaker 1: extent tell my employees or even customers that certain types 100 00:06:03,000 --> 00:06:06,840 Speaker 1: of expression are not tolerated. This is what allows social 101 00:06:06,880 --> 00:06:10,720 Speaker 1: media platforms to take action and ban users. The First 102 00:06:10,720 --> 00:06:14,719 Speaker 1: Amendment protects against the government violating freedom of speech, but 103 00:06:14,760 --> 00:06:18,320 Speaker 1: it doesn't protect against Facebook or Twitter taking away your 104 00:06:18,480 --> 00:06:23,560 Speaker 1: ability to post on those platforms. Now, this too has limitations, 105 00:06:23,600 --> 00:06:27,719 Speaker 1: since there are other laws, such as anti discrimination laws, 106 00:06:27,760 --> 00:06:31,120 Speaker 1: to make it illegal to act in a discriminatory manner, 107 00:06:31,520 --> 00:06:34,760 Speaker 1: even in the private sector. So if I said, people 108 00:06:34,839 --> 00:06:38,040 Speaker 1: of a certain religion, we're not allowed in my space, 109 00:06:38,440 --> 00:06:42,600 Speaker 1: I'd be violating anti discrimination laws, even though the First 110 00:06:42,640 --> 00:06:45,279 Speaker 1: Amendment itself wouldn't be a factor because I'm not a 111 00:06:45,320 --> 00:06:49,680 Speaker 1: government agent. I'm a representative of the private sector. But 112 00:06:49,720 --> 00:06:55,159 Speaker 1: the anti discrimination laws still cover that particular UH arena. 113 00:06:55,400 --> 00:06:59,719 Speaker 1: So further, in court cases, judges have determined that the 114 00:06:59,760 --> 00:07:03,680 Speaker 1: free him of speech in itself has limitations. It is 115 00:07:04,320 --> 00:07:09,159 Speaker 1: not without its own restrictions, which sounds like a contradiction 116 00:07:09,279 --> 00:07:12,640 Speaker 1: right freedom of speech with restrictions. Well, you have to 117 00:07:12,640 --> 00:07:15,200 Speaker 1: think about the specific cases. So, for one thing, you 118 00:07:15,240 --> 00:07:18,960 Speaker 1: could use your own freedom of expression to restrict someone 119 00:07:19,040 --> 00:07:24,200 Speaker 1: else's same freedoms, such as by intimidating them into silence 120 00:07:24,760 --> 00:07:28,559 Speaker 1: or otherwise harassing them or being a problem. So things 121 00:07:28,560 --> 00:07:31,720 Speaker 1: like threats are not protected by the First Amendment. You 122 00:07:31,760 --> 00:07:35,920 Speaker 1: cannot make threats to people and claim First Amendment protection. UH. 123 00:07:35,960 --> 00:07:40,120 Speaker 1: That is not protected free speech in general. You're not 124 00:07:40,160 --> 00:07:43,560 Speaker 1: allowed to use freedom of speech to harm others. In 125 00:07:43,680 --> 00:07:46,080 Speaker 1: order to get whatever it is that you want. They're 126 00:07:46,080 --> 00:07:48,360 Speaker 1: also not allowed to make claims that you know to 127 00:07:48,440 --> 00:07:53,000 Speaker 1: be false, because that's fraud. Uh. Similarly, you can't make 128 00:07:53,000 --> 00:07:57,200 Speaker 1: claims about people like in libel or slander. It's where 129 00:07:57,240 --> 00:08:02,559 Speaker 1: you defame someone using untrue or uns stantiated information. That's 130 00:08:02,600 --> 00:08:05,320 Speaker 1: not protected by free speech either. Now that's not the 131 00:08:05,360 --> 00:08:08,040 Speaker 1: same thing as saying that it will always be punished. 132 00:08:08,480 --> 00:08:11,640 Speaker 1: It could be punished, but it isn't necessarily going to 133 00:08:11,680 --> 00:08:14,760 Speaker 1: be punished. Really, it's just saying that if you were 134 00:08:15,320 --> 00:08:18,680 Speaker 1: found guilty of acting in that way, you could not 135 00:08:18,840 --> 00:08:22,240 Speaker 1: claim the First Amendment for protection. You couldn't say it's 136 00:08:22,280 --> 00:08:24,840 Speaker 1: my freedom of speech to say these things if in 137 00:08:24,880 --> 00:08:30,840 Speaker 1: fact you were found guilty of committing libel or slander, or, 138 00:08:30,880 --> 00:08:35,760 Speaker 1: as a well known and frequently erroneously attributed phrase goes, 139 00:08:36,160 --> 00:08:39,160 Speaker 1: the right to swing your arm ends at the other 140 00:08:39,280 --> 00:08:43,120 Speaker 1: man's nose. The rights of one person cannot be interpreted 141 00:08:43,240 --> 00:08:47,200 Speaker 1: as such that they infringe upon the rights of another person. 142 00:08:48,000 --> 00:08:50,960 Speaker 1: So you do have rights up to the point where 143 00:08:51,720 --> 00:08:54,959 Speaker 1: you could, uh, you know, act in such a way 144 00:08:54,960 --> 00:08:57,000 Speaker 1: that you're not causing harm to other people. But if 145 00:08:57,000 --> 00:08:59,600 Speaker 1: you start causing harm to other people that's not within 146 00:08:59,640 --> 00:09:03,320 Speaker 1: your because that would be a violation of their rights. Okay, 147 00:09:03,559 --> 00:09:06,120 Speaker 1: So the First Amendment states that the government shall not 148 00:09:06,240 --> 00:09:09,680 Speaker 1: restrict expression, and it further states that Congress cannot restrict 149 00:09:09,720 --> 00:09:12,520 Speaker 1: the rights of citizens to quote petition the government for 150 00:09:12,559 --> 00:09:16,559 Speaker 1: a redress of grievances end quote. These are important concepts 151 00:09:16,559 --> 00:09:19,800 Speaker 1: that come into play with the legal rulings around Twitter 152 00:09:19,920 --> 00:09:23,959 Speaker 1: and other social media platforms. The next complication we need 153 00:09:24,000 --> 00:09:27,280 Speaker 1: to look at is the nature of social media accounts. 154 00:09:28,000 --> 00:09:31,400 Speaker 1: So Donald Trump has had his Twitter account since March 155 00:09:31,520 --> 00:09:35,280 Speaker 1: two thousand nine, which was many years before he became 156 00:09:35,360 --> 00:09:38,640 Speaker 1: president of the United States. So when he established his 157 00:09:38,640 --> 00:09:41,800 Speaker 1: Twitter account, he was a private citizen. He wasn't a 158 00:09:41,800 --> 00:09:45,679 Speaker 1: member of the public sector, and as such, as a 159 00:09:45,720 --> 00:09:50,360 Speaker 1: private citizen, he could block anyone he chose, just as 160 00:09:50,400 --> 00:09:54,240 Speaker 1: any other user can for any reason he or she likes. 161 00:09:54,800 --> 00:09:57,640 Speaker 1: This is not a violation of the First Amendment. As 162 00:09:57,679 --> 00:10:00,600 Speaker 1: a private user, just because people have the right to 163 00:10:00,640 --> 00:10:03,400 Speaker 1: say stuff doesn't mean you're obligated to listen to it. 164 00:10:03,720 --> 00:10:06,480 Speaker 1: At least if you're a private citizen, you're not you 165 00:10:06,480 --> 00:10:09,640 Speaker 1: can ignore it. And we all know how communication on 166 00:10:09,640 --> 00:10:13,760 Speaker 1: the Internet can quickly become little more than a flame 167 00:10:13,800 --> 00:10:17,360 Speaker 1: war or worse. I'm sure many of you out there have, 168 00:10:17,520 --> 00:10:21,280 Speaker 1: at some point or another blocked someone in an effort 169 00:10:21,280 --> 00:10:25,080 Speaker 1: to preserve at least some sense of sanity at one 170 00:10:25,120 --> 00:10:28,760 Speaker 1: time or another. I know that I have. Now, if 171 00:10:28,800 --> 00:10:32,800 Speaker 1: Trump had refrained from using his Twitter account to communicate 172 00:10:32,880 --> 00:10:37,920 Speaker 1: matters of official government business, perhaps this ruling would have 173 00:10:37,960 --> 00:10:43,120 Speaker 1: gone another way. Now, it's impossible to say for certain 174 00:10:43,320 --> 00:10:46,040 Speaker 1: it would have gone another way. It requires a what 175 00:10:46,200 --> 00:10:49,920 Speaker 1: if scenario, and that what if scenario did not happen. 176 00:10:50,800 --> 00:10:54,600 Speaker 1: If Trump as president had operated his Twitter account as 177 00:10:54,640 --> 00:11:00,280 Speaker 1: a private citizen and not included anything that remotely lated 178 00:11:00,280 --> 00:11:03,319 Speaker 1: to his official capacity as president, it would be hard 179 00:11:03,360 --> 00:11:07,280 Speaker 1: to argue that he violated anyone's First Amendment rights by 180 00:11:07,280 --> 00:11:11,679 Speaker 1: blocking them. This would assume a total separation of Trump 181 00:11:11,840 --> 00:11:15,080 Speaker 1: the person and Trump the president, which I think would 182 00:11:15,080 --> 00:11:18,600 Speaker 1: be challenging to argue, but I am no expert on 183 00:11:18,679 --> 00:11:22,560 Speaker 1: that matter. However, the courts have stated that the way 184 00:11:22,559 --> 00:11:26,480 Speaker 1: Trump has used his Twitter account since becoming president has 185 00:11:26,520 --> 00:11:30,680 Speaker 1: an overwhelming amount of evidence that it counts as presidential 186 00:11:30,679 --> 00:11:35,400 Speaker 1: communication to the public. As such, he cannot block members 187 00:11:35,520 --> 00:11:38,720 Speaker 1: of that public from being able to see or respond 188 00:11:38,880 --> 00:11:42,280 Speaker 1: to his messaging. As the First Amendment states, he can't 189 00:11:42,320 --> 00:11:47,079 Speaker 1: prevent citizens from petitioning the government for a redress of grievances. 190 00:11:47,080 --> 00:11:51,360 Speaker 1: Since he uses Twitter to communicate government activities, including such 191 00:11:51,400 --> 00:11:55,199 Speaker 1: stuff as hiring or firing members of his staff, intended 192 00:11:55,320 --> 00:12:00,000 Speaker 1: changes to policies, or announcing executive orders, he has constant 193 00:12:00,000 --> 00:12:05,520 Speaker 1: atitutionally obligated to keep those channels open. Now, this is 194 00:12:05,559 --> 00:12:09,160 Speaker 1: an issue that goes beyond Trump. People from both the 195 00:12:09,200 --> 00:12:15,400 Speaker 1: Republican and Democrat parties have had similar cases brought against them. 196 00:12:15,440 --> 00:12:19,000 Speaker 1: The Secretary of State for Alabama, guy named John Merrill, 197 00:12:19,280 --> 00:12:22,160 Speaker 1: had a lawsuit against him still does have a lawsuit 198 00:12:22,200 --> 00:12:25,680 Speaker 1: against him from plaintiffs who argue that his blocking them 199 00:12:25,760 --> 00:12:29,760 Speaker 1: on Twitter constitutes much the same matter as Trump's case, 200 00:12:30,320 --> 00:12:33,680 Speaker 1: though Merrill disputes this, and that lawsuit has seen no 201 00:12:33,800 --> 00:12:37,240 Speaker 1: resolution as of the recording of this podcast. In two 202 00:12:37,240 --> 00:12:42,040 Speaker 1: thousands seventeen, a Virginia District Court judge ruled against chairwoman 203 00:12:42,200 --> 00:12:46,640 Speaker 1: of the Loudon County Board of Supervisors, h Phyllis Randall 204 00:12:46,760 --> 00:12:49,880 Speaker 1: is her name. She had banned a man named Brian 205 00:12:50,000 --> 00:12:53,840 Speaker 1: Davison from her Facebook page after Davison had left critical 206 00:12:53,880 --> 00:12:58,040 Speaker 1: comments in the messages when she was talking to voters. 207 00:12:58,559 --> 00:13:01,880 Speaker 1: Randall had argued that this was her personal Facebook page, 208 00:13:01,880 --> 00:13:05,480 Speaker 1: it wasn't the official page for her as in her 209 00:13:05,559 --> 00:13:08,959 Speaker 1: role as chairwoman, but the judge stated that she clearly 210 00:13:09,040 --> 00:13:12,040 Speaker 1: was using it in an official capacity because she would 211 00:13:12,040 --> 00:13:15,560 Speaker 1: solicit comment from the voting public in her posts, and 212 00:13:15,600 --> 00:13:19,440 Speaker 1: therefore she could not exclude members of the public from 213 00:13:19,480 --> 00:13:25,679 Speaker 1: that activity. Very recently, Congresswoman Alexandria Ocasia Cortez has been 214 00:13:25,720 --> 00:13:29,520 Speaker 1: sued for blocking Joseph Saladino, a YouTube personality who is 215 00:13:29,559 --> 00:13:33,120 Speaker 1: currently seeking office in the United States, and she blocked 216 00:13:33,200 --> 00:13:36,240 Speaker 1: him on Twitter. Saladino's argument is that the same rules 217 00:13:36,280 --> 00:13:41,000 Speaker 1: that apply to Trump have to apply to all elected officials. Now, 218 00:13:41,080 --> 00:13:43,680 Speaker 1: when we come back, we'll look at some of these 219 00:13:43,720 --> 00:13:46,480 Speaker 1: issues a little more closely and talk about the difficult 220 00:13:46,559 --> 00:13:49,960 Speaker 1: landscape we now find ourselves in. But first I'm going 221 00:13:50,000 --> 00:13:59,760 Speaker 1: to take a quick break. I'm going to guess that 222 00:14:00,000 --> 00:14:03,400 Speaker 1: most of you listening to this podcast have spent a 223 00:14:03,520 --> 00:14:07,280 Speaker 1: decent amount of time on the Internet. You've probably all 224 00:14:07,400 --> 00:14:12,200 Speaker 1: heard the common wisdom, never read the comments. You know 225 00:14:12,240 --> 00:14:16,160 Speaker 1: that the Internet, while it can give us access to entertainment, information, 226 00:14:16,480 --> 00:14:22,000 Speaker 1: channels of communication also facilitates trolling, hate speech, harassment, all 227 00:14:22,080 --> 00:14:26,960 Speaker 1: sorts of unpleasantness. The old adage of ignore them and 228 00:14:27,000 --> 00:14:31,120 Speaker 1: they'll go away hasn't proven to be very effective, particularly 229 00:14:31,480 --> 00:14:33,800 Speaker 1: in an age where some people will go to such 230 00:14:33,920 --> 00:14:37,040 Speaker 1: lengths as do sing, digging up and then sharing private 231 00:14:37,080 --> 00:14:40,960 Speaker 1: information about people, or swatting in which a malicious person 232 00:14:41,040 --> 00:14:44,440 Speaker 1: would give false information to law enforcement officials to initiate 233 00:14:44,480 --> 00:14:47,720 Speaker 1: what amounts to an attack on a person's home or office. 234 00:14:48,520 --> 00:14:51,200 Speaker 1: It can be an ugly world out there, and there 235 00:14:51,240 --> 00:14:54,080 Speaker 1: are lots of different reasons why these things happen, and 236 00:14:54,120 --> 00:14:57,200 Speaker 1: I'll have to have a more in depth discussion about 237 00:14:57,240 --> 00:15:00,240 Speaker 1: that at some point, about the psychology of trolling and 238 00:15:00,320 --> 00:15:02,720 Speaker 1: the culture of trolling and what enables it and what 239 00:15:02,960 --> 00:15:05,600 Speaker 1: feeds it. But that's a matter for a different episode. 240 00:15:05,920 --> 00:15:08,120 Speaker 1: The reason I bring it up here is that the 241 00:15:08,160 --> 00:15:11,280 Speaker 1: ability to block or ban people is for some of 242 00:15:11,400 --> 00:15:15,280 Speaker 1: us what makes it bearable to even use the platforms 243 00:15:15,280 --> 00:15:17,880 Speaker 1: available on the Internet in the first place. And the 244 00:15:17,920 --> 00:15:21,600 Speaker 1: courts recognized this as well. And here's where we encounter 245 00:15:21,680 --> 00:15:24,680 Speaker 1: one of the many sticky points that are hard to resolve. 246 00:15:25,400 --> 00:15:28,440 Speaker 1: Before the break, I mentioned the case of Phyllis Randall, 247 00:15:28,600 --> 00:15:31,920 Speaker 1: the politician in Virginia who was reprimanded for blocking a 248 00:15:32,040 --> 00:15:34,800 Speaker 1: voter on her Facebook page. Now, I should point out 249 00:15:35,040 --> 00:15:38,760 Speaker 1: she only instituted a twelve hour ban, and the judge 250 00:15:38,840 --> 00:15:42,240 Speaker 1: in the case took that into account. And the judge 251 00:15:42,280 --> 00:15:46,000 Speaker 1: also stated that quote government officials have at least a 252 00:15:46,040 --> 00:15:50,880 Speaker 1: reasonably strong interest in moderating discussion on their Facebook pages 253 00:15:51,320 --> 00:15:55,640 Speaker 1: in an expeditious manner. By permitting a commenter to repeatedly 254 00:15:55,800 --> 00:15:59,920 Speaker 1: post inappropriate content pending a review process, a government of 255 00:16:00,000 --> 00:16:03,080 Speaker 1: official could easily fail to preserve their online forum for 256 00:16:03,200 --> 00:16:06,960 Speaker 1: its intended purpose end quote. So, in other words, a 257 00:16:07,040 --> 00:16:12,840 Speaker 1: politician has the right and arguably the responsibility to at 258 00:16:12,880 --> 00:16:16,480 Speaker 1: least block or delete certain comments that pop up because 259 00:16:16,520 --> 00:16:19,160 Speaker 1: they may not be germane to the matter at hand, 260 00:16:19,800 --> 00:16:22,480 Speaker 1: or they may be meant to derail the discussion or 261 00:16:22,520 --> 00:16:27,040 Speaker 1: even attack other people involved in that discussion. These actions 262 00:16:27,040 --> 00:16:31,160 Speaker 1: would quote impinge on the First Amendment rights end quote 263 00:16:31,600 --> 00:16:34,640 Speaker 1: of the other folks who are using that forum. So 264 00:16:34,680 --> 00:16:38,800 Speaker 1: now we're talking about a case by case basis issue, 265 00:16:39,080 --> 00:16:42,760 Speaker 1: and elected official can't ban a voter from participating in 266 00:16:42,760 --> 00:16:45,840 Speaker 1: a public forum, even if that public forum is the 267 00:16:45,880 --> 00:16:49,840 Speaker 1: officials own social media account page, whether it's on Twitter 268 00:16:50,040 --> 00:16:52,400 Speaker 1: or Facebook or whatever, So you can think of that 269 00:16:52,440 --> 00:16:55,160 Speaker 1: as a private platform, but a public forum. This is 270 00:16:55,200 --> 00:16:59,120 Speaker 1: another issue that is a problem here. But the official 271 00:16:59,440 --> 00:17:02,840 Speaker 1: is allowed to moderate comments, but that means the official 272 00:17:02,920 --> 00:17:05,119 Speaker 1: has to be able to determine whether or not a 273 00:17:05,240 --> 00:17:10,840 Speaker 1: comment counts as being relative or being disruptive. Now, we 274 00:17:10,880 --> 00:17:16,080 Speaker 1: need to remember that elected officials, despite occasional evidence to 275 00:17:16,119 --> 00:17:20,920 Speaker 1: the contrary, are human beings. Some of them are awful people. Sure, 276 00:17:21,119 --> 00:17:24,400 Speaker 1: some of them are wonderful people. Some of them may 277 00:17:24,440 --> 00:17:28,200 Speaker 1: not show their awfulness or wonderfulness in their capacity as 278 00:17:28,240 --> 00:17:32,840 Speaker 1: an elected official. But people, as a general rule are 279 00:17:33,160 --> 00:17:36,879 Speaker 1: not super happy when other people disagree with them. Some 280 00:17:37,040 --> 00:17:41,000 Speaker 1: of us, and I'm including myself in this particular category, 281 00:17:42,119 --> 00:17:45,000 Speaker 1: I can get a bit huffy about it. Now, later on, 282 00:17:45,480 --> 00:17:51,400 Speaker 1: when the initial and often irrational reaction has settled, I 283 00:17:51,400 --> 00:17:56,600 Speaker 1: will find myself acknowledging them maybe, just maybe I was wrong. 284 00:17:57,720 --> 00:18:00,600 Speaker 1: That's not a fun realization, But I wish it were 285 00:18:00,640 --> 00:18:03,359 Speaker 1: one that would occur to me earlier in the process, 286 00:18:03,440 --> 00:18:06,840 Speaker 1: because it could have saved me from some nasty exchanges 287 00:18:06,960 --> 00:18:10,320 Speaker 1: that I otherwise could have avoided. I would have ruffled 288 00:18:10,359 --> 00:18:13,679 Speaker 1: fewer feathers, and I would have strained fewer friendships. But 289 00:18:13,720 --> 00:18:16,720 Speaker 1: what I'm really getting at is I could easily imagine 290 00:18:16,760 --> 00:18:19,879 Speaker 1: myself being in one of these situations in which I 291 00:18:19,920 --> 00:18:23,120 Speaker 1: post something as an elected official, and then I see 292 00:18:23,160 --> 00:18:26,639 Speaker 1: a dissenting opinion worded in such a way that I 293 00:18:26,680 --> 00:18:29,480 Speaker 1: consider it an attack or an insult, and so boop, 294 00:18:29,720 --> 00:18:33,199 Speaker 1: I delete the comment. But I can also imagine that 295 00:18:33,359 --> 00:18:35,679 Speaker 1: later on I might cool down and realize, you know, 296 00:18:35,800 --> 00:18:39,080 Speaker 1: maybe the comment was worded in a confrontational way, but 297 00:18:39,200 --> 00:18:42,280 Speaker 1: ultimately that was a voter expressing a point of view 298 00:18:42,359 --> 00:18:45,360 Speaker 1: that's different from my own, but it's no less valid 299 00:18:45,640 --> 00:18:48,760 Speaker 1: than my own. So I imagine that for at least 300 00:18:48,840 --> 00:18:52,439 Speaker 1: some people, being able to be objective and separate the 301 00:18:52,440 --> 00:18:57,520 Speaker 1: critical comments that have validity from the disruptive or insulting 302 00:18:57,680 --> 00:19:01,639 Speaker 1: or irrelevant comments can sometimes be a challenge, And to 303 00:19:01,760 --> 00:19:04,640 Speaker 1: those people, I suggest you get a social media manager 304 00:19:04,840 --> 00:19:08,160 Speaker 1: who isn't the person directly making the statements, but rather 305 00:19:08,200 --> 00:19:12,000 Speaker 1: connect as a more objective judge on the matter. Goodness knows, 306 00:19:12,280 --> 00:19:15,960 Speaker 1: I wouldn't be able to do this for myself, and 307 00:19:16,040 --> 00:19:18,439 Speaker 1: this is one of many reasons why everyone should be 308 00:19:18,480 --> 00:19:22,080 Speaker 1: glad I am not a politician. I also want to 309 00:19:22,080 --> 00:19:25,960 Speaker 1: acknowledge there are plenty of remarkably cool headed people in 310 00:19:26,080 --> 00:19:28,840 Speaker 1: politics who can handle this sort of stuff a billion 311 00:19:29,000 --> 00:19:33,040 Speaker 1: times better than I can. So if an elected official 312 00:19:33,280 --> 00:19:36,679 Speaker 1: in the United States uses a social media platform in 313 00:19:36,720 --> 00:19:41,040 Speaker 1: any capacity related to their office, they should, in theory 314 00:19:41,400 --> 00:19:45,600 Speaker 1: obey these rules. On Twitter that can get a little tricky. 315 00:19:45,640 --> 00:19:48,840 Speaker 1: There are verified accounts, which are great because you know 316 00:19:48,880 --> 00:19:51,600 Speaker 1: for a fact who you're dealing with, since Twitter has 317 00:19:51,680 --> 00:19:54,479 Speaker 1: verified that the account belongs to a particular person, and 318 00:19:54,520 --> 00:19:57,480 Speaker 1: you could say, oh, this is actually a citizen, this 319 00:19:57,520 --> 00:19:59,920 Speaker 1: is someone I cannot block. This person they are sit 320 00:20:00,080 --> 00:20:03,280 Speaker 1: is in. I'm elected official, this person represents one of 321 00:20:03,280 --> 00:20:06,760 Speaker 1: my constituents. Uh, they either are voting for me or 322 00:20:06,760 --> 00:20:09,600 Speaker 1: against me, so they need to have this channel. But 323 00:20:09,640 --> 00:20:11,280 Speaker 1: there are a lot of accounts that could be dummy 324 00:20:11,320 --> 00:20:15,560 Speaker 1: accounts or bots, or even foreign agents attempting to sow 325 00:20:15,720 --> 00:20:19,919 Speaker 1: discord and chaos and more. There are troublemakers who just 326 00:20:20,000 --> 00:20:22,399 Speaker 1: want to stir stuff up, either because they have a 327 00:20:22,400 --> 00:20:26,679 Speaker 1: specific agenda that involves derailing a politician, or because they 328 00:20:26,720 --> 00:20:29,240 Speaker 1: just get a kick out of being a nuisance. And 329 00:20:29,280 --> 00:20:32,320 Speaker 1: then you've got the truly toxic stuff out there. People 330 00:20:32,320 --> 00:20:38,520 Speaker 1: who threaten, intimidate, insult, and harass mercilessly. These are behaviors 331 00:20:38,560 --> 00:20:42,040 Speaker 1: that wouldn't be tolerated in public spaces in the real world, 332 00:20:42,480 --> 00:20:45,960 Speaker 1: So how do you handle it in the online world. Well, 333 00:20:46,040 --> 00:20:49,200 Speaker 1: one thing that could happen is the platforms themselves could 334 00:20:49,240 --> 00:20:51,960 Speaker 1: step up and enforce some of the rules they've said 335 00:20:52,080 --> 00:20:56,760 Speaker 1: are in place. Twitter representatives occasionally talk about forming new rules, 336 00:20:57,040 --> 00:20:59,479 Speaker 1: while several critics have in the past pointed out that 337 00:20:59,560 --> 00:21:03,199 Speaker 1: new rules weren't necessarily needed. Rather, what was needed was 338 00:21:03,400 --> 00:21:07,119 Speaker 1: enforcement of the rules that were already in place. But 339 00:21:07,200 --> 00:21:09,199 Speaker 1: I do want to be fair to Twitter. One of 340 00:21:09,200 --> 00:21:12,080 Speaker 1: the issues the service has had is that it's hate 341 00:21:12,119 --> 00:21:16,400 Speaker 1: speech policy was previously only applied at the individual level, 342 00:21:16,760 --> 00:21:20,200 Speaker 1: So if someone were to direct hate speech at me personally, 343 00:21:20,640 --> 00:21:23,160 Speaker 1: Twitter then could step in and intervene because that would 344 00:21:23,160 --> 00:21:26,880 Speaker 1: be a violation of Twitter's policy. However, if someone were 345 00:21:26,920 --> 00:21:30,320 Speaker 1: to attack a general group of people, like a specific 346 00:21:30,480 --> 00:21:34,240 Speaker 1: race or a religious group, but was not targeting a 347 00:21:34,320 --> 00:21:39,280 Speaker 1: specific individual that was not covered by Twitter's policies, it 348 00:21:39,359 --> 00:21:43,480 Speaker 1: wasn't explicitly against the rules. So the company has since 349 00:21:43,560 --> 00:21:46,840 Speaker 1: revised those rules so that the hate speech rules have 350 00:21:46,920 --> 00:21:51,760 Speaker 1: a more broad application. The platforms, being the private sector 351 00:21:51,760 --> 00:21:55,600 Speaker 1: and offering a service, aren't bound by the same restrictions 352 00:21:55,640 --> 00:21:59,320 Speaker 1: that the elected officials have to comply with. I've used 353 00:21:59,320 --> 00:22:02,679 Speaker 1: this analogy conversations with friends. So imagine for a moment, then, 354 00:22:02,720 --> 00:22:06,200 Speaker 1: an official government facility. Let's say it's a state legislature 355 00:22:06,240 --> 00:22:09,800 Speaker 1: building is actually built on private land, and you are 356 00:22:09,840 --> 00:22:12,840 Speaker 1: the landowner, and you've drawn up in an agreement that 357 00:22:12,880 --> 00:22:14,919 Speaker 1: gives you the right to close that land off to 358 00:22:15,000 --> 00:22:18,159 Speaker 1: anyone you like at any time, and somehow no one 359 00:22:18,200 --> 00:22:20,879 Speaker 1: really noticed or objected to that clause, and it was 360 00:22:20,920 --> 00:22:23,760 Speaker 1: all signed in the agreement, and then one day you 361 00:22:23,800 --> 00:22:27,240 Speaker 1: just arbitrarily decided to shut off access to the legislature building, 362 00:22:27,480 --> 00:22:30,399 Speaker 1: and you prevent lawmakers from going to work. It's a 363 00:22:30,480 --> 00:22:33,080 Speaker 1: rather implausible example, but it's kind of similar to what 364 00:22:33,119 --> 00:22:36,400 Speaker 1: we have with these social platforms, and that raises some 365 00:22:36,560 --> 00:22:39,440 Speaker 1: alarm flags or red warning flags. I guess I should 366 00:22:39,440 --> 00:22:43,800 Speaker 1: say not alarms flags can't alarm very much, but red 367 00:22:43,840 --> 00:22:47,000 Speaker 1: warning flags to the minds of people who see potential 368 00:22:47,119 --> 00:22:49,960 Speaker 1: for problems in the future. With this, now, I can't 369 00:22:49,960 --> 00:22:53,439 Speaker 1: imagine any reality in which Facebook or Twitter makes a 370 00:22:53,480 --> 00:22:56,920 Speaker 1: move to completely remove a government account or the account 371 00:22:56,920 --> 00:22:59,480 Speaker 1: of an elected official. You could make the argument that 372 00:22:59,560 --> 00:23:03,080 Speaker 1: the elected officials themselves wouldn't be allowed to do that either, 373 00:23:03,440 --> 00:23:06,000 Speaker 1: because it would be equivalent to destroying the records of 374 00:23:06,040 --> 00:23:10,480 Speaker 1: public discourse on official matters. So if you happen to 375 00:23:10,480 --> 00:23:13,080 Speaker 1: be an elected official and you've created a Twitter account, 376 00:23:13,160 --> 00:23:16,320 Speaker 1: deleting that Twitter account could be a big problem because 377 00:23:16,840 --> 00:23:20,080 Speaker 1: there's going to be records of communication with actual constituents 378 00:23:20,119 --> 00:23:23,399 Speaker 1: in that account. But the social media networks are in 379 00:23:23,440 --> 00:23:26,399 Speaker 1: the private sector, so they're not bound by the rules 380 00:23:26,640 --> 00:23:30,879 Speaker 1: that the government is. They're not likely to act in 381 00:23:30,920 --> 00:23:34,600 Speaker 1: this way because, for one thing, these are companies operating 382 00:23:34,600 --> 00:23:37,840 Speaker 1: and headquartered in the United States. Both Facebook and Twitter 383 00:23:37,960 --> 00:23:40,959 Speaker 1: and others have been active with lobbyists to try and 384 00:23:41,040 --> 00:23:44,399 Speaker 1: shape policy that favors these companies. It would be a 385 00:23:44,440 --> 00:23:48,560 Speaker 1: bit counterproductive to take such an antagonistic stance against the 386 00:23:48,600 --> 00:23:53,359 Speaker 1: government of their home country. Banning individual users who aren't 387 00:23:53,560 --> 00:23:57,639 Speaker 1: government officials is a little less controversial, though most social 388 00:23:57,640 --> 00:24:01,159 Speaker 1: platforms aren't keen to do that very frequently either. If 389 00:24:01,200 --> 00:24:04,119 Speaker 1: Twitter figures out an account is a duplicate and is 390 00:24:04,160 --> 00:24:08,120 Speaker 1: spreading disinformation or being used to attack people. It can 391 00:24:08,200 --> 00:24:11,359 Speaker 1: ban those accounts, and it certainly has banned plenty of 392 00:24:11,480 --> 00:24:14,479 Speaker 1: empty shell accounts that were created simply for the purpose 393 00:24:14,560 --> 00:24:17,800 Speaker 1: of boosting follower accounts. But in the case of those 394 00:24:17,880 --> 00:24:20,680 Speaker 1: empty accounts, this was an easy call because they didn't 395 00:24:20,720 --> 00:24:24,479 Speaker 1: represent real people and they were creating a fake sense 396 00:24:24,520 --> 00:24:28,280 Speaker 1: of influence. Celebrities who had millions of followers saw those 397 00:24:28,359 --> 00:24:33,359 Speaker 1: numbers reduced dramatically in the wake of these bands, and 398 00:24:33,400 --> 00:24:35,879 Speaker 1: that suggested that a large number of those followers weren't 399 00:24:35,960 --> 00:24:39,120 Speaker 1: real people. But making that call was easy. Going after 400 00:24:39,160 --> 00:24:42,160 Speaker 1: troublemakers appears to be more difficult, or at the very least, 401 00:24:42,320 --> 00:24:45,920 Speaker 1: the platforms seemed more reluctant to take action in those cases. 402 00:24:46,320 --> 00:24:48,280 Speaker 1: Part of that might be due to a fear that 403 00:24:48,359 --> 00:24:52,360 Speaker 1: taking action would result in a backlash against the platforms. Now, 404 00:24:52,400 --> 00:24:54,840 Speaker 1: in an ideal world, from the point of view of 405 00:24:54,880 --> 00:24:59,760 Speaker 1: a company like Twitter or Facebook, the platform would remain unbiased, objective, 406 00:25:00,200 --> 00:25:02,920 Speaker 1: and hands off. It wouldn't interfere at all. It would 407 00:25:02,920 --> 00:25:05,960 Speaker 1: just be a place for discussion, free of responsibility for 408 00:25:06,040 --> 00:25:09,240 Speaker 1: the content of those discussions. Hey, it's not my fault 409 00:25:09,560 --> 00:25:12,080 Speaker 1: if hate speech is going across this platform. I just 410 00:25:12,160 --> 00:25:14,720 Speaker 1: made the platform I didn't make the hate speech, but 411 00:25:14,760 --> 00:25:18,880 Speaker 1: in the wake of various controversies involving misinformation and manipulation, 412 00:25:19,400 --> 00:25:23,600 Speaker 1: that position is largely untenable. Now. There has been a 413 00:25:23,680 --> 00:25:28,000 Speaker 1: fear among organizations like the Electronic Frontier Foundation also known 414 00:25:28,000 --> 00:25:30,240 Speaker 1: as the e f F, that the courts could find 415 00:25:30,359 --> 00:25:33,840 Speaker 1: that a private sector platform like Facebook or Twitter could 416 00:25:33,880 --> 00:25:37,399 Speaker 1: be viewed as a public forum itself, and thus the 417 00:25:37,440 --> 00:25:42,000 Speaker 1: First Amendment restrictions would apply to those platforms as acting 418 00:25:42,040 --> 00:25:44,879 Speaker 1: in that capacity. But in June two thousand nineteen, the 419 00:25:44,920 --> 00:25:47,199 Speaker 1: Supreme Court of the United States made a decision that 420 00:25:47,280 --> 00:25:51,360 Speaker 1: appears to alleviate that concerned somewhat. The case was one 421 00:25:51,400 --> 00:25:55,199 Speaker 1: in which a nonprofit organization was essentially hired by the 422 00:25:55,240 --> 00:25:58,639 Speaker 1: City of New York to operate public access channels in 423 00:25:58,680 --> 00:26:01,600 Speaker 1: the city, and a bull of producers created a film 424 00:26:01,800 --> 00:26:04,920 Speaker 1: for those public access channels, and that led to some 425 00:26:04,960 --> 00:26:09,600 Speaker 1: complaints against the producers, and the nonprofit organization disciplined the 426 00:26:09,680 --> 00:26:14,760 Speaker 1: producers as a result. The producers then sued this nonprofit organization, 427 00:26:15,040 --> 00:26:18,600 Speaker 1: claiming that their First Amendment rights were being violated, and 428 00:26:18,640 --> 00:26:22,119 Speaker 1: this prompted an argument over whether the nonprofit should be 429 00:26:22,200 --> 00:26:26,679 Speaker 1: held to those standards. Is it at all relevant with 430 00:26:26,760 --> 00:26:32,040 Speaker 1: the First Amendment and h on its own. The nonprofit 431 00:26:32,359 --> 00:26:35,760 Speaker 1: isn't a government office or agency. It's a nonprofit organization, 432 00:26:35,800 --> 00:26:39,199 Speaker 1: but it's not it's not related to the government, but 433 00:26:39,680 --> 00:26:43,720 Speaker 1: argued the plaintiffs, since the nonprofit was working on behalf 434 00:26:43,880 --> 00:26:46,960 Speaker 1: of the city, and the city is a government agency, 435 00:26:47,480 --> 00:26:49,760 Speaker 1: it should be held to the same rules that the 436 00:26:49,800 --> 00:26:53,560 Speaker 1: government is. Now that could also apply to social networks. 437 00:26:53,920 --> 00:26:57,119 Speaker 1: They are not created to be channels of communication between 438 00:26:57,160 --> 00:27:02,000 Speaker 1: governments and their citizens. They're just munication channels, period. But 439 00:27:02,119 --> 00:27:05,720 Speaker 1: the Supreme Court had ruled that the nonprofit was effectively 440 00:27:06,040 --> 00:27:09,240 Speaker 1: a surrogate for the government. The same could be argued 441 00:27:09,359 --> 00:27:13,359 Speaker 1: for Twitter or Facebook. However, the Supreme Court didn't rule 442 00:27:13,480 --> 00:27:17,200 Speaker 1: that way. The five conservative judges ruled that the nonprofit 443 00:27:17,359 --> 00:27:20,800 Speaker 1: did not constitute a governmental agent and therefore the First 444 00:27:20,840 --> 00:27:24,359 Speaker 1: Amendment rules weren't applicable. The liberal judges, by the way, 445 00:27:24,640 --> 00:27:28,199 Speaker 1: ruled the opposite way, arguing that the nonprofit was in 446 00:27:28,280 --> 00:27:31,920 Speaker 1: fact effectively a government agent in its role of overseeing 447 00:27:32,000 --> 00:27:34,720 Speaker 1: public access channels. And I could see the merits of 448 00:27:34,800 --> 00:27:38,640 Speaker 1: either side of this argument. It's not that easy. It's 449 00:27:38,640 --> 00:27:41,800 Speaker 1: not It's not something that I could quickly make a 450 00:27:41,880 --> 00:27:43,920 Speaker 1: judgment on I would really have to think on this, 451 00:27:44,520 --> 00:27:48,040 Speaker 1: and it's complicated. Moreover, I can definitely imagine how big 452 00:27:48,040 --> 00:27:50,359 Speaker 1: a mess it would be if the ruling had gone 453 00:27:50,359 --> 00:27:53,159 Speaker 1: the other way and all online platforms suddenly could be 454 00:27:53,160 --> 00:27:57,080 Speaker 1: bound by those restrictions whenever any elected official or any 455 00:27:57,119 --> 00:28:01,160 Speaker 1: government agency was making use of those in official way. 456 00:28:01,200 --> 00:28:04,640 Speaker 1: It's a pretty tough, complicated issue. Now I have more 457 00:28:04,680 --> 00:28:07,479 Speaker 1: to say on the matter, but first let's take another 458 00:28:07,560 --> 00:28:17,639 Speaker 1: quick break. Let's get back to the concept of blocking 459 00:28:17,680 --> 00:28:20,320 Speaker 1: folks on Twitter and how that relates to politics. I 460 00:28:20,359 --> 00:28:22,800 Speaker 1: want to be fair to the judge for the original 461 00:28:22,840 --> 00:28:26,120 Speaker 1: case against President Trump, because the judge recognized that there 462 00:28:26,280 --> 00:28:30,240 Speaker 1: is another option besides whether or not you can ban someone, 463 00:28:30,600 --> 00:28:33,359 Speaker 1: or whether or not you have to endure an enormous 464 00:28:33,400 --> 00:28:35,639 Speaker 1: amount of abuse because you aren't allowed to ban or 465 00:28:35,680 --> 00:28:41,440 Speaker 1: block anybody. The judge stated that officials could mute people 466 00:28:41,640 --> 00:28:45,520 Speaker 1: on Twitter, not block them, but mute them, and muting 467 00:28:45,720 --> 00:28:50,120 Speaker 1: is different from blocking. So when user A blocks User 468 00:28:50,200 --> 00:28:53,400 Speaker 1: B on Twitter, not only will user A no longer 469 00:28:53,440 --> 00:28:56,520 Speaker 1: see any posts from user B, whether they were directed 470 00:28:56,520 --> 00:28:59,520 Speaker 1: at user A or not, User B will no longer 471 00:28:59,560 --> 00:29:02,680 Speaker 1: be able to read any of user a's posts. It 472 00:29:02,680 --> 00:29:05,520 Speaker 1: will be as if each user appeared to stop existing 473 00:29:05,800 --> 00:29:08,479 Speaker 1: in the eyes of the other one, at least on Twitter. 474 00:29:09,040 --> 00:29:11,840 Speaker 1: And that was the crux of one of the arguments 475 00:29:11,840 --> 00:29:15,480 Speaker 1: was that if Trump blocks voters, they can't see what 476 00:29:15,520 --> 00:29:18,560 Speaker 1: Trump is posting. And if Trump is posting in an 477 00:29:18,600 --> 00:29:23,200 Speaker 1: official capacity as president, and those people are voters in America, 478 00:29:23,960 --> 00:29:26,680 Speaker 1: that violates the First Amendment. But let's say User A 479 00:29:26,920 --> 00:29:32,000 Speaker 1: dozen't block User B. Instead user A mutes User B. 480 00:29:32,720 --> 00:29:35,600 Speaker 1: In that instance, user A won't see any of user 481 00:29:35,640 --> 00:29:39,120 Speaker 1: b's posts because they've been muted, but User B will 482 00:29:39,120 --> 00:29:41,920 Speaker 1: still be able to see all of user a's posts. 483 00:29:42,400 --> 00:29:44,600 Speaker 1: The judge stated that this would be like a real 484 00:29:44,600 --> 00:29:47,600 Speaker 1: world setting in which a politician addresses a crowd and 485 00:29:47,720 --> 00:29:52,560 Speaker 1: chooses to ignore someone who is speaking out against that politician. Yes, 486 00:29:52,920 --> 00:29:55,400 Speaker 1: the outspoken person has the right to be part of 487 00:29:55,400 --> 00:29:58,920 Speaker 1: that public forum, and yes, that person also has the 488 00:29:58,920 --> 00:30:02,480 Speaker 1: freedom of expression, but the politician isn't obligated to actually 489 00:30:02,480 --> 00:30:06,880 Speaker 1: pay attention to it. Now, I appreciate that argument. It 490 00:30:07,160 --> 00:30:10,040 Speaker 1: is a little more complicated than that, though, because of course, 491 00:30:10,480 --> 00:30:13,080 Speaker 1: in the public setting, the politician would still be able 492 00:30:13,080 --> 00:30:16,880 Speaker 1: to hear the person who was protesting. They might not 493 00:30:17,160 --> 00:30:21,760 Speaker 1: give it any acknowledgement, they might not pay much attention 494 00:30:21,760 --> 00:30:23,360 Speaker 1: to it, but they'll still be able to hear it 495 00:30:24,160 --> 00:30:26,719 Speaker 1: on Twitter. If you mute someone, you never hear anything 496 00:30:26,760 --> 00:30:30,840 Speaker 1: from them again. So it's not the same thing at all. Right, 497 00:30:30,880 --> 00:30:33,160 Speaker 1: it would be as if you were suddenly able to 498 00:30:33,640 --> 00:30:37,520 Speaker 1: effectively tune out the frequencies of the protester in that 499 00:30:37,600 --> 00:30:41,719 Speaker 1: public setting. So I'm not entirely convinced that this is 500 00:30:43,200 --> 00:30:46,520 Speaker 1: uh a legit argument, but it is what the judge found. 501 00:30:47,000 --> 00:30:50,000 Speaker 1: So according to that original ruling, officials at the moment 502 00:30:50,120 --> 00:30:54,040 Speaker 1: still have an out on Twitter. They can mute voters, 503 00:30:54,280 --> 00:30:57,200 Speaker 1: they just can't block them. So that might work for Twitter, 504 00:30:57,280 --> 00:30:59,720 Speaker 1: but it is a different story on Facebook, where moderation 505 00:30:59,760 --> 00:31:01,920 Speaker 1: is all all about making sure someone isn't trying to 506 00:31:02,040 --> 00:31:06,520 Speaker 1: undermine an entire discussion. Twitter exchanges are more like overhearing 507 00:31:06,560 --> 00:31:08,960 Speaker 1: a few people talking at a party, rather than someone 508 00:31:09,000 --> 00:31:11,880 Speaker 1: standing on the stage and talking to a crowd. As 509 00:31:11,920 --> 00:31:14,320 Speaker 1: it is frequently the case with legal matters that revolve 510 00:31:14,360 --> 00:31:17,400 Speaker 1: around the heart of a country's system of laws. There 511 00:31:17,440 --> 00:31:21,000 Speaker 1: have been battles involving some pretty dark stuff that have 512 00:31:21,120 --> 00:31:25,160 Speaker 1: really tested this subject In two thousand and seventeen, the 513 00:31:25,200 --> 00:31:29,000 Speaker 1: Supreme Court weighed in on another case involving social media 514 00:31:29,080 --> 00:31:33,560 Speaker 1: and First Amendment rights. The case was Packingham versus North Carolina, 515 00:31:33,760 --> 00:31:37,440 Speaker 1: and the crux of the story again, it's pretty darn dark. 516 00:31:37,720 --> 00:31:40,800 Speaker 1: So in North Carolina, the state government passed a law 517 00:31:41,000 --> 00:31:45,280 Speaker 1: that prohibited convicted sex offenders from being allowed to access 518 00:31:45,480 --> 00:31:49,760 Speaker 1: social media websites. The state argued that social media enabled 519 00:31:49,800 --> 00:31:53,240 Speaker 1: and facilitated the very offenses these people had been convicted 520 00:31:53,280 --> 00:31:56,040 Speaker 1: of in the past, but the Supreme Court ruled that 521 00:31:56,120 --> 00:31:58,880 Speaker 1: the law violated the First Amendment and that it would 522 00:31:58,880 --> 00:32:03,200 Speaker 1: effectively criminalize is a lot of otherwise legal activities. In 523 00:32:03,240 --> 00:32:06,520 Speaker 1: that decision, the Court stated, while in the past there 524 00:32:06,560 --> 00:32:09,920 Speaker 1: may have been difficulty in identifying the most important places 525 00:32:09,960 --> 00:32:13,280 Speaker 1: in a spatial sense for the exchange of views, today 526 00:32:13,320 --> 00:32:18,000 Speaker 1: the answer is clear. It is cyberspace, the vast democratic 527 00:32:18,080 --> 00:32:22,160 Speaker 1: forums of the Internet in general and social media in particular. 528 00:32:23,200 --> 00:32:26,640 Speaker 1: Now that argument has prompted some people like David L. 529 00:32:26,720 --> 00:32:30,040 Speaker 1: Hudson Jr. Of the American Bar Association to call for 530 00:32:30,080 --> 00:32:33,560 Speaker 1: a re examination and expansion of the First Amendment with 531 00:32:33,640 --> 00:32:37,560 Speaker 1: regard to the digital space. Hudson argues that platforms like 532 00:32:37,600 --> 00:32:40,840 Speaker 1: Facebook and Twitter have the ability to manipulate and sensor 533 00:32:40,960 --> 00:32:44,400 Speaker 1: public discourse on a level equal to or perhaps beyond 534 00:32:44,760 --> 00:32:48,560 Speaker 1: that of government agencies. That the First Amendment is not 535 00:32:48,800 --> 00:32:53,120 Speaker 1: expanded to include such platforms, that these protections guaranteed by 536 00:32:53,120 --> 00:32:55,960 Speaker 1: the Amendment will be moot because everyone's going to be 537 00:32:55,960 --> 00:32:59,520 Speaker 1: communicating in those places and the Amendment doesn't apply there. 538 00:32:59,800 --> 00:33:02,960 Speaker 1: So he says, we've got to change that. Now. Obviously 539 00:33:03,240 --> 00:33:05,600 Speaker 1: that's not a point of view that everyone shares. Not 540 00:33:05,680 --> 00:33:09,440 Speaker 1: everybody thinks that these private entities or entities in the 541 00:33:09,480 --> 00:33:13,120 Speaker 1: private sector should be held to those restrictions, but it 542 00:33:13,160 --> 00:33:17,600 Speaker 1: does show how this concept is really complicated and thorny. Recently, 543 00:33:17,800 --> 00:33:20,480 Speaker 1: after I recorded a short radio segment about this subject, 544 00:33:20,640 --> 00:33:23,360 Speaker 1: the producer for that segment suggested something I thought was 545 00:33:23,400 --> 00:33:26,640 Speaker 1: really interesting. He suggested that perhaps we'll see a future 546 00:33:26,880 --> 00:33:30,640 Speaker 1: in which social media networks create special verified accounts for 547 00:33:30,760 --> 00:33:35,560 Speaker 1: government use only or government used to the public, and 548 00:33:35,640 --> 00:33:39,160 Speaker 1: these accounts will by default have certain features disabled on them, 549 00:33:39,240 --> 00:33:42,760 Speaker 1: such as the ability to block a Twitter account. Presumably, 550 00:33:43,000 --> 00:33:45,760 Speaker 1: these accounts would have some sort of indication akin to 551 00:33:45,920 --> 00:33:48,880 Speaker 1: a verified Twitter check mark, to let people know they 552 00:33:48,920 --> 00:33:52,960 Speaker 1: represent actual official government accounts, and I certainly see the 553 00:33:53,000 --> 00:33:55,480 Speaker 1: value in that, though even with such a system in place, 554 00:33:55,680 --> 00:33:58,320 Speaker 1: it would still be the responsibility of the public officials 555 00:33:58,360 --> 00:34:00,160 Speaker 1: to make certain they weren't using their of have it 556 00:34:00,240 --> 00:34:02,560 Speaker 1: accounts in a way that could be seen as part 557 00:34:02,680 --> 00:34:05,000 Speaker 1: of their public role, or else they would be held 558 00:34:05,000 --> 00:34:08,560 Speaker 1: to the same restrictions. Now, from a personal perspective, I 559 00:34:08,600 --> 00:34:10,640 Speaker 1: do think that if you are in a public office 560 00:34:10,719 --> 00:34:13,000 Speaker 1: in the United States, and if you operate one or 561 00:34:13,040 --> 00:34:15,160 Speaker 1: more social accounts in a way that is meant to 562 00:34:15,239 --> 00:34:18,560 Speaker 1: engage with the public you represent in a way that 563 00:34:18,600 --> 00:34:21,799 Speaker 1: relates to your public role, you should not be able 564 00:34:21,840 --> 00:34:24,799 Speaker 1: to block people from seeing what you post. I can't 565 00:34:24,800 --> 00:34:28,080 Speaker 1: imagine a scenario where you could hold a public forum 566 00:34:28,120 --> 00:34:30,840 Speaker 1: but then say, oh, but these members of the public 567 00:34:31,040 --> 00:34:34,240 Speaker 1: aren't allowed to hear or see anything that happens in there. 568 00:34:34,480 --> 00:34:37,440 Speaker 1: Only this other select group of the public will get 569 00:34:37,480 --> 00:34:40,200 Speaker 1: that privilege, because if it's a public forum, it has 570 00:34:40,200 --> 00:34:43,240 Speaker 1: to be public for everyone. This matter, by the way, 571 00:34:43,480 --> 00:34:45,960 Speaker 1: is far from settled. I'm sure we'll continue to see 572 00:34:46,000 --> 00:34:49,280 Speaker 1: complications in the digital space regarding the freedom of speech 573 00:34:49,520 --> 00:34:52,800 Speaker 1: and the government's responsibilities in that space, and will probably 574 00:34:52,840 --> 00:34:56,840 Speaker 1: see more actions like the time departing Twitter contract worker 575 00:34:56,920 --> 00:35:00,080 Speaker 1: on his last day at Twitter set into most in 576 00:35:00,200 --> 00:35:03,480 Speaker 1: the deletion of President Trump's Twitter account. You might remember 577 00:35:03,520 --> 00:35:07,320 Speaker 1: when that happened. Now. In that case, according to the 578 00:35:07,440 --> 00:35:10,960 Speaker 1: departing employee, he did it as sort of a gesture. 579 00:35:11,040 --> 00:35:13,719 Speaker 1: He didn't think it was actually gonna happen. People had 580 00:35:13,760 --> 00:35:17,280 Speaker 1: flagged Trump's account as having violated the terms of service 581 00:35:17,320 --> 00:35:21,040 Speaker 1: of Twitter. So this former employee, just as before he 582 00:35:21,080 --> 00:35:24,279 Speaker 1: was ready to clock out for his last day, initiated 583 00:35:24,320 --> 00:35:28,399 Speaker 1: the process needed to delete Trump's account. And he had 584 00:35:28,400 --> 00:35:30,759 Speaker 1: stated he never thought it would go so far as 585 00:35:30,760 --> 00:35:34,239 Speaker 1: to actually happen, that somehow there had to be protections 586 00:35:34,239 --> 00:35:38,440 Speaker 1: in place to prevent it without further authorization. So he 587 00:35:38,600 --> 00:35:41,759 Speaker 1: did it as sort of a symbolic gesture, thinking nothing 588 00:35:41,800 --> 00:35:45,720 Speaker 1: would happen. But it actually did end up deleting President 589 00:35:45,760 --> 00:35:50,360 Speaker 1: Trump's account, though only temporarily. It's these sorts of actions, 590 00:35:50,400 --> 00:35:52,719 Speaker 1: by the way, that lead to advocates arguing that the 591 00:35:52,760 --> 00:35:55,960 Speaker 1: social media networks have a huge responsibility to those who 592 00:35:56,040 --> 00:36:01,160 Speaker 1: utilize the services. Hopefully, this episode has highlighted how challenging 593 00:36:01,360 --> 00:36:04,280 Speaker 1: this issue is, and for people outside the United States, 594 00:36:04,440 --> 00:36:07,400 Speaker 1: illuminated some of the arguments behind the whole thing and 595 00:36:07,440 --> 00:36:12,280 Speaker 1: why it's such a big deal. Um, I'm not entirely 596 00:36:12,400 --> 00:36:16,160 Speaker 1: convinced that there is a right way to move forward. 597 00:36:16,880 --> 00:36:19,279 Speaker 1: I think no matter what we choose, it's going to 598 00:36:19,360 --> 00:36:23,960 Speaker 1: be a solution that is not going to be satisfactory 599 00:36:24,000 --> 00:36:28,120 Speaker 1: for all parties involved. I think the courts have ruled 600 00:36:28,320 --> 00:36:30,800 Speaker 1: in the right way, and I do think it should 601 00:36:30,800 --> 00:36:36,120 Speaker 1: apply to politicians across the board, not just ones that 602 00:36:36,400 --> 00:36:40,520 Speaker 1: are of one party or another. Uh. I also think 603 00:36:40,560 --> 00:36:45,200 Speaker 1: that we have to have some means that they can 604 00:36:45,760 --> 00:36:51,160 Speaker 1: take in order to avoid being, you know, attacked. Politicians 605 00:36:51,200 --> 00:36:53,480 Speaker 1: are people too. I don't know how many of you 606 00:36:53,560 --> 00:36:59,120 Speaker 1: out there have had to withstand online attacks. It is 607 00:36:59,160 --> 00:37:04,000 Speaker 1: never a pleasant experience, and it can be really, really demoralizing. 608 00:37:04,719 --> 00:37:10,319 Speaker 1: It can certainly create real sense of trauma. I have 609 00:37:10,440 --> 00:37:13,520 Speaker 1: friends in the space who have had real hard times 610 00:37:13,560 --> 00:37:16,040 Speaker 1: dealing with this stuff. I've been pretty fortunate. I've dealt 611 00:37:16,080 --> 00:37:19,040 Speaker 1: with a little bit of harassment, but nothing on the 612 00:37:19,080 --> 00:37:22,399 Speaker 1: scale of some of my colleagues, and that's not even 613 00:37:22,480 --> 00:37:26,759 Speaker 1: being a publicly elected official. So there needs to be 614 00:37:26,800 --> 00:37:30,040 Speaker 1: a balance. Obviously, social media is going to continue being 615 00:37:30,160 --> 00:37:34,239 Speaker 1: a major communications channel moving forward. It's not likely that 616 00:37:34,280 --> 00:37:36,880 Speaker 1: we're going to see people back off of it. But 617 00:37:37,000 --> 00:37:39,600 Speaker 1: I honestly don't have the answer to this. UM. I 618 00:37:39,640 --> 00:37:42,799 Speaker 1: think that we're making steps in the right direction, but 619 00:37:42,920 --> 00:37:44,440 Speaker 1: I don't know that we're ever going to come up 620 00:37:44,440 --> 00:37:46,759 Speaker 1: with the perfect solution. But what do you guys think? 621 00:37:47,440 --> 00:37:49,800 Speaker 1: And if you have any suggestions for future episodes, you 622 00:37:49,800 --> 00:37:51,759 Speaker 1: should reach out to me and let me know about those. 623 00:37:52,040 --> 00:37:54,640 Speaker 1: You can email me the addresses text stuff at how 624 00:37:54,680 --> 00:37:57,440 Speaker 1: stuff works dot com or pop on over to our 625 00:37:57,440 --> 00:38:01,839 Speaker 1: website that's text stuff podcast dot com. You're gonna find 626 00:38:01,920 --> 00:38:04,600 Speaker 1: links to where we are on social media over there, 627 00:38:05,239 --> 00:38:08,680 Speaker 1: and uh, I promise I probably won't block you. You 628 00:38:08,680 --> 00:38:11,160 Speaker 1: will also find an archive of all of our previous 629 00:38:11,200 --> 00:38:15,000 Speaker 1: episodes and a link to our online store, where everything 630 00:38:15,080 --> 00:38:17,720 Speaker 1: you purchase goes to help the show and we greatly 631 00:38:17,760 --> 00:38:21,879 Speaker 1: appreciate it. And I will talk to you again really soon. 632 00:38:27,080 --> 00:38:29,279 Speaker 1: Hext Stuff is a production of I Heart Radio's How 633 00:38:29,360 --> 00:38:32,719 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 634 00:38:32,760 --> 00:38:35,840 Speaker 1: the I heart Radio app, Apple Podcasts, or wherever you 635 00:38:35,920 --> 00:38:37,280 Speaker 1: listen to your favorite shows.