1 00:00:10,360 --> 00:00:15,720 Speaker 1: What's an earth. I'm taking my pels in the vitamins. 2 00:00:15,760 --> 00:00:17,479 Speaker 1: You know, I gotta stay strong. You know what I'm saying. 3 00:00:17,520 --> 00:00:20,200 Speaker 1: Many of you all probably know. Leslie Jones is one 4 00:00:20,239 --> 00:00:23,400 Speaker 1: of the comedians on Saturday Night Live. She's also one 5 00:00:23,400 --> 00:00:26,720 Speaker 1: of the stars of the female version of Ghostbusters, and 6 00:00:26,760 --> 00:00:32,479 Speaker 1: earlier this summer, she was eviscerated, really abused mercilessly on 7 00:00:32,520 --> 00:00:39,559 Speaker 1: social media, particularly Twitter, with the most hateful, bigoted, disgusting comments. 8 00:00:39,600 --> 00:00:42,000 Speaker 1: She really got a lot of attention because she stood 9 00:00:42,080 --> 00:00:44,920 Speaker 1: up and she shared this with the world, which really 10 00:00:45,000 --> 00:00:49,400 Speaker 1: kind of I think unveiled what a cess pool social 11 00:00:49,400 --> 00:00:52,839 Speaker 1: media can be. Have you experienced much hate speech yourself? Oh, 12 00:00:52,920 --> 00:00:56,160 Speaker 1: I've I've been swimming in that cess pool. Actually, you know, 13 00:00:56,320 --> 00:01:00,680 Speaker 1: people don't really understand how it's like a sucker punch 14 00:01:01,000 --> 00:01:05,440 Speaker 1: when people say that such mean, nasty, negative, ugly things 15 00:01:05,480 --> 00:01:10,520 Speaker 1: about you, either about who you are, you know, your intelligence, 16 00:01:10,600 --> 00:01:14,760 Speaker 1: your appearance, and I think under the cloak of anonymity, 17 00:01:14,959 --> 00:01:20,280 Speaker 1: people feel emboldened to say all sorts of horrific things. Well, 18 00:01:20,319 --> 00:01:24,000 Speaker 1: and I've seen what people say about you online and 19 00:01:24,160 --> 00:01:27,280 Speaker 1: I know you, I'm your friend, and it's really upsetting 20 00:01:27,319 --> 00:01:33,720 Speaker 1: because I know that it isn't true, and thank you 21 00:01:33,760 --> 00:01:38,200 Speaker 1: for your support. You're welcome anytime. And I think that anonymity, 22 00:01:38,240 --> 00:01:41,560 Speaker 1: as you said, gives people an ability to say things 23 00:01:41,640 --> 00:01:44,520 Speaker 1: that they never would dream of saying if they actually 24 00:01:44,560 --> 00:01:47,640 Speaker 1: had to identify themselves or to be face to face 25 00:01:47,720 --> 00:01:49,920 Speaker 1: with somebody. We're going to talk with two people today 26 00:01:49,960 --> 00:01:53,080 Speaker 1: who are on the front lines of this issue. First, 27 00:01:53,160 --> 00:01:55,800 Speaker 1: Jonathan Weissman. He's an editor for the New York Times, 28 00:01:56,120 --> 00:01:59,400 Speaker 1: and he quit Twitter over hate speech and he did 29 00:01:59,440 --> 00:02:02,320 Speaker 1: something very very brave, which will tell you about in 30 00:02:02,360 --> 00:02:04,800 Speaker 1: a moment. And will also be hearing from Richard Cohen, 31 00:02:05,080 --> 00:02:08,480 Speaker 1: who's president of the Southern Poverty Law Center. Now, Richard's 32 00:02:08,520 --> 00:02:11,480 Speaker 1: been mired in this stuff for decades. He'll tell us 33 00:02:11,560 --> 00:02:14,640 Speaker 1: that for every big win against hate speech, there is 34 00:02:14,800 --> 00:02:19,160 Speaker 1: inevitably a backlash. But there are still things that we 35 00:02:19,240 --> 00:02:21,960 Speaker 1: can all do to fight these hateful words and deeds. 36 00:02:21,960 --> 00:02:25,079 Speaker 1: I guess the question is our technology companies doing as 37 00:02:25,160 --> 00:02:27,720 Speaker 1: much as they could. Well, it's a tough issue because 38 00:02:27,760 --> 00:02:30,840 Speaker 1: you want people to be able to participate anonymously from 39 00:02:30,880 --> 00:02:34,680 Speaker 1: places like Syria and Iran where they have no choice, 40 00:02:35,120 --> 00:02:37,560 Speaker 1: but in this country people are sort of abusing that 41 00:02:37,639 --> 00:02:41,640 Speaker 1: privilege in order to be really nasty and inject a 42 00:02:41,639 --> 00:02:44,160 Speaker 1: lot of poison into the public discourse. Now, of course 43 00:02:44,200 --> 00:02:46,680 Speaker 1: I'm the target of some of this because I'm a 44 00:02:46,880 --> 00:02:50,600 Speaker 1: public figure or whatever. But we wondered if other people 45 00:02:50,639 --> 00:02:54,200 Speaker 1: experience that even in their small circles, even among their 46 00:02:54,240 --> 00:02:57,280 Speaker 1: followers on social media. So we decided to head out 47 00:02:57,320 --> 00:03:00,520 Speaker 1: to Times Square, the center of the New York universe, 48 00:03:00,560 --> 00:03:05,040 Speaker 1: of course, to find out what other people are experiencing online. 49 00:03:05,200 --> 00:03:08,520 Speaker 1: And here's what we discovered. Hi, can I show a 50 00:03:08,600 --> 00:03:12,520 Speaker 1: quick question? Sure, have you ever experienced hate speech or 51 00:03:12,600 --> 00:03:18,720 Speaker 1: somebody being abusive to you on social media or online? Yeah? Political? Yeah? 52 00:03:19,080 --> 00:03:20,839 Speaker 1: How does it make you feel when you get those 53 00:03:20,919 --> 00:03:23,160 Speaker 1: kinds of comments like, I don't want to be a 54 00:03:23,200 --> 00:03:25,800 Speaker 1: part of it. I mean, I just ignore it. To 55 00:03:25,840 --> 00:03:28,320 Speaker 1: be honest with you, when you see it online directed 56 00:03:28,320 --> 00:03:30,960 Speaker 1: at other people, what do you think? In general? I 57 00:03:31,000 --> 00:03:34,359 Speaker 1: don't like any kind of anybody that makes any adverse 58 00:03:34,400 --> 00:03:37,440 Speaker 1: comments to people online because it goes to everybody, and 59 00:03:37,560 --> 00:03:41,920 Speaker 1: I don't I don't agree with that. Leat haters hate Honestly, 60 00:03:42,400 --> 00:03:44,400 Speaker 1: They're going to talk about whatever they want to talk about. 61 00:03:44,720 --> 00:03:47,520 Speaker 1: Are you shocked sometimes when you see the level of 62 00:03:47,720 --> 00:03:53,720 Speaker 1: discourse online. Well, I'm I'm not shocked because that was 63 00:03:53,800 --> 00:03:56,000 Speaker 1: what the Internet was originally supposed to be, which was 64 00:03:56,000 --> 00:04:00,080 Speaker 1: supposed to be an area where it's democratizing opinion to 65 00:04:00,120 --> 00:04:03,800 Speaker 1: send and so forth. So unfortunately it matches up with 66 00:04:03,840 --> 00:04:05,800 Speaker 1: the spectrum of the world. I think there needs to 67 00:04:05,840 --> 00:04:09,640 Speaker 1: be some boundary of you know, what's appropriate and what's 68 00:04:09,640 --> 00:04:12,119 Speaker 1: not right. You sort of teach kids and people what's 69 00:04:12,120 --> 00:04:13,960 Speaker 1: appropriate what's not and you have to kind of practice 70 00:04:13,960 --> 00:04:17,000 Speaker 1: it in real life. I think have you ever experienced 71 00:04:17,400 --> 00:04:21,040 Speaker 1: online bullying or hate? You know you're talking to I'm 72 00:04:21,080 --> 00:04:23,640 Speaker 1: the Finnish dragon for men's health. So basically everything I 73 00:04:23,760 --> 00:04:28,960 Speaker 1: post is he skipped leg days, calves look terrible. I 74 00:04:29,000 --> 00:04:32,280 Speaker 1: hate his face. Um, I wish you would die. Do 75 00:04:32,320 --> 00:04:35,120 Speaker 1: you monitor your comments and do you ever remove them? 76 00:04:35,240 --> 00:04:37,839 Speaker 1: We often remove them. That's half my job. He's growing 77 00:04:37,880 --> 00:04:41,919 Speaker 1: to the Facebook feed particular. Yeah, Facebook definitely. So it 78 00:04:42,000 --> 00:04:45,200 Speaker 1: keeps you hate speech, keeps you very busy, It keeps 79 00:04:45,240 --> 00:04:51,160 Speaker 1: me employed. Yeah, definitely. There is rough out there, all right, 80 00:04:51,279 --> 00:04:58,920 Speaker 1: stay safe out there. A big thank you to everyone 81 00:04:59,120 --> 00:05:01,680 Speaker 1: willing to talk thus on a warm summer day in 82 00:05:01,720 --> 00:05:04,480 Speaker 1: the middle of Times Square, very warm now, Let's talk 83 00:05:04,480 --> 00:05:09,240 Speaker 1: about Jonathan Wiseman. Jonathan is a very fine writer for 84 00:05:09,279 --> 00:05:13,599 Speaker 1: the New York Times, and he confronted really something that 85 00:05:13,760 --> 00:05:17,880 Speaker 1: is so ugly that many of us believed, probably incorrectly, 86 00:05:17,920 --> 00:05:20,000 Speaker 1: had been put to bed a long time ago, which 87 00:05:20,040 --> 00:05:24,120 Speaker 1: is just blatant anti semitism. So he's here to talk 88 00:05:24,160 --> 00:05:31,279 Speaker 1: about that and what he did about it. Hey, Jonathan Wiseman, Hi, Katie, 89 00:05:31,320 --> 00:05:33,760 Speaker 1: how are you. I'm good, Brian, and I are really 90 00:05:34,480 --> 00:05:37,520 Speaker 1: delighted that you're here to talk about what happened to you, 91 00:05:37,560 --> 00:05:40,440 Speaker 1: because I think it's emblematic of the times we're living 92 00:05:40,520 --> 00:05:44,400 Speaker 1: in and how strange and surreal social media can be, 93 00:05:44,800 --> 00:05:48,560 Speaker 1: and how actually upsetting and dangerous it can be as well. 94 00:05:48,880 --> 00:05:51,640 Speaker 1: Let's start from the beginning. This all happened when you 95 00:05:51,720 --> 00:05:55,320 Speaker 1: posted an article about fascism. Tell us a little bit 96 00:05:55,360 --> 00:05:57,840 Speaker 1: about why you posted it and what it was about. 97 00:05:58,440 --> 00:06:02,240 Speaker 1: You know, if something catches your fancy, um, you tend 98 00:06:02,279 --> 00:06:05,880 Speaker 1: to tweet it out to your followers. It was something 99 00:06:05,920 --> 00:06:09,640 Speaker 1: that I do several times a day. And Robert Kagan, 100 00:06:09,760 --> 00:06:13,919 Speaker 1: kind of a neo conservative author uh supporter of a 101 00:06:13,960 --> 00:06:18,440 Speaker 1: lot of military interventions past um now at the Brookings Institution. 102 00:06:18,760 --> 00:06:21,960 Speaker 1: He had written a piece for the Washington Post on 103 00:06:21,960 --> 00:06:25,040 Speaker 1: how fascism is coming to the United States, and I 104 00:06:25,080 --> 00:06:27,560 Speaker 1: tweeted it out to my followers and I had gotten 105 00:06:27,960 --> 00:06:31,200 Speaker 1: a response back that I had never seen before. It 106 00:06:31,320 --> 00:06:35,480 Speaker 1: was just simply my name in three parentheses from somebody 107 00:06:35,480 --> 00:06:40,640 Speaker 1: who identified himself as Cybertrump. And I tweeted back to him, 108 00:06:40,720 --> 00:06:44,000 Speaker 1: I said, do you care to explain? And you know 109 00:06:44,080 --> 00:06:49,280 Speaker 1: that let loose this retuperative anti Semitic rant and it's 110 00:06:49,480 --> 00:06:52,760 Speaker 1: he said, I have belled the cat. Um. And with 111 00:06:52,880 --> 00:06:56,640 Speaker 1: the belling of that cat, the trolls, the neo Nazis, 112 00:06:56,680 --> 00:07:02,480 Speaker 1: the white supremacists started just day losing me with Holocaust imagery, 113 00:07:02,760 --> 00:07:08,159 Speaker 1: Nazi iconography, really really ugly stuff that I really thought was, 114 00:07:08,560 --> 00:07:12,080 Speaker 1: you know, things of the past. Tell us before I 115 00:07:12,120 --> 00:07:13,840 Speaker 1: ask you a little bit more about some of the 116 00:07:13,880 --> 00:07:16,840 Speaker 1: things that were tweeted at you how this sort of 117 00:07:16,960 --> 00:07:21,600 Speaker 1: three parentheses thing works, and how it's so called bells 118 00:07:21,760 --> 00:07:24,160 Speaker 1: the cat, because I know that a lot of news 119 00:07:24,280 --> 00:07:28,680 Speaker 1: organizations have since written about this. Yeah, I of course 120 00:07:28,840 --> 00:07:32,720 Speaker 1: knew nothing about this until I was sucked into it. Um. 121 00:07:32,760 --> 00:07:37,320 Speaker 1: It turns out that this group of neo Nazis and 122 00:07:37,480 --> 00:07:41,680 Speaker 1: white supremacists had come up with an actual piece of software, 123 00:07:42,240 --> 00:07:47,440 Speaker 1: a Google Chrome plug in that you can download, and 124 00:07:47,600 --> 00:07:52,600 Speaker 1: it puts these three parentheses around Jewish sounding names, which 125 00:07:52,640 --> 00:07:55,960 Speaker 1: allows them, if they have the right software, to search 126 00:07:56,080 --> 00:08:00,640 Speaker 1: social networks m for the three parentheses and then find 127 00:08:00,720 --> 00:08:05,520 Speaker 1: targets to go after, because oddly enough, most search engines 128 00:08:05,560 --> 00:08:10,240 Speaker 1: don't look for things like parentheses and that kind of thing. 129 00:08:10,280 --> 00:08:14,560 Speaker 1: They only look for numbers and and letters and set 130 00:08:14,640 --> 00:08:17,480 Speaker 1: things like the at sign. So they had come up 131 00:08:17,520 --> 00:08:23,000 Speaker 1: with an actual organized way to find and target certain 132 00:08:23,040 --> 00:08:26,520 Speaker 1: people with Jewish sounding names. That's just unbelievable to me. 133 00:08:27,120 --> 00:08:29,240 Speaker 1: How long has this been going on, by the way, 134 00:08:29,280 --> 00:08:32,200 Speaker 1: because I know it was recently discovered, but was this 135 00:08:32,240 --> 00:08:36,000 Speaker 1: a practice that had been in effect for years? Even? 136 00:08:36,360 --> 00:08:39,280 Speaker 1: I don't think so. I think it's been several months. 137 00:08:39,280 --> 00:08:44,720 Speaker 1: And I think the first targets were Jewish conservative writers 138 00:08:44,960 --> 00:08:50,360 Speaker 1: and pundits, people like Bill crystal Um, who had been 139 00:08:50,440 --> 00:08:53,280 Speaker 1: targets of what they called the alt right, and now 140 00:08:53,360 --> 00:08:57,960 Speaker 1: it's moved to more mainstream journalists like myself. I understand 141 00:08:57,960 --> 00:09:00,640 Speaker 1: that Google Chrome was made aware of this, then they 142 00:09:00,720 --> 00:09:04,480 Speaker 1: got rid of the software that enables this process. Is 143 00:09:04,520 --> 00:09:07,600 Speaker 1: that right? That's right, it was. It was labeled something 144 00:09:07,880 --> 00:09:12,320 Speaker 1: very generic that you would never ever recognize as some 145 00:09:12,400 --> 00:09:15,160 Speaker 1: kind of software targeted for anti semit So they had 146 00:09:15,160 --> 00:09:17,640 Speaker 1: no idea that it was there, and once they found 147 00:09:17,640 --> 00:09:20,320 Speaker 1: out about it, they yanked it off. What did it 148 00:09:20,360 --> 00:09:24,280 Speaker 1: feel like when you first got those tweets that showed 149 00:09:24,320 --> 00:09:29,800 Speaker 1: things like dead bodies in the Holocaust and other neo 150 00:09:29,920 --> 00:09:36,160 Speaker 1: Nazi symbols. Yeah, I mean the Holocaust. The concentration camp 151 00:09:36,200 --> 00:09:40,040 Speaker 1: imagery is such a shock in this context because it 152 00:09:40,480 --> 00:09:43,080 Speaker 1: turns your stomach and the first time you see an 153 00:09:43,080 --> 00:09:47,440 Speaker 1: image of like Donald Trump in Nazi garb, flipping the 154 00:09:47,480 --> 00:09:50,800 Speaker 1: switch on a gas chamber in your face inside that 155 00:09:50,880 --> 00:09:54,320 Speaker 1: gas chamber, it's a shock. Now the tenth time you 156 00:09:54,360 --> 00:09:57,240 Speaker 1: see it, you start thinking this is a joke. These 157 00:09:57,240 --> 00:10:01,280 Speaker 1: people are just sick um and you start imagining them 158 00:10:01,320 --> 00:10:04,680 Speaker 1: sitting in their parents basement in their pajamas. The shock 159 00:10:04,840 --> 00:10:09,720 Speaker 1: value does wear off. But you know, the goating, uh, 160 00:10:09,840 --> 00:10:13,920 Speaker 1: the anti semited goating, the constantly asking you, well, why 161 00:10:13,960 --> 00:10:17,600 Speaker 1: are Jews always oppressed? Why are Jews always thrown out 162 00:10:17,600 --> 00:10:22,080 Speaker 1: of these countries through the centuries. It's very, very frustrating, 163 00:10:22,720 --> 00:10:26,400 Speaker 1: And Trump is kind of a funny hero for these 164 00:10:26,520 --> 00:10:30,520 Speaker 1: neo Nazis because he hasn't made, to my knowledge, any 165 00:10:30,600 --> 00:10:35,040 Speaker 1: anti Semitic comments. He in fact has a Jewish daughter 166 00:10:35,160 --> 00:10:38,240 Speaker 1: and son in law. His daughter converted when he got married. 167 00:10:38,720 --> 00:10:43,800 Speaker 1: Why do you think he's become so celebrated and protected 168 00:10:44,040 --> 00:10:49,720 Speaker 1: by these people online? They in in his language on immigrants, 169 00:10:49,840 --> 00:10:53,600 Speaker 1: they see a kindred spirit. They think that he's sending 170 00:10:53,720 --> 00:10:56,640 Speaker 1: signals that that there's much more to it than he 171 00:10:56,640 --> 00:10:58,720 Speaker 1: would that he would like to say, but he can't 172 00:10:58,720 --> 00:11:01,280 Speaker 1: say it because he's running propresed it in. So. Wolf 173 00:11:01,320 --> 00:11:04,199 Speaker 1: Blitzer one time was talking to Donald Trump and brought 174 00:11:04,280 --> 00:11:08,480 Speaker 1: up the anti Semitic goating of of journalists and asked him, 175 00:11:08,480 --> 00:11:10,200 Speaker 1: what do you think about this? Do you were you 176 00:11:10,320 --> 00:11:14,199 Speaker 1: repudiate that? And he basically said, I don't speak from 177 00:11:14,200 --> 00:11:16,160 Speaker 1: my followers. I don't speak for all my voters. No, 178 00:11:16,400 --> 00:11:19,920 Speaker 1: I I think anti semitism is terrible. But he did 179 00:11:19,920 --> 00:11:24,160 Speaker 1: not overtly repudiate them, did not say I do not 180 00:11:24,320 --> 00:11:26,560 Speaker 1: want your vote, I do not want your support, the 181 00:11:26,600 --> 00:11:31,640 Speaker 1: way Ronald Reagan once did when the clan had endorsed him. Jonathan, 182 00:11:31,679 --> 00:11:35,480 Speaker 1: one interesting thing that you did was you didn't mute 183 00:11:35,520 --> 00:11:39,200 Speaker 1: these people. You didn't block them from your account. You 184 00:11:39,240 --> 00:11:44,360 Speaker 1: actually retweeted so all of your followers could see what 185 00:11:44,440 --> 00:11:46,760 Speaker 1: they were saying to you. Why did you want to 186 00:11:46,800 --> 00:11:51,160 Speaker 1: do that? Because if you, you know, just ignore them, 187 00:11:51,240 --> 00:11:53,880 Speaker 1: it's hard for somebody to go onto your Twitter feed 188 00:11:53,920 --> 00:11:56,640 Speaker 1: and and see what you're seeing. If you retweet, then 189 00:11:56,679 --> 00:11:59,440 Speaker 1: everybody who follows you seese it. At first, I would 190 00:11:59,440 --> 00:12:01,920 Speaker 1: make a little comments to make sure that nobody thought 191 00:12:01,920 --> 00:12:05,640 Speaker 1: that I was, you know, in endorsing them by retweeting, uh, 192 00:12:05,679 --> 00:12:09,120 Speaker 1: And then I just started just retweeting, retweeting, retweeting to 193 00:12:09,240 --> 00:12:12,560 Speaker 1: create a record, uh, and to show anybody who wanted 194 00:12:12,600 --> 00:12:16,440 Speaker 1: to see how ugly things were could just search for 195 00:12:16,480 --> 00:12:20,160 Speaker 1: at Jonathan Weissman and find all sorts of evidence right there. 196 00:12:20,880 --> 00:12:24,079 Speaker 1: What was the reaction when you were retweeting? Because I 197 00:12:24,120 --> 00:12:27,040 Speaker 1: was following your Twitter feed and I was looking at 198 00:12:27,240 --> 00:12:30,400 Speaker 1: how some people were coming to your defense. Did you 199 00:12:30,480 --> 00:12:34,560 Speaker 1: get much support from the journalistic community? And I've gotten 200 00:12:34,760 --> 00:12:38,319 Speaker 1: a ton of support. And when I decided to write 201 00:12:38,360 --> 00:12:41,840 Speaker 1: about it, it was actually then there was another reporter 202 00:12:42,000 --> 00:12:46,280 Speaker 1: for Air. It's one of the Israeli newspapers. He's the 203 00:12:46,280 --> 00:12:49,120 Speaker 1: one that came up with the idea that we should 204 00:12:49,120 --> 00:12:53,800 Speaker 1: all put those triple parentheses around our names, um, to 205 00:12:53,960 --> 00:12:56,520 Speaker 1: show that we're not cow, to show that we're not afraid, 206 00:12:56,559 --> 00:12:59,800 Speaker 1: but also to thwart the software. And I thought that 207 00:12:59,840 --> 00:13:03,160 Speaker 1: was an amazing moment of solidarity because now I if 208 00:13:03,160 --> 00:13:05,440 Speaker 1: you look on Twitter, you see a lot of people 209 00:13:05,440 --> 00:13:09,160 Speaker 1: with these triple parentheses. So yes, I do feel supported 210 00:13:09,480 --> 00:13:13,160 Speaker 1: and I do feel like I have accomplished something. But 211 00:13:13,280 --> 00:13:17,439 Speaker 1: you quit Twitter because you were upset with the reaction 212 00:13:18,040 --> 00:13:21,600 Speaker 1: you got from the company. Tell us what transpired? Who 213 00:13:21,600 --> 00:13:24,720 Speaker 1: did you call, what did they say, What were you 214 00:13:24,760 --> 00:13:28,920 Speaker 1: hoping to achieve, and what didn't you get? So as 215 00:13:29,080 --> 00:13:32,320 Speaker 1: as you said, Katie, I wanted to create this trail 216 00:13:32,640 --> 00:13:35,439 Speaker 1: of anti semitism so people could see what they were 217 00:13:35,520 --> 00:13:38,160 Speaker 1: what they were doing. And I thought, I'll leave all 218 00:13:38,160 --> 00:13:41,320 Speaker 1: this stuff out there for a little while. But now 219 00:13:41,360 --> 00:13:44,319 Speaker 1: I'm kind of growing sick of this. So I started 220 00:13:44,720 --> 00:13:49,240 Speaker 1: reporting the most vicious anti Semitic tweets coming at me 221 00:13:49,360 --> 00:13:53,559 Speaker 1: to Twitter. Because Twitter has something called the terms of service. 222 00:13:53,600 --> 00:13:57,160 Speaker 1: When you sign on to Twitter, nobody ever reads them, um, 223 00:13:57,200 --> 00:14:00,240 Speaker 1: but you click on something saying okay, I under stand 224 00:14:00,280 --> 00:14:03,319 Speaker 1: the terms of service, and those terms of service say 225 00:14:03,360 --> 00:14:09,920 Speaker 1: no harassment, no singling out people for ill treatment based 226 00:14:09,960 --> 00:14:17,720 Speaker 1: on race, sexual orientation, religion, ethnic origin. You're vowing to 227 00:14:17,720 --> 00:14:22,200 Speaker 1: to to follow Twitter's rules. So I started reporting them 228 00:14:22,240 --> 00:14:25,040 Speaker 1: to Twitter, and at first I would just get a 229 00:14:25,040 --> 00:14:28,200 Speaker 1: little acknowledgement saying, we got your report, we're looking into it. 230 00:14:28,240 --> 00:14:31,320 Speaker 1: We got your report, we're looking into it. Then one 231 00:14:31,360 --> 00:14:35,200 Speaker 1: of the New York Times social media people, we have 232 00:14:35,360 --> 00:14:38,760 Speaker 1: people who you know, we're hired to look after our 233 00:14:38,840 --> 00:14:42,240 Speaker 1: social media she's she decided to take upon herself too, 234 00:14:42,840 --> 00:14:46,680 Speaker 1: to put together a compendium of all the worst things 235 00:14:46,720 --> 00:14:49,280 Speaker 1: that had had come my way and send them in 236 00:14:49,360 --> 00:14:52,360 Speaker 1: a in a package to Twitter. And then she got 237 00:14:52,440 --> 00:14:55,400 Speaker 1: one of these notes back from Twitter saying we've looked 238 00:14:55,400 --> 00:14:57,600 Speaker 1: at what you've sent us and we see no violations 239 00:14:57,600 --> 00:15:00,400 Speaker 1: of our terms of service, which shocked me. I mean, 240 00:15:00,840 --> 00:15:04,640 Speaker 1: I mean honestly. And and then I remember, remember her 241 00:15:04,680 --> 00:15:06,440 Speaker 1: name is ari a. Re sent me this, and I 242 00:15:06,480 --> 00:15:09,360 Speaker 1: sent her a little frowny face back, saying this is ridiculous. 243 00:15:09,680 --> 00:15:11,720 Speaker 1: The next morning I woke up and there was somebody 244 00:15:11,760 --> 00:15:14,320 Speaker 1: I would have sent her something more than a frown face, 245 00:15:14,440 --> 00:15:17,840 Speaker 1: but you were very polite. I was. I was much 246 00:15:17,880 --> 00:15:21,840 Speaker 1: more angry at Twitter than I was by this point 247 00:15:21,840 --> 00:15:24,440 Speaker 1: at the neo Nazis because I've got grown used to them, 248 00:15:24,640 --> 00:15:27,720 Speaker 1: um and and I went on to Twitter. I said, 249 00:15:27,720 --> 00:15:30,440 Speaker 1: this is a brief rant, and I said I'm getting 250 00:15:30,440 --> 00:15:33,480 Speaker 1: off Twitter because Twitter will not follow its own rules 251 00:15:33,520 --> 00:15:37,440 Speaker 1: and will not police itself. And almost immediately as soon 252 00:15:37,480 --> 00:15:41,280 Speaker 1: as I did that, I started getting notes back saying 253 00:15:41,360 --> 00:15:44,080 Speaker 1: We've looked at this count and we've suspended it. We've 254 00:15:44,080 --> 00:15:46,200 Speaker 1: looked at that account, we've suspended it. We've looked at 255 00:15:46,200 --> 00:15:47,680 Speaker 1: some of the accounts, and they said, we don't see 256 00:15:47,720 --> 00:15:50,680 Speaker 1: we still don't see, um a violation of our terms 257 00:15:50,720 --> 00:15:53,760 Speaker 1: of service. And I, honestly I went back and I 258 00:15:53,800 --> 00:15:57,880 Speaker 1: could not figure out what criteria Twitter was using to 259 00:15:58,000 --> 00:16:01,239 Speaker 1: decide that this guy's account is going to be suspended 260 00:16:01,520 --> 00:16:04,200 Speaker 1: and this guy's account was fine. I honestly didn't see 261 00:16:04,200 --> 00:16:07,000 Speaker 1: any difference between them. What about this argument that the 262 00:16:07,000 --> 00:16:11,240 Speaker 1: conundrum that some of the social media outlets ostensibly find 263 00:16:11,280 --> 00:16:15,800 Speaker 1: themselves in Jonathan withdrawing the line between free speech and 264 00:16:15,920 --> 00:16:19,520 Speaker 1: hate speech. It's a question that I've gone over in 265 00:16:19,600 --> 00:16:21,640 Speaker 1: my own head for a long time now, a lot, 266 00:16:21,640 --> 00:16:26,040 Speaker 1: well the last few few weeks, and my attitude at 267 00:16:26,040 --> 00:16:29,360 Speaker 1: this point is if Twitter wants to make an announcement 268 00:16:29,440 --> 00:16:32,480 Speaker 1: saying we're doing away with our terms of service. We 269 00:16:32,520 --> 00:16:35,360 Speaker 1: don't care. We can have a wild West. We're gonna 270 00:16:35,400 --> 00:16:38,560 Speaker 1: have open free speech. Say what you want. You know, 271 00:16:38,680 --> 00:16:40,600 Speaker 1: if they did that, I'd probably signed back up. At 272 00:16:40,680 --> 00:16:44,160 Speaker 1: least they were being honest. Or I find offensive is 273 00:16:44,160 --> 00:16:47,560 Speaker 1: that Twitter can wear the halo of the terms of 274 00:16:47,600 --> 00:16:50,440 Speaker 1: service that says we don't put up with harassment, we 275 00:16:50,440 --> 00:16:55,120 Speaker 1: don't put up with racism or anti semitism or homophobia, 276 00:16:55,200 --> 00:16:57,040 Speaker 1: we don't put up with these things, but then don't 277 00:16:57,120 --> 00:16:59,520 Speaker 1: do anything to police it. So they have to make 278 00:16:59,560 --> 00:17:02,640 Speaker 1: a decision. Asian, they are either one thing or another, 279 00:17:02,840 --> 00:17:06,240 Speaker 1: but they can't have it both ways. And the suspensions 280 00:17:06,320 --> 00:17:11,080 Speaker 1: you talked about are meaningless because you're suspending the accounts, 281 00:17:11,160 --> 00:17:13,840 Speaker 1: but not the people behind. All of those people can 282 00:17:13,960 --> 00:17:18,439 Speaker 1: just set up additional hate spewing accounts. So what do 283 00:17:18,480 --> 00:17:21,520 Speaker 1: you think Twitter actually ought to do to fix this 284 00:17:21,560 --> 00:17:24,960 Speaker 1: problem short of turning it into a wild West. It's 285 00:17:25,119 --> 00:17:29,480 Speaker 1: it is a conundrum. It is very hard to puzzle out. 286 00:17:29,880 --> 00:17:32,320 Speaker 1: And a lot of people have said, well, who cares, 287 00:17:32,359 --> 00:17:34,560 Speaker 1: are just a bunch of trolls, And my answer is 288 00:17:34,960 --> 00:17:36,960 Speaker 1: you're right, who cares? Why do I why do I 289 00:17:37,000 --> 00:17:41,440 Speaker 1: need Twitter? So I just my attitude was why if 290 00:17:41,440 --> 00:17:45,400 Speaker 1: if I'm offended by the way Twitter is policing itself, 291 00:17:45,680 --> 00:17:47,960 Speaker 1: then I will just say goodbye to Twitter for a while. 292 00:17:48,240 --> 00:17:50,880 Speaker 1: Did you hear from anyone like Jack Dorsey or any 293 00:17:50,920 --> 00:17:54,200 Speaker 1: executives at Twitter after this whole bruhaha? Because I think 294 00:17:54,200 --> 00:17:57,800 Speaker 1: you put them in a terrible light. I've been expecting 295 00:17:57,880 --> 00:18:01,320 Speaker 1: to hear from somebody and I haven't. They've really been 296 00:18:01,320 --> 00:18:05,480 Speaker 1: remarkably quiescent about this whole thing. They don't seem to 297 00:18:05,520 --> 00:18:09,240 Speaker 1: want to engage. What do you think this says about 298 00:18:09,560 --> 00:18:13,320 Speaker 1: the state of our country? You know, I wonder did 299 00:18:13,400 --> 00:18:17,359 Speaker 1: these people always exist? Are we just now seeing them 300 00:18:17,440 --> 00:18:21,119 Speaker 1: because they have a platform they can express their views. 301 00:18:21,760 --> 00:18:26,119 Speaker 1: Is there something going on that the numbers are multiplying? Um? 302 00:18:26,280 --> 00:18:28,919 Speaker 1: What's your take on what this says about us as 303 00:18:28,960 --> 00:18:31,119 Speaker 1: a country? When Katie to jump in, I mean, the 304 00:18:31,160 --> 00:18:36,959 Speaker 1: polling does show that deeply entrenched anti Semitic views, at 305 00:18:37,000 --> 00:18:39,119 Speaker 1: least according to the A d l S poelling, have 306 00:18:39,119 --> 00:18:42,719 Speaker 1: have gone down from about twenty nine percent of Americans 307 00:18:42,720 --> 00:18:47,960 Speaker 1: in n twelve percent. But it's still twelve percent of 308 00:18:47,960 --> 00:18:51,879 Speaker 1: the American people anyway, And that surprises me. Surprises you 309 00:18:51,920 --> 00:18:54,360 Speaker 1: that it's gone down. No, it surprises me that it's 310 00:18:54,359 --> 00:18:58,399 Speaker 1: at twelve percent. I mean, remember in Washington, especially where 311 00:18:58,400 --> 00:19:03,080 Speaker 1: where you see this high partisan uh commitment to Israel 312 00:19:03,400 --> 00:19:08,439 Speaker 1: and politicians of every conceivable stripe from from very liberal 313 00:19:08,480 --> 00:19:11,720 Speaker 1: to very conservative tumbling over each other to get to 314 00:19:11,960 --> 00:19:15,199 Speaker 1: to get to the right side of Jewish voters. I 315 00:19:15,200 --> 00:19:17,360 Speaker 1: would never have thought it was twelve franc I would 316 00:19:17,359 --> 00:19:20,480 Speaker 1: have thought it was considerably lower now. But we've seen 317 00:19:20,920 --> 00:19:25,480 Speaker 1: if you look on a newspaper website, if the comments 318 00:19:25,520 --> 00:19:32,240 Speaker 1: sections under articles allow anonymity, the tone of the discourse 319 00:19:32,480 --> 00:19:36,600 Speaker 1: is horrific. It degenerates so quickly. But if you have 320 00:19:36,640 --> 00:19:39,840 Speaker 1: to use a real name, it stays at a certain level. 321 00:19:40,320 --> 00:19:42,760 Speaker 1: It just it's like road rage. If you don't have 322 00:19:42,800 --> 00:19:45,640 Speaker 1: an identity, I don't know. I guess it's as old 323 00:19:45,640 --> 00:19:48,560 Speaker 1: as Lord of the Flies or older than that. Um. 324 00:19:48,920 --> 00:19:52,119 Speaker 1: But I do think that the tone of this presidential 325 00:19:52,160 --> 00:19:57,960 Speaker 1: campaign and the tone set by Donald Trump have empowered 326 00:19:58,080 --> 00:20:02,600 Speaker 1: people to bring out views that they used to only speak, 327 00:20:02,680 --> 00:20:05,160 Speaker 1: you know, in the privacy of their home. Around there, 328 00:20:05,359 --> 00:20:08,560 Speaker 1: around the angry dinner table or something. I mean, people 329 00:20:08,680 --> 00:20:13,400 Speaker 1: seem willing to speak in public the way they used 330 00:20:13,440 --> 00:20:17,120 Speaker 1: to never. I mean, I get your point about Trump 331 00:20:17,320 --> 00:20:22,840 Speaker 1: not being strong enough in condemning David Duke or anti 332 00:20:22,840 --> 00:20:28,600 Speaker 1: Semitic pronouncements, But are there elements of his campaign, of 333 00:20:28,720 --> 00:20:35,000 Speaker 1: his platform that you think are attracting white nationalists anti 334 00:20:35,080 --> 00:20:39,920 Speaker 1: Semites even though white supremisists and yeah, even though what 335 00:20:40,040 --> 00:20:46,040 Speaker 1: he's saying of the time is not anti Semitic. But 336 00:20:46,119 --> 00:20:48,320 Speaker 1: I think if if you call for a blanket ban 337 00:20:48,680 --> 00:20:54,000 Speaker 1: on Muslim immigrants, if you call for a giant wall 338 00:20:54,119 --> 00:20:57,760 Speaker 1: on the southern border, if you talk about Mexicans as 339 00:20:57,920 --> 00:21:01,880 Speaker 1: rapists and thieves, I think you're sending a signal that 340 00:21:02,280 --> 00:21:06,840 Speaker 1: exclusionary politics are now mainstream. Do you think that. I 341 00:21:06,880 --> 00:21:10,600 Speaker 1: know that a number of newspapers have done away with 342 00:21:10,680 --> 00:21:15,560 Speaker 1: the comments section, or they're monitoring them more carefully, because 343 00:21:15,640 --> 00:21:18,400 Speaker 1: you look at them. It is so depressing to see 344 00:21:18,440 --> 00:21:24,000 Speaker 1: how the discourse quickly devolves into basically a verbal fist 345 00:21:24,040 --> 00:21:31,000 Speaker 1: fite between respondents, and the kind of vitriol and nastiness 346 00:21:31,040 --> 00:21:34,520 Speaker 1: that is so pervasive. Then I read the letters to 347 00:21:34,560 --> 00:21:37,000 Speaker 1: the editor in the New York Times, and I think, Wow, 348 00:21:37,400 --> 00:21:40,359 Speaker 1: my faith in America has been restored because people have 349 00:21:40,400 --> 00:21:44,120 Speaker 1: different opinions, but they're all sort of thoughtful and well written, 350 00:21:44,480 --> 00:21:48,879 Speaker 1: and it's just it's it's like they're two America's. Well, 351 00:21:49,240 --> 00:21:52,639 Speaker 1: you know, the Wall Street Journal did a great exercise 352 00:21:52,640 --> 00:21:57,119 Speaker 1: where they created a blue Facebook and a red Facebook 353 00:21:57,640 --> 00:22:01,000 Speaker 1: and if you are a conservative you're tending to see 354 00:22:01,040 --> 00:22:03,040 Speaker 1: on Facebook, and if you're a liberal, what you're tending 355 00:22:03,080 --> 00:22:05,840 Speaker 1: to see on Facebook? And they made a very very 356 00:22:05,840 --> 00:22:11,640 Speaker 1: compelling illustration of how different and how divided the country 357 00:22:11,840 --> 00:22:15,040 Speaker 1: is along political grounds. I mean, if you look on 358 00:22:15,080 --> 00:22:18,800 Speaker 1: the on the New York Times comments, now, they're curated. 359 00:22:19,040 --> 00:22:22,880 Speaker 1: There are editors picks, there are readers picks, and so 360 00:22:23,480 --> 00:22:26,640 Speaker 1: we we empower certain readers to actually go through and 361 00:22:26,760 --> 00:22:29,800 Speaker 1: select the best ones you can click on, you know, 362 00:22:29,960 --> 00:22:32,960 Speaker 1: see all as well, but those are policed. I mean, 363 00:22:33,040 --> 00:22:36,320 Speaker 1: we actually employ people to go through them. So what's 364 00:22:36,320 --> 00:22:39,080 Speaker 1: the solution, Jonathan, I mean, what have you learned from 365 00:22:39,119 --> 00:22:46,960 Speaker 1: this whole bizarre and really upsetting experience. I think that 366 00:22:47,920 --> 00:22:51,560 Speaker 1: part of the solution is to not care so much 367 00:22:51,720 --> 00:22:55,360 Speaker 1: about social media. You know, since writing off Twitter, I've 368 00:22:55,359 --> 00:22:59,480 Speaker 1: been spending a lot more time actually reading the whole 369 00:22:59,600 --> 00:23:02,640 Speaker 1: article off of you know, off of the New York 370 00:23:02,640 --> 00:23:07,000 Speaker 1: Times in the Washington Post apps and not becoming almost 371 00:23:07,040 --> 00:23:12,159 Speaker 1: like an automaton consulting my social media feed. Um. I 372 00:23:12,200 --> 00:23:15,640 Speaker 1: actually feel like it's been good to pull back from 373 00:23:15,680 --> 00:23:19,280 Speaker 1: that world, and I think we could all do with 374 00:23:19,320 --> 00:23:22,200 Speaker 1: a little less time on social media, right. And it's 375 00:23:22,400 --> 00:23:25,320 Speaker 1: kind of narcissistic, don't you think, because it makes you 376 00:23:25,400 --> 00:23:29,320 Speaker 1: obsessed with what people are saying about you instead of 377 00:23:29,320 --> 00:23:32,480 Speaker 1: looking outward and saying what you think is going on 378 00:23:32,560 --> 00:23:35,960 Speaker 1: in the world. It's a weird inversion, isn't it. I 379 00:23:36,000 --> 00:23:39,480 Speaker 1: have teenage daughters and they measure, you know, their self 380 00:23:39,520 --> 00:23:42,960 Speaker 1: worth by their Instagram following and their life. It's crazy. 381 00:23:43,200 --> 00:23:48,000 Speaker 1: It's weird that they can quantify their popularity on social media. 382 00:23:48,680 --> 00:23:51,480 Speaker 1: But I really wish they would just leave it alone. 383 00:23:51,560 --> 00:23:54,359 Speaker 1: You know. Do you think, Jonathan, that there's a broader 384 00:23:54,640 --> 00:23:58,880 Speaker 1: challenge for the mainstream press this election cycle? Um, when 385 00:23:58,920 --> 00:24:04,359 Speaker 1: it seems like some reporters have crossed the line from 386 00:24:04,400 --> 00:24:10,840 Speaker 1: just covering the campaign to describing Trump and his candidacy 387 00:24:10,880 --> 00:24:14,480 Speaker 1: and his supporters in ways that might have seemed too 388 00:24:14,560 --> 00:24:18,400 Speaker 1: much like advocacy, UM for the other side or against 389 00:24:18,400 --> 00:24:23,440 Speaker 1: one side. Do you think that the mainstream press reinforces 390 00:24:23,480 --> 00:24:27,520 Speaker 1: this narrative of a liberal bias based on the way 391 00:24:27,920 --> 00:24:31,679 Speaker 1: some are covering Donald Trump. I think I and a 392 00:24:31,840 --> 00:24:37,000 Speaker 1: lot of journalists involved in watching this campaign are wrestling 393 00:24:37,040 --> 00:24:40,880 Speaker 1: with this because on the one hand, we have an 394 00:24:40,920 --> 00:24:46,080 Speaker 1: obligation to fact check and to cover any candidate with 395 00:24:46,320 --> 00:24:50,080 Speaker 1: a jaundiced eye, with a sense that they can't just 396 00:24:50,440 --> 00:24:53,199 Speaker 1: get away with the big lie saying something over and 397 00:24:53,200 --> 00:24:56,280 Speaker 1: over until we all accept it. It's our responsibility to 398 00:24:56,400 --> 00:25:01,119 Speaker 1: point out when something is just factually in accurate, untrue. 399 00:25:01,960 --> 00:25:05,760 Speaker 1: I think that you know, famously, the Huffington Post declared 400 00:25:05,840 --> 00:25:08,600 Speaker 1: that they weren't going to cover Donald Trump as a 401 00:25:08,640 --> 00:25:10,879 Speaker 1: true candidate that we're gonna keep, They're going to relegate 402 00:25:10,960 --> 00:25:14,360 Speaker 1: him to the entertainment section. And I remember arguing at 403 00:25:14,400 --> 00:25:17,040 Speaker 1: that point, we need to cover Donald Trump like a 404 00:25:17,280 --> 00:25:19,919 Speaker 1: like we would cover any other candidate. We need to 405 00:25:19,960 --> 00:25:24,080 Speaker 1: scrutinize this policy proposals. It's very difficult, and I do 406 00:25:24,200 --> 00:25:28,560 Speaker 1: think that that that reporters, I think in this campaign 407 00:25:29,400 --> 00:25:35,280 Speaker 1: are faced with a candidate that is unprecedented that nobody's 408 00:25:35,480 --> 00:25:38,520 Speaker 1: we We have ways that we cover candidates. There's a 409 00:25:38,640 --> 00:25:43,520 Speaker 1: rhythm of a campaign, but with with Trump, uh, much 410 00:25:43,560 --> 00:25:45,800 Speaker 1: of that rhythm has just been thrown off because he's 411 00:25:45,880 --> 00:25:49,399 Speaker 1: not running what what most people would call a campaign. 412 00:25:49,480 --> 00:25:52,280 Speaker 1: You know, there's no yard signs, Um, there is, there 413 00:25:52,320 --> 00:25:55,320 Speaker 1: is no advertising. How do we we have to we 414 00:25:55,400 --> 00:25:59,359 Speaker 1: actually do have to rethink the way we cover a 415 00:25:59,400 --> 00:26:05,080 Speaker 1: presidential campaign when the candidate himself is not behaving like 416 00:26:05,160 --> 00:26:08,760 Speaker 1: a candidate. Not only not behaving like a candidate. But 417 00:26:09,000 --> 00:26:12,280 Speaker 1: of course, Walter Cronkite had that famous moment when he 418 00:26:12,280 --> 00:26:16,400 Speaker 1: spoke out against the Vietnam War. When do you have 419 00:26:16,480 --> 00:26:22,639 Speaker 1: a moral imperative to speak out against something that is 420 00:26:23,560 --> 00:26:27,760 Speaker 1: seems so contrary to the principles upon which this country 421 00:26:27,840 --> 00:26:30,879 Speaker 1: was founded. It took Walter Cronkite a long time to 422 00:26:30,920 --> 00:26:34,280 Speaker 1: get to that point, right. So, um, I think that 423 00:26:34,359 --> 00:26:38,560 Speaker 1: a lot of people might say that, you know, a 424 00:26:38,600 --> 00:26:42,679 Speaker 1: blanket ban on Muslim immigrants already cross that line. But 425 00:26:42,720 --> 00:26:46,960 Speaker 1: I think that you can point out where Trump or 426 00:26:47,080 --> 00:26:53,160 Speaker 1: Clinton I've gone through the lines of what is constitutional 427 00:26:53,280 --> 00:26:58,280 Speaker 1: and what is democratic without necessarily denouncing them. One of 428 00:26:58,520 --> 00:27:01,720 Speaker 1: the Times reporters he's are our Supreme Court reporter wrote 429 00:27:01,720 --> 00:27:06,200 Speaker 1: a story about all of the ways that Trump's proposals 430 00:27:06,280 --> 00:27:11,080 Speaker 1: are anti democratic and anti constitutional, and in so doing 431 00:27:11,640 --> 00:27:15,800 Speaker 1: he spoke only to conservative constitutional scholars. He didn't speak 432 00:27:15,800 --> 00:27:19,480 Speaker 1: to one liberal in the entire story. And that story 433 00:27:19,600 --> 00:27:23,720 Speaker 1: was incredibly widely read. It was extremely successful in getting 434 00:27:23,800 --> 00:27:26,560 Speaker 1: its message out. But I don't think anybody would read that, 435 00:27:26,720 --> 00:27:30,000 Speaker 1: as you know, Walter Cronkite creed a cour It was 436 00:27:30,119 --> 00:27:34,640 Speaker 1: just at LIPTOC going through the proposals and measuring them 437 00:27:34,720 --> 00:27:38,399 Speaker 1: up against what we know to be what the the 438 00:27:38,440 --> 00:27:41,080 Speaker 1: tenets of the of the Bill of Rights. But of 439 00:27:41,119 --> 00:27:45,200 Speaker 1: course Lyndon Johnson said famously, if we lost Walter Cronkite, 440 00:27:45,359 --> 00:27:50,080 Speaker 1: We've lost the war. And today, does any institution have 441 00:27:50,800 --> 00:27:55,359 Speaker 1: sort of the moral heft to be able to say 442 00:27:55,440 --> 00:27:59,440 Speaker 1: something and have it really affect public opinion, because so 443 00:27:59,480 --> 00:28:02,720 Speaker 1: far it doesn't seem to. You know, I I actually 444 00:28:02,720 --> 00:28:04,879 Speaker 1: think it would have to be kind of a truth 445 00:28:05,000 --> 00:28:09,800 Speaker 1: power moment where a conservative came out against Trump. But 446 00:28:09,800 --> 00:28:12,960 Speaker 1: then again, so many conservatives. Someone was going to say, 447 00:28:12,960 --> 00:28:17,440 Speaker 1: Bill Crystal, hello, No, I mean well, then Robert Kagan story, 448 00:28:17,960 --> 00:28:22,040 Speaker 1: uh that that came out, and and Max Boot It's 449 00:28:22,119 --> 00:28:26,040 Speaker 1: so in fact, some of the most eloquent writing against 450 00:28:26,160 --> 00:28:30,480 Speaker 1: Trumpism has come from conservatives. All those have done is 451 00:28:30,520 --> 00:28:35,560 Speaker 1: show just how far apart the Republican power structure. The 452 00:28:35,600 --> 00:28:41,240 Speaker 1: Republican intelligentsia is from so much of the Republican base. Um. 453 00:28:41,280 --> 00:28:43,640 Speaker 1: What we have seen is people that we used to 454 00:28:44,000 --> 00:28:46,920 Speaker 1: you know, routinely call up to get a sense of 455 00:28:46,920 --> 00:28:49,719 Speaker 1: where are conservatives these days? We found out that they 456 00:28:49,720 --> 00:28:52,760 Speaker 1: have no influence whatsoever. And I would argue it's not 457 00:28:52,840 --> 00:28:57,400 Speaker 1: just the Republican base, it's this new base of populism 458 00:28:57,440 --> 00:29:00,440 Speaker 1: that we've seen bubbling up throughout this camp paint. What 459 00:29:00,520 --> 00:29:02,840 Speaker 1: we learned in the rise of Donald Trump is that 460 00:29:02,880 --> 00:29:07,520 Speaker 1: there is a whole new quadrant. It's the socially conservative, 461 00:29:07,680 --> 00:29:12,640 Speaker 1: economically liberal voter. The person who feels like a white 462 00:29:12,680 --> 00:29:16,880 Speaker 1: working class is besieged and they don't want special treatment 463 00:29:17,000 --> 00:29:22,280 Speaker 1: going to Gaze and African Americans and Latinos, but they 464 00:29:22,400 --> 00:29:25,360 Speaker 1: do want their social security, and they do want their Medicare, 465 00:29:25,440 --> 00:29:28,680 Speaker 1: and they don't particularly uh, you know, want government to 466 00:29:28,800 --> 00:29:32,320 Speaker 1: shrivel up and die. So it is the populist quadrant, 467 00:29:32,360 --> 00:29:36,920 Speaker 1: the social conservative economic liberal. Um. Uh that Donald Trump's 468 00:29:36,960 --> 00:29:41,040 Speaker 1: candidacy is unearthed when make America Great Again is fundamentally 469 00:29:41,160 --> 00:29:46,440 Speaker 1: a nostalgic, backward looking sort of message. Let's take America 470 00:29:46,520 --> 00:29:49,840 Speaker 1: back to when it was last great, and presumably that 471 00:29:49,920 --> 00:29:53,200 Speaker 1: means when they were you know, when the demographics were different, 472 00:29:53,200 --> 00:29:56,000 Speaker 1: when they was different. A friend of mine said, when 473 00:29:56,040 --> 00:29:59,160 Speaker 1: he sees that on a on a baseball hat, he 474 00:29:59,200 --> 00:30:03,600 Speaker 1: sees make America white again. Well, you know when, um, 475 00:30:03,720 --> 00:30:06,479 Speaker 1: when Donald Trump sat down with one of the Times 476 00:30:06,480 --> 00:30:10,560 Speaker 1: as UH diplomatic reporters, UM foreign policy reporters David Sanger, 477 00:30:11,000 --> 00:30:15,000 Speaker 1: and David Sanger mentioned, so you would say that your 478 00:30:15,040 --> 00:30:18,880 Speaker 1: policies are kind of America First, and he said yes, yes, 479 00:30:18,960 --> 00:30:22,040 Speaker 1: And now ever since then that interview, he's been using 480 00:30:22,080 --> 00:30:25,760 Speaker 1: the term America First. Never once has he googled America 481 00:30:25,840 --> 00:30:29,080 Speaker 1: First to understand the historic context. We still have no 482 00:30:29,160 --> 00:30:33,040 Speaker 1: evidence that he understands, you know, Charles Lindbergh and the 483 00:30:33,040 --> 00:30:36,000 Speaker 1: America First Committee that tried to keep the United States 484 00:30:36,000 --> 00:30:38,200 Speaker 1: out of World War Two. He seems to be a 485 00:30:38,480 --> 00:30:41,520 Speaker 1: historic in his sloganeering, at least in a sense that 486 00:30:41,600 --> 00:30:45,320 Speaker 1: maybe the most anti Semitic thing he ever says exactly 487 00:30:45,440 --> 00:30:48,080 Speaker 1: it is. But I'm not sure if he has any 488 00:30:48,120 --> 00:30:53,400 Speaker 1: idea the anti Semitic undertones of of America First. Well, 489 00:30:53,600 --> 00:30:58,479 Speaker 1: fascinating conversation. Jonathan Weissman. I guess I won't see on Twitter, 490 00:30:58,560 --> 00:31:01,400 Speaker 1: but I'll see on Facebook. That's right, I am on 491 00:31:01,680 --> 00:31:04,840 Speaker 1: I'm on Facebook. Thanks so much for talking with us. 492 00:31:05,000 --> 00:31:08,200 Speaker 1: It's uh, it's it's it's upsetting, but but a really 493 00:31:08,240 --> 00:31:12,280 Speaker 1: important conversation. So thanks, thanks so much, Jonathan, Well, thank 494 00:31:12,280 --> 00:31:17,640 Speaker 1: you for having me. So after we recorded this interview, 495 00:31:17,800 --> 00:31:21,400 Speaker 1: we were surprised to see Jonathan Wiseman started tweeting again. 496 00:31:21,880 --> 00:31:24,560 Speaker 1: We asked him about his change of heart, and he 497 00:31:24,640 --> 00:31:27,480 Speaker 1: told us that a recent piece about the Iran nuclear 498 00:31:27,560 --> 00:31:30,520 Speaker 1: deal in the Wall Street Journal had him itching to 499 00:31:30,800 --> 00:31:35,040 Speaker 1: communicate online with other journalists, and Facebook just wasn't doing 500 00:31:35,040 --> 00:31:38,400 Speaker 1: the trick. He wrote to us that Twitter in Washington 501 00:31:38,560 --> 00:31:41,520 Speaker 1: is much more of a conversation between journalists and that 502 00:31:41,520 --> 00:31:44,959 Speaker 1: that was the audience he was seeking. His one tweet 503 00:31:45,000 --> 00:31:47,800 Speaker 1: about the Iran deal got a hundred and eighty six 504 00:31:47,840 --> 00:31:51,200 Speaker 1: thousand impressions and reminded him that Twitter is still a 505 00:31:51,240 --> 00:31:55,120 Speaker 1: powerful medium of communication. And also, to his surprise, he 506 00:31:55,200 --> 00:31:58,760 Speaker 1: says that his Twitter bullies had nothing to say. He's 507 00:31:58,760 --> 00:32:01,600 Speaker 1: tried to be more judicious in his use of Twitter, 508 00:32:01,720 --> 00:32:05,400 Speaker 1: he told us, but he still hasn't heard anything from 509 00:32:05,440 --> 00:32:09,480 Speaker 1: the higher upsite Twitter about what he went through Jonathan said, 510 00:32:09,520 --> 00:32:13,960 Speaker 1: Twitter can root out hate speech, but it chooses not to. Also, 511 00:32:14,000 --> 00:32:16,800 Speaker 1: don't forget to pick up a copy of Jonathan Wiseman's 512 00:32:16,880 --> 00:32:20,760 Speaker 1: excellent novel Number four, Imperial Lane, which comes out in 513 00:32:20,880 --> 00:32:24,880 Speaker 1: paperback in January. How big is this problem? Is it 514 00:32:25,000 --> 00:32:29,840 Speaker 1: simply an isolated incident a pack of anti Semites getting 515 00:32:29,880 --> 00:32:33,640 Speaker 1: ahold of one person on social media? Or is it widespread? 516 00:32:33,680 --> 00:32:37,640 Speaker 1: And what other kind of bigotry is being unleashed online? 517 00:32:38,160 --> 00:32:53,520 Speaker 1: We're going to talk about that coming up. By the way, 518 00:32:53,560 --> 00:32:57,400 Speaker 1: if anybody out there is wondering who this disembodied voices, 519 00:32:57,640 --> 00:33:00,800 Speaker 1: this guy named Brian Goldsmith, let me ex plane briefly. 520 00:33:00,920 --> 00:33:02,920 Speaker 1: Brian and I have known each other for a very 521 00:33:02,960 --> 00:33:07,080 Speaker 1: long time, gosh, maybe ten or fifteen years. Brian I 522 00:33:07,120 --> 00:33:10,600 Speaker 1: started working for you in two thousand four as a 523 00:33:10,640 --> 00:33:14,040 Speaker 1: summer intern on the Today Show. Wow. Brian, by the way, 524 00:33:14,120 --> 00:33:17,400 Speaker 1: is only thirty four years old. But the reason why 525 00:33:17,400 --> 00:33:20,080 Speaker 1: he's here is he's got one of the sharpest political 526 00:33:20,160 --> 00:33:23,560 Speaker 1: minds I know. Plus I happened to find him really 527 00:33:23,600 --> 00:33:28,680 Speaker 1: amusing sometimes you and three or four other people, and 528 00:33:28,800 --> 00:33:31,680 Speaker 1: so that's why he is my ED McMahon, Can you 529 00:33:31,720 --> 00:33:36,280 Speaker 1: do an ed McMahon laugh. Ha ha ha, No, not 530 00:33:36,400 --> 00:33:37,880 Speaker 1: so good. I think you need to go home and 531 00:33:37,880 --> 00:33:40,600 Speaker 1: work on that. I hadn't practiced my Ed McMahon laugh. 532 00:33:43,400 --> 00:33:46,760 Speaker 1: Boy started almost as genuine as the real Ed mcman laugh. 533 00:33:47,440 --> 00:33:49,280 Speaker 1: If we were Ed mcman and Johnny, we would go 534 00:33:49,280 --> 00:33:52,280 Speaker 1: out and have like ten martinis after this each. Sounds 535 00:33:52,280 --> 00:33:59,760 Speaker 1: good to me. Yeah, now let's get back to the show. Obviously, 536 00:33:59,840 --> 00:34:03,400 Speaker 1: the story about Jonathan Weissman is so upsetting, and we 537 00:34:03,480 --> 00:34:07,560 Speaker 1: wanted to get a broader perspective on what's happening visa 538 00:34:07,640 --> 00:34:11,760 Speaker 1: vi anti Semitism and social media and also just around 539 00:34:11,800 --> 00:34:15,759 Speaker 1: the country and around the world. So Richard Cohen is 540 00:34:15,800 --> 00:34:20,400 Speaker 1: president of the Southern Poverty Law Center in Montgomery, Alabama. Hey, Richard, 541 00:34:20,400 --> 00:34:23,720 Speaker 1: thanks so much for talking with us. Sure, Katie, Hey Richard, 542 00:34:23,760 --> 00:34:26,239 Speaker 1: it's Brian Um. I'm sure a lot of people have 543 00:34:26,320 --> 00:34:29,520 Speaker 1: heard of the Southern Poverty Law Center but may not 544 00:34:29,600 --> 00:34:31,680 Speaker 1: want to admit that they don't know what it actually 545 00:34:31,719 --> 00:34:35,640 Speaker 1: does or why it exists. Can you share a little 546 00:34:35,640 --> 00:34:39,000 Speaker 1: bit about the history of that organization and and what 547 00:34:39,040 --> 00:34:42,480 Speaker 1: you do today. I appreciate the opportunity to you know, 548 00:34:42,520 --> 00:34:46,319 Speaker 1: the Southern Partarty Law Center is founded in nineteen seventy one, 549 00:34:46,440 --> 00:34:50,680 Speaker 1: forty five years ago. Uh, really to make the promise 550 00:34:51,000 --> 00:34:54,600 Speaker 1: of the newly enacted civil rights laws, the Voting Rights Act, 551 00:34:55,120 --> 00:34:58,720 Speaker 1: the Civil Rights Act a reality in the Deep South, 552 00:34:59,200 --> 00:35:01,720 Speaker 1: you know nowadays is you know, kind of organizational motto 553 00:35:01,880 --> 00:35:06,800 Speaker 1: is fighting hate, teaching tolerance, and seeking justice. The project 554 00:35:06,880 --> 00:35:10,560 Speaker 1: that monitors white supremacist activity. That's kind of our arm 555 00:35:10,640 --> 00:35:14,600 Speaker 1: that fights hate. Right now, we focus primarily on issues 556 00:35:14,680 --> 00:35:19,200 Speaker 1: of recent immigrants, the rights of children. Uh, we have 557 00:35:19,280 --> 00:35:22,799 Speaker 1: a lot of work trying to push back against the 558 00:35:22,920 --> 00:35:26,840 Speaker 1: problems of mass incarceration. Really the full gamut of civil 559 00:35:26,960 --> 00:35:30,600 Speaker 1: rights issues. Richard, was there an aha moment when you said, 560 00:35:30,640 --> 00:35:33,280 Speaker 1: this is my calling, this is what what I want 561 00:35:33,360 --> 00:35:36,400 Speaker 1: to dedicate my life to. You know, when I was 562 00:35:36,440 --> 00:35:40,279 Speaker 1: a junior in high school, way back when my my 563 00:35:40,400 --> 00:35:45,560 Speaker 1: Civics teacher, you know, challenged me, he um, he made 564 00:35:45,600 --> 00:35:50,200 Speaker 1: me read in brief Supreme Court cases. This is in 565 00:35:50,280 --> 00:35:55,000 Speaker 1: the eleventh grade. What year was that? Killing me? Katie? 566 00:35:55,160 --> 00:35:59,520 Speaker 1: Stop it stop. I think I'm probably older than you are. No, No, 567 00:35:59,760 --> 00:36:02,120 Speaker 1: I'm I'm sixty one. So that would have been in 568 00:36:02,200 --> 00:36:07,200 Speaker 1: nineteen seventy one, seventy two. So really sort of a 569 00:36:07,320 --> 00:36:10,880 Speaker 1: very tumultuous time in our nation's history. I mentioned one 570 00:36:10,920 --> 00:36:13,160 Speaker 1: of the things too, you know, the first democratic the 571 00:36:13,160 --> 00:36:17,160 Speaker 1: first uh uh convention that I watched was the nineteen 572 00:36:17,200 --> 00:36:20,200 Speaker 1: sixty eight convention. I was thirteen years old, and of 573 00:36:20,239 --> 00:36:23,080 Speaker 1: course that was the convention where the police are rioting 574 00:36:23,160 --> 00:36:26,200 Speaker 1: in the expression whole world of watching, and you know 575 00:36:26,239 --> 00:36:28,960 Speaker 1: I was. I was mesmerized by it even as a 576 00:36:28,960 --> 00:36:31,799 Speaker 1: little kid. And you know, one of the things that 577 00:36:31,800 --> 00:36:33,719 Speaker 1: stood out of the person who stood out the most 578 00:36:33,840 --> 00:36:37,760 Speaker 1: for me was Julian Bond, who really came to national 579 00:36:37,760 --> 00:36:40,799 Speaker 1: attention then. And you might remember, although he was too young, 580 00:36:40,920 --> 00:36:43,319 Speaker 1: his name was placed into nomination, you know, in the 581 00:36:43,360 --> 00:36:47,480 Speaker 1: Supreme Court it excuse me, for the vice presidency. And 582 00:36:47,520 --> 00:36:51,520 Speaker 1: of course Julian was the first president of the Southern 583 00:36:51,520 --> 00:36:54,480 Speaker 1: Poverty Law Center. I think about those days when I 584 00:36:54,520 --> 00:36:56,360 Speaker 1: was thirteen years old and think about it today, and 585 00:36:56,400 --> 00:36:59,000 Speaker 1: of course I'll think about Julian's legacy. He was such 586 00:36:59,040 --> 00:37:01,799 Speaker 1: a close friend and we lost him last year. You know, 587 00:37:01,840 --> 00:37:05,120 Speaker 1: I suppose that was as much in a moment for 588 00:37:05,160 --> 00:37:08,839 Speaker 1: me as anything else. Do you have any trouble attracting Wait, wait, wait, wait, 589 00:37:08,840 --> 00:37:11,000 Speaker 1: I want to know if Katie's older than me. I'm 590 00:37:11,120 --> 00:37:17,160 Speaker 1: two years younger. I'm younger. Richard. You're killing me. Hey, listen, 591 00:37:17,320 --> 00:37:19,360 Speaker 1: you know I would have dated you in high school. 592 00:37:19,360 --> 00:37:21,320 Speaker 1: You would have been a senior when I was a sophomore. 593 00:37:21,360 --> 00:37:25,200 Speaker 1: What can I say? Yeah? I think that's probably a lie, 594 00:37:25,280 --> 00:37:29,719 Speaker 1: but we'll move on. Why were you were a cheerleader 595 00:37:29,760 --> 00:37:31,600 Speaker 1: in high school? Why? Yeah? What do you? What do you? 596 00:37:31,680 --> 00:37:33,720 Speaker 1: What does that say? Is that? Is that an indictment 597 00:37:33,760 --> 00:37:36,759 Speaker 1: of me or Richard? Because Richard went to Columbia First 598 00:37:36,760 --> 00:37:38,840 Speaker 1: of all, I like smart guys. He went to Columbia 599 00:37:38,920 --> 00:37:43,200 Speaker 1: and he went to UV A law school. Richard, Okay, 600 00:37:43,320 --> 00:37:46,040 Speaker 1: I'm sure you would have dated in high school. Um, 601 00:37:46,160 --> 00:37:50,560 Speaker 1: do you have any trouble attracting really smart, ambitious lawyers 602 00:37:50,640 --> 00:37:54,440 Speaker 1: to Montgomery, Alabama and work for the Southern Poverty Law Center. 603 00:37:55,360 --> 00:37:57,400 Speaker 1: You know, we're lucky. There are a lot of people 604 00:37:57,480 --> 00:38:00,160 Speaker 1: who have a lot of passion, you know, who go 605 00:38:00,239 --> 00:38:03,359 Speaker 1: to great schools and want to join our cause. So 606 00:38:03,520 --> 00:38:05,520 Speaker 1: you know, we don't have a hard time recruiting people. 607 00:38:05,920 --> 00:38:09,719 Speaker 1: Sometimes we have a hard time keeping them for very long, uh, 608 00:38:09,800 --> 00:38:11,720 Speaker 1: just because they want to move to a big city. 609 00:38:11,800 --> 00:38:14,440 Speaker 1: But you know, we've been incredibly lucky to be able 610 00:38:14,480 --> 00:38:17,000 Speaker 1: to draw new talent to our cause. And you know, 611 00:38:17,080 --> 00:38:19,600 Speaker 1: we don't have any secret formulas at the Southern Privorty Loss, 612 00:38:19,600 --> 00:38:22,719 Speaker 1: and we have any patents or copyrights. All we have 613 00:38:23,040 --> 00:38:27,160 Speaker 1: is talented and passionate people. It's really what's sustained us 614 00:38:27,200 --> 00:38:30,360 Speaker 1: for forty five years. I guess you sometimes wish business 615 00:38:30,400 --> 00:38:32,920 Speaker 1: weren't so good, right in terms of the kind of 616 00:38:32,920 --> 00:38:36,080 Speaker 1: things that you're dealing with. Yeah, we've seen a real 617 00:38:36,160 --> 00:38:40,200 Speaker 1: explosion of extremism in in recent years, and you know, 618 00:38:40,280 --> 00:38:44,600 Speaker 1: unfortunately the rhetoric surrounding you know, this year's presidential campaign 619 00:38:45,160 --> 00:38:48,640 Speaker 1: has amplified it. Brian Goldsmith is here. I know that 620 00:38:48,680 --> 00:38:51,719 Speaker 1: you all have talked on the phone together and we 621 00:38:51,880 --> 00:38:54,880 Speaker 1: both talked to Jonathan Weisman. What did you make of 622 00:38:54,920 --> 00:38:58,759 Speaker 1: the whole Jonathan Weisman Twitter story. Well, it's an example 623 00:38:58,880 --> 00:39:01,719 Speaker 1: of the fact that at so much on Twitter as 624 00:39:01,760 --> 00:39:04,799 Speaker 1: like a drive by shooting. You know, they're very, very 625 00:39:04,880 --> 00:39:08,480 Speaker 1: few barriers to entry to get a Twitter account. You know, 626 00:39:08,520 --> 00:39:12,640 Speaker 1: it's not no real systems of verification and whatnot, and 627 00:39:12,719 --> 00:39:19,880 Speaker 1: so that anonymity breeds you know, ugly conduct. Why do 628 00:39:19,920 --> 00:39:23,880 Speaker 1: you think Twitter doesn't follow its terms its own terms 629 00:39:23,920 --> 00:39:27,359 Speaker 1: of service on hate speech. Look, we have a lot 630 00:39:27,400 --> 00:39:31,719 Speaker 1: of experience with the digital industry, and you know, they 631 00:39:31,760 --> 00:39:35,440 Speaker 1: like to pay lip service to their terms of service. 632 00:39:35,440 --> 00:39:38,920 Speaker 1: So we're opposed to hate. We don't like misogyny. But 633 00:39:39,040 --> 00:39:41,920 Speaker 1: the reality is that, you know, it's a pain in 634 00:39:41,960 --> 00:39:44,680 Speaker 1: the butt for them to enforce those terms. Well, when 635 00:39:44,680 --> 00:39:47,360 Speaker 1: you've talked to executives at Twitter, or you've you reached 636 00:39:47,360 --> 00:39:51,680 Speaker 1: out to various social media platforms, I mean, are they embarrassed? 637 00:39:51,760 --> 00:39:55,960 Speaker 1: What what's their explanation for for not policing it. I 638 00:39:56,000 --> 00:39:59,400 Speaker 1: think they have no good explanation for it. Sometimes it 639 00:39:59,480 --> 00:40:04,000 Speaker 1: takes you know, the media to expose them. Uh, you know, 640 00:40:04,080 --> 00:40:06,600 Speaker 1: we want one time wrote a story about iTunes that 641 00:40:06,719 --> 00:40:09,799 Speaker 1: caught their attention and led them to pull all of 642 00:40:09,840 --> 00:40:12,600 Speaker 1: their hate music off their site, which is really important. 643 00:40:12,920 --> 00:40:17,319 Speaker 1: I commend you know, Tim Cook for doing that. I 644 00:40:17,400 --> 00:40:20,400 Speaker 1: know that you use the term hate speech or hate music. 645 00:40:20,920 --> 00:40:25,160 Speaker 1: How do we define what that means? What's protected but 646 00:40:25,520 --> 00:40:29,799 Speaker 1: offensive speech under the First Amendment, and what crosses the 647 00:40:29,880 --> 00:40:35,800 Speaker 1: line to a point where the content should be removed? Well, look, 648 00:40:36,320 --> 00:40:39,120 Speaker 1: none of this, none of the speech that we're talking about, 649 00:40:39,440 --> 00:40:44,719 Speaker 1: is illegal. But of course these are private companies. They 650 00:40:44,760 --> 00:40:48,680 Speaker 1: can remove speech that they find offensive if they want to. 651 00:40:49,200 --> 00:40:51,520 Speaker 1: It's not like the government doing it that would that 652 00:40:51,520 --> 00:40:55,840 Speaker 1: would amount to censorship. So we're not talking about illegal speech. 653 00:40:56,040 --> 00:40:59,759 Speaker 1: We're talking about ugly speech, offensive speech that violates the 654 00:40:59,840 --> 00:41:02,920 Speaker 1: term terms of service of these of these organizations. And 655 00:41:02,960 --> 00:41:05,680 Speaker 1: typically the kind of speech that we're talking about is 656 00:41:05,719 --> 00:41:09,560 Speaker 1: that which, you know, vilify someone for their race, their religion, 657 00:41:09,640 --> 00:41:14,840 Speaker 1: their sexual orientation or not or vilifies entire groups. So again, 658 00:41:14,880 --> 00:41:18,040 Speaker 1: it's not a question of whether the speech is illegal, 659 00:41:18,200 --> 00:41:21,520 Speaker 1: because it's not. The question is is it ugly enough 660 00:41:21,560 --> 00:41:25,640 Speaker 1: to violate the terms of the service of these private organizations. 661 00:41:26,000 --> 00:41:28,839 Speaker 1: But it sounds like you are making some progress that 662 00:41:28,960 --> 00:41:34,240 Speaker 1: you are exposing some of this hateful behavior or speech 663 00:41:34,320 --> 00:41:37,319 Speaker 1: or the use of these platforms to the companies that 664 00:41:38,000 --> 00:41:42,279 Speaker 1: oversee them or own them. And they've been fairly responsive 665 00:41:42,520 --> 00:41:47,239 Speaker 1: except for Twitter. Well, you know, Twitter hasn't moved as 666 00:41:47,320 --> 00:41:49,600 Speaker 1: quickly as some of the others. Most of these, you know, 667 00:41:49,680 --> 00:41:54,200 Speaker 1: high tech companies are run by liberal minded people, and 668 00:41:54,280 --> 00:41:56,400 Speaker 1: you know, if you can get their attention, you know 669 00:41:56,480 --> 00:41:58,920 Speaker 1: you can make progress. You know, we've gone to the 670 00:41:58,920 --> 00:42:02,760 Speaker 1: Silicon Valley and with people from Google, from Amazon, from 671 00:42:02,800 --> 00:42:06,719 Speaker 1: from PayPal, and you know, we're making some progress there, 672 00:42:06,719 --> 00:42:09,759 Speaker 1: but you've got to get on their radar. These are big, 673 00:42:09,800 --> 00:42:14,279 Speaker 1: giant companies, and uh, you know this is oftentimes not 674 00:42:14,400 --> 00:42:17,960 Speaker 1: a high priority for them. Tell us about a personal 675 00:42:18,000 --> 00:42:22,920 Speaker 1: experience you've had where you have been so discouraged or 676 00:42:23,000 --> 00:42:29,000 Speaker 1: hurt or depressed about language, or a situation that you've 677 00:42:29,040 --> 00:42:36,080 Speaker 1: witnessed and wanted to wanted to fix, you know, Katie, 678 00:42:36,160 --> 00:42:39,759 Speaker 1: To be honest with me, sometimes it's a little bit 679 00:42:39,880 --> 00:42:45,279 Speaker 1: like blood on the operating room floor. Uh. Sometimes I 680 00:42:45,400 --> 00:42:49,360 Speaker 1: hear so much hate speech that I don't want to 681 00:42:49,360 --> 00:42:52,320 Speaker 1: say I'm used to it, but I guess what I 682 00:42:52,320 --> 00:42:56,560 Speaker 1: would say is nothing particularly shocks me at this point, unfortunately. 683 00:42:57,239 --> 00:43:00,000 Speaker 1: Let me ask you about the role, Richard, of social 684 00:43:00,200 --> 00:43:05,440 Speaker 1: media in kind of empowering or fueling this kind of 685 00:43:05,480 --> 00:43:09,239 Speaker 1: hate speech or these kinds of organizations or movements. I 686 00:43:09,320 --> 00:43:12,520 Speaker 1: know that Stormfront, which is a leading neo Nazi portal, 687 00:43:13,000 --> 00:43:15,680 Speaker 1: had a hundred and forty thousand registered members in two 688 00:43:15,680 --> 00:43:18,960 Speaker 1: thousand eight and now it has over three hundred thousand. 689 00:43:19,280 --> 00:43:22,520 Speaker 1: I mean, what role does social media have and expanding 690 00:43:22,640 --> 00:43:26,680 Speaker 1: the numbers and the intensity of some of these individuals. 691 00:43:27,760 --> 00:43:30,640 Speaker 1: I think it has multiple roles First, you know, it 692 00:43:30,760 --> 00:43:36,839 Speaker 1: makes it easier to find kindred souls. Second, you know, 693 00:43:37,000 --> 00:43:40,320 Speaker 1: the people you're talking to, most of them share your views, 694 00:43:40,600 --> 00:43:44,800 Speaker 1: and you know, you you get encouragement, you get validation, 695 00:43:45,560 --> 00:43:48,120 Speaker 1: and I think that, you know, kind of ramps up 696 00:43:48,160 --> 00:43:53,360 Speaker 1: the intensity. Some of these people eventually get frustrated, uh 697 00:43:53,400 --> 00:43:57,000 Speaker 1: and enough talk and act out. Dylan Ruth was an 698 00:43:57,000 --> 00:43:59,919 Speaker 1: example of that, you know, in his manifesto he said, 699 00:44:00,000 --> 00:44:03,440 Speaker 1: and someone needs to take action. I guess it's me. 700 00:44:04,200 --> 00:44:08,160 Speaker 1: And you know, so that was the final step. Who 701 00:44:08,239 --> 00:44:13,759 Speaker 1: are these people? Do they feel disenfranchised? Alienated? Are they 702 00:44:14,120 --> 00:44:17,600 Speaker 1: workers who have been sort of laid off? I mean, 703 00:44:18,320 --> 00:44:21,160 Speaker 1: are there is there a common thread among some of 704 00:44:21,200 --> 00:44:25,719 Speaker 1: these people in terms of what's binding them together other 705 00:44:25,800 --> 00:44:30,239 Speaker 1: than just hate and intolerance. Well, you know, I think 706 00:44:30,280 --> 00:44:32,760 Speaker 1: that most of the people in the white supremacist movement, 707 00:44:32,880 --> 00:44:35,520 Speaker 1: most of the people who are foot soldiers at least, 708 00:44:36,239 --> 00:44:41,440 Speaker 1: tend to come from somewhat marginalized backgrounds. They look for reasons, 709 00:44:41,480 --> 00:44:46,399 Speaker 1: perhaps outside of themselves, to explain their failures, and they 710 00:44:47,000 --> 00:44:50,440 Speaker 1: glom onto a powerful narrative. You know, you're here, you 711 00:44:50,480 --> 00:44:55,560 Speaker 1: are You're Dylan Roof, You're an unemployed kid with no 712 00:44:55,760 --> 00:44:59,640 Speaker 1: future you can you can think about that, or you 713 00:44:59,680 --> 00:45:03,200 Speaker 1: can suddenly decide that you're a warrior for the white 714 00:45:03,320 --> 00:45:06,680 Speaker 1: race and you're going to strike a blow for freedom. 715 00:45:06,880 --> 00:45:09,000 Speaker 1: I think that some of the people who were attracted 716 00:45:09,040 --> 00:45:13,279 Speaker 1: in this country to the siren song of Isis are 717 00:45:13,320 --> 00:45:16,560 Speaker 1: in the same position. There's kind of a lack and 718 00:45:16,600 --> 00:45:20,520 Speaker 1: that they they're they're searching for an identity, uh, and 719 00:45:20,760 --> 00:45:29,080 Speaker 1: suddenly they're offered one by this seemingly invincible, powerful group 720 00:45:29,360 --> 00:45:32,080 Speaker 1: that is going to you know, change their lives and 721 00:45:32,120 --> 00:45:36,120 Speaker 1: give it meaning. You said earlier that you thought extremism 722 00:45:36,120 --> 00:45:40,359 Speaker 1: has exploded and hate speeches on the rise. Is that 723 00:45:40,440 --> 00:45:45,160 Speaker 1: just because technology and access to technology is more widely 724 00:45:45,200 --> 00:45:49,040 Speaker 1: available or do you think that says something deeper about 725 00:45:49,239 --> 00:45:52,799 Speaker 1: our society and the direction that we're moving in. I 726 00:45:52,840 --> 00:45:57,000 Speaker 1: think it says something about global trends. Really, Um, what 727 00:45:57,040 --> 00:46:00,600 Speaker 1: we've seen over the past decade has been you know, 728 00:46:01,000 --> 00:46:05,880 Speaker 1: the increasing diversity of the United States. We've seen the 729 00:46:06,000 --> 00:46:10,400 Speaker 1: same thing in Europe, increased immigration, and our country is 730 00:46:10,480 --> 00:46:13,759 Speaker 1: having has had a backlash to that. You know, President 731 00:46:13,800 --> 00:46:17,040 Speaker 1: Obama in many ways is the symbol of the country's 732 00:46:17,160 --> 00:46:20,239 Speaker 1: changing diversity, but he's also been you know, kind of 733 00:46:20,239 --> 00:46:24,080 Speaker 1: a ferocious target of its backlash. Also, I would say 734 00:46:24,160 --> 00:46:26,640 Speaker 1: in the past you know, half dozen years, at least 735 00:46:27,120 --> 00:46:30,880 Speaker 1: past seven or eight years. You know, the dislocations in 736 00:46:30,920 --> 00:46:37,719 Speaker 1: the economy caused by globalization, the worldwide recession has has 737 00:46:37,760 --> 00:46:41,480 Speaker 1: pushed people to look for answers and has increased stereotyping. 738 00:46:41,960 --> 00:46:45,160 Speaker 1: So I think we're looking at some some real global trends, 739 00:46:45,160 --> 00:46:49,360 Speaker 1: some macro trends, and you know, and those things are exacerbated, 740 00:46:49,600 --> 00:46:53,800 Speaker 1: you know, by the ease of communications that that exists 741 00:46:53,800 --> 00:46:56,680 Speaker 1: in the modern world. So you see it happening all 742 00:46:56,719 --> 00:47:00,960 Speaker 1: over Europe obviously and here in the Iided States. I 743 00:47:01,000 --> 00:47:07,000 Speaker 1: think possibly exacerbated or probably exacerbated by Donald Trump's candidacy. 744 00:47:07,120 --> 00:47:10,840 Speaker 1: Have you seen a correlation between his popularity and the 745 00:47:10,880 --> 00:47:14,480 Speaker 1: amount of hate speech that you're seeing either online or 746 00:47:14,520 --> 00:47:20,480 Speaker 1: in other forms. Absolutely? Uh. You know. First, you know, 747 00:47:20,640 --> 00:47:23,799 Speaker 1: usually in the white supremacist world, you know, they don't 748 00:47:23,840 --> 00:47:26,560 Speaker 1: get involved in politics. It's like a pox on both 749 00:47:26,600 --> 00:47:30,759 Speaker 1: their homes. Both parties are irredeemably corrupt. Trump, on the 750 00:47:30,800 --> 00:47:34,200 Speaker 1: other hand, hasn't been embraced by the white supremacist you know, 751 00:47:34,280 --> 00:47:36,480 Speaker 1: people like David Duke have called him, you know, the 752 00:47:36,480 --> 00:47:39,880 Speaker 1: glorious leader. So you know we're having that. You know, 753 00:47:39,920 --> 00:47:42,799 Speaker 1: suddenly he's their champion. Another thing, you know, we did 754 00:47:42,800 --> 00:47:47,160 Speaker 1: a survey Katie. Uh. You know our schools around the country, 755 00:47:47,160 --> 00:47:49,280 Speaker 1: teachers around the country. You know, we have a program 756 00:47:49,320 --> 00:47:53,040 Speaker 1: that provides free educational resources to every school in the nation. 757 00:47:53,160 --> 00:47:56,440 Speaker 1: We asked the teachers, you know, what impact had the 758 00:47:56,480 --> 00:48:00,280 Speaker 1: election had in your school. More than half the teachers, 759 00:48:00,800 --> 00:48:04,440 Speaker 1: two thousand of them responded and talked about you know, 760 00:48:04,600 --> 00:48:07,880 Speaker 1: Mr Trump talked about how they had been increase in 761 00:48:08,000 --> 00:48:10,920 Speaker 1: bullying in their classroom and increase in in kind of 762 00:48:11,080 --> 00:48:15,279 Speaker 1: ugly rhetoric. Uh, kids who were scared, kids who thought 763 00:48:15,280 --> 00:48:18,440 Speaker 1: they would be deported. One of the things most depressing 764 00:48:18,480 --> 00:48:21,200 Speaker 1: to me, and this this was depressing, I would say. 765 00:48:21,440 --> 00:48:23,839 Speaker 1: A teacher wrote us and said, look, you know when 766 00:48:23,880 --> 00:48:26,120 Speaker 1: people said stuff in the past, my kids in the 767 00:48:26,160 --> 00:48:28,239 Speaker 1: past said things that were out of line, they were 768 00:48:28,239 --> 00:48:31,960 Speaker 1: off color. I'd rebuke them and they shut up. Now 769 00:48:32,239 --> 00:48:34,960 Speaker 1: when they say things. They asked me, Hey, why can't 770 00:48:34,960 --> 00:48:37,640 Speaker 1: I say that the future president of the United States 771 00:48:37,680 --> 00:48:41,759 Speaker 1: is saying that? Wow? When you told me sort of 772 00:48:41,760 --> 00:48:45,040 Speaker 1: the Raison Detra of the Southern Poverty Law Center to 773 00:48:45,080 --> 00:48:47,719 Speaker 1: make sure the Civil Rights Act the Voting Rights Act 774 00:48:47,760 --> 00:48:52,400 Speaker 1: were implemented and followed. I couldn't help but wonder, Richard, 775 00:48:52,640 --> 00:48:57,359 Speaker 1: where we are now compared to back then, Because, as 776 00:48:57,400 --> 00:49:00,520 Speaker 1: you know, the Voting Rights Act has been old back 777 00:49:00,560 --> 00:49:05,040 Speaker 1: in certain ways, and it seems like the race relations 778 00:49:05,239 --> 00:49:08,640 Speaker 1: in this country are as fraught as ever. So when 779 00:49:08,640 --> 00:49:12,560 Speaker 1: you look back from n to two thousand sixteen at 780 00:49:12,600 --> 00:49:16,480 Speaker 1: the trajectory of these things, what do you what do 781 00:49:16,520 --> 00:49:20,799 Speaker 1: you come up with? Well, it's remarkable, frankly, how much 782 00:49:21,080 --> 00:49:25,719 Speaker 1: progress the country has made since those days. And it's 783 00:49:25,800 --> 00:49:31,680 Speaker 1: depressing also that we see so often, especially in recent years, 784 00:49:32,480 --> 00:49:36,160 Speaker 1: to be taking a giant step backwards. Uh. You know, 785 00:49:36,280 --> 00:49:40,600 Speaker 1: the fight for justice is one that every generation has 786 00:49:40,680 --> 00:49:45,400 Speaker 1: to wage. Uh. And you know, it's it's it's it's nothing. 787 00:49:45,480 --> 00:49:49,480 Speaker 1: It's not anything where people of goodwill can rest on 788 00:49:49,520 --> 00:49:51,960 Speaker 1: their laurels. There are always new battles to fight, and 789 00:49:51,960 --> 00:49:54,520 Speaker 1: there are always forces that are trying to hold us back. 790 00:49:55,600 --> 00:49:58,480 Speaker 1: Let me ask you, you gave us some very depressing 791 00:49:58,800 --> 00:50:03,160 Speaker 1: sort of statistic trends. So what do you do about this? 792 00:50:03,400 --> 00:50:07,279 Speaker 1: What do you do about the increase in hate speech? Is? 793 00:50:07,320 --> 00:50:10,000 Speaker 1: How do you counter it? I know that the state department, 794 00:50:10,040 --> 00:50:15,239 Speaker 1: has a whole Countering Violent Extremism department to try to 795 00:50:15,760 --> 00:50:20,879 Speaker 1: come up with alternative narratives for people online. But how 796 00:50:20,880 --> 00:50:22,840 Speaker 1: do you how do you nip this in the bud 797 00:50:24,480 --> 00:50:27,400 Speaker 1: and look, it takes a full court press, right you 798 00:50:27,480 --> 00:50:31,600 Speaker 1: have to. It takes, you know, work by people in schools, 799 00:50:31,840 --> 00:50:36,080 Speaker 1: people in churches, community leaders are political leaders. There's not 800 00:50:36,280 --> 00:50:40,719 Speaker 1: one magic bullet, you know. For us, again, it's reflected 801 00:50:40,760 --> 00:50:44,320 Speaker 1: on our organizational motto. We try to expose the people 802 00:50:44,400 --> 00:50:47,160 Speaker 1: who are who who who have hated their heart. And 803 00:50:47,239 --> 00:50:49,000 Speaker 1: I think that some of the things I'm talking about 804 00:50:49,000 --> 00:50:52,480 Speaker 1: our pollyanish. But we're people who have hope and people 805 00:50:52,480 --> 00:50:55,319 Speaker 1: who believe that, you know, as Dr King said, the 806 00:50:55,360 --> 00:50:57,440 Speaker 1: moral ark of the universe may be long, but it 807 00:50:57,480 --> 00:51:01,239 Speaker 1: bends towards justice. That doesn't mean that justice is inevitable. 808 00:51:01,280 --> 00:51:03,880 Speaker 1: That doesn't mean that it's going to happen without our efforts. 809 00:51:04,440 --> 00:51:06,600 Speaker 1: But I think, you know, I think you have to 810 00:51:06,640 --> 00:51:08,880 Speaker 1: have hope in order to continue this kind of struggle. 811 00:51:09,520 --> 00:51:11,680 Speaker 1: Richard Cohen, thanks so much for talking with us, and 812 00:51:11,760 --> 00:51:13,960 Speaker 1: thank you for the great work that all of you 813 00:51:14,040 --> 00:51:21,120 Speaker 1: do with the Southern Poverty Law Center. Thank you, Katie. 814 00:51:23,160 --> 00:51:27,439 Speaker 1: It's so nice to end such a depressing topic on 815 00:51:27,480 --> 00:51:30,920 Speaker 1: a positive note, because I do think people are basically 816 00:51:30,960 --> 00:51:33,680 Speaker 1: good and kind. They're just some bad apples that give 817 00:51:33,760 --> 00:51:40,279 Speaker 1: us all a bad name. Thanks to the whole team 818 00:51:40,280 --> 00:51:43,640 Speaker 1: at Your Wolf, including Chris Bannon, Bretta Cone, and the 819 00:51:43,880 --> 00:51:47,600 Speaker 1: Reverend John Delore for helping to produce our show. Thanks 820 00:51:47,640 --> 00:51:50,839 Speaker 1: to Mark Phillips for our theme music, and thank you 821 00:51:50,880 --> 00:51:55,600 Speaker 1: for listening. So maybe you think we're kidding when we 822 00:51:55,680 --> 00:51:57,839 Speaker 1: say we want you to leave a message and tell 823 00:51:57,880 --> 00:52:00,640 Speaker 1: us what you think about the show, We're actually not kidding. 824 00:52:00,760 --> 00:52:04,919 Speaker 1: Katie has me here chained to the old fashion style 825 00:52:04,960 --> 00:52:07,719 Speaker 1: answering machine, so it would just make my life a 826 00:52:07,800 --> 00:52:11,600 Speaker 1: lot better if you would call and leave a message 827 00:52:11,800 --> 00:52:15,560 Speaker 1: at nine to nine two two four four six three 828 00:52:15,640 --> 00:52:18,480 Speaker 1: seven and let us know what you think about anything 829 00:52:18,480 --> 00:52:20,360 Speaker 1: you've heard on the show, anything you want us to 830 00:52:20,400 --> 00:52:24,120 Speaker 1: talk about, questions for Katie, really, anything you want to 831 00:52:24,160 --> 00:52:26,880 Speaker 1: talk about it all. So give us a call again 832 00:52:27,000 --> 00:52:29,720 Speaker 1: nine to nine to two four four six three seven, 833 00:52:30,080 --> 00:52:33,960 Speaker 1: and also please subscribe, rate and review the show. Number one, 834 00:52:34,000 --> 00:52:37,000 Speaker 1: it helps us to learn what you think, and number two, 835 00:52:37,120 --> 00:52:44,880 Speaker 1: it helps other listeners find the show. If it helps 836 00:52:45,200 --> 00:52:48,080 Speaker 1: to make fun of my calves so you can get 837 00:52:48,080 --> 00:52:49,719 Speaker 1: through the day and it's a tough day. I'll be 838 00:52:49,800 --> 00:52:51,880 Speaker 1: that for you today. I know you're a good person. 839 00:52:52,280 --> 00:52:54,040 Speaker 1: And the other thing it does, too is it makes 840 00:52:54,080 --> 00:52:56,200 Speaker 1: you want to train a little bit harder. So I 841 00:52:56,239 --> 00:52:58,640 Speaker 1: don't skip leg day even though they think I do. 842 00:53:02,320 --> 00:53:05,800 Speaker 1: I'm Trayvon Freed. You're now in the room where it's happening, 843 00:53:05,840 --> 00:53:09,080 Speaker 1: where we get out about the best musical of all time, Hamilton, 844 00:53:09,480 --> 00:53:13,840 Speaker 1: Hamilton's We'll be talking about the lyrics, the history, the production, 845 00:53:14,160 --> 00:53:16,560 Speaker 1: and we've got some amazing guests. We have actress Christian 846 00:53:16,640 --> 00:53:22,120 Speaker 1: Chinnem with comedian Jhon Hodgment. I ran home and I 847 00:53:22,160 --> 00:53:25,080 Speaker 1: said to my wife and two children, you have to 848 00:53:25,200 --> 00:53:29,240 Speaker 1: listen to this. Hamilton cast member Jasmine Cyphis Jones. This 849 00:53:29,320 --> 00:53:34,239 Speaker 1: is amazing. Subscribe right now and you'll never miss an episode. 850 00:53:34,560 --> 00:53:44,120 Speaker 1: Join us in the room where it's happening, everybody, that's amazing.