1 00:00:00,200 --> 00:00:03,560 Speaker 1: Welcome to Worst Year Ever, a production of I Heart Radio. 2 00:00:06,720 --> 00:00:22,880 Speaker 1: Welcome Yet Everything, So don't Oh my gosh, everybody, Welcome 3 00:00:22,920 --> 00:00:26,119 Speaker 1: back to the Worst Year Ever. This is part two 4 00:00:26,840 --> 00:00:31,440 Speaker 1: of the first two episodes, Part two of episode one. 5 00:00:31,520 --> 00:00:34,800 Speaker 1: Oh my gosh, Yeah, you get to you get too 6 00:00:34,880 --> 00:00:37,640 Speaker 1: on the first day. Is that exciting because we didn't 7 00:00:37,680 --> 00:00:41,279 Speaker 1: budget our time. Well, but it's a gift for you guys. Yeah, 8 00:00:41,360 --> 00:00:42,639 Speaker 1: we just have too much to say to each other. 9 00:00:43,880 --> 00:00:46,720 Speaker 1: I do. And speaking of gifts, oh wait, no, it's 10 00:00:46,720 --> 00:00:50,040 Speaker 1: not time for products and services. I am just a hack. 11 00:00:50,360 --> 00:00:54,200 Speaker 1: I'm so sorry. Okay, I accept your apology on behalf 12 00:00:54,200 --> 00:00:57,560 Speaker 1: of everyone listening. Right. We don't traditionally open with ads. 13 00:00:57,960 --> 00:01:02,120 Speaker 1: We actually I have an addictive personality and I have 14 00:01:02,160 --> 00:01:05,200 Speaker 1: gotten addicted to add pivots and it's a real problem. 15 00:01:05,280 --> 00:01:07,600 Speaker 1: If I don't do it twice a day, I just 16 00:01:07,640 --> 00:01:10,440 Speaker 1: get the shakes comes. I'll pull my car over in 17 00:01:10,440 --> 00:01:13,600 Speaker 1: the street and just shout at a passing police officers. 18 00:01:16,600 --> 00:01:19,480 Speaker 1: It's hims like a diaper for dudes, adult diaper. No, 19 00:01:19,520 --> 00:01:21,480 Speaker 1: it gets your dick hard. Oh, it's a hard pill. 20 00:01:21,800 --> 00:01:23,600 Speaker 1: It's a dick pill. It's a dick hard pill. Is 21 00:01:23,640 --> 00:01:26,360 Speaker 1: it spelled like the word him or the other women, 22 00:01:26,480 --> 00:01:29,959 Speaker 1: but with an s him No, not like what what 23 00:01:30,080 --> 00:01:34,240 Speaker 1: religious people do, although selling hymns has become my religion. 24 00:01:34,880 --> 00:01:37,560 Speaker 1: I mean, I'm sure that gets some people hard. Is 25 00:01:37,600 --> 00:01:42,000 Speaker 1: this podcast sponsored by HIMS? Maybe one day we can 26 00:01:42,000 --> 00:01:47,400 Speaker 1: only in horrible at this crossed has crossed everybody? Co 27 00:01:47,520 --> 00:01:49,640 Speaker 1: Do you want to talk about the Washington Post? Yeah, 28 00:01:50,600 --> 00:01:53,160 Speaker 1: Penis is crossed and then talking about the New York Times. 29 00:01:53,200 --> 00:01:55,040 Speaker 1: All right, it's actually gonna talk about the New York 30 00:01:55,040 --> 00:01:57,920 Speaker 1: Times and Washington Post kind of together because I think 31 00:01:57,920 --> 00:02:02,200 Speaker 1: they have kind of similar problems and kind of similar positives. 32 00:02:03,440 --> 00:02:06,520 Speaker 1: They are two of America's papers of record. Phrase it 33 00:02:06,600 --> 00:02:08,880 Speaker 1: tends to mean that they're, you know, very rigorous. They 34 00:02:08,919 --> 00:02:12,120 Speaker 1: show attention to detail and have accountability in their editorial process, 35 00:02:12,639 --> 00:02:14,840 Speaker 1: and they have like a truly like a national view 36 00:02:15,440 --> 00:02:17,280 Speaker 1: and attempt to be the best possible record of what 37 00:02:17,320 --> 00:02:19,280 Speaker 1: happened in the nation, not just in a region for 38 00:02:19,360 --> 00:02:23,079 Speaker 1: the day. And we could debate how much they do 39 00:02:23,120 --> 00:02:26,600 Speaker 1: that or not um but as as I think we 40 00:02:26,800 --> 00:02:29,080 Speaker 1: all can guess or remember, the New York Times the 41 00:02:29,080 --> 00:02:32,680 Speaker 1: Washington Post are both generally pro war. Another theme that 42 00:02:32,720 --> 00:02:34,440 Speaker 1: I think we all sort of get with a lot 43 00:02:34,480 --> 00:02:38,959 Speaker 1: of these mainstream news organizations, cody, unless it's war against 44 00:02:39,040 --> 00:02:41,880 Speaker 1: Nazi Germany in the late nineteen thirties, in which case 45 00:02:41,960 --> 00:02:44,040 Speaker 1: they really think we should see how this Hitler guy 46 00:02:44,040 --> 00:02:46,359 Speaker 1: plays out. Let's hear him out, let him, let him 47 00:02:46,360 --> 00:02:48,560 Speaker 1: do what he's gonna do, and we'll we'll just stay 48 00:02:48,800 --> 00:02:53,720 Speaker 1: keep to ourselves. Yeah, just it's just another you know, 49 00:02:54,000 --> 00:02:59,800 Speaker 1: balanced viewpoint. What could America not second? Yeah, exactly, um whatever. 50 00:03:00,040 --> 00:03:04,000 Speaker 1: Whatever the synonym for not second is to also specifically 51 00:03:04,120 --> 00:03:06,880 Speaker 1: pro the war in Iraq is interesting. I think that 52 00:03:07,000 --> 00:03:09,880 Speaker 1: like the war stance that draws claims of bias is 53 00:03:10,240 --> 00:03:13,360 Speaker 1: the anti war stance. No one really accuses anybody of 54 00:03:13,400 --> 00:03:17,160 Speaker 1: bias if they're pro war. Interesting thing about our culture. 55 00:03:17,520 --> 00:03:19,880 Speaker 1: Both of Your Times and Washington Post have also been 56 00:03:19,919 --> 00:03:22,600 Speaker 1: heavily criticized for the reporting on the Obama administration's drone 57 00:03:22,639 --> 00:03:26,559 Speaker 1: strike program, their general lack of reporting on civilian casualties 58 00:03:26,880 --> 00:03:29,920 Speaker 1: for that program. Which civilian casualties are I guess one 59 00:03:29,919 --> 00:03:32,200 Speaker 1: would describe it as a problem with drone strikes. Yeah, 60 00:03:32,240 --> 00:03:35,200 Speaker 1: I certainly wouldn't. I think that's fair. And to to 61 00:03:35,280 --> 00:03:37,360 Speaker 1: quote a study from Jay Bachman from the School of 62 00:03:37,360 --> 00:03:40,400 Speaker 1: International Service at American University. The willingness of The New 63 00:03:40,480 --> 00:03:43,720 Speaker 1: York Times and Washington Post to state positively that only 64 00:03:43,800 --> 00:03:46,240 Speaker 1: legitimate targets were killed in the immediate aftermath of drone 65 00:03:46,280 --> 00:03:49,960 Speaker 1: strikes and Yemen and Pakistan is especially problematic because exactly 66 00:03:49,960 --> 00:03:53,320 Speaker 1: who was killed in drone strikes is difficult to immediately confirm. Uh. 67 00:03:53,360 --> 00:03:56,600 Speaker 1: The study found not always, not always, not always, difficult 68 00:03:56,840 --> 00:03:59,760 Speaker 1: always Yeah. The other side, the site that I write 69 00:03:59,800 --> 00:04:02,560 Speaker 1: for Bellancat did a fun investigation last year where they 70 00:04:02,560 --> 00:04:04,560 Speaker 1: traced back one of the missiles that we sent Saudi 71 00:04:04,600 --> 00:04:06,640 Speaker 1: Arabia that was used to blow up a school bus 72 00:04:06,680 --> 00:04:11,119 Speaker 1: with a couple of dozen small children on it. Fun. 73 00:04:11,200 --> 00:04:16,360 Speaker 1: Legitimate targets seem like a school bus, hospitals, weddings, things 74 00:04:16,400 --> 00:04:20,640 Speaker 1: of things of that nature. It's a war, you guy, Hey, okay, Um, 75 00:04:20,680 --> 00:04:24,320 Speaker 1: it's a very old long war. Um. This this. This 76 00:04:24,400 --> 00:04:28,320 Speaker 1: study found that civilian deaths there were reported in stories 77 00:04:28,800 --> 00:04:32,599 Speaker 1: nine of the cases where civilian deaths were actually proven 78 00:04:32,640 --> 00:04:36,200 Speaker 1: to have occurred, so they did not report on them. Um. 79 00:04:36,279 --> 00:04:38,640 Speaker 1: Neither New York Times nor the Washington Post have corrected 80 00:04:38,680 --> 00:04:41,560 Speaker 1: their errors and failing to accurately report on the effects 81 00:04:41,560 --> 00:04:44,400 Speaker 1: of drone strikes. Uh, and the editorial departments of both 82 00:04:44,400 --> 00:04:46,520 Speaker 1: papers have said that it is either not up to 83 00:04:46,640 --> 00:04:49,240 Speaker 1: them or not in the papers policy to run corrections 84 00:04:49,279 --> 00:04:54,599 Speaker 1: on such stories. So Papers of Record, Yeah, I love 85 00:04:54,920 --> 00:04:59,559 Speaker 1: that response. Um. Another fun fact, the New York Times 86 00:04:59,640 --> 00:05:02,839 Speaker 1: is nick name is the Gray Lady, because the entire 87 00:05:02,880 --> 00:05:08,920 Speaker 1: staff are huge fans of the erotic novel Shades of Gray. Yeah. Yeah, yeah, 88 00:05:09,080 --> 00:05:12,080 Speaker 1: very very cool. I wasn't going to include that because 89 00:05:12,080 --> 00:05:14,440 Speaker 1: it seemed a little blue, you know, a little little 90 00:05:14,520 --> 00:05:17,159 Speaker 1: rough around the edges for our very clean podcast. But 91 00:05:17,320 --> 00:05:20,560 Speaker 1: it sounds pretty basic to me, well, you know, not 92 00:05:20,640 --> 00:05:24,119 Speaker 1: for not for the Paper of Record. So I actually 93 00:05:24,160 --> 00:05:26,840 Speaker 1: hate describing things as basic because it just describes something 94 00:05:26,880 --> 00:05:30,800 Speaker 1: that's popular than that. People. Yeah, we move on from 95 00:05:30,800 --> 00:05:32,440 Speaker 1: the drone strikes, but I think it is hard to 96 00:05:32,520 --> 00:05:37,600 Speaker 1: understate the detrimental effect of a misinformed public getting sort 97 00:05:37,640 --> 00:05:41,920 Speaker 1: of like normalized too civilian deaths from drone strikes. Like 98 00:05:42,040 --> 00:05:44,640 Speaker 1: we've talked about normalization a lot and normally of violence, 99 00:05:44,680 --> 00:05:47,560 Speaker 1: but like if you have this president who's not really 100 00:05:47,560 --> 00:05:51,280 Speaker 1: criticized for a lot, especially stuff like this, and you're 101 00:05:51,320 --> 00:05:53,640 Speaker 1: not even reporting on the civilian deaths, and you're sort 102 00:05:53,680 --> 00:05:56,359 Speaker 1: of creating this idea that drone strikes are actually like 103 00:05:56,560 --> 00:05:59,520 Speaker 1: very safe and effective when they're totally not and they 104 00:05:59,640 --> 00:06:03,800 Speaker 1: kill people, uh pretty indiscriminately, um, with a lot of 105 00:06:03,800 --> 00:06:08,760 Speaker 1: collateral damage. It's bad. I would say, yeah, we'll talk 106 00:06:08,800 --> 00:06:11,800 Speaker 1: about that a bit with me and okay, yeah, my 107 00:06:12,080 --> 00:06:16,559 Speaker 1: theories that that it's bad. Yeah, I concur my theory 108 00:06:16,640 --> 00:06:21,080 Speaker 1: is that that pod is going to in fact save America. Well, okay, 109 00:06:21,120 --> 00:06:25,960 Speaker 1: will you in that? Alright? Well I thought they were, 110 00:06:26,000 --> 00:06:28,800 Speaker 1: but I guess not. They do good reporting to like 111 00:06:29,080 --> 00:06:33,680 Speaker 1: the Washington Post has a great like investigative team, um, 112 00:06:33,920 --> 00:06:37,120 Speaker 1: people like Dave farettole Like they do a lot of Yeah, 113 00:06:39,440 --> 00:06:41,560 Speaker 1: I tend to like stuff a lot of things that 114 00:06:41,640 --> 00:06:43,880 Speaker 1: the Washington Post specifically puts out. And that's why I 115 00:06:43,880 --> 00:06:46,320 Speaker 1: want to make that clear. Like they do good reporting. 116 00:06:46,440 --> 00:06:48,080 Speaker 1: And I mean the New York Times was the first 117 00:06:48,080 --> 00:06:50,880 Speaker 1: to publish the Pentagon papers. They got they got sued 118 00:06:50,920 --> 00:06:53,560 Speaker 1: for it. Like they do good things. That's a long 119 00:06:53,560 --> 00:06:56,679 Speaker 1: time ago. But actual reporters at the New York Times 120 00:06:56,680 --> 00:06:58,400 Speaker 1: tend to be very I've worked with the number of them, 121 00:06:58,400 --> 00:07:00,440 Speaker 1: tend to be very very very very are we good 122 00:07:00,440 --> 00:07:04,200 Speaker 1: at their jobs? Like the actual investigative journalists and stuff 123 00:07:04,200 --> 00:07:05,960 Speaker 1: who were doing the work. Like most of it people 124 00:07:05,960 --> 00:07:08,119 Speaker 1: get piste off at is like the op ed section 125 00:07:08,160 --> 00:07:10,920 Speaker 1: and stuff like, there's a lot of really good original 126 00:07:10,960 --> 00:07:12,760 Speaker 1: reporting done by the Times, And I would never want 127 00:07:12,760 --> 00:07:15,040 Speaker 1: to pretend like there isn't yes exactly, I want to 128 00:07:15,040 --> 00:07:17,000 Speaker 1: make that clear. Um, and we'll yeah, we're gonna get 129 00:07:17,000 --> 00:07:19,200 Speaker 1: to the op ed stuff. Yeah. But again like sort 130 00:07:19,200 --> 00:07:21,560 Speaker 1: of going with the theme, I think, uh that is 131 00:07:21,640 --> 00:07:24,960 Speaker 1: generally true. Is it's just there they're very pro war 132 00:07:25,360 --> 00:07:29,640 Speaker 1: uh and uh not questioning of the military things like that. Um. 133 00:07:29,920 --> 00:07:32,720 Speaker 1: The Washington Post, by their own admission, ran one forty 134 00:07:32,840 --> 00:07:36,120 Speaker 1: editorials and articles supporting Bush's wars. Um. They at one 135 00:07:36,160 --> 00:07:39,600 Speaker 1: point apologize for its practice of burying articles. But the 136 00:07:39,640 --> 00:07:42,360 Speaker 1: problem wasn't that they buried stories. It was just that 137 00:07:42,840 --> 00:07:45,280 Speaker 1: the anti war stories were vastly outnumbered by the pro 138 00:07:45,280 --> 00:07:48,840 Speaker 1: war stories, so they did not do the reporting. UM. 139 00:07:49,040 --> 00:07:51,240 Speaker 1: One study positive that there was just sort of general 140 00:07:51,240 --> 00:07:54,480 Speaker 1: attitude among the editors that boiled down to the idea that, like, 141 00:07:54,600 --> 00:07:58,640 Speaker 1: the war was inevitable, so why published these contrary voices? 142 00:07:59,360 --> 00:08:03,120 Speaker 1: Interesting that you want balance, but not when it comes 143 00:08:03,160 --> 00:08:07,560 Speaker 1: to maybe saying that like to war, you know, I mean, 144 00:08:07,760 --> 00:08:09,760 Speaker 1: is it possible that war is a good thing? Though 145 00:08:10,200 --> 00:08:12,320 Speaker 1: for their paper, I haven't considered it. I guess I 146 00:08:12,360 --> 00:08:16,000 Speaker 1: should read there are many many articles about how why 147 00:08:16,000 --> 00:08:19,880 Speaker 1: don't why don't you start with that? Do some research? Okay? 148 00:08:19,920 --> 00:08:26,040 Speaker 1: I will read their articles? How bushes um? So they 149 00:08:26,040 --> 00:08:29,640 Speaker 1: have that problem? Um. But like Robert you alluded to, 150 00:08:29,800 --> 00:08:32,120 Speaker 1: what I think we really should talk about are the 151 00:08:32,160 --> 00:08:35,000 Speaker 1: Washington Post and New York Times calumness and fact checkers, 152 00:08:35,800 --> 00:08:38,120 Speaker 1: um on some more news. We actually recently did an 153 00:08:38,120 --> 00:08:42,160 Speaker 1: episode written by Creature features own Katie Golden about Brett Stephens, 154 00:08:42,200 --> 00:08:45,320 Speaker 1: specifically he's a New York Times columnist and how he's 155 00:08:45,320 --> 00:08:49,280 Speaker 1: basically used his columns for his own petty bullshit combined 156 00:08:49,320 --> 00:08:51,760 Speaker 1: with climate denial. Um. And that's why the New York 157 00:08:51,760 --> 00:08:54,200 Speaker 1: Times op ed seems to be a place for out 158 00:08:54,200 --> 00:08:57,439 Speaker 1: of touch elites to sort of air their petty personal grievances, 159 00:08:58,360 --> 00:09:01,120 Speaker 1: mostly involving how kids on college camp? Is there totalitarian? 160 00:09:01,679 --> 00:09:04,800 Speaker 1: In one column by Barry Weiss as her only evidence 161 00:09:04,840 --> 00:09:08,079 Speaker 1: of this totalitarianism, she linked to a fake Twitter account. 162 00:09:09,200 --> 00:09:11,920 Speaker 1: Recently columns Brett Stevens after being called a bedbug by 163 00:09:11,960 --> 00:09:14,840 Speaker 1: a Jewish professor in a tweet that got five likes. 164 00:09:15,559 --> 00:09:18,000 Speaker 1: Brett wrote in an entire column that likened him to 165 00:09:18,080 --> 00:09:21,440 Speaker 1: a Nazi, and his evidence shared a link to Google 166 00:09:21,480 --> 00:09:24,600 Speaker 1: books with the search field reading Jews as bedbugs, which 167 00:09:24,679 --> 00:09:27,319 Speaker 1: led to him finding a quote about bed bugs that 168 00:09:27,440 --> 00:09:30,920 Speaker 1: completely undermined his point because the quote was about literal bedbugs. 169 00:09:32,360 --> 00:09:35,400 Speaker 1: So it seems it seems like maybe there's like a 170 00:09:35,480 --> 00:09:39,240 Speaker 1: complete lack of oversight and editorial rigor um, which a 171 00:09:39,280 --> 00:09:43,600 Speaker 1: paper records should probably have. It also seems like if 172 00:09:43,640 --> 00:09:47,080 Speaker 1: they fired Brett Stevens for his salary, they could afford 173 00:09:47,120 --> 00:09:51,160 Speaker 1: to pay a couple of dozen really talented journalists to 174 00:09:51,280 --> 00:09:55,080 Speaker 1: investigate like six to ten really complicated stories a year 175 00:09:55,160 --> 00:09:59,439 Speaker 1: each and provide like more than weekly content that was 176 00:09:59,480 --> 00:10:04,200 Speaker 1: incredibly useful in groundbreaking. But because they don't, we get 177 00:10:04,240 --> 00:10:08,720 Speaker 1: to know that Brett Stevens googled Jews as bed bugs 178 00:10:09,760 --> 00:10:13,080 Speaker 1: and did not read the quote that he found. He 179 00:10:13,120 --> 00:10:17,040 Speaker 1: didn't even unbelievable, why why would he? He found it 180 00:10:17,080 --> 00:10:20,360 Speaker 1: in a book? Therefore, what that professor did to him 181 00:10:20,400 --> 00:10:23,360 Speaker 1: was the same as the Holocaust exactly, que fucking d 182 00:10:24,360 --> 00:10:28,360 Speaker 1: you just got bloom up, random Jewish professor who I 183 00:10:28,400 --> 00:10:30,880 Speaker 1: think is a Nazi. Now it's just a it's just 184 00:10:31,040 --> 00:10:35,439 Speaker 1: real problem. Yeah. And so this actually kind of started 185 00:10:35,640 --> 00:10:38,520 Speaker 1: due to new leadership from James Bennett. He was put 186 00:10:38,520 --> 00:10:42,520 Speaker 1: in charge of the editorial page in um interesting year 187 00:10:42,720 --> 00:10:45,600 Speaker 1: for this to start. I don't know if anyone's familiar 188 00:10:45,600 --> 00:10:48,200 Speaker 1: with that year and how good it was everybody. Yeah, 189 00:10:48,800 --> 00:10:52,360 Speaker 1: So James hired a bunch of Brett bugs U two, 190 00:10:52,400 --> 00:10:54,800 Speaker 1: as he claims, we're looking to challenge our own and 191 00:10:54,880 --> 00:10:57,600 Speaker 1: our reader's assumptions. Uh, never mind that the paper only 192 00:10:57,600 --> 00:10:59,800 Speaker 1: seems to be challenging any assumptions from the right, and 193 00:11:00,000 --> 00:11:03,280 Speaker 1: any of the reader's assumptions about climate change are true. 194 00:11:03,720 --> 00:11:06,480 Speaker 1: So don't need to be challenged by Brett fucking Stevens. 195 00:11:07,040 --> 00:11:10,320 Speaker 1: But in general, the spectrum of challenge is pretty narrow. 196 00:11:10,400 --> 00:11:14,280 Speaker 1: To Glenn Greenwald points out, it spans the small gap 197 00:11:14,360 --> 00:11:19,960 Speaker 1: from establishment centrist Democrats to establishment centrist Republicans. Again, just 198 00:11:20,040 --> 00:11:24,040 Speaker 1: sort of all these viewpoints that are approved and none 199 00:11:24,080 --> 00:11:28,120 Speaker 1: that are, like war bad, maybe more taxes, I don't know, 200 00:11:28,480 --> 00:11:32,800 Speaker 1: things like that. This all reminds me of the liberal consensus. 201 00:11:32,880 --> 00:11:34,959 Speaker 1: This is something I think about a lot, and I 202 00:11:35,000 --> 00:11:37,040 Speaker 1: think it helps to understand some of what we're seeing 203 00:11:37,080 --> 00:11:40,080 Speaker 1: here and on the internet and culturally in general. In 204 00:11:40,120 --> 00:11:44,080 Speaker 1: the seventies, there was something developing called the liberal consensus. Uh. 205 00:11:44,160 --> 00:11:46,199 Speaker 1: One can make an argument that this is because reality 206 00:11:46,240 --> 00:11:49,080 Speaker 1: sort of leans liberal, but basically there was there are 207 00:11:49,080 --> 00:11:51,560 Speaker 1: these nonpartisan organizations like the New York Times and the 208 00:11:51,559 --> 00:11:55,560 Speaker 1: Brookings Institute, and conservatives saw them as incredibly biased. Like 209 00:11:55,600 --> 00:11:58,480 Speaker 1: the Brookings Institute would do studies, and they would study 210 00:11:59,000 --> 00:12:02,440 Speaker 1: issues and do research and gather facts and sort of 211 00:12:02,520 --> 00:12:07,240 Speaker 1: come to conclusions and propose policy. It was a nonpartisan 212 00:12:07,360 --> 00:12:10,760 Speaker 1: organization meant to do that, and a lot of their 213 00:12:10,800 --> 00:12:14,640 Speaker 1: policies were liberal leaning, right, but it was the result 214 00:12:14,679 --> 00:12:17,240 Speaker 1: of a nonpartisan effort to research and understand an issue. 215 00:12:17,600 --> 00:12:19,760 Speaker 1: So in response to this, people were very mad that 216 00:12:19,840 --> 00:12:22,880 Speaker 1: Nixon was sort of buying into the liberal consensus. So 217 00:12:23,440 --> 00:12:25,600 Speaker 1: we started to see things pop up like the American 218 00:12:25,720 --> 00:12:29,120 Speaker 1: Enterprise Institute and the Heritage Foundation. UH, these sort of 219 00:12:29,120 --> 00:12:32,000 Speaker 1: conservative think tanks that were responding to what they saw 220 00:12:32,040 --> 00:12:36,760 Speaker 1: as body So they're saying, you're biased, therefore we're going 221 00:12:36,800 --> 00:12:39,400 Speaker 1: to be biased, right, So you have like all these 222 00:12:39,400 --> 00:12:42,680 Speaker 1: institutions were basically admitting their own bias and their goal 223 00:12:42,720 --> 00:12:45,839 Speaker 1: being to find facts and information that they can use 224 00:12:45,960 --> 00:12:49,480 Speaker 1: to prove a conservative policy that's worth pursuing. We see 225 00:12:49,480 --> 00:12:51,200 Speaker 1: this a lot, the shades of it, at least in 226 00:12:51,320 --> 00:12:55,040 Speaker 1: people like Ben Shapiro. Praguer University is great at this. 227 00:12:55,120 --> 00:12:57,240 Speaker 1: You'll notice that a lot of their speakers are actually 228 00:12:57,240 --> 00:13:00,520 Speaker 1: from the American Enterprise Institute. So you start with conclusion, 229 00:13:00,559 --> 00:13:03,079 Speaker 1: you find facts can support it, as opposed to finding facts, 230 00:13:03,080 --> 00:13:05,439 Speaker 1: doing research, and then analyzing it to a right a conclusion. 231 00:13:05,800 --> 00:13:07,720 Speaker 1: So there's always been this sort of push against the 232 00:13:07,720 --> 00:13:11,560 Speaker 1: liberal consensus and liberal media. In the sixties, the first 233 00:13:11,600 --> 00:13:14,800 Speaker 1: instance of it was used by George Wallace and it 234 00:13:14,920 --> 00:13:18,439 Speaker 1: was used to basically discredit the civil rights movement, saying 235 00:13:18,520 --> 00:13:21,079 Speaker 1: the media only supported the civil rights movement because of 236 00:13:21,120 --> 00:13:23,680 Speaker 1: their liberal bias. So it's been going on for as 237 00:13:23,720 --> 00:13:25,520 Speaker 1: been going on for a while, and it's this sort 238 00:13:25,520 --> 00:13:28,640 Speaker 1: of general idea that has been building and building over 239 00:13:28,679 --> 00:13:32,079 Speaker 1: the years. Um, we're becoming more and more polarized, right, 240 00:13:32,200 --> 00:13:35,679 Speaker 1: is this push against the liberal consensus, liberal media and 241 00:13:35,760 --> 00:13:38,040 Speaker 1: saying that like, well now we're gonna do it even 242 00:13:38,040 --> 00:13:41,559 Speaker 1: though they weren't doing it right. It's like Fox News 243 00:13:42,040 --> 00:13:44,440 Speaker 1: they go full far right conspiracy nut job as a 244 00:13:44,440 --> 00:13:47,000 Speaker 1: response to what they perceive as bias, and then we 245 00:13:47,080 --> 00:13:49,160 Speaker 1: get to sort of where we are today. I think 246 00:13:49,160 --> 00:13:51,199 Speaker 1: we're still grappling with it. Um. There's more to say 247 00:13:51,240 --> 00:13:53,959 Speaker 1: about like the Powell memo that sort of pushed more 248 00:13:53,960 --> 00:13:57,120 Speaker 1: think tanks, more lobbying in favor of big business, regardless 249 00:13:57,160 --> 00:14:00,000 Speaker 1: of the facts. So I think there's always this see 250 00:14:00,040 --> 00:14:04,319 Speaker 1: mean call for balance when balance isn't warranted um, which 251 00:14:04,360 --> 00:14:06,200 Speaker 1: again is sort of what we've been talking about, where 252 00:14:06,200 --> 00:14:09,120 Speaker 1: you have fifty articles about climate change and then fifty 253 00:14:09,160 --> 00:14:12,720 Speaker 1: articles denying climate change. But that's not balance, that's the 254 00:14:12,760 --> 00:14:17,160 Speaker 1: illusion of balance in the face of what reality actually is, right, 255 00:14:17,320 --> 00:14:20,960 Speaker 1: because denying climate change is just lying. Well, you're not 256 00:14:21,040 --> 00:14:24,440 Speaker 1: being balanced at all. You're you're putting something out there 257 00:14:24,520 --> 00:14:26,720 Speaker 1: that isn't true. Yeah, And so you have these I 258 00:14:26,720 --> 00:14:30,080 Speaker 1: think a lot of these institutions, like the New York Times, 259 00:14:30,320 --> 00:14:34,680 Speaker 1: who are being accused of this bias, and they're like, well, 260 00:14:34,720 --> 00:14:36,040 Speaker 1: you don't have any of these voices and any of 261 00:14:36,080 --> 00:14:38,560 Speaker 1: these voices, So they're trying to create a balance that 262 00:14:38,600 --> 00:14:42,600 Speaker 1: doesn't actually exist. They're getting all these calmness to write 263 00:14:42,600 --> 00:14:44,600 Speaker 1: about how we need more nukes and how climate change 264 00:14:44,600 --> 00:14:46,960 Speaker 1: isn't really how all the all the college campuses are 265 00:14:47,000 --> 00:14:48,800 Speaker 1: just a bunch of lips snowflakes. While I'm going to 266 00:14:48,920 --> 00:14:53,000 Speaker 1: accuse some critic of being hitler and that's why we 267 00:14:53,200 --> 00:14:56,800 Speaker 1: are sort of here now, it seems more criticizing their 268 00:14:56,920 --> 00:15:02,080 Speaker 1: articles about war less criticizing liberal Yeah, and like again, 269 00:15:02,120 --> 00:15:03,800 Speaker 1: like I'm not gonna say there's no liberal bias and 270 00:15:03,840 --> 00:15:07,320 Speaker 1: things like that, but this is sort of where we're 271 00:15:07,320 --> 00:15:09,960 Speaker 1: getting at well. And I think also like a big 272 00:15:10,040 --> 00:15:12,880 Speaker 1: chunk of the liberal bias that like doesn't get written 273 00:15:12,920 --> 00:15:16,479 Speaker 1: down as bias is actually they're biased towards wanting to 274 00:15:16,520 --> 00:15:19,080 Speaker 1: be that fucking kind of chronkite voice that like like 275 00:15:19,080 --> 00:15:22,240 Speaker 1: like that that leads them to giving weight to sides 276 00:15:22,240 --> 00:15:25,480 Speaker 1: that ought not be given weight just because they want 277 00:15:25,480 --> 00:15:28,240 Speaker 1: to be seen as being fair, because that's more important 278 00:15:28,240 --> 00:15:32,400 Speaker 1: to them than accurately reporting on the truth, because getting 279 00:15:32,400 --> 00:15:35,600 Speaker 1: attacked is like they just can't handle it. And I 280 00:15:35,600 --> 00:15:37,120 Speaker 1: think part of that to do with a lot of 281 00:15:37,120 --> 00:15:39,640 Speaker 1: the fact that a lot of our most prominent voices 282 00:15:39,640 --> 00:15:43,480 Speaker 1: in mainstream journalism are rich kids who got the job 283 00:15:43,480 --> 00:15:45,800 Speaker 1: because they were able to take an unpaid internship, who 284 00:15:45,800 --> 00:15:47,720 Speaker 1: went to j school and learned all the things you 285 00:15:47,800 --> 00:15:50,920 Speaker 1: learn at j school, which often doesn't interact with like 286 00:15:50,960 --> 00:15:54,480 Speaker 1: what actually makes a good journalist and whose family is 287 00:15:54,520 --> 00:15:57,480 Speaker 1: tied into like the media industry somehow. Like it's it's 288 00:15:57,520 --> 00:16:01,280 Speaker 1: a bunch of people who when they get actually like attacked, 289 00:16:01,480 --> 00:16:04,320 Speaker 1: can't push back because they don't have any kind of 290 00:16:04,360 --> 00:16:08,040 Speaker 1: like they're they're gormless at a fundamental level, and that 291 00:16:08,280 --> 00:16:10,760 Speaker 1: is not good. Yeah. Right, as soon as they're accused 292 00:16:10,760 --> 00:16:12,120 Speaker 1: of that bias, you're like, oh, I guess I got, 293 00:16:12,200 --> 00:16:14,720 Speaker 1: I got, I guess we are we have to fix that. 294 00:16:14,840 --> 00:16:17,800 Speaker 1: I got overcorrect. Yeah, it's all. Yeah, it's a lot 295 00:16:17,840 --> 00:16:21,040 Speaker 1: of overcorrection and in general, just saying saying what your 296 00:16:21,080 --> 00:16:23,520 Speaker 1: narrative is. I know head in New York Times recently 297 00:16:23,960 --> 00:16:25,960 Speaker 1: said like, oh, we focus on Russia for a couple 298 00:16:25,960 --> 00:16:27,480 Speaker 1: of years, and now we're not going to do that 299 00:16:27,520 --> 00:16:29,080 Speaker 1: because that, you know, that didn't work out. So now 300 00:16:29,080 --> 00:16:31,680 Speaker 1: we're gonna focus on race. And this idea of like 301 00:16:32,440 --> 00:16:35,480 Speaker 1: choosing your narrative for like the next year and a half, Well, 302 00:16:35,520 --> 00:16:38,640 Speaker 1: that's not that's not that's not journalism. You you like, 303 00:16:39,040 --> 00:16:41,320 Speaker 1: you research, you react, you analyze and stuff. You don't 304 00:16:41,320 --> 00:16:43,640 Speaker 1: just say like this is this is our narrative now, 305 00:16:43,960 --> 00:16:46,880 Speaker 1: Um so I don't care for that. But speaking of 306 00:16:47,120 --> 00:16:50,280 Speaker 1: people airing personal grievances and papers of record and wanting balance, 307 00:16:50,320 --> 00:16:52,600 Speaker 1: except when it comes to the left. The Washington Post 308 00:16:52,720 --> 00:16:57,160 Speaker 1: today Thursday, September twelve ran an op ed called I 309 00:16:57,280 --> 00:17:00,440 Speaker 1: like Elizabeth Warren too bad. She's a hypocrite, and I'm 310 00:17:00,480 --> 00:17:03,840 Speaker 1: gonna I'm gonna briefly read this thing, and I want 311 00:17:03,880 --> 00:17:05,840 Speaker 1: you to stop me when you think that I've made 312 00:17:05,920 --> 00:17:10,639 Speaker 1: my point. Warren attacked former Vice President Joe Biden off 313 00:17:10,640 --> 00:17:12,320 Speaker 1: to a good start list personally, this is me saying 314 00:17:12,600 --> 00:17:17,000 Speaker 1: good for holding a kickoff fundraiser in Philadelphia in April, 315 00:17:17,160 --> 00:17:20,639 Speaker 1: which she criticized as a swanky, private fundraiser for wealthy 316 00:17:20,720 --> 00:17:23,760 Speaker 1: donors in an email to supporters the next day. Well, 317 00:17:24,280 --> 00:17:28,480 Speaker 1: I helped organize that affair, And I thought her attack 318 00:17:28,680 --> 00:17:32,040 Speaker 1: was extremely hypocritical because nearly twenty of us who attended 319 00:17:32,040 --> 00:17:34,040 Speaker 1: the Biden fundraiser had also given her two thousand or 320 00:17:34,119 --> 00:17:38,480 Speaker 1: more closed door fundraisers and swanky locations. You didn't stop me, 321 00:17:38,840 --> 00:17:41,400 Speaker 1: but if you didn't catch that, well, I helped organize 322 00:17:41,440 --> 00:17:43,840 Speaker 1: that affair. This is a person who just like they 323 00:17:44,000 --> 00:17:46,320 Speaker 1: organized the thing and they're upset, so they're writing an 324 00:17:46,400 --> 00:17:49,280 Speaker 1: article about itulous and like we will. I'm sure in 325 00:17:49,359 --> 00:17:52,760 Speaker 1: other episodes talk about Elizabeth Warren and her donors, but like, 326 00:17:53,000 --> 00:17:58,640 Speaker 1: come on, Washington Post, Uh, that's crazy, right, Like it's 327 00:17:58,680 --> 00:18:01,840 Speaker 1: is this person who's totally involved and at least they 328 00:18:01,880 --> 00:18:04,719 Speaker 1: mentioned it. Um. They don't always do that, but they 329 00:18:04,800 --> 00:18:07,600 Speaker 1: seemingly have a huge bias against anything on the left 330 00:18:07,880 --> 00:18:11,919 Speaker 1: of the establishment Democratic Party, especially Bernard Sanders. Sixteen they 331 00:18:11,960 --> 00:18:16,639 Speaker 1: ran sixteen negative articles about him in sixteen hours. Yeah, recently, 332 00:18:16,760 --> 00:18:19,639 Speaker 1: but I bet they ran sixteen positive articles about him 333 00:18:19,720 --> 00:18:24,679 Speaker 1: in sixteen hours. I bet they did. Yeah, I bet 334 00:18:24,760 --> 00:18:29,880 Speaker 1: that proved their balance. Recently, Glenn Kessler gave Brie Sanders 335 00:18:29,960 --> 00:18:33,040 Speaker 1: three pinocchio is about a claim that five thousand people 336 00:18:33,080 --> 00:18:36,399 Speaker 1: go bankrupt every year because they can't pay outrageous medical bills. Um. 337 00:18:36,560 --> 00:18:39,280 Speaker 1: So he fact checked this gave it three pinocchios. The 338 00:18:39,400 --> 00:18:44,680 Speaker 1: claim was proven to be true within Glenn's article. His 339 00:18:44,880 --> 00:18:47,600 Speaker 1: source for the fact check said that it was true. 340 00:18:48,280 --> 00:18:52,160 Speaker 1: That's so, I mean, but it's kind of like, guys, 341 00:18:52,280 --> 00:18:56,520 Speaker 1: if you multiply negative one times negative one times negative one, 342 00:18:56,920 --> 00:19:00,119 Speaker 1: you get positive three. I don't understand math, but I 343 00:19:00,200 --> 00:19:02,680 Speaker 1: think that means that three pinocchios equals of truth. Well, 344 00:19:02,840 --> 00:19:07,680 Speaker 1: he he lost a couple of pinocchios just because he 345 00:19:08,280 --> 00:19:11,600 Speaker 1: don't like it, you know, took something about Bernie, just 346 00:19:11,720 --> 00:19:14,760 Speaker 1: like knock those pinocchios like, but also it's Bernie, so 347 00:19:15,119 --> 00:19:17,480 Speaker 1: take two off. Yeah. Maybe that's something to do with 348 00:19:17,520 --> 00:19:19,280 Speaker 1: the Washington Post being owned by the richest man on 349 00:19:19,320 --> 00:19:24,760 Speaker 1: the planet, Jeff Bezos, somebody money criticizing all the time. 350 00:19:24,920 --> 00:19:28,800 Speaker 1: I don't know. Maybe that's like suggesting that I scw 351 00:19:28,960 --> 00:19:32,040 Speaker 1: my coverage in favor of dick Pills because they support 352 00:19:32,119 --> 00:19:36,480 Speaker 1: my podcast. What else did they support? Yea, they support 353 00:19:36,880 --> 00:19:43,760 Speaker 1: of healthy rectile balance, they support blood slow that's unbiased. 354 00:19:44,240 --> 00:19:46,520 Speaker 1: There claims that Bezos is pretty hands off with the 355 00:19:46,560 --> 00:19:49,520 Speaker 1: Washington Post. I don't know. Maybe he is. Um that 356 00:19:49,600 --> 00:19:53,920 Speaker 1: doesn't mean that there is not this sort of push again, 357 00:19:53,960 --> 00:19:55,359 Speaker 1: we see it. We see the New York Times too, 358 00:19:55,359 --> 00:19:58,120 Speaker 1: We see the established media in general. One quick note. 359 00:19:58,160 --> 00:20:01,760 Speaker 1: Also today September twelve, a nineteen, the Washington Post published 360 00:20:01,760 --> 00:20:05,680 Speaker 1: their last issue of their express publication. Because you know, 361 00:20:06,040 --> 00:20:09,600 Speaker 1: money digital media sort of, we're all going under the cover, 362 00:20:09,720 --> 00:20:12,840 Speaker 1: says hope you enjoy your sticking phones, um as if 363 00:20:12,920 --> 00:20:14,800 Speaker 1: the Washington Post isn't owned by the richest man in 364 00:20:14,840 --> 00:20:18,040 Speaker 1: the planets, right, But that's the gist. I didn't think 365 00:20:18,080 --> 00:20:20,399 Speaker 1: we we should be wary of things like that, of 366 00:20:20,640 --> 00:20:24,800 Speaker 1: specifically their editorial columns. You'll see a lot of disclaimers. 367 00:20:25,000 --> 00:20:26,720 Speaker 1: One thing I've noticed a lot of There are a 368 00:20:26,760 --> 00:20:29,479 Speaker 1: lot of articles that are like, oh, the Democrats are 369 00:20:29,480 --> 00:20:31,080 Speaker 1: going too far left for me to vote for them. 370 00:20:31,760 --> 00:20:34,480 Speaker 1: But then you're like, oh, it's written by like Dick Cheney, 371 00:20:34,920 --> 00:20:37,520 Speaker 1: Ari Fleischer or something like that, Like these people were 372 00:20:37,680 --> 00:20:40,080 Speaker 1: you going to vote for them anyway. It's this sort 373 00:20:40,119 --> 00:20:42,280 Speaker 1: of like taking a moderate or a centrist or a 374 00:20:42,359 --> 00:20:46,600 Speaker 1: Republican and saying we're going nuts, which I think is 375 00:20:46,640 --> 00:20:49,760 Speaker 1: a bit of a dishonest framing. I agree with you completely. Uh, 376 00:20:49,840 --> 00:20:53,320 Speaker 1: we got to take a quick break for things. Okay, 377 00:20:53,760 --> 00:20:58,200 Speaker 1: you appear to be done speaking. I thought you said 378 00:20:58,200 --> 00:21:00,720 Speaker 1: I love crepes, but we're not love crepes. I love 379 00:21:00,760 --> 00:21:02,520 Speaker 1: crepes too, but that's not I assume that's not what 380 00:21:02,600 --> 00:21:14,280 Speaker 1: we're selling. I'm selling that right now by crepes together everything, 381 00:21:14,840 --> 00:21:23,440 Speaker 1: don't don't We have returned from break now? How those 382 00:21:23,480 --> 00:21:27,119 Speaker 1: are great ads? I loved them. I bought them all again. 383 00:21:27,680 --> 00:21:30,960 Speaker 1: You're gonna run out of money them, but that money 384 00:21:31,080 --> 00:21:33,719 Speaker 1: is coming back to you in a way. So all right, 385 00:21:33,880 --> 00:21:35,680 Speaker 1: we got a lot to get through. I'm gonna be 386 00:21:35,800 --> 00:21:37,919 Speaker 1: pretty quick on what I'm here to talk about now, 387 00:21:38,200 --> 00:21:42,040 Speaker 1: which is pontive America. You guys are familiar. That's the 388 00:21:42,119 --> 00:21:44,119 Speaker 1: goal of this podcast, right we are are we are 389 00:21:44,160 --> 00:21:46,359 Speaker 1: we trying to save or destroy America? I have I 390 00:21:46,480 --> 00:21:49,040 Speaker 1: have forgotten. I think destroy America so that we can 391 00:21:49,080 --> 00:21:50,760 Speaker 1: build it back up again in the way that we 392 00:21:50,960 --> 00:21:54,760 Speaker 1: prefer if we if like, if we have time, I mean, 393 00:21:54,800 --> 00:21:57,200 Speaker 1: I guess like, well, well, well we'll pencil that in 394 00:21:57,560 --> 00:22:00,159 Speaker 1: after destroying it, like if we if we wind up 395 00:22:00,280 --> 00:22:02,879 Speaker 1: you know, you know, all podcasts are an evolution, so 396 00:22:03,000 --> 00:22:05,480 Speaker 1: we'll see what happens if we want to rebuild it 397 00:22:05,560 --> 00:22:07,960 Speaker 1: back up. Yeah, because part of me is thinking that 398 00:22:08,040 --> 00:22:11,199 Speaker 1: after we destroy America, that's probably the time for us 399 00:22:11,280 --> 00:22:14,760 Speaker 1: to launch our podcast about Frasier when we break down 400 00:22:14,840 --> 00:22:18,680 Speaker 1: every episode an exhaustive, forty five minute detail. I love 401 00:22:18,800 --> 00:22:22,280 Speaker 1: that idea. Yeah, so maybe we'll destroy America. We'll talk 402 00:22:22,320 --> 00:22:25,240 Speaker 1: about Frasier for three years, and then we'll rebuild it. Okay, 403 00:22:25,440 --> 00:22:27,919 Speaker 1: that's a good plan. You guys on board. All right, 404 00:22:28,160 --> 00:22:30,159 Speaker 1: I'm gonna push for saving America. I don't want to. 405 00:22:30,160 --> 00:22:31,639 Speaker 1: I don't want to rock the boat here, but I 406 00:22:31,760 --> 00:22:34,240 Speaker 1: think I'm going to convince you both by the end 407 00:22:34,280 --> 00:22:37,720 Speaker 1: of the year that we should save it. Robert, Okay, 408 00:22:37,880 --> 00:22:40,879 Speaker 1: Cody's out. All right, let's see we can Can we 409 00:22:41,320 --> 00:22:45,399 Speaker 1: marketplace ideas? Huh? Can we get the ghost of the 410 00:22:45,520 --> 00:22:49,560 Speaker 1: dog that played Eddie on Frasier? All right, we'll get 411 00:22:49,600 --> 00:22:52,440 Speaker 1: to be a perfect fit. Um. All right, we're gonna 412 00:22:52,440 --> 00:22:58,840 Speaker 1: talk about America specifically. Sophie's right here, Sophie. Oh, Sophie's 413 00:22:58,920 --> 00:23:01,560 Speaker 1: right here. I was you would hid Anderson. Wait are 414 00:23:01,600 --> 00:23:04,360 Speaker 1: you saying Sophie is the ghost of the dog from Fraser. 415 00:23:04,600 --> 00:23:05,960 Speaker 1: I was just a friend that you would hire a 416 00:23:06,080 --> 00:23:10,480 Speaker 1: dog that isn't my own mine, Sophie, it's the ghost 417 00:23:10,560 --> 00:23:13,119 Speaker 1: of a dog. I know, but your sibling is in 418 00:23:13,240 --> 00:23:15,119 Speaker 1: this room and you're not going to hire. And the 419 00:23:15,800 --> 00:23:22,399 Speaker 1: dog has a lot of star power. But come on, 420 00:23:23,080 --> 00:23:26,879 Speaker 1: I've never read employment law or talked to an employment lawyer, 421 00:23:26,960 --> 00:23:29,440 Speaker 1: but I do think we're legally bound to hire ghosts 422 00:23:29,520 --> 00:23:32,159 Speaker 1: before living dogs. But we'll check into that, and we'll 423 00:23:32,200 --> 00:23:35,920 Speaker 1: circle back Okay, we won't go stack of of two 424 00:23:36,040 --> 00:23:43,520 Speaker 1: thousand four. All right, America, Pod save Bros. Because they're bros, right, 425 00:23:45,680 --> 00:23:53,919 Speaker 1: common phrase, God Save America? Yeah, alright, okay, that's another. 426 00:23:54,240 --> 00:23:56,800 Speaker 1: In the first episode, we talked about Jake Tapper writing 427 00:23:56,880 --> 00:23:59,919 Speaker 1: himbo and then calling it a day. They were brain's 428 00:24:00,080 --> 00:24:03,480 Speaker 1: arming on names, and somebody said, what about pod Save America? 429 00:24:03,560 --> 00:24:07,280 Speaker 1: And that was the day's work got done? I know. Okay, 430 00:24:07,880 --> 00:24:11,800 Speaker 1: Uh so pods America is comprised of Jon Favreau, not 431 00:24:11,960 --> 00:24:16,520 Speaker 1: that Jon Favreau, John Lovett, not that John Lovett, Tommy Vtur, 432 00:24:16,680 --> 00:24:21,639 Speaker 1: and Dan Phifer during the that but yeah, that did 433 00:24:21,680 --> 00:24:26,080 Speaker 1: you say dog Phifer? Yes, mad dog Phifer, you know, 434 00:24:26,200 --> 00:24:29,480 Speaker 1: as we like to call them. Uh. During the election, 435 00:24:29,520 --> 00:24:33,080 Speaker 1: they hosted Keeping It six also not a great name. 436 00:24:33,680 --> 00:24:37,520 Speaker 1: But then after after Trump won, they rebranded, as you said, 437 00:24:37,880 --> 00:24:41,160 Speaker 1: we guessed it, Pods of America and started their own 438 00:24:41,240 --> 00:24:45,960 Speaker 1: podcast network called Crooked Media. Yes, that is a quippy 439 00:24:46,480 --> 00:24:50,920 Speaker 1: Hillary reference. Very cool. Um. I think it's important to 440 00:24:51,080 --> 00:24:54,760 Speaker 1: distinguish that they do have some interesting shows on that network. 441 00:24:55,160 --> 00:24:57,720 Speaker 1: Pod Save the People is hosted by civil rights activist 442 00:24:57,760 --> 00:25:01,119 Speaker 1: Deray McKesson, and that's very well well done. And doesn't 443 00:25:01,160 --> 00:25:04,240 Speaker 1: have the same uh you know, kind of BROWI perspective 444 00:25:04,560 --> 00:25:08,200 Speaker 1: of the pod save guys, Uh with friends like these 445 00:25:08,280 --> 00:25:10,800 Speaker 1: dives into understanding people on either side of the aisle. 446 00:25:11,000 --> 00:25:12,679 Speaker 1: And I've been known to listen to the all female 447 00:25:12,760 --> 00:25:17,359 Speaker 1: panel show Hysteria, which how Stuff Works host Danish Schwartz 448 00:25:17,520 --> 00:25:19,680 Speaker 1: is a frequent guest on. So they do have some 449 00:25:19,800 --> 00:25:22,560 Speaker 1: good programs to offer. But we're here to talk about 450 00:25:22,560 --> 00:25:26,040 Speaker 1: positive America specifically. There's too many of them to go 451 00:25:26,160 --> 00:25:29,240 Speaker 1: into all of their backgrounds, but suffice to say that 452 00:25:29,280 --> 00:25:32,480 Speaker 1: they all met while working for the Obama administration. Fifer 453 00:25:32,600 --> 00:25:35,880 Speaker 1: was the communications director, Vieter was the assistant Press secretary 454 00:25:35,960 --> 00:25:39,120 Speaker 1: and National security spokesperson. Favreau and love It we're both 455 00:25:39,160 --> 00:25:42,880 Speaker 1: speech writers. Uh. One little thing that I found interesting, uh, 456 00:25:43,359 --> 00:25:45,800 Speaker 1: was I guess at one point Love It secretly officiated 457 00:25:45,880 --> 00:25:50,240 Speaker 1: the first gay wedding at the White House, but because 458 00:25:50,560 --> 00:25:53,240 Speaker 1: it was during a time when Obama's views on same 459 00:25:53,280 --> 00:25:56,280 Speaker 1: sex marriage were still ambiguous, he straight up did not 460 00:25:56,520 --> 00:25:59,600 Speaker 1: have permission, like they told him not to. In several interviews, 461 00:25:59,680 --> 00:26:02,000 Speaker 1: Love It has been quoted as saying, we were very nervous. 462 00:26:02,040 --> 00:26:03,720 Speaker 1: They were nervous because they were getting married, and I 463 00:26:03,800 --> 00:26:05,560 Speaker 1: was nervous because I snuck into my boss's house to 464 00:26:05,600 --> 00:26:08,680 Speaker 1: perform a wedding against his wishes in his backyard. Um, 465 00:26:09,240 --> 00:26:11,400 Speaker 1: that's actually pretty cool. I know, it is pretty cool. 466 00:26:12,600 --> 00:26:14,920 Speaker 1: I actually did the same thing in Jack O'Brien's back 467 00:26:15,520 --> 00:26:17,840 Speaker 1: but the people who got married were unaware of the 468 00:26:17,880 --> 00:26:22,480 Speaker 1: wedding too, so doubling down. So you're really nervous. It 469 00:26:22,600 --> 00:26:24,600 Speaker 1: was the same but different. How are they going to 470 00:26:24,680 --> 00:26:28,000 Speaker 1: react when I tell them that they're married? Now? One 471 00:26:28,320 --> 00:26:31,480 Speaker 1: tidbit that I found less fun slightly is it? Apparently 472 00:26:31,560 --> 00:26:35,240 Speaker 1: at one point, Jon Favreau posted a picture on Facebook 473 00:26:35,520 --> 00:26:38,439 Speaker 1: of him posing with a cardboard cutout of Hillary Clinton 474 00:26:38,520 --> 00:26:42,760 Speaker 1: and grabbing her boob. As scandals go, that's not a 475 00:26:42,840 --> 00:26:46,360 Speaker 1: big one, but it is pretty BROI And again, that's 476 00:26:46,400 --> 00:26:49,919 Speaker 1: the thesis that I keep coming back to about. I mean, 477 00:26:49,960 --> 00:26:52,080 Speaker 1: this is just a general bob. That's my opinion listening 478 00:26:52,119 --> 00:26:54,680 Speaker 1: to the show. It's why I don't listen to it anymore. 479 00:26:55,359 --> 00:26:57,680 Speaker 1: So I don't know how if they've evolved a bit 480 00:26:57,840 --> 00:27:02,680 Speaker 1: since then, But that's kind of a consensus that people have. Um, 481 00:27:02,880 --> 00:27:04,879 Speaker 1: but people love to tune in here their perspective. I mean, 482 00:27:04,920 --> 00:27:07,359 Speaker 1: they've got a really unique one. They they've been inside 483 00:27:07,440 --> 00:27:09,399 Speaker 1: the party for a long time, They've seen what a 484 00:27:09,480 --> 00:27:12,600 Speaker 1: functioning White House looks like. Um. But they get a 485 00:27:12,640 --> 00:27:16,000 Speaker 1: lot of criticism, uh, based on the very valid perception 486 00:27:16,080 --> 00:27:19,080 Speaker 1: that because of this access that they have, you know, 487 00:27:19,280 --> 00:27:21,639 Speaker 1: they have been in the Democratic for a Party for 488 00:27:21,720 --> 00:27:24,480 Speaker 1: a long time, they have access to that they that 489 00:27:24,560 --> 00:27:28,040 Speaker 1: they have some blind spots criticizing the establishment and the 490 00:27:28,119 --> 00:27:31,680 Speaker 1: people in the establishment, or that they don't want to 491 00:27:31,800 --> 00:27:35,040 Speaker 1: lose that access, you know, by pushing too hard, by 492 00:27:35,080 --> 00:27:39,480 Speaker 1: pushing too many buttons. Um. For example, when you listen 493 00:27:39,520 --> 00:27:41,800 Speaker 1: to Pods of America, you're gonna get no real criticisms 494 00:27:41,880 --> 00:27:45,160 Speaker 1: of Obama. They were a part of the Obama administration, 495 00:27:45,680 --> 00:27:50,640 Speaker 1: even during like we're in a period right now where 496 00:27:50,680 --> 00:27:54,320 Speaker 1: we're seeing these masks child detentions and that's the policy 497 00:27:54,400 --> 00:27:57,560 Speaker 1: that did start under the Obama administration. Yes, Trump has 498 00:27:57,600 --> 00:28:00,280 Speaker 1: made it worse and exactly changed it and like actually 499 00:28:00,280 --> 00:28:03,760 Speaker 1: did this whole thing up. But there's a real conversation 500 00:28:03,880 --> 00:28:08,280 Speaker 1: that needs to happen about what are the Democratic Party, 501 00:28:08,320 --> 00:28:11,320 Speaker 1: what Obama's administration has, what role they played in all 502 00:28:11,400 --> 00:28:13,439 Speaker 1: of this. Um. But whenever something like that comes up, 503 00:28:13,480 --> 00:28:16,800 Speaker 1: they dodged the issue. Same thing with talking about drone strikes. 504 00:28:17,920 --> 00:28:21,480 Speaker 1: The drone program that Obama you know, that's started there, 505 00:28:21,760 --> 00:28:23,840 Speaker 1: it continues to this day, and nobody talks about it. 506 00:28:24,520 --> 00:28:29,159 Speaker 1: And it's just and and that's it's very frustrating. And 507 00:28:29,200 --> 00:28:31,400 Speaker 1: that's something that's frustrating to me in general when we're 508 00:28:31,440 --> 00:28:35,160 Speaker 1: talking going through this whole primary process. I think it's 509 00:28:35,200 --> 00:28:41,040 Speaker 1: fair to criticize Obama. It doesn't make him a completely 510 00:28:41,120 --> 00:28:43,520 Speaker 1: bad person. It doesn't mean that everything he did was bad. 511 00:28:43,600 --> 00:28:45,800 Speaker 1: But you have to be able to have some self reflection. 512 00:28:46,040 --> 00:28:47,880 Speaker 1: And that is the main thing for me that's missing 513 00:28:48,240 --> 00:28:52,640 Speaker 1: from Pod Save America. Um, you know, the whole thing. 514 00:28:52,720 --> 00:28:56,080 Speaker 1: Like just this week, Cody shared this with me. Uh. 515 00:28:56,520 --> 00:28:59,400 Speaker 1: But John Favreau tweeted a glowing endorsement of former UN 516 00:28:59,400 --> 00:29:01,600 Speaker 1: Ambassador im At the Power's new book, The Education of 517 00:29:01,640 --> 00:29:04,800 Speaker 1: an Idealist, which is, I guess, a memoir that takes 518 00:29:04,800 --> 00:29:07,480 Speaker 1: a look at US foreign policy from the inside. It 519 00:29:07,600 --> 00:29:10,760 Speaker 1: apparently talks a lot about Bashar al Assad and Russia, 520 00:29:10,840 --> 00:29:15,760 Speaker 1: but fails to acknowledge anything about Obama's drone program or 521 00:29:15,880 --> 00:29:19,600 Speaker 1: how the US armed al Kada in Syria, you know, 522 00:29:20,200 --> 00:29:23,160 Speaker 1: or about Yemen, you know, the stuff that's going on 523 00:29:23,200 --> 00:29:29,560 Speaker 1: in Yemen with America's support. So it's again just whitewashing, 524 00:29:29,720 --> 00:29:32,360 Speaker 1: you know, highlighting what they want and like ignoring the 525 00:29:32,560 --> 00:29:36,160 Speaker 1: actual issues that we have. And someone who helps yea 526 00:29:36,400 --> 00:29:38,840 Speaker 1: a situation like Yemen, like she was involved with that 527 00:29:39,040 --> 00:29:41,520 Speaker 1: and just being like, hey, check out this book. I 528 00:29:41,600 --> 00:29:45,080 Speaker 1: don't know. It's is that it shouldn't be a source 529 00:29:45,320 --> 00:29:47,560 Speaker 1: you go to for news. It's a source where like 530 00:29:47,640 --> 00:29:49,880 Speaker 1: the obviously, as you pointed out, they have some unique 531 00:29:49,920 --> 00:29:53,400 Speaker 1: perspective that's not not valuable, Like there's certainly value when 532 00:29:53,480 --> 00:29:56,320 Speaker 1: hearing from people who've worked inside a functioning White House. Absolutely, 533 00:29:56,440 --> 00:29:59,600 Speaker 1: but they're not journalists. They're a propaganda arm of the 534 00:29:59,640 --> 00:30:04,920 Speaker 1: Obama the administration. Yes, yes, that's why I originally did 535 00:30:05,080 --> 00:30:07,440 Speaker 1: listen to their show and liked it because most of 536 00:30:07,520 --> 00:30:10,320 Speaker 1: what they talked about was like related to like their 537 00:30:10,360 --> 00:30:12,160 Speaker 1: experience in the White House and like how the White 538 00:30:12,200 --> 00:30:14,640 Speaker 1: House functions, and it was interesting to get that perspective. 539 00:30:15,400 --> 00:30:18,680 Speaker 1: But then the election happened and they kept sort of 540 00:30:19,120 --> 00:30:21,680 Speaker 1: being wrong and like doubling down and like not sort 541 00:30:21,720 --> 00:30:24,280 Speaker 1: of reflecting on that, and it became way less well, 542 00:30:24,320 --> 00:30:28,120 Speaker 1: but didn't didn't they keep it sixteen hundred? Oh no, 543 00:30:28,440 --> 00:30:31,680 Speaker 1: oh no, they didn't. They didn't. That might mean the 544 00:30:31,760 --> 00:30:36,720 Speaker 1: pod won't save America. I don't know it won't. Uh yeah. 545 00:30:36,840 --> 00:30:39,720 Speaker 1: The Parts of America team has also supported the United 546 00:30:39,760 --> 00:30:42,080 Speaker 1: States of Care. UM. I'm gonna let Cody talk a 547 00:30:42,080 --> 00:30:43,960 Speaker 1: bit about this because he knows more about it. But basically, 548 00:30:44,000 --> 00:30:46,880 Speaker 1: the United States of Care is an initiative that aims 549 00:30:46,920 --> 00:30:51,360 Speaker 1: to put healthcare over politics and to change the conversation. 550 00:30:51,640 --> 00:30:55,080 Speaker 1: I mean, health care is politics, but okay. Also, the 551 00:30:55,120 --> 00:30:57,400 Speaker 1: council working on this initiative is comprised of a ton 552 00:30:57,440 --> 00:31:00,160 Speaker 1: of healthcare executives, people have spent their lives ripping, pull 553 00:31:00,200 --> 00:31:03,160 Speaker 1: off and change for their lives, and they claim to 554 00:31:03,200 --> 00:31:06,280 Speaker 1: be a safe space to discuss common sense solutions. Uh. 555 00:31:06,640 --> 00:31:09,640 Speaker 1: And their mission statement says, the United States of Care 556 00:31:09,760 --> 00:31:12,480 Speaker 1: is a new movement to ensure that every single American 557 00:31:12,600 --> 00:31:17,080 Speaker 1: has access to quality, affordable healthcare, regardless of health status, 558 00:31:17,320 --> 00:31:21,920 Speaker 1: social need, or income. And that's a pretty conservative talking point. 559 00:31:22,200 --> 00:31:27,000 Speaker 1: Access is very different than having um and there's been 560 00:31:27,000 --> 00:31:29,200 Speaker 1: a lot of pushback online from that, and they've waffled, 561 00:31:29,240 --> 00:31:31,200 Speaker 1: you know, they've come out as saying they're for Medicare 562 00:31:31,280 --> 00:31:34,120 Speaker 1: for all. But then you know, they start, you know, 563 00:31:34,160 --> 00:31:37,440 Speaker 1: support this, or they're tweeting, I'm a Medicare for all advocate. 564 00:31:37,520 --> 00:31:39,080 Speaker 1: This is from John Favrew. By the way, I'm a 565 00:31:39,120 --> 00:31:41,200 Speaker 1: Medicare for All advocate. I also want to make sure 566 00:31:41,200 --> 00:31:43,640 Speaker 1: that there's a path to get there, which involves building 567 00:31:43,640 --> 00:31:46,120 Speaker 1: a movement and persuading people who don't currently agree. So 568 00:31:46,200 --> 00:31:48,920 Speaker 1: it's a different approach. It's more of a centrist uh 569 00:31:49,240 --> 00:31:52,440 Speaker 1: you know, yeah, I mean it's yeah, super centrist. I 570 00:31:53,320 --> 00:31:55,760 Speaker 1: find issue with a lot of this framing a lot 571 00:31:56,120 --> 00:31:58,920 Speaker 1: from like Favro, where he's like, you know, and it 572 00:31:59,000 --> 00:32:02,640 Speaker 1: involves building a movement persuading people. Well, John, I don't 573 00:32:02,640 --> 00:32:06,280 Speaker 1: know if you're familiar, but in sixteen there was a movement. 574 00:32:06,960 --> 00:32:10,520 Speaker 1: You dismissed it, and now that it's a popular idea, 575 00:32:10,680 --> 00:32:12,720 Speaker 1: you're adopting it but still sort of couching it in 576 00:32:12,760 --> 00:32:17,120 Speaker 1: this language of access, Like it's just like movement happens 577 00:32:17,160 --> 00:32:19,480 Speaker 1: and a candidate says for two years healthcare as a 578 00:32:19,560 --> 00:32:21,520 Speaker 1: human right, and he said it so many times and 579 00:32:21,600 --> 00:32:25,640 Speaker 1: people now think that because it's true. Um, and now 580 00:32:25,720 --> 00:32:28,760 Speaker 1: they're sort of like piggybacking on that movement, but also 581 00:32:29,800 --> 00:32:32,520 Speaker 1: like to the detriment of the movement by like allign 582 00:32:32,680 --> 00:32:35,880 Speaker 1: with like what former Senator Bill Frist is involved in 583 00:32:35,920 --> 00:32:38,479 Speaker 1: this organization. A lot of private insurance companies are involved, 584 00:32:38,560 --> 00:32:41,920 Speaker 1: and you still have like love it doing these sort 585 00:32:41,960 --> 00:32:44,200 Speaker 1: of rants about how we got to keep the private 586 00:32:44,320 --> 00:32:46,040 Speaker 1: health insurance if you like it and things like that, 587 00:32:46,320 --> 00:32:51,280 Speaker 1: which stops the actual momentum of the movement and persuading 588 00:32:51,360 --> 00:32:53,520 Speaker 1: people that this is a better pack. If you're saying 589 00:32:53,560 --> 00:32:54,840 Speaker 1: like we gotta do this, we gotta do this in 590 00:32:54,880 --> 00:32:57,080 Speaker 1: this incremental change, and like we gotta protect this, then 591 00:32:57,080 --> 00:32:59,440 Speaker 1: you're not actually persuading people of the thing that you 592 00:32:59,480 --> 00:33:02,920 Speaker 1: actually want, which is Medicare for all. So saying you're 593 00:33:02,920 --> 00:33:04,640 Speaker 1: for Medicare for all but doing all this other stuff 594 00:33:04,960 --> 00:33:07,520 Speaker 1: tells me that you're not really Another thing that people 595 00:33:07,560 --> 00:33:09,880 Speaker 1: find frustrating about Pots of America again, that they have 596 00:33:09,960 --> 00:33:12,040 Speaker 1: access to all these candidates, but they rarely ask them 597 00:33:12,680 --> 00:33:17,720 Speaker 1: hard hitting questions, and when they do, or they just 598 00:33:17,840 --> 00:33:19,680 Speaker 1: let the person, you know, get off the hook with 599 00:33:19,720 --> 00:33:21,840 Speaker 1: a pretty slippery answer. I'm sure that has to do 600 00:33:22,000 --> 00:33:24,320 Speaker 1: again with the access question of not wanting to burn bridges, 601 00:33:24,400 --> 00:33:25,880 Speaker 1: wanting to be a place that people want to go, 602 00:33:26,320 --> 00:33:28,800 Speaker 1: but you're squandering an opportunity and they're not journalists, and 603 00:33:28,800 --> 00:33:33,360 Speaker 1: they're not journalists, Like, um, it takes like speaking as 604 00:33:33,400 --> 00:33:35,960 Speaker 1: someone who's who's done a lot of interviews, like there's 605 00:33:35,960 --> 00:33:38,120 Speaker 1: a thing you have to get over when you're kind 606 00:33:38,160 --> 00:33:40,360 Speaker 1: of new to it, where like we're trying to be 607 00:33:40,440 --> 00:33:44,200 Speaker 1: kind of non confrontational, um, just in civilized society, and 608 00:33:44,320 --> 00:33:46,440 Speaker 1: like you have to be willing to say things to 609 00:33:46,600 --> 00:33:50,280 Speaker 1: someone's face that will make them hate you. And if 610 00:33:50,320 --> 00:33:53,880 Speaker 1: you're not a journalist and not experienced doing that, you're 611 00:33:53,920 --> 00:33:55,760 Speaker 1: just not going to do it in a conversation, Like 612 00:33:55,840 --> 00:33:59,440 Speaker 1: you're just not especially in that community especially I mean 613 00:34:00,400 --> 00:34:03,080 Speaker 1: friends and like having been there, I mean sure in 614 00:34:03,160 --> 00:34:05,200 Speaker 1: the interviews, but they also have big people come on. 615 00:34:05,240 --> 00:34:07,280 Speaker 1: They've had the presidential candidates coming on. Love It or 616 00:34:07,360 --> 00:34:11,440 Speaker 1: Leave It John Lovett's own show and do this like 617 00:34:11,760 --> 00:34:14,000 Speaker 1: Queen for a Day type segment where they'll ask him 618 00:34:14,000 --> 00:34:16,919 Speaker 1: a bunch of really intense questions like as president, would 619 00:34:16,920 --> 00:34:19,120 Speaker 1: you recognize a hot dog as a sandwich? And like 620 00:34:20,000 --> 00:34:23,839 Speaker 1: part of me is interested in seeing breaking groundbreaking, part 621 00:34:23,880 --> 00:34:26,680 Speaker 1: of me is curious about seeing how uh, these very 622 00:34:26,760 --> 00:34:31,000 Speaker 1: serious people are in a less formal situation, but not really. 623 00:34:31,080 --> 00:34:33,279 Speaker 1: What's more important to me is to like take that 624 00:34:33,400 --> 00:34:36,960 Speaker 1: time to ask Amy klobish Er something important right now. 625 00:34:37,160 --> 00:34:40,560 Speaker 1: They're there, you're there, you're there. Why do you Why 626 00:34:40,560 --> 00:34:44,480 Speaker 1: are you wandering this opportunity? UM, it's like having Donald 627 00:34:44,520 --> 00:34:47,239 Speaker 1: Trump on your show and like testing his two pay 628 00:34:47,560 --> 00:34:50,719 Speaker 1: and like trying to humanize him. It is ignoring the 629 00:34:50,800 --> 00:34:53,440 Speaker 1: fact that, like he's calling a huge chunk of the 630 00:34:53,520 --> 00:34:58,359 Speaker 1: country vermin, Jimmy Fallon might as well be a UM. 631 00:34:58,680 --> 00:35:01,160 Speaker 1: And one little thing I want to say this is, 632 00:35:01,200 --> 00:35:05,520 Speaker 1: according to Cody oh Love, it is the one that 633 00:35:05,600 --> 00:35:08,920 Speaker 1: suggested to Elizabeth Warren that she take the DNA test 634 00:35:09,400 --> 00:35:12,680 Speaker 1: that proved that she is one percent that bitch who 635 00:35:12,760 --> 00:35:15,200 Speaker 1: doesn't have very much Native American blood at all. I 636 00:35:15,320 --> 00:35:16,960 Speaker 1: was wanted to do a Lizzo joke. I'm sorry, I 637 00:35:17,000 --> 00:35:19,960 Speaker 1: did hear that. It was a really bad call, and 638 00:35:20,040 --> 00:35:22,160 Speaker 1: it was the kind of really bad call you'd make 639 00:35:22,360 --> 00:35:24,799 Speaker 1: if like. One of the things that makes these guys 640 00:35:24,920 --> 00:35:28,200 Speaker 1: so bad at actually analyzing a real presidential election is 641 00:35:28,280 --> 00:35:31,440 Speaker 1: that their only experience getting a guy elected was quote 642 00:35:31,480 --> 00:35:35,399 Speaker 1: unquote helping to get the most charismatic politician of any 643 00:35:35,440 --> 00:35:37,880 Speaker 1: of our lifetimes elected like Barack Obama didn't need that 644 00:35:38,000 --> 00:35:43,200 Speaker 1: much help, like um So, I don't trust that these 645 00:35:43,280 --> 00:35:45,560 Speaker 1: guys actually have good advice on how to get a 646 00:35:45,640 --> 00:35:49,799 Speaker 1: president elected because their previous experience is the most charismatic 647 00:35:49,920 --> 00:35:54,240 Speaker 1: man alive in the country got elected and they were around. Also, 648 00:35:54,520 --> 00:35:57,080 Speaker 1: love It didn't love It, didn't join the administration until 649 00:35:57,080 --> 00:36:01,239 Speaker 1: after he was Hillary didn't get her to either, Sure 650 00:36:01,320 --> 00:36:03,960 Speaker 1: did not neither time. They all also work for John Gary, 651 00:36:04,280 --> 00:36:06,440 Speaker 1: So maybe not all of them, but most of them did. 652 00:36:06,920 --> 00:36:09,799 Speaker 1: I might say they are the worst people to ask 653 00:36:09,880 --> 00:36:11,680 Speaker 1: about how to get a president elect, and they might 654 00:36:11,719 --> 00:36:17,280 Speaker 1: be the worst people to try and save America. Yeah, yeah, yeah, yeah, Okay, 655 00:36:17,560 --> 00:36:20,200 Speaker 1: we're gonna take a real quick break again, and then 656 00:36:20,239 --> 00:36:22,480 Speaker 1: we're gonna come back and hear what Robert has to 657 00:36:22,520 --> 00:36:27,000 Speaker 1: say about stuff. We're gonna talk about Nate silver. But first, 658 00:36:27,239 --> 00:36:29,560 Speaker 1: enjoy these ads for silver that you put up your 659 00:36:29,600 --> 00:36:32,720 Speaker 1: butt to cure diseases. We're selling butt silver, right, Sophie, 660 00:36:32,840 --> 00:36:35,800 Speaker 1: that's that's one of our sponsors. She's nodding yes, Okay, 661 00:36:36,000 --> 00:36:47,280 Speaker 1: because she's confirmed. She's not just confirmed. It products together everything. 662 00:36:50,120 --> 00:36:55,040 Speaker 1: We're back now, we have to get through this last 663 00:36:55,080 --> 00:36:58,719 Speaker 1: bit quickly, so we're gonna have fewer interjections, which is 664 00:36:58,760 --> 00:37:01,759 Speaker 1: why this is a a good time for me to 665 00:37:02,000 --> 00:37:06,080 Speaker 1: talk about Nate Silver, a man who Cody Johnston personally 666 00:37:06,160 --> 00:37:10,640 Speaker 1: finds deeply engaging and dare I say, powerfully erotic onto 667 00:37:10,680 --> 00:37:15,320 Speaker 1: the story politically engaged person who feared the rise of 668 00:37:15,360 --> 00:37:17,879 Speaker 1: Donald Trump at two thousand sixteen. There is a better 669 00:37:17,960 --> 00:37:20,120 Speaker 1: than even chance you spent a lot of time refreshing 670 00:37:20,160 --> 00:37:22,920 Speaker 1: the website for Nate Silver's pole aggregating thing of the 671 00:37:23,040 --> 00:37:26,040 Speaker 1: jig five thirty eight. Presumably many folks will do the 672 00:37:26,080 --> 00:37:29,279 Speaker 1: same thing throughout, sighing an ease a Trump's chances fall 673 00:37:29,600 --> 00:37:32,960 Speaker 1: and cringing as they rise throughout the campaign. On the surface, 674 00:37:33,640 --> 00:37:36,160 Speaker 1: seems like a totally down to business, numbers focused endeavor 675 00:37:36,239 --> 00:37:38,920 Speaker 1: that can be relied upon for objective analysis of the polls. 676 00:37:39,400 --> 00:37:43,759 Speaker 1: The reality, of course, is murkier. Nathaniel Read Silver was 677 00:37:43,840 --> 00:37:47,319 Speaker 1: born on January nineteen seventy eight. He fell in love 678 00:37:47,360 --> 00:37:49,520 Speaker 1: with baseball from an early age. If you aren't a 679 00:37:49,600 --> 00:37:51,320 Speaker 1: fan of the sport, this may come as a surprise 680 00:37:51,360 --> 00:37:53,880 Speaker 1: to you, but baseball is essentially dungeons and dragons with 681 00:37:53,920 --> 00:37:56,080 Speaker 1: an optional component where human beings, get out in the 682 00:37:56,160 --> 00:37:59,200 Speaker 1: field and do stuff too many many fans, baseball is 683 00:37:59,239 --> 00:38:02,239 Speaker 1: about the number, calculating run averages and turning players in 684 00:38:02,280 --> 00:38:05,520 Speaker 1: two collections of Statistics. In a two thousand six Chicago 685 00:38:05,600 --> 00:38:10,000 Speaker 1: Tribune profile, William Hagman wrote this about Silver quote. Silver 686 00:38:10,080 --> 00:38:11,960 Speaker 1: caught the baseball bug when he was six growing up 687 00:38:12,000 --> 00:38:14,560 Speaker 1: in East Lansing, Michigan. It was nineteen eighty four, the 688 00:38:14,640 --> 00:38:17,120 Speaker 1: year the Detroit Tigers won the World Series. The Tigers 689 00:38:17,200 --> 00:38:19,120 Speaker 1: became his team, and baseball his sport. And if there's 690 00:38:19,120 --> 00:38:21,560 Speaker 1: anything that goes hand in glove with baseball, it's numbers, 691 00:38:21,640 --> 00:38:25,240 Speaker 1: another of Silver's childhood interests. It is always more interesting 692 00:38:25,320 --> 00:38:28,040 Speaker 1: to apply it to batting averages than algebra class. Silver 693 00:38:28,160 --> 00:38:31,319 Speaker 1: said now. As a high school student, Nate was an 694 00:38:31,320 --> 00:38:33,880 Speaker 1: avid debater at a state level champion. He was not 695 00:38:34,000 --> 00:38:36,480 Speaker 1: a big fan of public speaking, but he loved delving 696 00:38:36,520 --> 00:38:39,840 Speaker 1: into research and crafting arguments. His childhood and East Lansing, 697 00:38:39,920 --> 00:38:42,280 Speaker 1: Michigan was pretty normal outside of his love for baseball, 698 00:38:42,320 --> 00:38:44,680 Speaker 1: which is objectively a red flag. He wrote for his 699 00:38:44,760 --> 00:38:47,600 Speaker 1: high school newspaper and showed an interest in both journalism 700 00:38:47,640 --> 00:38:50,680 Speaker 1: and economics. Nate went to the university of Chicago, where 701 00:38:50,680 --> 00:38:53,400 Speaker 1: he thrived and eventually got a degree in economics. He 702 00:38:53,480 --> 00:38:55,760 Speaker 1: spent some time working as a consultant for a gigantic 703 00:38:55,760 --> 00:38:58,480 Speaker 1: accounting firm in the early aughts, but his true love 704 00:38:58,560 --> 00:39:01,440 Speaker 1: remained baseball. He's been most of his free time crafting 705 00:39:01,440 --> 00:39:04,600 Speaker 1: a system for protecting the performance of professional baseball players. 706 00:39:05,000 --> 00:39:09,239 Speaker 1: He named the Player Empirical Comparison and Optimization Test Algorithm, 707 00:39:09,440 --> 00:39:11,960 Speaker 1: or PAKOTA, a reference to the name of a relatively 708 00:39:12,120 --> 00:39:16,200 Speaker 1: obscure infielder named Bill Pacoda. In that Tribune article major 709 00:39:16,280 --> 00:39:20,120 Speaker 1: League Yeah, he's a huge nerd um huge nerds also, 710 00:39:20,400 --> 00:39:22,320 Speaker 1: and he said he was a big fan of public speaking. 711 00:39:22,360 --> 00:39:26,799 Speaker 1: That's why he loves Twitter. In that Tribune article, major 712 00:39:26,840 --> 00:39:30,480 Speaker 1: League Baseball consultant Gary Huckaday described how Pacoda worked. Quote, 713 00:39:30,680 --> 00:39:33,840 Speaker 1: Nate isolated five components how a player provides value. Then 714 00:39:33,880 --> 00:39:36,520 Speaker 1: Pacoda looks through the history to find similar players who 715 00:39:36,600 --> 00:39:39,200 Speaker 1: have had similar performance shapes at the same age. It 716 00:39:39,239 --> 00:39:40,839 Speaker 1: takes a look at all those guys in the past 717 00:39:40,880 --> 00:39:42,480 Speaker 1: to have been similar, and takes a look at what 718 00:39:42,560 --> 00:39:45,560 Speaker 1: their careers did after a certain point. Keep that in 719 00:39:45,719 --> 00:39:48,279 Speaker 1: mind because it will be relevant to Nate's later work 720 00:39:48,320 --> 00:39:52,560 Speaker 1: in politics. Now Pacoda was successful enough that Silver eventually 721 00:39:52,640 --> 00:39:55,359 Speaker 1: sold it to a site called Baseball Prospectus. He got 722 00:39:55,400 --> 00:39:57,279 Speaker 1: a job there as a managing editor and did that 723 00:39:57,360 --> 00:40:00,440 Speaker 1: for a while. As George W. Bush's president, he wound 724 00:40:00,480 --> 00:40:03,400 Speaker 1: down and the two thousand eight election campaign started winding up. 725 00:40:03,520 --> 00:40:06,760 Speaker 1: Nates Silver found himself focusing more and more on national politics. 726 00:40:07,120 --> 00:40:10,520 Speaker 1: He started an anonymous blog under the name Pablano and 727 00:40:10,600 --> 00:40:13,480 Speaker 1: provided his analysis of the polls, which often differed from 728 00:40:13,480 --> 00:40:17,000 Speaker 1: the conventional wisdom of the pundit classmates. Hot takes were 729 00:40:17,040 --> 00:40:19,200 Speaker 1: based in large part on applying the lessons he learned 730 00:40:19,239 --> 00:40:21,920 Speaker 1: analyzing sports to politics. I'm going to quote from a 731 00:40:21,960 --> 00:40:25,640 Speaker 1: New York magazine profile. Now. Sports and politics offers several 732 00:40:25,719 --> 00:40:29,480 Speaker 1: obvious parallels. Both involved a competition, essentially between two teams. 733 00:40:29,719 --> 00:40:32,640 Speaker 1: Both involved dreams of statistical data available for devotees to 734 00:40:32,719 --> 00:40:36,320 Speaker 1: sort through or more commonly, for intermediary experts to sort through, analyze, 735 00:40:36,320 --> 00:40:38,719 Speaker 1: and then interpret for you. In baseball, these stats track 736 00:40:38,800 --> 00:40:41,040 Speaker 1: player performance, how many hits the player gets and win, 737 00:40:41,120 --> 00:40:43,400 Speaker 1: and against what kind of pictures, while in politics, the 738 00:40:43,440 --> 00:40:46,000 Speaker 1: data tracts voter preferences. Who do you like and why? 739 00:40:46,320 --> 00:40:47,719 Speaker 1: What kind of choice are you going to make on 740 00:40:47,840 --> 00:40:50,799 Speaker 1: election day. These stats, on their face seemed pretty straightforward. 741 00:40:50,880 --> 00:40:53,400 Speaker 1: If a hitter hits three hundred, he's valuable. If Obama 742 00:40:53,440 --> 00:40:56,200 Speaker 1: opens up a six point national lead, he's in good shape. 743 00:40:57,040 --> 00:41:00,440 Speaker 1: So in May of two thousand eight, eight had his 744 00:41:00,520 --> 00:41:03,560 Speaker 1: first major win. Poster started predicting Hillary Clinton would win 745 00:41:03,600 --> 00:41:05,600 Speaker 1: big in Indiana and only lose by eight points in 746 00:41:05,680 --> 00:41:08,440 Speaker 1: North Carolina. This was seen as evidence that her campaign 747 00:41:08,520 --> 00:41:11,400 Speaker 1: was finally recovering from the temporary lead Obama had accrued 748 00:41:11,480 --> 00:41:14,359 Speaker 1: over the past few months. Silver blogging as Pablana only 749 00:41:14,400 --> 00:41:16,759 Speaker 1: got around eight hundred daily visitors to his side at 750 00:41:16,800 --> 00:41:19,040 Speaker 1: the time. That changed when he published an in depth 751 00:41:19,080 --> 00:41:21,759 Speaker 1: breakdown of Clinton's polling numbers, which suggested she would lose 752 00:41:21,800 --> 00:41:24,400 Speaker 1: by sixteen points in North Carolina and only win by 753 00:41:24,440 --> 00:41:27,399 Speaker 1: two in Indiana. The final members were a one point 754 00:41:27,440 --> 00:41:30,080 Speaker 1: win for Clinton in Indiana and a fifteen point loss 755 00:41:30,239 --> 00:41:33,279 Speaker 1: in North Carolina. So maybe it had been almost exactly right, 756 00:41:33,520 --> 00:41:36,480 Speaker 1: very close, and every major pundit had been wrong. Nate 757 00:41:36,560 --> 00:41:39,040 Speaker 1: went on to correctly predict forty nine out of fifty 758 00:41:39,080 --> 00:41:41,399 Speaker 1: states in the two thousand eight election, or at least 759 00:41:42,040 --> 00:41:44,799 Speaker 1: that's how things were reported. Time put him on their 760 00:41:44,840 --> 00:41:47,600 Speaker 1: list of a hundred most influential people for doing that. 761 00:41:48,239 --> 00:41:50,800 Speaker 1: Two years later, Nate's new website five thirty eight was 762 00:41:50,800 --> 00:41:53,040 Speaker 1: bought by The New York Times, netting him a pile 763 00:41:53,120 --> 00:41:57,040 Speaker 1: of money. Now, the reality of that award he got 764 00:41:57,120 --> 00:41:59,560 Speaker 1: for predicting the outcomes of Fordnite in the fifty States 765 00:41:59,680 --> 00:42:02,400 Speaker 1: was that eight had not, in fact predicted anything. His 766 00:42:02,520 --> 00:42:04,839 Speaker 1: model had calculated what it saw is the most likely 767 00:42:04,920 --> 00:42:07,719 Speaker 1: outcomes for all fifty states, and in forty nine of 768 00:42:07,800 --> 00:42:10,799 Speaker 1: those cases, the end result had jelled with that calculation. 769 00:42:11,239 --> 00:42:14,120 Speaker 1: There's a difference though Nate had not said Obama is 770 00:42:14,200 --> 00:42:16,799 Speaker 1: going to win all these states and said he'd said, 771 00:42:16,880 --> 00:42:19,400 Speaker 1: my analysis of these polls shows Obama ahead in all 772 00:42:19,440 --> 00:42:22,040 Speaker 1: these states, and the media had sort of translated that 773 00:42:22,160 --> 00:42:26,640 Speaker 1: into Nate Silver correctly predicts the results. Yeah, this is 774 00:42:26,719 --> 00:42:29,839 Speaker 1: not hair splitting. There's a very important difference between the two. Yeah, yeah, 775 00:42:30,160 --> 00:42:33,120 Speaker 1: this actually happens, uh quite a lot with like him 776 00:42:33,520 --> 00:42:37,800 Speaker 1: and just even like on the reverse side, everyone everyone 777 00:42:37,840 --> 00:42:40,719 Speaker 1: often says like everyone got it wrong. Trump won, Everyone 778 00:42:40,760 --> 00:42:42,960 Speaker 1: got it wrong. Not then to get it wrong. If 779 00:42:43,040 --> 00:42:46,960 Speaker 1: you say, uh, someone has chance of winning and they win, 780 00:42:47,800 --> 00:42:53,560 Speaker 1: you're right chance, Like that's how that's how this works. Yeah, 781 00:42:53,719 --> 00:42:58,359 Speaker 1: that's that's how numbers work. Yeah. So Nate builds this blog, 782 00:42:58,800 --> 00:43:00,239 Speaker 1: and as he did it, he was again sort of 783 00:43:00,320 --> 00:43:02,719 Speaker 1: taking the ideas that had first earned him success in 784 00:43:02,760 --> 00:43:06,000 Speaker 1: the realm of baseball and applying them to politics. There 785 00:43:06,120 --> 00:43:08,479 Speaker 1: is a lot of logic to this system. Uh here's 786 00:43:08,520 --> 00:43:12,320 Speaker 1: New York Magazine quote. In concocting eight, Silver decided the 787 00:43:12,360 --> 00:43:13,880 Speaker 1: best way to read the polls was to put them 788 00:43:13,920 --> 00:43:16,200 Speaker 1: all together, with the idea that averaging tin poles would 789 00:43:16,200 --> 00:43:17,759 Speaker 1: give you a better result than trying to pick out 790 00:43:17,800 --> 00:43:20,120 Speaker 1: the best one. Again, he wasn't the first person to 791 00:43:20,200 --> 00:43:22,600 Speaker 1: do this. Other sites like Real Clear Politics and Poster 792 00:43:22,719 --> 00:43:25,760 Speaker 1: offered the same service. But as Silver told me, sometimes 793 00:43:25,800 --> 00:43:27,959 Speaker 1: the answer is in looking at other alternatives that exist 794 00:43:28,040 --> 00:43:29,720 Speaker 1: in the market and saying they have the right idea, 795 00:43:29,800 --> 00:43:32,080 Speaker 1: but they're not doing it the right way. So he 796 00:43:32,160 --> 00:43:34,880 Speaker 1: came up with a system that predicts posters future performance 797 00:43:34,920 --> 00:43:36,719 Speaker 1: based on how good it's been in the past, and 798 00:43:36,800 --> 00:43:39,400 Speaker 1: finding his average Silver ways each pole differently ranking them 799 00:43:39,440 --> 00:43:43,160 Speaker 1: according to his own statistic pie Poster induced error based 800 00:43:43,200 --> 00:43:45,360 Speaker 1: on a number of factors, including its track record and 801 00:43:45,440 --> 00:43:48,000 Speaker 1: its methodology. When advantage of this system is that during 802 00:43:48,040 --> 00:43:50,560 Speaker 1: the primaries, the system actually got smarter because each time 803 00:43:50,600 --> 00:43:53,240 Speaker 1: a pole performed well in the primary, it's ranking improved. 804 00:43:54,160 --> 00:43:57,480 Speaker 1: So that's what Nate does. And uh, you know. For years, 805 00:43:58,000 --> 00:44:00,080 Speaker 1: Nate Silver was considered to be something very close to 806 00:44:00,160 --> 00:44:02,520 Speaker 1: a profit by many in particularly sort of like the 807 00:44:02,600 --> 00:44:05,320 Speaker 1: liberal side of things. Uh. In the two thousand and 808 00:44:05,360 --> 00:44:08,320 Speaker 1: twelve election, five thirty eight was lauded for correctly predicting 809 00:44:08,360 --> 00:44:10,720 Speaker 1: the results of all fifty states. It won a Webby 810 00:44:10,760 --> 00:44:13,560 Speaker 1: Award in two thousand thirteen at one another, and Nate 811 00:44:13,640 --> 00:44:16,080 Speaker 1: sold the site to ESPN and became its editor in 812 00:44:16,160 --> 00:44:18,879 Speaker 1: chief or I guess the New York Times, so yeah. 813 00:44:19,320 --> 00:44:21,759 Speaker 1: Five thirty eight relaunched as something more akin to a 814 00:44:21,880 --> 00:44:24,359 Speaker 1: major news site in two thousand fourteen, covering a wide 815 00:44:24,440 --> 00:44:27,719 Speaker 1: variety of stories with the perspective of data journalism. And 816 00:44:27,880 --> 00:44:31,160 Speaker 1: then came the two thousand sixteen election. It would be 817 00:44:31,239 --> 00:44:33,560 Speaker 1: a stretch to say this election was Nates undoing, but 818 00:44:33,680 --> 00:44:36,319 Speaker 1: it was a blow to his reputation. Like everyone else, 819 00:44:36,400 --> 00:44:39,600 Speaker 1: five thirty eight predicted a Clinton win as most likely. However, 820 00:44:39,920 --> 00:44:42,560 Speaker 1: that bird's eye view alone is not fair. Nates Silver 821 00:44:42,680 --> 00:44:45,080 Speaker 1: actually took a huge amount of flak from the left 822 00:44:45,160 --> 00:44:48,359 Speaker 1: from you know, giving Donald Trump a thirty percent chance 823 00:44:48,440 --> 00:44:50,960 Speaker 1: of victory. On November five, two thousand and sixteen, the 824 00:44:51,040 --> 00:44:54,440 Speaker 1: Huffington Posts Ryan Grimm published a very dumb article titled, 825 00:44:54,680 --> 00:44:58,240 Speaker 1: Nate Silver is unskewing polls, all of them in Trump's direction. 826 00:44:58,920 --> 00:45:01,120 Speaker 1: Now that unskewing line as a reference to a bit 827 00:45:01,160 --> 00:45:03,160 Speaker 1: of political history some of you may remember. Back in 828 00:45:03,200 --> 00:45:05,360 Speaker 1: two thousand twelve, a moron grew convinced that all of 829 00:45:05,400 --> 00:45:08,399 Speaker 1: the polls were biased against Mitt Romney. Using nonsense math, 830 00:45:08,440 --> 00:45:11,840 Speaker 1: he unskewed them to predict a massive victory formt This 831 00:45:11,960 --> 00:45:15,640 Speaker 1: did not happen. So while outlets like Poe gave Trump 832 00:45:15,719 --> 00:45:17,839 Speaker 1: like a one to two percent chance of victory, five 833 00:45:17,920 --> 00:45:20,680 Speaker 1: thirty eight showed his campaign as having a serious fighting chance, 834 00:45:20,920 --> 00:45:23,760 Speaker 1: and Nate Silver, to his credit, repeatedly outlined that Trump 835 00:45:23,800 --> 00:45:26,440 Speaker 1: had a very viable route to an electoral college victory. 836 00:45:26,760 --> 00:45:29,759 Speaker 1: I'm gonna quote from Ryan Grimm's article. Now, the short 837 00:45:29,880 --> 00:45:32,600 Speaker 1: version is that Silver is changing the results of polls 838 00:45:32,680 --> 00:45:35,160 Speaker 1: to fit where he thinks the polls truly are, rather 839 00:45:35,239 --> 00:45:37,279 Speaker 1: than simply entering the poll numbers into his model and 840 00:45:37,360 --> 00:45:41,120 Speaker 1: crunching them. Silver called this unskewing thank you a trend 841 00:45:41,239 --> 00:45:44,319 Speaker 1: line adjustment. He compares it a poll to previous polls 842 00:45:44,320 --> 00:45:46,880 Speaker 1: conducted by the same polling firm, makes a series of assumptions, 843 00:45:46,960 --> 00:45:49,520 Speaker 1: runs a regression analysis, and gets a new poll number. 844 00:45:49,680 --> 00:45:51,480 Speaker 1: That's the number he sticks in his model, not the 845 00:45:51,520 --> 00:45:54,279 Speaker 1: original number. He may end up being right, but he's 846 00:45:54,360 --> 00:45:57,879 Speaker 1: just guessing. So this guy basically said that, like, Trump 847 00:45:57,920 --> 00:46:00,279 Speaker 1: benefited unfairly from the skewing and of course eat his 848 00:46:00,360 --> 00:46:03,600 Speaker 1: words hardcore when Trump won, And like, you know, I 849 00:46:03,640 --> 00:46:06,120 Speaker 1: think it's actually unfair that Silver took some credit, like 850 00:46:06,320 --> 00:46:08,279 Speaker 1: took flak for like not giving Trump more of a 851 00:46:08,400 --> 00:46:11,160 Speaker 1: chance of victory, because he really did repeatedly point out 852 00:46:11,320 --> 00:46:14,640 Speaker 1: he could totally win if the things that happened happened. 853 00:46:14,800 --> 00:46:17,920 Speaker 1: There's a chance he will win. Yeah. Yeah, Silver gave 854 00:46:18,000 --> 00:46:20,080 Speaker 1: him a solid chance of winning, and he won, And 855 00:46:20,160 --> 00:46:21,959 Speaker 1: you got to give the dude credit for that. Also, 856 00:46:22,080 --> 00:46:23,759 Speaker 1: like this, like he may end up right, but he's 857 00:46:23,760 --> 00:46:30,920 Speaker 1: just guessing. Everyone. Yeah, everyone, homeboy, yeah, none of you. Yeah. Now, 858 00:46:31,360 --> 00:46:32,960 Speaker 1: I just gave Nate a lot of credit for his 859 00:46:33,040 --> 00:46:35,600 Speaker 1: performance in the two thousand and sixteen elections, so it's 860 00:46:35,600 --> 00:46:39,120 Speaker 1: also fair that I criticized him for the things that 861 00:46:39,200 --> 00:46:43,320 Speaker 1: he was wrong about in that election. Two thousand fifteen, 862 00:46:44,239 --> 00:46:48,120 Speaker 1: eight published this article under Nate Silver's byline, Donald Trump 863 00:46:48,239 --> 00:46:58,239 Speaker 1: is winning the polls and losing the nomination like that 864 00:46:58,520 --> 00:47:01,920 Speaker 1: is just guessing now. In that article, Nate predicted that 865 00:47:01,960 --> 00:47:04,040 Speaker 1: Trump would lose the nomination, and his evidence as to 866 00:47:04,120 --> 00:47:06,840 Speaker 1: why was largely not based on hard data science but 867 00:47:07,080 --> 00:47:10,399 Speaker 1: on history. Quote twelve years ago in two thousand three, 868 00:47:10,600 --> 00:47:13,200 Speaker 1: Joe Lieberman led in most polls of the Democratic primary. 869 00:47:13,400 --> 00:47:16,360 Speaker 1: Eight years ago, on August two thousand seven, Rudy Giuliani 870 00:47:16,400 --> 00:47:18,920 Speaker 1: maintained a clear lead in polls of Republicans, while Hillary 871 00:47:19,000 --> 00:47:22,359 Speaker 1: Clinton led poles of the Democratic nomination contest four years 872 00:47:22,400 --> 00:47:24,919 Speaker 1: ago in August two thousand eleven, when Mitt Romney began 873 00:47:24,960 --> 00:47:26,640 Speaker 1: with a lead in the polls of Republican voters, but 874 00:47:26,680 --> 00:47:28,080 Speaker 1: he would be surpassed by the end of the month 875 00:47:28,120 --> 00:47:30,600 Speaker 1: by Rick Perry, the first of four Republican rivals who 876 00:47:30,600 --> 00:47:35,720 Speaker 1: would at some point overtake Romney in the national polling averages. Lieberman, Clinton, Giuliani, Perry, 877 00:47:35,800 --> 00:47:37,880 Speaker 1: as you probably gathered, are not the phases of top 878 00:47:37,960 --> 00:47:41,040 Speaker 1: mount Rushmore. Only Clinton came close to winning the nomination. 879 00:47:41,960 --> 00:47:44,840 Speaker 1: Following that, Silver included one of his own Twitter posts 880 00:47:44,880 --> 00:47:47,960 Speaker 1: in the article, which makes him look pretty dumb and retrospect. 881 00:47:48,440 --> 00:47:50,520 Speaker 1: So this is a tweet from August nine, two thousand 882 00:47:50,560 --> 00:47:53,000 Speaker 1: and fifteen, by Nate Silver, and he's like, he's doing 883 00:47:53,080 --> 00:47:56,840 Speaker 1: it as like a little script. Media, Trump's doing great, Nerds, No, 884 00:47:57,080 --> 00:47:59,520 Speaker 1: those polls don't meet what you think. Media. A new 885 00:47:59,600 --> 00:48:04,600 Speaker 1: poll shows Trump doing great. Proved you wrong. Yeah, he's uh, yeah, 886 00:48:05,080 --> 00:48:07,600 Speaker 1: how can you be like that? Not so aware? Like 887 00:48:07,719 --> 00:48:13,360 Speaker 1: that's just parity. Also, it's interesting that he's done this 888 00:48:13,480 --> 00:48:15,160 Speaker 1: sort of like if you look at history, actually left 889 00:48:15,160 --> 00:48:17,279 Speaker 1: poles changed that he doesn't. He doesn't do that for 890 00:48:17,400 --> 00:48:21,480 Speaker 1: Joe Biden. Does he know? He does not? Interesting? Interesting, 891 00:48:21,760 --> 00:48:25,719 Speaker 1: very interest Interesting. It's September and he's has not has 892 00:48:25,800 --> 00:48:30,560 Speaker 1: not made Maybe he's learned his lesson. Yeah. Now, at 893 00:48:30,680 --> 00:48:33,120 Speaker 1: this point, the problem with Nate Silver is pretty obvious. 894 00:48:33,200 --> 00:48:35,440 Speaker 1: He's never gotten out of the habit of treating political 895 00:48:35,520 --> 00:48:39,360 Speaker 1: predictions like sports. In baseball, appealing to history makes sense 896 00:48:39,440 --> 00:48:42,520 Speaker 1: because there are thousands upon thousands of games and thousands 897 00:48:42,600 --> 00:48:44,800 Speaker 1: upon thousands of players, and you have a lot of 898 00:48:44,920 --> 00:48:47,640 Speaker 1: data with which you can use to draw useful conclusions. 899 00:48:48,200 --> 00:48:52,080 Speaker 1: American politics does not work that way, there are comparatively speaking, 900 00:48:52,200 --> 00:48:55,759 Speaker 1: relatively few congress people and elections in recorded history, and 901 00:48:55,880 --> 00:48:59,480 Speaker 1: even fewer presidents. Outliers then matter a lot more, and 902 00:48:59,600 --> 00:49:03,279 Speaker 1: making accurate predictions based on history is much harder. But 903 00:49:03,400 --> 00:49:05,600 Speaker 1: even in that article, Nate fell directly back on his 904 00:49:05,640 --> 00:49:08,160 Speaker 1: baseball history to help justify his opinion that Trump had 905 00:49:08,200 --> 00:49:11,960 Speaker 1: no chance quote, so should you ignore those national polls entirely? 906 00:49:12,120 --> 00:49:14,320 Speaker 1: In a literal sense? They do have some correlation with 907 00:49:14,400 --> 00:49:16,680 Speaker 1: election outcomes, even this far out at candidate near the 908 00:49:16,719 --> 00:49:18,480 Speaker 1: top of the polls is a somewhat better bet to 909 00:49:18,520 --> 00:49:20,839 Speaker 1: win the nomination than one near the bottom. But that's 910 00:49:20,880 --> 00:49:24,160 Speaker 1: like projecting a major league pitchers numbers from high school stats. Sure, 911 00:49:24,200 --> 00:49:26,279 Speaker 1: you'd rather draft a random seventeen year old with a 912 00:49:26,360 --> 00:49:28,120 Speaker 1: two point one for e r A than another one 913 00:49:28,200 --> 00:49:29,759 Speaker 1: with a three point three one e r A if 914 00:49:29,840 --> 00:49:31,759 Speaker 1: that's all the information you have to go by, But 915 00:49:31,880 --> 00:49:34,160 Speaker 1: that data doesn't reveal very much, and it's predictive power 916 00:49:34,160 --> 00:49:38,960 Speaker 1: it tends to be swamped by other indicators, so wild Like, yeah, 917 00:49:39,120 --> 00:49:44,319 Speaker 1: it's really dumb. Like an actual like you can look 918 00:49:44,400 --> 00:49:47,880 Speaker 1: at a players like actual performance over their entire career 919 00:49:48,160 --> 00:49:51,000 Speaker 1: and like judge them based off of that, how do 920 00:49:51,120 --> 00:49:54,520 Speaker 1: you how do you how do you bring that logic 921 00:49:54,640 --> 00:49:58,800 Speaker 1: to politics? I mean it worked for a while. You 922 00:49:58,960 --> 00:50:00,800 Speaker 1: got to give him that for a while. But like, 923 00:50:00,880 --> 00:50:05,080 Speaker 1: even the logic of it isn't right. Yeah, it's not 924 00:50:06,120 --> 00:50:09,279 Speaker 1: um yes, a lot of it is luck. And in 925 00:50:09,360 --> 00:50:12,480 Speaker 1: that article Nate made it it stated that our emphatic 926 00:50:12,560 --> 00:50:15,080 Speaker 1: prediction is simply that Trump will not win the nomination. 927 00:50:15,200 --> 00:50:17,120 Speaker 1: It's not even clear that he's trying to do so. 928 00:50:18,120 --> 00:50:22,080 Speaker 1: Um So I don't know that Trump was trying to 929 00:50:22,120 --> 00:50:25,800 Speaker 1: actually win. That is a fair point. Now, I'm obviously 930 00:50:25,840 --> 00:50:27,920 Speaker 1: far from the first person to criticize Nate for treating 931 00:50:27,920 --> 00:50:31,000 Speaker 1: politics like the baseballs. Among the community of people who 932 00:50:31,080 --> 00:50:33,759 Speaker 1: understand statistics, he has quite a few detractors. One of 933 00:50:33,840 --> 00:50:36,040 Speaker 1: them is not Seemed to Leb not Seem is one 934 00:50:36,080 --> 00:50:38,400 Speaker 1: of a galaxy of thinkers who got famous thanks to 935 00:50:38,520 --> 00:50:41,279 Speaker 1: Malcolm Gladwell. He's an investor with a unique philosophy on 936 00:50:41,400 --> 00:50:43,719 Speaker 1: risk that I don't really understand in a very successful 937 00:50:43,800 --> 00:50:46,600 Speaker 1: history of making money via predicting bad economic ship and 938 00:50:46,640 --> 00:50:49,080 Speaker 1: then capitalizing on it. I understand that seemed to Leb 939 00:50:49,480 --> 00:50:52,279 Speaker 1: does not like Nate. Nate does not like not Seem, 940 00:50:52,440 --> 00:50:54,480 Speaker 1: and starting in two thousand fifteen or so, the two 941 00:50:54,560 --> 00:50:57,880 Speaker 1: started ship fighting on Twitter. Their disagreements are fairly nerdy, 942 00:50:57,920 --> 00:50:59,719 Speaker 1: and I am not competent to get into the weeds 943 00:50:59,760 --> 00:51:02,560 Speaker 1: of their deeper statistical slap flights or to say who's right. 944 00:51:02,880 --> 00:51:04,520 Speaker 1: What's important is that you know that this was not 945 00:51:04,600 --> 00:51:07,040 Speaker 1: a friendly rivalry, and the Seem took to commenting on 946 00:51:07,120 --> 00:51:10,240 Speaker 1: Silver's methods on Twitter. He regularly chimed in on comments 947 00:51:10,280 --> 00:51:13,359 Speaker 1: made by other individuals criticizing five thirty eight, and then 948 00:51:13,680 --> 00:51:17,160 Speaker 1: in April of two thou eighteen, this happened, so guy 949 00:51:17,280 --> 00:51:21,880 Speaker 1: named Joachim Marnet's like that he posts like a criticism 950 00:51:21,960 --> 00:51:25,760 Speaker 1: of five now cast, and it seemed to leab comments 951 00:51:25,800 --> 00:51:28,239 Speaker 1: to that. Yes, the now cast came out later in 952 00:51:28,320 --> 00:51:31,719 Speaker 1: his bs and Nate Silver ways in the Seame, it's 953 00:51:31,719 --> 00:51:33,759 Speaker 1: pathetic that for the last two years you favorite it 954 00:51:33,800 --> 00:51:36,560 Speaker 1: retweeted hundreds of comments from random Twitter nobodies that take 955 00:51:36,600 --> 00:51:38,880 Speaker 1: your side of the argument. Maybe you should save some 956 00:51:38,960 --> 00:51:42,799 Speaker 1: of your dignity and debate me, to which the Seam responds, Silver, 957 00:51:42,960 --> 00:51:45,239 Speaker 1: You're a quack. Do not engage me. Just broadcast to 958 00:51:45,280 --> 00:51:48,000 Speaker 1: your followers three million or three billion doesn't make a difference. 959 00:51:48,120 --> 00:51:51,640 Speaker 1: I write formal papers, to which Nate responds, l O L. 960 00:51:51,680 --> 00:51:53,360 Speaker 1: I can't believe all it took was for someone to 961 00:51:53,440 --> 00:51:55,400 Speaker 1: stand up to your bullying by trolling you back, to 962 00:51:55,480 --> 00:52:01,640 Speaker 1: turn you into such a cuck, calls him a cut 963 00:52:04,600 --> 00:52:11,959 Speaker 1: Nates liver, Yeah, classy. I love it, And ironically calling 964 00:52:12,040 --> 00:52:14,919 Speaker 1: someone a cock like that, and like the scene comes 965 00:52:14,960 --> 00:52:16,400 Speaker 1: out as a bit of a dick in that too, 966 00:52:16,560 --> 00:52:19,040 Speaker 1: Like they both seem like unpleasant people. I wouldn't want 967 00:52:19,040 --> 00:52:21,840 Speaker 1: to have dinner with their having a moment online. I 968 00:52:22,000 --> 00:52:27,960 Speaker 1: highly doubt that Nate actually laughed out loud. I love 969 00:52:28,040 --> 00:52:30,920 Speaker 1: that his response, Like the sim comes out with the 970 00:52:31,000 --> 00:52:34,040 Speaker 1: incredibly arrogant, Like in any other argument, somebody like trying 971 00:52:34,120 --> 00:52:36,480 Speaker 1: to argue that they're right by saying I write formal 972 00:52:36,560 --> 00:52:39,120 Speaker 1: papers would be the person I sympathize with least in 973 00:52:39,200 --> 00:52:42,399 Speaker 1: that argument, Nate follow us up calling him a cock 974 00:52:44,960 --> 00:52:49,320 Speaker 1: and end of argument. Oh it's amazing, he just no 975 00:52:49,360 --> 00:52:53,000 Speaker 1: one looks good here. Yeah, it's incredible. So in December 976 00:52:53,080 --> 00:52:55,920 Speaker 1: of that year, Isaac Farber wrote a really interesting breakdown 977 00:52:55,960 --> 00:52:58,160 Speaker 1: of the spat between the same and Nate on the 978 00:52:58,280 --> 00:53:01,800 Speaker 1: medium blog towards Data Science. It is a very wonky 979 00:53:01,920 --> 00:53:04,320 Speaker 1: dissection of five thirty eight, and I will not pretend 980 00:53:04,360 --> 00:53:06,719 Speaker 1: to understand every piece of the math that this guy 981 00:53:06,800 --> 00:53:09,200 Speaker 1: brings up in it. Favor does not call five thirty 982 00:53:09,239 --> 00:53:12,520 Speaker 1: eight useless and certainly seems to see value innates analysis 983 00:53:12,560 --> 00:53:15,680 Speaker 1: of the polls, but he also criticizes Silver for accepting 984 00:53:15,719 --> 00:53:18,040 Speaker 1: praise for his not a prediction predictions of the two 985 00:53:18,080 --> 00:53:21,440 Speaker 1: thousand eight election results. Quote he should not have accepted 986 00:53:21,480 --> 00:53:23,520 Speaker 1: the honor if he didn't call a winner in any 987 00:53:23,600 --> 00:53:26,040 Speaker 1: of the states, which is a fair point. If he 988 00:53:26,080 --> 00:53:30,759 Speaker 1: didn't predict anything, don't. Favor points out that there are 989 00:53:30,800 --> 00:53:34,360 Speaker 1: two types of uncertainty in any prediction, aliatory and epistemic. 990 00:53:34,600 --> 00:53:37,440 Speaker 1: If aliatory uncertainty is the probability of, say, rolling a 991 00:53:37,520 --> 00:53:40,160 Speaker 1: six on a standard die, epistemic would have more to 992 00:53:40,239 --> 00:53:42,680 Speaker 1: do with the uncertainty involved in getting certain results in 993 00:53:42,760 --> 00:53:45,040 Speaker 1: a specific game, or with specific other types of dice 994 00:53:45,160 --> 00:53:46,600 Speaker 1: or whatever like. It has to do with more with 995 00:53:46,680 --> 00:53:49,360 Speaker 1: the system. I'm going to quote from his right up 996 00:53:49,400 --> 00:53:52,600 Speaker 1: now quote. Bespoke models like five thirty eight's only report 997 00:53:52,640 --> 00:53:56,280 Speaker 1: to the public aliatory uncertainty as it concerns their statistical outputs. 998 00:53:56,600 --> 00:53:59,880 Speaker 1: The trouble is that epistemic uncertainty is very difficult sometimes 999 00:54:00,120 --> 00:54:03,000 Speaker 1: possible to estimate. For example, why didn't five thirty eight 1000 00:54:03,160 --> 00:54:05,919 Speaker 1: model incorporate before it happened a chance that Comey would 1001 00:54:05,920 --> 00:54:09,719 Speaker 1: reopen his investigation into the Clinton emails. Sports, like other 1002 00:54:09,840 --> 00:54:12,680 Speaker 1: games of chance, have very well defined mechanisms which lend 1003 00:54:12,719 --> 00:54:16,759 Speaker 1: themselves to statistical analysis. On the other hand, highly nonlinear 1004 00:54:16,840 --> 00:54:20,400 Speaker 1: events that contested elections may not. With fewer data points, 1005 00:54:20,480 --> 00:54:23,280 Speaker 1: you can see the variation of the Senate predictions is enormous. 1006 00:54:23,600 --> 00:54:26,080 Speaker 1: Gauging the performance of models on these types of events 1007 00:54:26,160 --> 00:54:29,080 Speaker 1: becomes doubly difficult. It isn't clear if a prediction is 1008 00:54:29,160 --> 00:54:31,600 Speaker 1: wrong owing to the quality of the model, epistemic, or 1009 00:54:31,719 --> 00:54:35,719 Speaker 1: just luck aliatory. Because there is so much uncertainty around 1010 00:54:35,760 --> 00:54:39,200 Speaker 1: nonlinear events like an election, it could reasonably be considered 1011 00:54:39,239 --> 00:54:43,000 Speaker 1: frivolous to report early stage forecasts. The only conceivable reason 1012 00:54:43,080 --> 00:54:45,760 Speaker 1: to do so is to capture and monetize the interest 1013 00:54:45,840 --> 00:54:47,680 Speaker 1: of a public which is hungry to know the future. 1014 00:54:47,880 --> 00:54:50,000 Speaker 1: I will not go into the technical arguments. Tell Of 1015 00:54:50,040 --> 00:54:52,000 Speaker 1: has written and published a paper on the key issues. 1016 00:54:52,040 --> 00:54:54,880 Speaker 1: With a solution. Here, we can say with some confidence 1017 00:54:55,000 --> 00:54:58,520 Speaker 1: that five thirty eight predictions are not always reliable probabilities 1018 00:55:00,320 --> 00:55:05,680 Speaker 1: seems fair, solid solid maths lapdown on Nate there. Yeah, Now, 1019 00:55:06,120 --> 00:55:08,160 Speaker 1: if the last several months or anything to judge by 1020 00:55:08,360 --> 00:55:10,680 Speaker 1: Nate Silver still suffers from the problem of weighing his 1021 00:55:10,760 --> 00:55:12,880 Speaker 1: own biases too heavily in his predictions, even when the 1022 00:55:12,960 --> 00:55:15,680 Speaker 1: numbers don't back him up. Back in August two thousand nineteen, 1023 00:55:15,800 --> 00:55:19,000 Speaker 1: Nate tweeted about an analysis of campaign finance data published 1024 00:55:19,040 --> 00:55:21,759 Speaker 1: by the Washington Post. This analysis looked at the people 1025 00:55:21,800 --> 00:55:24,800 Speaker 1: who had donated to Democratic presidential campaigns and concluded that 1026 00:55:24,880 --> 00:55:27,760 Speaker 1: Sanders had the most loyal base of any campaign because 1027 00:55:27,800 --> 00:55:30,279 Speaker 1: more than eighty percent of his donors gave just to him. 1028 00:55:30,760 --> 00:55:34,160 Speaker 1: Here sound Nate interpreted that data on Twitter, so basically, 1029 00:55:34,239 --> 00:55:38,480 Speaker 1: Bernie fans only like Bernie, and only Bernie fans like Bernie. Now, 1030 00:55:38,640 --> 00:55:40,400 Speaker 1: a number of people pointed out that this was not 1031 00:55:40,520 --> 00:55:42,600 Speaker 1: a fair analysis. I found a good website on the 1032 00:55:42,760 --> 00:55:45,520 Speaker 1: very biased Splinter news website that nonetheless hits at some 1033 00:55:45,600 --> 00:55:47,960 Speaker 1: of Nate's core issues. Quote. First of all, it should 1034 00:55:48,000 --> 00:55:51,200 Speaker 1: not be surprising whatsoever that of people who have donated 1035 00:55:51,239 --> 00:55:53,759 Speaker 1: to Sanders anti establishment campaign did not then turn around 1036 00:55:53,800 --> 00:55:56,640 Speaker 1: and give money to former Vice President Joe Biden. Sanders's 1037 00:55:56,680 --> 00:55:58,560 Speaker 1: base is loyal, and there are a few other candidates 1038 00:55:58,560 --> 00:56:00,839 Speaker 1: who truly share his goals like metic care for all. Plus, 1039 00:56:00,920 --> 00:56:02,920 Speaker 1: many of his donors have given in small amounts, so 1040 00:56:03,000 --> 00:56:04,440 Speaker 1: it makes sense to think that they may not have 1041 00:56:04,600 --> 00:56:07,600 Speaker 1: money to give elsewhere. But the chart and the story 1042 00:56:07,680 --> 00:56:10,880 Speaker 1: clearly showed. Despite silver suggestion that roughly twenty of Sanders 1043 00:56:10,880 --> 00:56:14,080 Speaker 1: supporters gave to other campaigns, Sanders and Senator Elizabeth Warren 1044 00:56:14,120 --> 00:56:17,560 Speaker 1: share sixty thousand donors. Sanders supporters also gave to Reptulci 1045 00:56:17,600 --> 00:56:21,000 Speaker 1: Gabbard in significant amounts. Most overlapping donors gave more money 1046 00:56:21,040 --> 00:56:24,040 Speaker 1: to Sanders. Now. A number of people also pointed out 1047 00:56:24,080 --> 00:56:26,600 Speaker 1: that while Sanders donors tended to be loyal, a number 1048 00:56:26,680 --> 00:56:30,120 Speaker 1: of donors to other Democratic candidates also donated to Sanders, 1049 00:56:30,160 --> 00:56:33,200 Speaker 1: which would suggest that supporters of other candidates don't hate Bernie, 1050 00:56:33,239 --> 00:56:35,920 Speaker 1: even if they do prefer Kamala Harris or mayor Pete. 1051 00:56:36,360 --> 00:56:38,560 Speaker 1: More to the point is the fact that since Sanders 1052 00:56:38,600 --> 00:56:40,879 Speaker 1: has so many small dollar donors who only gave to him, 1053 00:56:41,080 --> 00:56:44,040 Speaker 1: this suggests something else very important that Nate missed entirely. 1054 00:56:44,560 --> 00:56:47,080 Speaker 1: Malaka Jabali, who writes for the Guardian the intercept in 1055 00:56:47,120 --> 00:56:49,760 Speaker 1: the route pointed this out in a tweet, quote, today, 1056 00:56:49,800 --> 00:56:51,960 Speaker 1: we learned that Sanders get support from people who may 1057 00:56:52,040 --> 00:56:55,320 Speaker 1: not normally be drawn to Democratic candidates. Something tells me 1058 00:56:55,440 --> 00:56:59,560 Speaker 1: that might come in handy in November. Now, that's all 1059 00:56:59,560 --> 00:57:02,440 Speaker 1: pretty their evidence that Nate, like everyone, lets his personal 1060 00:57:02,480 --> 00:57:04,840 Speaker 1: opinions cloud his analysis, even when he claims to just 1061 00:57:04,920 --> 00:57:07,520 Speaker 1: be looking at the numbers. Data can be interpreted in 1062 00:57:07,640 --> 00:57:10,080 Speaker 1: numerous ways, and it's clear to me that Silver has 1063 00:57:10,120 --> 00:57:13,799 Speaker 1: an anti Bernie bias. His actual articles on eight are 1064 00:57:14,000 --> 00:57:16,880 Speaker 1: fairer about Sanders. In July he published an article about 1065 00:57:16,880 --> 00:57:19,280 Speaker 1: how Bernie was quote ahead of the pack on healthcare, 1066 00:57:19,400 --> 00:57:22,160 Speaker 1: and in February, one of his editors, Claire Malone, wrote 1067 00:57:22,200 --> 00:57:25,520 Speaker 1: an article about how Bernie could win the nomination. But 1068 00:57:25,640 --> 00:57:28,560 Speaker 1: even in his supposedly unbiased coverage, Nate has a tendency 1069 00:57:28,640 --> 00:57:31,240 Speaker 1: to reveal himself, as best embodied by this June six, 1070 00:57:31,440 --> 00:57:35,240 Speaker 1: two thou nineteen headline, Bernie Sanders has the highest floor 1071 00:57:35,520 --> 00:57:40,120 Speaker 1: and it's pretty damn low. Okay, where he's like pointing 1072 00:57:40,160 --> 00:57:42,720 Speaker 1: out that Sanders has like the highest like floor of 1073 00:57:42,800 --> 00:57:45,040 Speaker 1: support of any of the Democratic candidates, but it's low, 1074 00:57:45,160 --> 00:57:48,919 Speaker 1: so it's not a big come on dude, like, yeah, ah, 1075 00:57:49,040 --> 00:57:51,960 Speaker 1: he does that all the time. Yes. Sorry. We were 1076 00:57:52,000 --> 00:57:54,120 Speaker 1: supposed to write brief summaries of these people, and I 1077 00:57:54,160 --> 00:57:56,720 Speaker 1: wrote nine pages, and I apologize, but you got through 1078 00:57:56,760 --> 00:57:59,760 Speaker 1: that so fast it was hard to I was biding 1079 00:57:59,800 --> 00:58:02,919 Speaker 1: my tongue so much because I knew we had limited time. 1080 00:58:03,200 --> 00:58:06,600 Speaker 1: I didn't want any reject It was interesting. Yeah, that 1081 00:58:06,800 --> 00:58:08,360 Speaker 1: caused a lot of a lot of my already held 1082 00:58:08,400 --> 00:58:11,040 Speaker 1: opinions about Nate Solver. I mean, we could go through 1083 00:58:11,080 --> 00:58:12,480 Speaker 1: every one of his tweets and be like, well, here's 1084 00:58:12,480 --> 00:58:15,920 Speaker 1: how you're misrepresenting it. We're not going to I will 1085 00:58:15,960 --> 00:58:18,560 Speaker 1: not be reading all of my responses to Nate Silver's tweets. 1086 00:58:18,960 --> 00:58:21,720 Speaker 1: He's got some singers in there, got some singers, there's 1087 00:58:21,720 --> 00:58:24,800 Speaker 1: some hey folks, there's some good tweets in there. All right, um, 1088 00:58:25,520 --> 00:58:30,280 Speaker 1: but boy, he uh yeah, just just take take take 1089 00:58:30,320 --> 00:58:34,600 Speaker 1: him with a grain of salt. Yeah, yeah, take everybody, 1090 00:58:36,320 --> 00:58:41,160 Speaker 1: everybody with a grain of salt. But again, particularly people 1091 00:58:41,200 --> 00:58:44,400 Speaker 1: who claim to have some sort of like that their 1092 00:58:44,400 --> 00:58:48,240 Speaker 1: analysis is based purely on the numbers, because there's always 1093 00:58:48,320 --> 00:58:52,920 Speaker 1: more to it than that. And and again, predictions like 1094 00:58:53,000 --> 00:58:54,760 Speaker 1: one of the most valuable things I think about that. 1095 00:58:54,920 --> 00:58:59,240 Speaker 1: Um that, Uh. The quote that I read from that 1096 00:58:59,400 --> 00:59:04,160 Speaker 1: like stick histical wank medium blog was the point that, um, you, 1097 00:59:04,640 --> 00:59:08,040 Speaker 1: when you're analyzing sports, there are rules to baseball. You 1098 00:59:08,160 --> 00:59:10,520 Speaker 1: know that a game, if baseball, is not going to 1099 00:59:10,680 --> 00:59:13,520 Speaker 1: be impacted by one of the players pulling out a 1100 00:59:13,720 --> 00:59:16,520 Speaker 1: gun and shooting the ball out of the air, because 1101 00:59:16,560 --> 00:59:18,880 Speaker 1: then that that that guy would be ejected and that 1102 00:59:18,920 --> 00:59:20,600 Speaker 1: would not be part of the game because it's not 1103 00:59:20,680 --> 00:59:22,520 Speaker 1: part of the rules of baseball. You can't shoot a 1104 00:59:22,560 --> 00:59:25,000 Speaker 1: ball out of the air with a gun. You can't 1105 00:59:25,080 --> 00:59:27,800 Speaker 1: do the same thing with politics because all sorts of 1106 00:59:27,840 --> 00:59:30,640 Speaker 1: ship happens, and there really aren't rules, And so at 1107 00:59:30,720 --> 00:59:34,520 Speaker 1: any moment, the FBI director can reopen an investigation into 1108 00:59:34,560 --> 00:59:38,080 Speaker 1: a candidate's emails, or a candidate could pull his dick out, 1109 00:59:38,720 --> 00:59:43,040 Speaker 1: or like like anything, there are variables. You cannot account for. 1110 00:59:43,720 --> 00:59:47,000 Speaker 1: The idea that like you can treat politics the way 1111 00:59:47,040 --> 00:59:53,200 Speaker 1: you treat sports is fundamentally asinine also dangerous, I think dangerous, right, 1112 00:59:53,280 --> 00:59:56,200 Speaker 1: it's sort of like reinforces this sort of like the 1113 00:59:56,600 --> 00:59:59,720 Speaker 1: racehorse sort of narrative like how politics go, but also 1114 00:59:59,800 --> 01:00:04,440 Speaker 1: like I mean baseball, like you can observe somebody's actual 1115 01:00:04,520 --> 01:00:07,640 Speaker 1: skill level and what they accomplish, and like translating that 1116 01:00:07,760 --> 01:00:11,200 Speaker 1: to what people think of a candidate. That's not how 1117 01:00:11,400 --> 01:00:15,120 Speaker 1: that's not how it is. It's completely different. Yeah, And 1118 01:00:15,200 --> 01:00:17,320 Speaker 1: there would be like if we if if Donald Trump 1119 01:00:17,440 --> 01:00:20,120 Speaker 1: was the thousandth president, Nate would have more of a 1120 01:00:20,200 --> 01:00:22,120 Speaker 1: like to stand on because then there would be enough 1121 01:00:22,200 --> 01:00:24,440 Speaker 1: numbers that like it would kind of boil out some 1122 01:00:24,560 --> 01:00:27,120 Speaker 1: of like the weird fringe chances of things happening, just 1123 01:00:27,160 --> 01:00:29,840 Speaker 1: because there would be so much data. But there's not. 1124 01:00:30,520 --> 01:00:33,680 Speaker 1: You haven't even had fifty Yeah, I don't think he 1125 01:00:33,680 --> 01:00:40,320 Speaker 1: would uh that sample size? Yeah, got a ways to 1126 01:00:40,400 --> 01:00:44,280 Speaker 1: go before we had any Before we go, I wanted 1127 01:00:44,320 --> 01:00:47,560 Speaker 1: to give Washington Post its to do. Yeah. In regards 1128 01:00:47,720 --> 01:00:51,200 Speaker 1: to Bernie Sanders coverage, um and share with you this 1129 01:00:51,520 --> 01:00:54,479 Speaker 1: this headline from I want to you both to guess 1130 01:00:54,560 --> 01:00:58,520 Speaker 1: who you think wrote it. People came to see Bernie 1131 01:00:58,560 --> 01:01:00,880 Speaker 1: Sanders in Boston. Why aren't we talking more about it? 1132 01:01:01,800 --> 01:01:07,840 Speaker 1: Fair question? Who wrote it? Oh? God, I don't know, Chris. Yeah, 1133 01:01:08,920 --> 01:01:11,160 Speaker 1: I was wrong. Yeah, I didn't even know he wrote 1134 01:01:11,200 --> 01:01:14,640 Speaker 1: for the Washington Post. I didn't either. Unbelievable, good take Chris, 1135 01:01:14,760 --> 01:01:18,520 Speaker 1: you're one good take. Well this has been great. Yeah, 1136 01:01:18,560 --> 01:01:20,880 Speaker 1: we know what to read and listen to and what 1137 01:01:21,200 --> 01:01:26,560 Speaker 1: you guys still clear on where to get your information? Yes, 1138 01:01:26,960 --> 01:01:30,880 Speaker 1: Breitbart Right dot com. Yeah, yeah, that's where we landed. Good. 1139 01:01:33,960 --> 01:01:42,360 Speaker 1: DA's great. The Daily Stormer some really hot takes. Yeah, 1140 01:01:42,800 --> 01:01:47,520 Speaker 1: Daily Mail, all the Daily The Daily Mail is when 1141 01:01:47,560 --> 01:01:50,480 Speaker 1: The Daily Mail reposted the entirety of a mass shooters 1142 01:01:50,560 --> 01:01:55,400 Speaker 1: manifesto uncritically, just like plastered it on. I didn't know 1143 01:01:55,480 --> 01:01:57,760 Speaker 1: they did that. What the hell? Yeah, they did that 1144 01:01:57,920 --> 01:02:02,080 Speaker 1: because they're responsible journalists. Cody, Oh, that's why. That's what 1145 01:02:02,320 --> 01:02:06,240 Speaker 1: that looks like. That's why. Yes, So stay tuned for 1146 01:02:06,360 --> 01:02:09,520 Speaker 1: the next episode of this podcast, wherein I read Anders 1147 01:02:09,560 --> 01:02:15,880 Speaker 1: Brevick's manifesto with no commentary. Um, you mean you're gonna 1148 01:02:15,880 --> 01:02:21,240 Speaker 1: read twelve Rules for life. God not to say that 1149 01:02:21,320 --> 01:02:24,400 Speaker 1: they're the same, but um, I feel like I feel 1150 01:02:24,440 --> 01:02:27,320 Speaker 1: like maybe Jordan Peterson read a little little Anders and 1151 01:02:27,480 --> 01:02:29,280 Speaker 1: was like, oh man, it's kind of right about a 1152 01:02:29,280 --> 01:02:32,760 Speaker 1: lot of stuff. He had that decision we all have 1153 01:02:32,920 --> 01:02:36,200 Speaker 1: where you have to choose between being a rich political 1154 01:02:36,280 --> 01:02:39,440 Speaker 1: personality who gets paid to lecture at colleges and shooting 1155 01:02:39,520 --> 01:02:46,200 Speaker 1: dozens of children. Uh, and thankfully he chose for him. 1156 01:02:46,440 --> 01:02:48,520 Speaker 1: I feel like the root of the problem is still there. 1157 01:02:48,720 --> 01:02:52,840 Speaker 1: But yeah, how the tree grew um is a better, Yes, 1158 01:02:52,960 --> 01:02:56,440 Speaker 1: absolutely better tree. So stay tuned for more of the 1159 01:02:56,520 --> 01:03:00,280 Speaker 1: Worst Year Ever. Yeah. Also stay tuned for the literal 1160 01:03:00,400 --> 01:03:03,320 Speaker 1: Worst Year Ever. And yeah, the actual year and the 1161 01:03:03,320 --> 01:03:07,400 Speaker 1: show about the year. We'll be ratching this up yeah later. Yeah, 1162 01:03:07,480 --> 01:03:09,320 Speaker 1: not every episode. It's going to be us talking about 1163 01:03:09,600 --> 01:03:12,760 Speaker 1: media sources. It's gonna be talking about a lot of 1164 01:03:12,800 --> 01:03:14,960 Speaker 1: other stuff. But I feel good about this. I think 1165 01:03:15,000 --> 01:03:17,640 Speaker 1: this is a great place for us to start primary 1166 01:03:18,880 --> 01:03:24,560 Speaker 1: trust only us. That's the takeaway that this two parter 1167 01:03:24,920 --> 01:03:29,480 Speaker 1: is to give you only us, and uh you know 1168 01:03:29,800 --> 01:03:33,320 Speaker 1: that that way you'll just kind of uncritically accepted. When 1169 01:03:33,400 --> 01:03:38,320 Speaker 1: we start pushing our secret dark horse candidate for Theodor Kazynski, 1170 01:03:39,920 --> 01:03:41,760 Speaker 1: I thought you were going to kill with Vermin Supreme, 1171 01:03:41,920 --> 01:03:47,360 Speaker 1: but you went with Okay, okay, okay, um. Real quick 1172 01:03:47,360 --> 01:03:49,360 Speaker 1: before you go, make sure to check out our website 1173 01:03:50,360 --> 01:03:56,760 Speaker 1: www dot Worst Year pot dot calm, and on both 1174 01:03:56,800 --> 01:04:02,240 Speaker 1: Instagram and Twitter, we're at Worst Year Pod. Simple ease 1175 01:04:02,360 --> 01:04:04,959 Speaker 1: to remember we all just followed it. I said ease 1176 01:04:05,240 --> 01:04:10,840 Speaker 1: because it was easier than saying easy saying I'm really 1177 01:04:10,960 --> 01:04:14,960 Speaker 1: good at ending podcasts, So like and subscribe, and in 1178 01:04:15,040 --> 01:04:18,840 Speaker 1: the meantime, maybe stockpile some dried food, maybe by a 1179 01:04:18,880 --> 01:04:21,120 Speaker 1: machete and some bolt cutters. Maybe pick up, you know, 1180 01:04:21,200 --> 01:04:24,960 Speaker 1: a couple of months of extra your prescription, whatever that 1181 01:04:25,040 --> 01:04:27,680 Speaker 1: happens to be. Maybe Horde insulin, maybe by m oh 1182 01:04:27,760 --> 01:04:33,760 Speaker 1: whatever sounds good. Order all of that from Amazon dot com. 1183 01:04:35,520 --> 01:04:38,360 Speaker 1: Choose one day delivery to put lots of people's lives 1184 01:04:38,400 --> 01:04:41,800 Speaker 1: and risk. Right, isn't that a thing? And stay tuned 1185 01:04:41,840 --> 01:04:44,200 Speaker 1: for our next episode on why Medicare for all just 1186 01:04:44,480 --> 01:04:50,560 Speaker 1: isn't very practical hard degree, I think, I think good 1187 01:04:50,640 --> 01:04:57,080 Speaker 1: things are impossible. We have a new tagline, good things 1188 01:04:57,120 --> 01:05:03,480 Speaker 1: are impossible. What was the one failing? Just like yeah 1189 01:05:03,680 --> 01:05:07,120 Speaker 1: from the last episode, feeling something don't expectacularly something like that. 1190 01:05:08,680 --> 01:05:14,000 Speaker 1: Guess sure, guys, we did it, don't you know? What? 1191 01:05:14,640 --> 01:05:24,480 Speaker 1: Come by? Everything? Everything? So dump it's got again. I 1192 01:05:24,600 --> 01:05:30,480 Speaker 1: tried Daniel Lovely. Worst Year Ever is a production of 1193 01:05:30,520 --> 01:05:33,280 Speaker 1: I Heart Radio. For more podcasts from my heart Radio, 1194 01:05:33,400 --> 01:05:36,320 Speaker 1: visit the I heart Radio app, Apple Podcasts, or wherever 1195 01:05:36,400 --> 01:05:37,720 Speaker 1: you listen to your favorite shows.