1 00:00:06,440 --> 00:00:09,800 Speaker 1: Hi, and welcome back to the Carol Markoi Show on iHeartRadio. 2 00:00:10,160 --> 00:00:11,920 Speaker 2: My guest today is Jacob Siegel. 3 00:00:12,320 --> 00:00:14,880 Speaker 1: Jacob is a columnist for Tablet and the author of 4 00:00:14,920 --> 00:00:18,000 Speaker 1: the new book The Information State, Politics, and the Age 5 00:00:18,040 --> 00:00:21,640 Speaker 1: of Total Control, published by Henry Holt out now anywhere 6 00:00:21,680 --> 00:00:24,920 Speaker 1: you buy your books. He also co hosts the Manifesto 7 00:00:25,000 --> 00:00:28,440 Speaker 1: podcast with the novelist Bill Clay. Hi, Jacob's so nice 8 00:00:28,480 --> 00:00:28,920 Speaker 1: to have you on. 9 00:00:29,720 --> 00:00:29,920 Speaker 3: Hi. 10 00:00:31,640 --> 00:00:33,560 Speaker 2: So I had your brother, Harry. 11 00:00:33,280 --> 00:00:36,800 Speaker 1: Siegel on just a month or two ago, and I 12 00:00:36,880 --> 00:00:40,280 Speaker 1: told him he has the very best New York accent 13 00:00:40,400 --> 00:00:42,839 Speaker 1: because it feels so familiar to me, and it's like 14 00:00:43,320 --> 00:00:47,360 Speaker 1: my part of South Brooklyn accent. And you sound exactly 15 00:00:47,440 --> 00:00:49,760 Speaker 1: like him. 16 00:00:49,080 --> 00:00:52,839 Speaker 3: You know, personally, I like that Marine Park like the 17 00:00:52,960 --> 00:00:54,200 Speaker 3: Irish accent. 18 00:00:54,640 --> 00:00:57,480 Speaker 2: Yeah, that's your favorite. Okay, how does that sound? 19 00:00:57,600 --> 00:01:02,480 Speaker 3: Really, it's like a higher quality to it, almost like 20 00:01:03,160 --> 00:01:05,760 Speaker 3: old James Cagney gangst their pictures. 21 00:01:06,319 --> 00:01:10,600 Speaker 1: Right, you guys sound like annoyed Jews, which is what 22 00:01:10,680 --> 00:01:11,720 Speaker 1: I would be going for. 23 00:01:13,120 --> 00:01:15,880 Speaker 3: Okay, angry Jews, not angry. 24 00:01:16,120 --> 00:01:18,120 Speaker 2: Not angry. No, No, you're not angry. 25 00:01:18,120 --> 00:01:20,640 Speaker 1: You guys are so happy. You're like, you know, completely 26 00:01:20,680 --> 00:01:24,679 Speaker 1: happy warriors. But you know, the you didn't get your 27 00:01:24,760 --> 00:01:27,720 Speaker 1: gefilta the right way this morning, and I'll tell you. 28 00:01:28,800 --> 00:01:31,280 Speaker 1: So tell me about your new book, What is the 29 00:01:31,319 --> 00:01:32,280 Speaker 1: Information State? 30 00:01:33,880 --> 00:01:38,000 Speaker 3: So I started writing this book really through this long 31 00:01:38,160 --> 00:01:41,039 Speaker 3: essay I wrote for Tablet Magazine called a God to 32 00:01:41,120 --> 00:01:44,240 Speaker 3: Understanding the Hoax of the Century Thirteen Ways of Looking 33 00:01:44,319 --> 00:01:48,400 Speaker 3: at Disinformation, which was my attempt to understand how this 34 00:01:48,640 --> 00:01:55,600 Speaker 3: relatively unknown phenomenon disinformation had gone from being a kind 35 00:01:55,600 --> 00:01:59,320 Speaker 3: of relic of the Cold War spoken about in the 36 00:01:59,360 --> 00:02:03,120 Speaker 3: context of spycraft and in the context of the Soviet 37 00:02:03,240 --> 00:02:08,600 Speaker 3: Union which no longer existed, to suddenly becoming totally ubiquitous 38 00:02:08,800 --> 00:02:13,160 Speaker 3: in the American political landscape, coincident with the rise of 39 00:02:13,200 --> 00:02:17,120 Speaker 3: Donald Trump in twenty fifteen. In twenty and sixteen, and 40 00:02:17,280 --> 00:02:19,680 Speaker 3: it seemed evident to me as I started looking at this, 41 00:02:20,400 --> 00:02:22,880 Speaker 3: not only that there was a lot of let's say, 42 00:02:23,200 --> 00:02:28,040 Speaker 3: political malfeasance and kind of partisan exploitation of this concept 43 00:02:28,160 --> 00:02:33,200 Speaker 3: to serve what were just nakedly political ends, right to 44 00:02:33,240 --> 00:02:36,680 Speaker 3: sort of discredit Trump is and whatever you thought of Trump. 45 00:02:36,960 --> 00:02:39,760 Speaker 3: It was obvious that there were these sort of inflated 46 00:02:39,840 --> 00:02:44,560 Speaker 3: charges about the power of disinformation that were being used 47 00:02:44,600 --> 00:02:49,280 Speaker 3: to sort of paint all Trump supporters as if they 48 00:02:49,320 --> 00:02:55,360 Speaker 3: were either unwitting dupes of Kremlin propaganda, or perhaps they 49 00:02:55,400 --> 00:02:58,320 Speaker 3: were even you know, something more malevolent, like you know, 50 00:02:58,480 --> 00:03:03,520 Speaker 3: actual form Asians doing active measures. And then when I 51 00:03:03,560 --> 00:03:09,400 Speaker 3: looked even deeper and sort of applied a more critical lens, 52 00:03:09,480 --> 00:03:13,320 Speaker 3: what I realized was that this phenomenon that I was seeing, 53 00:03:13,560 --> 00:03:19,280 Speaker 3: disinformation or the war against disinformation specifically, had all of 54 00:03:19,320 --> 00:03:23,239 Speaker 3: these resonances with experiences I had had as a US 55 00:03:23,400 --> 00:03:28,920 Speaker 3: Army officer and an intelligence officer in Afghanistan specifically. I 56 00:03:28,960 --> 00:03:31,320 Speaker 3: was in Iraq also earlier in two thousand and six 57 00:03:31,360 --> 00:03:34,720 Speaker 3: two thousand and seven, but I had a different, more 58 00:03:35,040 --> 00:03:38,880 Speaker 3: kinetic role in Iraq, and so I started to recognize 59 00:03:39,640 --> 00:03:45,280 Speaker 3: these kind of tactics strategies even at a deeper level, 60 00:03:45,800 --> 00:03:50,800 Speaker 3: philosophies of information warfare and information control creeping into the 61 00:03:50,840 --> 00:03:55,200 Speaker 3: American political system. And finally, to wrap it up, this 62 00:03:55,360 --> 00:03:58,680 Speaker 3: brought me to this idea of the information state as 63 00:03:58,720 --> 00:04:04,720 Speaker 3: a distinctive form of political regime which attempts to replace 64 00:04:05,480 --> 00:04:12,560 Speaker 3: the earlier structures of constitutional representative democracy based on the 65 00:04:12,600 --> 00:04:16,920 Speaker 3: procedures of law, the sovereignty of individual voters, the power 66 00:04:16,960 --> 00:04:20,839 Speaker 3: of democratic institutions. It attempts to sweep all of these 67 00:04:20,880 --> 00:04:26,320 Speaker 3: away and replace them with a form of mass information control, 68 00:04:26,600 --> 00:04:32,400 Speaker 3: whereby the powers that control the algorithms that structure our 69 00:04:32,480 --> 00:04:37,760 Speaker 3: perceptions of reality have the ability to engineer political outcomes 70 00:04:37,800 --> 00:04:41,680 Speaker 3: and social outcomes for that matter. And so power moves 71 00:04:41,800 --> 00:04:46,640 Speaker 3: away from individual voters and their votes, let's say. 72 00:04:47,080 --> 00:04:52,679 Speaker 4: And it moves up into these obscure, somewhat mysterious kinds 73 00:04:52,720 --> 00:04:58,360 Speaker 4: of algorithmic and informational control, where the ability to determine 74 00:04:58,839 --> 00:05:03,480 Speaker 4: what information somebody sees and what information they don't see 75 00:05:04,279 --> 00:05:07,400 Speaker 4: is how you shape the vote that they will cast. 76 00:05:08,960 --> 00:05:12,560 Speaker 4: So that, in the kind of grand sense, is the 77 00:05:12,560 --> 00:05:15,520 Speaker 4: information state. But the more deeply I looked at it, also, 78 00:05:15,520 --> 00:05:19,400 Speaker 4: the more I discovered that this was something that's been 79 00:05:19,440 --> 00:05:23,600 Speaker 4: built really over centuries, in the sense that these ideas 80 00:05:23,800 --> 00:05:28,760 Speaker 4: date back. Ideas about information control as a sort of 81 00:05:29,800 --> 00:05:33,960 Speaker 4: technique for mastering the physical universe dates back to the 82 00:05:34,000 --> 00:05:38,200 Speaker 4: scientific Revolution, and then shows up in American politics really 83 00:05:38,240 --> 00:05:42,440 Speaker 4: with the progressive and technocratic movements, and then really in 84 00:05:42,560 --> 00:05:46,240 Speaker 4: full force with the administration of Woodrow Wilson in the 85 00:05:46,240 --> 00:05:47,120 Speaker 4: First World War. 86 00:05:48,200 --> 00:05:52,839 Speaker 1: Were you surprised at kind of more sophisticated people falling 87 00:05:53,120 --> 00:05:57,520 Speaker 1: for various hoaxes that you wouldn't imagine would fall for 88 00:05:57,560 --> 00:05:58,320 Speaker 1: those hoaxes. 89 00:06:00,240 --> 00:06:02,479 Speaker 3: I mean, I think the only honest would answer that 90 00:06:02,640 --> 00:06:05,479 Speaker 3: is to say that I was only surprised after I 91 00:06:05,520 --> 00:06:12,559 Speaker 3: stopped following for them. Initially in twenty sixteen, Let's say 92 00:06:13,560 --> 00:06:18,760 Speaker 3: I remember reading the accounts of Trump's ties. 93 00:06:18,760 --> 00:06:19,440 Speaker 5: To Putin and. 94 00:06:21,400 --> 00:06:26,720 Speaker 3: Hearing the reports coming from politicians like Adam Schiff, who 95 00:06:26,760 --> 00:06:31,000 Speaker 3: I didn't trust necessarily, but who who I assumed were 96 00:06:31,040 --> 00:06:36,080 Speaker 3: working within the bounds of a certain certain constraints of 97 00:06:36,120 --> 00:06:40,279 Speaker 3: the American political system that you couldn't simply invent a 98 00:06:40,360 --> 00:06:42,040 Speaker 3: conspiracy about. 99 00:06:41,760 --> 00:06:48,120 Speaker 1: The president being the Russian right, Yeah. 100 00:06:47,120 --> 00:06:52,600 Speaker 3: You could exaggerate, but where there's smoker's fire. And then 101 00:06:52,640 --> 00:06:55,120 Speaker 3: at a certain point after I don't know what it was, 102 00:06:55,160 --> 00:06:59,120 Speaker 3: the sixth or seventh time that I heard Shift promise 103 00:06:59,200 --> 00:07:02,760 Speaker 3: an imminent smoking gun revelation that was going to wrap 104 00:07:02,800 --> 00:07:06,080 Speaker 3: all this up and establish it definitively. And I saw 105 00:07:06,120 --> 00:07:08,840 Speaker 3: that not only was this not delivered, not only did 106 00:07:08,839 --> 00:07:13,960 Speaker 3: this smoking gun if materialize, but the MSNBC host or 107 00:07:14,040 --> 00:07:18,080 Speaker 3: remember the CNN hosts, Know Times, Washington Post didn't. It 108 00:07:18,120 --> 00:07:24,280 Speaker 3: didn't seem to in any way disturb their sense of 109 00:07:24,400 --> 00:07:27,040 Speaker 3: Shift's credibility. They just reported the same story. 110 00:07:26,800 --> 00:07:27,320 Speaker 5: Over and over. 111 00:07:27,920 --> 00:07:32,360 Speaker 6: So it was only when I, you know, sort of 112 00:07:32,440 --> 00:07:35,480 Speaker 6: woke up to that, which was probably it took a 113 00:07:35,560 --> 00:07:37,600 Speaker 6: year or so. I mean, I think it was twenty 114 00:07:37,720 --> 00:07:43,560 Speaker 6: seventeen before I really fully realized, Okay, it's there's nothing 115 00:07:43,680 --> 00:07:46,480 Speaker 6: here and nothing of what he's promising is here. 116 00:07:47,440 --> 00:07:51,160 Speaker 3: Then I could be surprised that other people, by whatever 117 00:07:51,200 --> 00:07:54,800 Speaker 3: it was twenty twenty twenty twenty three, were still so 118 00:07:54,920 --> 00:07:55,680 Speaker 3: wrapped up in this. 119 00:07:56,520 --> 00:07:59,560 Speaker 1: Yeah, it was interesting because I was not a trump 120 00:07:59,560 --> 00:08:01,280 Speaker 1: band at all in twenty sixteen. 121 00:08:01,360 --> 00:08:02,840 Speaker 2: I was a third party voter. 122 00:08:03,040 --> 00:08:06,240 Speaker 1: I just I just didn't like the guy, and I 123 00:08:06,280 --> 00:08:09,200 Speaker 1: still thought the people who were posting that he was, 124 00:08:09,280 --> 00:08:12,480 Speaker 1: you know, basically a Russian spy, right or compromised by Russia. 125 00:08:12,800 --> 00:08:15,120 Speaker 1: I thought that was so far out. It was like 126 00:08:15,680 --> 00:08:18,960 Speaker 1: it was like calling somebody a Nazi. It had stopped mattering. 127 00:08:19,240 --> 00:08:22,280 Speaker 1: It no longer registered because it was such a crazy 128 00:08:22,320 --> 00:08:25,400 Speaker 1: thing to say that I couldn't take it seriously. And 129 00:08:25,920 --> 00:08:27,880 Speaker 1: I was surprised at kind of people who I thought 130 00:08:27,960 --> 00:08:34,160 Speaker 1: were more sophisticated purveyors of news and information would fall 131 00:08:34,200 --> 00:08:36,960 Speaker 1: for it, and it does make me afraid that we 132 00:08:37,000 --> 00:08:40,959 Speaker 1: would fall for other things that and mass we could 133 00:08:41,000 --> 00:08:45,960 Speaker 1: be lied to often and we probably are. Are you 134 00:08:46,280 --> 00:08:49,640 Speaker 1: worried at all about the fact that if people could 135 00:08:49,640 --> 00:08:52,120 Speaker 1: buy that, they could buy a whole bunch of other 136 00:08:52,160 --> 00:08:54,839 Speaker 1: things too, Well, first of all, you. 137 00:08:54,800 --> 00:08:58,480 Speaker 3: Know that's you're a lot wiser and more STU. That 138 00:08:58,679 --> 00:09:00,960 Speaker 3: was really to catch. 139 00:09:00,720 --> 00:09:04,920 Speaker 6: That, because I think you were probably in the minority 140 00:09:05,280 --> 00:09:06,160 Speaker 6: at that point. 141 00:09:06,280 --> 00:09:09,000 Speaker 3: But am I worried about it? I'm so worried I 142 00:09:09,000 --> 00:09:09,920 Speaker 3: wrote a whole book about it. 143 00:09:10,200 --> 00:09:14,640 Speaker 6: Yeah, yeah, I not only am I worried about it, 144 00:09:15,320 --> 00:09:18,560 Speaker 6: But when you peel back the layers a bit, you 145 00:09:18,679 --> 00:09:24,640 Speaker 6: realize historically that the premise that the masses. 146 00:09:24,360 --> 00:09:30,520 Speaker 3: Can have their public opinionship that public opinion is a 147 00:09:30,720 --> 00:09:35,560 Speaker 3: sculptible resource, is more than a century old, and there 148 00:09:35,600 --> 00:09:41,439 Speaker 3: has been a continuous refinement of techniques applied towards sculpting 149 00:09:41,520 --> 00:09:44,600 Speaker 3: public opinion for more than one hundred years, not only 150 00:09:44,640 --> 00:09:47,720 Speaker 3: in the United States but across the world. That this 151 00:09:47,920 --> 00:09:53,839 Speaker 3: technology of sculpting public opinion on a mass scale has 152 00:09:53,880 --> 00:09:58,760 Speaker 3: a number of identifiable features. One, it always accelerates dramatically 153 00:09:58,800 --> 00:10:02,480 Speaker 3: in times of war, and so if you can manufacture 154 00:10:02,520 --> 00:10:05,480 Speaker 3: a kind of false pretext of war, like an invasion 155 00:10:05,520 --> 00:10:08,959 Speaker 3: of Russian controls, for instance, You can use that as 156 00:10:08,960 --> 00:10:13,839 Speaker 3: a way of accelerating this process of mass opinion formation. Secondarily, 157 00:10:14,720 --> 00:10:18,960 Speaker 3: it never manages to achieve the thing that it's supposed 158 00:10:19,000 --> 00:10:23,160 Speaker 3: to achieve. So human beings turn out to be remarkably 159 00:10:23,400 --> 00:10:30,040 Speaker 3: difficult to enslave on mass through techniques of mind control, 160 00:10:30,360 --> 00:10:35,240 Speaker 3: no matter how sophisticated the technologies. It doesn't matter if 161 00:10:35,240 --> 00:10:38,120 Speaker 3: you're talking about the radio or AI. 162 00:10:39,200 --> 00:10:40,960 Speaker 5: The human being remains a. 163 00:10:41,040 --> 00:10:44,880 Speaker 3: Very stubborn subject to control in this way. It doesn't 164 00:10:44,920 --> 00:10:50,400 Speaker 3: mean that you can't create massively destructive effects in a 165 00:10:50,480 --> 00:10:54,120 Speaker 3: society through the effort. It doesn't mean that you can't 166 00:10:54,120 --> 00:10:56,400 Speaker 3: even convince some people. But I think what tends to 167 00:10:56,480 --> 00:11:03,120 Speaker 3: happen is this kind of ongoing onslaught of information warfare 168 00:11:03,559 --> 00:11:08,360 Speaker 3: drives people crazy, makes them vulnerable to all kinds of nonsense, 169 00:11:09,040 --> 00:11:14,600 Speaker 3: but rarely produces the sort of neat, predictable waves of 170 00:11:15,800 --> 00:11:18,959 Speaker 3: correct opinion that it's supposed to accomplish. 171 00:11:19,000 --> 00:11:20,960 Speaker 1: We're going to take a quick break and be right 172 00:11:21,000 --> 00:11:22,520 Speaker 1: back on the Carol Markowitz Show. 173 00:11:25,600 --> 00:11:28,079 Speaker 2: Where does the AI revolution fall into this? 174 00:11:28,240 --> 00:11:31,040 Speaker 1: Because, like so for example, Adam Schiff, you know, says 175 00:11:31,080 --> 00:11:33,680 Speaker 1: it's you know, smoking gun any minute now and smoking 176 00:11:33,679 --> 00:11:34,280 Speaker 1: gun any. 177 00:11:34,120 --> 00:11:36,000 Speaker 2: Minute now, and it never happens. 178 00:11:36,080 --> 00:11:38,840 Speaker 1: And then you you know, people I assume look at 179 00:11:38,840 --> 00:11:40,800 Speaker 1: Adam Schiff and say I'm not going to believe what 180 00:11:40,840 --> 00:11:44,080 Speaker 1: he says anymore. And then you can have an AI 181 00:11:45,040 --> 00:11:48,720 Speaker 1: person say it. You can have somebody who doesn't exist 182 00:11:48,920 --> 00:11:52,360 Speaker 1: lie to people on a large scale and you know, 183 00:11:52,440 --> 00:11:57,199 Speaker 1: make her very attractive and show her in bikinis telling 184 00:11:57,240 --> 00:11:57,880 Speaker 1: you stories. 185 00:11:59,320 --> 00:12:00,480 Speaker 2: People could fall for that. 186 00:12:00,400 --> 00:12:02,400 Speaker 1: And then there is nobody to be held accountable. There's 187 00:12:02,400 --> 00:12:05,160 Speaker 1: nobody to say, Okay, I'm not going to believe you anymore. 188 00:12:05,559 --> 00:12:12,600 Speaker 3: Ultimately, in these cases, there's a political system to fall 189 00:12:12,640 --> 00:12:16,640 Speaker 3: back on where powerfuls to reside in some sense, and 190 00:12:16,720 --> 00:12:22,360 Speaker 3: so as AI technologies make this question of ultimate sovereignty 191 00:12:22,559 --> 00:12:27,160 Speaker 3: more and more opaque and mysterious. And how powerful is 192 00:12:27,480 --> 00:12:31,080 Speaker 3: Elon Musk within the American political system at this point? 193 00:12:31,240 --> 00:12:36,079 Speaker 3: How powerful are the people determining X's algorithms or Googles 194 00:12:36,160 --> 00:12:38,960 Speaker 3: for that matter. No one ever talks about Google because 195 00:12:39,000 --> 00:12:45,160 Speaker 3: Google is very smartly, you know, being much more circumspect 196 00:12:45,200 --> 00:12:50,240 Speaker 3: about the ways in which ideological bias and you know 197 00:12:50,480 --> 00:12:54,720 Speaker 3: other even more subtle forms of kind of political control. 198 00:12:54,760 --> 00:12:59,000 Speaker 3: They're just baked into the operating structure of Google. But 199 00:12:59,280 --> 00:13:04,760 Speaker 3: we can't in a representative constitutional system like the United States, 200 00:13:04,880 --> 00:13:07,880 Speaker 3: affect the composition of Google. 201 00:13:08,640 --> 00:13:12,959 Speaker 5: There's still a way to vote and. 202 00:13:12,920 --> 00:13:18,079 Speaker 3: To exercise sovereignty through the political system, which is ultimately 203 00:13:18,640 --> 00:13:22,320 Speaker 3: when I'm saying, like, the only way to affect this, 204 00:13:23,480 --> 00:13:29,040 Speaker 3: And if it's not done that way through the political system, 205 00:13:30,200 --> 00:13:33,640 Speaker 3: then I think what you're gonna witness is just the 206 00:13:33,880 --> 00:13:40,080 Speaker 3: gradual leaching away of the meaning of rights like voting 207 00:13:40,320 --> 00:13:49,000 Speaker 3: until they're essentially worthless because the technology AI is going 208 00:13:49,040 --> 00:13:53,160 Speaker 3: to just slowly. It's like money being transferred out of 209 00:13:53,160 --> 00:13:55,560 Speaker 3: a bank fund of penny at a time, right like 210 00:13:55,679 --> 00:14:02,240 Speaker 3: political freedom, sovereignty is being a way out of individual 211 00:14:02,360 --> 00:14:08,400 Speaker 3: citizens and the organizations specific associations that they form. It's 212 00:14:08,440 --> 00:14:11,920 Speaker 3: being transferred out of those and into these sort of 213 00:14:12,480 --> 00:14:18,400 Speaker 3: mass state tech configurations at a very rapid pace. 214 00:14:19,640 --> 00:14:23,600 Speaker 1: Is the information state a pessimistic book, like do you 215 00:14:23,760 --> 00:14:25,520 Speaker 1: think this can be improved upon? 216 00:14:25,840 --> 00:14:27,440 Speaker 2: Or are we really in trouble? 217 00:14:28,880 --> 00:14:30,240 Speaker 5: We're really in trouble. 218 00:14:29,920 --> 00:14:32,360 Speaker 3: But I'm not a pessimist, or like, at a very 219 00:14:32,520 --> 00:14:37,440 Speaker 3: very deep level, I'm not a pessimist. And you know, 220 00:14:37,480 --> 00:14:41,920 Speaker 3: I look at an organization like Khabad and their attitude 221 00:14:42,000 --> 00:14:48,880 Speaker 3: toward technology, which is that technology is part of the 222 00:14:48,880 --> 00:14:51,920 Speaker 3: process of the redemption of the entire world, that there's 223 00:14:51,960 --> 00:14:53,840 Speaker 3: a reason why the Internet arrived. 224 00:14:55,080 --> 00:14:56,280 Speaker 5: And yet even. 225 00:14:56,040 --> 00:15:01,240 Speaker 3: Though they associate the developments in the tech technological sphere 226 00:15:01,760 --> 00:15:09,360 Speaker 3: with coming in Musha, they still exercise incredibly smart in 227 00:15:09,440 --> 00:15:13,880 Speaker 3: my opinion, and deliberate controls about how they interact with 228 00:15:13,880 --> 00:15:17,200 Speaker 3: those technologies. So I would say it's not a pessimistic 229 00:15:17,200 --> 00:15:21,720 Speaker 3: book because I'm not a pessimist. I'm an I don't know, 230 00:15:22,080 --> 00:15:24,560 Speaker 3: I'm not sure optimists would be a right word it 231 00:15:24,600 --> 00:15:26,800 Speaker 3: would be the right word, But I'm a believer at 232 00:15:26,800 --> 00:15:31,960 Speaker 3: a very deep level. However, we're in an extraordinarily dangerous 233 00:15:32,000 --> 00:15:38,040 Speaker 3: moment for anyone who cares about matters like human liberty 234 00:15:38,680 --> 00:15:42,320 Speaker 3: or for that matter, of the American constitutional system. 235 00:15:42,640 --> 00:15:46,160 Speaker 1: A question I ask all of my guests is what 236 00:15:46,200 --> 00:15:47,960 Speaker 1: are you most proud of in your life? 237 00:15:50,320 --> 00:15:52,640 Speaker 5: Such a question I had to think about this. Can 238 00:15:52,680 --> 00:15:53,240 Speaker 5: I tell you? 239 00:15:54,320 --> 00:15:56,800 Speaker 3: Can I go through some options that I thought about? 240 00:15:56,840 --> 00:16:02,360 Speaker 1: Okay, please, So there's no wrong answer here. You know, 241 00:16:02,520 --> 00:16:04,880 Speaker 1: you can just say I'm most proud of my book, 242 00:16:04,920 --> 00:16:05,960 Speaker 1: the Information State. 243 00:16:06,040 --> 00:16:06,760 Speaker 2: It could be anything. 244 00:16:07,360 --> 00:16:09,600 Speaker 3: No, I'm not most proud of my I like my book. 245 00:16:09,920 --> 00:16:11,640 Speaker 3: I'm very proud of my book. 246 00:16:12,000 --> 00:16:14,560 Speaker 5: But it wouldn't I don't think my book. 247 00:16:14,400 --> 00:16:19,800 Speaker 3: Would crack the top five. No. I completed Ranger School, 248 00:16:20,400 --> 00:16:26,680 Speaker 3: which is a very difficult, especially infantry school in the 249 00:16:26,760 --> 00:16:29,320 Speaker 3: US Army, and I did it when I was not 250 00:16:31,160 --> 00:16:33,000 Speaker 3: either in the active through the Army. I was in 251 00:16:33,000 --> 00:16:35,880 Speaker 3: the National Guard. I wasn't really young anymore when I 252 00:16:35,920 --> 00:16:40,240 Speaker 3: was twenty nine when I went through, and so I 253 00:16:40,240 --> 00:16:43,800 Speaker 3: had a sort of and I was a chainsmoker. I 254 00:16:43,920 --> 00:16:48,360 Speaker 3: quit right before I went to Ranger School. But in 255 00:16:48,400 --> 00:16:52,000 Speaker 3: other words, like I had to train myself to get 256 00:16:52,000 --> 00:16:53,320 Speaker 3: through Ranger School. 257 00:16:53,720 --> 00:16:54,920 Speaker 5: And most of. 258 00:16:56,480 --> 00:16:59,440 Speaker 3: Most graduates of Ranger School, which is a very high 259 00:16:59,520 --> 00:17:02,640 Speaker 3: failure because it's so intent and difficult. You know, the 260 00:17:02,720 --> 00:17:08,760 Speaker 3: average soldier going through is either straight out of Ranger Battalion, 261 00:17:08,840 --> 00:17:14,040 Speaker 3: so like nineteen twenty, just like hard charging young men 262 00:17:14,240 --> 00:17:18,160 Speaker 3: or their infantry officers, right out of the Infantry basic course, 263 00:17:18,200 --> 00:17:23,040 Speaker 3: where they've gone through essentially a four month preparatory period 264 00:17:23,080 --> 00:17:25,040 Speaker 3: where the Army is just training. 265 00:17:24,640 --> 00:17:25,879 Speaker 5: Them to get away from rangers. 266 00:17:25,920 --> 00:17:28,680 Speaker 3: Bool whereas I at the time was a chain smoking 267 00:17:28,720 --> 00:17:32,760 Speaker 3: freelance writer living in Prospect Heights, Brooke. 268 00:17:32,880 --> 00:17:34,639 Speaker 2: What made you do it? 269 00:17:36,080 --> 00:17:38,040 Speaker 3: What made me do it? I had something to prove 270 00:17:38,080 --> 00:17:42,280 Speaker 3: to myself and I had already been I had already 271 00:17:42,320 --> 00:17:47,800 Speaker 3: done a deployment in Iraq at that point, and one 272 00:17:47,840 --> 00:17:51,280 Speaker 3: of my soldiers in Iraq was a Ranger Battalion veteran, 273 00:17:52,160 --> 00:17:55,320 Speaker 3: and it made an impression on me. And then many 274 00:17:55,359 --> 00:17:59,960 Speaker 3: of the soldiers and officers I most respected had an 275 00:18:00,040 --> 00:18:03,040 Speaker 3: Into TAB and it meant a lot, you know, it 276 00:18:06,160 --> 00:18:11,200 Speaker 3: sort of established like a baseline of resilience, toughness, confidence. 277 00:18:11,520 --> 00:18:15,639 Speaker 3: So I wanted it, and it was difficult, but I graduated, 278 00:18:15,640 --> 00:18:17,919 Speaker 3: I got my TAB. So I thought about saying Ranger School, 279 00:18:19,440 --> 00:18:23,439 Speaker 3: but ultimately I decided that having my children is what 280 00:18:23,560 --> 00:18:30,159 Speaker 3: I'm most proud of, and raising them, but also having 281 00:18:30,200 --> 00:18:37,359 Speaker 3: them choosing to choosing to be a father, to commit 282 00:18:37,440 --> 00:18:42,840 Speaker 3: my life to my family in that way, which had 283 00:18:42,920 --> 00:18:47,160 Speaker 3: the most you know, we're talking about like technologies to 284 00:18:47,200 --> 00:18:50,679 Speaker 3: shift people's opinion, but nothing in my life shifted my 285 00:18:50,720 --> 00:18:52,000 Speaker 3: way of thinking more. 286 00:18:51,920 --> 00:18:52,879 Speaker 5: Than having kids. 287 00:18:52,920 --> 00:19:01,400 Speaker 3: It was a radically consciousness shifting experience. So I'm proudest 288 00:19:01,560 --> 00:19:03,320 Speaker 3: of being a. 289 00:19:03,280 --> 00:19:04,239 Speaker 5: Father that much too. 290 00:19:04,760 --> 00:19:06,879 Speaker 2: That's a great answer. I love it. 291 00:19:06,800 --> 00:19:09,200 Speaker 1: It could be a two part answer. Anything goes here 292 00:19:09,280 --> 00:19:11,399 Speaker 1: on the Carol Marcowitz Show. We're going to take a 293 00:19:11,480 --> 00:19:14,240 Speaker 1: quick break and be right back on the Carol Marcowitz Show. 294 00:19:17,560 --> 00:19:20,240 Speaker 1: My other question that I ask all of my guests 295 00:19:20,280 --> 00:19:23,640 Speaker 1: is a forward looking question. And I know that you're 296 00:19:24,000 --> 00:19:27,640 Speaker 1: you said your book is not optimistic. It's not pessimistic. 297 00:19:27,680 --> 00:19:31,639 Speaker 1: Maybe it's just the way things are. But give us 298 00:19:31,640 --> 00:19:34,400 Speaker 1: a prediction about the next five years, and it could 299 00:19:34,440 --> 00:19:35,800 Speaker 1: be about anything at all. 300 00:19:35,840 --> 00:19:39,120 Speaker 2: It could actually veer into art or music or anything. 301 00:19:38,840 --> 00:19:42,399 Speaker 3: You want, oh art of music. 302 00:19:42,960 --> 00:19:44,960 Speaker 2: That could be anything. I don't want to I don't 303 00:19:44,960 --> 00:19:48,200 Speaker 2: want to lead you. You could five year prediction about anything. 304 00:19:48,720 --> 00:19:52,280 Speaker 3: I think it all relates to actually, because one of 305 00:19:52,320 --> 00:19:56,160 Speaker 3: the things that's happening and we're seeing this play out 306 00:19:56,640 --> 00:20:01,960 Speaker 3: and the kind of mob mentality around the Epstein story 307 00:20:02,359 --> 00:20:07,160 Speaker 3: is that the. 308 00:20:05,640 --> 00:20:07,359 Speaker 5: Arrival, the real. 309 00:20:07,200 --> 00:20:14,200 Speaker 3: Arrival of digital technology as a kind of cosmology unto itself. 310 00:20:14,280 --> 00:20:20,880 Speaker 3: Any sufficiently powerful technology generates its own universe of consciousness, 311 00:20:21,920 --> 00:20:27,439 Speaker 3: affective dimension of feeling, thought, experience. This is what the 312 00:20:27,520 --> 00:20:31,639 Speaker 3: great media theorist Marshall McClue and meant when he spoke 313 00:20:31,720 --> 00:20:35,640 Speaker 3: about the Gutenberg galaxy that the printing press. The Gutenberg 314 00:20:36,040 --> 00:20:40,560 Speaker 3: printing press wasn't just a means of you know, utilizing 315 00:20:40,640 --> 00:20:45,280 Speaker 3: typeset to print books. It generated a new galaxy of 316 00:20:46,440 --> 00:20:51,640 Speaker 3: political forms, emotional forms, literary forms, artistic forms. 317 00:20:52,080 --> 00:20:53,920 Speaker 5: We're seeing that now with. 318 00:20:54,400 --> 00:20:58,480 Speaker 3: Digital technology as well, which you know has been around 319 00:20:58,880 --> 00:21:01,679 Speaker 3: for I mean, digital technology itself has been around for 320 00:21:02,400 --> 00:21:06,040 Speaker 3: effectively a century at this point, a little less than 321 00:21:06,080 --> 00:21:13,200 Speaker 3: a century, but the effects of mass universalized digital technology 322 00:21:13,320 --> 00:21:18,240 Speaker 3: as the essential medium through which we communicate do politics, 323 00:21:18,720 --> 00:21:23,400 Speaker 3: do sports, do dating, do everything? Digital technology as the 324 00:21:23,440 --> 00:21:31,480 Speaker 3: medium of existence is resurrecting ways of life that are fundamentally, 325 00:21:31,520 --> 00:21:35,320 Speaker 3: at this stage more medieval in their orientation than what 326 00:21:35,359 --> 00:21:38,280 Speaker 3: we might think of as modern. Certainly they're not print 327 00:21:38,359 --> 00:21:44,359 Speaker 3: age modern print age. Modernity was about reason and rationality 328 00:21:45,160 --> 00:21:51,159 Speaker 3: and the exercise of sovereign individual rights, and that's not 329 00:21:51,160 --> 00:21:54,240 Speaker 3: what the medieval world was structured around. Those are not 330 00:21:54,280 --> 00:21:59,160 Speaker 3: the values that the medieval world was structured around. Hierarchical 331 00:21:59,240 --> 00:22:05,159 Speaker 3: senses order. It was structured around hierarchies of fealty and 332 00:22:05,280 --> 00:22:10,320 Speaker 3: duty that are quite different from the more individualistic, reason 333 00:22:10,400 --> 00:22:13,120 Speaker 3: based world that we've been living in for the past 334 00:22:13,160 --> 00:22:17,480 Speaker 3: five hundred years, and which is coming apart at you know, 335 00:22:18,400 --> 00:22:21,359 Speaker 3: nuclear speeds before our eyes. So my prediction is that 336 00:22:22,119 --> 00:22:28,040 Speaker 3: the art will look a lot more medieval as you know, 337 00:22:28,520 --> 00:22:33,480 Speaker 3: the short end for it meaning a cult based in 338 00:22:34,200 --> 00:22:40,120 Speaker 3: mystery and revelation rather than based in rationality and reason, 339 00:22:40,400 --> 00:22:45,360 Speaker 3: and that even technology and science will follow that same pattern. 340 00:22:45,640 --> 00:22:50,280 Speaker 3: So technology and science AI is the perfect example. We 341 00:22:50,560 --> 00:22:55,919 Speaker 3: already have black box AI technologies that are inscrutable to 342 00:22:56,040 --> 00:22:59,879 Speaker 3: the people who designed them, right, the people who program 343 00:23:00,440 --> 00:23:04,840 Speaker 3: AIS don't understand how they work. It's functionally indistinguishable from magic, 344 00:23:05,480 --> 00:23:07,920 Speaker 3: and that in a word, is medieval. 345 00:23:08,000 --> 00:23:08,760 Speaker 5: That's what I mean. 346 00:23:09,800 --> 00:23:10,800 Speaker 2: So interesting. 347 00:23:10,880 --> 00:23:13,320 Speaker 1: I can tell you I have definitely never gotten that 348 00:23:13,560 --> 00:23:17,440 Speaker 1: answer before, and some answers do repeat on the five 349 00:23:17,520 --> 00:23:21,560 Speaker 1: year prediction question. Jacob, it has been so nice talking 350 00:23:21,560 --> 00:23:23,480 Speaker 1: to you. I've been reading you for a long time. 351 00:23:23,600 --> 00:23:27,280 Speaker 1: I've been a big fan. Leave us here with your 352 00:23:27,320 --> 00:23:29,879 Speaker 1: best tip for my listeners on how they can improve 353 00:23:29,920 --> 00:23:31,680 Speaker 1: their lives. 354 00:23:35,200 --> 00:23:38,200 Speaker 3: Try to be patient with your children. 355 00:23:38,400 --> 00:23:44,119 Speaker 5: That's taken me some time to really learn. 356 00:23:44,840 --> 00:23:49,000 Speaker 3: I mean patience is what it really means is to 357 00:23:49,960 --> 00:23:54,040 Speaker 3: give your presence to your children, like we talk about 358 00:23:54,040 --> 00:23:56,879 Speaker 3: being present, but I think what that really means is 359 00:23:57,119 --> 00:23:59,919 Speaker 3: you can be present with yourself, but then you have 360 00:24:00,080 --> 00:24:04,360 Speaker 3: to give that presence to your children. One other thing quickly, 361 00:24:04,640 --> 00:24:07,000 Speaker 3: because I just learned this and it's amazing. There's a 362 00:24:07,400 --> 00:24:13,240 Speaker 3: famous book of like Mussol or Jewish ethics by this catalyst, 363 00:24:13,359 --> 00:24:18,040 Speaker 3: Rabbi the Raumkams. This book called Me, and he says 364 00:24:18,080 --> 00:24:21,760 Speaker 3: this amazing thing, which is there's a lot of context, 365 00:24:21,840 --> 00:24:24,679 Speaker 3: but essentially he says that in order to be a 366 00:24:24,720 --> 00:24:29,560 Speaker 3: good person, to really serve God, you have to devote 367 00:24:30,520 --> 00:24:32,679 Speaker 3: I think it's an hour a day, he says, to 368 00:24:32,760 --> 00:24:36,720 Speaker 3: being watchful over yourself and your actions and evaluating how 369 00:24:36,720 --> 00:24:39,840 Speaker 3: you acted. Because it's not enough to want to be good. 370 00:24:39,920 --> 00:24:42,639 Speaker 3: You have to apply strategies to be good. But what 371 00:24:42,680 --> 00:24:44,760 Speaker 3: he says that makes it so brilliant is it can't 372 00:24:44,760 --> 00:24:48,360 Speaker 3: be any longer than an hour. In other words, if you. 373 00:24:48,320 --> 00:24:51,320 Speaker 1: Say three hours laying awake at night not not not 374 00:24:51,440 --> 00:24:51,800 Speaker 1: the best. 375 00:24:51,840 --> 00:24:54,680 Speaker 3: For that not only is it not the best, but 376 00:24:55,160 --> 00:25:00,399 Speaker 3: you're not it's not contributing to you being a better person. 377 00:25:00,640 --> 00:25:02,760 Speaker 3: It's taking you in the other direction. 378 00:25:03,359 --> 00:25:04,080 Speaker 5: So you have to. 379 00:25:04,119 --> 00:25:10,239 Speaker 3: Control, you have to subordinate that self critical mood to 380 00:25:10,320 --> 00:25:12,520 Speaker 3: the higher goal right, which is. 381 00:25:14,160 --> 00:25:16,760 Speaker 5: To the service of guidance. Is the way they're on 382 00:25:16,840 --> 00:25:18,879 Speaker 5: top puts it. 383 00:25:19,400 --> 00:25:19,960 Speaker 2: I love it. 384 00:25:20,200 --> 00:25:23,199 Speaker 1: Only one hour of self criticism per day, friends, no 385 00:25:23,320 --> 00:25:23,880 Speaker 1: more than that. 386 00:25:24,160 --> 00:25:25,760 Speaker 2: I love it. He is Jacob Siegel. 387 00:25:25,880 --> 00:25:28,400 Speaker 1: Check him out at tablet and get his new book, 388 00:25:28,440 --> 00:25:31,920 Speaker 1: The Information State, Politics and the Age of Total Control. 389 00:25:32,280 --> 00:25:34,159 Speaker 2: Thank you so much for coming on the show, Jacob. 390 00:25:35,040 --> 00:25:36,159 Speaker 5: Thanks Carol, it's great to be