1 00:00:04,960 --> 00:00:07,560 Speaker 1: Welcome to it could happen here. Um the show that 2 00:00:07,800 --> 00:00:11,520 Speaker 1: is normally introduced by me shouting a tonally, but today 3 00:00:11,760 --> 00:00:15,360 Speaker 1: I did like a professional. Um, because today myself and 4 00:00:15,400 --> 00:00:18,600 Speaker 1: my colleagues Garrison and Christopher are talking to someone. I'm 5 00:00:18,680 --> 00:00:22,880 Speaker 1: very excited to chat with Mr Corey. Doctor. Oh Corey, Well, 6 00:00:22,880 --> 00:00:25,599 Speaker 1: hello the show. Thank you very much. It is my 7 00:00:25,680 --> 00:00:28,000 Speaker 1: pleasure to be on it. It's great to meet you 8 00:00:28,040 --> 00:00:31,040 Speaker 1: all and to be talking to you today. Corey. You 9 00:00:31,160 --> 00:00:34,120 Speaker 1: do a lot of writing about kind of technology and 10 00:00:34,280 --> 00:00:38,600 Speaker 1: surveillance and cultural issues around those. You're also an author. 11 00:00:38,640 --> 00:00:41,000 Speaker 1: You've written some great fiction. I think today will probably 12 00:00:41,440 --> 00:00:44,720 Speaker 1: talk most around books like Attack Service and walk Away, 13 00:00:44,760 --> 00:00:47,440 Speaker 1: But you've written a lot of wonderful stuff. Um. And 14 00:00:47,479 --> 00:00:49,520 Speaker 1: you've also worked with the e f F for years 15 00:00:49,520 --> 00:00:52,159 Speaker 1: and years. UM. So you you you're coming at what 16 00:00:52,200 --> 00:00:54,240 Speaker 1: I love about I mean, we're gonna be talking today 17 00:00:54,280 --> 00:00:57,360 Speaker 1: broadly about surveillance and kind of the future of of 18 00:00:57,480 --> 00:01:00,440 Speaker 1: the Internet. Will probably talk about some Metaversey stuff. What 19 00:01:00,560 --> 00:01:03,440 Speaker 1: I love about the way in which you think and 20 00:01:03,520 --> 00:01:05,399 Speaker 1: write about the future is that you're kind of coming 21 00:01:05,400 --> 00:01:07,760 Speaker 1: about it from a number of angles, both as like 22 00:01:07,880 --> 00:01:12,440 Speaker 1: a tech industry journalist, as a fiction writer imagining the 23 00:01:12,440 --> 00:01:15,080 Speaker 1: future and as somebody who's kind of weighted in as 24 00:01:15,080 --> 00:01:19,040 Speaker 1: an activist to this, And UM, I'm kind of wondering, 25 00:01:19,959 --> 00:01:23,759 Speaker 1: where do you see like the greatest potential for actual 26 00:01:23,959 --> 00:01:26,880 Speaker 1: like change? Um? Is it? Is it in kind of 27 00:01:27,040 --> 00:01:29,399 Speaker 1: Is it in lobbying and engaging as an activist or 28 00:01:29,440 --> 00:01:31,679 Speaker 1: is it in sort of imagining as a as a 29 00:01:31,760 --> 00:01:35,200 Speaker 1: as a fiction writer? What might be? So I I 30 00:01:35,240 --> 00:01:38,920 Speaker 1: see them as adjuncts uh, you know, diversity of tactics 31 00:01:38,920 --> 00:01:43,440 Speaker 1: and all that stuff. Um. The thing is that tech 32 00:01:43,560 --> 00:01:49,920 Speaker 1: policy arguments are often very abstract, uh, and they are 33 00:01:50,040 --> 00:01:55,160 Speaker 1: only visceral for the people who would provide the kind 34 00:01:55,160 --> 00:01:58,680 Speaker 1: of political will to do something about them. Usually that 35 00:01:58,680 --> 00:02:01,840 Speaker 1: that comes when it's too late, right, people, people care 36 00:02:01,840 --> 00:02:04,480 Speaker 1: about tech monopolies once the web is turned into five 37 00:02:04,520 --> 00:02:06,840 Speaker 1: giant websites filled with scripted shots of text from the 38 00:02:06,880 --> 00:02:10,760 Speaker 1: other four, But not when Yahoo is on a buying 39 00:02:10,800 --> 00:02:13,280 Speaker 1: spree of tech companies and we're saying, oh, that's how 40 00:02:13,280 --> 00:02:15,400 Speaker 1: tech companies grow, and all tech companies will grow in 41 00:02:15,440 --> 00:02:18,320 Speaker 1: the future by buying all their nascent competitors and rolling 42 00:02:18,360 --> 00:02:20,799 Speaker 1: them up into a big vertically integrated monopoly, which is 43 00:02:20,840 --> 00:02:22,480 Speaker 1: kind of how we got Facebook and Google and the 44 00:02:22,480 --> 00:02:25,519 Speaker 1: rest of it. And um, you need to be able 45 00:02:25,560 --> 00:02:28,720 Speaker 1: to make policy arguments to policy people, but you also 46 00:02:28,760 --> 00:02:31,760 Speaker 1: need to be able to put uh, some some sinew 47 00:02:31,800 --> 00:02:34,880 Speaker 1: and muscle on the bone of that highly abstract kind 48 00:02:34,919 --> 00:02:37,960 Speaker 1: of argument. And and that's where fiction comes in. It's 49 00:02:38,040 --> 00:02:42,639 Speaker 1: kind of a like a fly through of like an 50 00:02:42,639 --> 00:02:46,079 Speaker 1: emotional architects rendering of what things might look like if 51 00:02:46,120 --> 00:02:49,080 Speaker 1: we get it wrong or if we change it. It 52 00:02:49,200 --> 00:02:51,720 Speaker 1: preserves the sense of possibility, you know. I think one 53 00:02:51,760 --> 00:02:55,000 Speaker 1: of the great enemies of change is the inevitable is 54 00:02:55,040 --> 00:02:57,359 Speaker 1: um of capitalist realism and the idea that there is 55 00:02:57,400 --> 00:02:59,560 Speaker 1: no alternative. So if you can make people believe in 56 00:02:59,560 --> 00:03:02,400 Speaker 1: an alternative, than they might work for one. And certainly 57 00:03:02,400 --> 00:03:04,880 Speaker 1: the opposite is true. If people don't believe there is 58 00:03:05,160 --> 00:03:08,160 Speaker 1: any alternative possible, they won't work for one. Why why 59 00:03:08,200 --> 00:03:12,560 Speaker 1: would you? Uh? And so all of that together, I 60 00:03:12,600 --> 00:03:16,000 Speaker 1: think is part of how you mobilize people to care 61 00:03:16,040 --> 00:03:20,799 Speaker 1: about stuff. Yeah, I mean, that makes that makes total sense, 62 00:03:20,840 --> 00:03:24,720 Speaker 1: and it is. It's difficult, I think because I first 63 00:03:24,720 --> 00:03:28,679 Speaker 1: came into technology as a journalist, and it's very difficult 64 00:03:28,680 --> 00:03:32,320 Speaker 1: to get people to care about stuff. And I think 65 00:03:32,320 --> 00:03:35,600 Speaker 1: in particular privacy, which there was. It has been one 66 00:03:35,600 --> 00:03:37,960 Speaker 1: of the most interesting cases of like the kind of 67 00:03:37,960 --> 00:03:41,920 Speaker 1: thought leaders in in an industry freaking out over something 68 00:03:41,960 --> 00:03:44,320 Speaker 1: and people not really having an issue with it because 69 00:03:44,400 --> 00:03:46,200 Speaker 1: we kind of all agreed to hand over all of 70 00:03:46,200 --> 00:03:49,120 Speaker 1: our data to a number of big side not all, 71 00:03:49,160 --> 00:03:51,080 Speaker 1: but I don't know. I'm interested in your thoughts on that. 72 00:03:51,280 --> 00:03:54,600 Speaker 1: I understand the idea that like fiction is um is 73 00:03:54,640 --> 00:03:56,240 Speaker 1: a much better way to try to get people to 74 00:03:56,280 --> 00:03:59,040 Speaker 1: care about these things because it makes them feel as 75 00:03:59,040 --> 00:04:01,560 Speaker 1: opposed to kind of boarding on. I think people can 76 00:04:01,560 --> 00:04:03,880 Speaker 1: get kind of lost in the weeds of acquisitions and 77 00:04:03,920 --> 00:04:07,480 Speaker 1: like uh pivots and you know, tech companies acquiring each 78 00:04:07,480 --> 00:04:11,600 Speaker 1: other and whatnot. Well, look, I think that the part 79 00:04:11,680 --> 00:04:15,440 Speaker 1: of the problem with privacy, the reason that we were 80 00:04:15,560 --> 00:04:17,960 Speaker 1: late to wake up and do something about it, is 81 00:04:18,000 --> 00:04:20,920 Speaker 1: because it was obfuscated. You know, if you've ever seen 82 00:04:21,080 --> 00:04:23,720 Speaker 1: the maps of like how an ad tech stack works, 83 00:04:23,760 --> 00:04:27,760 Speaker 1: the flow diagrams, and there are some things that are 84 00:04:27,800 --> 00:04:31,600 Speaker 1: complicated because um, there are some things that are hard 85 00:04:31,600 --> 00:04:33,760 Speaker 1: to understand because they're complicated. And then there are some 86 00:04:33,800 --> 00:04:36,800 Speaker 1: things that are made complicated so they will be hard 87 00:04:36,839 --> 00:04:40,320 Speaker 1: to understand. And I think in the case of the 88 00:04:40,360 --> 00:04:45,479 Speaker 1: surveillance industry, the latter is true. And it wasn't just 89 00:04:45,640 --> 00:04:47,480 Speaker 1: that they were trying to play us for suckers, they 90 00:04:47,520 --> 00:04:50,640 Speaker 1: were also playing their customers for suckers. Right. One of 91 00:04:50,640 --> 00:04:52,839 Speaker 1: the reasons that the ad tech stack is such a 92 00:04:52,960 --> 00:04:56,640 Speaker 1: snarled hair ball is so that the people who buy 93 00:04:56,680 --> 00:04:59,080 Speaker 1: ads and the publishers who run ads can't tell how 94 00:04:59,120 --> 00:05:01,919 Speaker 1: badly they're being RiPP off by their intermediaries. But this 95 00:05:02,000 --> 00:05:04,080 Speaker 1: also has the side effect of making it very hard 96 00:05:04,120 --> 00:05:07,080 Speaker 1: for us to know as the as the kind of 97 00:05:07,120 --> 00:05:11,560 Speaker 1: inputs to that system, how our own dignity and private 98 00:05:11,600 --> 00:05:15,040 Speaker 1: lives and safety and integrity are being put to risk 99 00:05:15,240 --> 00:05:20,320 Speaker 1: by these systems as well. Um. And you know, it 100 00:05:20,400 --> 00:05:22,720 Speaker 1: may be that people, if they had been well informed 101 00:05:22,800 --> 00:05:25,400 Speaker 1: about what was going on, they might have been indifferent 102 00:05:25,440 --> 00:05:27,880 Speaker 1: as well. But I think that when most people were 103 00:05:27,960 --> 00:05:30,440 Speaker 1: very poorly informed, right when all there was was this 104 00:05:30,560 --> 00:05:34,880 Speaker 1: kind of that privacy discourse was just like stuff as 105 00:05:35,000 --> 00:05:39,120 Speaker 1: being your personal information is being siphoned up, but no 106 00:05:39,440 --> 00:05:42,360 Speaker 1: kind of specifics on how that was being used and 107 00:05:42,480 --> 00:05:44,560 Speaker 1: how that was being done and how it might bring 108 00:05:44,600 --> 00:05:48,320 Speaker 1: you to harm. Um, it's not clear that that you 109 00:05:48,360 --> 00:05:50,920 Speaker 1: can say that that the reason they were indifferent is 110 00:05:50,960 --> 00:05:54,240 Speaker 1: because they were fully informed and inn care if you 111 00:05:54,360 --> 00:05:56,479 Speaker 1: know that they weren't fully informed, if you know that 112 00:05:56,520 --> 00:05:59,760 Speaker 1: they were barely informed. M h, I mean yeah, I 113 00:05:59,760 --> 00:06:02,800 Speaker 1: think are absolutely right. Because when the Cambridge analytic a 114 00:06:02,839 --> 00:06:04,720 Speaker 1: scandal broke, which was I think one of the first 115 00:06:04,760 --> 00:06:08,039 Speaker 1: times that there was a really huge international story that 116 00:06:08,120 --> 00:06:10,960 Speaker 1: made it clear some of the consequences of all this, like, 117 00:06:11,040 --> 00:06:13,320 Speaker 1: it did provoke a lot of a lot of anger. 118 00:06:14,000 --> 00:06:17,000 Speaker 1: Um I I do you worry at all that, Like 119 00:06:18,360 --> 00:06:21,640 Speaker 1: there's a degree to which because it because people got 120 00:06:21,720 --> 00:06:23,560 Speaker 1: tricked or wherever you want to frame it, and it's 121 00:06:23,560 --> 00:06:29,719 Speaker 1: gone the the kind of um financialization of people's private data, 122 00:06:29,839 --> 00:06:32,760 Speaker 1: of people's like personal information, because that has gone so far, 123 00:06:33,240 --> 00:06:35,400 Speaker 1: there's a risk that people are just kind of inured 124 00:06:35,440 --> 00:06:39,080 Speaker 1: to it. Um yeah, well, well, I mean that kind 125 00:06:39,080 --> 00:06:41,240 Speaker 1: of gets to my theory change here, which is that 126 00:06:41,680 --> 00:06:45,560 Speaker 1: there is always gonna be uh a point of um 127 00:06:46,320 --> 00:06:50,080 Speaker 1: maximum and difference peak in difference. You know, Um, if 128 00:06:50,120 --> 00:06:55,440 Speaker 1: you think about something like being a smoker, the likelihood 129 00:06:55,520 --> 00:07:00,120 Speaker 1: that you care about cancer goes up the long or 130 00:07:00,200 --> 00:07:04,039 Speaker 1: you smoke, and the more health effects you feel. And 131 00:07:04,120 --> 00:07:07,279 Speaker 1: certainly there will come a point in your life when 132 00:07:07,279 --> 00:07:09,880 Speaker 1: you will only ever grow more worried about the effects 133 00:07:09,920 --> 00:07:12,760 Speaker 1: of smoking on your life. But there's also a point 134 00:07:12,760 --> 00:07:15,240 Speaker 1: of no return. Right. If the point at which you 135 00:07:15,440 --> 00:07:19,640 Speaker 1: you're you're concern reaches the point where you're actually going 136 00:07:19,680 --> 00:07:22,640 Speaker 1: to do something about smoking is the day you get 137 00:07:22,640 --> 00:07:27,119 Speaker 1: diagnosed with stage four lung cancer, then that um denialism 138 00:07:27,800 --> 00:07:31,720 Speaker 1: can slide into nihilism. You can say, why bother, right, 139 00:07:31,760 --> 00:07:34,640 Speaker 1: it's too late. It's like if if we spend years 140 00:07:34,760 --> 00:07:39,080 Speaker 1: arguing about the crashing population of rhinos and then finally 141 00:07:39,280 --> 00:07:41,760 Speaker 1: there's only one left, and you say you're right, there 142 00:07:41,880 --> 00:07:45,000 Speaker 1: was a problem, you might as well say, like, why 143 00:07:45,000 --> 00:07:46,680 Speaker 1: don't we eat him and find out what he tastes like? 144 00:07:46,760 --> 00:07:49,240 Speaker 1: It's not like the rhinos are ever going to come back, right, 145 00:07:49,640 --> 00:07:53,600 Speaker 1: And so for me, so much of the work is 146 00:07:53,640 --> 00:07:57,360 Speaker 1: about shifting the point of peak indifference to the left 147 00:07:57,400 --> 00:08:00,680 Speaker 1: of the point of no return on the timeline, so 148 00:08:00,800 --> 00:08:05,080 Speaker 1: that people actually start to care earlier, because it's it's 149 00:08:04,880 --> 00:08:08,360 Speaker 1: if you haven't a genuine problem, right, like the overcollection 150 00:08:08,400 --> 00:08:10,880 Speaker 1: of our private data, the mishandling of it, the abuse 151 00:08:10,880 --> 00:08:15,679 Speaker 1: of it, that genuine problem will eventually produce tangible effects 152 00:08:15,720 --> 00:08:19,080 Speaker 1: that are undeniable, right that that the our ability to 153 00:08:19,160 --> 00:08:22,000 Speaker 1: ignore it just goes monotonically down. It's the thing about 154 00:08:22,000 --> 00:08:26,200 Speaker 1: the climate emergency. You know, even if Shell had not 155 00:08:26,360 --> 00:08:29,600 Speaker 1: our Exon had not hidden the data it had on 156 00:08:29,640 --> 00:08:32,400 Speaker 1: the role that it's products were playing in climate change 157 00:08:32,400 --> 00:08:34,920 Speaker 1: in the seventies, it would have been hard to muster 158 00:08:34,960 --> 00:08:37,679 Speaker 1: a sense of urgency in the seventies, right because the 159 00:08:37,720 --> 00:08:40,400 Speaker 1: story is that in fifty years something bad's going to happen. 160 00:08:40,400 --> 00:08:43,000 Speaker 1: But here we are, fifty years on something bad is 161 00:08:43,080 --> 00:08:45,920 Speaker 1: really happening, and a lot of people are caring about it. 162 00:08:46,440 --> 00:08:48,600 Speaker 1: They still don't seem to care about it enough, or 163 00:08:48,640 --> 00:08:51,280 Speaker 1: maybe they've slid into nihilism. There's certainly, i think on 164 00:08:51,320 --> 00:08:54,440 Speaker 1: the part of the elites, a kind of nihilistic sense 165 00:08:54,960 --> 00:08:57,880 Speaker 1: that maybe they can all retreat to like mountaintops and 166 00:08:57,920 --> 00:09:02,200 Speaker 1: build fortresses and breathe their children by Harrier jet you know, 167 00:09:02,400 --> 00:09:07,040 Speaker 1: and and and you know that nihilism, I think is 168 00:09:07,040 --> 00:09:10,240 Speaker 1: is what you get when the point of no return 169 00:09:10,600 --> 00:09:15,880 Speaker 1: is passed before peak denial. Uh and the privacy um 170 00:09:16,840 --> 00:09:19,720 Speaker 1: catastrophe that is looming in our future that we haven't 171 00:09:19,800 --> 00:09:22,000 Speaker 1: quite reached yet. I mean, we've just had the first 172 00:09:22,520 --> 00:09:26,440 Speaker 1: kind of trickles of the dam breaking that's in our future. 173 00:09:27,280 --> 00:09:30,680 Speaker 1: It hasn't been enough yet to shift people away from it, 174 00:09:30,720 --> 00:09:32,760 Speaker 1: but but we might be getting there right. We might 175 00:09:32,840 --> 00:09:36,160 Speaker 1: we might eventually be able to do something about it. 176 00:09:36,480 --> 00:09:38,560 Speaker 1: And one of the things that will hasten that moment 177 00:09:39,720 --> 00:09:44,600 Speaker 1: is um restoring competition to those industries. That one of 178 00:09:44,600 --> 00:09:48,079 Speaker 1: the reasons that uh, the industry that spies on us 179 00:09:48,720 --> 00:09:53,520 Speaker 1: is able to foster denial and indifference is because it 180 00:09:53,720 --> 00:09:56,960 Speaker 1: is a monopolized industry. To companies control eighty percent of 181 00:09:57,040 --> 00:10:02,640 Speaker 1: the ad market, Google and Facebook, and as as monopolists, 182 00:10:02,640 --> 00:10:05,319 Speaker 1: they're able to extract huge monopoly rents. They're among the 183 00:10:05,360 --> 00:10:08,200 Speaker 1: most profitable companies in the history of the world. And 184 00:10:08,360 --> 00:10:11,000 Speaker 1: some of those monopoly rents, rather than being returned to shareholders, 185 00:10:11,000 --> 00:10:14,199 Speaker 1: can be mobilized to distort policy, to to make us 186 00:10:14,360 --> 00:10:16,720 Speaker 1: think that there's nothing wrong with the way that they 187 00:10:16,760 --> 00:10:21,080 Speaker 1: collect data and use it to forestall regulation, to pay 188 00:10:21,160 --> 00:10:24,480 Speaker 1: Nick Clegg four million a year to go around Europe 189 00:10:24,480 --> 00:10:26,800 Speaker 1: and the world and say, as the former Deputy Prime 190 00:10:26,800 --> 00:10:29,080 Speaker 1: Minister of the United Kingdom, I'm here to tell you 191 00:10:29,120 --> 00:10:31,840 Speaker 1: that Facebook is the friend of the democratic regimes of 192 00:10:31,840 --> 00:10:36,680 Speaker 1: the world. And and you know, if if the anti 193 00:10:36,720 --> 00:10:39,800 Speaker 1: monopoly movement, which is a thing I've become very involved with, 194 00:10:40,559 --> 00:10:43,920 Speaker 1: is able to go from strength to strength, it's surging now, 195 00:10:44,440 --> 00:10:47,920 Speaker 1: then one of the things that we might do is 196 00:10:47,920 --> 00:10:51,720 Speaker 1: is destroy the ammunition that's being used by these large 197 00:10:51,760 --> 00:10:55,240 Speaker 1: monopolistic firms to distort our policy and harm us in 198 00:10:55,280 --> 00:10:58,679 Speaker 1: these ways with impunity. And and then maybe we can 199 00:10:58,720 --> 00:11:03,800 Speaker 1: actually take the the nascent and natural alarm that people 200 00:11:03,920 --> 00:11:08,880 Speaker 1: do feel about the invasions of their privacy and and 201 00:11:09,080 --> 00:11:13,520 Speaker 1: actually turn that into privacy policy that is meaningful in 202 00:11:13,600 --> 00:11:17,320 Speaker 1: respect of these big companies that actually reigns them in. Yeah, 203 00:11:17,360 --> 00:11:20,360 Speaker 1: and I think I like that you frame it as 204 00:11:20,440 --> 00:11:23,280 Speaker 1: a privacy catastrophe because I think, I mean, what I 205 00:11:23,360 --> 00:11:27,000 Speaker 1: just exhibited earlier in this episode, is this this tendency 206 00:11:27,040 --> 00:11:28,920 Speaker 1: that I certainly see in myself and I see in 207 00:11:28,960 --> 00:11:31,000 Speaker 1: other people to get kind of beaten down by the 208 00:11:31,040 --> 00:11:34,640 Speaker 1: continued um excesses of this industry and the continued kind 209 00:11:34,640 --> 00:11:37,080 Speaker 1: of failure of anything to be done to curb it. 210 00:11:37,120 --> 00:11:39,000 Speaker 1: And I think you're right, it has to be viewed 211 00:11:39,760 --> 00:11:43,160 Speaker 1: as um as a calamity. And I and nothing I 212 00:11:43,200 --> 00:11:46,199 Speaker 1: think makes that clearer than some of watching some of 213 00:11:46,240 --> 00:11:48,600 Speaker 1: the stuff Facebook in particular has put out about their 214 00:11:48,600 --> 00:11:51,320 Speaker 1: plans for the metaverse, and kind of thinking back from 215 00:11:51,360 --> 00:11:53,480 Speaker 1: all of these sensors they want to store in your house, 216 00:11:53,480 --> 00:11:54,880 Speaker 1: all of the ways in which they want to map 217 00:11:54,960 --> 00:11:58,559 Speaker 1: everything around you. Um, they never you know, they they 218 00:11:58,640 --> 00:12:00,480 Speaker 1: kind of advertise as like you'll be able to play 219 00:12:00,480 --> 00:12:03,000 Speaker 1: basketball with somebody who's in a different state. But really 220 00:12:03,000 --> 00:12:06,480 Speaker 1: what it is is you're giving Facebook access to every 221 00:12:06,480 --> 00:12:09,120 Speaker 1: measurement of your body and you know, the pulse of 222 00:12:09,120 --> 00:12:11,200 Speaker 1: the beat of your heart and all this this stuff 223 00:12:11,200 --> 00:12:14,520 Speaker 1: that like maybe we don't quite know what it would 224 00:12:14,520 --> 00:12:19,239 Speaker 1: be useful for from a financialization standpoint, but it's unsettling 225 00:12:19,240 --> 00:12:21,040 Speaker 1: to think that they'll have to find a way because 226 00:12:21,080 --> 00:12:23,120 Speaker 1: they'll have it. You know, I don't know. I don't 227 00:12:23,120 --> 00:12:25,040 Speaker 1: know what is to be done about that, other than, 228 00:12:25,240 --> 00:12:27,959 Speaker 1: as you say, kind of breaking up these monopolies. Well, 229 00:12:28,000 --> 00:12:30,320 Speaker 1: and and I mean breaking up is like one of 230 00:12:30,360 --> 00:12:32,440 Speaker 1: the things we can do to monopolies, and and it 231 00:12:32,480 --> 00:12:35,360 Speaker 1: takes a long time, you know, Um A T and T. 232 00:12:35,760 --> 00:12:40,040 Speaker 1: The first enforcement action against it happened sixty nine years 233 00:12:40,080 --> 00:12:42,800 Speaker 1: before it was broken up in N two. I don't 234 00:12:42,800 --> 00:12:44,600 Speaker 1: think we can wait that long. But there's a lot 235 00:12:44,640 --> 00:12:47,439 Speaker 1: of intermediate steps, right, Like we can force them to 236 00:12:47,600 --> 00:12:52,679 Speaker 1: do interoperability, we can block them from from predatory acquisitions. 237 00:12:52,679 --> 00:12:55,640 Speaker 1: We can force them to divest of companies and engage 238 00:12:55,640 --> 00:12:58,600 Speaker 1: in structural separation. We can do all kinds of things. 239 00:12:58,640 --> 00:13:00,280 Speaker 1: It actually looks like the United King it was going 240 00:13:00,360 --> 00:13:03,680 Speaker 1: to stop them from buying Giffey, which might seem trivial 241 00:13:03,800 --> 00:13:07,360 Speaker 1: after all, it's just like animated jifts, but um, what 242 00:13:07,480 --> 00:13:11,960 Speaker 1: it actually is surveillance beacons in every social media application, 243 00:13:12,480 --> 00:13:16,800 Speaker 1: right because if you're hosting a Jeff from Giffey in 244 00:13:16,920 --> 00:13:22,439 Speaker 1: your message to someone else, Facebook has telemetry about that message. Um. 245 00:13:22,480 --> 00:13:26,000 Speaker 1: And so the the the not the i c O, 246 00:13:26,160 --> 00:13:29,960 Speaker 1: the Competition Competition and Markets Authority in the UK was like, yeah, 247 00:13:29,960 --> 00:13:31,880 Speaker 1: this is just gonna strengthen your market power. That's why 248 00:13:31,880 --> 00:13:34,959 Speaker 1: you're buying this company. You have too much market power already. 249 00:13:35,040 --> 00:13:38,080 Speaker 1: We're not gonna let you do it. Um. It was 250 00:13:38,200 --> 00:13:41,080 Speaker 1: almost the case that the Fitbit merger was blocked. Google's 251 00:13:41,120 --> 00:13:43,559 Speaker 1: Fitbit merger. I think it's still not too late to 252 00:13:43,679 --> 00:13:46,240 Speaker 1: roll it back. And Lena Cohn, who's the new fire 253 00:13:46,280 --> 00:13:49,199 Speaker 1: breathing dragon in charge of the FTC, who is an 254 00:13:49,240 --> 00:13:52,840 Speaker 1: astonishing person who was a law student three years ago. 255 00:13:53,559 --> 00:13:56,600 Speaker 1: Uh she has said, oh yeah, this this like one 256 00:13:56,640 --> 00:13:59,360 Speaker 1: point three trillion dollars worth of mergers and acquisitions that 257 00:13:59,360 --> 00:14:01,480 Speaker 1: you're doing right now to get in under the wire 258 00:14:01,520 --> 00:14:04,720 Speaker 1: before we start enforcing. Guess what, We're going to unwind 259 00:14:04,760 --> 00:14:07,560 Speaker 1: those fucking mergers if it looks like they were anti competitive. 260 00:14:07,800 --> 00:14:09,480 Speaker 1: And not only are you going to lose all the 261 00:14:09,480 --> 00:14:11,400 Speaker 1: money you spent on the M and A due diligence 262 00:14:11,440 --> 00:14:13,680 Speaker 1: and the paperwork and the corporate stuff, but all that 263 00:14:13,760 --> 00:14:16,040 Speaker 1: integration you're going to do between now and then, you're 264 00:14:16,080 --> 00:14:18,959 Speaker 1: gonna have to de integrate those companies when we tell 265 00:14:19,000 --> 00:14:21,800 Speaker 1: you that you don't have uh, you don't have merger 266 00:14:21,840 --> 00:14:26,080 Speaker 1: approval and you're on notice. You can't come and complain later, right, Like, 267 00:14:26,400 --> 00:14:28,400 Speaker 1: you can either get in line and wait for us 268 00:14:28,440 --> 00:14:31,080 Speaker 1: to tell you whether or not your merger is legal, 269 00:14:31,520 --> 00:14:33,240 Speaker 1: or you can roll the dice. But I tell you 270 00:14:33,280 --> 00:14:36,200 Speaker 1: what if you come up sneake, guys, you are fucked. 271 00:14:36,640 --> 00:14:40,240 Speaker 1: And that is amazing, right, that is a powerful change 272 00:14:40,280 --> 00:14:44,760 Speaker 1: in American industrial policy that really makes a difference. Yeah, 273 00:14:44,800 --> 00:14:47,040 Speaker 1: I mean, and that is a beautiful thing to think 274 00:14:47,080 --> 00:14:50,360 Speaker 1: of being in place and actually hitting as hard as 275 00:14:50,360 --> 00:14:53,680 Speaker 1: it could. Obviously, the concern is that like who will 276 00:14:53,720 --> 00:14:56,200 Speaker 1: be you know, picking the head of the FTC in 277 00:14:56,440 --> 00:14:59,080 Speaker 1: three years in change, and like how how how much 278 00:14:59,080 --> 00:15:02,960 Speaker 1: influence is Peter You're going to have their in the like, um, yeah, well, 279 00:15:03,000 --> 00:15:05,840 Speaker 1: and Peter Tiel of course love's monopolies. He says, competition 280 00:15:05,920 --> 00:15:09,680 Speaker 1: is for losers. So you're right, I mean, obviously, elections 281 00:15:09,680 --> 00:15:13,160 Speaker 1: have consequences, but you know, one of the ways that 282 00:15:13,200 --> 00:15:16,720 Speaker 1: you win elections is by making material differences in people's lives, 283 00:15:16,800 --> 00:15:21,880 Speaker 1: and so you know, if people are policy, then uh. 284 00:15:22,040 --> 00:15:24,480 Speaker 1: One of the most important policies Biden has set so 285 00:15:24,520 --> 00:15:28,120 Speaker 1: far is hiring Lena Khan and her colleagues Canter at 286 00:15:28,160 --> 00:15:31,200 Speaker 1: the d J and Tim Wu and the White House. Yeah, 287 00:15:31,240 --> 00:15:33,880 Speaker 1: I mean, I would I would love nothing more than 288 00:15:33,920 --> 00:15:37,520 Speaker 1: to see particularly like Facebook reigned in at this point 289 00:15:37,560 --> 00:15:40,240 Speaker 1: because I'm one of the casualties of the of the 290 00:15:40,360 --> 00:15:43,960 Speaker 1: of the the ad market, like crash of started in 291 00:15:44,000 --> 00:15:48,080 Speaker 1: like two sixteen seventeen. It feels like the odds of 292 00:15:48,080 --> 00:15:51,440 Speaker 1: them being able to like, I don't know, We've got 293 00:15:51,840 --> 00:15:56,960 Speaker 1: three years where we know, you know, theoretically these policies 294 00:15:57,000 --> 00:15:59,800 Speaker 1: will be in place, and and I don't know, I'm 295 00:16:00,080 --> 00:16:02,000 Speaker 1: well like when I when I because the Republicans are 296 00:16:02,000 --> 00:16:04,760 Speaker 1: talking a lot about regulating social media too, about even 297 00:16:04,800 --> 00:16:07,200 Speaker 1: breaking up these companies, but they often tend to be 298 00:16:07,200 --> 00:16:09,080 Speaker 1: talking about it in a very different way and with 299 00:16:09,120 --> 00:16:12,760 Speaker 1: a very different kind of end goal in mind. Um 300 00:16:12,800 --> 00:16:15,840 Speaker 1: And I guess you know obviously they know that, right Facebook, 301 00:16:15,880 --> 00:16:18,560 Speaker 1: they are well aware that like this might be a 302 00:16:18,600 --> 00:16:20,760 Speaker 1: weight out the clock situation for them, and they have 303 00:16:20,880 --> 00:16:23,640 Speaker 1: some arrows in that quiver. I mean that may be so, 304 00:16:23,720 --> 00:16:26,960 Speaker 1: but also remember that Facebook's users are outside of the US, 305 00:16:27,560 --> 00:16:31,320 Speaker 1: and that even a change in administration here won't won't 306 00:16:31,560 --> 00:16:35,400 Speaker 1: um put Margaret vest Dagger, who's the Competition commissioner in 307 00:16:35,440 --> 00:16:39,040 Speaker 1: the EU, back in the bottle. And she's another fire breather, right, 308 00:16:39,080 --> 00:16:42,360 Speaker 1: She's another amazing person, And so you know I wouldn't 309 00:16:42,400 --> 00:16:44,479 Speaker 1: be too quick to write that off. I mean Facebook 310 00:16:45,120 --> 00:16:48,000 Speaker 1: needs its foreign markets, Yes, It's U S customers are 311 00:16:48,000 --> 00:16:49,800 Speaker 1: worth more to anyone else because we have the most 312 00:16:49,800 --> 00:16:52,960 Speaker 1: primitive privacy frameworks, so it can extract a lot more 313 00:16:53,000 --> 00:16:54,960 Speaker 1: data for like we're the we're the richest people with 314 00:16:55,000 --> 00:16:58,760 Speaker 1: a worse privacy, so that's that's um. You know, it's 315 00:16:58,800 --> 00:17:01,200 Speaker 1: a real home court advantage for Facebook, but it needs 316 00:17:01,200 --> 00:17:03,600 Speaker 1: that other eight percent of its users. It wouldn't be 317 00:17:03,640 --> 00:17:05,960 Speaker 1: what it is without them, and that makes it subject 318 00:17:06,000 --> 00:17:08,160 Speaker 1: to their jurisdiction. And you know, one of the things 319 00:17:08,200 --> 00:17:12,440 Speaker 1: about ad driven firms like Facebook, UM is that they 320 00:17:12,560 --> 00:17:16,840 Speaker 1: really need sales offices in country. Uh. So you know, 321 00:17:16,960 --> 00:17:20,200 Speaker 1: even before we we had the proliferation national firewalls, which 322 00:17:20,200 --> 00:17:23,440 Speaker 1: don't get me wrong, I don't think it's a good thing. Um. 323 00:17:23,520 --> 00:17:29,720 Speaker 1: These large global firms that operated UM sales offices in country, 324 00:17:29,800 --> 00:17:33,159 Speaker 1: in every territory they worked in, were vulnerable to regulation 325 00:17:33,280 --> 00:17:36,679 Speaker 1: because if you have staff in a country, then you 326 00:17:36,680 --> 00:17:40,720 Speaker 1: have someone that can be arrested, right, And so it's 327 00:17:40,760 --> 00:17:43,880 Speaker 1: not like they can just be like I don't know, 328 00:17:43,920 --> 00:17:47,359 Speaker 1: like the Tour Project, which just you know, it has people, 329 00:17:48,280 --> 00:17:51,160 Speaker 1: um who who sit in hack on tour who are 330 00:17:51,160 --> 00:17:53,480 Speaker 1: close to lawyers who can defend people who sit on 331 00:17:53,520 --> 00:17:56,440 Speaker 1: hack and on tour. Uh. You know, if the Tour 332 00:17:56,520 --> 00:18:00,320 Speaker 1: Project had to have staff full time in Turkey and 333 00:18:00,520 --> 00:18:04,840 Speaker 1: China and Russia and Syria in order to operate, it 334 00:18:04,880 --> 00:18:08,520 Speaker 1: would be a very different project. But you know, Facebook 335 00:18:08,560 --> 00:18:11,000 Speaker 1: and Twitter and Google they all have staff in those 336 00:18:11,000 --> 00:18:14,880 Speaker 1: countries and it makes them vulnerable to regulation. And so, 337 00:18:15,000 --> 00:18:19,280 Speaker 1: you know, China is really interesting because because um she 338 00:18:19,480 --> 00:18:22,439 Speaker 1: Jin Ping, for his own reasons which are not my 339 00:18:22,600 --> 00:18:26,280 Speaker 1: reasons and distinct from the Democrats and the Republicans reasons, 340 00:18:26,680 --> 00:18:29,040 Speaker 1: is doing stuff to rein in big tech in China. 341 00:18:29,520 --> 00:18:31,800 Speaker 1: And it's actually quite interesting because you know the argument 342 00:18:31,800 --> 00:18:33,920 Speaker 1: that Nick Klegg makes when he says why we shouldn't 343 00:18:33,920 --> 00:18:37,119 Speaker 1: break up Facebook, as he says, uh, you know, China 344 00:18:37,240 --> 00:18:40,679 Speaker 1: is coming for your UM, for your I P and 345 00:18:40,760 --> 00:18:45,159 Speaker 1: for your industrial competitiveness with its big tech giants that 346 00:18:45,240 --> 00:18:48,439 Speaker 1: it treats as national champions that projects soft power around 347 00:18:48,440 --> 00:18:52,000 Speaker 1: the world. Meanwhile, China is like these tech giants. We 348 00:18:52,080 --> 00:18:55,040 Speaker 1: hate these tech giants. They present a countervailing force to 349 00:18:55,080 --> 00:18:57,439 Speaker 1: the hedgemony of the the Communist Party and the and 350 00:18:57,480 --> 00:18:59,879 Speaker 1: the executive branch that she should bring sets at the 351 00:19:00,040 --> 00:19:02,440 Speaker 1: off of we're gonna neuter them and we're gonna we're 352 00:19:02,440 --> 00:19:07,280 Speaker 1: gonna disappear their founders like Jack Montt fucking googlogs right like, 353 00:19:07,480 --> 00:19:11,080 Speaker 1: they're like, we don't want national champions because the nation 354 00:19:11,400 --> 00:19:14,080 Speaker 1: that you know, we bow and Ali Baba is the 355 00:19:14,200 --> 00:19:17,399 Speaker 1: champion four is we Bow and Ali Baba and tense 356 00:19:17,480 --> 00:19:20,280 Speaker 1: that they're not They're not champions for China by any 357 00:19:20,359 --> 00:19:23,440 Speaker 1: stretch of the imagination. They don't give a shit about China. 358 00:19:23,840 --> 00:19:26,480 Speaker 1: And so you know, they're all of these companies are 359 00:19:26,520 --> 00:19:30,800 Speaker 1: going to face regulatory pressure, anti monopoly regulatory pressure all 360 00:19:30,800 --> 00:19:36,160 Speaker 1: over the world, and you're you're so much more um optimistic. 361 00:19:36,200 --> 00:19:38,840 Speaker 1: I guess about about the potential for that to bite 362 00:19:39,000 --> 00:19:40,840 Speaker 1: than a lot of people I talked to, and I 363 00:19:40,840 --> 00:19:43,200 Speaker 1: think more knowledgeable as well. And I kind of wonder 364 00:19:43,240 --> 00:19:46,639 Speaker 1: because there's this very strong, obviously influenced by decades of 365 00:19:46,640 --> 00:19:50,080 Speaker 1: cyberpunk attitude that like, we're in this age of mega 366 00:19:50,119 --> 00:19:54,040 Speaker 1: corporations whose power is you know, there's nothing that can 367 00:19:54,080 --> 00:19:56,439 Speaker 1: stop Amazon from doing what Amazon wants to do, right, 368 00:19:56,480 --> 00:19:58,720 Speaker 1: Facebook is going to keep doing whatever they want to 369 00:19:58,720 --> 00:20:01,320 Speaker 1: do forever. You clear really don't believe that, and I 370 00:20:01,520 --> 00:20:05,600 Speaker 1: you know, you clearly know your stuff. I'm wondering why 371 00:20:05,720 --> 00:20:09,919 Speaker 1: you think that that image is still persist, so persistent 372 00:20:10,000 --> 00:20:14,200 Speaker 1: that like attitude in our heads of these these these 373 00:20:14,240 --> 00:20:17,320 Speaker 1: are kind of monolithic forces in our society that um 374 00:20:17,680 --> 00:20:20,720 Speaker 1: just have to be endured. So I think it's a 375 00:20:20,720 --> 00:20:24,080 Speaker 1: belief in the great forces of history, right, um, and 376 00:20:24,160 --> 00:20:27,600 Speaker 1: the great man theory. You know that the the these 377 00:20:27,720 --> 00:20:34,960 Speaker 1: um Uh you know that these rich people are driving history. Yeah, 378 00:20:35,040 --> 00:20:37,879 Speaker 1: these these these powerful figures are driving history. They're in 379 00:20:37,960 --> 00:20:39,840 Speaker 1: charge there in the driver's seat. I mean, that's kind 380 00:20:39,880 --> 00:20:42,359 Speaker 1: of what's behind Trump arrangement syndrome, right, the idea that 381 00:20:42,400 --> 00:20:47,800 Speaker 1: Trump is uniquely powerful and talented demagogue as opposed to 382 00:20:47,880 --> 00:20:51,159 Speaker 1: just like a demagogue shaped puzzle piece that fit in 383 00:20:51,200 --> 00:20:54,280 Speaker 1: the demagogue shaped hold that was left by the collapse 384 00:20:54,280 --> 00:20:57,400 Speaker 1: of credibility of capitalism. Uh. And you know, a man 385 00:20:57,440 --> 00:20:59,639 Speaker 1: who is clearly too stupid to be a cause of 386 00:20:59,680 --> 00:21:02,479 Speaker 1: any thing and will only ever be the effect of something. 387 00:21:03,160 --> 00:21:08,000 Speaker 1: And uh, you know the for me, the theory of 388 00:21:08,280 --> 00:21:12,880 Speaker 1: history and how it goes was really transformed by an 389 00:21:12,880 --> 00:21:16,159 Speaker 1: exercise that my friend Ada Palmer does. So. Aida is 390 00:21:16,200 --> 00:21:19,359 Speaker 1: a science fiction novelist. She's she's just published the fourth 391 00:21:19,400 --> 00:21:22,239 Speaker 1: book of her Terra Ignota series, her debut series. It's 392 00:21:22,280 --> 00:21:25,480 Speaker 1: in an incredible series of books. But she's a real 393 00:21:25,560 --> 00:21:27,879 Speaker 1: like kind of multi talented, multi threat. So she's a 394 00:21:27,960 --> 00:21:31,919 Speaker 1: librettiston singer who has produced album length operas based on 395 00:21:31,960 --> 00:21:36,960 Speaker 1: the Norse mythos. She's also a tenured history of um 396 00:21:37,000 --> 00:21:40,000 Speaker 1: a tenured professor of Renaissance history in Florence at the 397 00:21:40,040 --> 00:21:46,359 Speaker 1: University of Chicago, where she studies heterodox information, pornography, homosexuality, witchcraft, 398 00:21:46,359 --> 00:21:49,479 Speaker 1: and so on. During the Inquisitions and every year with 399 00:21:49,560 --> 00:21:54,160 Speaker 1: her undergrads, she reenacts through a four week long live 400 00:21:54,200 --> 00:21:57,000 Speaker 1: action role playing game, the Election of the Medici's Pope, 401 00:21:57,600 --> 00:21:59,879 Speaker 1: and each of her students takes on the role of 402 00:22:00,040 --> 00:22:04,119 Speaker 1: a cardinal from a great family and the in the 403 00:22:04,160 --> 00:22:08,120 Speaker 1: actual election of the I forget what year was, uh 404 00:22:08,480 --> 00:22:13,800 Speaker 1: fourteen ninety or something, I forget, but they each take 405 00:22:13,840 --> 00:22:17,639 Speaker 1: on this role and they have a character sheet and 406 00:22:17,760 --> 00:22:20,800 Speaker 1: has motivations like a dinner party, murder mystery. But for 407 00:22:20,840 --> 00:22:25,760 Speaker 1: four weeks they make alliances, break alliances, stab each other 408 00:22:25,760 --> 00:22:29,480 Speaker 1: in the back uh stage surprise reversals. And at the 409 00:22:29,600 --> 00:22:32,560 Speaker 1: end of the four weeks there's this u faux Gothic 410 00:22:33,280 --> 00:22:37,840 Speaker 1: cathedral on campus and they dress up in costume. Aida 411 00:22:37,960 --> 00:22:41,399 Speaker 1: has a a Google alert for theater companies that are 412 00:22:41,440 --> 00:22:44,199 Speaker 1: getting rid of their costumes, so she clothes them in 413 00:22:44,240 --> 00:22:49,240 Speaker 1: the garb of the Medici's cardinals and they gather and 414 00:22:49,520 --> 00:22:51,840 Speaker 1: they go into a room and then a puff of 415 00:22:51,840 --> 00:22:55,639 Speaker 1: smoke emerges and you get the new pope. And every 416 00:22:55,720 --> 00:22:59,720 Speaker 1: year four of the final candidates, Uh, there are four 417 00:22:59,760 --> 00:23:02,240 Speaker 1: final candidates rather, and two of them are always the 418 00:23:02,320 --> 00:23:05,480 Speaker 1: same because the great forces of history bear down on 419 00:23:05,520 --> 00:23:07,920 Speaker 1: that moment to say those people will absolutely be in 420 00:23:07,960 --> 00:23:12,320 Speaker 1: the running for the for the papacy, and two of 421 00:23:12,359 --> 00:23:16,120 Speaker 1: them have never once been the same, because human action 422 00:23:17,119 --> 00:23:21,919 Speaker 1: still has space to alter the outcomes that are prefigured 423 00:23:21,960 --> 00:23:25,040 Speaker 1: by the great forces of history. And so for me, 424 00:23:26,320 --> 00:23:29,080 Speaker 1: the idea of being an optimist or a pessimist has 425 00:23:29,119 --> 00:23:32,960 Speaker 1: always felt very fatalistic. It's this either way, this idea 426 00:23:33,000 --> 00:23:35,000 Speaker 1: that the great forces of history have determined the outcome 427 00:23:35,040 --> 00:23:37,560 Speaker 1: and human action has no bearing on it. And I 428 00:23:37,600 --> 00:23:42,000 Speaker 1: think that rather than optimism or pessimism, we can be hopeful. 429 00:23:42,080 --> 00:23:44,719 Speaker 1: And that's the word you use before. Hope is the 430 00:23:44,800 --> 00:23:47,040 Speaker 1: idea not that you can see a path from here 431 00:23:47,080 --> 00:23:49,560 Speaker 1: to the place you want to get to, but rather 432 00:23:49,600 --> 00:23:51,440 Speaker 1: that you haven't run out of things that you can 433 00:23:51,480 --> 00:23:54,600 Speaker 1: do to advance your your goal, right, Because if you 434 00:23:54,640 --> 00:23:57,600 Speaker 1: can take a step to advance your goal, you can 435 00:23:57,600 --> 00:24:01,720 Speaker 1: ascend the gradient towards the heat that you're trying to reach, 436 00:24:02,240 --> 00:24:05,320 Speaker 1: then you will attain a new vantage point, and from 437 00:24:05,320 --> 00:24:07,760 Speaker 1: that vantage point, you may have revealed to you courses 438 00:24:07,760 --> 00:24:11,280 Speaker 1: of action that you didn't suspect before you took that step. 439 00:24:11,520 --> 00:24:13,760 Speaker 1: So so long as the step is available, there's always 440 00:24:13,760 --> 00:24:16,400 Speaker 1: another step lurking in the wings that you can't see 441 00:24:16,440 --> 00:24:19,119 Speaker 1: from where you are. And the reason I'm hopeful about 442 00:24:19,160 --> 00:24:22,280 Speaker 1: this is I can think of like fifty things that 443 00:24:22,320 --> 00:24:26,280 Speaker 1: could improve the monopoly picture that we're living in now, 444 00:24:26,680 --> 00:24:29,520 Speaker 1: and it's up from thirty things last year. And so 445 00:24:29,760 --> 00:24:31,520 Speaker 1: even though I don't know how we get from here 446 00:24:31,560 --> 00:24:33,800 Speaker 1: to a better future, and even though I absolutely see 447 00:24:33,800 --> 00:24:38,760 Speaker 1: the blockers you're talking about Trump landslide, uh losing Congress 448 00:24:38,800 --> 00:24:42,720 Speaker 1: because they let Joe Mansion and Christmas Cinema Newter the 449 00:24:42,720 --> 00:24:45,880 Speaker 1: build back better Bill, Um, you know all of those 450 00:24:45,920 --> 00:24:49,440 Speaker 1: things that can happen. I have hope, you know, which 451 00:24:49,440 --> 00:24:52,040 Speaker 1: is not the same as optimism or a belief that 452 00:24:52,119 --> 00:24:55,359 Speaker 1: things will be great, or even even like a sense 453 00:24:55,720 --> 00:24:57,879 Speaker 1: a lack of a sense of foreboding. I have that 454 00:24:58,000 --> 00:25:01,840 Speaker 1: in spades. But I have that when the next phase 455 00:25:01,880 --> 00:25:05,240 Speaker 1: of the fight begins, that we will have many um 456 00:25:05,440 --> 00:25:09,160 Speaker 1: vulnerable spots we can strike at, and that we can 457 00:25:09,200 --> 00:25:13,320 Speaker 1: capitalize on whichever victories we attain to find more vulnerabilities 458 00:25:13,320 --> 00:25:16,520 Speaker 1: and move on. I think that's so important and I 459 00:25:16,520 --> 00:25:18,679 Speaker 1: think it goes in line with to bring up climate 460 00:25:18,760 --> 00:25:20,480 Speaker 1: change again. The idea that like one of the most 461 00:25:20,480 --> 00:25:22,960 Speaker 1: toxic things you can think are e climate change is 462 00:25:23,040 --> 00:25:25,920 Speaker 1: that there's nothing to do. We're already past every point 463 00:25:25,920 --> 00:25:28,920 Speaker 1: of no return and there's no there's no positive action 464 00:25:28,960 --> 00:25:30,639 Speaker 1: because it just leads you to doing the same thing 465 00:25:30,720 --> 00:25:34,240 Speaker 1: as the people who deny it. Um. And it's Yeah, 466 00:25:34,280 --> 00:25:38,719 Speaker 1: I think it's it's very important to um recognize that like, 467 00:25:38,840 --> 00:25:40,520 Speaker 1: not only are there things you can do, but when 468 00:25:40,520 --> 00:25:42,640 Speaker 1: you do those things, you start taking those steps, other 469 00:25:42,680 --> 00:25:47,000 Speaker 1: steps reveal themselves. Yeah. Um. And you know what, if 470 00:25:47,000 --> 00:25:51,640 Speaker 1: you're feeling nihilistic about about climate I'm nearly through Saul 471 00:25:51,720 --> 00:25:55,479 Speaker 1: Griffith's book Electrify Uh. Saul's an old friend of mine. 472 00:25:55,560 --> 00:25:59,280 Speaker 1: He's MacArthur Winner's electrical engineer, and he's just done the 473 00:25:59,400 --> 00:26:01,399 Speaker 1: He's it's a popular engineering book. It's one of my 474 00:26:01,440 --> 00:26:04,760 Speaker 1: favorite genres. They're like popular science books, except instead of 475 00:26:04,800 --> 00:26:06,640 Speaker 1: telling you about how science works, they tell you about 476 00:26:06,640 --> 00:26:10,200 Speaker 1: how engineering works. And he's basically like, here is why 477 00:26:10,280 --> 00:26:13,080 Speaker 1: all the estimates of how much renewable as we need 478 00:26:13,160 --> 00:26:19,240 Speaker 1: are hugely overestimated, And it's basically that like keeping uh 479 00:26:19,400 --> 00:26:23,679 Speaker 1: fossil fuel power online requires a lot of fossil fuel, right, 480 00:26:23,720 --> 00:26:27,240 Speaker 1: so something like of that estimate is just it's the 481 00:26:27,440 --> 00:26:30,440 Speaker 1: energy that we need to make the energy and it's 482 00:26:30,440 --> 00:26:33,679 Speaker 1: not present in electrical models. Here's how we can manufacture it. 483 00:26:33,680 --> 00:26:36,359 Speaker 1: Here's how we can distribute it. Here is basically how 484 00:26:37,000 --> 00:26:41,920 Speaker 1: if we can figure out the financing, Americans can uh 485 00:26:42,200 --> 00:26:45,040 Speaker 1: spend less money every year than they do now to 486 00:26:45,160 --> 00:26:48,119 Speaker 1: get more stuff that they love every year that we 487 00:26:48,119 --> 00:26:52,560 Speaker 1: can do this without hair shirts. It's a spectacular book. Um, 488 00:26:52,600 --> 00:26:55,520 Speaker 1: And you know, I don't agree with everything Saul says 489 00:26:55,560 --> 00:26:59,639 Speaker 1: every all the time, but he is very careful about 490 00:27:00,080 --> 00:27:03,960 Speaker 1: is technical facts. There aren't technical errors in this. There 491 00:27:04,040 --> 00:27:06,480 Speaker 1: might be assumptions that we disagree with, but as a 492 00:27:06,520 --> 00:27:09,880 Speaker 1: technical matter, he's basically written a piece of design fiction 493 00:27:10,280 --> 00:27:14,040 Speaker 1: in which, over the next fifteen years, using clever finance 494 00:27:14,280 --> 00:27:18,800 Speaker 1: and and solid engineering, we really actually do avert the 495 00:27:18,840 --> 00:27:23,240 Speaker 1: climate emergency. And yeah, as always, kind of the main 496 00:27:23,800 --> 00:27:26,680 Speaker 1: barriers to doing the best version of the thing is 497 00:27:26,840 --> 00:27:29,680 Speaker 1: the political realities on the ground. You know, you have 498 00:27:29,760 --> 00:27:32,320 Speaker 1: to but I think that's the that's the value of 499 00:27:32,320 --> 00:27:35,520 Speaker 1: at least trying to make it clear that there are options. 500 00:27:45,600 --> 00:27:48,399 Speaker 1: I wanted to shift for a moment um. I was 501 00:27:48,480 --> 00:27:51,640 Speaker 1: thinking recently about I think probably the earliest back book 502 00:27:51,640 --> 00:27:55,440 Speaker 1: of yours that I've read, Pirate Cinema, which is heavily involved. 503 00:27:55,440 --> 00:27:58,320 Speaker 1: I think I'm going to you know, if you're one 504 00:27:58,320 --> 00:28:00,840 Speaker 1: of the folks like me, uh was on the Internet 505 00:28:00,840 --> 00:28:04,000 Speaker 1: back when you know, file sharing sites, when that was 506 00:28:04,000 --> 00:28:06,360 Speaker 1: a huge topic of discussion, when the r i A 507 00:28:06,560 --> 00:28:10,280 Speaker 1: Was going after people, when like copyright was kind of 508 00:28:10,760 --> 00:28:14,240 Speaker 1: a a much more prevalent part of kind of the 509 00:28:14,280 --> 00:28:17,800 Speaker 1: online discourse. Um. It deals a lot in that and 510 00:28:17,840 --> 00:28:20,040 Speaker 1: these kind of I think there's elements of it that 511 00:28:20,119 --> 00:28:23,960 Speaker 1: kind of prefigured what Disney has done buying up every 512 00:28:24,000 --> 00:28:27,159 Speaker 1: imaginable fictional property in the world. And that's kind of 513 00:28:27,160 --> 00:28:30,520 Speaker 1: the the elements of dystopia that book deals with is is, 514 00:28:31,240 --> 00:28:34,000 Speaker 1: you know, the attempts of these this these giant multinational 515 00:28:34,080 --> 00:28:38,040 Speaker 1: entertainment corporations to shut down the free tape trading of ideas, 516 00:28:38,080 --> 00:28:41,280 Speaker 1: remixing and all that stuff. And then kind of thinking 517 00:28:41,320 --> 00:28:44,040 Speaker 1: about the difference between the focus of that and the 518 00:28:44,080 --> 00:28:46,880 Speaker 1: focus of books like Attack surface where you're really delving 519 00:28:46,880 --> 00:28:50,360 Speaker 1: more into you know, I the fictional versions of real 520 00:28:50,400 --> 00:28:54,040 Speaker 1: life companies like Tiger Swan that do it uh the 521 00:28:54,240 --> 00:28:57,840 Speaker 1: surveillance on protesters and all around the world, and that 522 00:28:57,840 --> 00:29:00,680 Speaker 1: are kind of using tactics that were and yeared by 523 00:29:00,760 --> 00:29:05,480 Speaker 1: other contractors, and like Iraq and Afghanistan years earlier. I 524 00:29:05,480 --> 00:29:07,480 Speaker 1: guess kind of the things that I find interesting about that, 525 00:29:07,520 --> 00:29:10,400 Speaker 1: as I can remember when I was first on the Internet, 526 00:29:10,560 --> 00:29:14,280 Speaker 1: the big social kind of crusades online with the people 527 00:29:14,920 --> 00:29:17,120 Speaker 1: that that I paid attention to at least was all 528 00:29:17,160 --> 00:29:19,920 Speaker 1: around copyright. It was about not just you know, the 529 00:29:19,960 --> 00:29:23,520 Speaker 1: attempts to stop people from remixing and sharing copyrighted work, 530 00:29:23,560 --> 00:29:28,239 Speaker 1: but about um attempts to like buy up copyrights and 531 00:29:28,400 --> 00:29:34,080 Speaker 1: like into these these ever kind of larger uh UM agglomerations, 532 00:29:34,120 --> 00:29:37,000 Speaker 1: and and that's kind of hit. It seems to have 533 00:29:37,080 --> 00:29:39,719 Speaker 1: hit like a terminal point with the you know, movies 534 00:29:39,840 --> 00:29:42,760 Speaker 1: like Ready Player one and kind of a lot of 535 00:29:42,760 --> 00:29:44,800 Speaker 1: the stuff we're seeing in Marble where everything is showing 536 00:29:44,880 --> 00:29:49,600 Speaker 1: up everywhere, Space Jam two UM. I guess the part 537 00:29:49,640 --> 00:29:52,000 Speaker 1: of it that feels less dystopian the days attempts to 538 00:29:52,000 --> 00:29:54,960 Speaker 1: crack down on file sharing, which I don't think went 539 00:29:55,200 --> 00:29:57,800 Speaker 1: kind of in the worst case scenario. I'm interested actually 540 00:29:57,800 --> 00:30:00,320 Speaker 1: in your thoughts on that, um, because I can remember, 541 00:30:00,360 --> 00:30:02,880 Speaker 1: you know, when the r I A would be threatening 542 00:30:02,920 --> 00:30:05,840 Speaker 1: people with years in jail and whatnot over sharing stuff 543 00:30:05,840 --> 00:30:08,480 Speaker 1: on kaza we seem to be I don't know. Is 544 00:30:08,480 --> 00:30:10,360 Speaker 1: it just that it gets less like I'm interested in 545 00:30:10,400 --> 00:30:11,680 Speaker 1: your in your thoughts on that. Is it just that 546 00:30:11,720 --> 00:30:15,000 Speaker 1: it's less publicized when they cracked down on people, or 547 00:30:15,040 --> 00:30:17,160 Speaker 1: has kind of the nature of their response to that 548 00:30:17,200 --> 00:30:20,080 Speaker 1: really changed. Well, I think that what's happened with the 549 00:30:20,800 --> 00:30:23,640 Speaker 1: kind of steady state of the copyright wars has been 550 00:30:23,640 --> 00:30:28,520 Speaker 1: the introduction of brittleness and fragility into our speech platforms 551 00:30:28,560 --> 00:30:32,760 Speaker 1: like Twitter, uh and and Facebook and YouTube, where it's 552 00:30:32,920 --> 00:30:36,560 Speaker 1: very easy to get material removed by by making copyright 553 00:30:36,600 --> 00:30:39,520 Speaker 1: claims um. And you know, we see that with the 554 00:30:39,560 --> 00:30:43,000 Speaker 1: sleazier side of the reputation management industry, where they use 555 00:30:43,080 --> 00:30:47,880 Speaker 1: bogus copyright claims to take down criticisms. You you know, 556 00:30:47,920 --> 00:30:51,000 Speaker 1: there was a group of leftists who are really celebrating 557 00:30:51,160 --> 00:30:54,040 Speaker 1: the idea that if you if Nazis were marching in 558 00:30:54,080 --> 00:30:57,040 Speaker 1: your town, you could stop them from uploading their videos 559 00:30:57,040 --> 00:30:59,800 Speaker 1: by playing copyrighted music in the background, and I was like, 560 00:31:00,320 --> 00:31:03,000 Speaker 1: you have no idea, what a terrible fucking idea that is. 561 00:31:03,320 --> 00:31:05,960 Speaker 1: And you know, within a couple of years, cops and 562 00:31:06,000 --> 00:31:08,440 Speaker 1: Beverly Hills were doing it. Whenever people tried to film 563 00:31:08,440 --> 00:31:10,560 Speaker 1: the police there, they would just turn on some Taylor 564 00:31:10,600 --> 00:31:13,560 Speaker 1: Swift to try and stop uploading. Um. You know, the 565 00:31:13,920 --> 00:31:18,560 Speaker 1: thing about the copyright wars is that the real action 566 00:31:18,720 --> 00:31:24,320 Speaker 1: turned out to be in um wage theft through monopolization. So, 567 00:31:24,400 --> 00:31:29,960 Speaker 1: you know, the neutering and destruction of label independent music 568 00:31:30,000 --> 00:31:34,320 Speaker 1: distribution platforms like Kazah or Groxter or Napster, and the 569 00:31:34,360 --> 00:31:38,880 Speaker 1: Supreme Court decision, the Groster decision that supported that meant 570 00:31:38,920 --> 00:31:41,640 Speaker 1: that the only um way that you could launch a 571 00:31:41,680 --> 00:31:44,720 Speaker 1: service like that was in cooperation with the big labels, 572 00:31:45,320 --> 00:31:49,720 Speaker 1: and the you know, most successful one is Spotify. Spotify 573 00:31:49,800 --> 00:31:53,440 Speaker 1: is actually partially owned by the labels, and the labels 574 00:31:53,520 --> 00:31:57,760 Speaker 1: use that ownership steak to negotiate a kind of formalized 575 00:31:57,800 --> 00:32:02,760 Speaker 1: wage theft where they allowed for a lower perse stream 576 00:32:02,880 --> 00:32:06,040 Speaker 1: rate because when they get royalties for a stream, part 577 00:32:06,040 --> 00:32:09,880 Speaker 1: of that money goes to their musicians. And that meant 578 00:32:09,920 --> 00:32:13,640 Speaker 1: that the firm Spotify retain more profits which it returned 579 00:32:13,640 --> 00:32:16,640 Speaker 1: to it in the form of higher dividends, and dividends 580 00:32:16,640 --> 00:32:19,680 Speaker 1: go just straight to their shareholders. They don't that there's 581 00:32:19,680 --> 00:32:22,640 Speaker 1: no claim that musicians can make on this. And because 582 00:32:22,640 --> 00:32:26,480 Speaker 1: they set the benchmark rate, it meant that everyone, irrespective 583 00:32:26,520 --> 00:32:27,960 Speaker 1: of whether you were assigned to one of the big 584 00:32:28,000 --> 00:32:33,080 Speaker 1: three labels, ended up getting the same per stream rate 585 00:32:33,240 --> 00:32:36,920 Speaker 1: as as Universal's artists, so they were able to structure 586 00:32:36,960 --> 00:32:40,120 Speaker 1: the whole market. In the meantime, in the industrial side, 587 00:32:40,640 --> 00:32:43,480 Speaker 1: UH copyright laws, notably Section twelve of one of the 588 00:32:43,480 --> 00:32:48,120 Speaker 1: Digital Millennium Copyright Act, which is a law past that 589 00:32:48,240 --> 00:32:52,000 Speaker 1: makes it a felony to remove DRM to bypass a 590 00:32:52,040 --> 00:32:55,640 Speaker 1: technical protection measure UM that has become the go to 591 00:32:55,960 --> 00:33:01,920 Speaker 1: system for blocking, repair, interoperability UH and to prevent third 592 00:33:01,960 --> 00:33:06,720 Speaker 1: parties from um UH from from creating services or add 593 00:33:06,760 --> 00:33:11,960 Speaker 1: ons that accomplish positive ends like improved accessibility, improved security, 594 00:33:12,560 --> 00:33:16,080 Speaker 1: um AD blocking and privacy and so on. They just say, well, 595 00:33:16,120 --> 00:33:18,880 Speaker 1: you know, we we put a one molecule thick layer 596 00:33:18,880 --> 00:33:21,560 Speaker 1: of DRM around, say YouTube, and when you make a 597 00:33:21,600 --> 00:33:26,200 Speaker 1: YouTube downloader for archival purposes or whatever, UM you you 598 00:33:26,320 --> 00:33:30,680 Speaker 1: just create a um A. UH you bypass our technical 599 00:33:30,720 --> 00:33:33,760 Speaker 1: protection measure, and so you're committing a felony and you 600 00:33:33,800 --> 00:33:35,840 Speaker 1: can go to prison for five years and and pay 601 00:33:35,840 --> 00:33:38,880 Speaker 1: a five thousand dollar fine. And so you have this 602 00:33:38,960 --> 00:33:43,440 Speaker 1: like relentless monotonic expansion of DRM into like automotive tractors. 603 00:33:44,160 --> 00:33:48,360 Speaker 1: Medtronic uses it to block people from fixing ventilators. UM. 604 00:33:48,520 --> 00:33:53,720 Speaker 1: So you know this, this UM assault on the ability 605 00:33:53,800 --> 00:33:58,440 Speaker 1: to reconfigure a technology that is ever more prevalent in 606 00:33:58,440 --> 00:34:01,640 Speaker 1: our lives and that recently holds our lives in its 607 00:34:01,720 --> 00:34:04,760 Speaker 1: in its hands right its choices determine whether we live 608 00:34:04,840 --> 00:34:08,200 Speaker 1: or die has been really consequential. And I know, we 609 00:34:08,239 --> 00:34:10,520 Speaker 1: don't really think of it as a copyright problem. We 610 00:34:10,520 --> 00:34:12,120 Speaker 1: think of it as right to repair. We think of 611 00:34:12,239 --> 00:34:16,319 Speaker 1: a security auditing our accessibility. But the rule that is 612 00:34:16,360 --> 00:34:19,680 Speaker 1: being used to block into operability is a copyright law. 613 00:34:20,080 --> 00:34:22,799 Speaker 1: It's what printer companies used to stop you from buying 614 00:34:22,840 --> 00:34:26,120 Speaker 1: third party inc um. It's what Apple uses to stop 615 00:34:26,120 --> 00:34:28,480 Speaker 1: you from installing a third party app store. And you 616 00:34:28,520 --> 00:34:30,560 Speaker 1: know the absence of a third party app store is 617 00:34:30,600 --> 00:34:34,640 Speaker 1: why when Apple removed all the working VPNs in China, 618 00:34:35,360 --> 00:34:38,120 Speaker 1: Chinese users couldn't just switch to another app store that 619 00:34:38,160 --> 00:34:41,319 Speaker 1: had working VPNs in it. And so you know that 620 00:34:41,560 --> 00:34:45,000 Speaker 1: this um endgame of the copyright wars is I think 621 00:34:45,000 --> 00:34:50,840 Speaker 1: a lot more dystopian than merely suing college kids. Uh, 622 00:34:51,080 --> 00:34:54,640 Speaker 1: it's it's actually really screwed us in ways that are 623 00:34:54,680 --> 00:34:59,080 Speaker 1: that are hard to fathom. Yeah, it's a fascinating example 624 00:34:59,080 --> 00:35:02,040 Speaker 1: of kind of Dystopi creep because, at least kind of 625 00:35:02,040 --> 00:35:05,200 Speaker 1: from my more more ignorant position, when I was nineteen, 626 00:35:05,239 --> 00:35:07,160 Speaker 1: I was like worried that all of these these people 627 00:35:07,200 --> 00:35:09,960 Speaker 1: remixing music and movies that I liked, like we're going 628 00:35:10,000 --> 00:35:12,320 Speaker 1: to get cracked down on or have their stuff pulled 629 00:35:13,040 --> 00:35:16,520 Speaker 1: um And the kind of thing that I didn't I 630 00:35:16,560 --> 00:35:19,279 Speaker 1: don't think a lot of people saw coming until it hit. 631 00:35:19,360 --> 00:35:21,520 Speaker 1: I certainly didn't, was what you were just talking about 632 00:35:21,560 --> 00:35:24,640 Speaker 1: the fact that kind of the logic of how these 633 00:35:24,640 --> 00:35:27,759 Speaker 1: these entertainment companies were looking at like an album or 634 00:35:27,800 --> 00:35:29,799 Speaker 1: you know, a movie and and cutting up pieces of 635 00:35:29,800 --> 00:35:32,759 Speaker 1: that they've they've applied to like a tractor, you know, 636 00:35:32,880 --> 00:35:35,040 Speaker 1: and now you can't like repair your John Deer or 637 00:35:35,080 --> 00:35:38,400 Speaker 1: modify your John Deer so it works better. And then 638 00:35:38,600 --> 00:35:40,680 Speaker 1: you know, you you get situations like we just kind 639 00:35:40,719 --> 00:35:43,279 Speaker 1: of averted with the John Deer strike, where there was 640 00:35:43,320 --> 00:35:45,400 Speaker 1: a very real possibility that we wouldn't be able to 641 00:35:45,440 --> 00:35:47,360 Speaker 1: get a large chunk of a harvest because there wouldn't 642 00:35:47,400 --> 00:35:48,880 Speaker 1: be parts and you can't put your own in. And 643 00:35:48,920 --> 00:35:51,719 Speaker 1: that's to think that that the thought process that let 644 00:35:51,800 --> 00:35:55,840 Speaker 1: us there started with like trying to protect Metallica. In 645 00:35:56,000 --> 00:35:58,600 Speaker 1: some ways, it's kind of funny. And this is why 646 00:35:58,600 --> 00:36:02,080 Speaker 1: the anti monopoly critique is great because it shows you 647 00:36:02,120 --> 00:36:05,280 Speaker 1: that there's cause for solidarity between John Deere tractor owners 648 00:36:05,320 --> 00:36:10,040 Speaker 1: and John Deere tractor UH makers the workers who work there. 649 00:36:10,080 --> 00:36:13,239 Speaker 1: Because the same force that has allowed John Deere to 650 00:36:13,400 --> 00:36:17,719 Speaker 1: cram down its workforce for forty years is the is 651 00:36:17,760 --> 00:36:21,440 Speaker 1: the force that allows it to um uh take away 652 00:36:21,480 --> 00:36:25,560 Speaker 1: the agency and economic liberties of farmers who own John 653 00:36:25,600 --> 00:36:29,120 Speaker 1: Deere tractors. And it's it's the it's the political power 654 00:36:29,160 --> 00:36:31,719 Speaker 1: that comes with monopoly. And so you know, if John 655 00:36:31,719 --> 00:36:35,640 Speaker 1: Deere were a smaller, weaker firm, it would be less 656 00:36:35,680 --> 00:36:39,719 Speaker 1: able to resist both the claims of its workforce and 657 00:36:39,800 --> 00:36:45,040 Speaker 1: the claims of its um UH customers. Mhm yeah, I 658 00:36:45,040 --> 00:36:47,120 Speaker 1: mean that makes that makes sense, and it is like 659 00:36:47,160 --> 00:36:49,960 Speaker 1: I like that idea of of because it's not just 660 00:36:50,040 --> 00:36:53,920 Speaker 1: kind of solidarity between John Deere purchasers UM and and 661 00:36:54,320 --> 00:36:57,120 Speaker 1: the people who work in the factories. It's also there's 662 00:36:57,160 --> 00:37:01,279 Speaker 1: kind of solidarity between a wide like anyone concerned with 663 00:37:01,680 --> 00:37:06,440 Speaker 1: UM copyright. It's a much broader base of solidarity than 664 00:37:06,480 --> 00:37:08,760 Speaker 1: just people who are worried about you know, what's happening 665 00:37:09,040 --> 00:37:11,520 Speaker 1: uh to fiction or like what Disney is doing to 666 00:37:11,680 --> 00:37:14,480 Speaker 1: like copyrights around Mickey Mouse or whatever. Like it's it. 667 00:37:14,680 --> 00:37:18,040 Speaker 1: You can you can draw in concerns from right to 668 00:37:18,040 --> 00:37:20,440 Speaker 1: repair to a bunch of other things, which potentially means 669 00:37:21,120 --> 00:37:24,720 Speaker 1: there's there's a greater body of people available for action 670 00:37:24,760 --> 00:37:28,080 Speaker 1: if you can make them see kind of um converging 671 00:37:28,120 --> 00:37:31,120 Speaker 1: interests there, which is I think is an interesting idea. Well, 672 00:37:31,160 --> 00:37:33,640 Speaker 1: I think you're getting something really important and this is UM. 673 00:37:33,960 --> 00:37:36,600 Speaker 1: This comes from James Boyle, who's a copyright scholar at 674 00:37:36,640 --> 00:37:39,800 Speaker 1: Duke University and was really involved and found in creative 675 00:37:39,800 --> 00:37:43,240 Speaker 1: commons and in those early copyright fights and and Jamie 676 00:37:43,360 --> 00:37:46,400 Speaker 1: makes an analogy to the coining of the term ecology. 677 00:37:47,080 --> 00:37:49,840 Speaker 1: He says that before the term ecology came along, you know, 678 00:37:49,880 --> 00:37:51,919 Speaker 1: someone us cared about owls and someone us cared about 679 00:37:51,920 --> 00:37:54,960 Speaker 1: the ozone layer, but it wasn't really clear that we 680 00:37:54,960 --> 00:37:57,279 Speaker 1: were on the same side. You know, it's not clear. 681 00:37:57,320 --> 00:37:59,480 Speaker 1: If you're Martian looking through a telescope, you might be 682 00:37:59,520 --> 00:38:02,200 Speaker 1: hard pressed to explain why. You know, the destiny of 683 00:38:02,280 --> 00:38:06,520 Speaker 1: charismatic charismatic nocturnal birds and the gaseous composition of the 684 00:38:06,560 --> 00:38:10,279 Speaker 1: upper atmosphere were the same issue. Right in the term ecology. 685 00:38:10,520 --> 00:38:13,640 Speaker 1: Let all these people who cared about different things find 686 00:38:13,680 --> 00:38:16,520 Speaker 1: a single point to rally around. It turned a thousand 687 00:38:16,520 --> 00:38:19,600 Speaker 1: issues into one movement. And I think that in the 688 00:38:19,760 --> 00:38:22,279 Speaker 1: in the course of resisting corporate power, which is to say, 689 00:38:22,280 --> 00:38:26,040 Speaker 1: resisting monopoly, we have the potential to weld together people 690 00:38:26,080 --> 00:38:29,920 Speaker 1: from very diverse fields. You know, farmers and people who 691 00:38:29,920 --> 00:38:32,520 Speaker 1: make tractors. Sure, but you know, if you grew up 692 00:38:33,160 --> 00:38:37,120 Speaker 1: watching professional wrestling and now you're aghast that the wrestlers 693 00:38:37,160 --> 00:38:39,640 Speaker 1: that you loved are begging on go fund me for 694 00:38:39,800 --> 00:38:43,240 Speaker 1: pennies to die with dignity, you know once someone explains 695 00:38:43,280 --> 00:38:45,880 Speaker 1: to the reason that that's happening is that thirty wrestling 696 00:38:45,920 --> 00:38:48,840 Speaker 1: leagues became one wrestling league that was able to practice 697 00:38:48,840 --> 00:38:53,040 Speaker 1: worker's classification, turn those performers into contractors, take away their 698 00:38:53,040 --> 00:38:56,000 Speaker 1: health insurance and leave them to die. Then suddenly you're 699 00:38:56,040 --> 00:38:57,560 Speaker 1: on the same side of the people who were worried 700 00:38:57,560 --> 00:39:00,279 Speaker 1: about big tech and big tractor, and the people are 701 00:39:00,280 --> 00:39:02,360 Speaker 1: worried about the fact that there's only one manufacturer of 702 00:39:02,440 --> 00:39:06,799 Speaker 1: cheerleading uniform uniforms, and two manufacturers of athletic shoes, and 703 00:39:07,400 --> 00:39:10,719 Speaker 1: two manufacturers of spirits, and two manufacturers of beer. One 704 00:39:10,760 --> 00:39:14,400 Speaker 1: manufacturer of eyewear that also owns all the eye wear 705 00:39:14,600 --> 00:39:17,720 Speaker 1: stores and the eyewear ensure. You know that Duff Beer 706 00:39:17,760 --> 00:39:20,359 Speaker 1: thing from the early Simpsons, where there's like Duff Beer, 707 00:39:20,480 --> 00:39:28,360 Speaker 1: Rispberry Thing, Dulci, Gabana, Oliver, People's Bauschan, Loam, Versaci. Every 708 00:39:28,400 --> 00:39:31,760 Speaker 1: eyewear brand you've ever heard of is one company coach 709 00:39:32,000 --> 00:39:36,399 Speaker 1: all of them. And they also own Sunglass Hut and 710 00:39:37,000 --> 00:39:41,640 Speaker 1: uh Target Optical and Sears Optical, and lens Crafters and 711 00:39:41,880 --> 00:39:44,719 Speaker 1: Spec Savers and every other eyewear story you've ever heard of. 712 00:39:45,040 --> 00:39:47,880 Speaker 1: And they bought all the labs that make the lenses, 713 00:39:47,920 --> 00:39:49,680 Speaker 1: so more than half the lenses in the world come 714 00:39:49,719 --> 00:39:53,959 Speaker 1: from them. A division called and they bought Imed, which 715 00:39:54,000 --> 00:39:56,360 Speaker 1: is the company that bought all the insurance companies that 716 00:39:56,480 --> 00:39:59,319 Speaker 1: ensure I wear and so they're also the company that's 717 00:39:59,400 --> 00:40:03,960 Speaker 1: ensuring glasses your your eyes one company, and I Wear 718 00:40:04,040 --> 00:40:06,640 Speaker 1: costs a thousand percent more than it did a decade ago. 719 00:40:06,760 --> 00:40:10,200 Speaker 1: They stole our fucking eyes. Right, So people who care 720 00:40:10,239 --> 00:40:13,160 Speaker 1: about that have common cause with people who care about 721 00:40:13,160 --> 00:40:16,120 Speaker 1: wrestlers and people who care about beer and big tech 722 00:40:16,640 --> 00:40:19,919 Speaker 1: and the fact that there's four shipping companies and they 723 00:40:19,920 --> 00:40:22,040 Speaker 1: have no competitive pressure and so they just keep building 724 00:40:22,080 --> 00:40:25,080 Speaker 1: bigger ships that gets stuck in the fucking Suez canal Right, 725 00:40:25,120 --> 00:40:30,560 Speaker 1: we're all on the same side. Yeah, And I I 726 00:40:30,600 --> 00:40:34,400 Speaker 1: like the idea that I like. I like hoping that 727 00:40:34,400 --> 00:40:37,120 Speaker 1: that kind of inherent solidarity, if you can point it 728 00:40:37,120 --> 00:40:40,480 Speaker 1: out to people, is potentially an antidote to, or at 729 00:40:40,520 --> 00:40:42,520 Speaker 1: least a partial antidote to the level of the layer 730 00:40:42,560 --> 00:40:46,160 Speaker 1: of politicization that's fallen down over everything, um that stops 731 00:40:46,160 --> 00:40:49,120 Speaker 1: people from actually considering matters but instead considering like, I 732 00:40:49,120 --> 00:40:51,520 Speaker 1: don't know, is this owning the libs? Right? Like if 733 00:40:51,520 --> 00:40:53,640 Speaker 1: you if they if if you can get them to 734 00:40:53,719 --> 00:40:57,320 Speaker 1: see that, like, yeah, their favorite wrestler is like dying 735 00:40:57,400 --> 00:40:59,840 Speaker 1: because he couldn't afford insulin, and that that that's t 736 00:41:00,000 --> 00:41:02,319 Speaker 1: i'd to the issue of like the reason his dad 737 00:41:02,360 --> 00:41:05,000 Speaker 1: can't get tractor parts this year or whatever, um, and 738 00:41:05,040 --> 00:41:07,960 Speaker 1: that that's tied to other issues that are maybe championed 739 00:41:07,960 --> 00:41:11,440 Speaker 1: by people he would reflexively dismiss. But like, yeah, I 740 00:41:11,600 --> 00:41:15,160 Speaker 1: I I find that really inspiring. It's still a significant 741 00:41:15,520 --> 00:41:19,719 Speaker 1: there's a significant challenge for people who are trying to 742 00:41:19,880 --> 00:41:22,399 Speaker 1: make those connections, for folks who are who are trying 743 00:41:22,440 --> 00:41:25,960 Speaker 1: to like inform them of that state. I mean, yeah, 744 00:41:26,000 --> 00:41:28,839 Speaker 1: that's true. And you know, like Steve Bannon will tell 745 00:41:28,880 --> 00:41:31,720 Speaker 1: you that the reason to do cultural world culture culture 746 00:41:31,760 --> 00:41:35,600 Speaker 1: war bullshit is because politics are down downstream from culture, 747 00:41:35,640 --> 00:41:37,759 Speaker 1: and there's probably an element of truth to that. But 748 00:41:37,800 --> 00:41:39,920 Speaker 1: I also think the reason that people find culture war 749 00:41:40,000 --> 00:41:44,040 Speaker 1: bullshits so attractive is because they got nothing else. Yeah, 750 00:41:45,160 --> 00:41:46,799 Speaker 1: I think we we talked about that a lot within 751 00:41:46,840 --> 00:41:50,360 Speaker 1: the context of conservative for politics. I grew up very conservative, 752 00:41:50,360 --> 00:41:53,080 Speaker 1: and I do remember how the tenor of things I 753 00:41:53,120 --> 00:41:56,600 Speaker 1: was hearing through the bush ears changed from advocation of 754 00:41:56,640 --> 00:42:00,480 Speaker 1: policies to just all culture war, all the time, all 755 00:42:00,560 --> 00:42:02,759 Speaker 1: all striking the dims all the time. And it was 756 00:42:02,800 --> 00:42:06,640 Speaker 1: the kind of um it. And that's not the only 757 00:42:06,719 --> 00:42:08,400 Speaker 1: place that's happen. You see it on the left to 758 00:42:08,680 --> 00:42:12,040 Speaker 1: absolutely like it's it's endemic now it's it's a poison 759 00:42:12,080 --> 00:42:15,480 Speaker 1: and kind of the discourse. But I think that there's 760 00:42:15,640 --> 00:42:17,799 Speaker 1: a lot that needs to be I think there's a 761 00:42:17,800 --> 00:42:20,280 Speaker 1: lot to be discovered still for like how to break 762 00:42:20,480 --> 00:42:32,719 Speaker 1: people out of that. I'm kind of bullish when we 763 00:42:32,760 --> 00:42:34,600 Speaker 1: talk about these issues like you were bringing up with 764 00:42:34,640 --> 00:42:37,160 Speaker 1: sort of the monopolization of these industries you wouldn't expect 765 00:42:37,160 --> 00:42:41,240 Speaker 1: to be monopolized. I'm hopeful about the future that stuff 766 00:42:41,280 --> 00:42:43,359 Speaker 1: like three D printing presents for that. We have an 767 00:42:43,440 --> 00:42:47,320 Speaker 1: organization in Portland that does kind of three D printing glasses, 768 00:42:47,360 --> 00:42:50,000 Speaker 1: frames and stuff and is helping people with that sort 769 00:42:50,000 --> 00:42:54,080 Speaker 1: of stuff. And I'm in conversations with like the Four 770 00:42:54,120 --> 00:42:58,840 Speaker 1: Thieves Vinegar Collective. I think it's called UM. Yeah. Some 771 00:42:58,880 --> 00:43:00,799 Speaker 1: of the folks doing like ing to do working on 772 00:43:00,800 --> 00:43:05,640 Speaker 1: pharmaceutical hacking making at the moment like lower cost UH 773 00:43:05,920 --> 00:43:09,879 Speaker 1: kind of home scratch brood versions of like different AIDS medications, 774 00:43:09,920 --> 00:43:13,440 Speaker 1: and the Holy Grail is doing that with um insulin 775 00:43:13,520 --> 00:43:17,680 Speaker 1: effectively UM and I think it is. And I do 776 00:43:17,760 --> 00:43:20,000 Speaker 1: think one of the things that's exciting about that is 777 00:43:20,040 --> 00:43:22,680 Speaker 1: because the way in which the way in which collaboration 778 00:43:22,760 --> 00:43:24,520 Speaker 1: on three D printing works in the way in which 779 00:43:24,560 --> 00:43:27,400 Speaker 1: actually spreading, like the ability to do stuff works. I 780 00:43:27,400 --> 00:43:31,920 Speaker 1: think it synergizes nicely with the ability of people to 781 00:43:32,000 --> 00:43:34,439 Speaker 1: kind of reach other folks through writing or other forms 782 00:43:34,480 --> 00:43:36,399 Speaker 1: of content, because they can both spread through the same 783 00:43:36,440 --> 00:43:38,520 Speaker 1: You can have a video or a story, and you 784 00:43:38,560 --> 00:43:41,239 Speaker 1: can have like kind of embedded guides on how to 785 00:43:41,280 --> 00:43:44,520 Speaker 1: do that. Um I I I don't know that I've 786 00:43:44,560 --> 00:43:47,480 Speaker 1: I've runned into a lot of your writings on kind 787 00:43:47,480 --> 00:43:49,520 Speaker 1: of the potential of three D printing in this space, 788 00:43:49,560 --> 00:43:52,719 Speaker 1: But I'm interested, like to what do you do? Are 789 00:43:52,760 --> 00:43:54,880 Speaker 1: you looking at that as kind of an area of 790 00:43:54,920 --> 00:43:57,360 Speaker 1: hope or do you see that still is kind of 791 00:43:57,360 --> 00:44:00,520 Speaker 1: too too niche and labor focus to really actually take 792 00:44:00,560 --> 00:44:02,200 Speaker 1: off in the way that it would need to to 793 00:44:02,320 --> 00:44:04,960 Speaker 1: crack some of these nuts. This is where I do 794 00:44:05,080 --> 00:44:08,080 Speaker 1: my my Woody Allen. You know nothing of my work 795 00:44:08,440 --> 00:44:11,760 Speaker 1: stick because I had this novel Maker Makers in two thousand. 796 00:44:14,480 --> 00:44:18,240 Speaker 1: It's it's why uh Bree Pettis went out and founded 797 00:44:18,239 --> 00:44:21,880 Speaker 1: Maker bought uh and it's you know, credited with like 798 00:44:22,000 --> 00:44:26,080 Speaker 1: kickstarting the homebrew three D printed revolution. Blah blah blah blah, 799 00:44:26,080 --> 00:44:29,080 Speaker 1: blah and um, and it was a very bullish novel 800 00:44:29,120 --> 00:44:32,680 Speaker 1: about three D printing. I Um, you know, the reality 801 00:44:32,719 --> 00:44:34,680 Speaker 1: hasn't lived up to the hype yet. It may just 802 00:44:34,719 --> 00:44:36,799 Speaker 1: be that we're in the long trough of despair, as 803 00:44:36,880 --> 00:44:40,840 Speaker 1: the Gardner hype cycle model has it. Uh. But you know, 804 00:44:40,880 --> 00:44:44,600 Speaker 1: I think the problem with UM three D printing was 805 00:44:44,680 --> 00:44:47,560 Speaker 1: that the patents had been concentrated into the hands of 806 00:44:47,600 --> 00:44:51,280 Speaker 1: two large firms that had bought all their competitors, including 807 00:44:51,280 --> 00:44:55,200 Speaker 1: Maker Bought and UM. When those patents finally expired, the 808 00:44:55,239 --> 00:44:59,280 Speaker 1: big one was the laser centering of of powder patent expired. 809 00:45:00,000 --> 00:45:01,919 Speaker 1: It just wasn't a big bang. And I think it's 810 00:45:01,960 --> 00:45:05,239 Speaker 1: because the supply chain for it still had a lot 811 00:45:05,239 --> 00:45:09,840 Speaker 1: of proprietary elements, and so producing the powder and producing 812 00:45:09,880 --> 00:45:15,239 Speaker 1: the components that allowed for that powder printing remained a 813 00:45:15,320 --> 00:45:18,360 Speaker 1: very high bar, and so we just didn't see the 814 00:45:18,440 --> 00:45:20,959 Speaker 1: kind of new industry emerged that we would have hoped for. 815 00:45:21,360 --> 00:45:24,160 Speaker 1: And you know, it's like seven years since those patents expired. 816 00:45:24,320 --> 00:45:26,719 Speaker 1: Five years since those patent expired. Now we're seeing a 817 00:45:26,760 --> 00:45:29,359 Speaker 1: few more of those powder printers. You get a lot 818 00:45:29,400 --> 00:45:33,560 Speaker 1: more like you've cured epoxy printers because those came off 819 00:45:33,600 --> 00:45:37,680 Speaker 1: patent earlier and they have a less complicated supply chain. Um. 820 00:45:37,760 --> 00:45:40,720 Speaker 1: But still, I mean mostly when we talk about printers, 821 00:45:40,719 --> 00:45:44,279 Speaker 1: we're talking about filament, and just filaments just not a 822 00:45:44,360 --> 00:45:47,520 Speaker 1: great technology. It's been pushed in ways that you wouldn't 823 00:45:47,560 --> 00:45:50,520 Speaker 1: even believe, and people have figured out how to do 824 00:45:50,719 --> 00:45:56,239 Speaker 1: absolutely incredible things with it. But it's not It's not 825 00:45:56,400 --> 00:45:59,200 Speaker 1: something that you would make aerospace components for, you know, 826 00:45:59,400 --> 00:46:03,640 Speaker 1: It's it's something that you make um novelty dungeons and 827 00:46:03,719 --> 00:46:07,520 Speaker 1: dragons dice out, Yeah, which is an important industry to disrupt. 828 00:46:07,560 --> 00:46:10,680 Speaker 1: Don't get me wrong, but I'm with you. I can 829 00:46:10,719 --> 00:46:12,799 Speaker 1: remember paying thirty bucks for a set of dice as 830 00:46:12,800 --> 00:46:16,680 Speaker 1: a kid and thinking, somebody's gotta fix this scam. I 831 00:46:16,680 --> 00:46:19,440 Speaker 1: can produce something for Christmas, Robert, thank you, Garrison. And 832 00:46:19,480 --> 00:46:21,640 Speaker 1: you know now I I own a I bought a 833 00:46:21,640 --> 00:46:23,400 Speaker 1: comic con a couple of years ago. I bought a 834 00:46:23,400 --> 00:46:27,080 Speaker 1: tiny little D twenty made out of meteoric or I 835 00:46:27,080 --> 00:46:29,920 Speaker 1: have a sky metal D twenty. Oh now that's yeah, 836 00:46:30,080 --> 00:46:34,040 Speaker 1: that's that's classy. Um. I'm curious. We've got a little 837 00:46:34,040 --> 00:46:35,799 Speaker 1: bit of time left and I wanted to ask in 838 00:46:35,840 --> 00:46:38,400 Speaker 1: your your novel Attack Surface. I know it was released 839 00:46:39,160 --> 00:46:43,560 Speaker 1: right October, if I'm not mistaken. UM. And obviously a 840 00:46:43,600 --> 00:46:45,640 Speaker 1: lot of that deals with again these kind of like 841 00:46:47,120 --> 00:46:49,600 Speaker 1: corporations that have been contractors for the D O D 842 00:46:49,719 --> 00:46:53,440 Speaker 1: doing like fucked up surveillance shit in Iraq and Afghanistan 843 00:46:53,800 --> 00:46:56,880 Speaker 1: bringing that technology to crack down on like US, uh, 844 00:46:57,000 --> 00:47:00,560 Speaker 1: sort of dissident left wing political movements. Him's out the 845 00:47:00,640 --> 00:47:05,080 Speaker 1: year that we have a nationwide kind of uprising. UM 846 00:47:05,120 --> 00:47:07,719 Speaker 1: that a lot of fucked up surveillance ship that had 847 00:47:07,719 --> 00:47:10,400 Speaker 1: been kind of demoed state side around it, like standing 848 00:47:10,480 --> 00:47:12,879 Speaker 1: Rock and whatnot, gets gets really put into its own 849 00:47:13,320 --> 00:47:17,879 Speaker 1: How much of that was written before ship went down? 850 00:47:18,000 --> 00:47:20,680 Speaker 1: And I and I'm assuming, like, I don't know exactly 851 00:47:20,680 --> 00:47:23,040 Speaker 1: how your process works, but I'm wondering, like, I assume 852 00:47:23,080 --> 00:47:25,960 Speaker 1: you started the project before everything went the way it 853 00:47:26,000 --> 00:47:28,840 Speaker 1: did last summer. How much did kind of what happened 854 00:47:28,880 --> 00:47:33,319 Speaker 1: last summer affect the way you imagined that technology in 855 00:47:33,360 --> 00:47:36,800 Speaker 1: those tactics functioning in that book. Yeah, the the timeline 856 00:47:36,840 --> 00:47:40,960 Speaker 1: goes the other direction. I wrote that book before the 857 00:47:41,560 --> 00:47:45,560 Speaker 1: summer uprising, UM, long long, long long before that, And 858 00:47:45,600 --> 00:47:49,000 Speaker 1: I wrote it about things like UM, the surveillance technology 859 00:47:49,040 --> 00:47:52,880 Speaker 1: we saw in Belarus and chev and also at Occupy 860 00:47:53,000 --> 00:47:58,040 Speaker 1: and Standing Rock and at other Black Lives Matter demonstrations 861 00:47:58,040 --> 00:48:02,000 Speaker 1: and uprisings in Americus, Assimila, and if you you know, 862 00:48:02,160 --> 00:48:08,880 Speaker 1: also the monotonic expansion of surveillance leagues right where you know, 863 00:48:08,960 --> 00:48:11,960 Speaker 1: first we learned about MC catchers, and then we learned 864 00:48:11,960 --> 00:48:15,040 Speaker 1: about dirt boxes, which are MC catchers on airplanes, and 865 00:48:15,440 --> 00:48:17,960 Speaker 1: you know, like we just all of that stuff leaked 866 00:48:18,000 --> 00:48:21,279 Speaker 1: like crazy because you know, these surveillance giants are are 867 00:48:21,360 --> 00:48:25,400 Speaker 1: not good at what they do, right, which isn't a 868 00:48:25,400 --> 00:48:28,759 Speaker 1: reason we should be hopeful. A company that's bad at 869 00:48:28,760 --> 00:48:32,080 Speaker 1: what it's it does is in some ways even worse 870 00:48:32,120 --> 00:48:35,680 Speaker 1: because one of the ways that they're incompetence expresses itself 871 00:48:36,239 --> 00:48:38,080 Speaker 1: is that they often gather a bunch of data on 872 00:48:38,120 --> 00:48:42,600 Speaker 1: innocent people and then leak it right, not maliciously, just 873 00:48:42,719 --> 00:48:48,680 Speaker 1: through incompetence. Um. And so you know, the this expansion 874 00:48:48,680 --> 00:48:51,000 Speaker 1: of surveillance has like been on my mind for a 875 00:48:51,040 --> 00:48:53,480 Speaker 1: long time, and I've been writing about it, well at 876 00:48:53,520 --> 00:48:55,759 Speaker 1: least since Little Brother, right, so two thousand and six, 877 00:48:55,800 --> 00:48:58,520 Speaker 1: I wrote that novel, and I've had my finger in 878 00:48:58,560 --> 00:49:00,840 Speaker 1: that Yeah, So I've had my finger in that for 879 00:49:00,880 --> 00:49:04,120 Speaker 1: all that time, and and working with the f F, 880 00:49:04,160 --> 00:49:09,320 Speaker 1: it's impossible to miss. Sure. Was there a degree to which, um, 881 00:49:09,400 --> 00:49:11,280 Speaker 1: I don't know. I guess we're you surprised by anything 882 00:49:11,280 --> 00:49:12,920 Speaker 1: that happened last him? Or did it just kind of 883 00:49:12,960 --> 00:49:16,800 Speaker 1: comprehensively feel like these are everything slotting into place that 884 00:49:16,840 --> 00:49:18,880 Speaker 1: I knew was heading in this direction? Because yeah, I mean, 885 00:49:18,880 --> 00:49:22,680 Speaker 1: you're right, I did, like there was like everything was 886 00:49:22,760 --> 00:49:28,160 Speaker 1: kind of presaged um years before. Yeah, I'm wondering if 887 00:49:28,320 --> 00:49:31,080 Speaker 1: if there was anything that kind of surprised you, um 888 00:49:31,160 --> 00:49:33,080 Speaker 1: or was it? Was it all just sort of what 889 00:49:33,120 --> 00:49:36,600 Speaker 1: you've been braced for. Yeah, I don't feel like there 890 00:49:36,600 --> 00:49:39,920 Speaker 1: were any kind of surveillance surprises. I mean the reverse 891 00:49:40,120 --> 00:49:44,759 Speaker 1: the use of reverse warrants. I think we all kind 892 00:49:44,800 --> 00:49:47,680 Speaker 1: of assumed was going on. There had been hints of 893 00:49:47,719 --> 00:49:51,600 Speaker 1: it in Google's warrant canarias beforehand. But you know those 894 00:49:51,680 --> 00:49:54,759 Speaker 1: geofense warrants, which again, if you're like sitting there going oh, 895 00:49:54,840 --> 00:49:59,440 Speaker 1: geofense warrants are awesome because they're catching the one six rioters, Like, dude, 896 00:49:59,520 --> 00:50:02,920 Speaker 1: you are going to be so disappointed, Holy sh it, 897 00:50:03,560 --> 00:50:07,080 Speaker 1: that's not where they're going to keep using this. Yeah. UM, 898 00:50:07,520 --> 00:50:10,040 Speaker 1: so you know, learning more about those reverse warrants I 899 00:50:10,080 --> 00:50:13,600 Speaker 1: think was was interesting. UM, but I don't feel like, 900 00:50:14,000 --> 00:50:15,640 Speaker 1: I don't well off the top of my head, I 901 00:50:15,680 --> 00:50:18,960 Speaker 1: can't say that there was any new technical stuff that emerged, 902 00:50:19,040 --> 00:50:21,640 Speaker 1: you know. I I, um kick started the audio book 903 00:50:21,719 --> 00:50:25,799 Speaker 1: for Attack Surface UH, and I offered as like the 904 00:50:25,840 --> 00:50:28,560 Speaker 1: top tier you could commission short stories in the Little 905 00:50:28,560 --> 00:50:31,319 Speaker 1: Brother universe, and there were three of those and I 906 00:50:31,360 --> 00:50:33,480 Speaker 1: just finished the first of them, and it's about um 907 00:50:33,920 --> 00:50:38,640 Speaker 1: future pipeline protests and uh. You know, I spent a 908 00:50:38,680 --> 00:50:42,000 Speaker 1: lot of time in my research looking at the surveillance 909 00:50:42,040 --> 00:50:44,520 Speaker 1: that was done on the pipeline protests, and a lot 910 00:50:44,520 --> 00:50:47,680 Speaker 1: of it was provocateurs and undercovers who were just terrible 911 00:50:47,719 --> 00:50:52,080 Speaker 1: at their jobs, right, like the intercepts, long publication of 912 00:50:52,080 --> 00:50:56,160 Speaker 1: of uh, you know, long documents about how those operators worked. 913 00:50:56,280 --> 00:50:59,160 Speaker 1: We just like showed up in military haircuts and combat 914 00:50:59,200 --> 00:51:02,120 Speaker 1: boots and then we're like, Hey, I'm from Portland and 915 00:51:02,160 --> 00:51:04,720 Speaker 1: I'm here because we're gonna funk up some bad guys. 916 00:51:04,800 --> 00:51:07,480 Speaker 1: Let's go do it. Let's go do violence and save 917 00:51:07,520 --> 00:51:10,480 Speaker 1: Indian country. And like everyone was like you, like, does 918 00:51:10,520 --> 00:51:14,839 Speaker 1: anyone want to buy drugs and and the actual protesters 919 00:51:14,840 --> 00:51:19,319 Speaker 1: were like you're a provocateur, like go away, you know, 920 00:51:19,400 --> 00:51:21,520 Speaker 1: like they could tell I mean, I guess you know, 921 00:51:21,560 --> 00:51:23,360 Speaker 1: there are a lot more effective in the UK and 922 00:51:23,640 --> 00:51:27,919 Speaker 1: infiltrating the climate movement. You know, they impregnated several protesters, 923 00:51:28,000 --> 00:51:30,960 Speaker 1: so you know, and had long term relationships with them 924 00:51:30,960 --> 00:51:34,720 Speaker 1: and raised kids with them. So there is no here stories. 925 00:51:35,239 --> 00:51:37,879 Speaker 1: Yeah here, it was not that we did just didn't 926 00:51:37,920 --> 00:51:41,919 Speaker 1: see that incredible efficacy. Yeah, And I do think that 927 00:51:41,920 --> 00:51:44,520 Speaker 1: that's I think kind of the message I took out 928 00:51:44,520 --> 00:51:46,920 Speaker 1: of it because I I was I started reporting on 929 00:51:46,920 --> 00:51:49,440 Speaker 1: like dirt boxes back during Standing Rock, just having them, 930 00:51:49,440 --> 00:51:51,759 Speaker 1: like it explained to me by people who were on 931 00:51:51,800 --> 00:51:54,279 Speaker 1: the ground when I showed up that like, yeah there's 932 00:51:54,280 --> 00:51:56,879 Speaker 1: this your like phones don't work the same out here, 933 00:51:56,920 --> 00:51:58,520 Speaker 1: and like we're trying to figure out what's going on, 934 00:51:58,560 --> 00:52:00,640 Speaker 1: but like everything is is in it's not just that 935 00:52:00,680 --> 00:52:03,120 Speaker 1: we're out in the sticks or anything. And I think 936 00:52:03,160 --> 00:52:05,600 Speaker 1: the only surprise, the big surprise for me last year 937 00:52:05,640 --> 00:52:10,160 Speaker 1: was how I think how little the technology accomplished for 938 00:52:10,239 --> 00:52:12,719 Speaker 1: them and how much it just it just wound up 939 00:52:12,719 --> 00:52:15,000 Speaker 1: back down to violence. Was like that was kind of 940 00:52:15,040 --> 00:52:17,680 Speaker 1: the for all of the toys they had, the toys 941 00:52:17,719 --> 00:52:20,080 Speaker 1: that actually made the most difference was gassing and beating 942 00:52:20,080 --> 00:52:24,319 Speaker 1: people and violence and like old fashioned informants that was 943 00:52:24,440 --> 00:52:26,879 Speaker 1: that was the stuff, and just having a dude there. Yeah, 944 00:52:27,280 --> 00:52:29,920 Speaker 1: they were really relied on. And the fact that you 945 00:52:30,080 --> 00:52:32,960 Speaker 1: that that you, Corey, weren't super surprised, but anything last year, 946 00:52:33,000 --> 00:52:35,719 Speaker 1: I think kind of just more shows kind of the 947 00:52:35,760 --> 00:52:38,520 Speaker 1: strength of your work in terms of how you're very 948 00:52:38,560 --> 00:52:42,040 Speaker 1: good at seeing the trends that are already happening but 949 00:52:42,160 --> 00:52:45,640 Speaker 1: taking them to their next logical place. Um. And it's 950 00:52:45,640 --> 00:52:48,320 Speaker 1: a really great way to kind of get a sense 951 00:52:48,360 --> 00:52:52,319 Speaker 1: of what is something, what what will something maybe look 952 00:52:52,400 --> 00:52:54,759 Speaker 1: like in the next decade or so, because it's it's 953 00:52:54,760 --> 00:52:57,839 Speaker 1: all based on already existing stuff, just in different kind 954 00:52:57,840 --> 00:52:59,680 Speaker 1: of original ways. So that's why I think it's it's 955 00:52:59,719 --> 00:53:03,200 Speaker 1: so useful to look at your books as as an activist, 956 00:53:03,480 --> 00:53:07,160 Speaker 1: specifically around like surveillance and stuff, because it's it's just 957 00:53:07,440 --> 00:53:10,960 Speaker 1: really it's it's really good for kind of keeping keeping 958 00:53:10,960 --> 00:53:14,640 Speaker 1: an eye on keeping your head, yeah, and keeping an 959 00:53:14,680 --> 00:53:16,879 Speaker 1: eye on what's keeping an eye on you, um, and 960 00:53:16,920 --> 00:53:19,399 Speaker 1: all that kind of stuff. This was a really lovely 961 00:53:19,440 --> 00:53:22,279 Speaker 1: conversation was a lovely last thing to do in my 962 00:53:22,400 --> 00:53:25,920 Speaker 1: home office in because I leave tomorrow and won't be 963 00:53:25,960 --> 00:53:28,640 Speaker 1: back until the next year, and then I'm actually gonna 964 00:53:28,640 --> 00:53:30,960 Speaker 1: be offline for a month after a joint replacement, so 965 00:53:31,520 --> 00:53:33,480 Speaker 1: it was it was really lovely to meet you all 966 00:53:33,520 --> 00:53:35,600 Speaker 1: on to chat with you. Thanks so much for chatting 967 00:53:35,600 --> 00:53:42,120 Speaker 1: with us today. Glory My pleasure. It Could Happen Here 968 00:53:42,200 --> 00:53:44,760 Speaker 1: is a production of cool Zone Media. For more podcast 969 00:53:44,840 --> 00:53:47,120 Speaker 1: on the cool Zone Media, visit our website cool zone 970 00:53:47,160 --> 00:53:49,000 Speaker 1: media dot com, or check us out on the I 971 00:53:49,080 --> 00:53:52,400 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to podcasts. 972 00:53:52,960 --> 00:53:55,080 Speaker 1: You can find sources for It Could Happen Here, updated 973 00:53:55,120 --> 00:53:58,600 Speaker 1: monthly at cool zone Media dot com slash sources. Thanks 974 00:53:58,640 --> 00:53:59,160 Speaker 1: for listening.