1 00:00:09,800 --> 00:00:11,920 Speaker 1: When was the last time you heard somebody say something 2 00:00:11,960 --> 00:00:15,080 Speaker 1: optimistic about the Internet? All about you? But for me, 3 00:00:15,520 --> 00:00:18,919 Speaker 1: it's been a while, which sucks, because there was a time, 4 00:00:19,000 --> 00:00:21,240 Speaker 1: and I'm talking about the nineties and the two thousands 5 00:00:21,239 --> 00:00:23,680 Speaker 1: and even up into the early twenty tens, when a 6 00:00:23,680 --> 00:00:26,439 Speaker 1: lot of the conversation about the Internet and technology in 7 00:00:26,520 --> 00:00:28,640 Speaker 1: general was really idealistic. 8 00:00:29,160 --> 00:00:33,360 Speaker 2: Basically, what happened to the idea that the Internet was 9 00:00:33,400 --> 00:00:36,919 Speaker 2: this magical network everybody would be able to start their 10 00:00:36,960 --> 00:00:40,519 Speaker 2: own little thing and have pockets of wealth emerged all 11 00:00:40,560 --> 00:00:43,680 Speaker 2: over the country, not to mention make every country into 12 00:00:43,760 --> 00:00:47,280 Speaker 2: a democracy, not to mention make every creative person a star. 13 00:00:48,400 --> 00:00:51,080 Speaker 1: Tim Wu is a legal scholar who writes about monopolies, 14 00:00:51,240 --> 00:00:53,600 Speaker 1: and he is very critical of some of the big 15 00:00:53,640 --> 00:00:56,800 Speaker 1: tech companies like Google and Amazon and Meta, But he 16 00:00:56,880 --> 00:01:00,600 Speaker 1: didn't always feel that way. Back in those early inner days, 17 00:01:00,840 --> 00:01:03,200 Speaker 1: a lot of us really believed that tech was going 18 00:01:03,240 --> 00:01:04,880 Speaker 1: to change the world for the better. 19 00:01:05,400 --> 00:01:07,240 Speaker 2: Actually, at the very late part of that, I was 20 00:01:07,280 --> 00:01:10,360 Speaker 2: working in the Silicon Valley, so I was drinking the kool. 21 00:01:10,160 --> 00:01:11,880 Speaker 3: Aid from the firehose like straight on. 22 00:01:12,480 --> 00:01:16,520 Speaker 2: And before that, I was working in the US government 23 00:01:16,560 --> 00:01:19,839 Speaker 2: at the Supreme Court, my coke clerk. Actually my office 24 00:01:19,880 --> 00:01:23,560 Speaker 2: mate was Ketanji Jackson, who is now the. 25 00:01:23,000 --> 00:01:24,080 Speaker 3: Supreme Court Justice. 26 00:01:24,120 --> 00:01:28,240 Speaker 2: So we were friends, and you know, I got to 27 00:01:28,280 --> 00:01:31,160 Speaker 2: say we were very optimistic, but you know, it didn't 28 00:01:31,200 --> 00:01:33,440 Speaker 2: quite all work out the way we had thought. 29 00:01:34,600 --> 00:01:38,200 Speaker 1: Tim worked in the Biden administration advising on anti trust policies, 30 00:01:38,400 --> 00:01:41,039 Speaker 1: and he's now a law professor at Columbia. His most 31 00:01:41,120 --> 00:01:43,800 Speaker 1: recent project is a quest to understand how the Internet 32 00:01:43,840 --> 00:01:46,759 Speaker 1: went from this utopian vision to something that a lot 33 00:01:46,760 --> 00:01:49,120 Speaker 1: of people blame for everything that's wrong in the world, 34 00:01:49,440 --> 00:01:51,840 Speaker 1: and he wrote a book about it called The Age 35 00:01:51,880 --> 00:01:54,720 Speaker 1: of Extraction. So today's conversation is going to take us 36 00:01:54,720 --> 00:01:58,320 Speaker 1: from Bible versus to rap music, to some Israeli map 37 00:01:58,400 --> 00:02:01,440 Speaker 1: software that you've probably used without even realizing it, to 38 00:02:01,560 --> 00:02:03,960 Speaker 1: that feeling that you get when you smoke way too 39 00:02:04,040 --> 00:02:06,440 Speaker 1: much weed and you can't get up off the couch, 40 00:02:06,960 --> 00:02:08,960 Speaker 1: and hopefully by the end of it, you'll have a 41 00:02:08,960 --> 00:02:12,120 Speaker 1: better idea of what happened to that nineties Internet optimism, 42 00:02:12,160 --> 00:02:14,880 Speaker 1: why it didn't pan out, and maybe some things that 43 00:02:14,919 --> 00:02:26,359 Speaker 1: we could do about it. Right now, I'm afraid Kaleidoscope 44 00:02:26,400 --> 00:02:33,800 Speaker 1: and iHeart podcast this. This is kill Switch I'm Dexter Thomas. 45 00:02:35,120 --> 00:03:17,960 Speaker 1: I'm all right, So, I mean, I think you're talking 46 00:03:18,000 --> 00:03:21,680 Speaker 1: about this kind of stretch from late nineties into the 47 00:03:21,680 --> 00:03:25,679 Speaker 1: early two thousands. I mean, I remember, you know, I'm 48 00:03:25,680 --> 00:03:29,000 Speaker 1: a kid growing up in Sanmadadino, California, but you're not 49 00:03:29,040 --> 00:03:31,160 Speaker 1: familiar with it. It's about an hour east of La 50 00:03:31,800 --> 00:03:34,880 Speaker 1: Nobody really goes there, right, and I'm feeling like, shoot, 51 00:03:35,520 --> 00:03:39,600 Speaker 1: I don't have to leave Sammonddino. Everything I want I 52 00:03:39,640 --> 00:03:41,520 Speaker 1: could find it on the Internet. I could learn to 53 00:03:41,600 --> 00:03:44,200 Speaker 1: do anything. I can connect with anybody. I remember the 54 00:03:44,240 --> 00:03:46,200 Speaker 1: first time I was talking with somebody on aim and 55 00:03:46,280 --> 00:03:49,680 Speaker 1: it was incredible. I can just talk to somebody on 56 00:03:49,720 --> 00:03:51,800 Speaker 1: the other side of the planet for free, you know, 57 00:03:51,840 --> 00:03:53,720 Speaker 1: as long as I pay the dial up bill, right, 58 00:03:54,200 --> 00:03:57,880 Speaker 1: And it seemed like something new was coming out every 59 00:03:58,000 --> 00:04:02,400 Speaker 1: year where I'm going to be able to just do 60 00:04:02,520 --> 00:04:05,680 Speaker 1: things that I never even thought were possible. And there 61 00:04:05,760 --> 00:04:09,280 Speaker 1: was definitely a time where everything felt optimistic, but the 62 00:04:09,320 --> 00:04:10,920 Speaker 1: optimism seemed realistic. 63 00:04:11,520 --> 00:04:12,160 Speaker 3: Yeah it did. 64 00:04:12,360 --> 00:04:15,920 Speaker 2: And you know, even going further back, I grew up 65 00:04:16,000 --> 00:04:18,040 Speaker 2: in a very kind of hippie environment. 66 00:04:18,360 --> 00:04:19,640 Speaker 3: I actually was living in Canada. 67 00:04:20,000 --> 00:04:22,600 Speaker 2: I went to a school where all the grades were 68 00:04:22,640 --> 00:04:25,839 Speaker 2: kind of combined into one. And you know, that was 69 00:04:25,880 --> 00:04:28,880 Speaker 2: its own kind of utopian movement back in the sixties 70 00:04:28,880 --> 00:04:31,599 Speaker 2: and seventies. And I think a lot of us thought 71 00:04:32,800 --> 00:04:37,800 Speaker 2: that the Internet was kind of the instantiation of a 72 00:04:37,839 --> 00:04:40,400 Speaker 2: lot of the ideas from that era. You know, you 73 00:04:40,440 --> 00:04:42,479 Speaker 2: were going to let people connect with each other, and 74 00:04:42,560 --> 00:04:45,160 Speaker 2: as soon as people connected, you know, of course there 75 00:04:45,200 --> 00:04:48,880 Speaker 2: would be peace. Right, all right, all violence and problems 76 00:04:48,880 --> 00:04:49,840 Speaker 2: are misunderstanding. 77 00:04:49,880 --> 00:04:51,360 Speaker 1: All we got to do is just talk to each other. 78 00:04:51,400 --> 00:04:53,080 Speaker 1: Will be good, Yeah, we'll be good. 79 00:04:53,240 --> 00:04:55,400 Speaker 2: I remember a friend of mine said, like, the Internet 80 00:04:55,480 --> 00:04:58,800 Speaker 2: is like Esperanto. Do you know Esperanto the language? 81 00:04:59,160 --> 00:04:59,480 Speaker 3: Yes? 82 00:05:00,120 --> 00:05:02,800 Speaker 1: Oh my gosh, yes, I made the mistake of spending 83 00:05:02,839 --> 00:05:06,320 Speaker 1: about a couple weeks trying to learn Esperanto. Yes, yes, 84 00:05:06,760 --> 00:05:11,920 Speaker 1: but yes, so Esperanto basically the language that the idea was. Listen, 85 00:05:12,160 --> 00:05:15,840 Speaker 1: we need to leave aside all the national languages. We 86 00:05:15,880 --> 00:05:19,839 Speaker 1: create a new language, and that will allow for equality 87 00:05:19,839 --> 00:05:22,360 Speaker 1: because we'll all literally speak the same language, will all 88 00:05:22,400 --> 00:05:24,680 Speaker 1: actually be on the same page. That will lead us 89 00:05:24,680 --> 00:05:28,040 Speaker 1: into this really promising future. And some people really dedicated 90 00:05:28,080 --> 00:05:28,839 Speaker 1: themselves to. 91 00:05:28,720 --> 00:05:30,440 Speaker 3: It entire lives, entire lives. 92 00:05:30,480 --> 00:05:35,440 Speaker 2: And my friend had you know, saw the Internet as Esperanto, 93 00:05:36,000 --> 00:05:39,560 Speaker 2: the successful version, you know, like it had actually eliminated. 94 00:05:39,600 --> 00:05:41,400 Speaker 3: They believed all the. 95 00:05:41,000 --> 00:05:44,440 Speaker 2: Differences between computers and ultimately between us, and you know, 96 00:05:44,560 --> 00:05:47,800 Speaker 2: that was the original idea basically of cyberspace, which was 97 00:05:47,880 --> 00:05:51,440 Speaker 2: like this better world that was the like the Internet, 98 00:05:51,560 --> 00:05:53,800 Speaker 2: no one knows you're a dog era where you're going 99 00:05:53,839 --> 00:05:57,159 Speaker 2: to pretend to be like somebody else, and that would 100 00:05:57,160 --> 00:05:59,960 Speaker 2: be liberating. Maybe I'm just like whoever, I am like 101 00:06:00,040 --> 00:06:04,080 Speaker 2: half Asian dude, but in cyberspace, I don't know, like 102 00:06:04,160 --> 00:06:06,360 Speaker 2: a blue space alien or something. 103 00:06:06,120 --> 00:06:10,640 Speaker 1: Right, and that's a beautiful thing. Yeah right, Okay, Then 104 00:06:10,760 --> 00:06:14,080 Speaker 1: what happened? So as Tim was starting research for his book, 105 00:06:14,279 --> 00:06:16,960 Speaker 1: he looked at Google, not as in he was searching 106 00:06:17,040 --> 00:06:19,520 Speaker 1: up things on the Google search engine. He started looking 107 00:06:19,560 --> 00:06:22,560 Speaker 1: at Google LLC as a source, or at least as 108 00:06:22,560 --> 00:06:25,960 Speaker 1: a symbol of how tech companies went from seemingly visionary 109 00:06:26,200 --> 00:06:27,960 Speaker 1: to obviously exploitative. 110 00:06:29,000 --> 00:06:32,120 Speaker 2: There was Google. It started at a university at Stanford. 111 00:06:32,320 --> 00:06:35,640 Speaker 2: It was originally housed on some random computers. Then it 112 00:06:35,680 --> 00:06:37,320 Speaker 2: was in a garage, so it had all these kind 113 00:06:37,320 --> 00:06:42,599 Speaker 2: of idealistic settings. Their early talk was very idealistic. We 114 00:06:42,720 --> 00:06:44,440 Speaker 2: just want to organize the world's information. 115 00:06:45,360 --> 00:06:45,840 Speaker 3: There was this. 116 00:06:45,880 --> 00:06:49,320 Speaker 2: Idea they had that our main goal is to take 117 00:06:49,400 --> 00:06:51,600 Speaker 2: you where you need to go. We're not going to 118 00:06:51,680 --> 00:06:54,080 Speaker 2: hold on to you. We just want to show you 119 00:06:54,160 --> 00:06:57,080 Speaker 2: the way. Their slogan was don't be evil. 120 00:06:57,600 --> 00:06:57,920 Speaker 3: Yeah. 121 00:06:58,080 --> 00:07:01,720 Speaker 1: I mean I remember being a genuine fan of Google. 122 00:07:02,760 --> 00:07:05,560 Speaker 1: I mean I was like an unplayed employee. Man, Gmail 123 00:07:05,680 --> 00:07:08,840 Speaker 1: came out. I remember having my Gmail invice. You remember 124 00:07:08,839 --> 00:07:11,560 Speaker 1: that where you you They didn't give it to everybody, 125 00:07:11,680 --> 00:07:13,800 Speaker 1: so you had a limited amount of invites. Man, I 126 00:07:13,920 --> 00:07:16,960 Speaker 1: was passing those things out. Every new thing that Google 127 00:07:17,040 --> 00:07:20,720 Speaker 1: came out with. It felt like, I don't want to 128 00:07:20,720 --> 00:07:23,560 Speaker 1: say a utopian future, but it felt like everything was 129 00:07:23,640 --> 00:07:28,080 Speaker 1: going to be interesting and amazing and convenient and helpful. 130 00:07:28,560 --> 00:07:30,080 Speaker 1: I mean, how can you argue against that? 131 00:07:30,680 --> 00:07:33,600 Speaker 3: Yeah? No, I agree, and I shared that feeling. 132 00:07:34,160 --> 00:07:38,360 Speaker 2: I thought there was something very well intentioned about Google, 133 00:07:38,480 --> 00:07:41,240 Speaker 2: the early company. I mean, some of your listeners might 134 00:07:41,240 --> 00:07:42,960 Speaker 2: be like, what are they talking about? 135 00:07:43,040 --> 00:07:46,040 Speaker 1: Who are these Is there something wrong with them? 136 00:07:47,280 --> 00:07:50,360 Speaker 3: Right? But no, I think they're very well intentioned. 137 00:07:50,520 --> 00:07:53,440 Speaker 2: A lot of my friends who were like in another 138 00:07:53,480 --> 00:07:55,720 Speaker 2: era might have gone to work at nonprofits, went to 139 00:07:55,720 --> 00:07:57,360 Speaker 2: go work at Google because. 140 00:07:57,120 --> 00:07:58,320 Speaker 1: They believed in the mission. 141 00:07:58,360 --> 00:08:01,120 Speaker 2: They believed in it, and the early Google, you know, 142 00:08:01,440 --> 00:08:03,840 Speaker 2: didn't have a business model, and then they kind of 143 00:08:03,880 --> 00:08:07,800 Speaker 2: grudgingly took on advertising, but said all the stuff, how, 144 00:08:07,800 --> 00:08:10,560 Speaker 2: we're just gonna be totally different kind of advertising. It's 145 00:08:10,560 --> 00:08:12,480 Speaker 2: gonna be actually stuff you really. 146 00:08:12,400 --> 00:08:13,880 Speaker 3: Want because you're looking for it anyway. 147 00:08:14,520 --> 00:08:17,360 Speaker 2: So it's not gonna be annoying, it's not gonna be cluttery, 148 00:08:17,680 --> 00:08:19,680 Speaker 2: it's not gonna mess with things, you know. 149 00:08:19,720 --> 00:08:20,560 Speaker 3: The founders had. 150 00:08:20,400 --> 00:08:24,480 Speaker 2: Already written a paper where they said that advertising rex search, 151 00:08:25,120 --> 00:08:27,080 Speaker 2: so they were already on record on saying that like 152 00:08:27,800 --> 00:08:31,760 Speaker 2: paid search, where whoever pays most gets the top link, 153 00:08:32,120 --> 00:08:32,880 Speaker 2: is a terrible thing. 154 00:08:33,520 --> 00:08:36,440 Speaker 1: Actually, let's get specific about that paper. It came out 155 00:08:36,480 --> 00:08:39,280 Speaker 1: in nineteen ninety eight and it's titled The Anatomy of 156 00:08:39,320 --> 00:08:43,800 Speaker 1: a large Scale Hypertextual Web Search Engine. And in that paper, 157 00:08:43,840 --> 00:08:46,880 Speaker 1: Sergey Brandon Larry Page, the founders of Google, wrote the 158 00:08:46,920 --> 00:08:51,200 Speaker 1: following quote, the goals of the advertising business model do 159 00:08:51,320 --> 00:08:55,920 Speaker 1: not always correspond to providing quality search to users. And 160 00:08:56,040 --> 00:08:59,439 Speaker 1: later on they say that quote, we expect that advertising 161 00:08:59,480 --> 00:09:03,280 Speaker 1: funded so search engines will be inherently biased towards the 162 00:09:03,320 --> 00:09:08,360 Speaker 1: advertisers and away from the needs of the consumers. End quote. So, 163 00:09:08,520 --> 00:09:11,520 Speaker 1: just to summarize, almost thirty years ago, the founders of 164 00:09:11,559 --> 00:09:15,680 Speaker 1: Google said in public that ads would bias search engines 165 00:09:15,880 --> 00:09:17,000 Speaker 1: and make them worse. 166 00:09:18,000 --> 00:09:18,800 Speaker 3: So what happened. 167 00:09:19,240 --> 00:09:22,640 Speaker 2: So they started making money, and then they made this 168 00:09:22,760 --> 00:09:25,840 Speaker 2: decision in the early two thousands that they were going 169 00:09:25,920 --> 00:09:26,520 Speaker 2: to go public. 170 00:09:27,000 --> 00:09:29,360 Speaker 3: But they had this idea. 171 00:09:29,360 --> 00:09:32,199 Speaker 2: We're going to go public, but we're not going to 172 00:09:32,280 --> 00:09:33,120 Speaker 2: do it like normal. 173 00:09:33,240 --> 00:09:34,400 Speaker 3: We don't want to become evil. 174 00:09:34,840 --> 00:09:37,559 Speaker 2: So they wrote this big, long letter to their shareholders 175 00:09:37,800 --> 00:09:40,199 Speaker 2: and it starts with like, we're. 176 00:09:40,000 --> 00:09:41,920 Speaker 3: Not a normal company. We don't want to be a 177 00:09:41,920 --> 00:09:42,679 Speaker 3: normal company. 178 00:09:43,080 --> 00:09:45,120 Speaker 2: We're going to do all these things that are you're 179 00:09:45,160 --> 00:09:48,920 Speaker 2: going to think are terrible, like crazy projects. We're going 180 00:09:49,000 --> 00:09:52,120 Speaker 2: to care about our users, not our advertisers. Don't be 181 00:09:52,200 --> 00:09:53,640 Speaker 2: evil as our model, on and on and on. 182 00:09:54,240 --> 00:09:57,120 Speaker 1: In that two thousand and four letter, the founders wrote, quote, 183 00:09:57,320 --> 00:10:00,760 Speaker 1: Google is not a conventional company. You do not intend 184 00:10:00,800 --> 00:10:01,440 Speaker 1: to become one. 185 00:10:02,280 --> 00:10:07,480 Speaker 2: But but they still decided to become a regular for 186 00:10:07,559 --> 00:10:12,360 Speaker 2: profit Delaware corporation, which means that they're registered in Delaware, 187 00:10:12,559 --> 00:10:17,520 Speaker 2: which has basically the loosest requirements for a corporation and 188 00:10:18,000 --> 00:10:20,240 Speaker 2: is the standard in corporate America. 189 00:10:20,840 --> 00:10:24,960 Speaker 1: Okay, yeah, standard, that is what conventional companies do. But 190 00:10:25,040 --> 00:10:29,120 Speaker 1: remember Google said they weren't conventional, and as Tim points out, 191 00:10:29,280 --> 00:10:30,960 Speaker 1: it didn't have to go that way. 192 00:10:31,440 --> 00:10:34,360 Speaker 2: They could have maybe done things differently at that point, 193 00:10:34,440 --> 00:10:37,640 Speaker 2: set up something either as a nonprofit if they really 194 00:10:37,640 --> 00:10:40,840 Speaker 2: believe this stuff, they could have been the kind of 195 00:10:40,880 --> 00:10:45,240 Speaker 2: corporation that is not necessarily for profit, what's called a 196 00:10:45,240 --> 00:10:48,959 Speaker 2: B corp, like Pantagonia. But they chose the normal Delaware 197 00:10:49,480 --> 00:10:53,800 Speaker 2: corporate form that every evil corporation in the world has, 198 00:10:54,240 --> 00:10:58,600 Speaker 2: and then you know, over the next ten or fifteen years, 199 00:10:59,440 --> 00:11:02,679 Speaker 2: they proceed to violate it, basically every promise in that letter, 200 00:11:03,640 --> 00:11:07,680 Speaker 2: and right the reason was, I mean, it's so predictable. 201 00:11:08,440 --> 00:11:15,040 Speaker 2: They had pressure to continually increase revenue, and ultimately that 202 00:11:15,200 --> 00:11:18,319 Speaker 2: meant they had, in fact, to treat advertisers better than 203 00:11:18,360 --> 00:11:20,320 Speaker 2: their users. That means they had to have more and 204 00:11:20,360 --> 00:11:23,200 Speaker 2: more and more ads. It means they had to cut 205 00:11:23,280 --> 00:11:27,320 Speaker 2: projects that didn't make sense. I mean, scripture says no 206 00:11:27,400 --> 00:11:31,439 Speaker 2: man can obey two masters, and they believed that they 207 00:11:31,440 --> 00:11:34,640 Speaker 2: were different. I think it's the original sin of Silicon Valley, 208 00:11:34,640 --> 00:11:38,160 Speaker 2: maybe of America, is that we have this belief that 209 00:11:38,280 --> 00:11:40,560 Speaker 2: we can have our cake and eat it too, obey 210 00:11:40,600 --> 00:11:44,000 Speaker 2: the interests of their users and consumers, and also their 211 00:11:44,160 --> 00:11:46,760 Speaker 2: advertisers and also their investors. 212 00:11:47,559 --> 00:11:50,080 Speaker 1: I believe that you're the first person on this show 213 00:11:50,120 --> 00:11:54,000 Speaker 1: to quote the Bible to me. But that's pretty appropriate 214 00:11:54,360 --> 00:11:57,959 Speaker 1: because in some ways, no, for real, it does feel 215 00:11:58,000 --> 00:12:01,480 Speaker 1: like we actually have straight from that light that felt 216 00:12:01,520 --> 00:12:07,120 Speaker 1: really you know, people were truly evangelical about the promises 217 00:12:07,760 --> 00:12:11,040 Speaker 1: of Silicon Valley, the promises of technology, but even specifically 218 00:12:11,040 --> 00:12:14,400 Speaker 1: about companies. I really remember, you know, myself being one 219 00:12:14,400 --> 00:12:18,400 Speaker 1: of them, felt really really optimistic about what Google was 220 00:12:18,480 --> 00:12:21,360 Speaker 1: doing and what Google could do in the future. For 221 00:12:21,400 --> 00:12:23,960 Speaker 1: most of us, though, I feel like they kind of 222 00:12:23,960 --> 00:12:26,520 Speaker 1: fell out of that light for us as a gradual process. 223 00:12:27,200 --> 00:12:30,679 Speaker 1: Most people probably couldn't tell you a time when they 224 00:12:30,720 --> 00:12:34,760 Speaker 1: stopped feeling good about Google. But you have a very 225 00:12:34,800 --> 00:12:37,480 Speaker 1: specific date that you point to in your book that 226 00:12:37,559 --> 00:12:40,920 Speaker 1: you feel like was a specific turning point a day. 227 00:12:41,720 --> 00:12:42,920 Speaker 3: Yes, it's a second. 228 00:12:42,960 --> 00:12:45,840 Speaker 2: The first is the decision become a for profit company. 229 00:12:46,280 --> 00:12:51,959 Speaker 2: The second was June eleventh, twenty thirteen. That's the fall of. 230 00:12:51,920 --> 00:12:53,120 Speaker 3: The early Internet. 231 00:12:54,720 --> 00:12:57,640 Speaker 1: This is where things get interesting for me. The date 232 00:12:57,679 --> 00:13:00,960 Speaker 1: when tim says, the utopian promise of the early Internet 233 00:13:01,400 --> 00:13:13,280 Speaker 1: died forever. We'll get into why after the break. Okay, 234 00:13:13,360 --> 00:13:16,760 Speaker 1: So some background on that date of June eleventh, twenty thirteen. 235 00:13:17,040 --> 00:13:18,760 Speaker 1: So first we actually have to go back to two 236 00:13:18,760 --> 00:13:21,360 Speaker 1: thousand and six to a project made by an Israeli 237 00:13:21,400 --> 00:13:26,000 Speaker 1: programmer called FreeMap Israel. What made this project interesting was 238 00:13:26,080 --> 00:13:28,120 Speaker 1: not only that it was free and it helped you 239 00:13:28,160 --> 00:13:30,040 Speaker 1: with figuring out directions on where you wanted to go, 240 00:13:30,520 --> 00:13:34,280 Speaker 1: but that everything was crowdsourced, even the traffic information, and 241 00:13:34,440 --> 00:13:37,440 Speaker 1: this became its defining feature, and they had the tagline 242 00:13:37,720 --> 00:13:42,520 Speaker 1: out smarting traffic Together. Pretty soon FreeMap Israel had expanded 243 00:13:42,520 --> 00:13:44,840 Speaker 1: globally and had changed their name to something that you 244 00:13:44,960 --> 00:13:50,640 Speaker 1: might recognize, Ways. So Ways got popular pretty quickly because 245 00:13:50,679 --> 00:13:54,280 Speaker 1: of that crowdsourced information. It was genuinely useful. It would 246 00:13:54,280 --> 00:13:56,720 Speaker 1: warn you about an accident nearby or if police were 247 00:13:56,800 --> 00:13:59,280 Speaker 1: up ahead, so you could change your travel route or 248 00:13:59,320 --> 00:14:01,199 Speaker 1: your travels speed accordingly. 249 00:14:03,120 --> 00:14:07,400 Speaker 2: So it was emerging as a challenger to Google Maps. 250 00:14:07,440 --> 00:14:10,040 Speaker 2: And here you had these two kind of I guess 251 00:14:10,280 --> 00:14:13,840 Speaker 2: models of the future. So what happened in twenty thirteen 252 00:14:14,040 --> 00:14:16,000 Speaker 2: in June the contest was. 253 00:14:16,000 --> 00:14:16,760 Speaker 3: About to start. 254 00:14:17,240 --> 00:14:20,400 Speaker 2: It pitted Google, which had a more traditional map program, 255 00:14:20,640 --> 00:14:23,760 Speaker 2: against Ways, which was based on user contributions, and the 256 00:14:23,800 --> 00:14:27,760 Speaker 2: competition was on for the future, and all of a sudden, 257 00:14:28,160 --> 00:14:29,560 Speaker 2: Google just bought Ways. 258 00:14:30,440 --> 00:14:32,080 Speaker 1: At this point in time, on one side, you had 259 00:14:32,080 --> 00:14:35,240 Speaker 1: Google Maps, which said they had a billion users, and 260 00:14:35,280 --> 00:14:37,880 Speaker 1: on the other you had Ways, which was reporting around 261 00:14:37,920 --> 00:14:41,160 Speaker 1: fifty million users. But they were growing and really quickly. 262 00:14:41,520 --> 00:14:44,120 Speaker 1: So Google opened up their wallet, pulled out reportedly about 263 00:14:44,120 --> 00:14:47,400 Speaker 1: a billion dollars, and just bought Ways. They let the 264 00:14:47,440 --> 00:14:50,400 Speaker 1: app itself continue to operate, but some of that really 265 00:14:50,480 --> 00:14:54,400 Speaker 1: valuable data that weays it generated, like that crowdsource traffic information, 266 00:14:54,960 --> 00:14:58,080 Speaker 1: all of that was incorporated up into the new mothership 267 00:14:58,240 --> 00:14:59,120 Speaker 1: of Google Maps. 268 00:15:00,080 --> 00:15:03,840 Speaker 2: They used, you know, frankly, a method pioneered by John Rockefeller, 269 00:15:03,880 --> 00:15:06,760 Speaker 2: which is, if you have problems with the competitor, buy them. 270 00:15:07,320 --> 00:15:10,560 Speaker 2: I remember being pretty shocked that that happened. Now we 271 00:15:10,840 --> 00:15:14,440 Speaker 2: go to the law, which is my particular area of expertise, 272 00:15:14,960 --> 00:15:19,640 Speaker 2: and there is a law called the Clayton Act, you know, 273 00:15:19,680 --> 00:15:22,680 Speaker 2: the anti monopoly law, which says you're not supposed to 274 00:15:22,720 --> 00:15:26,760 Speaker 2: be allowed to merge companies to create a monopoly and 275 00:15:26,840 --> 00:15:29,120 Speaker 2: I think you do not need to be a sophisticated 276 00:15:29,120 --> 00:15:33,360 Speaker 2: economist to say there's two mapping programs, one buys the other, 277 00:15:33,760 --> 00:15:34,640 Speaker 2: that's a monopoly. 278 00:15:35,320 --> 00:15:37,440 Speaker 1: Yeah, pretty cut, dry cut and try. 279 00:15:38,000 --> 00:15:41,120 Speaker 2: So it went to the Federal Trade Commission, which is 280 00:15:41,160 --> 00:15:44,280 Speaker 2: the agency one of the two agencies that tries to 281 00:15:44,280 --> 00:15:51,080 Speaker 2: block illegal mergers, and Federal Trade Commission said, fine, no problem. 282 00:15:51,880 --> 00:15:54,920 Speaker 1: So this is pretty strange. The FDC initially said that 283 00:15:54,920 --> 00:15:57,240 Speaker 1: they were going to look into this acquisition, but a 284 00:15:57,280 --> 00:16:00,400 Speaker 1: couple months later they stopped the investigation and they let 285 00:16:00,400 --> 00:16:01,560 Speaker 1: the purchase go ahead. 286 00:16:02,160 --> 00:16:03,960 Speaker 2: I remember I was had been working there, right, and I 287 00:16:04,000 --> 00:16:06,280 Speaker 2: was like, how did they get to that conclusion? As 288 00:16:06,280 --> 00:16:09,080 Speaker 2: far as I can tell, this looks pretty much like 289 00:16:09,120 --> 00:16:12,200 Speaker 2: a merger of monopoly that that reduced competition. So I 290 00:16:12,280 --> 00:16:14,280 Speaker 2: never I didn't find the answer out to that question 291 00:16:14,320 --> 00:16:17,920 Speaker 2: for many years until one day I was at a 292 00:16:17,920 --> 00:16:18,920 Speaker 2: anti trust party. 293 00:16:19,920 --> 00:16:20,720 Speaker 3: I was hanging out. 294 00:16:22,800 --> 00:16:25,840 Speaker 4: I was trying back up the anti trust party, Like, 295 00:16:26,040 --> 00:16:28,160 Speaker 4: I was, what does the party full invite look like 296 00:16:28,200 --> 00:16:30,400 Speaker 4: for that the anti trust party? We just like pull up, 297 00:16:30,400 --> 00:16:33,280 Speaker 4: we canna talk about anti trust, Like if you like monopolies, 298 00:16:33,320 --> 00:16:34,320 Speaker 4: don't come like what. 299 00:16:35,120 --> 00:16:39,120 Speaker 2: Basically, yeah, okay, it's a thing. 300 00:16:39,320 --> 00:16:41,600 Speaker 1: It's a thing. Invite me next time I'm down. This 301 00:16:41,640 --> 00:16:42,560 Speaker 1: sounds great, you. 302 00:16:42,440 --> 00:16:45,520 Speaker 3: Know, it's like, it's a pretty cool crowd. They're not bad. 303 00:16:46,240 --> 00:16:49,960 Speaker 2: Matt stolers there, Lena Kahn will be there. Like, it's 304 00:16:49,960 --> 00:16:52,200 Speaker 2: a bunch of people. So I was at this party 305 00:16:52,240 --> 00:16:56,240 Speaker 2: and I was having drinks and I realized one of 306 00:16:56,280 --> 00:16:58,040 Speaker 2: the person I was drinking with had worked on that case. 307 00:16:58,800 --> 00:17:01,720 Speaker 2: So it was after a while I was like, so, like, what, 308 00:17:03,440 --> 00:17:05,920 Speaker 2: you know, what went down? How did that happen? And 309 00:17:05,960 --> 00:17:08,800 Speaker 2: she said, well, you know, I know, look kind of bad. 310 00:17:08,800 --> 00:17:11,640 Speaker 2: But the boss said we should let the merger go through. 311 00:17:12,280 --> 00:17:16,560 Speaker 2: And he said, here was his theory that Google is 312 00:17:16,600 --> 00:17:19,680 Speaker 2: what you use when you want to figure out where 313 00:17:19,720 --> 00:17:23,240 Speaker 2: you are, and Ways is what you use when you 314 00:17:23,320 --> 00:17:24,560 Speaker 2: want to figure out where you're going. 315 00:17:26,200 --> 00:17:28,120 Speaker 3: Huh, I'll say it again. 316 00:17:28,160 --> 00:17:30,119 Speaker 2: Google is where you go when you want to figure 317 00:17:30,119 --> 00:17:32,760 Speaker 2: out where you are, and Ways is what you use 318 00:17:32,760 --> 00:17:34,040 Speaker 2: if you want to figure out where you're going. 319 00:17:34,080 --> 00:17:35,240 Speaker 3: So they're not really competing. 320 00:17:36,200 --> 00:17:41,560 Speaker 1: You have just described a map, like the two core 321 00:17:41,680 --> 00:17:42,679 Speaker 1: functions of a map. 322 00:17:44,200 --> 00:17:48,320 Speaker 2: Yeah, so they were not forced to release that reasoning 323 00:17:48,359 --> 00:17:51,560 Speaker 2: to the public, but that's what happened, and the reason 324 00:17:51,600 --> 00:17:55,760 Speaker 2: I pinpoint on that point. I think that point early 325 00:17:56,000 --> 00:17:58,920 Speaker 2: twenty tens is when the Internet really started to turn, 326 00:17:59,600 --> 00:18:03,159 Speaker 2: and it is because the first generation of companies, people 327 00:18:03,200 --> 00:18:08,200 Speaker 2: like Google, started becoming threatening by new guys, not just Ways, Instagram, 328 00:18:08,520 --> 00:18:11,280 Speaker 2: a whole bunch of other companies, and as opposed to 329 00:18:11,400 --> 00:18:15,160 Speaker 2: fighting them, as opposed to competing, they just started buying them. 330 00:18:15,600 --> 00:18:18,119 Speaker 1: We reached out to the FTC for comment on Tim's 331 00:18:18,119 --> 00:18:21,240 Speaker 1: account of this story. As of this recording, we haven't 332 00:18:21,280 --> 00:18:24,479 Speaker 1: heard back. So for a while, this acquisition thing was 333 00:18:24,520 --> 00:18:27,919 Speaker 1: a trend big tech companies would just buy their competition. 334 00:18:28,480 --> 00:18:30,840 Speaker 1: Instagram was starting to become a popular place for people 335 00:18:30,840 --> 00:18:34,600 Speaker 1: to connect online, so Facebook bought them. They also bought WhatsApp. 336 00:18:34,960 --> 00:18:37,719 Speaker 1: Maybe you remember when Zappo's was getting really popular as 337 00:18:37,760 --> 00:18:40,720 Speaker 1: a place to buy shoes online, and then Amazon bought 338 00:18:40,800 --> 00:18:43,880 Speaker 1: them to add to their online shopping empire. They also 339 00:18:43,880 --> 00:18:47,720 Speaker 1: bought Whole Foods. Microsoft picked up LinkedIn and GitHub. The 340 00:18:47,800 --> 00:18:50,880 Speaker 1: next year after getting Ways, Google bought an AI company 341 00:18:51,000 --> 00:18:54,480 Speaker 1: called DeepMind. I could keep going here. Tim worked on 342 00:18:54,480 --> 00:18:56,520 Speaker 1: a study a while back that showed them, between two 343 00:18:56,560 --> 00:19:00,000 Speaker 1: thousand and seven and twenty eighteen, Google and Facebook just betwre. 344 00:19:00,200 --> 00:19:03,280 Speaker 1: The two of them collectively acquired over three hundred and 345 00:19:03,440 --> 00:19:07,760 Speaker 1: fifty companies. The federal government let all those acquisitions go through, 346 00:19:08,280 --> 00:19:11,920 Speaker 1: and that study closes with this line of commentary. As 347 00:19:11,960 --> 00:19:15,080 Speaker 1: with a basketball referee who never calls a foul, the 348 00:19:15,200 --> 00:19:19,080 Speaker 1: question is whether the players have really been faultless, or 349 00:19:19,119 --> 00:19:24,320 Speaker 1: whether the referee is missing something. What might things look 350 00:19:24,359 --> 00:19:26,159 Speaker 1: like if Google had not bought Ways? 351 00:19:26,920 --> 00:19:30,240 Speaker 2: You know, I think that what was lost in that 352 00:19:30,440 --> 00:19:34,959 Speaker 2: acquisition was the chance of a full fledged rival, you know, 353 00:19:35,040 --> 00:19:39,919 Speaker 2: to Google, and another ecosystem, two different, fully functioning ecosystems 354 00:19:40,359 --> 00:19:43,920 Speaker 2: that offered you a real alternative because so much turns 355 00:19:43,920 --> 00:19:46,760 Speaker 2: on maps as sort of a foundation and a good 356 00:19:46,760 --> 00:19:50,199 Speaker 2: search engine, and if you imagine it growing, if you 357 00:19:50,240 --> 00:19:53,840 Speaker 2: imagine it development itself. It also was, as we said, 358 00:19:53,880 --> 00:19:57,200 Speaker 2: this kind of different user based business model, and who 359 00:19:57,200 --> 00:19:58,280 Speaker 2: knows where that would have gone. 360 00:19:58,680 --> 00:19:59,919 Speaker 1: I mean, within the Google War. 361 00:20:00,720 --> 00:20:05,240 Speaker 2: They were supposedly separate, but they were under ultimately the 362 00:20:05,240 --> 00:20:09,080 Speaker 2: command of Google, and ultimately the founders all left and 363 00:20:09,560 --> 00:20:11,919 Speaker 2: expressed extreme disappointment they had ever done this. 364 00:20:12,520 --> 00:20:15,080 Speaker 1: In twenty twenty one, the CEO of Ways from before 365 00:20:15,119 --> 00:20:18,000 Speaker 1: the acquisition left the company. He wrote a blog post 366 00:20:18,000 --> 00:20:20,600 Speaker 1: about why he was leaving, and he also reflected back 367 00:20:20,640 --> 00:20:22,920 Speaker 1: on when he sold to Google back in twenty thirteen, 368 00:20:23,400 --> 00:20:26,439 Speaker 1: and you can tell he regrets it. It almost reads 369 00:20:26,520 --> 00:20:29,240 Speaker 1: is less of a resignation letter than an apology letter 370 00:20:29,240 --> 00:20:32,320 Speaker 1: to society in general. And there's one line that hits 371 00:20:32,359 --> 00:20:36,320 Speaker 1: pretty hard here quote looking back, we could have probably 372 00:20:36,359 --> 00:20:41,400 Speaker 1: grown faster and much more efficiently had we stayed independent, And. 373 00:20:41,400 --> 00:20:43,080 Speaker 3: I think we lost the chance. 374 00:20:44,080 --> 00:20:48,200 Speaker 2: I mean not only through ways, but a million little acquisitions, 375 00:20:48,840 --> 00:20:53,520 Speaker 2: a million different aquihiers of having a truly more decentralized 376 00:20:53,520 --> 00:20:57,000 Speaker 2: economy that was one in which spread a lot more 377 00:20:57,040 --> 00:20:59,680 Speaker 2: wealth to a lot more people. Instead, it has concentrated 378 00:20:59,680 --> 00:21:01,240 Speaker 2: wealth and a very small number of people. 379 00:21:02,320 --> 00:21:05,840 Speaker 1: Acquisitions are just one method of maintaining a monopoly. In 380 00:21:05,880 --> 00:21:08,240 Speaker 1: his book, Tim lays out a bunch of other tactics. 381 00:21:08,280 --> 00:21:10,960 Speaker 1: They're used by tech giants, including Amazon. 382 00:21:11,560 --> 00:21:17,040 Speaker 2: So Amazon Marketplace, I guess fifteen years ago really was 383 00:21:17,760 --> 00:21:20,120 Speaker 2: in some ways carrying out the early dream of the Internet. 384 00:21:20,359 --> 00:21:23,240 Speaker 2: It was making a lot of people rich at the time, 385 00:21:23,280 --> 00:21:26,919 Speaker 2: they charged twenty percent, only twenty percent of fees, and 386 00:21:26,960 --> 00:21:29,760 Speaker 2: then they would ship your products to people, and a 387 00:21:29,800 --> 00:21:32,320 Speaker 2: lot of people started making a lot of money. I 388 00:21:32,359 --> 00:21:35,840 Speaker 2: have in my book stories about like this Indiana barber 389 00:21:35,920 --> 00:21:39,960 Speaker 2: who start selling palmade as a side business and suddenly 390 00:21:40,000 --> 00:21:43,120 Speaker 2: is like making millions of dollars. But then Amazon, once 391 00:21:43,160 --> 00:21:46,800 Speaker 2: it kind of had everybody, It had all the sellers, 392 00:21:47,200 --> 00:21:50,320 Speaker 2: it had the buyers, you know, locked in with Prime 393 00:21:50,440 --> 00:21:53,120 Speaker 2: or whatever, then it just started turning all the knobs. 394 00:21:54,440 --> 00:21:57,760 Speaker 2: They started adding what they called advertising fee for sellers. 395 00:21:57,840 --> 00:22:00,239 Speaker 2: So you know when you search an Amazon on if 396 00:22:00,240 --> 00:22:02,879 Speaker 2: you use Amazon, but you get like, hey, I'm looking 397 00:22:02,920 --> 00:22:06,520 Speaker 2: for slippers, and you get a bunch of sponsored results. Yeah, 398 00:22:06,640 --> 00:22:10,520 Speaker 2: those turned into a huge cash cow for Amazon because 399 00:22:10,560 --> 00:22:13,200 Speaker 2: the sellers bid against each other to get those spots, 400 00:22:13,880 --> 00:22:16,400 Speaker 2: and they don't feel they can sell without them, and 401 00:22:16,480 --> 00:22:19,320 Speaker 2: they started making more and more money. When I released 402 00:22:19,320 --> 00:22:23,480 Speaker 2: this book for twenty twenty four, they had made something 403 00:22:23,520 --> 00:22:28,600 Speaker 2: like fifty six billion from those sponsored links alone. So 404 00:22:28,720 --> 00:22:33,080 Speaker 2: that's more than double the revenue of every single newspaper 405 00:22:33,080 --> 00:22:36,720 Speaker 2: on the entire planet Earth. Ugh, and it's almost no costs. 406 00:22:36,720 --> 00:22:39,520 Speaker 2: It's actually more lucrative than Amazon Web services. I looked 407 00:22:39,560 --> 00:22:41,960 Speaker 2: into it this year twenty twenty five, and it looks 408 00:22:42,000 --> 00:22:45,080 Speaker 2: like it's gonna be over seventy billion dollars a pure 409 00:22:45,119 --> 00:22:46,000 Speaker 2: profit and. 410 00:22:45,960 --> 00:22:48,800 Speaker 1: It's only going out Wow, and it's providing nothing for 411 00:22:48,880 --> 00:22:50,879 Speaker 1: me as somebody who just wants to buy slippers. 412 00:22:51,160 --> 00:22:53,680 Speaker 2: The other way around is making it worse, it's making 413 00:22:53,720 --> 00:22:57,320 Speaker 2: it harder, true fine stuff. So you're paying, we are 414 00:22:57,359 --> 00:23:02,920 Speaker 2: collectively paying seventy billion dollars to degrade our experience. It's 415 00:23:03,000 --> 00:23:05,000 Speaker 2: like completely valueless extraction. 416 00:23:06,920 --> 00:23:09,920 Speaker 1: So we've got companies actively taking money from you, taking 417 00:23:09,960 --> 00:23:13,639 Speaker 1: it from small businesses and making your life worse. Whose 418 00:23:13,760 --> 00:23:17,000 Speaker 1: idea was this, Well, it turns out we could probably 419 00:23:17,040 --> 00:23:19,840 Speaker 1: point to a couple people we get into who After 420 00:23:19,880 --> 00:23:28,960 Speaker 1: the break back in the early two thousands, there were 421 00:23:29,040 --> 00:23:31,240 Speaker 1: a lot of books aimed at casual readers who were 422 00:23:31,320 --> 00:23:34,760 Speaker 1: curious about how technology might reshape the economy. They had 423 00:23:34,800 --> 00:23:37,680 Speaker 1: titles like Small Is the New Big and the Rise 424 00:23:37,720 --> 00:23:40,720 Speaker 1: of the Creative Class. These books were really optimistic and 425 00:23:40,760 --> 00:23:44,000 Speaker 1: they reassured people that tech innovation would lead to more 426 00:23:44,040 --> 00:23:48,040 Speaker 1: opportunities for the little guy. But then came another way 427 00:23:48,080 --> 00:23:50,800 Speaker 1: of books, and these were aimed directly at influencing the 428 00:23:50,880 --> 00:23:53,640 Speaker 1: people who were in Silicon Valley who were starting these 429 00:23:53,680 --> 00:23:57,439 Speaker 1: tech companies, books with titles like blitz Scaling and then 430 00:23:57,440 --> 00:24:00,359 Speaker 1: There's Zero to One, a book written by Peter Teal 431 00:24:00,400 --> 00:24:04,720 Speaker 1: who famously said quote competition is for losers, and he 432 00:24:04,920 --> 00:24:07,800 Speaker 1: argued that monopolies are good for society. 433 00:24:08,720 --> 00:24:11,159 Speaker 2: Part of this was just sort of business instinct, frankly, 434 00:24:11,160 --> 00:24:15,040 Speaker 2: a very old one, not dissimilar to the old Robert 435 00:24:15,119 --> 00:24:18,000 Speaker 2: barons of the nineteenth century who were like, you need 436 00:24:18,040 --> 00:24:21,639 Speaker 2: to build a giant empire and control everything. But actually 437 00:24:21,720 --> 00:24:24,399 Speaker 2: somewhat like them, they could have clothed that. They were like, well, 438 00:24:25,000 --> 00:24:27,080 Speaker 2: when you are a monopoly and when you dominate your 439 00:24:27,160 --> 00:24:30,560 Speaker 2: entire industry, you have enough money to treat your employees better. 440 00:24:31,200 --> 00:24:33,879 Speaker 1: One of the books that you bring up is blitz Scaling, 441 00:24:34,920 --> 00:24:40,760 Speaker 1: and it's specifically referencing Blitzkrieg, which is kind of incredible 442 00:24:40,800 --> 00:24:43,600 Speaker 1: that we've just got a book which is basically recommending 443 00:24:44,160 --> 00:24:49,240 Speaker 1: follow the war plans of World War two Germany. 444 00:24:50,240 --> 00:24:53,000 Speaker 2: I think riad Hoffins said one of his interviews, He's like, well, 445 00:24:53,200 --> 00:24:54,639 Speaker 2: I mean that's the whole idea. You don't carry too 446 00:24:54,720 --> 00:24:56,600 Speaker 2: much stuff, you don't get too busy and move fast. 447 00:24:56,960 --> 00:25:00,240 Speaker 2: So that was my business strategy. And if if you 448 00:25:00,280 --> 00:25:04,119 Speaker 2: read Peter Teel, it has, you know, a certain level 449 00:25:04,160 --> 00:25:07,199 Speaker 2: of practical wisdom and it's well written, but if you 450 00:25:07,840 --> 00:25:10,720 Speaker 2: have an academic background, you can see very clearly that 451 00:25:10,720 --> 00:25:14,600 Speaker 2: it's channeling kind of niche and theory of an ubermensh. 452 00:25:15,240 --> 00:25:18,320 Speaker 2: The monopolist, in his view, is kind of a superior 453 00:25:18,440 --> 00:25:22,120 Speaker 2: race of people, and there are clearly, in his view, 454 00:25:22,200 --> 00:25:25,679 Speaker 2: like men who are destined to bleed and rule, and 455 00:25:25,720 --> 00:25:28,000 Speaker 2: then like lesser creatures who are destined to serve. 456 00:25:28,600 --> 00:25:31,639 Speaker 1: So let me posit the counterpoint to that, which is 457 00:25:31,640 --> 00:25:34,560 Speaker 1: to say that, look, okay, you talking about whole bunch 458 00:25:34,600 --> 00:25:35,720 Speaker 1: of big economics game. 459 00:25:35,800 --> 00:25:38,000 Speaker 2: I'm not interested in that. I just want the APT 460 00:25:38,040 --> 00:25:40,680 Speaker 2: to work, right. That's what I think is kind of 461 00:25:40,720 --> 00:25:45,480 Speaker 2: clever and insidious about it. The business model feats on basically, 462 00:25:45,520 --> 00:25:48,400 Speaker 2: as I say in the book, a very profound bet 463 00:25:48,400 --> 00:25:52,760 Speaker 2: on human laziness, you know, like ultimately what the most 464 00:25:52,800 --> 00:25:55,480 Speaker 2: of us want from technology, just that it works. As 465 00:25:55,520 --> 00:25:57,359 Speaker 2: you said, you know, we don't want to have forty 466 00:25:57,400 --> 00:25:58,560 Speaker 2: options most of the time. 467 00:25:59,000 --> 00:26:00,760 Speaker 1: And I know you bring in cow cochlock. I love 468 00:26:00,800 --> 00:26:02,480 Speaker 1: to love you to hit me with that, because that's 469 00:26:02,480 --> 00:26:03,560 Speaker 1: one of my favorite parts of the book. 470 00:26:04,640 --> 00:26:07,000 Speaker 3: So the power of couchlock. 471 00:26:07,520 --> 00:26:10,800 Speaker 2: That's a term from the I guess we call the 472 00:26:10,840 --> 00:26:17,920 Speaker 2: marijuana community, which refers to fact that when you get couchlocked, 473 00:26:17,920 --> 00:26:20,520 Speaker 2: you're like unable to move, Like even a nuclear weapon 474 00:26:20,560 --> 00:26:22,919 Speaker 2: is coming at your house. You're just like, oh, I 475 00:26:22,920 --> 00:26:24,160 Speaker 2: guess that's just gonna happen. 476 00:26:24,400 --> 00:26:26,239 Speaker 1: You just get get real high and you just on 477 00:26:26,280 --> 00:26:28,239 Speaker 1: the couch is like, yo, we gotta go. It's like, bro, 478 00:26:28,400 --> 00:26:31,000 Speaker 1: I'm sorry, I'm I can't move, man. I feel like 479 00:26:31,000 --> 00:26:33,240 Speaker 1: a weigh a million pounds. I'm not leaving. 480 00:26:33,760 --> 00:26:36,840 Speaker 2: And in business terms, it's kind of like, let's say 481 00:26:36,840 --> 00:26:39,439 Speaker 2: you're buying something on Amazon. You're like, hey, there's this 482 00:26:39,520 --> 00:26:42,119 Speaker 2: better deal over here. It's twenty dollars cheaper, but you 483 00:26:42,119 --> 00:26:44,440 Speaker 2: have to sign up for something. You're like, ugh, I'm good, 484 00:26:44,920 --> 00:26:46,760 Speaker 2: I can't do it. You know, like you just think 485 00:26:46,800 --> 00:26:51,840 Speaker 2: the small amounts of irritation just are unfathomable. So I 486 00:26:51,880 --> 00:26:55,280 Speaker 2: think frankly, couchlock rules the web at this point, and 487 00:26:55,440 --> 00:26:59,160 Speaker 2: it tends to create monopoly, and at some point I think, look, 488 00:26:59,200 --> 00:27:01,680 Speaker 2: I don't want to say people shouldn't be lazy. I 489 00:27:01,720 --> 00:27:03,920 Speaker 2: don't want to pretend people. I just think if that's 490 00:27:03,920 --> 00:27:06,399 Speaker 2: what it's going to be, then we need a better deal. 491 00:27:06,680 --> 00:27:08,680 Speaker 2: Like if there's just going to be a couple of monopolies. 492 00:27:09,240 --> 00:27:11,040 Speaker 2: We need a better deal, and we need to have 493 00:27:11,160 --> 00:27:12,440 Speaker 2: them spread the wealth a bit better. 494 00:27:13,160 --> 00:27:15,679 Speaker 1: I love this. We've gone from quoting the Bible to 495 00:27:15,680 --> 00:27:19,719 Speaker 1: talk about weed this. This is a great conversation. So Okay, 496 00:27:19,880 --> 00:27:23,520 Speaker 1: let's say nothing changes. Let's say things keep going on 497 00:27:23,720 --> 00:27:26,439 Speaker 1: the way that they're going on. Where are we headed 498 00:27:26,800 --> 00:27:27,280 Speaker 1: right now? 499 00:27:27,400 --> 00:27:27,640 Speaker 3: Yeah? 500 00:27:27,720 --> 00:27:30,280 Speaker 2: I mean I discussed in the book what I call 501 00:27:30,359 --> 00:27:34,400 Speaker 2: the real road to serve them, which is the way 502 00:27:34,400 --> 00:27:37,840 Speaker 2: in which a monopolized economy tends to lead to a 503 00:27:37,920 --> 00:27:40,400 Speaker 2: rise of an autocratic leader. And frankly, I think we're 504 00:27:40,400 --> 00:27:43,840 Speaker 2: pretty far down that road, I got to say, And 505 00:27:43,880 --> 00:27:46,439 Speaker 2: I think it happens this way. You allow too much 506 00:27:46,480 --> 00:27:49,800 Speaker 2: of the economy to become monopolized, it takes too much 507 00:27:49,880 --> 00:27:54,919 Speaker 2: money from people. They become cynical, angry about democracy. You 508 00:27:55,000 --> 00:27:58,639 Speaker 2: then have a possibility for democracy to fix it. And 509 00:27:58,680 --> 00:28:01,560 Speaker 2: that's why I'm advocating need to do something. If not, 510 00:28:01,600 --> 00:28:05,320 Speaker 2: people get more and more angry and increasingly say, all right, 511 00:28:05,320 --> 00:28:06,520 Speaker 2: I don't believe in democracy. 512 00:28:06,520 --> 00:28:07,439 Speaker 3: It can't do anything. 513 00:28:07,880 --> 00:28:11,760 Speaker 2: I need some strong leader who's gonna put me first 514 00:28:11,800 --> 00:28:13,800 Speaker 2: and directly deliver the. 515 00:28:13,800 --> 00:28:14,399 Speaker 3: Money to me. 516 00:28:15,200 --> 00:28:17,800 Speaker 2: And I think that is the way you see the 517 00:28:17,880 --> 00:28:21,480 Speaker 2: rise of the populist dictator, and around the world. I'm 518 00:28:21,480 --> 00:28:23,359 Speaker 2: not just talking to the United States, so there's obvious 519 00:28:23,440 --> 00:28:26,440 Speaker 2: parallels the United States around the world. There's been dictators 520 00:28:26,480 --> 00:28:29,040 Speaker 2: who have come to power in our era on the 521 00:28:29,080 --> 00:28:31,800 Speaker 2: back of economic dissatisfaction. 522 00:28:32,720 --> 00:28:35,440 Speaker 1: I hope that there are some members of Congress who 523 00:28:35,800 --> 00:28:38,480 Speaker 1: have read your book and who are listening to this podcast. 524 00:28:39,160 --> 00:28:41,320 Speaker 1: I suspect, however, that the vast majority of the people 525 00:28:41,320 --> 00:28:43,560 Speaker 1: who are listening to this, who are watching this are 526 00:28:44,160 --> 00:28:47,680 Speaker 1: not members of Congress, are not directly able to push 527 00:28:47,720 --> 00:28:50,400 Speaker 1: those levers of government. What is an individual able to do? 528 00:28:51,040 --> 00:28:54,440 Speaker 2: I think you have to in terms of citizen voting, 529 00:28:55,000 --> 00:28:58,680 Speaker 2: be serious about people who truly serious about the threat 530 00:28:58,680 --> 00:29:01,440 Speaker 2: of monopoly power, and don't just like say a few 531 00:29:01,480 --> 00:29:04,080 Speaker 2: things and then quietly vote. I mean, I worked in 532 00:29:04,120 --> 00:29:06,160 Speaker 2: the White House, and I worked on trying to get 533 00:29:06,240 --> 00:29:08,680 Speaker 2: bills passed. And there's a lot of people who take 534 00:29:08,720 --> 00:29:11,880 Speaker 2: too much money from tech platforms and when push comes 535 00:29:11,880 --> 00:29:14,680 Speaker 2: as shove will never do anything to limit their business model. 536 00:29:15,000 --> 00:29:16,800 Speaker 2: You have to be really careful who you vote for 537 00:29:17,200 --> 00:29:18,720 Speaker 2: and their stances on monopoly. 538 00:29:19,400 --> 00:29:21,840 Speaker 1: Okay, I know we're getting back into politics. Here, But 539 00:29:22,000 --> 00:29:25,960 Speaker 1: ultimately this is where stuff ends up because scolding individual 540 00:29:26,000 --> 00:29:29,400 Speaker 1: people for continuing to use Apple products or Google products, 541 00:29:29,640 --> 00:29:32,240 Speaker 1: or trying to tell somebody to stop shopping on Amazon, 542 00:29:32,640 --> 00:29:34,160 Speaker 1: it's not going to get us anywhere. 543 00:29:35,040 --> 00:29:38,160 Speaker 2: Look, we have this illusion that we individuals can stand 544 00:29:38,160 --> 00:29:40,520 Speaker 2: out the companies that are so much more powerful than 545 00:29:40,560 --> 00:29:44,720 Speaker 2: we are. And I think we have learned and we 546 00:29:44,880 --> 00:29:48,360 Speaker 2: know that in every society, every civilization, there are going 547 00:29:48,440 --> 00:29:52,320 Speaker 2: to be platforms that are essential. And I was in 548 00:29:52,400 --> 00:29:55,920 Speaker 2: Rome with my kids earlier this year and go to 549 00:29:56,200 --> 00:29:59,200 Speaker 2: ancient center of Rome, and there is the forum, which 550 00:29:59,240 --> 00:30:02,719 Speaker 2: is like that place where they have markets, they sell stuff, 551 00:30:03,040 --> 00:30:05,080 Speaker 2: they also have speeches, they have the core. Everything is 552 00:30:05,080 --> 00:30:07,720 Speaker 2: happening there. There's always been there, and so it's no 553 00:30:08,280 --> 00:30:11,600 Speaker 2: answer to say, well, if you don't like where everything's happening, 554 00:30:11,640 --> 00:30:12,320 Speaker 2: go somewhere else. 555 00:30:12,520 --> 00:30:13,360 Speaker 3: That's not really an answer. 556 00:30:13,440 --> 00:30:16,280 Speaker 2: There's always going to be essential platforms, and in our 557 00:30:16,320 --> 00:30:20,200 Speaker 2: times they are companies like Amazon, and somebody's exes its 558 00:30:20,240 --> 00:30:23,640 Speaker 2: own essential platform of speech. Unless you're going to become 559 00:30:24,320 --> 00:30:26,640 Speaker 2: you know, hermer who lives in the cave, you cannot 560 00:30:26,880 --> 00:30:30,400 Speaker 2: ignore that we have these essential platforms. But the platforms 561 00:30:30,720 --> 00:30:33,960 Speaker 2: have a problem of main character syndrome. They think they 562 00:30:34,000 --> 00:30:37,160 Speaker 2: are it, you know, they think they're the story. But 563 00:30:37,200 --> 00:30:40,360 Speaker 2: they are supposed to be the sort of servants of 564 00:30:40,360 --> 00:30:42,840 Speaker 2: the rest of the economy. You know, we've dealt with 565 00:30:42,880 --> 00:30:43,960 Speaker 2: this problem before, like. 566 00:30:43,840 --> 00:30:44,640 Speaker 3: With the trains. 567 00:30:44,800 --> 00:30:47,240 Speaker 2: This was a big problem in the nineteenth century. We've 568 00:30:47,280 --> 00:30:50,560 Speaker 2: like forgotten our own history because we're calling stuff tech. 569 00:30:51,080 --> 00:30:53,400 Speaker 2: So I think the government should control and limit what 570 00:30:53,440 --> 00:30:56,480 Speaker 2: they're able to do and also how much they charge. 571 00:30:56,760 --> 00:30:59,280 Speaker 2: At some level, they need to be treated more like utilities. 572 00:30:59,400 --> 00:31:01,800 Speaker 2: I mean, think about the electric network and how we 573 00:31:02,040 --> 00:31:04,560 Speaker 2: limit how much they can charge you. If you didn't, 574 00:31:04,800 --> 00:31:07,160 Speaker 2: what would the electric company do? They'd say, all right, 575 00:31:07,200 --> 00:31:09,240 Speaker 2: want to give me one thousand dollars a month or 576 00:31:09,360 --> 00:31:11,440 Speaker 2: ten thousand dollars a month. You'd say no way, and 577 00:31:11,440 --> 00:31:13,560 Speaker 2: they'd say, okay, how does it feel to have no electricity? 578 00:31:14,040 --> 00:31:16,520 Speaker 2: And I think we need to think of these platforms 579 00:31:16,560 --> 00:31:21,040 Speaker 2: more like electricity, which is the platform for the rest 580 00:31:21,040 --> 00:31:22,760 Speaker 2: of us. I also think they need to be under 581 00:31:23,040 --> 00:31:25,680 Speaker 2: constant anti trust attention. But since this is not an 582 00:31:25,680 --> 00:31:27,920 Speaker 2: anti trust party, I will get into that too much. 583 00:31:29,920 --> 00:31:32,280 Speaker 1: So we've talked about that turning point that was in 584 00:31:32,360 --> 00:31:36,560 Speaker 1: twenty thirteen we're constantly now talking about all right, we're 585 00:31:36,640 --> 00:31:40,600 Speaker 1: probably it's some kind of inflection point for AI. Yeah, 586 00:31:40,760 --> 00:31:42,720 Speaker 1: where do you see that playing out? 587 00:31:43,160 --> 00:31:46,720 Speaker 2: Let me say something positive about AI before I turn 588 00:31:46,800 --> 00:31:52,600 Speaker 2: to darker possibility. Okay, sort of positive vision of AI 589 00:31:53,680 --> 00:31:56,720 Speaker 2: is that it actually is a great challenge to the 590 00:31:56,920 --> 00:31:57,800 Speaker 2: tech platforms. 591 00:31:57,880 --> 00:31:58,280 Speaker 3: Maybe. 592 00:31:58,760 --> 00:32:02,000 Speaker 2: I mean, I was using open Ai earlier today for 593 00:32:02,080 --> 00:32:04,440 Speaker 2: various things, and as far as I could tell, I 594 00:32:04,440 --> 00:32:07,320 Speaker 2: didn't see a single ad or give any money to Google. 595 00:32:07,320 --> 00:32:08,520 Speaker 2: I used it for a lot of things I would 596 00:32:08,520 --> 00:32:11,200 Speaker 2: have used Google for maybe in the old days. And 597 00:32:11,640 --> 00:32:14,560 Speaker 2: I also, I think was looking for some products and 598 00:32:14,720 --> 00:32:17,400 Speaker 2: didn't have to go through you know, Amazon's. 599 00:32:16,800 --> 00:32:20,560 Speaker 3: Insane sponsored links. So there is a possibility. 600 00:32:20,880 --> 00:32:23,040 Speaker 2: And one of the things I have a big believer 601 00:32:23,120 --> 00:32:26,680 Speaker 2: of in technology is you need a constant cycle of challengers. 602 00:32:27,200 --> 00:32:30,640 Speaker 2: So AI could be a challenge to the platforms and 603 00:32:30,680 --> 00:32:32,880 Speaker 2: could shake things up. And that's a very positive view. 604 00:32:33,280 --> 00:32:36,720 Speaker 2: The negative view is that it would reinforce the power 605 00:32:36,760 --> 00:32:41,840 Speaker 2: of the platforms, make them almost entirely invulnerable to competition, 606 00:32:42,440 --> 00:32:45,040 Speaker 2: give them more of a government like status, and make 607 00:32:45,120 --> 00:32:47,720 Speaker 2: us even more couch blocked than before, where not only 608 00:32:47,760 --> 00:32:50,760 Speaker 2: can we not think about getting up to get a 609 00:32:50,800 --> 00:32:53,480 Speaker 2: different company. We can't even like write our own emails, 610 00:32:54,560 --> 00:32:57,600 Speaker 2: you know, where you just feel so totally dependent that 611 00:32:57,720 --> 00:33:00,280 Speaker 2: it's like the sup of armor. You become like this 612 00:33:01,160 --> 00:33:03,640 Speaker 2: ineveted kind of creature and you have to climb into 613 00:33:03,640 --> 00:33:06,600 Speaker 2: your suit of armor to do anything. That's the scary future, 614 00:33:06,800 --> 00:33:09,200 Speaker 2: as we're so utterly dependent, we can't do anything without it. 615 00:33:10,240 --> 00:33:13,040 Speaker 1: You know. There's are you a hip hop fan? 616 00:33:13,720 --> 00:33:14,240 Speaker 3: Somewhat? 617 00:33:14,720 --> 00:33:20,440 Speaker 1: There's a track on DJ shadows introducing album, and it's 618 00:33:20,480 --> 00:33:24,200 Speaker 1: called why hip Hop Sucks in ninety six. It's very 619 00:33:24,200 --> 00:33:27,160 Speaker 1: short track and all it is it's a little bit 620 00:33:27,200 --> 00:33:30,000 Speaker 1: of background music plays and then a guy's voice says, 621 00:33:30,400 --> 00:33:39,600 Speaker 1: it's the money, money, Money Money track ends. That's it. 622 00:33:39,600 --> 00:33:42,080 Speaker 1: It sounds to me like that's almost a soundtrack to 623 00:33:42,160 --> 00:33:44,120 Speaker 1: some of your book, which is to say that we 624 00:33:44,240 --> 00:33:46,520 Speaker 1: came in with all this optimism thinking, you know, our 625 00:33:46,560 --> 00:33:49,880 Speaker 1: ideas and our beliefs and all this other stuff is 626 00:33:49,880 --> 00:33:51,800 Speaker 1: going to really push us forward. And then it seems 627 00:33:51,840 --> 00:33:53,840 Speaker 1: like a lot of what screwed this up, honestly, is 628 00:33:53,920 --> 00:33:56,760 Speaker 1: just people fell victim to the promise of the money. 629 00:33:56,880 --> 00:34:02,360 Speaker 3: Cashules everything around me. Dollar dollar billion. 630 00:34:02,400 --> 00:34:06,600 Speaker 1: There we go, there we go. But that's what it is. 631 00:34:08,800 --> 00:34:12,919 Speaker 2: Here's what I think is my ultimate prescription is if 632 00:34:12,960 --> 00:34:16,319 Speaker 2: we as really believe something like we did in the 633 00:34:16,320 --> 00:34:21,040 Speaker 2: early thousands, you have to create structures to control the 634 00:34:21,080 --> 00:34:24,399 Speaker 2: power of money to corrode it. Structure beats out good 635 00:34:24,440 --> 00:34:28,319 Speaker 2: intentions because everything is going to get corrupted and turn 636 00:34:28,360 --> 00:34:32,200 Speaker 2: to shit by that creeping need for more and more 637 00:34:32,239 --> 00:34:34,839 Speaker 2: little pieces of money. It's sort of like the way 638 00:34:34,840 --> 00:34:37,239 Speaker 2: if you want a nonprofit, you can't be a nonprofit 639 00:34:37,320 --> 00:34:40,720 Speaker 2: like the Red Cross and also have little profit stuff 640 00:34:40,719 --> 00:34:43,160 Speaker 2: on the side. You can't say, basically, we're about saving 641 00:34:43,200 --> 00:34:46,160 Speaker 2: people and disasters. Oh and also we advertise and hold 642 00:34:46,160 --> 00:34:47,840 Speaker 2: parties on the side or something or I don't know, 643 00:34:47,880 --> 00:34:50,120 Speaker 2: you know, whatever it else or we sell we sell merch, 644 00:34:50,239 --> 00:34:52,680 Speaker 2: We sell merch because the merch part is going to grow. 645 00:34:53,160 --> 00:34:54,520 Speaker 3: You have to be strict about this. 646 00:34:54,600 --> 00:34:56,200 Speaker 1: Stuff. Can't serve two masters. 647 00:34:56,520 --> 00:34:59,960 Speaker 2: That's right, where else it creeps, you know, to give credit. 648 00:35:00,120 --> 00:35:04,879 Speaker 2: Wikipedia took a different path. Wikipedia in the early two 649 00:35:04,920 --> 00:35:08,560 Speaker 2: thousands had roughly the same traffic as Google, if not more. 650 00:35:09,800 --> 00:35:13,840 Speaker 2: They had a very easy path to riches and Jimmy Wales, 651 00:35:13,840 --> 00:35:15,880 Speaker 2: who's the head of it. You know, It's almost like 652 00:35:16,000 --> 00:35:17,800 Speaker 2: every morning he woke up and he had a button 653 00:35:18,440 --> 00:35:21,600 Speaker 2: marked like billionaire that he could have pushed, and he didn't. 654 00:35:21,760 --> 00:35:23,960 Speaker 2: And he's like, if we ever have advertising a Wikipedia, 655 00:35:23,960 --> 00:35:26,640 Speaker 2: it's going to turn the shit. And like, Wikipedia is 656 00:35:26,680 --> 00:35:29,879 Speaker 2: not perfect, but it is a nonprofit. It makes plenty 657 00:35:29,920 --> 00:35:32,960 Speaker 2: of money by donations and it hasn't crept into this 658 00:35:33,920 --> 00:35:36,560 Speaker 2: in this thing. So I say it again, structure beats 659 00:35:36,600 --> 00:35:39,000 Speaker 2: good intentions. If you believe in something, you got to 660 00:35:39,040 --> 00:35:41,759 Speaker 2: start at the beginning and structure it right, or else 661 00:35:41,800 --> 00:35:44,919 Speaker 2: it's going to fall prey to the creep of cash 662 00:35:44,960 --> 00:35:45,880 Speaker 2: rules everything around me. 663 00:35:46,239 --> 00:35:47,320 Speaker 3: Cream get the money. 664 00:35:52,000 --> 00:35:54,600 Speaker 1: Thank you so much for listening to kill Switch. If 665 00:35:54,640 --> 00:35:56,279 Speaker 1: you want to hit us up, you could email us 666 00:35:56,320 --> 00:36:00,600 Speaker 1: at kill Switch at Kaleidoscope dot NYC or on Instagram. 667 00:36:00,680 --> 00:36:03,279 Speaker 1: We're at kill Switch Pod and if you got a 668 00:36:03,320 --> 00:36:06,280 Speaker 1: second please do leave us a review, give us a rating. 669 00:36:06,360 --> 00:36:08,880 Speaker 1: It helps other people find the show, which helps us 670 00:36:08,960 --> 00:36:11,359 Speaker 1: keep doing our thing. And once you've done that, did 671 00:36:11,400 --> 00:36:13,960 Speaker 1: you know that kill Switch is on YouTube? You can 672 00:36:14,000 --> 00:36:17,040 Speaker 1: search for us there at kill Switch underscore pod or 673 00:36:17,120 --> 00:36:19,480 Speaker 1: the link for that and everything else is in the 674 00:36:19,520 --> 00:36:23,240 Speaker 1: show notes. Kill Switch is hosted by me Dexter Thomas 675 00:36:23,480 --> 00:36:27,160 Speaker 1: It's produced by Shena Ozaki, Darluk Potts, and Julian Nutter. 676 00:36:27,480 --> 00:36:30,160 Speaker 1: Our theme song is by Me and Kyle Murdoch and 677 00:36:30,239 --> 00:36:33,600 Speaker 1: Kyle also mixed the show. From Kaleidoscope, our executive producers 678 00:36:33,640 --> 00:36:37,920 Speaker 1: are Oswa Lashin, Mangesh Hadikadur and Kate Osborne. From iHeart, 679 00:36:37,920 --> 00:36:41,799 Speaker 1: our executive producers are Katrina Norville and Nikki Etur. That's 680 00:36:41,840 --> 00:36:50,760 Speaker 1: it from US. Catch on the next one. Good Fine,