1 00:00:02,120 --> 00:00:03,279 Speaker 1: All media. 2 00:00:04,040 --> 00:00:06,800 Speaker 2: Hey everybody, Robert Gosh darn Evans for you here and 3 00:00:07,240 --> 00:00:08,720 Speaker 2: you know, but for the end of the year, to 4 00:00:08,800 --> 00:00:11,879 Speaker 2: celebrate and stuff, we've got our normal behind the Bastards 5 00:00:11,920 --> 00:00:14,760 Speaker 2: content coming to you. Do not worry, that's all going 6 00:00:14,840 --> 00:00:17,720 Speaker 2: to continue as normal. But we also wanted to highlight 7 00:00:18,280 --> 00:00:20,440 Speaker 2: some other shows in our network, most of which are 8 00:00:20,480 --> 00:00:23,800 Speaker 2: new and launched this year. We've got some compilation best 9 00:00:23,800 --> 00:00:27,040 Speaker 2: of episodes that we think the Bastard's audience is going 10 00:00:27,080 --> 00:00:29,680 Speaker 2: to love, and we're delivering to you now in a 11 00:00:29,760 --> 00:00:33,640 Speaker 2: special format with fewer ads. So today you're going to 12 00:00:33,640 --> 00:00:38,000 Speaker 2: hear some episodes of Better Offline, ed Zytron's excellent critical 13 00:00:38,080 --> 00:00:41,440 Speaker 2: tech industry podcast, which has taken the tech world by storm. 14 00:00:41,840 --> 00:00:43,800 Speaker 2: And I'm excited for you to learn about the man 15 00:00:43,800 --> 00:00:47,640 Speaker 2: who killed Google Search, about Sam Altman, the CEO of 16 00:00:47,760 --> 00:00:51,600 Speaker 2: Open Ai, and why he's dangerous for society, and what 17 00:00:51,760 --> 00:00:54,160 Speaker 2: ed Zitron calls the rot economy. 18 00:00:54,960 --> 00:01:09,880 Speaker 3: Hello, and welcome to Better Offline. I'm your host, that Zichron, And. 19 00:01:09,880 --> 00:01:11,880 Speaker 4: In the next two episodes, I'm. 20 00:01:11,760 --> 00:01:13,640 Speaker 3: Going to tell you the names some of the people 21 00:01:13,840 --> 00:01:17,160 Speaker 3: responsible for destroying the Internet. And I'm going to start 22 00:01:17,160 --> 00:01:21,120 Speaker 3: on February fifth, twenty nineteen, when Ben Gomes, Google's former 23 00:01:21,160 --> 00:01:23,680 Speaker 3: head of Search, well, he had a problem. 24 00:01:23,920 --> 00:01:25,000 Speaker 4: Jerry Dishler, then. 25 00:01:24,880 --> 00:01:27,280 Speaker 3: The VP and GM of Ads at Google, and Shiv 26 00:01:27,360 --> 00:01:30,520 Speaker 3: Van Carterman, then the VP of Engineering, Search and Ads 27 00:01:30,600 --> 00:01:33,679 Speaker 3: on Google Properties, had called something called a code yellow 28 00:01:33,680 --> 00:01:36,280 Speaker 3: for search revenue due to and I quote emails that 29 00:01:36,319 --> 00:01:39,480 Speaker 3: came out as part of Google's anti trust hearing steady 30 00:01:39,520 --> 00:01:42,760 Speaker 3: weakness in the daily numbers, and a likeliness that it 31 00:01:42,760 --> 00:01:46,520 Speaker 3: would end the quarter significantly behind in metrics. That kind 32 00:01:46,560 --> 00:01:50,720 Speaker 3: of unclear for those unfamiliar with Google's internal kind of 33 00:01:50,760 --> 00:01:54,840 Speaker 3: scientology esque jargon, which means most people, let me explain, 34 00:01:55,200 --> 00:01:58,120 Speaker 3: a code yellow isn't a terrible need to piss or 35 00:01:58,160 --> 00:02:02,120 Speaker 3: some sort of crisis of moderate severity. The yellow, according 36 00:02:02,120 --> 00:02:05,320 Speaker 3: to Stephen Levy's Tell All book about Google, refers to 37 00:02:05,400 --> 00:02:07,560 Speaker 3: and I promise this is not a joke, the color 38 00:02:07,560 --> 00:02:10,120 Speaker 3: of a tank top that a former VP of Engineering 39 00:02:10,200 --> 00:02:12,280 Speaker 3: called Wayne Rosling used to wear during his time at 40 00:02:12,320 --> 00:02:16,880 Speaker 3: the company. It's essentially the equivalent of deafcom one and activates, 41 00:02:16,880 --> 00:02:20,280 Speaker 3: as Levy explained, a war room like situation where workers 42 00:02:20,320 --> 00:02:22,639 Speaker 3: are pulled from their desks and into a conference room 43 00:02:22,639 --> 00:02:25,440 Speaker 3: where they tackle the problem as a top priority. Any 44 00:02:25,480 --> 00:02:30,119 Speaker 3: other projects or concerns are sidelined and independently. I've heard 45 00:02:30,120 --> 00:02:32,800 Speaker 3: there are other colors like purple. I'm not going to 46 00:02:32,800 --> 00:02:35,120 Speaker 3: get into that, though, it's quite boring and irrelevant to 47 00:02:35,160 --> 00:02:38,800 Speaker 3: this situation. In emails released as part of the Department 48 00:02:38,800 --> 00:02:42,400 Speaker 3: of Justices antitrust case against Google, as I previously mentioned, 49 00:02:42,680 --> 00:02:46,360 Speaker 3: Dishla laid out several contributing factors. Search query growth was 50 00:02:46,400 --> 00:02:50,760 Speaker 3: significantly behind forecast, the timing of revenue launches was significantly behind, 51 00:02:51,160 --> 00:02:54,399 Speaker 3: and he had this vague worry that several advertiser specific 52 00:02:54,440 --> 00:02:57,720 Speaker 3: and sector weaknesses existed in search. Now I want to 53 00:02:57,760 --> 00:03:00,600 Speaker 3: cover something because I've messed up, and I've really want. 54 00:03:00,440 --> 00:03:01,120 Speaker 4: To be clear about this. 55 00:03:01,520 --> 00:03:05,120 Speaker 3: I've previously and erroneously referred to the code yellow as 56 00:03:05,120 --> 00:03:07,560 Speaker 3: something that Gomes raised as a means of calling attention 57 00:03:07,840 --> 00:03:10,280 Speaker 3: to the proximity of Google's ad side getting a little 58 00:03:10,280 --> 00:03:14,239 Speaker 3: too close to Search. I'm afraid the truth is extremely 59 00:03:14,280 --> 00:03:17,920 Speaker 3: depressing and so much grimmar. The code yellow was actually 60 00:03:17,919 --> 00:03:21,040 Speaker 3: the rumble of the goddamn rot economy, with Google's revenue 61 00:03:21,120 --> 00:03:24,000 Speaker 3: arms sounding the alarm that its golden goose wasn't laying 62 00:03:24,080 --> 00:03:28,239 Speaker 3: enough eggs. Gomes, a Googler of nineteen years that basically 63 00:03:28,240 --> 00:03:31,680 Speaker 3: built the foundation of modern search engines, should go down 64 00:03:31,720 --> 00:03:34,079 Speaker 3: as one of the few people in tech that actually 65 00:03:34,120 --> 00:03:37,440 Speaker 3: fought for an actual principle, and he was destroyed by 66 00:03:37,480 --> 00:03:41,480 Speaker 3: a guy called Prabaka Ragavan, a computer scientist class traitor 67 00:03:41,520 --> 00:03:45,720 Speaker 3: that sided with the management consultancy sect. More confusingly, one 68 00:03:45,760 --> 00:03:49,040 Speaker 3: of their problems was that there was insufficient growth in queries, 69 00:03:49,760 --> 00:03:53,480 Speaker 3: as in the amount of things that people were asking Google. 70 00:03:54,040 --> 00:03:56,520 Speaker 3: It's a bit like if Ford decided that things were 71 00:03:56,560 --> 00:03:59,480 Speaker 3: going poorly because their drivers weren't putting enough goddamn miles 72 00:03:59,520 --> 00:04:03,680 Speaker 3: on their truck. This whole story has personally upset me, 73 00:04:03,760 --> 00:04:05,200 Speaker 3: and I think you're going to hear that in this 74 00:04:05,560 --> 00:04:08,080 Speaker 3: but going through these emails is. 75 00:04:08,040 --> 00:04:10,800 Speaker 4: Just very depressing. Anyway. 76 00:04:11,280 --> 00:04:14,800 Speaker 3: A few days beforehand, on February first, twenty nineteen, Kristen 77 00:04:14,880 --> 00:04:19,400 Speaker 3: gil then Google's VP Business Finance Officer, had emailed Shashi Thakker, 78 00:04:19,880 --> 00:04:22,960 Speaker 3: then Google's VP of Engineering Search and Discover, saying that 79 00:04:23,000 --> 00:04:25,599 Speaker 3: the ADS team had been considering a code yellow to 80 00:04:25,720 --> 00:04:30,720 Speaker 3: close the search gap it was seeing, vaguely referring to 81 00:04:30,760 --> 00:04:34,279 Speaker 3: how critical that growth was to an unnamed company plan. 82 00:04:34,880 --> 00:04:37,680 Speaker 3: To be clear, this email was in response to Thaker 83 00:04:37,880 --> 00:04:40,360 Speaker 3: stating that there is nothing that the search team could 84 00:04:40,360 --> 00:04:42,719 Speaker 3: do to operate at the fidelity of growth that the 85 00:04:42,760 --> 00:04:46,359 Speaker 3: Ads department had demanded. Shashi forward did the email to 86 00:04:46,360 --> 00:04:49,080 Speaker 3: Gomes asking if there's any way to discuss this with 87 00:04:49,160 --> 00:04:52,680 Speaker 3: san Dar Pashai, Google CEO, and declared that there was 88 00:04:52,760 --> 00:04:55,040 Speaker 3: no way he would sign up for a high fidelity 89 00:04:55,040 --> 00:04:59,159 Speaker 3: business metric for daily active users on search. Thakkh also 90 00:04:59,200 --> 00:05:02,200 Speaker 3: said something that I've been thinking about constantly since I 91 00:05:02,279 --> 00:05:05,720 Speaker 3: read these emails, that there was a good reason that 92 00:05:05,760 --> 00:05:10,000 Speaker 3: Google's founders separated search from ads. I want you to 93 00:05:10,080 --> 00:05:14,520 Speaker 3: remember that line for later. A day later, on February second, 94 00:05:14,600 --> 00:05:18,440 Speaker 3: twenty nineteen, Thacker and Gomes shared their anxieties with Nick Fox, 95 00:05:18,600 --> 00:05:21,440 Speaker 3: a vice president of Search and Google assistant, entering a 96 00:05:21,520 --> 00:05:24,600 Speaker 3: multiple day long debate about Google's some lust for growth. 97 00:05:25,560 --> 00:05:26,600 Speaker 4: This thread is. 98 00:05:26,560 --> 00:05:29,279 Speaker 3: A dark window into the world of growth focus tech, 99 00:05:29,640 --> 00:05:32,480 Speaker 3: where the Kherr listed the multiple points of disconnection between 100 00:05:32,520 --> 00:05:36,239 Speaker 3: ads and search, discussing how the search team wasn't able 101 00:05:36,440 --> 00:05:39,919 Speaker 3: to finally optimize engagement on Google without hacking it, a 102 00:05:40,040 --> 00:05:43,559 Speaker 3: term that means effectively tricking users into spending more time 103 00:05:43,600 --> 00:05:45,840 Speaker 3: on a site, and that doing so would lead them 104 00:05:45,880 --> 00:05:50,360 Speaker 3: to and I quote, abandoned work on efficient journeys. In 105 00:05:50,400 --> 00:05:52,839 Speaker 3: one email, Fox adds that there was a pretty big 106 00:05:52,839 --> 00:05:55,920 Speaker 3: disconnect between what finance and ads wants and what Search 107 00:05:56,040 --> 00:06:01,000 Speaker 3: was doing. Every part of this story pisses me off 108 00:06:01,040 --> 00:06:05,040 Speaker 3: so much. When Gomes pushed back on the multiple requests 109 00:06:05,040 --> 00:06:07,240 Speaker 3: for growth, Fox added that all three of them were 110 00:06:07,279 --> 00:06:09,919 Speaker 3: responsible for Search and that Search was and again I 111 00:06:10,000 --> 00:06:13,279 Speaker 3: quote the revenue engine of the company, and that bartering 112 00:06:13,320 --> 00:06:15,839 Speaker 3: with the ads and finance teams was now potentially the 113 00:06:15,880 --> 00:06:19,839 Speaker 3: new reality of their jobs. On February sixth, twenty nineteen, 114 00:06:20,120 --> 00:06:22,760 Speaker 3: Gomes said that he believed that Search was getting too 115 00:06:22,800 --> 00:06:25,600 Speaker 3: close to the money and ended his email by saying 116 00:06:25,640 --> 00:06:28,080 Speaker 3: that he was concerned that growth is all that Google 117 00:06:28,200 --> 00:06:32,599 Speaker 3: was thinking about. On March twenty second, twenty nineteen, Google 118 00:06:32,680 --> 00:06:36,400 Speaker 3: VP of Product Management Darshan Cantac would declare the end 119 00:06:36,520 --> 00:06:40,520 Speaker 3: of the Code Yellow. The thread mostly consisted of congratulatory 120 00:06:40,520 --> 00:06:44,239 Speaker 3: emails until Gomes made the mistake of responding congratulating everyone, 121 00:06:44,400 --> 00:06:46,520 Speaker 3: saying that the plans architected as part of the Code 122 00:06:46,600 --> 00:06:51,920 Speaker 3: Yellow would do well throughout the year, enter probaka Ragavan, 123 00:06:52,480 --> 00:06:55,360 Speaker 3: then Google's head of Ads and the true mastermind behind 124 00:06:55,360 --> 00:06:58,240 Speaker 3: the code yellow, who would respond curtly saying that the 125 00:06:58,279 --> 00:07:02,360 Speaker 3: current revenue targets were addressed by heroic RPM engineering and 126 00:07:02,360 --> 00:07:06,760 Speaker 3: that the core query softness continued without mitigation, a very 127 00:07:06,760 --> 00:07:09,920 Speaker 3: clunky way of saying that despite these changes, query growth 128 00:07:10,200 --> 00:07:12,120 Speaker 3: was not happening at the rate he needed it to. 129 00:07:13,160 --> 00:07:16,320 Speaker 3: A day later, Gomes emailed Fox Andhaker an email he 130 00:07:16,360 --> 00:07:19,480 Speaker 3: intended to center Ragavan. He led by saying that he 131 00:07:19,560 --> 00:07:23,040 Speaker 3: was annoyed both personally and on behalf of the search team. 132 00:07:23,360 --> 00:07:27,280 Speaker 3: In this very long email, he explained in arduous detail 133 00:07:27,560 --> 00:07:31,000 Speaker 3: how one might increase engagement with Google Search, but specifically 134 00:07:31,040 --> 00:07:34,600 Speaker 3: added that they could increase queries quite easily in the 135 00:07:34,640 --> 00:07:38,239 Speaker 3: short term, but only in user negative ways, like turning 136 00:07:38,280 --> 00:07:42,520 Speaker 3: off spell correction or ranking improvements, or placing refinements effectively 137 00:07:42,560 --> 00:07:45,400 Speaker 3: labels all over the page, adding that it was possible 138 00:07:45,440 --> 00:07:47,600 Speaker 3: that there are trade offs here between the different kinds 139 00:07:47,640 --> 00:07:50,600 Speaker 3: of user negativity caused by engagement hacking, and that he 140 00:07:50,720 --> 00:07:55,000 Speaker 3: was deeply, deeply uncomfortable with this. He also added that 141 00:07:55,040 --> 00:07:58,120 Speaker 3: this was the reason he didn't believe that queries, as 142 00:07:58,160 --> 00:08:01,680 Speaker 3: in the amount of the things with people searching on Google, 143 00:08:01,880 --> 00:08:04,320 Speaker 3: were a good metric to measure search, and that the 144 00:08:04,320 --> 00:08:07,800 Speaker 3: best defense against the weaknesses of queries was to create 145 00:08:07,840 --> 00:08:10,840 Speaker 3: compelling user experiences that make users want to come back. 146 00:08:11,360 --> 00:08:13,600 Speaker 4: Crazy idea there, what if the product was good? 147 00:08:14,440 --> 00:08:19,160 Speaker 3: Not good enough of probaca, So little bit of history 148 00:08:19,200 --> 00:08:21,840 Speaker 3: about Google here. They regularly throughout the year do Core 149 00:08:21,960 --> 00:08:25,200 Speaker 3: updates to Search. These are updates that change the algorithm. 150 00:08:25,480 --> 00:08:27,480 Speaker 3: Let's say, okay, we're going to suppress this kind of thing, 151 00:08:27,480 --> 00:08:30,600 Speaker 3: we can elevate this kind of thing. And they are 152 00:08:30,640 --> 00:08:35,240 Speaker 3: actually the reason that search changes. It's why certain sites 153 00:08:35,280 --> 00:08:38,120 Speaker 3: suddenly disappear or reappear. It's why sites get a ton 154 00:08:38,160 --> 00:08:40,360 Speaker 3: of traffic, some don't get any, and so on and 155 00:08:40,360 --> 00:08:42,960 Speaker 3: so forth. But they do a lot of them. The 156 00:08:43,000 --> 00:08:45,240 Speaker 3: one that's really interesting and a little bastard and I 157 00:08:45,320 --> 00:08:48,320 Speaker 3: went and looked through pretty much the last decade of these. 158 00:08:49,080 --> 00:08:50,840 Speaker 3: The one that stood out to me was the March 159 00:08:50,880 --> 00:08:53,720 Speaker 3: twenty nineteen Core update to Search, which happened about a 160 00:08:53,760 --> 00:08:56,560 Speaker 3: week before the end of the code yellow, meaning that 161 00:08:56,640 --> 00:08:59,760 Speaker 3: it's very likely that this was a result of Prabaker's 162 00:08:59,760 --> 00:09:03,000 Speaker 3: blome shit. So This was expected to be one of 163 00:09:03,040 --> 00:09:05,440 Speaker 3: the largest updates to search in a very long time, 164 00:09:05,480 --> 00:09:09,000 Speaker 3: and I'm quoting Search Engine Journal there. Yet when it launched, 165 00:09:09,440 --> 00:09:12,679 Speaker 3: many found that the update mostly rolled back changes and 166 00:09:12,840 --> 00:09:15,679 Speaker 3: traffic was increasing to sites that had been suppressed by 167 00:09:15,720 --> 00:09:19,239 Speaker 3: previous updates, like Google Search's Penguin update from twenty twelve 168 00:09:19,280 --> 00:09:22,320 Speaker 3: that specifically targeted spami search results. 169 00:09:23,600 --> 00:09:24,520 Speaker 4: There were others that were. 170 00:09:24,440 --> 00:09:26,880 Speaker 3: Seeing traffic as well from an update that happened on 171 00:09:26,920 --> 00:09:29,400 Speaker 3: the first of August twenty eighteen that was a few 172 00:09:29,440 --> 00:09:33,880 Speaker 3: months after Gomes became head of Search. While I'm guessing here, 173 00:09:34,000 --> 00:09:36,160 Speaker 3: I really don't know. I do not work for Google. 174 00:09:36,200 --> 00:09:37,240 Speaker 4: I do not have friends there. 175 00:09:38,040 --> 00:09:41,160 Speaker 3: I think the timing of the March twenty nineteen Core update, 176 00:09:41,280 --> 00:09:44,520 Speaker 3: along with the traffic increases the previously suppressed sites that 177 00:09:44,880 --> 00:09:48,600 Speaker 3: one hundred percent were spamy SEO nonsense. I think these 178 00:09:48,640 --> 00:09:51,160 Speaker 3: suggest that Google's response to the Coyello was to roll 179 00:09:51,200 --> 00:09:54,120 Speaker 3: back changes that were made to maintain the quality of search. 180 00:09:55,760 --> 00:09:58,360 Speaker 3: A few months later, in May twenty nineteen, Google would 181 00:09:58,440 --> 00:10:00,560 Speaker 3: roll out a redesign of how ads we shown on 182 00:10:00,840 --> 00:10:05,319 Speaker 3: Google Search, specifically on mobile, replacing the bright green ad 183 00:10:05,400 --> 00:10:08,200 Speaker 3: label and URL color on ads with a tiny, little, 184 00:10:08,240 --> 00:10:11,120 Speaker 3: bolded black note that said ad in the smallest font 185 00:10:11,160 --> 00:10:14,240 Speaker 3: you could possibly put there, with the link looking otherwise 186 00:10:14,280 --> 00:10:18,120 Speaker 3: identical to a regular search link. I guess that's how 187 00:10:18,120 --> 00:10:20,040 Speaker 3: they managed to start hating their numbers. 188 00:10:20,120 --> 00:10:20,280 Speaker 4: Huh. 189 00:10:20,800 --> 00:10:23,280 Speaker 3: And then in January twenty twenty, Google would bring this 190 00:10:23,360 --> 00:10:26,319 Speaker 3: change to desktop, and the vergess John Porter would suggest 191 00:10:26,520 --> 00:10:29,440 Speaker 3: that it made Google's ads look just like search results 192 00:10:29,480 --> 00:10:35,720 Speaker 3: now awesome. Five months later, a little over a year 193 00:10:35,800 --> 00:10:39,720 Speaker 3: after the code yellow situation, Google would make Probakar Ragavan 194 00:10:39,920 --> 00:10:43,320 Speaker 3: the head of Google Search, with Jerry Dishler taking his 195 00:10:43,400 --> 00:10:47,360 Speaker 3: place as the head of Ads. After nearly twenty years 196 00:10:47,400 --> 00:10:50,120 Speaker 3: of building Google Search, Gomes would be relegated to the 197 00:10:50,200 --> 00:10:53,760 Speaker 3: SVP of Education at Google. Domes, who was a critical 198 00:10:53,800 --> 00:10:56,800 Speaker 3: part of the original team that made Google Search work, 199 00:10:57,000 --> 00:10:59,760 Speaker 3: who has been credited with establishing the culture of the 200 00:10:59,800 --> 00:11:02,960 Speaker 3: world's largest and most important search engine, was chased out 201 00:11:02,960 --> 00:11:06,800 Speaker 3: by a growth hungry managerial type several of them, actually 202 00:11:06,920 --> 00:11:11,560 Speaker 3: led by Probagar Ragavan, a management consultant wearing an engineer costume. 203 00:11:12,760 --> 00:11:14,480 Speaker 3: As a side note, by the way, I use the 204 00:11:14,559 --> 00:11:18,120 Speaker 3: term management consultant there as a pejorative. While he exhibits 205 00:11:18,160 --> 00:11:21,400 Speaker 3: all the same bean counting morally young guided behaviors of 206 00:11:21,400 --> 00:11:24,199 Speaker 3: a management consultant. From what I can tell, Ragavan has 207 00:11:24,240 --> 00:11:26,680 Speaker 3: never actually worked in that particular sector of the economy. 208 00:11:27,160 --> 00:11:31,160 Speaker 4: But you know who has. San Dhar Pashai, the CEO. 209 00:11:30,880 --> 00:11:34,520 Speaker 3: Of Google, who previously worked at McKinsey, arguably the most 210 00:11:34,520 --> 00:11:37,959 Speaker 3: morally abhorrent company that's ever existed, having played roles both 211 00:11:38,000 --> 00:11:40,240 Speaker 3: in the two thousand and eight financial crisis, where it 212 00:11:40,320 --> 00:11:42,800 Speaker 3: encouraged banks to load up on debt and floored mortgage 213 00:11:42,800 --> 00:11:46,160 Speaker 3: backed securities, and the ongoing opiord crisis, where it effectively 214 00:11:46,200 --> 00:11:48,960 Speaker 3: advised Perdue Farmer on how to growth hack sales of 215 00:11:48,960 --> 00:11:54,200 Speaker 3: oxy content, an extremely addictive painkiller. McKinsey has paid nearly 216 00:11:54,240 --> 00:11:56,920 Speaker 3: one billion dollars over several settlements due to its work 217 00:11:56,920 --> 00:12:00,720 Speaker 3: with Perdue. But I'm getting sidetracked, but one last point. 218 00:12:01,280 --> 00:12:04,600 Speaker 3: McKinsey is actively anti labor. When a company brings in 219 00:12:04,640 --> 00:12:08,160 Speaker 3: a McKinsey consultant, they're often there to advise on how 220 00:12:08,200 --> 00:12:12,640 Speaker 3: to cut costs, which inevitably means layoffs and outsourcing. McKinsey 221 00:12:12,800 --> 00:12:16,199 Speaker 3: is to the middle class what fleshy in bacteria. 222 00:12:15,760 --> 00:12:16,800 Speaker 4: Is the skin. 223 00:12:26,440 --> 00:12:29,559 Speaker 3: But back to the emails, which are a stark example 224 00:12:29,800 --> 00:12:33,720 Speaker 3: of the monstrous, disgusting rot economy, the growth that all 225 00:12:33,800 --> 00:12:37,040 Speaker 3: costs mindset that's dominating the tech ecosystem. And if you 226 00:12:37,160 --> 00:12:39,840 Speaker 3: take one thing away from this episode, I want it 227 00:12:39,880 --> 00:12:43,520 Speaker 3: to be the name Prabakar Ragavan and an understanding that 228 00:12:43,520 --> 00:12:46,760 Speaker 3: there are people responsible for the current state of the Internet. 229 00:12:47,800 --> 00:12:50,560 Speaker 3: These emails, which I really encourage you to look up 230 00:12:50,600 --> 00:12:52,640 Speaker 3: and if you go to where's youreaed dot at, you'll 231 00:12:52,640 --> 00:12:55,959 Speaker 3: be able to see a newslet that has links to them. Well, 232 00:12:55,960 --> 00:12:59,360 Speaker 3: these emails tell a dramatic story about how Google's finance 233 00:12:59,400 --> 00:13:02,360 Speaker 3: and advertising teams, led by Ragavan, with the blessing of 234 00:13:02,480 --> 00:13:06,880 Speaker 3: CEO Sandhar Pashai, the McKinsey guy, actively worked to make 235 00:13:06,920 --> 00:13:10,520 Speaker 3: Google worse to make the company more money. This is 236 00:13:10,600 --> 00:13:13,199 Speaker 3: exactly what I mean when I talk about the economy, 237 00:13:13,360 --> 00:13:17,000 Speaker 3: an illogical, product destroying mindset that turns products you love 238 00:13:17,160 --> 00:13:20,680 Speaker 3: into torturous, frustrating, quasi tools that require you to fight 239 00:13:20,720 --> 00:13:24,200 Speaker 3: the company to get the thing you want. Ben Gomes 240 00:13:24,320 --> 00:13:27,480 Speaker 3: was instrumental in making search work both as a product 241 00:13:27,559 --> 00:13:30,720 Speaker 3: and a business. He joined the company in nineteen ninety nine, 242 00:13:30,920 --> 00:13:33,880 Speaker 3: a time long before Google established dominance in the field, 243 00:13:34,120 --> 00:13:36,880 Speaker 3: and the same year when Larry Page and Serge Britain 244 00:13:36,960 --> 00:13:40,000 Speaker 3: tried to sell the company to Excite for one million dollars, 245 00:13:40,360 --> 00:13:43,280 Speaker 3: only to walk away after Vinnard Coosler and Excite investor 246 00:13:43,320 --> 00:13:46,440 Speaker 3: and co founder of some Microsystems that's now a VC 247 00:13:46,600 --> 00:13:48,600 Speaker 3: who tried to stop people going to a beach in 248 00:13:48,640 --> 00:13:50,960 Speaker 3: Half Moon Bay, Well, he tried to low ball them 249 00:13:51,000 --> 00:13:53,800 Speaker 3: with a seven hundred and fifty thousand dollars offer, also 250 00:13:53,880 --> 00:13:56,920 Speaker 3: known as a one hundred square foot apartment in San Francisco. 251 00:13:58,200 --> 00:14:00,400 Speaker 3: In an interview with Fast Companies Harry mc crack him 252 00:14:00,440 --> 00:14:03,920 Speaker 3: from twenty eighteen, Gomes frayed Google's challenge as taking the 253 00:14:03,920 --> 00:14:06,760 Speaker 3: page erank algorithm from one machine to a whole bunch 254 00:14:06,760 --> 00:14:09,840 Speaker 3: of machines and they weren't very good machines at the time. 255 00:14:10,720 --> 00:14:14,160 Speaker 3: Despite his impact and tenure, Gomes had only been made 256 00:14:14,200 --> 00:14:16,720 Speaker 3: head of Search in the middle of twenty eighteen after 257 00:14:16,800 --> 00:14:19,880 Speaker 3: John Gianderia moved to Apple to work on its machine 258 00:14:19,960 --> 00:14:21,160 Speaker 3: learning in AR strategy. 259 00:14:22,000 --> 00:14:22,720 Speaker 4: Gomes had been. 260 00:14:22,640 --> 00:14:25,880 Speaker 3: Described as Google's searches are beloved for his ability to 261 00:14:25,920 --> 00:14:32,080 Speaker 3: communicate across Google's many quite decentralized departments. Every single article 262 00:14:32,280 --> 00:14:35,600 Speaker 3: I've read about Gomes and his tenure at Google spoke 263 00:14:35,640 --> 00:14:38,600 Speaker 3: of a man deeply ingrained in the foundation of one 264 00:14:38,600 --> 00:14:41,560 Speaker 3: of the most important technologies ever made, a man who 265 00:14:41,560 --> 00:14:44,360 Speaker 3: had dedicated decades to maintaining a product with a and 266 00:14:44,480 --> 00:14:48,200 Speaker 3: I quote Gomes here guiding light of serving the user 267 00:14:48,920 --> 00:14:50,760 Speaker 3: and using technology to do that. 268 00:14:51,520 --> 00:14:53,520 Speaker 4: And when finally given the keys. 269 00:14:53,280 --> 00:14:56,880 Speaker 3: To the kingdom, the ability to elevate Google Search even further, 270 00:14:57,160 --> 00:14:59,960 Speaker 3: he was rap fucked by a series of rotten careersts 271 00:15:00,360 --> 00:15:04,080 Speaker 3: trying to please Wall Street, led by Probakar Ragavan. Do 272 00:15:04,120 --> 00:15:06,680 Speaker 3: you want to know what Provakar Ragavan's old job was? 273 00:15:06,800 --> 00:15:09,400 Speaker 3: We Probacar Ragavan, the new head of Google Search, the 274 00:15:09,400 --> 00:15:12,360 Speaker 3: guy that ran Google Search, that runs Google Search right now, 275 00:15:12,360 --> 00:15:15,080 Speaker 3: that is running a Google Search into the goddamn ground. 276 00:15:15,160 --> 00:15:17,000 Speaker 3: Do you want to know what his job was? His 277 00:15:17,120 --> 00:15:20,240 Speaker 3: job before Google, He was the head of search for 278 00:15:20,320 --> 00:15:23,560 Speaker 3: god damn Yahoo from two thousand and five through two 279 00:15:23,640 --> 00:15:27,280 Speaker 3: thy and twelve when he joined the company. When Probakar 280 00:15:27,400 --> 00:15:30,840 Speaker 3: Ragavan took over Yahoo Search, they held a thirty point 281 00:15:30,920 --> 00:15:34,720 Speaker 3: four percent market share, not far from Google's own thirty 282 00:15:34,760 --> 00:15:38,120 Speaker 3: six point nine percent, and miles ahead of the fifteen 283 00:15:38,160 --> 00:15:42,200 Speaker 3: point seven percent that Microsoft's MSN Search had. By May 284 00:15:42,280 --> 00:15:45,280 Speaker 3: twenty twelve, Yahoo was down to just thirteen point four 285 00:15:45,320 --> 00:15:48,640 Speaker 3: percent and had shrunk for the previous nine consecutive months 286 00:15:48,720 --> 00:15:51,720 Speaker 3: and was being beaten by even the newly released Bing. 287 00:15:52,680 --> 00:15:55,520 Speaker 3: That same year, Yahoo had the largest layoffs in its 288 00:15:55,560 --> 00:15:59,080 Speaker 3: corporate history, shedding two thousand employees, or fourteen percent of 289 00:15:59,120 --> 00:16:03,000 Speaker 3: its overall work. For the man who deposed Ben Gomes, 290 00:16:03,600 --> 00:16:06,359 Speaker 3: someone who worked on Google Search from its very beginnings, 291 00:16:06,560 --> 00:16:08,680 Speaker 3: was so shit at his job that in two thousand 292 00:16:08,720 --> 00:16:11,360 Speaker 3: and nine, Yahoo effectively threw in the towel on its 293 00:16:11,440 --> 00:16:15,200 Speaker 3: own search tech, instead choosing to license Being's engine in 294 00:16:15,240 --> 00:16:18,000 Speaker 3: a ten year deal. If we take a long view 295 00:16:18,000 --> 00:16:22,040 Speaker 3: of things, this likely precipitated the overall decline of the company, 296 00:16:22,160 --> 00:16:24,320 Speaker 3: which went from being worth one hundred and twenty five 297 00:16:24,320 --> 00:16:27,000 Speaker 3: billion dollars at the peak of the dot com boom 298 00:16:27,040 --> 00:16:29,560 Speaker 3: to being sold to Verizon for four point eight billion 299 00:16:29,560 --> 00:16:32,920 Speaker 3: dollars in twenty seventeen, which is roughly a three thousand 300 00:16:32,920 --> 00:16:37,000 Speaker 3: square foot apartment in San Francisco. With Search no longer 301 00:16:37,000 --> 00:16:40,400 Speaker 3: a priority in making less money for the company, Yahoo 302 00:16:40,400 --> 00:16:42,920 Speaker 3: decided to pivot into web two point zero and original 303 00:16:42,960 --> 00:16:46,000 Speaker 3: content making sum beats that paid off, but far far 304 00:16:46,040 --> 00:16:48,880 Speaker 3: too many that did not. It spent one point one 305 00:16:48,960 --> 00:16:52,640 Speaker 3: billion dollars on Tumblr in twenty thirteen, only for Verizon 306 00:16:52,680 --> 00:16:55,040 Speaker 3: to sell it for just three million dollars in twenty nineteen. 307 00:16:55,320 --> 00:16:57,800 Speaker 3: It put Zimbra in two thousand and seven, ostensibly to 308 00:16:57,840 --> 00:17:01,080 Speaker 3: complete with the new Google Apps productivity, only to sell 309 00:17:01,120 --> 00:17:03,880 Speaker 3: it for a reported fraction of the original purchase price 310 00:17:03,920 --> 00:17:06,879 Speaker 3: to VMware a few years later. That's not his fault, 311 00:17:07,960 --> 00:17:11,439 Speaker 3: but nevertheless, Yahoo was a company without a mission, a purpose, 312 00:17:11,520 --> 00:17:15,280 Speaker 3: or an objective. Nobody, and I'll speculate even those leading 313 00:17:15,359 --> 00:17:19,000 Speaker 3: the company really knew what it was or what it did. Anyway, 314 00:17:19,119 --> 00:17:21,400 Speaker 3: just a big shout out right now to Kura Swisher, 315 00:17:21,440 --> 00:17:24,480 Speaker 3: who referred to Prabaka as well respected. When he moved 316 00:17:24,520 --> 00:17:29,120 Speaker 3: from Yahoo to Google. He absolutely nailed at Kara bang 317 00:17:29,160 --> 00:17:33,040 Speaker 3: up job. In an interview with zdnets Dan Farber from 318 00:17:33,080 --> 00:17:36,119 Speaker 3: two thousand and five, Ragavan spoke of his intent to 319 00:17:36,160 --> 00:17:39,679 Speaker 3: align the commercial incentives of a billion content providers with 320 00:17:39,800 --> 00:17:42,560 Speaker 3: social good intent while at Yahoo, and his eagerness to 321 00:17:42,640 --> 00:17:47,200 Speaker 3: inspire the audience to give more data. What anyway before that, 322 00:17:47,600 --> 00:17:50,840 Speaker 3: it's It's actually hard to find out exactly what Ragavan did, 323 00:17:51,160 --> 00:17:54,320 Speaker 3: though according to zd net, he spent fourteen years doing 324 00:17:54,400 --> 00:18:00,679 Speaker 3: search and data mining research ibm M. In April twenty eleven, 325 00:18:00,760 --> 00:18:03,199 Speaker 3: The Guardian ran an interview with Ragavan that called him 326 00:18:03,280 --> 00:18:07,840 Speaker 3: Yahoo's secret weapon, describing his plan to make rigorous scientific 327 00:18:07,880 --> 00:18:11,920 Speaker 3: research and practice to inform Yahoo's business from email to advertising, 328 00:18:12,160 --> 00:18:15,359 Speaker 3: and how under then CEO Carol Bart's the focus had 329 00:18:15,400 --> 00:18:19,000 Speaker 3: shifted to the direct development of new products. It speaks 330 00:18:19,040 --> 00:18:23,679 Speaker 3: of Ragavan's scientific approach and his steady process based logic 331 00:18:23,720 --> 00:18:26,920 Speaker 3: to innovation that is very different to the common perception 332 00:18:27,160 --> 00:18:30,160 Speaker 3: the ideas and development are more about luck and spontaneity. 333 00:18:30,520 --> 00:18:32,600 Speaker 3: A sentence that I'm only reading to you because I 334 00:18:32,640 --> 00:18:34,919 Speaker 3: really need you to hear how stupid it sounds and 335 00:18:34,960 --> 00:18:39,480 Speaker 3: how specious some of the tech press used to be. Frankly, 336 00:18:39,520 --> 00:18:42,800 Speaker 3: this entire article is ridiculous, so utterly vacuous that I'm 337 00:18:42,840 --> 00:18:45,399 Speaker 3: actually astonished. I don't want to name the reporter. I 338 00:18:45,520 --> 00:18:50,000 Speaker 3: feel bad. What about Ragavan's career made this feel right? 339 00:18:50,520 --> 00:18:53,399 Speaker 3: How has nobody connected these thoughts before? I have a 340 00:18:53,480 --> 00:18:55,560 Speaker 3: day job. I run a PR firm. I am a 341 00:18:55,600 --> 00:19:01,000 Speaker 3: blogger with a podcast, and I'm the one who said, yeah, okay, 342 00:19:01,080 --> 00:19:03,560 Speaker 3: drag Killer is now the CEO of the blood Bank. 343 00:19:03,800 --> 00:19:06,520 Speaker 3: Nobody saw this. Nobody saw this at the time. I 344 00:19:06,640 --> 00:19:09,040 Speaker 3: just feel a bit crazy. I feel a bit crazy. 345 00:19:09,080 --> 00:19:12,320 Speaker 3: But to be clear, this was something written several years 346 00:19:12,359 --> 00:19:16,439 Speaker 3: after Yahoo had licensed its search technology to Microsoft in 347 00:19:16,480 --> 00:19:20,080 Speaker 3: a financial deal that the next CEO, Marissa Maya, who 348 00:19:20,080 --> 00:19:24,320 Speaker 3: replaced Barts, was still angry about for years. Ragavan's reign 349 00:19:24,359 --> 00:19:27,080 Speaker 3: as what zd net referred to as the search Master 350 00:19:27,680 --> 00:19:31,800 Speaker 3: was one so successful that it ended up being replaced 351 00:19:31,800 --> 00:19:34,040 Speaker 3: by a search engine that not a single person in 352 00:19:34,080 --> 00:19:38,800 Speaker 3: the world enjoys saying out loud. The Guardian article ran 353 00:19:38,920 --> 00:19:42,960 Speaker 3: exactly one year before dramatic layoffs at Yahoo that involved 354 00:19:43,000 --> 00:19:46,320 Speaker 3: firing entire divisions worth of people, and four months before 355 00:19:46,359 --> 00:19:49,480 Speaker 3: Carol Barts would be fired by telephone by then chairman 356 00:19:49,600 --> 00:19:53,679 Speaker 3: Roy Bostock. Her replacement, Scott Thompson, who previously served as 357 00:19:53,720 --> 00:19:56,199 Speaker 3: president of PayPal, would last a whole five months in 358 00:19:56,240 --> 00:19:59,800 Speaker 3: the role before he was replaced by former Google executive 359 00:20:00,080 --> 00:20:02,480 Speaker 3: a Mayor, in part because it emerged he lied on 360 00:20:02,520 --> 00:20:07,240 Speaker 3: his resume about having a computer science degree. Hey brabaka, 361 00:20:07,240 --> 00:20:23,879 Speaker 3: did you not notice that anyway, whatever Barts joined Yahoo 362 00:20:24,000 --> 00:20:26,320 Speaker 3: in two thousand and nine, so about four years into 363 00:20:26,720 --> 00:20:30,960 Speaker 3: Braba Kha's bign of terror, I guess, and she joined 364 00:20:31,000 --> 00:20:33,880 Speaker 3: in the aftermath of its previous CEO, Jerry Yang, refusing 365 00:20:33,880 --> 00:20:37,080 Speaker 3: to sell the company to Microsoft for forty five billion dollars. 366 00:20:37,440 --> 00:20:39,679 Speaker 3: In her first year, she laid off hundreds of people 367 00:20:39,920 --> 00:20:43,000 Speaker 3: and struck a deal that I've mentioned before to power 368 00:20:43,080 --> 00:20:46,520 Speaker 3: Yahoo Search using Microsoft's being search engine tech, with Microsoft 369 00:20:46,560 --> 00:20:48,920 Speaker 3: paying Yahoo eighty eight percent of the revenue it gained 370 00:20:48,920 --> 00:20:51,400 Speaker 3: from searches, a deal that made Ya who are couple 371 00:20:51,440 --> 00:20:53,560 Speaker 3: hundred million dollars for handing over the keys and the 372 00:20:53,600 --> 00:20:57,439 Speaker 3: tech to its most high traffic platform. As I previously stated, 373 00:20:57,440 --> 00:21:01,080 Speaker 3: when Brabakar Ragavan, Yahoo's secret weapon was doing his work, 374 00:21:01,280 --> 00:21:04,080 Speaker 3: Yahoo's search was so valuable that it was replaced by 375 00:21:04,119 --> 00:21:08,520 Speaker 3: Bing its sole value. In fact, I mean, maybe I'm 376 00:21:08,560 --> 00:21:09,560 Speaker 3: being a little unfair. 377 00:21:10,200 --> 00:21:10,840 Speaker 4: There's a way of. 378 00:21:10,760 --> 00:21:13,639 Speaker 3: Looking at this that you could say that Yahoo's entire 379 00:21:13,760 --> 00:21:16,000 Speaker 3: value at the end of his career was driven by 380 00:21:16,040 --> 00:21:19,800 Speaker 3: nostalgia and association with days before he worked there. Anyway, 381 00:21:20,760 --> 00:21:23,200 Speaker 3: thanks to the state of modern search, it's actually very, 382 00:21:23,280 --> 00:21:25,679 Speaker 3: very difficult to find much about Ragavan's history. 383 00:21:25,760 --> 00:21:26,240 Speaker 4: It took me. 384 00:21:26,320 --> 00:21:29,720 Speaker 3: Hours of digging through Google and at one point being 385 00:21:29,760 --> 00:21:33,760 Speaker 3: embarrassingly to find three or four articles that went into 386 00:21:33,760 --> 00:21:35,960 Speaker 3: any depth about him. But from what I've gleaned, his 387 00:21:36,000 --> 00:21:39,879 Speaker 3: expertise lies primarily in failing upwards ascending through the ranks 388 00:21:39,920 --> 00:21:43,720 Speaker 3: of technology on the momentum from the explosions he's coursed. 389 00:21:44,480 --> 00:21:46,959 Speaker 3: In a wide interview from twenty twenty one, GLAD handler 390 00:21:47,000 --> 00:21:49,840 Speaker 3: Stephen Levy said Ragavan isn't the CEO of Google, he 391 00:21:49,960 --> 00:21:52,280 Speaker 3: just runs the place, and described his addition to the 392 00:21:52,320 --> 00:21:55,879 Speaker 3: company as a move from research to management. While Levy 393 00:21:55,920 --> 00:21:58,840 Speaker 3: calls him a world class computer scientist who has authored 394 00:21:58,840 --> 00:22:01,800 Speaker 3: definitive text in the field, which is true, he also 395 00:22:01,840 --> 00:22:05,840 Speaker 3: describes Ragavan as choosing a management track which definitely tracks 396 00:22:05,840 --> 00:22:09,359 Speaker 3: with everything I found out about him. Ragavan proudly declares 397 00:22:09,400 --> 00:22:12,240 Speaker 3: that Google's third party adtech plays a critical role in 398 00:22:12,320 --> 00:22:14,920 Speaker 3: keeping journalism alive and a really shitty answer to a 399 00:22:15,000 --> 00:22:17,359 Speaker 3: question that was also made at a time when he 400 00:22:17,440 --> 00:22:21,879 Speaker 3: was in aggressively incentivizing search engine optimized content and a 401 00:22:21,960 --> 00:22:24,160 Speaker 3: year after he'd deposed someone who actually gave a shit 402 00:22:24,240 --> 00:22:29,080 Speaker 3: about search. Under Ragavan, Google has become less reliable, and 403 00:22:29,119 --> 00:22:33,120 Speaker 3: it's dominated by search engine optimization and just outright spam. 404 00:22:33,600 --> 00:22:34,480 Speaker 4: And I've said this. 405 00:22:34,359 --> 00:22:37,439 Speaker 3: Before, but look, we complain about the state of Twitter 406 00:22:37,560 --> 00:22:40,800 Speaker 3: under Elon Musk and justifiably he's a vile, anti semi 407 00:22:41,040 --> 00:22:44,040 Speaker 3: racist bigger. We all know this, It's fully true. We 408 00:22:44,080 --> 00:22:48,560 Speaker 3: can say it a million times. However, I'd argue that Ragavan, 409 00:22:48,760 --> 00:22:54,560 Speaker 3: by extension Sundhar Pushai deserve one hundred times more criticism. 410 00:22:55,200 --> 00:22:59,760 Speaker 3: They've done unfathomable damage to society. You really can't fix 411 00:22:59,840 --> 00:23:02,440 Speaker 3: the damage they've been doing and the damage they'll continue 412 00:23:02,480 --> 00:23:05,960 Speaker 3: to do, especially as we go into an election. Ragavan 413 00:23:06,040 --> 00:23:08,760 Speaker 3: and his cronies worked to oust Ben Gomes, a man 414 00:23:08,800 --> 00:23:11,320 Speaker 3: who dedicated a good portion of his life to making 415 00:23:11,359 --> 00:23:15,040 Speaker 3: the world's information more accessible, in the process burning the 416 00:23:15,040 --> 00:23:18,159 Speaker 3: Library of Alexandria to the goddamn ground so that Sundar 417 00:23:18,240 --> 00:23:20,880 Speaker 3: Peshai could make more than two hundred million dollars a year, 418 00:23:21,359 --> 00:23:24,920 Speaker 3: and Ragavan, a manager high by Sundar Peshai, a former 419 00:23:25,000 --> 00:23:28,439 Speaker 3: McKinsey man. The King of Managers, is an example of 420 00:23:28,520 --> 00:23:32,439 Speaker 3: everything wrong with the tech industry. Despite his history as 421 00:23:32,480 --> 00:23:36,600 Speaker 3: a true computer scientist with actual academic credentials. Ragavan chose 422 00:23:36,640 --> 00:23:39,920 Speaker 3: to bulldoze actual workers, people who did things, and people 423 00:23:39,920 --> 00:23:43,800 Speaker 3: that care about technology and replace them with horrifying toadies 424 00:23:44,040 --> 00:23:49,680 Speaker 3: that would make Google more profitable and less useful. Since 425 00:23:49,680 --> 00:23:52,959 Speaker 3: Prabakar took the reigns of Google in twenty twenty, Google 426 00:23:52,960 --> 00:23:56,919 Speaker 3: search has dramatically declined. With these core search updates are mentioned, 427 00:23:56,960 --> 00:24:00,320 Speaker 3: allegedly made to improve the quality of results, having the 428 00:24:00,400 --> 00:24:05,160 Speaker 3: adverse effect increasing the prevalence of spammy shitty search optimized content. 429 00:24:05,920 --> 00:24:09,720 Speaker 3: It's frustrating. The anger you hear in my voice. The 430 00:24:09,720 --> 00:24:13,080 Speaker 3: emotion is because I've read all of these antitrust emails. 431 00:24:13,400 --> 00:24:15,800 Speaker 3: I have gone through this guy's history, and I've read 432 00:24:15,840 --> 00:24:18,680 Speaker 3: all the things about Ben Gomes too. Every article about 433 00:24:18,680 --> 00:24:21,720 Speaker 3: Ben Gomes where they interviewed, is this guy just having 434 00:24:21,760 --> 00:24:24,560 Speaker 3: these dreamy thoughts about the future of information and the 435 00:24:24,560 --> 00:24:27,320 Speaker 3: complexity of delivering it at high speed. Every interview with 436 00:24:27,400 --> 00:24:31,280 Speaker 3: Ragavan is some vague bullshit about how important data is. 437 00:24:31,560 --> 00:24:35,879 Speaker 3: It's so goddamn offensive to me, and all of this 438 00:24:35,920 --> 00:24:38,760 Speaker 3: stuff happening is just one example of what I think 439 00:24:38,800 --> 00:24:41,879 Speaker 3: are probably hundreds of things happening across startups or that 440 00:24:41,920 --> 00:24:43,919 Speaker 3: have happened across startups in the last ten or fifteen 441 00:24:43,960 --> 00:24:47,040 Speaker 3: years and big tech two. And it's because the people 442 00:24:47,280 --> 00:24:50,720 Speaker 3: running the tech industry are no longer those who built him. 443 00:24:51,240 --> 00:24:54,560 Speaker 3: Larry Page and Sergey Brin left Google in December twenty nineteen, 444 00:24:54,840 --> 00:24:56,800 Speaker 3: the same year, by the way as the Code Yellow thing, 445 00:24:57,240 --> 00:25:01,000 Speaker 3: and while they remained as controlling shareholders, they clearly don't 446 00:25:01,000 --> 00:25:04,640 Speaker 3: give a shit about what Google means anymore. Probakar Ragavan 447 00:25:04,760 --> 00:25:06,960 Speaker 3: is a manager, and his career, from what I can tell, 448 00:25:07,240 --> 00:25:09,960 Speaker 3: is mostly made up of did some stuff at IBM, 449 00:25:10,280 --> 00:25:12,720 Speaker 3: failed to make Yahoo anything of no, and fucked up 450 00:25:12,760 --> 00:25:15,119 Speaker 3: Google so badly that every news outlet has run a 451 00:25:15,119 --> 00:25:18,959 Speaker 3: story about how bad it is. This this is the 452 00:25:19,000 --> 00:25:21,880 Speaker 3: result of taking technology out of the hands of real 453 00:25:21,960 --> 00:25:24,560 Speaker 3: builders and handing it to managers at a time when 454 00:25:24,600 --> 00:25:27,800 Speaker 3: management is synonymous with staying as far away from actual 455 00:25:27,800 --> 00:25:31,280 Speaker 3: work as possible. When you're a do nothing looking to 456 00:25:31,359 --> 00:25:34,360 Speaker 3: profit as much as possible, who doesn't use tech, who 457 00:25:34,400 --> 00:25:37,720 Speaker 3: doesn't care about tech, and you only care about growth, well, 458 00:25:37,760 --> 00:25:38,560 Speaker 3: you're not a user. 459 00:25:38,840 --> 00:25:39,880 Speaker 4: You're a parasite. 460 00:25:39,920 --> 00:25:42,800 Speaker 3: And it's these parasites that have dominated and are now 461 00:25:42,920 --> 00:25:45,120 Speaker 3: draining the tech industry of its value. 462 00:25:45,200 --> 00:25:47,639 Speaker 4: They're driving it into a goddamn ditch. 463 00:25:48,640 --> 00:25:51,320 Speaker 3: Ragavan's story is unique in so far as the damage 464 00:25:51,320 --> 00:25:54,200 Speaker 3: he's managed to inflict, or if we're being exceptionally charitable, 465 00:25:54,400 --> 00:25:58,040 Speaker 3: failed to avoid in the case of Yahoo on two 466 00:25:58,160 --> 00:26:00,480 Speaker 3: industry defining companies, and the fact that he did it 467 00:26:00,520 --> 00:26:05,040 Speaker 3: without being a CEO or founder is remarkable. Yeah, he's 468 00:26:05,359 --> 00:26:08,760 Speaker 3: far from the only example of a manager falling upwards. 469 00:26:10,000 --> 00:26:12,760 Speaker 3: I'm going to editorialize a bit here. I want to 470 00:26:12,800 --> 00:26:14,640 Speaker 3: think about your job history. I want you to think 471 00:26:14,640 --> 00:26:17,840 Speaker 3: about the managers you've had. I've written a lot about management, 472 00:26:18,320 --> 00:26:21,080 Speaker 3: and specifically to do with remote work and the whole 473 00:26:21,119 --> 00:26:24,040 Speaker 3: thing around guys who don't do work, who are barely 474 00:26:24,040 --> 00:26:26,040 Speaker 3: in the office telling you you need to be in 475 00:26:26,080 --> 00:26:32,560 Speaker 3: the office. This problem is everywhere. Managers are everywhere, and 476 00:26:32,640 --> 00:26:36,639 Speaker 3: managers aren't doing work. I'm sure someone will email me 477 00:26:36,760 --> 00:26:39,239 Speaker 3: now and say, well, I'm a manager and off I'll 478 00:26:39,320 --> 00:26:39,960 Speaker 3: do work. 479 00:26:39,720 --> 00:26:41,159 Speaker 4: All the time. Yeah, make sure you do. 480 00:26:41,280 --> 00:26:43,359 Speaker 3: That's why you're emailing me telling me how good you 481 00:26:43,400 --> 00:26:46,280 Speaker 3: are at your job. People who actually do work don't 482 00:26:46,280 --> 00:26:50,440 Speaker 3: feel defensive about it. People who do things and are 483 00:26:50,480 --> 00:26:53,600 Speaker 3: part of the actual profit center, they don't need a 484 00:26:53,640 --> 00:26:55,640 Speaker 3: podcast to tell them they're good at their job. 485 00:26:56,840 --> 00:26:58,320 Speaker 4: What I think the problem is. 486 00:26:58,480 --> 00:27:03,320 Speaker 3: In modern American corporate society is that management is no 487 00:27:03,440 --> 00:27:06,960 Speaker 3: longer synonymous with actually managing people. It's not about getting 488 00:27:06,960 --> 00:27:10,040 Speaker 3: the people what they need. It's not about organizing things 489 00:27:10,160 --> 00:27:13,520 Speaker 3: and making things efficient and good. It's not about execution. 490 00:27:14,040 --> 00:27:16,439 Speaker 3: It's about handing work off to other people and getting 491 00:27:16,440 --> 00:27:20,919 Speaker 3: paid handsomely. And if you disagree easy at better offline 492 00:27:20,960 --> 00:27:23,840 Speaker 3: dot com, I will read your email, maybe I'll even respond. 493 00:27:25,040 --> 00:27:28,880 Speaker 3: But the thing is management has become a poison in America. 494 00:27:29,040 --> 00:27:33,520 Speaker 3: Managers have become poisonous because managers are not actually held 495 00:27:33,760 --> 00:27:36,879 Speaker 3: to any kind of standard. No, only the workers who 496 00:27:36,960 --> 00:27:39,639 Speaker 3: do the work are What happened to Ben Gomes is 497 00:27:39,640 --> 00:27:42,840 Speaker 3: one of the most disgusting, disgraceful things to happen in 498 00:27:42,880 --> 00:27:46,600 Speaker 3: the tech industry. It's an absolute joke. Ben Gomes was 499 00:27:46,600 --> 00:27:49,760 Speaker 3: a goddamn hero. And I really need you to read 500 00:27:49,760 --> 00:27:52,399 Speaker 3: the newsletter and read these emails. I need you to 501 00:27:52,480 --> 00:27:55,840 Speaker 3: see how many times him and thaka Great Guy as 502 00:27:55,880 --> 00:28:00,160 Speaker 3: well were saying, hey, growth is bad for searching. Thing 503 00:28:00,160 --> 00:28:02,640 Speaker 3: that Ben Gomes was being asked to do was increase 504 00:28:02,920 --> 00:28:07,280 Speaker 3: queries on Google. The literal amount that people search. There 505 00:28:07,280 --> 00:28:10,560 Speaker 3: are many ways of looking at that and thinking, oh shit, 506 00:28:10,920 --> 00:28:14,160 Speaker 3: that's not what you want. Surely you don't want no queries. 507 00:28:14,160 --> 00:28:16,240 Speaker 3: You don't want people not using it at all. But 508 00:28:16,440 --> 00:28:21,240 Speaker 3: queries going upwards linearly suggests that if you're not magic 509 00:28:21,280 --> 00:28:23,680 Speaker 3: good to use the growth, at least the people are 510 00:28:24,240 --> 00:28:26,280 Speaker 3: not getting what they want on the first. 511 00:28:25,880 --> 00:28:28,800 Speaker 4: Try, which, by the way, kind of feels like how 512 00:28:28,840 --> 00:28:30,080 Speaker 4: Google is nowadays. 513 00:28:30,840 --> 00:28:32,919 Speaker 3: When you go to Google and the first result, and 514 00:28:32,920 --> 00:28:34,640 Speaker 3: the second result, and the fifth result and the tenth 515 00:28:34,640 --> 00:28:37,479 Speaker 3: result just don't get what you need because it's all 516 00:28:37,520 --> 00:28:41,000 Speaker 3: that SEO crap. Now, this is all theorizing, But what 517 00:28:41,080 --> 00:28:44,080 Speaker 3: I think prabagar Ragavan did was I think he took 518 00:28:44,120 --> 00:28:47,560 Speaker 3: off all the fucking guidelines on Google Search. I think 519 00:28:47,560 --> 00:28:50,880 Speaker 3: he rolled back changes specifically to make search worse, to 520 00:28:50,960 --> 00:28:55,440 Speaker 3: increase queries, to give Google more chance to show you adverts. 521 00:28:55,880 --> 00:28:58,160 Speaker 3: I am guessing don't have a source telling me this, 522 00:28:58,760 --> 00:29:01,800 Speaker 3: but the pattern around on the core search updates. The 523 00:29:01,880 --> 00:29:05,760 Speaker 3: fact that Google Search started getting worse toward the middle 524 00:29:05,800 --> 00:29:11,440 Speaker 3: and end of twenty nineteen and unquestionably dipped in twenty twenty. 525 00:29:11,720 --> 00:29:15,440 Speaker 3: Well that's when Prabaka took over that's when the big 526 00:29:15,480 --> 00:29:18,080 Speaker 3: Man took the reins. That's when drack Killer got his 527 00:29:18,160 --> 00:29:21,000 Speaker 3: job at the blood bank. And this is the thing. 528 00:29:22,440 --> 00:29:24,680 Speaker 3: There's very little that you and I can actually do 529 00:29:24,760 --> 00:29:27,000 Speaker 3: about this. But what we can do is say names 530 00:29:27,000 --> 00:29:29,800 Speaker 3: like Probakar Ragavan a great deal of times so that 531 00:29:29,840 --> 00:29:33,240 Speaker 3: people like this can be known, so that the actions 532 00:29:33,280 --> 00:29:37,080 Speaker 3: of these scurrilous assholes can be seen and heard and 533 00:29:37,160 --> 00:29:40,959 Speaker 3: pointed at and spat upon. I'm not suggesting spitting on anyone, 534 00:29:41,000 --> 00:29:44,160 Speaker 3: No violent acts. No can be pissy on the internet 535 00:29:44,200 --> 00:29:47,800 Speaker 3: like the rest of us. Now I'm ranting. I realize 536 00:29:47,800 --> 00:29:51,320 Speaker 3: i'm ranting, but this subject really really got to me. 537 00:29:52,960 --> 00:29:56,200 Speaker 3: But it's not the only one. In the next episode, 538 00:29:56,480 --> 00:30:03,160 Speaker 3: I'm going to conclude this sordid three part fiasco with 539 00:30:03,280 --> 00:30:07,040 Speaker 3: a few more examples, and how many of these managers, 540 00:30:07,440 --> 00:30:11,160 Speaker 3: these bean counters, devoid of imagination or ability or anything 541 00:30:11,200 --> 00:30:12,480 Speaker 3: of note save for that. 542 00:30:13,400 --> 00:30:15,680 Speaker 4: Utter slug likability. 543 00:30:15,760 --> 00:30:18,520 Speaker 3: To protect oneself, I want to talk about how these 544 00:30:18,560 --> 00:30:21,959 Speaker 3: people manage to obfiscate their true intentions by pretending to 545 00:30:21,960 --> 00:30:25,000 Speaker 3: be engineers, by pretending to be technologists, and pretending to 546 00:30:25,040 --> 00:30:28,520 Speaker 3: be innovators. I want to tell you all about how 547 00:30:28,560 --> 00:30:33,320 Speaker 3: Adam Masseri destroyed Instagram, and I want to tell you 548 00:30:34,000 --> 00:30:37,480 Speaker 3: how little Sam Altman has achieved other than making him 549 00:30:37,760 --> 00:30:51,040 Speaker 3: and his friends rich. See you next time. Thank you 550 00:30:51,120 --> 00:30:53,840 Speaker 3: for listening to Better Offline. The editor and composer of 551 00:30:53,840 --> 00:30:56,840 Speaker 3: the Better Offline theme song is Mattosowski. You can check 552 00:30:56,840 --> 00:30:59,760 Speaker 3: out more of his music and audio projects at Mattasowski 553 00:30:59,760 --> 00:31:03,360 Speaker 3: dot com, m A T T O s O W 554 00:31:03,640 --> 00:31:06,800 Speaker 3: s ki dot com. You can email me at easy 555 00:31:06,840 --> 00:31:09,440 Speaker 3: at better offline dot com or check out better offline 556 00:31:09,440 --> 00:31:11,560 Speaker 3: dot com to find my newsletter and more links to 557 00:31:11,600 --> 00:31:13,640 Speaker 3: this podcast. Thank you so much for listening. 558 00:31:14,840 --> 00:31:17,560 Speaker 5: Better Offline is a production of cool Zone Media. For 559 00:31:17,640 --> 00:31:20,840 Speaker 5: more from cool Zone Media, visit our website cool Zonemedia 560 00:31:20,880 --> 00:31:23,720 Speaker 5: dot com or check us out on the iHeartRadio app, 561 00:31:23,760 --> 00:31:26,240 Speaker 5: Apple Podcasts, or wherever you get your podcasts. 562 00:31:42,520 --> 00:31:45,000 Speaker 3: Hello and welcome to Better Offline. I'm your host ed 563 00:31:45,080 --> 00:31:59,720 Speaker 3: zait Tron. Also, as I discussed in the last episode, 564 00:32:00,080 --> 00:32:03,040 Speaker 3: Sam Mortman has spent more than a decade accumulating power 565 00:32:03,080 --> 00:32:06,040 Speaker 3: and wealth in Silicon Valley without ever having to actually 566 00:32:06,080 --> 00:32:09,280 Speaker 3: build anything, using a network of tech industry all stars 567 00:32:09,280 --> 00:32:12,560 Speaker 3: like LinkedIn co founder and investor Reid Hoffman, and Airbnb 568 00:32:12,720 --> 00:32:17,360 Speaker 3: CEO Brian Chesky to insulate himself from responsibility and accountability. 569 00:32:18,280 --> 00:32:21,280 Speaker 3: Yet things are beginning to fall apart as years of 570 00:32:21,320 --> 00:32:25,680 Speaker 3: half baked ideas and terrible, terrible product decisions have kind 571 00:32:25,720 --> 00:32:31,480 Speaker 3: of made society sour on the tech industry, and the 572 00:32:31,560 --> 00:32:35,360 Speaker 3: last month has been particularly difficult for Sam, starting with 573 00:32:35,400 --> 00:32:40,160 Speaker 3: the chaos cause by open ai blatantly mimicking Scarlet Johansson's 574 00:32:40,240 --> 00:32:43,640 Speaker 3: voice for the new version of chat GBT, followed by 575 00:32:43,680 --> 00:32:47,240 Speaker 3: the resignation of researchers who claimed the open ai prioritized 576 00:32:47,280 --> 00:32:51,640 Speaker 3: and I quote shiny products over AI safety. After the 577 00:32:51,640 --> 00:32:55,440 Speaker 3: dissolution of open AI's safety team, I know, it's just 578 00:32:55,480 --> 00:33:00,160 Speaker 3: it's almost cliche. Shortly thereafter, former open ai bos bard 579 00:33:00,200 --> 00:33:03,240 Speaker 3: member Helen Toner revealed that Sam Altman was fired from 580 00:33:03,240 --> 00:33:05,880 Speaker 3: an open Ai because of a regular pattern of deception, 581 00:33:06,400 --> 00:33:09,360 Speaker 3: one where Aortman would give inaccurate info about the company's 582 00:33:09,360 --> 00:33:12,960 Speaker 3: safety processes on multiple occasions, and his deceat was so 583 00:33:13,200 --> 00:33:16,120 Speaker 3: severe that open aye's board only found out about the 584 00:33:16,160 --> 00:33:19,360 Speaker 3: launch of chet GPT, which, by the way, is open 585 00:33:19,400 --> 00:33:23,120 Speaker 3: Ayes first product that really made money, arguably the biggest 586 00:33:23,120 --> 00:33:24,880 Speaker 3: product in tech. Do you wont know how they found 587 00:33:24,920 --> 00:33:26,600 Speaker 3: out about it, Well, they found out when they were 588 00:33:26,640 --> 00:33:31,320 Speaker 3: browsing Twitter. They found out then, not from the CEO 589 00:33:31,400 --> 00:33:33,280 Speaker 3: of open Ai, the company which they were the. 590 00:33:33,280 --> 00:33:34,240 Speaker 4: Border very weird. 591 00:33:35,080 --> 00:33:39,320 Speaker 3: Toner also noted that Aortman was an aggressive political player 592 00:33:39,560 --> 00:33:42,160 Speaker 3: with the board, correctly by the way, worrying the and 593 00:33:42,240 --> 00:33:46,000 Speaker 3: I quote again that if Sam Altman had any inkling 594 00:33:46,040 --> 00:33:48,320 Speaker 3: that the board might do something that went against him, 595 00:33:48,480 --> 00:33:50,840 Speaker 3: he'd pull out all the stops, do everything in his 596 00:33:50,960 --> 00:33:54,200 Speaker 3: power to undermine the board and to prevent them from 597 00:33:54,240 --> 00:33:56,880 Speaker 3: even getting to the point of being able to fire him. 598 00:33:57,800 --> 00:34:00,480 Speaker 3: As a reminder, by the way, the Board's seeded in 599 00:34:00,560 --> 00:34:03,600 Speaker 3: firing Sam Altman in November last year, but not for long, 600 00:34:03,920 --> 00:34:06,960 Speaker 3: with Ortmand returning as CEO a few days later, kicking 601 00:34:07,000 --> 00:34:09,759 Speaker 3: Helen Toner off of the board along with Ilia Sudskava, 602 00:34:09,920 --> 00:34:12,800 Speaker 3: a technical co founder that Altman manipulated long enough to 603 00:34:12,840 --> 00:34:15,360 Speaker 3: build chat GBT and announced it him the moment that 604 00:34:15,400 --> 00:34:18,680 Speaker 3: he chose to complain Sidskava, by the way, has resigned now. 605 00:34:19,080 --> 00:34:21,879 Speaker 3: He's also one of the biggest technical minds there. It's 606 00:34:21,920 --> 00:34:26,040 Speaker 3: so how is open are going to continue anyway? Last week, 607 00:34:26,080 --> 00:34:29,279 Speaker 3: a group of insiders at various AI companies published an 608 00:34:29,320 --> 00:34:32,040 Speaker 3: open letter asking for their overlords, for the heads of 609 00:34:32,080 --> 00:34:35,640 Speaker 3: these companies, for the right to warn about advanced artificial 610 00:34:35,640 --> 00:34:40,720 Speaker 3: intelligence in a monument genuinely impressive monument to the bullshit 611 00:34:40,760 --> 00:34:43,600 Speaker 3: machine that Sam Altman has created. While there are genuine 612 00:34:43,600 --> 00:34:46,919 Speaker 3: safety concerns with AI, there really are, there are many 613 00:34:46,920 --> 00:34:50,200 Speaker 3: of them to consider. These people are desperately afraid of 614 00:34:50,239 --> 00:34:52,960 Speaker 3: the computer coming alive and killing them when they should 615 00:34:52,960 --> 00:34:56,839 Speaker 3: fear the non technical asshole manipulator getting rich making egregious 616 00:34:56,880 --> 00:35:01,400 Speaker 3: promises about what AI can do. Airis you have to 617 00:35:01,440 --> 00:35:04,560 Speaker 3: live up to Sam Altman's promises. Sam Alton doesn't. This 618 00:35:04,719 --> 00:35:07,840 Speaker 3: is not your friend. The problem is not the boogeyman 619 00:35:07,920 --> 00:35:10,960 Speaker 3: computer coming alive. That's not happening man. What's happening is 620 00:35:11,000 --> 00:35:14,040 Speaker 3: this guy is leading your industry to ruin, and the 621 00:35:14,080 --> 00:35:16,840 Speaker 3: bigger concern that they should have should be about what 622 00:35:17,000 --> 00:35:20,680 Speaker 3: Leo Ashenbrenner, a former safety researcher Open ai, had to 623 00:35:20,719 --> 00:35:23,920 Speaker 3: say on the Duaquesh Ptel podcast, where he claimed that 624 00:35:23,960 --> 00:35:27,680 Speaker 3: security processes of open ai were and I quote egregiously 625 00:35:27,760 --> 00:35:30,800 Speaker 3: insufficient and that the priorities that the company were focused 626 00:35:30,800 --> 00:35:35,200 Speaker 3: on growth over stability of security. These people are afraid 627 00:35:35,360 --> 00:35:38,319 Speaker 3: of open AI potentially creating a computer that can think 628 00:35:38,360 --> 00:35:41,440 Speaker 3: for itself that will come and kill them at a 629 00:35:41,480 --> 00:35:44,319 Speaker 3: time where they should be far more concerned about this 630 00:35:44,440 --> 00:35:48,480 Speaker 3: manipulative con artist that's running open AI. Sam Altman is 631 00:35:48,600 --> 00:35:53,360 Speaker 3: dangerous to artificial intelligence. Not because he's building artificial general intelligence, 632 00:35:53,400 --> 00:35:55,920 Speaker 3: which is a kind of AI that meets or surpasses 633 00:35:56,000 --> 00:35:59,959 Speaker 3: human cove capabilities by the way, kind of like data 634 00:36:00,080 --> 00:36:03,359 Speaker 3: from Star Trek. They're afraid of that happening when they 635 00:36:03,400 --> 00:36:07,160 Speaker 3: should be afraid of Aortmand's focus. What does Sam Moortman 636 00:36:07,320 --> 00:36:11,040 Speaker 3: care about? Because the only thing I can find reading 637 00:36:11,080 --> 00:36:14,360 Speaker 3: about what Sam Mortman cares about is Sam bloody Aortman. 638 00:36:14,880 --> 00:36:19,240 Speaker 3: And right now the progress attached to Sam Mortman actually 639 00:36:19,280 --> 00:36:23,319 Speaker 3: isn't looking that great. Open AI's growth is stalling, with 640 00:36:23,480 --> 00:36:26,560 Speaker 3: Alex Cantruwitz reporting that user growth has effectively come to 641 00:36:26,600 --> 00:36:29,160 Speaker 3: a halt based on a recent release claiming that chat 642 00:36:29,200 --> 00:36:33,120 Speaker 3: gpt had one hundred million users a couple of weeks ago, 643 00:36:33,280 --> 00:36:35,400 Speaker 3: which is, by the way, the exact same number that 644 00:36:35,400 --> 00:36:39,160 Speaker 3: the company claimed chat gpt had in November twenty twenty three. 645 00:36:40,160 --> 00:36:43,480 Speaker 3: Chat gpt is also a goddamn expensive product to operate 646 00:36:43,960 --> 00:36:46,719 Speaker 3: with the company burning through capital at this insane rate. 647 00:36:47,040 --> 00:36:49,919 Speaker 3: It's definitely more than seven hundred thousand dollars a day. 648 00:36:51,520 --> 00:36:53,200 Speaker 3: It's got to be in the millions if not. What 649 00:36:53,320 --> 00:36:57,160 Speaker 3: it's insane and what open ai is aggressively monetizing chat GPT, 650 00:36:57,280 --> 00:37:00,560 Speaker 3: both the customers and to businesses. It's so old, fiercely 651 00:37:00,680 --> 00:37:06,000 Speaker 3: far from crossing the break even rubicon. They keep leaking 652 00:37:06,080 --> 00:37:08,560 Speaker 3: and they'll claim, oh, I didn't put that out there. 653 00:37:08,600 --> 00:37:10,920 Speaker 3: They keep telling people, oh, it's making billions of revenue, 654 00:37:11,000 --> 00:37:15,040 Speaker 3: but they never say profit. And eventually someone's going to 655 00:37:15,080 --> 00:37:17,640 Speaker 3: turn to them and say, hey, man, you can't just 656 00:37:17,719 --> 00:37:21,280 Speaker 3: do this for free or for negative. At some point, 657 00:37:21,800 --> 00:37:26,960 Speaker 3: Sacha Nadella is going to call Sam Ortman and say, Sammy, Sammy, 658 00:37:27,000 --> 00:37:28,080 Speaker 3: it's time, Sammy. 659 00:37:28,120 --> 00:37:29,360 Speaker 4: It's got to be a real business. 660 00:37:29,640 --> 00:37:32,680 Speaker 3: I assume he calls him that because the supernatural. But 661 00:37:32,760 --> 00:37:36,160 Speaker 3: as things get desperate, Samuel was going to use the 662 00:37:36,200 --> 00:37:40,680 Speaker 3: only approach he really has, sheer force of will. He's 663 00:37:40,680 --> 00:37:42,959 Speaker 3: going to push open ai to grow and sell into 664 00:37:42,960 --> 00:37:46,440 Speaker 3: as many industries as possible. And he's a specious hype man. 665 00:37:46,480 --> 00:37:49,120 Speaker 3: He's gonna be selling to other specious hype men, the 666 00:37:49,200 --> 00:37:51,279 Speaker 3: Jim Kramers of the world are going to eat it up. 667 00:37:51,600 --> 00:37:54,680 Speaker 3: And they're all all of them. The mock Bernioff's, the 668 00:37:54,719 --> 00:37:59,320 Speaker 3: Sacha Nadellas, a Sun dapashies. They're all desperate to connect 669 00:37:59,320 --> 00:38:02,759 Speaker 3: themselves with future and with generative AI and those that 670 00:38:02,840 --> 00:38:06,960 Speaker 3: he's selling to the company's brokering deals. Yes, even Apple, 671 00:38:08,160 --> 00:38:13,000 Speaker 3: they're desperate to connect their companies to another company which 672 00:38:13,080 --> 00:38:17,880 Speaker 3: is building a bubble, a bubble inflated by Sam Altman. 673 00:38:18,960 --> 00:38:19,960 Speaker 4: And I'd argue that this. 674 00:38:20,040 --> 00:38:23,560 Speaker 3: Is exceedingly dangerous for Silicon Valley and for the tech 675 00:38:23,600 --> 00:38:27,040 Speaker 3: industry writ large, as executives that have become disconnected from 676 00:38:27,040 --> 00:38:30,320 Speaker 3: the process of creating software and hardware follow yet another 677 00:38:30,480 --> 00:38:35,920 Speaker 3: non technical founder hooking unprofitable, unsustainable, and hallucination prone software. 678 00:38:36,200 --> 00:38:40,880 Speaker 3: It's just very frustrating. If there was a very technical 679 00:38:40,960 --> 00:38:44,239 Speaker 3: mind at these companies, they might walk away. And I'm 680 00:38:44,280 --> 00:38:47,080 Speaker 3: not going to give Tim Cook much credit, but looking 681 00:38:47,120 --> 00:38:50,279 Speaker 3: into it, I can't find any evidence that Apple is 682 00:38:50,320 --> 00:38:52,759 Speaker 3: buying a bunch of GPUs, the things that you use 683 00:38:52,840 --> 00:38:57,000 Speaker 3: to power these generative AI models. I found some researcher 684 00:38:57,000 --> 00:38:59,799 Speaker 3: and analysts suggesting that they would buy a lot. But 685 00:39:00,080 --> 00:39:02,880 Speaker 3: now open Ai is doing a deal with Apple to 686 00:39:03,000 --> 00:39:07,600 Speaker 3: power the next iOS, and it's interesting. It is interesting 687 00:39:07,640 --> 00:39:10,840 Speaker 3: that Apple isn't doing this themselves. Apple a company with 688 00:39:11,280 --> 00:39:13,400 Speaker 3: hundreds of billions of dollars in the bank. I believe 689 00:39:14,200 --> 00:39:17,640 Speaker 3: that pretty much prints money. That alone makes me think 690 00:39:17,640 --> 00:39:19,440 Speaker 3: it's a bubble. Now, it might look like an asshole 691 00:39:19,440 --> 00:39:21,600 Speaker 3: if it comes out they have. But also, why are 692 00:39:21,680 --> 00:39:25,520 Speaker 3: they subcontracting this to open ai when they could build 693 00:39:25,520 --> 00:39:31,279 Speaker 3: it themselves, as Apple has always done. Very strange, It's 694 00:39:31,360 --> 00:39:35,760 Speaker 3: all so peculiar. But I wanted to get a little 695 00:39:35,760 --> 00:39:38,520 Speaker 3: bit deeper into the Sam Almonds story. And as I 696 00:39:38,560 --> 00:39:41,680 Speaker 3: discussed last episode, Ellen Hewett of Bloomberg she's been doing 697 00:39:41,680 --> 00:39:44,880 Speaker 3: this excellent reporting on the man and joins me today 698 00:39:44,920 --> 00:39:47,759 Speaker 3: to talk about the subject of a recent podcast, Sam 699 00:39:47,800 --> 00:39:58,880 Speaker 3: Almond's Rise to Power. So tell me a little bit 700 00:39:58,880 --> 00:39:59,880 Speaker 3: about the show you're working on. 701 00:40:01,120 --> 00:40:04,480 Speaker 1: The show is the new season of Foundering, which is 702 00:40:04,560 --> 00:40:08,440 Speaker 1: a serialized podcast from Bloomberg Technology. So this is season five, 703 00:40:08,920 --> 00:40:11,160 Speaker 1: and in every season we've told one story of a 704 00:40:11,280 --> 00:40:14,000 Speaker 1: high stakes drama in Silicon Valley. I was also the 705 00:40:14,000 --> 00:40:16,000 Speaker 1: host of season one, which came out several years ago, 706 00:40:16,120 --> 00:40:19,120 Speaker 1: was about we work, and we've done other companies since then, 707 00:40:19,200 --> 00:40:22,480 Speaker 1: and season five is about Open AI and Sam Altman, 708 00:40:22,560 --> 00:40:27,120 Speaker 1: and I think we really tried to, you know, cover 709 00:40:27,160 --> 00:40:30,680 Speaker 1: the arc of the company's creation and where it is now. 710 00:40:31,200 --> 00:40:33,279 Speaker 1: But in doing so, we really tried to do a 711 00:40:33,360 --> 00:40:37,000 Speaker 1: character study of Sam Altman, like he's a very important 712 00:40:37,000 --> 00:40:39,520 Speaker 1: person in the tech industry right now, with a lot 713 00:40:39,560 --> 00:40:42,560 Speaker 1: of power, and we really wanted to ask ourselves a 714 00:40:42,640 --> 00:40:46,800 Speaker 1: question and to help listeners ask themselves a question. Should 715 00:40:46,800 --> 00:40:50,120 Speaker 1: we trust him? Should we trust this person who is 716 00:40:50,239 --> 00:40:53,840 Speaker 1: currently in a position of a lot of influence and 717 00:40:53,880 --> 00:40:57,480 Speaker 1: about whom there have been very serious, you know, allegations 718 00:40:57,480 --> 00:41:00,520 Speaker 1: and questions raised about, you know, to put it in 719 00:41:00,560 --> 00:41:06,680 Speaker 1: the words of the opening eyeboard, his not consistently candid behavior. 720 00:41:06,800 --> 00:41:10,800 Speaker 1: And I think it's, you know, my hope is that 721 00:41:11,239 --> 00:41:15,040 Speaker 1: we give listeners a chance to hear kind of the 722 00:41:15,080 --> 00:41:19,480 Speaker 1: whole story. And it's like broader you know, when there's 723 00:41:19,520 --> 00:41:21,560 Speaker 1: news that's happening, it can happen so quickly it's hard 724 00:41:21,600 --> 00:41:23,640 Speaker 1: to get a step back. And I think what the 725 00:41:23,640 --> 00:41:27,240 Speaker 1: show really does is it collects a lot of information 726 00:41:27,320 --> 00:41:29,520 Speaker 1: in one place, and we also have lots of new 727 00:41:29,560 --> 00:41:32,920 Speaker 1: information that you won't hear anywhere else, and interviews with 728 00:41:32,920 --> 00:41:35,640 Speaker 1: people who you know have worked with Sam, who knew 729 00:41:35,680 --> 00:41:38,399 Speaker 1: him when he was younger. We have an interview with 730 00:41:38,480 --> 00:41:42,560 Speaker 1: Sam's sister Annie, from whom he is estranged, and there's 731 00:41:42,560 --> 00:41:44,799 Speaker 1: a lot of material in there. I think that tries 732 00:41:44,840 --> 00:41:47,200 Speaker 1: to get closer to this answer of like, what should 733 00:41:47,200 --> 00:41:50,520 Speaker 1: we make of this person? How should we think about 734 00:41:50,719 --> 00:41:54,120 Speaker 1: checks and balances of power when we have these companies 735 00:41:54,160 --> 00:41:57,560 Speaker 1: that are, by all appearances, gathering a lot of power 736 00:41:57,600 --> 00:41:58,880 Speaker 1: and there for the people who are running them have 737 00:41:58,880 --> 00:42:01,280 Speaker 1: a lot of power as well. So we have it's 738 00:42:01,320 --> 00:42:06,360 Speaker 1: a five episode arc, five episode season, and the first 739 00:42:06,800 --> 00:42:10,200 Speaker 1: three episodes are out now to the general public and 740 00:42:10,239 --> 00:42:13,040 Speaker 1: the last two will come out on subsequent Thursdays, And 741 00:42:13,080 --> 00:42:15,239 Speaker 1: if you would like to binge the whole season right away, 742 00:42:16,320 --> 00:42:19,960 Speaker 1: the episodes are available early to Bloomberg dot com subscribers. 743 00:42:22,040 --> 00:42:27,120 Speaker 3: So you've just started this series about Sam Altman and 744 00:42:27,160 --> 00:42:30,520 Speaker 3: his upbringing indoors, so the growth of Open Ai and 745 00:42:30,560 --> 00:42:33,400 Speaker 3: Looved and everything. Who are the people that have helped 746 00:42:33,440 --> 00:42:34,760 Speaker 3: him get where he is today? 747 00:42:34,800 --> 00:42:35,080 Speaker 4: Though? 748 00:42:36,239 --> 00:42:39,160 Speaker 1: So the making of Sam Altman is a really interesting 749 00:42:39,200 --> 00:42:42,000 Speaker 1: part of the overall story of Sam Altman. You know, 750 00:42:42,040 --> 00:42:46,160 Speaker 1: many people know him as the CEO of open Ai 751 00:42:46,320 --> 00:42:49,400 Speaker 1: because that's the role he's been in when he has 752 00:42:49,960 --> 00:42:53,160 Speaker 1: risen to prominence, you know, beyond Silicon Valley. Like I 753 00:42:53,200 --> 00:42:55,400 Speaker 1: think for many years he was well known in Silicon Valley, 754 00:42:55,400 --> 00:42:58,040 Speaker 1: but this is like now he's kind of a household name, 755 00:42:58,760 --> 00:43:02,120 Speaker 1: and so it's important to understand where Sam came from. 756 00:43:02,200 --> 00:43:04,680 Speaker 1: You know, he's been in the valley for you know, 757 00:43:04,760 --> 00:43:07,640 Speaker 1: since two thousand and five, I think is when he 758 00:43:07,719 --> 00:43:09,560 Speaker 1: started college two thousand and four, two thousand and five 759 00:43:09,560 --> 00:43:12,000 Speaker 1: at Stanford. Then he dropped out, and then he joined 760 00:43:12,719 --> 00:43:16,360 Speaker 1: y Combinator, the now famous startup accelerator. But he was 761 00:43:16,360 --> 00:43:21,520 Speaker 1: actually part of the first cohort of founders ever in 762 00:43:21,760 --> 00:43:22,840 Speaker 1: in YC. 763 00:43:22,719 --> 00:43:24,960 Speaker 3: Along with Witch as well, right, yes. 764 00:43:24,880 --> 00:43:28,880 Speaker 1: Including the the co founders of Twitch and of Reddit, 765 00:43:29,239 --> 00:43:30,960 Speaker 1: and so EMMITTT. Sheer, you know, for those who know 766 00:43:31,000 --> 00:43:33,600 Speaker 1: Emmit Sheer has a like very short seventy two hour 767 00:43:33,719 --> 00:43:37,880 Speaker 1: cameo in the open Ai very Sam been firing saga. 768 00:43:38,719 --> 00:43:41,360 Speaker 1: But yes, Emmett and Sam were both in the same 769 00:43:41,920 --> 00:43:46,879 Speaker 1: YC batch. So when we think about Sam's early career 770 00:43:46,920 --> 00:43:49,120 Speaker 1: in Silicon Valley, I think what's important to know is 771 00:43:49,120 --> 00:43:52,480 Speaker 1: that he rose very quickly, in part because he was 772 00:43:52,600 --> 00:43:57,879 Speaker 1: very successful in making these strategic, advantageous friendships and connections 773 00:43:57,960 --> 00:44:01,800 Speaker 1: with already established people in the valley. The most important 774 00:44:01,800 --> 00:44:04,759 Speaker 1: one is Paul Graham, who is the you know, one 775 00:44:04,760 --> 00:44:09,600 Speaker 1: of the founders of y Combinator and you know basically 776 00:44:09,640 --> 00:44:13,200 Speaker 1: like immediately took Sam under his wing when Sam joined 777 00:44:13,239 --> 00:44:18,319 Speaker 1: this first batch of YC. And yeah, Paul's a really 778 00:44:18,360 --> 00:44:20,520 Speaker 1: important mentor to Sam. He's kind of the first person 779 00:44:20,560 --> 00:44:26,640 Speaker 1: who really sees in Sam this you know, ambition, this 780 00:44:26,719 --> 00:44:31,360 Speaker 1: hunger for power, this like drive to really build bigger 781 00:44:31,360 --> 00:44:33,680 Speaker 1: and bigger companies, even when you know, they met when 782 00:44:33,719 --> 00:44:36,840 Speaker 1: Sam Altman was nineteen, so Paul like sees him as 783 00:44:36,840 --> 00:44:40,920 Speaker 1: a teenager and sees this future potential. And so yes, 784 00:44:40,960 --> 00:44:44,719 Speaker 1: you know, not only did Paul become a mentor to 785 00:44:44,800 --> 00:44:48,319 Speaker 1: him and sort of helped build Sam's profile over those 786 00:44:48,360 --> 00:44:50,360 Speaker 1: early years because he would you know, Paul Graham is 787 00:44:50,440 --> 00:44:52,680 Speaker 1: very famous for writing these essays about how to build 788 00:44:52,719 --> 00:44:55,440 Speaker 1: startups and how to build the best startups, and if 789 00:44:55,480 --> 00:44:57,719 Speaker 1: you're at all interested in building startups, you've read many 790 00:44:57,760 --> 00:45:00,560 Speaker 1: of them. They're kind of like almost like a startup bible, 791 00:45:01,040 --> 00:45:03,680 Speaker 1: and in many of them. He extols the virtues of 792 00:45:03,719 --> 00:45:06,120 Speaker 1: Sam Altman. He talks about Sam's ambition, He talks about 793 00:45:06,120 --> 00:45:10,280 Speaker 1: Sam's cunning, his ability to like you know, make deals 794 00:45:10,280 --> 00:45:11,160 Speaker 1: and like think big and. 795 00:45:11,200 --> 00:45:14,400 Speaker 3: Never actually think Sam Moltman is done. Is what I've found. 796 00:45:14,880 --> 00:45:17,520 Speaker 1: Yeah, there are some interesting you know, I've read many 797 00:45:17,560 --> 00:45:19,840 Speaker 1: of the things Paul has written about Sam. Some of 798 00:45:19,840 --> 00:45:25,400 Speaker 1: my favorite ones include Paul writing that within three minutes 799 00:45:25,400 --> 00:45:28,839 Speaker 1: of meeting Sam, this was when Sam was nineteen, Paul 800 00:45:28,920 --> 00:45:31,760 Speaker 1: thought to himself, Ah, so this is what a young 801 00:45:31,960 --> 00:45:33,839 Speaker 1: Bill Gates is like, or you know, this is what 802 00:45:33,880 --> 00:45:35,640 Speaker 1: Bill Gates was like at nineteen, I think is the 803 00:45:35,680 --> 00:45:38,040 Speaker 1: exact quote. So, you know, he really build him up 804 00:45:38,040 --> 00:45:40,640 Speaker 1: in this way. And I do think Paul had like 805 00:45:40,760 --> 00:45:43,520 Speaker 1: unique insight into Sam, like they were close. They in 806 00:45:43,560 --> 00:45:47,080 Speaker 1: many ways I'm sure still are. But it is this 807 00:45:47,200 --> 00:45:50,360 Speaker 1: interesting role where you know, Paul met Sam when he 808 00:45:50,400 --> 00:45:52,000 Speaker 1: really didn't have much to his name, and he really 809 00:45:52,000 --> 00:45:55,080 Speaker 1: elevated him early on through his writings as this like 810 00:45:55,200 --> 00:45:57,880 Speaker 1: startup founder to emulate right, that other founders should be 811 00:45:57,920 --> 00:46:01,000 Speaker 1: emulating Sam. And then of course, as Sam progresses in 812 00:46:01,040 --> 00:46:04,480 Speaker 1: the Valley, he also starts to write these like startup 813 00:46:05,239 --> 00:46:10,239 Speaker 1: Wisdom essays in a similar style to Paul. And then, 814 00:46:10,280 --> 00:46:12,120 Speaker 1: of course, the most important thing that happens is that 815 00:46:12,160 --> 00:46:15,080 Speaker 1: in twenty fourteen, when Paul decides he no longer wants 816 00:46:15,120 --> 00:46:17,200 Speaker 1: to run hy Combinator, which at this point is a 817 00:46:17,280 --> 00:46:19,799 Speaker 1: much bigger vehicle than it was when Sam first started. 818 00:46:19,800 --> 00:46:21,880 Speaker 1: It has no longer just a few stops totally. It 819 00:46:21,880 --> 00:46:26,000 Speaker 1: has produced Stripe, Dropbox, Airbnb. This is a big job, 820 00:46:26,080 --> 00:46:29,040 Speaker 1: right like running y Combinator. And when Paul wants to 821 00:46:29,040 --> 00:46:30,960 Speaker 1: hand it off to someone, you know, he has said 822 00:46:31,000 --> 00:46:34,000 Speaker 1: that the only person he considered giving this to was Sam. 823 00:46:34,040 --> 00:46:36,879 Speaker 1: So in twenty fourteen, when Sam is I believe twenty 824 00:46:36,920 --> 00:46:40,280 Speaker 1: eight years old, he becomes the president of y Combinator, 825 00:46:40,280 --> 00:46:42,520 Speaker 1: and this is you know, he had started a startup, 826 00:46:42,560 --> 00:46:45,640 Speaker 1: it didn't really work, he sold it and was starting 827 00:46:45,680 --> 00:46:48,640 Speaker 1: to tabble in angel investing. And at that point Paul 828 00:46:48,719 --> 00:46:52,239 Speaker 1: really elevated Sam to this new position of power. And 829 00:46:52,280 --> 00:46:54,720 Speaker 1: then he ran YC for a while and then started 830 00:46:54,719 --> 00:46:58,320 Speaker 1: open Ai. And in starting open Ai, he also leveraged 831 00:46:58,360 --> 00:47:03,080 Speaker 1: these like very useful connetions with particularly powerful people who 832 00:47:03,080 --> 00:47:05,680 Speaker 1: could help him, such as Elon Musk, who was able 833 00:47:05,719 --> 00:47:09,000 Speaker 1: to give the vast majority of the pledged funding to 834 00:47:09,080 --> 00:47:13,839 Speaker 1: start open Ai. Later, when Elon Musk splits from open Ai, 835 00:47:14,040 --> 00:47:17,239 Speaker 1: Sam makes his very powerful partnership with Satya Nadella to 836 00:47:17,280 --> 00:47:21,040 Speaker 1: help fund open Ai. Another important partnership that Sam has made, 837 00:47:21,480 --> 00:47:24,680 Speaker 1: you know, much earlier on was his friendship with Peter Teel. 838 00:47:25,080 --> 00:47:26,840 Speaker 1: And one of the things Peter Teal does is also, 839 00:47:27,760 --> 00:47:31,600 Speaker 1: you know, gives him millions of dollars to start investing. 840 00:47:31,920 --> 00:47:35,040 Speaker 1: This is like before Sam takes over at White Yeah. 841 00:47:35,080 --> 00:47:37,840 Speaker 1: And you know another thing that Paul did that really 842 00:47:38,360 --> 00:47:41,399 Speaker 1: Paul Graham did that really helped Sam was also he 843 00:47:41,440 --> 00:47:44,239 Speaker 1: gave Paul had the opportunity to be one of the 844 00:47:44,239 --> 00:47:47,960 Speaker 1: first investors in Stripe. He was offered the chance to 845 00:47:48,080 --> 00:47:51,319 Speaker 1: invest thirty thousand dollars for four percent of Stripe, which, 846 00:47:51,360 --> 00:47:54,680 Speaker 1: of course now that Stripe is enormous, we all know 847 00:47:54,719 --> 00:47:57,959 Speaker 1: how valuable that was, and Paul split it with Sam. 848 00:47:58,000 --> 00:48:00,560 Speaker 1: He was like, oh, I might as well this with Sam. 849 00:48:00,600 --> 00:48:03,440 Speaker 1: So Sam has said that that fifteen thousand dollars for 850 00:48:03,440 --> 00:48:06,440 Speaker 1: a two percent of Stripe has been you know, one 851 00:48:06,480 --> 00:48:08,040 Speaker 1: of his best performing Angel investments. 852 00:48:08,080 --> 00:48:09,919 Speaker 3: Ever, that was some point he Austen is always where 853 00:48:09,960 --> 00:48:13,360 Speaker 3: he got fifteen grand from He was still working on 854 00:48:13,480 --> 00:48:16,799 Speaker 3: looped at the time. It's funny how perf litric anyway. 855 00:48:16,560 --> 00:48:21,320 Speaker 1: Yeah, my guess is fifteen grand was I don't actually 856 00:48:21,320 --> 00:48:23,040 Speaker 1: know this, but my guess is fifteen gen was not 857 00:48:23,200 --> 00:48:25,120 Speaker 1: hard for him to pull up. And it's one of 858 00:48:25,160 --> 00:48:27,640 Speaker 1: those things where it's really is, you know, access to 859 00:48:28,719 --> 00:48:31,040 Speaker 1: access and relationships are the sorts of things that can 860 00:48:31,040 --> 00:48:33,759 Speaker 1: build a career and can lead to great wealth. Right 861 00:48:33,800 --> 00:48:36,840 Speaker 1: like Sam is now you know, by our own internal 862 00:48:36,840 --> 00:48:40,080 Speaker 1: accounts and by other lists, a billionaire. And this money 863 00:48:40,120 --> 00:48:42,600 Speaker 1: comes from you know, not from open Ai, but from 864 00:48:42,600 --> 00:48:44,640 Speaker 1: these angel investments that he's made early on that have 865 00:48:45,560 --> 00:48:47,959 Speaker 1: been enormously successful. 866 00:48:48,200 --> 00:48:50,799 Speaker 3: So you called him in one of the titles, the 867 00:48:50,840 --> 00:48:53,759 Speaker 3: most Silicon Valley Man Alive. Is this what you're getting at, 868 00:48:53,760 --> 00:48:55,719 Speaker 3: this kind of power player mentality. 869 00:48:55,840 --> 00:48:59,000 Speaker 1: Yeah, I think it's it reflects a few things. One 870 00:48:59,080 --> 00:49:02,000 Speaker 1: that even though he is you know, he's in his 871 00:49:02,080 --> 00:49:04,840 Speaker 1: late thirties, he's been a player in Silicon Valley for 872 00:49:04,880 --> 00:49:06,760 Speaker 1: such a long time, you know, close to two decades. 873 00:49:07,600 --> 00:49:12,400 Speaker 1: And also that he's just someone who is extremely well connected. 874 00:49:12,480 --> 00:49:16,480 Speaker 1: So even before he took over y Combinator, which I 875 00:49:16,480 --> 00:49:20,160 Speaker 1: think you could argue is like kind of king of 876 00:49:20,200 --> 00:49:22,640 Speaker 1: the startup world in some sense, like Y combinators, like 877 00:49:22,680 --> 00:49:26,520 Speaker 1: you know, the topics totally. Even before he took over 878 00:49:26,560 --> 00:49:28,560 Speaker 1: I y comminator, I think he was extremely well connected. 879 00:49:28,600 --> 00:49:31,200 Speaker 1: He's very social, he's very helpful, he's very efficient, Like 880 00:49:31,280 --> 00:49:34,279 Speaker 1: many people have told me stories in which he you know, 881 00:49:34,400 --> 00:49:38,319 Speaker 1: calling Sam and talking for five minutes has solved their 882 00:49:38,360 --> 00:49:40,480 Speaker 1: problem because he knows exactly the right person to call 883 00:49:40,560 --> 00:49:42,520 Speaker 1: to fix it, or you know, he's really good at 884 00:49:42,520 --> 00:49:45,080 Speaker 1: making deals. I think it's just clear he's extremely well 885 00:49:45,120 --> 00:49:51,560 Speaker 1: integrated into this world and has very successfully moved up 886 00:49:51,760 --> 00:49:55,839 Speaker 1: the Silicon Valley status ladder to the point where he 887 00:49:55,880 --> 00:49:57,680 Speaker 1: is now, which is kind of you know, one of 888 00:49:57,680 --> 00:50:00,480 Speaker 1: the you know, he's the CEO of the one of 889 00:50:00,480 --> 00:50:02,880 Speaker 1: the arguably hardest companies in the valley right now. And 890 00:50:02,920 --> 00:50:05,560 Speaker 1: I think that that's not luck right, Like he didn't 891 00:50:05,600 --> 00:50:07,759 Speaker 1: just come up with He's not like a nobody who 892 00:50:07,800 --> 00:50:10,040 Speaker 1: came up with an idea. It's like he has the 893 00:50:10,040 --> 00:50:14,040 Speaker 1: connections and has parlayed his connections into power to bring 894 00:50:14,120 --> 00:50:15,200 Speaker 1: him to the point he is now. 895 00:50:16,320 --> 00:50:19,640 Speaker 3: So, in your experience talking to people about Sam Ortman, 896 00:50:20,440 --> 00:50:23,200 Speaker 3: how technical is he? Do you think what if you 897 00:50:23,280 --> 00:50:26,600 Speaker 3: heard because you say there he wasn't lucky. But he 898 00:50:26,640 --> 00:50:29,280 Speaker 3: also does not appear to have successfully run a business 899 00:50:30,000 --> 00:50:33,719 Speaker 3: because Luke shut down two people, well, two executives tried 900 00:50:33,719 --> 00:50:35,200 Speaker 3: to get him fired from there, he got fired from 901 00:50:35,320 --> 00:50:37,719 Speaker 3: y Combinator, which did very well, but at the same 902 00:50:37,719 --> 00:50:40,040 Speaker 3: time YSI was basically a conveyor belt for money at 903 00:50:40,040 --> 00:50:44,319 Speaker 3: one point, not so much recently. Yeah, it just it 904 00:50:44,440 --> 00:50:48,200 Speaker 3: feels weird that this completely non technical, semi non technical 905 00:50:48,239 --> 00:50:49,799 Speaker 3: guy has ascended so far. 906 00:50:51,040 --> 00:50:54,120 Speaker 1: My sense is that's not maybe the most fair description. Like, 907 00:50:54,160 --> 00:50:57,000 Speaker 1: I think Sam is incredibly smart, and people say this 908 00:50:57,080 --> 00:50:59,120 Speaker 1: a lot, and you know, I believe them. I think 909 00:50:59,480 --> 00:51:02,359 Speaker 1: his special skill, you know, he obviously knows how to 910 00:51:02,920 --> 00:51:05,040 Speaker 1: like he's an engineer, he has training. I'm sure he 911 00:51:05,040 --> 00:51:06,960 Speaker 1: can build a lot of stuff. It seems like his 912 00:51:07,160 --> 00:51:12,600 Speaker 1: comparative advantage. His special skill is relationships, deal making, figuring 913 00:51:12,600 --> 00:51:16,640 Speaker 1: out who exactly is the right person to help him 914 00:51:17,000 --> 00:51:19,240 Speaker 1: in whatever he's really trying to get done, and figuring 915 00:51:19,280 --> 00:51:22,520 Speaker 1: out the best way to get something to happen. You know, 916 00:51:22,600 --> 00:51:24,359 Speaker 1: one of the people I spoke to is someone who 917 00:51:25,080 --> 00:51:28,360 Speaker 1: knows Sam from when he was younger and knows him personally, 918 00:51:28,400 --> 00:51:32,520 Speaker 1: and said that his superpower is figuring out who's in 919 00:51:32,600 --> 00:51:34,920 Speaker 1: charge or figuring out who is in the best position 920 00:51:34,960 --> 00:51:37,359 Speaker 1: to help him, and then charming them so that they 921 00:51:38,120 --> 00:51:41,120 Speaker 1: help him with whatever goal he's trying to get done. 922 00:51:41,480 --> 00:51:43,840 Speaker 1: And I think that, like, yeah, one could argue that 923 00:51:43,840 --> 00:51:47,000 Speaker 1: that's actually a really good skill set if you want 924 00:51:47,000 --> 00:51:51,640 Speaker 1: to build a very big company, which you know, I 925 00:51:51,640 --> 00:51:53,600 Speaker 1: think at this current moment he has, right like opening 926 00:51:53,600 --> 00:51:57,600 Speaker 1: eyes really, you know, you can there's a lot that 927 00:51:57,640 --> 00:52:00,600 Speaker 1: you can say about whether they're upholding their digital mission 928 00:52:00,680 --> 00:52:01,960 Speaker 1: or that. You know, that's up for debate, but I 929 00:52:01,960 --> 00:52:07,359 Speaker 1: think that they've obviously been commercially successful so far, so. 930 00:52:08,200 --> 00:52:11,719 Speaker 3: It feels like Silicon Valley on some level. And I 931 00:52:11,960 --> 00:52:14,920 Speaker 3: just to give some thoughts here within the two episodes 932 00:52:14,920 --> 00:52:17,840 Speaker 3: I'm doing here, the pattern I've seen with sam Oltman 933 00:52:18,360 --> 00:52:22,200 Speaker 3: is that everyone seems to want him to win, and 934 00:52:22,239 --> 00:52:25,759 Speaker 3: there's almost a degree of they will make it. So 935 00:52:27,360 --> 00:52:31,040 Speaker 3: have you seen anyone who's really a detractor or anyone 936 00:52:31,040 --> 00:52:36,120 Speaker 3: who's not pro sam Oltman, because it's interesting how few 937 00:52:36,160 --> 00:52:37,560 Speaker 3: people are in tech. 938 00:52:38,640 --> 00:52:41,600 Speaker 1: Well, there is. I won't get too into it because 939 00:52:41,600 --> 00:52:43,600 Speaker 1: this is in some of the future episodes which will 940 00:52:43,680 --> 00:52:47,480 Speaker 1: drop in future weeks. But you know, I would say 941 00:52:47,520 --> 00:52:50,160 Speaker 1: in in in some of the conversations that I've had 942 00:52:50,239 --> 00:52:53,600 Speaker 1: off the record about people about in some of the 943 00:52:53,600 --> 00:52:56,160 Speaker 1: conversations I've had off the record with people about Sam, 944 00:52:56,880 --> 00:53:00,279 Speaker 1: I think, you know, my general impressions are people often 945 00:53:00,320 --> 00:53:02,920 Speaker 1: do find him impressive in terms of what he has 946 00:53:02,960 --> 00:53:07,280 Speaker 1: gotten done, you know, the size and scale of his ambitions, 947 00:53:07,320 --> 00:53:09,080 Speaker 1: and the way that he has generally been able to 948 00:53:11,000 --> 00:53:14,480 Speaker 1: make that happen. I think there's also a lot of 949 00:53:14,480 --> 00:53:17,719 Speaker 1: people who you know, are willing to privately share some 950 00:53:18,080 --> 00:53:22,120 Speaker 1: gripes that they might have about him. Also, you know, 951 00:53:22,200 --> 00:53:24,719 Speaker 1: in recent weeks, we've seen people be a lot more 952 00:53:24,760 --> 00:53:27,120 Speaker 1: public about some of those gripes. We have Helen Toner, 953 00:53:27,920 --> 00:53:30,880 Speaker 1: a former former board member at open AI who voted 954 00:53:30,880 --> 00:53:34,359 Speaker 1: to remove Sam last November, saying publicly in the last 955 00:53:34,400 --> 00:53:37,120 Speaker 1: few weeks that Sam lied to her and the other 956 00:53:37,160 --> 00:53:41,600 Speaker 1: former board members that his you know, misdirection made them 957 00:53:41,600 --> 00:53:45,279 Speaker 1: feel like they couldn't do their jobs. And she has 958 00:53:45,280 --> 00:53:49,879 Speaker 1: also said that people were intimidated to the point where 959 00:53:49,880 --> 00:53:52,720 Speaker 1: they did not feel comfortable speaking more publicly about negative 960 00:53:52,760 --> 00:53:56,160 Speaker 1: experiences they'd had about Sam, that they are afraid to 961 00:53:56,200 --> 00:53:59,839 Speaker 1: speak more publicly about you know, times that he has 962 00:54:00,520 --> 00:54:02,920 Speaker 1: been honest with them or you know, has in which 963 00:54:02,920 --> 00:54:07,840 Speaker 1: they've had challenging experiences. And that has also been reflected 964 00:54:07,840 --> 00:54:10,080 Speaker 1: in some of the private conversations I've had in which people, 965 00:54:10,760 --> 00:54:13,680 Speaker 1: you know, they might have complaints or they might have 966 00:54:13,719 --> 00:54:16,799 Speaker 1: had like challenging situations with him, and I think they 967 00:54:17,320 --> 00:54:19,160 Speaker 1: just feel like the risk calculus is not worth it 968 00:54:19,200 --> 00:54:23,560 Speaker 1: to come out and say something like that. But you know, 969 00:54:23,600 --> 00:54:26,960 Speaker 1: there have been bits and pieces where people have come 970 00:54:27,000 --> 00:54:29,680 Speaker 1: out and said things that you know, Sam has. You know, 971 00:54:30,000 --> 00:54:31,680 Speaker 1: another thing that the board members have said was that 972 00:54:31,719 --> 00:54:35,960 Speaker 1: Sam had been deceptive and manipulative, and that's also followed 973 00:54:36,040 --> 00:54:38,880 Speaker 1: up by or not followed up. There was also, I 974 00:54:38,880 --> 00:54:41,600 Speaker 1: think back in November, a former Open Eye employee who 975 00:54:41,680 --> 00:54:45,000 Speaker 1: had tweeted something publicly about that, you know, saying that 976 00:54:45,000 --> 00:54:46,680 Speaker 1: Sam had lied to him on occasion, even though he 977 00:54:46,719 --> 00:54:49,120 Speaker 1: had also always been nice to him, which I think 978 00:54:49,160 --> 00:54:51,080 Speaker 1: is a very interesting combination of. 979 00:54:52,680 --> 00:54:56,520 Speaker 3: Charter Silicon Valley though I'm afraid of dealing with them, 980 00:54:56,600 --> 00:54:57,800 Speaker 3: but they were so nice to me. 981 00:54:59,160 --> 00:55:00,759 Speaker 1: And yeah, of course that you know, that person has 982 00:55:00,800 --> 00:55:03,880 Speaker 1: not elaborated more publicly about what they meant. But I 983 00:55:03,960 --> 00:55:06,960 Speaker 1: think I think, you know, I think this is why 984 00:55:07,040 --> 00:55:11,680 Speaker 1: people are asking themselves these questions, which is like, you know, 985 00:55:11,760 --> 00:55:14,480 Speaker 1: the more that we hear about what the board uh 986 00:55:15,600 --> 00:55:17,799 Speaker 1: was thinking before they decided to fire Sam, I think 987 00:55:17,800 --> 00:55:20,280 Speaker 1: the more people are wondering about what are the patterns 988 00:55:20,280 --> 00:55:24,200 Speaker 1: of behavior that he shows that you know that led 989 00:55:24,239 --> 00:55:26,399 Speaker 1: to the board trying to make this drastic move. 990 00:55:28,160 --> 00:55:32,920 Speaker 3: Yeah, that's actually an interesting point. So when Samulton was 991 00:55:33,040 --> 00:55:36,880 Speaker 3: fired from open AI, there was this very strange reaction 992 00:55:37,120 --> 00:55:40,600 Speaker 3: from Silicon Valley, including some in the media, where it 993 00:55:40,880 --> 00:55:46,279 Speaker 3: was almost like Hunger Games, everyone doing the symbol thing 994 00:55:46,320 --> 00:55:48,400 Speaker 3: where everyone's like, oh, we got to put Sam Alton 995 00:55:48,480 --> 00:55:52,000 Speaker 3: back in, isn't it kind of strange? We still don't 996 00:55:52,080 --> 00:55:54,760 Speaker 3: know why he was actually fired though, I mean Helen 997 00:55:54,800 --> 00:55:58,040 Speaker 3: Tona has elaborated like, I've never seen a have you 998 00:55:58,120 --> 00:55:59,560 Speaker 3: seen anything like this in your career? 999 00:56:00,200 --> 00:56:04,120 Speaker 1: I think that it has been surprising that there has 1000 00:56:04,239 --> 00:56:09,880 Speaker 1: not been more of a clear answer. I think, you know, 1001 00:56:09,960 --> 00:56:13,200 Speaker 1: as as time has gone on, like we have heard 1002 00:56:13,280 --> 00:56:16,279 Speaker 1: a little bit more Like I think Helen Toner has, 1003 00:56:17,160 --> 00:56:20,200 Speaker 1: you know, to her credit, tried to give more information 1004 00:56:20,320 --> 00:56:24,560 Speaker 1: in recent weeks about what happened. I think, you know, 1005 00:56:24,960 --> 00:56:27,680 Speaker 1: people were obviously asking this question six months ago, and 1006 00:56:27,840 --> 00:56:29,279 Speaker 1: so I think like there's been a little bit of 1007 00:56:29,320 --> 00:56:33,360 Speaker 1: a delay and trying to get this this answer, and 1008 00:56:33,520 --> 00:56:36,000 Speaker 1: I wonder if maybe there just isn't like a very 1009 00:56:37,719 --> 00:56:40,200 Speaker 1: neat answer to it, and so and then in that 1010 00:56:40,960 --> 00:56:45,239 Speaker 1: absence we get this kind of more of a like murky, multifaceted, 1011 00:56:45,320 --> 00:56:48,040 Speaker 1: multi voiced answer. But I, yes, I agree that it 1012 00:56:48,320 --> 00:56:50,560 Speaker 1: is sort of surprising that there that there hasn't been 1013 00:56:51,440 --> 00:56:56,080 Speaker 1: more clarification on what exactly happened or a little bit 1014 00:56:56,200 --> 00:57:00,160 Speaker 1: more granular detail about what led up to it. 1015 00:57:01,520 --> 00:57:05,080 Speaker 3: So on today, aihype in general said that a bit weird. 1016 00:57:05,160 --> 00:57:08,279 Speaker 3: I'll keep going, why do you think there's such a 1017 00:57:08,360 --> 00:57:11,680 Speaker 3: doulf between what Sam Altman says and what chat GPT 1018 00:57:11,880 --> 00:57:12,560 Speaker 3: can actually do? 1019 00:57:13,920 --> 00:57:16,439 Speaker 1: What Sam Moltman says, what are you talking about specifically, as. 1020 00:57:16,400 --> 00:57:18,960 Speaker 3: In he says it will be a super smart company. Yeah, yeah, 1021 00:57:19,000 --> 00:57:20,280 Speaker 3: that he'll be all of these things. 1022 00:57:21,120 --> 00:57:23,400 Speaker 1: Well, this is something that we get into in episode three, 1023 00:57:23,440 --> 00:57:25,760 Speaker 1: which is a personal interest of mine, which is kind 1024 00:57:25,760 --> 00:57:29,560 Speaker 1: of the psychology of the AI industry right now. And 1025 00:57:30,440 --> 00:57:33,520 Speaker 1: you know, I what I find so interesting about this 1026 00:57:33,600 --> 00:57:36,280 Speaker 1: and what we try to delve into in episode three 1027 00:57:36,320 --> 00:57:39,760 Speaker 1: and kind of throughout the series is these kind of 1028 00:57:39,840 --> 00:57:43,320 Speaker 1: like extreme projections about AI. And in the industry you 1029 00:57:43,400 --> 00:57:45,960 Speaker 1: see both positive ones and negative ones, and I think, 1030 00:57:46,240 --> 00:57:49,560 Speaker 1: you know, the negative ones, that's what looks like AI dooomerism, 1031 00:57:49,680 --> 00:57:53,280 Speaker 1: AI existential risks, sometimes called AI safety depending on your 1032 00:57:53,320 --> 00:57:55,240 Speaker 1: point of view. But you know, it's these projections that 1033 00:57:56,360 --> 00:58:02,000 Speaker 1: you know, superintelligence might very quickly and very soon learned 1034 00:58:02,080 --> 00:58:05,720 Speaker 1: to self improve in a way that allows it to 1035 00:58:05,960 --> 00:58:09,360 Speaker 1: rapidly outstrip our control and our capabilities and could lead 1036 00:58:09,400 --> 00:58:13,080 Speaker 1: to the extinction of humanity. There are so many interesting 1037 00:58:13,120 --> 00:58:16,200 Speaker 1: things to say about the psychology of believing that our 1038 00:58:16,400 --> 00:58:19,880 Speaker 1: human race might either be wiped out or incredibly changed 1039 00:58:20,840 --> 00:58:24,480 Speaker 1: within our lifetimes, and we get into that in episode three. 1040 00:58:24,480 --> 00:58:26,680 Speaker 1: I think I really wanted to get into the psychology 1041 00:58:26,720 --> 00:58:29,520 Speaker 1: of someone who believes that AI doom is just around 1042 00:58:29,560 --> 00:58:32,000 Speaker 1: the corner, and so we talked to someone who sort 1043 00:58:32,000 --> 00:58:35,040 Speaker 1: of became convinced of this belief soon after the twenty 1044 00:58:35,080 --> 00:58:40,000 Speaker 1: sixteen Alpha Go matches in which the go playing AI 1045 00:58:40,120 --> 00:58:43,720 Speaker 1: beat the the world's champion in Go, and he talks 1046 00:58:43,720 --> 00:58:46,160 Speaker 1: about yeah, no longer, you know, deciding not to make 1047 00:58:46,240 --> 00:58:48,200 Speaker 1: a retirement account because he was like, what is the 1048 00:58:48,240 --> 00:58:52,160 Speaker 1: point by the time I reach retirement age, either the 1049 00:58:52,240 --> 00:58:55,240 Speaker 1: world will be dramatically different and money won't matter, or 1050 00:58:55,280 --> 00:58:58,760 Speaker 1: will all be dead. And I think that, even though 1051 00:58:58,800 --> 00:59:00,600 Speaker 1: some people might scoff at that, that's like a real 1052 00:59:00,640 --> 00:59:04,440 Speaker 1: belief that people believe that this, you know, these extreme 1053 00:59:04,680 --> 00:59:08,160 Speaker 1: possible scenarios are in our near future. And on the 1054 00:59:08,200 --> 00:59:12,080 Speaker 1: other hand, we also see extreme projections in a positive direction. 1055 00:59:12,200 --> 00:59:16,640 Speaker 1: You know, this idea that AI is going to unlock 1056 00:59:17,280 --> 00:59:20,720 Speaker 1: a whole new era of human flourishing, that we might 1057 00:59:21,360 --> 00:59:23,440 Speaker 1: expand beyond our planet, that we might be able to 1058 00:59:23,720 --> 00:59:28,720 Speaker 1: give say what abundance abundance, right exactly. You know, one 1059 00:59:28,720 --> 00:59:31,680 Speaker 1: of the things we do I believe in episode three 1060 00:59:31,800 --> 00:59:33,760 Speaker 1: is is do a little bit of a supercut of 1061 00:59:33,840 --> 00:59:36,120 Speaker 1: Sam all been talking about abundance. It's it's pretty clear 1062 00:59:36,200 --> 00:59:38,800 Speaker 1: that this is a way that he likes to frame 1063 00:59:39,240 --> 00:59:41,040 Speaker 1: our AI future is going to be this future in 1064 00:59:41,080 --> 00:59:45,320 Speaker 1: which everyone has plenty, right, everyone has, you know, access 1065 00:59:45,360 --> 00:59:50,000 Speaker 1: to intelligence, abundant energy, abundant access to superintelligence that can 1066 00:59:50,040 --> 00:59:52,200 Speaker 1: help us live kind of our best lives and beyond 1067 00:59:52,240 --> 00:59:57,880 Speaker 1: our wildest dreams. Right. And I you know, obviously Silicon 1068 00:59:57,920 --> 01:00:02,400 Speaker 1: Valley is a place where people like to make grandiose statements. 1069 01:00:02,920 --> 01:00:05,720 Speaker 1: But this is beyond that, right, This is not just yeah, 1070 01:00:06,280 --> 01:00:08,919 Speaker 1: this is not just like you know, we joked about 1071 01:00:08,960 --> 01:00:09,360 Speaker 1: we work. 1072 01:00:09,880 --> 01:00:10,200 Speaker 4: We works. 1073 01:00:10,240 --> 01:00:12,760 Speaker 1: Mission statement was to elevate the world's consciousness, like well, 1074 01:00:14,280 --> 01:00:17,960 Speaker 1: galaxies of human flourishing for eons beyond us. Like that 1075 01:00:18,120 --> 01:00:20,640 Speaker 1: is on another scale, right, Like we're talking about something 1076 01:00:20,680 --> 01:00:26,560 Speaker 1: that is sort of at an unprecedented level of extreme rhetoric. 1077 01:00:26,640 --> 01:00:28,880 Speaker 1: And I think that's really interesting. I think it is 1078 01:00:28,960 --> 01:00:31,960 Speaker 1: a very powerful motivator, both in a you know, in 1079 01:00:32,000 --> 01:00:34,520 Speaker 1: the Doumer sense and also in the abundance sense. People 1080 01:00:34,600 --> 01:00:37,280 Speaker 1: believing that what they're working on is the most important 1081 01:00:37,320 --> 01:00:42,720 Speaker 1: technological leap forward for humanity. Talk about a motivating reason 1082 01:00:42,880 --> 01:00:46,600 Speaker 1: to work on this technology, right, talk about a way 1083 01:00:46,680 --> 01:00:50,080 Speaker 1: to feel powerful, feel like you're making a huge difference. 1084 01:00:50,120 --> 01:00:53,040 Speaker 1: I think that's a really key part of what's driving 1085 01:00:53,240 --> 01:00:54,760 Speaker 1: a lot of work in AI right now. 1086 01:00:55,640 --> 01:00:56,720 Speaker 4: Striving a lot of work. 1087 01:00:56,840 --> 01:01:01,160 Speaker 3: Sure, But with Altman himself, there is the golf. It 1088 01:01:01,360 --> 01:01:05,120 Speaker 3: is a million mile golf between the things he says 1089 01:01:05,200 --> 01:01:08,480 Speaker 3: and what chat GBT is, even even on the most 1090 01:01:08,520 --> 01:01:11,160 Speaker 3: basic level, capable of doing and will be capable, And 1091 01:01:11,280 --> 01:01:14,360 Speaker 3: it just feels like it almost feels like he's become 1092 01:01:15,360 --> 01:01:19,880 Speaker 3: the propagandist for the tech industry, and it's very strange 1093 01:01:19,960 --> 01:01:23,200 Speaker 3: to me how far that distance is. Because you've got 1094 01:01:23,240 --> 01:01:27,160 Speaker 3: the AI doomers and the AI optimist I guess you'd 1095 01:01:27,200 --> 01:01:31,400 Speaker 3: call them, but Allmond doesn't even feel like he's in 1096 01:01:31,560 --> 01:01:34,040 Speaker 3: with He's just kind of He'll say one day that 1097 01:01:34,160 --> 01:01:36,400 Speaker 3: he doesn't think it's a creature, the next one will 1098 01:01:36,400 --> 01:01:39,000 Speaker 3: say it's gonna kill us, or it all just feels 1099 01:01:39,160 --> 01:01:42,320 Speaker 3: like a pr campaign but for nothing. 1100 01:01:43,760 --> 01:01:47,560 Speaker 1: Yeah, it has been interesting to try to answer the question. 1101 01:01:47,560 --> 01:01:47,680 Speaker 3: You know. 1102 01:01:47,720 --> 01:01:49,120 Speaker 1: One of the questions we tried to answer in the 1103 01:01:49,120 --> 01:01:53,240 Speaker 1: podcast is does Sam actually believe you know, because as 1104 01:01:53,280 --> 01:01:56,000 Speaker 1: you mentioned, there are some early clips of him, you know, 1105 01:01:56,080 --> 01:01:57,800 Speaker 1: and when I say earlier, I mean around the time 1106 01:01:57,840 --> 01:02:00,840 Speaker 1: of founding open Ai twenty fifteen or so. There's some 1107 01:02:00,920 --> 01:02:05,560 Speaker 1: clips of him talking about, you know, saying somewhat jokingly 1108 01:02:05,640 --> 01:02:07,600 Speaker 1: that AI might kill us all. But there's also this, 1109 01:02:08,120 --> 01:02:09,800 Speaker 1: you know, very famous blog post that he wrote in 1110 01:02:09,880 --> 01:02:12,720 Speaker 1: twenty fifteen in which he says that, you know, basically, 1111 01:02:12,760 --> 01:02:16,640 Speaker 1: super intelligence is one of the most serious risks to humanity, 1112 01:02:17,800 --> 01:02:20,200 Speaker 1: you know, full stop. And so it's clear that at 1113 01:02:20,280 --> 01:02:23,360 Speaker 1: some point in his life, he believed kind of what 1114 01:02:23,480 --> 01:02:27,480 Speaker 1: we might now call a more doom ory outlook. But 1115 01:02:27,600 --> 01:02:30,680 Speaker 1: as time has gone on, he has you know, offered 1116 01:02:30,760 --> 01:02:33,040 Speaker 1: views that are a little bit more measured and more positive. 1117 01:02:33,520 --> 01:02:35,920 Speaker 1: You know, he tends to you know, in his big 1118 01:02:36,000 --> 01:02:40,000 Speaker 1: media tour of twenty twenty three, he tended to talk 1119 01:02:40,040 --> 01:02:43,200 Speaker 1: about how AI was going to. You know, his projection 1120 01:02:43,360 --> 01:02:46,160 Speaker 1: was that AI would radically transform society, but that it 1121 01:02:46,200 --> 01:02:49,640 Speaker 1: would be net good, right that like, overall we would 1122 01:02:49,640 --> 01:02:53,240 Speaker 1: be glad that this happened, and that it would improve lives, 1123 01:02:53,680 --> 01:02:56,400 Speaker 1: even if in the short term or for some people 1124 01:02:56,440 --> 01:02:59,640 Speaker 1: it might prove to be bringing a lot of challenges 1125 01:02:59,680 --> 01:02:59,920 Speaker 1: as well. 1126 01:03:01,080 --> 01:03:02,440 Speaker 3: And so it is. 1127 01:03:02,560 --> 01:03:04,200 Speaker 1: You know, I think one of the interesting things about 1128 01:03:04,320 --> 01:03:06,400 Speaker 1: is about him is it is a little hard to 1129 01:03:06,440 --> 01:03:08,760 Speaker 1: pin down exactly what he thinks. I think you're right 1130 01:03:08,840 --> 01:03:13,560 Speaker 1: that I wouldn't consider him like a gung ho effective accelerationist. 1131 01:03:13,600 --> 01:03:15,360 Speaker 1: I would not consider him a dumer. He is like 1132 01:03:16,520 --> 01:03:19,400 Speaker 1: somewhere in this large gulf in between there. But I 1133 01:03:19,520 --> 01:03:24,800 Speaker 1: think he's also smart enough to know that making grandiose 1134 01:03:24,840 --> 01:03:30,800 Speaker 1: projections about what AI could bring is a compelling story, right, 1135 01:03:30,920 --> 01:03:35,080 Speaker 1: like is a story that he can help sell by 1136 01:03:35,120 --> 01:03:37,040 Speaker 1: being like a spokesman for it. And often that is 1137 01:03:37,120 --> 01:03:38,880 Speaker 1: the role of a CEO is to be a really 1138 01:03:38,880 --> 01:03:42,640 Speaker 1: good storyteller, to bring the pitch of the company to 1139 01:03:42,760 --> 01:03:47,600 Speaker 1: the public, to investors, to potential employees, to customers, to 1140 01:03:47,720 --> 01:03:49,240 Speaker 1: try to sell them on this vision of the future. 1141 01:03:49,280 --> 01:03:51,400 Speaker 1: And I do think Sam is good at that. There 1142 01:03:51,440 --> 01:03:56,040 Speaker 1: is an interesting tidbit in episode three in which we 1143 01:03:56,160 --> 01:03:59,560 Speaker 1: interview a fiction writer who was actually hired on contract 1144 01:03:59,720 --> 01:04:05,479 Speaker 1: by open Ai to write like a novella about AI 1145 01:04:05,720 --> 01:04:09,439 Speaker 1: futures and things like that. And yeah, he just talks 1146 01:04:09,480 --> 01:04:11,360 Speaker 1: a little bit about you know, the novella is not 1147 01:04:11,920 --> 01:04:14,760 Speaker 1: I think in active use within open ai, but they 1148 01:04:14,800 --> 01:04:17,320 Speaker 1: did at some point, see they did at some point 1149 01:04:17,440 --> 01:04:21,480 Speaker 1: see value in commissioning it. And I think you know 1150 01:04:21,560 --> 01:04:24,280 Speaker 1: something that the author Patrick House explains to us is 1151 01:04:24,920 --> 01:04:28,360 Speaker 1: you know that open a Ey, just like many other startups, 1152 01:04:29,160 --> 01:04:31,680 Speaker 1: is really motivated by story, right, and that Sam Altman 1153 01:04:31,840 --> 01:04:34,720 Speaker 1: is inspired by fiction. You know, it's inspired by certain 1154 01:04:34,760 --> 01:04:36,920 Speaker 1: kinds of sci fi. I think this is not unique 1155 01:04:36,920 --> 01:04:40,080 Speaker 1: to Sam. Many founders in Silicon Valley, you know, Elon 1156 01:04:40,160 --> 01:04:42,880 Speaker 1: Musk has talked about this as well, are driven to 1157 01:04:43,040 --> 01:04:47,000 Speaker 1: create things in part because of what they read about 1158 01:04:47,040 --> 01:04:49,160 Speaker 1: when they were younger, that you know, these dreams of 1159 01:04:49,200 --> 01:04:52,080 Speaker 1: the future. And so it's just interesting to get his 1160 01:04:52,200 --> 01:04:56,080 Speaker 1: perspective on how motivating a story can be, and how 1161 01:04:56,120 --> 01:04:59,160 Speaker 1: motivating this compelling story of like, oh, we're building something 1162 01:04:59,200 --> 01:05:01,840 Speaker 1: that's gonna change the course of human history. Like you 1163 01:05:01,960 --> 01:05:04,760 Speaker 1: just couldn't ask for a more powerful motivating force. 1164 01:05:06,160 --> 01:05:11,960 Speaker 3: So as Alman accumulates power and as he kind of 1165 01:05:12,160 --> 01:05:15,160 Speaker 3: sends to the top of open AI, do you think 1166 01:05:15,240 --> 01:05:17,640 Speaker 3: he's done there? Do you think there's going to be 1167 01:05:17,720 --> 01:05:20,800 Speaker 3: another thing he starts? Because it feels like you've discussed 1168 01:05:20,880 --> 01:05:22,760 Speaker 3: like UBI and all these other things. Do you think 1169 01:05:22,800 --> 01:05:26,000 Speaker 3: he has grander ideas that he wants to pursue? 1170 01:05:27,160 --> 01:05:31,920 Speaker 1: Well, obviously I can't speak to what's inside Sam's. 1171 01:05:31,880 --> 01:05:34,440 Speaker 3: I don't know the man's mind, but I. 1172 01:05:34,520 --> 01:05:38,240 Speaker 1: Mean past indicators would suggest yes, Like I think he 1173 01:05:38,320 --> 01:05:42,160 Speaker 1: has proven pretty consistently that he's someone who you know, 1174 01:05:42,440 --> 01:05:44,880 Speaker 1: is you know, as much as he might focus on 1175 01:05:44,960 --> 01:05:47,760 Speaker 1: one project with a lot of effort, like he is 1176 01:05:48,040 --> 01:05:50,120 Speaker 1: cooking things on the side, Like this is a man. 1177 01:05:50,680 --> 01:05:52,200 Speaker 1: This is going to be an extended metaphor, but this 1178 01:05:52,320 --> 01:05:54,200 Speaker 1: is a man working at a stove that has like 1179 01:05:54,280 --> 01:05:57,480 Speaker 1: six burners, not one. And you know he we already 1180 01:05:57,560 --> 01:05:57,840 Speaker 1: know that. 1181 01:05:58,800 --> 01:06:02,320 Speaker 3: What are you saying, sorry, he's got a big house, He's. 1182 01:06:02,200 --> 01:06:06,920 Speaker 1: Got multiple houses. The uh the you know, we already 1183 01:06:07,000 --> 01:06:10,479 Speaker 1: know that. You know. In addition to running open ai, 1184 01:06:11,240 --> 01:06:16,360 Speaker 1: he has funded and or helped prompt the founding of, 1185 01:06:16,880 --> 01:06:19,680 Speaker 1: or has you know, been very involved in investing in 1186 01:06:19,800 --> 01:06:23,360 Speaker 1: or supporting other startups that you know, are part of 1187 01:06:23,440 --> 01:06:28,680 Speaker 1: this kind of ecosystem of businesses that are connected to 1188 01:06:29,240 --> 01:06:31,880 Speaker 1: an AI future or might benefit in an AI future. 1189 01:06:31,960 --> 01:06:35,880 Speaker 1: So for example, Helion, which is a nuclear fusion company 1190 01:06:35,920 --> 01:06:38,400 Speaker 1: which he has invested a ton of money into. I 1191 01:06:38,480 --> 01:06:40,800 Speaker 1: think he has said publicly that you know, his his 1192 01:06:40,960 --> 01:06:44,040 Speaker 1: vision is that this is a potential way to provide 1193 01:06:44,400 --> 01:06:49,480 Speaker 1: abundant energy that could then power the technology that we 1194 01:06:49,560 --> 01:06:54,120 Speaker 1: need to you know, uh, improve AI to the level 1195 01:06:54,160 --> 01:06:55,440 Speaker 1: that we're hoping that it can get to, or that 1196 01:06:55,480 --> 01:06:57,960 Speaker 1: he's hoping that it can get to. At the same time, 1197 01:06:58,680 --> 01:07:01,000 Speaker 1: you know, we've talked a little bit about universal basic income. 1198 01:07:01,120 --> 01:07:04,200 Speaker 1: This has been something that Sam has been a proponent 1199 01:07:04,280 --> 01:07:07,760 Speaker 1: of and an advocate of since at least twenty sixteen, 1200 01:07:07,880 --> 01:07:10,680 Speaker 1: when he was running y Combinator and they started a 1201 01:07:11,000 --> 01:07:15,240 Speaker 1: side research project to study universal basic income by giving 1202 01:07:17,440 --> 01:07:20,440 Speaker 1: cash payments to families in Oakland of I believe a 1203 01:07:20,480 --> 01:07:24,440 Speaker 1: thousand dollars a month. That research project is still ongoing. 1204 01:07:24,520 --> 01:07:28,440 Speaker 1: It's now moved away from y Combinator and is associated 1205 01:07:29,040 --> 01:07:32,320 Speaker 1: with open Research, which is I believe funded by open Ai, 1206 01:07:32,480 --> 01:07:34,480 Speaker 1: and so it has kind of moved with Sam to 1207 01:07:34,600 --> 01:07:39,120 Speaker 1: his new role. And of course he also co founded 1208 01:07:39,200 --> 01:07:45,400 Speaker 1: this company called world Coin, which used these silver orbs 1209 01:07:45,960 --> 01:07:50,360 Speaker 1: machines to scan to take pictures of your iris and 1210 01:07:50,760 --> 01:07:55,680 Speaker 1: give everyone register every individual human as like a unique 1211 01:07:55,720 --> 01:07:59,800 Speaker 1: human individual, and to create this eyeball registry in which 1212 01:08:00,440 --> 01:08:03,440 Speaker 1: by which one could in the future distribute a universal 1213 01:08:03,480 --> 01:08:06,960 Speaker 1: basic income. So he's funding these energy companies. He's like 1214 01:08:07,160 --> 01:08:11,080 Speaker 1: involved in these sort the sort of crypto eyeball registry 1215 01:08:11,200 --> 01:08:13,920 Speaker 1: project that will help distribute UBI in this future that 1216 01:08:13,960 --> 01:08:16,200 Speaker 1: he's imagining. Like I think it's safe to say he's 1217 01:08:16,240 --> 01:08:19,920 Speaker 1: definitely thinking about things beyond just open AI for the 1218 01:08:20,000 --> 01:08:23,200 Speaker 1: future and imagining like, Okay, well, if we have this 1219 01:08:23,320 --> 01:08:25,320 Speaker 1: piece that's growing, what else will we need to support it? 1220 01:08:25,880 --> 01:08:27,840 Speaker 1: And I'm sure there are other things he's working on 1221 01:08:27,960 --> 01:08:29,920 Speaker 1: that we don't even know about, right, Like I know 1222 01:08:30,000 --> 01:08:34,720 Speaker 1: he has also funded some like longevity bioscience projects and 1223 01:08:34,800 --> 01:08:38,400 Speaker 1: things like that. He's I guarantee he's thinking about stuff 1224 01:08:38,439 --> 01:08:39,320 Speaker 1: beyond what we know about. 1225 01:08:41,360 --> 01:08:45,599 Speaker 3: Final question, why do you think the entire tech industry 1226 01:08:45,640 --> 01:08:47,400 Speaker 3: has become so fascinated with AI? 1227 01:08:47,640 --> 01:08:50,400 Speaker 4: Do you think it's just oldman or is it something more? 1228 01:08:51,840 --> 01:08:56,200 Speaker 1: I do think chat GBT started heating up this interest 1229 01:08:56,280 --> 01:08:58,479 Speaker 1: that was already percolating a little bit in the tech industry. 1230 01:08:58,920 --> 01:09:01,439 Speaker 1: But it does seem like something about chatchept capture the 1231 01:09:01,479 --> 01:09:05,400 Speaker 1: public imagination made people imagine very seriously for the first time, 1232 01:09:06,360 --> 01:09:10,920 Speaker 1: how AI could affect their lives, their lives individually. It 1233 01:09:11,000 --> 01:09:12,439 Speaker 1: used to be kind of this abstract thing that was 1234 01:09:12,439 --> 01:09:15,040 Speaker 1: a little farther away, or maybe you understood that like 1235 01:09:15,479 --> 01:09:18,960 Speaker 1: you were interacting with AI sometimes, like when you would 1236 01:09:19,080 --> 01:09:25,240 Speaker 1: look at like flight price predictors. Yeah, exactly. But you know, 1237 01:09:25,280 --> 01:09:27,200 Speaker 1: I think as we you know, we talk about this 1238 01:09:27,320 --> 01:09:30,000 Speaker 1: in episode three, but that you know, chat chipet wasn't 1239 01:09:30,040 --> 01:09:34,519 Speaker 1: even new technology. It was actually just a different user 1240 01:09:34,600 --> 01:09:38,639 Speaker 1: interface on a model that already existed GPT three point five. 1241 01:09:39,320 --> 01:09:39,559 Speaker 4: And so. 1242 01:09:41,200 --> 01:09:43,320 Speaker 1: To me, that actually speaks I guess to the power 1243 01:09:43,439 --> 01:09:47,519 Speaker 1: of like making a technology accessible to everyone. And in 1244 01:09:47,640 --> 01:09:49,600 Speaker 1: a way that was like easy to use and you know, 1245 01:09:49,680 --> 01:09:52,000 Speaker 1: for better or worse. That kind of got a lot 1246 01:09:52,080 --> 01:09:55,280 Speaker 1: of people in this like public momentum of people thinking 1247 01:09:55,320 --> 01:09:59,360 Speaker 1: about AI feeling you know, just feeling like it had 1248 01:10:00,560 --> 01:10:04,000 Speaker 1: rapidly increased its capabilities in a short period of time. 1249 01:10:04,880 --> 01:10:11,240 Speaker 1: And yeah, something about that really captured not just you know, 1250 01:10:11,400 --> 01:10:14,080 Speaker 1: the minds, but also the hearts of people and like 1251 01:10:14,360 --> 01:10:17,519 Speaker 1: getting them really thinking about, like what could a future 1252 01:10:17,640 --> 01:10:19,880 Speaker 1: like this look like? And I think while some people 1253 01:10:19,880 --> 01:10:22,280 Speaker 1: were excited, a lot of people also reacted with fear 1254 01:10:22,360 --> 01:10:23,840 Speaker 1: right and like I think in the valley like you 1255 01:10:23,960 --> 01:10:28,080 Speaker 1: will hear a lot of people more openly discussing their 1256 01:10:28,200 --> 01:10:34,840 Speaker 1: fears of sort of like job loss or or just 1257 01:10:34,920 --> 01:10:37,080 Speaker 1: like dramatic social change that might come about in the 1258 01:10:37,160 --> 01:10:40,439 Speaker 1: next ten or twenty years. The feeling I get in 1259 01:10:40,520 --> 01:10:42,680 Speaker 1: conversations that I have in and around San Francisco is, 1260 01:10:43,400 --> 01:10:44,920 Speaker 1: you know, even people who are pretty deep in this 1261 01:10:45,000 --> 01:10:50,280 Speaker 1: technology are uncertain about whether it's going to be overall 1262 01:10:50,640 --> 01:10:53,040 Speaker 1: good or bad. Like they're just uncertain of how to 1263 01:10:53,080 --> 01:10:55,720 Speaker 1: look back on this time, like whether it will have 1264 01:10:56,000 --> 01:10:59,960 Speaker 1: ended up being elite forward for humanity or something differ. 1265 01:11:12,680 --> 01:11:14,720 Speaker 3: Aortman has taken advantage of the fact that the tech 1266 01:11:14,800 --> 01:11:18,680 Speaker 3: industry might not have any hypergrowth markets left, knowing that 1267 01:11:18,800 --> 01:11:22,439 Speaker 3: chat GPT is much like Sam Altman, incredibly adept at 1268 01:11:22,520 --> 01:11:26,400 Speaker 3: mimicking depth and experience by parroting the experiences of those 1269 01:11:26,479 --> 01:11:30,679 Speaker 3: that have actually done things. Like Sam Altman, chat GPT 1270 01:11:30,840 --> 01:11:34,080 Speaker 3: consumes information and feeds it back to the people using 1271 01:11:34,160 --> 01:11:38,599 Speaker 3: it in a way that feels superficially satisfying, and it's 1272 01:11:38,640 --> 01:11:41,600 Speaker 3: quite impressive to those who don't really care about creativity 1273 01:11:41,720 --> 01:11:44,360 Speaker 3: or depth, And like I've said, it takes advantage of 1274 01:11:44,400 --> 01:11:47,160 Speaker 3: the fact that the teche CoSystem has been dominated and 1275 01:11:47,280 --> 01:11:52,120 Speaker 3: funded like people who don't really build tech. As I've 1276 01:11:52,160 --> 01:11:56,880 Speaker 3: said before, generative AI things like chat gpt, anthropics Claude, 1277 01:11:57,200 --> 01:11:59,880 Speaker 3: Microsoft's co Pilot, which is also powered by chat gp. 1278 01:12:00,720 --> 01:12:03,920 Speaker 3: It's not going to become the incredible supercomputer that Sam 1279 01:12:03,960 --> 01:12:07,240 Speaker 3: Mortman is promising. It will not be a virtual brain 1280 01:12:07,560 --> 01:12:10,600 Speaker 3: or imminently human like or a super smart person that 1281 01:12:10,760 --> 01:12:14,720 Speaker 3: knows everything about you, because it is, at its deepest complexity, 1282 01:12:15,120 --> 01:12:19,160 Speaker 3: a fundamentally different technology based on mathematics and the probabilistic 1283 01:12:19,240 --> 01:12:22,280 Speaker 3: answer to what you have asked it, rather than anything 1284 01:12:22,400 --> 01:12:26,639 Speaker 3: resembling how human beings think, or act or even know things. 1285 01:12:26,880 --> 01:12:30,920 Speaker 3: Generative AI does not know anything. How can a thing 1286 01:12:31,040 --> 01:12:34,600 Speaker 3: think when it doesn't know a anyone? I want to 1287 01:12:34,680 --> 01:12:39,040 Speaker 3: ask bradlight Cap Miramurati, Sam Ortman any of these questions 1288 01:12:39,320 --> 01:12:43,840 Speaker 3: just once hear what they fart out. No, Well, chat 1289 01:12:43,920 --> 01:12:48,240 Speaker 3: GBT isn't inherently useless. Almand realizes that it's impossible to 1290 01:12:48,360 --> 01:12:51,240 Speaker 3: generate the kind of funding and hype he needs based 1291 01:12:51,280 --> 01:12:54,520 Speaker 3: on its actual achievements, and that to continue to accumulate 1292 01:12:54,640 --> 01:12:57,240 Speaker 3: power and money, which is his only goal, he has 1293 01:12:57,280 --> 01:12:59,439 Speaker 3: the speciously hype it and he has the hype it 1294 01:12:59,680 --> 01:13:02,519 Speaker 3: to wear healthy and powerful people who also do not 1295 01:13:02,680 --> 01:13:04,599 Speaker 3: participate in the creation of anything. 1296 01:13:06,200 --> 01:13:09,439 Speaker 4: And that's who he is. I've been pretty mean about 1297 01:13:09,479 --> 01:13:11,880 Speaker 4: this guy, I really have. But he does have a skill. 1298 01:13:12,479 --> 01:13:16,479 Speaker 3: He knows a mark, he knows he knows how to 1299 01:13:16,600 --> 01:13:18,679 Speaker 3: say the right things and get in the right rooms 1300 01:13:18,880 --> 01:13:21,880 Speaker 3: with the people who aren't really touching the software or 1301 01:13:21,960 --> 01:13:24,560 Speaker 3: the hardware. He knows what they need to hear, he 1302 01:13:24,640 --> 01:13:28,960 Speaker 3: knows what the vcs need to hear. He knows quite 1303 01:13:29,040 --> 01:13:33,080 Speaker 3: aptly what this needs to sound like. But if he 1304 01:13:33,200 --> 01:13:35,679 Speaker 3: had to say what chat GBT does today, what would 1305 01:13:35,720 --> 01:13:38,640 Speaker 3: he say, Yeah, yeah, it's really good at generating a 1306 01:13:38,680 --> 01:13:40,120 Speaker 3: bunch of TEGs that's kind of shitty. 1307 01:13:41,000 --> 01:13:41,200 Speaker 4: Yeah. 1308 01:13:41,240 --> 01:13:43,160 Speaker 3: Sometimes it does math right and sometimes it does it 1309 01:13:43,240 --> 01:13:45,160 Speaker 3: really wrong. Sometimes you ask it to do It can 1310 01:13:45,240 --> 01:13:47,320 Speaker 3: draw a picture, Hey, what do you think of that? 1311 01:13:48,680 --> 01:13:50,640 Speaker 3: These are all things, by the way, that if like 1312 01:13:50,720 --> 01:13:52,160 Speaker 3: a six year old told you'd be like, wow, that's 1313 01:13:52,200 --> 01:13:55,000 Speaker 3: really impressive, or like a ten year old, perhaps because 1314 01:13:55,040 --> 01:13:59,040 Speaker 3: that's a living being. CHATGBT does these things, and it 1315 01:13:59,160 --> 01:14:01,160 Speaker 3: does it I know it's cheesy to say, but in 1316 01:14:01,240 --> 01:14:03,560 Speaker 3: a soulless way, but it really does that because and 1317 01:14:03,640 --> 01:14:06,479 Speaker 3: the reason all of this, the writing and the horrible 1318 01:14:06,600 --> 01:14:09,280 Speaker 3: video and the images, the reason it feels so empty 1319 01:14:09,400 --> 01:14:15,320 Speaker 3: is because even the most manure adjacent press release still 1320 01:14:15,400 --> 01:14:20,719 Speaker 3: has gone through someone's manure adjacent brain. Even the most 1321 01:14:20,840 --> 01:14:24,519 Speaker 3: pallid empty copy you've read has gone through someone. A 1322 01:14:24,600 --> 01:14:27,559 Speaker 3: person has put thought and intention in, even if they're 1323 01:14:27,600 --> 01:14:31,880 Speaker 3: not great with the English language. What chat GPT does 1324 01:14:32,479 --> 01:14:35,479 Speaker 3: is use math to generate the next thing, and sometimes 1325 01:14:35,520 --> 01:14:38,600 Speaker 3: it gets it pretty right. But pretty right is not 1326 01:14:38,880 --> 01:14:42,880 Speaker 3: enough to mimic human creation. But look at Sam Altman. 1327 01:14:43,400 --> 01:14:46,599 Speaker 4: Look who he is. What has he created? Other than wealth? 1328 01:14:46,680 --> 01:14:51,040 Speaker 3: For him and other people. What about Sam Moltman is 1329 01:14:51,240 --> 01:14:56,479 Speaker 3: particularly exciting. Well, he's been rich before and his money 1330 01:14:57,080 --> 01:14:58,080 Speaker 3: made him even richer. 1331 01:14:58,439 --> 01:14:59,200 Speaker 4: That's pretty good. 1332 01:14:59,520 --> 01:15:02,439 Speaker 3: He was why Combinator don't ask too much about what 1333 01:15:02,560 --> 01:15:07,559 Speaker 3: happened there. Just feels like sometimes Silicon Valley caun't wipe 1334 01:15:07,600 --> 01:15:11,400 Speaker 3: its own ass. It can't see when there's a wolf 1335 01:15:11,479 --> 01:15:15,120 Speaker 3: amongst the sheep. It can't see when someone isn't really 1336 01:15:15,240 --> 01:15:18,080 Speaker 3: part of the system other than finding new ways to 1337 01:15:18,160 --> 01:15:21,360 Speaker 3: manipulate and extract value from it. And Sam Altman is 1338 01:15:21,439 --> 01:15:25,240 Speaker 3: a monster created by Silicon Valley's sin, and their sin, 1339 01:15:25,479 --> 01:15:28,519 Speaker 3: by the way, is empowering and elevating those who don't 1340 01:15:28,800 --> 01:15:32,080 Speaker 3: build software, which in turn has led to the greater 1341 01:15:32,280 --> 01:15:35,400 Speaker 3: sin of allowing the tech industry to drift away from 1342 01:15:35,479 --> 01:15:40,120 Speaker 3: fixing the problems of actual human beings. Sam Altman's manipulative 1343 01:15:40,200 --> 01:15:43,280 Speaker 3: little power plays have been so effective because so many 1344 01:15:43,400 --> 01:15:46,559 Speaker 3: of the power players in venture capital and the public markets, 1345 01:15:46,600 --> 01:15:49,880 Speaker 3: and even tech companies are disconnected from the process of 1346 01:15:50,000 --> 01:15:53,439 Speaker 3: building things, of building software and hardware, and that makes 1347 01:15:53,520 --> 01:15:57,559 Speaker 3: them incapable or perhaps unwilling to understand that sam Altman 1348 01:15:57,800 --> 01:16:01,320 Speaker 3: is leading them to a deeply desolate place and on 1349 01:16:01,439 --> 01:16:04,280 Speaker 3: some level it's kind of impressive how he succeeded in 1350 01:16:04,400 --> 01:16:07,320 Speaker 3: bending these fools to his whims, to the point that 1351 01:16:07,439 --> 01:16:09,880 Speaker 3: executives like Sunned Up a Shi of Google, are willing 1352 01:16:09,960 --> 01:16:13,120 Speaker 3: to break Google Search in pursuit of this next big 1353 01:16:13,240 --> 01:16:18,400 Speaker 3: hype cycle created by Sam Altman. He might not create anything, 1354 01:16:18,479 --> 01:16:22,200 Speaker 3: but he's excellent at spotting market opportunities, even if these 1355 01:16:22,240 --> 01:16:26,679 Speaker 3: opportunities involve him transparently lying about the technology he creates, 1356 01:16:27,680 --> 01:16:31,880 Speaker 3: or while having his nasty little boosters further propagate these bullshit, 1357 01:16:32,479 --> 01:16:35,800 Speaker 3: mostly because they don't know, or perhaps they don't care 1358 01:16:36,160 --> 01:16:38,840 Speaker 3: if Sam Morton's full of shit. Maybe it doesn't matter 1359 01:16:38,880 --> 01:16:42,160 Speaker 3: to them. It doesn't matter that Google Search is still 1360 01:16:42,240 --> 01:16:46,960 Speaker 3: plagued with nonsensical AI answers that sometimes steal other people's work, 1361 01:16:47,240 --> 01:16:50,479 Speaker 3: or that AI in legal research has been proven to 1362 01:16:50,600 --> 01:16:53,360 Speaker 3: regularly hallucinate, which, by the way, is a problem that's 1363 01:16:53,520 --> 01:16:58,120 Speaker 3: impossible to fix. It's all happening because AI is the 1364 01:16:58,200 --> 01:17:00,280 Speaker 3: new thing that can be sold to the market, and 1365 01:17:00,360 --> 01:17:04,000 Speaker 3: it's all happening because Sam Altman, intentionally or otherwise has 1366 01:17:04,080 --> 01:17:08,200 Speaker 3: created a totally hollow hype cycle. And all of this 1367 01:17:08,400 --> 01:17:11,040 Speaker 3: is thanks to Sam Moltman. And a tech industry that's 1368 01:17:11,080 --> 01:17:14,280 Speaker 3: lost its ability to create things worthy of an actual 1369 01:17:14,439 --> 01:17:17,599 Speaker 3: hype cycle to the point that this spacious, non technical 1370 01:17:17,680 --> 01:17:21,880 Speaker 3: manipulator can lead it down this nasty, ugly offensive anti 1371 01:17:22,120 --> 01:17:27,000 Speaker 3: tech path. The tech industry has spent years pissing off customers, 1372 01:17:27,120 --> 01:17:30,839 Speaker 3: with platforms like Facebook and Google actively making their products 1373 01:17:30,920 --> 01:17:34,240 Speaker 3: worse in the pursuit of perpetual growth and ashamedly turning 1374 01:17:34,280 --> 01:17:36,519 Speaker 3: their backs on the people that made them rich and 1375 01:17:36,600 --> 01:17:40,439 Speaker 3: acting with this horrifying contempt for their users. And I 1376 01:17:40,520 --> 01:17:42,920 Speaker 3: believe the result will be that tech is going to 1377 01:17:43,000 --> 01:17:47,439 Speaker 3: face a harsh reprimand from society. As I mentioned in 1378 01:17:47,479 --> 01:17:50,840 Speaker 3: the rock Com bubble, things are already falling apart. WHEB 1379 01:17:50,920 --> 01:17:55,000 Speaker 3: traffic is already dropping. And what sucks is the people 1380 01:17:55,080 --> 01:17:57,560 Speaker 3: around Sam Moultman should have been able to see this, 1381 01:17:58,800 --> 01:18:03,479 Speaker 3: even putting aside his I've listened to an alarming amount 1382 01:18:03,520 --> 01:18:06,960 Speaker 3: of Sam Onman talk, and I'm a public relations person, 1383 01:18:07,000 --> 01:18:09,080 Speaker 3: who the hell am I I'm someone who's been around 1384 01:18:09,120 --> 01:18:11,320 Speaker 3: a lot of people who make shit up. I've been 1385 01:18:11,360 --> 01:18:13,720 Speaker 3: around a lot of people whose job it is to 1386 01:18:13,840 --> 01:18:19,519 Speaker 3: kind of obfuscate things, and quite frankly, almost really obvious. 1387 01:18:20,120 --> 01:18:22,519 Speaker 3: I'm not gonna do any weird light to me esque 1388 01:18:22,640 --> 01:18:26,000 Speaker 3: ways of proving he's lying. He just doesn't ever get 1389 01:18:26,080 --> 01:18:30,599 Speaker 3: pushed into any depth. No one ever asks him really 1390 01:18:30,760 --> 01:18:34,000 Speaker 3: technical questions or even just a question like, Hey, Sam, 1391 01:18:34,320 --> 01:18:36,479 Speaker 3: did you work on any of the code at open AI? 1392 01:18:36,800 --> 01:18:39,080 Speaker 4: What did you work on? Yeah? 1393 01:18:39,080 --> 01:18:41,920 Speaker 3: I know you can't talk about the future, Sam, but 1394 01:18:42,040 --> 01:18:44,599 Speaker 3: how close are we actually to AGI? And if he says, ah, 1395 01:18:44,600 --> 01:18:48,360 Speaker 3: a few years, that's not specific enough, Sam, how about 1396 01:18:48,439 --> 01:18:52,200 Speaker 3: give me a ballpark? And then when he lies again, 1397 01:18:52,280 --> 01:18:55,080 Speaker 3: you say, okay, Sam, how do we get from generative 1398 01:18:55,120 --> 01:18:59,720 Speaker 3: AI to AGI? And when he starts waffling, say no, no, no, 1399 01:18:59,840 --> 01:19:04,439 Speaker 3: be specific, Sam. This is how you actually ask questions. 1400 01:19:04,520 --> 01:19:06,960 Speaker 3: And when you say things like this, by the way, 1401 01:19:07,040 --> 01:19:11,559 Speaker 3: to technical founders, they don't get worried. They don't offer skate. 1402 01:19:11,840 --> 01:19:13,479 Speaker 3: They may say I can't talk about this duty of 1403 01:19:13,560 --> 01:19:17,040 Speaker 3: legal things, which is fine, but they'll generally try and 1404 01:19:17,320 --> 01:19:21,120 Speaker 3: talk to you. Listen to any interview with any other 1405 01:19:21,600 --> 01:19:25,880 Speaker 3: technical AI person, listen to them, and then listen to 1406 01:19:25,960 --> 01:19:26,599 Speaker 3: Sam Altman. 1407 01:19:27,000 --> 01:19:27,800 Speaker 4: He's full of it. 1408 01:19:28,080 --> 01:19:31,760 Speaker 3: It's so obvious and one deeply unfair thing with the 1409 01:19:31,880 --> 01:19:34,479 Speaker 3: value is there are people that get held to these 1410 01:19:34,520 --> 01:19:38,200 Speaker 3: standards early stage startups generally do the ones that aren't 1411 01:19:38,280 --> 01:19:41,120 Speaker 3: handed to people like Aortman or Alexis Ohanian have read it, 1412 01:19:41,320 --> 01:19:44,880 Speaker 3: or Paul Graham or read Hoffman. They don't get those 1413 01:19:45,000 --> 01:19:47,320 Speaker 3: chances because they're not saying the things that need to 1414 01:19:47,360 --> 01:19:52,080 Speaker 3: be said to the venture capitalists. They're not in the circles. 1415 01:19:52,800 --> 01:19:55,679 Speaker 3: They're not doing the right things because the right things 1416 01:19:55,960 --> 01:19:58,960 Speaker 3: are no longer the right thing for the tech industry. 1417 01:20:00,800 --> 01:20:03,800 Speaker 3: And when all of this falls apart, Sam Almon's going 1418 01:20:03,880 --> 01:20:07,639 Speaker 3: to be fine. When this all collapses, He'll find something 1419 01:20:07,720 --> 01:20:11,240 Speaker 3: to blame it on, market forces, a lack of energy, breakthroughs, 1420 01:20:11,720 --> 01:20:15,759 Speaker 3: unfortunate economic things, all of that nonsense, and he'll remain 1421 01:20:15,800 --> 01:20:19,360 Speaker 3: a billionaire, capable of doing anything he wants. The people 1422 01:20:19,439 --> 01:20:21,560 Speaker 3: that are going to suffer are the people working in 1423 01:20:21,640 --> 01:20:24,800 Speaker 3: Silicon Valley who aren't Sam Almon, The people that did 1424 01:20:24,880 --> 01:20:27,920 Speaker 3: not get born with a silver spoon in each hand 1425 01:20:27,960 --> 01:20:30,479 Speaker 3: and then handed further silver spoons as they walk the 1426 01:20:30,560 --> 01:20:34,360 Speaker 3: streets of San Francisco, people that don't live in nine 1427 01:20:34,400 --> 01:20:38,559 Speaker 3: and a half thousand square foot mansions, the people trying 1428 01:20:38,600 --> 01:20:40,880 Speaker 3: to raise money who can't write now because all the 1429 01:20:41,000 --> 01:20:45,720 Speaker 3: vcs are obsessed with AI the people that will get 1430 01:20:45,800 --> 01:20:49,599 Speaker 3: fired from public tech companies when a depression hits, because 1431 01:20:49,640 --> 01:20:53,000 Speaker 3: the markets realize that the generative AI boom was a bubble, 1432 01:20:53,960 --> 01:20:56,639 Speaker 3: when they realize that the most famous people in tech 1433 01:20:56,880 --> 01:21:02,120 Speaker 3: have been making these promises for nobody other than the markets. Well, 1434 01:21:02,439 --> 01:21:04,840 Speaker 3: the markets need you to do something eventually, and I 1435 01:21:05,000 --> 01:21:08,439 Speaker 3: just don't think it's gonna happen. And I think that 1436 01:21:08,720 --> 01:21:11,840 Speaker 3: we need to really think why was Sam Altman allowed 1437 01:21:11,840 --> 01:21:14,759 Speaker 3: to get to this point? Why did so many people 1438 01:21:15,160 --> 01:21:18,560 Speaker 3: like Paul Graham, like Reid Hoffman, like Brian Chesky, like 1439 01:21:18,680 --> 01:21:22,559 Speaker 3: Sacha Nadella. Back up, it's obvious con artists who has 1440 01:21:22,680 --> 01:21:27,120 Speaker 3: acted like this forever? And what sucks is I don't 1441 01:21:27,120 --> 01:21:28,960 Speaker 3: know if the valley is going to learn anything unless 1442 01:21:28,960 --> 01:21:30,519 Speaker 3: it's really bad, and I don't want it to be 1443 01:21:30,600 --> 01:21:32,519 Speaker 3: by the way. I would love to be wrong. I 1444 01:21:32,560 --> 01:21:34,439 Speaker 3: would love for all of this to just be like 1445 01:21:34,520 --> 01:21:37,479 Speaker 3: Sam Ortman's actually a genius. Turns out the whole thing 1446 01:21:37,560 --> 01:21:39,679 Speaker 3: is no, no, it's not gonna happen. 1447 01:21:41,000 --> 01:21:41,800 Speaker 4: And I worry that. 1448 01:21:42,520 --> 01:21:44,599 Speaker 3: There is no smooth way out of this, that there 1449 01:21:44,720 --> 01:21:49,040 Speaker 3: is no way to just casually integrate OpenAI with Microsoft, 1450 01:21:50,479 --> 01:21:53,280 Speaker 3: because now there's an antitrust thing going in with Microsoft 1451 01:21:53,360 --> 01:21:57,080 Speaker 3: and acquiring Inflection Ai, another AI company, and that's the thing. 1452 01:21:57,960 --> 01:22:01,720 Speaker 3: It feels like we were approaching a press apiss here, 1453 01:22:03,160 --> 01:22:06,240 Speaker 3: and the only way to avoid it is for people 1454 01:22:06,320 --> 01:22:09,080 Speaker 3: to come clean, which is never going to happen, or 1455 01:22:09,200 --> 01:22:12,040 Speaker 3: of course for Sam Wortman not to be laying for 1456 01:22:12,280 --> 01:22:14,559 Speaker 3: Agi to actually come out of open AI. And by 1457 01:22:14,560 --> 01:22:15,720 Speaker 3: the way, it's going to need to be in the 1458 01:22:15,800 --> 01:22:19,599 Speaker 3: next year. I don't think they've got even three quarters left. 1459 01:22:20,320 --> 01:22:24,959 Speaker 3: I think that once this falls apart, once the markets realize, 1460 01:22:25,080 --> 01:22:28,440 Speaker 3: oh shit, this is not profitable, this is not sustainable, 1461 01:22:28,680 --> 01:22:31,639 Speaker 3: they're going to walk away from it. When companies realize 1462 01:22:31,680 --> 01:22:34,759 Speaker 3: that generative AI it's given him a couple percent profit, 1463 01:22:34,960 --> 01:22:39,800 Speaker 3: maybe they're going to be pissed because this is not 1464 01:22:40,280 --> 01:22:46,240 Speaker 3: a stock rally worthy boondoggle. This is not going to 1465 01:22:46,320 --> 01:22:49,719 Speaker 3: be pretty when things fall apart from it. For Nvidia, 1466 01:22:49,840 --> 01:22:52,479 Speaker 3: you're still over one thousand dollars. When those orders stop 1467 01:22:52,640 --> 01:22:55,680 Speaker 3: coming in quite as fast. What do you think is 1468 01:22:55,760 --> 01:22:59,840 Speaker 3: going to happen to tech stocks? Startups are already having 1469 01:23:00,000 --> 01:23:03,160 Speaker 3: trouble raising money, and they're having trouble raising money because 1470 01:23:03,160 --> 01:23:06,880 Speaker 3: the people giving out the money are too disconnected from 1471 01:23:06,920 --> 01:23:10,000 Speaker 3: the creation of software and hardware. The only way to 1472 01:23:10,120 --> 01:23:14,960 Speaker 3: fix Silicon Valley perhaps is an apocalypse. Perhaps is people 1473 01:23:15,120 --> 01:23:17,560 Speaker 3: like Sam Ortman getting washed out. I don't want it 1474 01:23:17,640 --> 01:23:22,200 Speaker 3: to happen. I really must be bloody clear. But maybe 1475 01:23:22,240 --> 01:23:25,400 Speaker 3: it won't be apocalyptic. Maybe it would just be a 1476 01:23:25,560 --> 01:23:29,800 Speaker 3: brutal realignment. And maybe Silicon Valley needs that realignment because 1477 01:23:29,840 --> 01:23:35,280 Speaker 3: this industry desperately needs a big bathful of ice and 1478 01:23:35,400 --> 01:23:37,639 Speaker 3: need to dunk the head in it aggressively and wake 1479 01:23:37,840 --> 01:23:42,360 Speaker 3: the hell up. Venture capital needs to put money back 1480 01:23:42,439 --> 01:23:46,040 Speaker 3: into real things. The largest tech companies need to realign 1481 01:23:46,120 --> 01:23:49,200 Speaker 3: and build for sustainability so they're not binging and purging 1482 01:23:49,320 --> 01:23:53,160 Speaker 3: staff with every boom. And if we really are at 1483 01:23:53,240 --> 01:23:57,160 Speaker 3: the end of the hypergrowth era, every tech company needs 1484 01:23:57,200 --> 01:23:59,760 Speaker 3: to be thinking profit and sustainability again. And that's a 1485 01:24:00,080 --> 01:24:03,200 Speaker 3: to Silicon Valley because a better Silicon Valley builds things 1486 01:24:03,240 --> 01:24:06,280 Speaker 3: for people, It solves real problems. It doesn't have to 1487 01:24:06,479 --> 01:24:08,759 Speaker 3: lie about what the thing could do in the future 1488 01:24:08,960 --> 01:24:11,920 Speaker 3: so that it can sell a thing today. And I 1489 01:24:12,000 --> 01:24:14,639 Speaker 3: realize that sounds like the foundation of most venture capital. 1490 01:24:15,479 --> 01:24:18,920 Speaker 3: That's fine at the seed stage, that's fine at this 1491 01:24:19,200 --> 01:24:22,519 Speaker 3: moonshot stage where your early early days. It is not 1492 01:24:22,760 --> 01:24:26,360 Speaker 3: befitting the most famous company in tech, It is not 1493 01:24:26,520 --> 01:24:31,240 Speaker 3: befitting a multi billionaire, It is not befitting anyone, and 1494 01:24:31,360 --> 01:24:34,080 Speaker 3: it is insulting to the people actually building things, both 1495 01:24:34,160 --> 01:24:35,640 Speaker 3: in and outside of technology. 1496 01:24:36,160 --> 01:24:37,639 Speaker 4: The people I hear from after. 1497 01:24:37,520 --> 01:24:41,559 Speaker 3: Every episode, they are angry, They are frustrated because there 1498 01:24:41,640 --> 01:24:43,800 Speaker 3: are good people in tech. There are people building real things. 1499 01:24:43,840 --> 01:24:46,280 Speaker 3: There are people that remember a time when the tech 1500 01:24:46,320 --> 01:24:50,760 Speaker 3: industry was exciting, when people were talking about cool shit 1501 01:24:50,800 --> 01:24:54,040 Speaker 3: in the future, and then they'd actually do it. Returning 1502 01:24:54,120 --> 01:24:56,839 Speaker 3: to that is better for society and the tech industry. 1503 01:24:58,000 --> 01:25:07,640 Speaker 3: Just don't know when it's going to happen. Thank you 1504 01:25:07,680 --> 01:25:10,400 Speaker 3: for listening to Better Offline. The editor and composer of 1505 01:25:10,400 --> 01:25:13,400 Speaker 3: the Better Offline theme song is Matasowski. You can check 1506 01:25:13,439 --> 01:25:16,360 Speaker 3: out more of his music and audio projects at Mattasowski 1507 01:25:16,400 --> 01:25:17,519 Speaker 3: dot com, m. 1508 01:25:17,560 --> 01:25:21,639 Speaker 4: A T T O S O W s ki dot com. 1509 01:25:22,400 --> 01:25:24,920 Speaker 3: You can email me at easy at better Offline dot com, 1510 01:25:25,120 --> 01:25:27,400 Speaker 3: or visit Better Offline dot com to find more podcast 1511 01:25:27,479 --> 01:25:30,799 Speaker 3: links and of course my newsletter. I also really recommend 1512 01:25:30,800 --> 01:25:32,760 Speaker 3: you go to chat dot Where's youreed dot at to 1513 01:25:32,840 --> 01:25:35,639 Speaker 3: visit the discord and go to our slash Better Offline 1514 01:25:35,680 --> 01:25:38,880 Speaker 3: to check out our reddit. Thank you so much for listening. 1515 01:25:39,720 --> 01:25:42,400 Speaker 5: Better Offline is a production of cool Zone Media. For 1516 01:25:42,560 --> 01:25:45,719 Speaker 5: more from cool Zone Media, visit our website cool Zonemedia 1517 01:25:45,760 --> 01:25:48,599 Speaker 5: dot com, or check us out on the iHeartRadio app, 1518 01:25:48,640 --> 01:25:51,040 Speaker 5: Apple Podcasts, or wherever you get your podcasts. 1519 01:26:07,160 --> 01:26:09,600 Speaker 3: Hello, and welcome to Better Offline. I'm your host ed 1520 01:26:09,720 --> 01:26:23,479 Speaker 3: ze Tron. It's been a hard couple of weeks. It's 1521 01:26:23,560 --> 01:26:27,080 Speaker 3: been pretty hard to focus. I've written a few newsletters, 1522 01:26:27,160 --> 01:26:29,400 Speaker 3: I've gone to Portugal, I've done a bunch of shit. 1523 01:26:29,760 --> 01:26:32,759 Speaker 3: Just try not to think about everything happening outside. 1524 01:26:32,880 --> 01:26:34,519 Speaker 4: But it's time to do so. 1525 01:26:35,840 --> 01:26:38,600 Speaker 3: Seemingly every single person on Earth with a blog or 1526 01:26:38,600 --> 01:26:41,479 Speaker 3: a podcast store, or even a Twitter account or XD everything, Apple, 1527 01:26:41,560 --> 01:26:43,920 Speaker 3: whatever it's called now, they've all tried to drill down 1528 01:26:43,960 --> 01:26:47,360 Speaker 3: into what happened on November fifth, to find the people 1529 01:26:47,400 --> 01:26:51,760 Speaker 3: to blame, to explain what could have gone differently. Really 1530 01:26:51,840 --> 01:26:54,519 Speaker 3: looking for who to blame though, and find out why 1531 01:26:54,680 --> 01:26:57,360 Speaker 3: so many actions led to a result that well overwhelmingly 1532 01:26:57,400 --> 01:27:02,040 Speaker 3: home woman, minorities, immigrants, LGBTQ people, and lower income workers 1533 01:27:03,280 --> 01:27:06,360 Speaker 3: is terrifying. It fucking sucks. I'm not going to mince words, 1534 01:27:06,600 --> 01:27:09,920 Speaker 3: not that I would usually anyway, and I don't feel 1535 01:27:09,960 --> 01:27:12,439 Speaker 3: fully equipped to respond to the moment. I don't have 1536 01:27:13,080 --> 01:27:16,040 Speaker 3: any real answers, at least not political ones. I'm not 1537 01:27:16,080 --> 01:27:19,799 Speaker 3: a political analyst, and I'd feel disingenuous trying to dissect 1538 01:27:19,840 --> 01:27:22,680 Speaker 3: either the Harris or the Trump campaigns, because I just 1539 01:27:22,800 --> 01:27:24,920 Speaker 3: feel like there's a take Olympics right now. It's the 1540 01:27:25,000 --> 01:27:28,400 Speaker 3: Dunning Kruger Festival out there. Everyone is trying to rationalize 1541 01:27:28,439 --> 01:27:32,559 Speaker 3: and intellectualize these events that ultimately come down to something 1542 01:27:32,640 --> 01:27:37,439 Speaker 3: quite simple. People don't trust authority, and yet it's pretty 1543 01:27:37,520 --> 01:27:42,000 Speaker 3: ironic that this often leads them towards authoritarianism. Now, I 1544 01:27:42,000 --> 01:27:43,559 Speaker 3: don't want to give you the impression that I'm going 1545 01:27:43,640 --> 01:27:46,719 Speaker 3: to go on my crank mode that and somehow against 1546 01:27:46,800 --> 01:27:48,200 Speaker 3: institutions on their face. 1547 01:27:48,320 --> 01:27:48,560 Speaker 4: I'm not. 1548 01:27:48,840 --> 01:27:52,240 Speaker 3: But at the same time, understanding this moment requires us 1549 01:27:52,280 --> 01:27:55,840 Speaker 3: to acknowledge that institutions have failed us and failed most people, 1550 01:27:56,200 --> 01:27:59,799 Speaker 3: and how certain institutions missteps have led us to exactly 1551 01:27:59,840 --> 01:28:04,000 Speaker 3: what we are today. Legacy media, and while oftentimes they're 1552 01:28:04,000 --> 01:28:06,080 Speaker 3: staffed by people who truly love their readers and care 1553 01:28:06,120 --> 01:28:09,680 Speaker 3: about their beats, they're weighed down by this hysterical, nonsensical 1554 01:28:09,720 --> 01:28:13,479 Speaker 3: attachment to the imaginary concept of objectivity and the will 1555 01:28:13,520 --> 01:28:17,880 Speaker 3: of the markets. Case in point, regular people have spent 1556 01:28:18,080 --> 01:28:20,960 Speaker 3: years watching the price of goods increase due to inflation, 1557 01:28:21,360 --> 01:28:23,679 Speaker 3: despite the fact that the increase in pricing was mostly 1558 01:28:23,800 --> 01:28:28,240 Speaker 3: driven by get this, corporations raising their prices. Now, that's 1559 01:28:28,320 --> 01:28:30,320 Speaker 3: not to say that external factors like the war in 1560 01:28:30,400 --> 01:28:34,160 Speaker 3: Ukraine or lingering COVID restrictions in China, these things did 1561 01:28:34,360 --> 01:28:37,920 Speaker 3: play a role in it. They did, But the bulk 1562 01:28:38,000 --> 01:28:42,000 Speaker 3: of these price increases were caused by these fucking companies 1563 01:28:42,120 --> 01:28:45,320 Speaker 3: raising the prices. It was in their earnings. 1564 01:28:45,680 --> 01:28:46,920 Speaker 4: It was right there. 1565 01:28:47,160 --> 01:28:50,679 Speaker 3: Pepsi Cola said it on the news. Yet some parts 1566 01:28:50,720 --> 01:28:53,360 Speaker 3: of the legacy media spent an alarming amount of time 1567 01:28:53,479 --> 01:28:56,680 Speaker 3: chiding their readers for thinking otherwise, even going against their 1568 01:28:56,720 --> 01:28:59,719 Speaker 3: own reporting. And there will be links in the episode 1569 01:28:59,760 --> 01:29:03,920 Speaker 3: notes promise as a means of providing balanced coverage, insisting 1570 01:29:04,000 --> 01:29:07,840 Speaker 3: again and again that the economy is actually good, contorting 1571 01:29:07,920 --> 01:29:10,679 Speaker 3: their little bodies to prove that prices aren't actually higher, 1572 01:29:10,760 --> 01:29:14,040 Speaker 3: even as companies literally boasted about raising their prices on earnings. 1573 01:29:14,520 --> 01:29:17,920 Speaker 3: In fact, the media spent years debating with itself, where 1574 01:29:17,920 --> 01:29:20,720 Speaker 3: the price scouging was actually happening despite years of proof 1575 01:29:20,800 --> 01:29:23,439 Speaker 3: that it was. Some of them even reported that the 1576 01:29:23,520 --> 01:29:27,920 Speaker 3: price gouging was happening, So like, get this. I just 1577 01:29:27,960 --> 01:29:31,200 Speaker 3: don't think people trust authority, and they especially don't trust 1578 01:29:31,240 --> 01:29:35,519 Speaker 3: the media, especially the legacy media. It also probably didn't 1579 01:29:35,600 --> 01:29:38,120 Speaker 3: help that the legacy media implored readers and viewers to 1580 01:29:38,200 --> 01:29:40,640 Speaker 3: ignore what they saw at the supermarket or at the 1581 01:29:40,720 --> 01:29:42,720 Speaker 3: pump and the growing hits that their wallets from the 1582 01:29:42,800 --> 01:29:47,000 Speaker 3: daily incessities of life. It was just a national level 1583 01:29:47,120 --> 01:29:49,920 Speaker 3: gas lighting and it was disgusting. And I know some 1584 01:29:50,080 --> 01:29:53,280 Speaker 3: of you might say, you know where to email me. 1585 01:29:53,640 --> 01:29:54,560 Speaker 4: Oh, it's not just this. 1586 01:29:54,800 --> 01:29:57,479 Speaker 3: No, of course, it's not just this asshole, but I 1587 01:29:57,560 --> 01:29:59,920 Speaker 3: think this is a big thing now. Before a go 1588 01:30:00,160 --> 01:30:03,320 Speaker 3: any further, I've used the term legacy media here repeatedly, 1589 01:30:03,600 --> 01:30:05,960 Speaker 3: but I don't completely intend for it to come across 1590 01:30:06,000 --> 01:30:09,519 Speaker 3: as a pejorative despite my criticism. Believe me, I've got 1591 01:30:09,520 --> 01:30:12,080 Speaker 3: a few of them. There are people in the legacy 1592 01:30:12,120 --> 01:30:14,280 Speaker 3: media doing a good job. They're reporting the truth, they're 1593 01:30:14,280 --> 01:30:16,200 Speaker 3: doing the kinds of work that matters, and they're actually 1594 01:30:16,240 --> 01:30:18,599 Speaker 3: trying to teach their reader's stuff and tell them what's 1595 01:30:18,640 --> 01:30:22,240 Speaker 3: happening and giving them context. I read and pay for 1596 01:30:22,520 --> 01:30:24,720 Speaker 3: several legacy media outlets. I think the world is a 1597 01:30:24,760 --> 01:30:28,439 Speaker 3: better place for them existing despite their flaws. The problem is, 1598 01:30:28,560 --> 01:30:33,040 Speaker 3: as I'll explain, is this editorial industrial complex and how 1599 01:30:33,560 --> 01:30:36,040 Speaker 3: these people are writing about the powerful don't seem to 1600 01:30:36,120 --> 01:30:38,280 Speaker 3: be able to or maybe they don't want to actually 1601 01:30:38,360 --> 01:30:42,200 Speaker 3: interrogate the powerful. This could be an entire episode on 1602 01:30:42,280 --> 01:30:44,000 Speaker 3: its sign, But I don't think the answer to these 1603 01:30:44,040 --> 01:30:46,639 Speaker 3: failings is to simply discard legacy media entirely. 1604 01:30:47,520 --> 01:30:48,000 Speaker 2: But I want to. 1605 01:30:48,040 --> 01:30:50,680 Speaker 3: Implore them to do better and to strive for the 1606 01:30:50,760 --> 01:30:54,280 Speaker 3: values of truth hunting and truth telling and actually explaining 1607 01:30:54,360 --> 01:30:57,880 Speaker 3: what's happening and criticizing the people that don't have pr 1608 01:30:58,040 --> 01:31:01,679 Speaker 3: firms and lobbying groups, lawyers and the means to protect 1609 01:31:01,720 --> 01:31:06,200 Speaker 3: themselves from the world. The time for fucking around is 1610 01:31:06,320 --> 01:31:12,240 Speaker 3: over and we're currently finding out now. Anyway, as you know, 1611 01:31:12,400 --> 01:31:14,599 Speaker 3: as a person existing in the real world, the price 1612 01:31:14,640 --> 01:31:17,360 Speaker 3: of everything has kept increasing despite the fact that wages 1613 01:31:17,400 --> 01:31:19,920 Speaker 3: are stagnating. It's forcing many of the poorest people to 1614 01:31:20,000 --> 01:31:23,759 Speaker 3: choose between food and fuel or I don't know, eating 1615 01:31:23,920 --> 01:31:28,360 Speaker 3: and having heat simultaneously. Businesses have spent several years telling 1616 01:31:28,439 --> 01:31:30,599 Speaker 3: workers they're asking for too much and doing too little, 1617 01:31:30,800 --> 01:31:33,040 Speaker 3: telling people a few years ago they were quiet quitting, 1618 01:31:33,439 --> 01:31:35,840 Speaker 3: which is a fucking stupid term that just means going 1619 01:31:35,880 --> 01:31:37,240 Speaker 3: to your job and doing the thing you're paying to 1620 01:31:37,280 --> 01:31:40,920 Speaker 3: do any anyway, And a year later, in twenty twenty three, 1621 01:31:41,000 --> 01:31:44,320 Speaker 3: they insisted that the years of remote work were actually 1622 01:31:44,560 --> 01:31:47,680 Speaker 3: bad because profits didn't reach the same profit levels of 1623 01:31:47,760 --> 01:31:50,920 Speaker 3: twenty twenty one, which was something to do with remote work. Now, 1624 01:31:51,000 --> 01:31:53,479 Speaker 3: did anyone actually prove this, did anyone actually going. 1625 01:31:53,439 --> 01:31:54,000 Speaker 4: No, they didn't. 1626 01:31:54,040 --> 01:31:56,840 Speaker 3: They just well, I just listened to Mark Benioff, who's 1627 01:31:57,080 --> 01:32:00,040 Speaker 3: one of the more evil people alive. Now I also so, 1628 01:32:00,120 --> 01:32:02,320 Speaker 3: I think a lot of these problems come to twenty 1629 01:32:02,400 --> 01:32:04,400 Speaker 3: twenty one, a year that we really need to dig 1630 01:32:04,479 --> 01:32:06,400 Speaker 3: into more. We might not do so today, but we 1631 01:32:06,479 --> 01:32:10,000 Speaker 3: will in the future. But one of the big things 1632 01:32:10,040 --> 01:32:12,000 Speaker 3: that punish workers and led to so many layoffs in 1633 01:32:12,040 --> 01:32:14,000 Speaker 3: twenty twenty three was the fact that we couldn't get 1634 01:32:14,080 --> 01:32:16,559 Speaker 3: back to the post lockdown boot of twenty twenty one, 1635 01:32:16,760 --> 01:32:19,960 Speaker 3: when everyone bought everything always as they left the house 1636 01:32:20,040 --> 01:32:23,680 Speaker 3: for the first time in a while. Now, any corporation 1637 01:32:25,000 --> 01:32:27,479 Speaker 3: would be smart enough to know that that was a phase, 1638 01:32:27,560 --> 01:32:31,080 Speaker 3: that that was not going to be forever. Except every 1639 01:32:31,560 --> 01:32:33,960 Speaker 3: single big company seemed to make the same mistake and 1640 01:32:34,000 --> 01:32:37,000 Speaker 3: say number going up forever, line go up forever. When 1641 01:32:37,040 --> 01:32:40,760 Speaker 3: it didn't, well, they started punishing workers and they started thinking, well, 1642 01:32:41,320 --> 01:32:44,360 Speaker 3: could it be that we as companies we set unrealistic 1643 01:32:44,520 --> 01:32:48,360 Speaker 3: expectations for the markets and we just thought that we'd 1644 01:32:48,400 --> 01:32:50,439 Speaker 3: keep growing forever. Or maybe it was the people using 1645 01:32:50,479 --> 01:32:56,000 Speaker 3: the computer at home. Yeah, that seems way better anyway, Well, 1646 01:32:56,040 --> 01:32:58,600 Speaker 3: the majority of people don't work remotely. From talking to 1647 01:32:58,640 --> 01:33:00,800 Speaker 3: the people I know outside of tech business, there's this 1648 01:33:00,960 --> 01:33:03,799 Speaker 3: genuine sense that the media has allied itself with the bosses, 1649 01:33:04,000 --> 01:33:05,920 Speaker 3: and I imagine it's because of the many articles that 1650 01:33:06,000 --> 01:33:08,960 Speaker 3: literally call workers lazy and have done so for years. 1651 01:33:09,360 --> 01:33:12,160 Speaker 3: Yet when it comes to the powerful legacy, media doesn't 1652 01:33:12,200 --> 01:33:14,640 Speaker 3: seem to have that much pisson vinegar. They just have 1653 01:33:14,920 --> 01:33:19,599 Speaker 3: much more guarded critiques. The appetite for shaming and finger wagging. 1654 01:33:19,800 --> 01:33:23,080 Speaker 3: It's always directed that middle and working class workers and 1655 01:33:23,240 --> 01:33:25,920 Speaker 3: seemingly disappears what a person has a three character job 1656 01:33:26,000 --> 01:33:30,120 Speaker 3: title like CEO. It's fucking stupid, it's insulting, and yes 1657 01:33:30,160 --> 01:33:33,320 Speaker 3: it's demoralizing for the average person, despite the fact that 1658 01:33:33,360 --> 01:33:36,120 Speaker 3: Elon Musk has spent years telegraphing is intent to uses 1659 01:33:36,160 --> 01:33:38,360 Speaker 3: billions of dollars to wield power equivalent to that of 1660 01:33:38,400 --> 01:33:40,479 Speaker 3: a nation state. As you may remember from my first 1661 01:33:40,520 --> 01:33:42,960 Speaker 3: episode of Anything over On, it could happen here. 1662 01:33:43,760 --> 01:33:45,280 Speaker 4: Too. Much of the media, both. 1663 01:33:45,200 --> 01:33:48,599 Speaker 3: Legacy and otherwise, responded slowly, cautiously, failing to call him 1664 01:33:48,600 --> 01:33:51,040 Speaker 3: a liar, a con artist, an aggress or, a manipula 1665 01:33:51,120 --> 01:33:53,320 Speaker 3: of rasis the deadbeat dad, you know all the thing's 1666 01:33:53,320 --> 01:33:54,080 Speaker 3: actually happening. 1667 01:33:54,400 --> 01:33:54,880 Speaker 4: No, no, no. 1668 01:33:55,920 --> 01:33:58,640 Speaker 3: They kind of danced around him. They reported stories that 1669 01:33:58,760 --> 01:34:01,880 Speaker 3: might make you think they maybe noticed it, But there 1670 01:34:01,960 --> 01:34:06,479 Speaker 3: was this desperation to guard objectivity, and it was just 1671 01:34:07,800 --> 01:34:10,920 Speaker 3: it lacked any real intent. It lacked any interest in 1672 01:34:11,040 --> 01:34:14,320 Speaker 3: calling account to a man who has pretty much bought 1673 01:34:14,360 --> 01:34:17,640 Speaker 3: an election for Donald Trump, a racist billionaire using his 1674 01:34:17,720 --> 01:34:20,360 Speaker 3: outsized capital the Benz Society to his will. Just isn't 1675 01:34:20,400 --> 01:34:22,479 Speaker 3: a fucking problem for the media, or at least not 1676 01:34:22,560 --> 01:34:24,559 Speaker 3: as much of a problem as a worker who might 1677 01:34:24,680 --> 01:34:26,880 Speaker 3: not work fifty to one hundred hours a week for 1678 01:34:26,960 --> 01:34:28,640 Speaker 3: a boss who makes one hundred and thirty times what 1679 01:34:28,760 --> 01:34:32,000 Speaker 3: they do. The news at least outside of the right wing, 1680 01:34:32,600 --> 01:34:37,160 Speaker 3: is always separate from opinion, always guarded, always safe for 1681 01:34:37,280 --> 01:34:40,120 Speaker 3: fear that they might piss somebody off and be declared biased, 1682 01:34:40,680 --> 01:34:44,280 Speaker 3: something that happens anyway. And while there are columnists are 1683 01:34:44,320 --> 01:34:47,519 Speaker 3: given some space to have their own thoughts, sometimes in 1684 01:34:47,600 --> 01:34:51,240 Speaker 3: the newspaper, sometimes online, the stories themselves are delivered with 1685 01:34:51,320 --> 01:34:54,360 Speaker 3: the kind of reserved hmmmm tone that often fails to 1686 01:34:54,479 --> 01:34:58,160 Speaker 3: express any actual consequences or context around the news itself, 1687 01:34:59,080 --> 01:35:01,800 Speaker 3: and just doesn't I seem to care about making sure 1688 01:35:01,920 --> 01:35:05,719 Speaker 3: that the reader or listener learns something. My mate Casey 1689 01:35:05,760 --> 01:35:08,080 Speaker 3: has a good point about podcasts, and I applied some 1690 01:35:08,200 --> 01:35:10,320 Speaker 3: of the news too, that there's too much stuff out 1691 01:35:10,320 --> 01:35:13,240 Speaker 3: there that is there to make you feel intelligent rather 1692 01:35:13,280 --> 01:35:14,200 Speaker 3: than make you intelligent. 1693 01:35:14,479 --> 01:35:15,599 Speaker 4: I think this falls into him. 1694 01:35:16,479 --> 01:35:16,599 Speaker 5: Now. 1695 01:35:16,680 --> 01:35:18,720 Speaker 3: This isn't to say that outlets are incapable of doing 1696 01:35:18,760 --> 01:35:20,880 Speaker 3: this correctly. I love the Washington Post. They've done an 1697 01:35:20,960 --> 01:35:24,680 Speaker 3: excellent job on analyzing major tech stories. But a lot 1698 01:35:24,760 --> 01:35:27,240 Speaker 3: of these outlets feel custom built to be bulldozed the 1699 01:35:27,280 --> 01:35:30,960 Speaker 3: moment an authoritarian turns up. This force that exists to 1700 01:35:31,040 --> 01:35:35,479 Speaker 3: crush those desperately attached to norms and objectivity authoritarians know 1701 01:35:35,600 --> 01:35:38,720 Speaker 3: that they're ideologically charged words we quoted ad verbatim with 1702 01:35:38,800 --> 01:35:39,840 Speaker 3: the occasional. 1703 01:35:39,600 --> 01:35:43,320 Speaker 4: Ah, this could mean little dribble, this drizzle, this. 1704 01:35:43,360 --> 01:35:46,040 Speaker 3: Spunk of context that's lost in the headline that repeats 1705 01:35:46,120 --> 01:35:49,040 Speaker 3: exactly what the fucking authoritarian wants them to. 1706 01:35:49,400 --> 01:35:50,000 Speaker 4: And guess what. 1707 01:35:50,640 --> 01:35:54,200 Speaker 3: Some people don't read the article, They just read the headline. 1708 01:35:54,520 --> 01:35:56,880 Speaker 3: And Musk is the most brutal example of this. By 1709 01:35:56,920 --> 01:35:59,479 Speaker 3: the way, despite the fact that he's turned Twitter into 1710 01:35:59,520 --> 01:36:02,360 Speaker 3: a website pump full of racism and hatred that literally 1711 01:36:02,439 --> 01:36:05,360 Speaker 3: helped make Donald Trump president, Musk was still able to 1712 01:36:05,400 --> 01:36:08,120 Speaker 3: get mostly positive coverage from the majority of the mainstream 1713 01:36:08,200 --> 01:36:12,040 Speaker 3: media for his fucking robotaxi nonsense, despite the fact that 1714 01:36:12,080 --> 01:36:14,479 Speaker 3: he spent the best part of a decade lying about 1715 01:36:14,520 --> 01:36:17,600 Speaker 3: what Tesla will do next. There are entire websites just 1716 01:36:17,800 --> 01:36:21,080 Speaker 3: based on how much Elon Musk lies, yet they still 1717 01:36:21,400 --> 01:36:25,080 Speaker 3: report this shit. It makes me very upset, And it 1718 01:36:25,160 --> 01:36:27,920 Speaker 3: doesn't matter that some of these outlets, by the way, 1719 01:36:28,000 --> 01:36:30,519 Speaker 3: had a company in coverage that suggested that the markets 1720 01:36:30,560 --> 01:36:34,800 Speaker 3: weren't impressed by Tesla's theoretical robotaxi plans or their fake 1721 01:36:34,880 --> 01:36:36,240 Speaker 3: cass robots run by people. 1722 01:36:36,760 --> 01:36:38,520 Speaker 4: Musk is still able. 1723 01:36:38,400 --> 01:36:41,640 Speaker 3: To use the media's desperation for objectivity against them, and 1724 01:36:41,760 --> 01:36:44,960 Speaker 3: he knows that they never dare to combine reporting on 1725 01:36:45,120 --> 01:36:48,280 Speaker 3: stuff with thinking about stuff for fear that Elon Musk 1726 01:36:48,400 --> 01:36:51,479 Speaker 3: might say their bias, which he has been doing for years. 1727 01:36:51,840 --> 01:36:55,200 Speaker 3: Do you see my goddamn point yet? And this, by 1728 01:36:55,240 --> 01:36:56,880 Speaker 3: the way, is not always the fault of the rayers. 1729 01:36:57,200 --> 01:37:01,200 Speaker 3: There are entire foundations of editors that have more faith 1730 01:37:01,240 --> 01:37:03,040 Speaker 3: in the markets and the powerful than they do the 1731 01:37:03,120 --> 01:37:05,800 Speaker 3: people writing or the people reading their fucking words. And 1732 01:37:05,880 --> 01:37:09,599 Speaker 3: above them are entire editorial superstructures that exist to make 1733 01:37:09,640 --> 01:37:13,320 Speaker 3: sure that the editorial vision never colors too far outside 1734 01:37:13,320 --> 01:37:16,680 Speaker 3: the lines or informs people a little too much. And 1735 01:37:16,840 --> 01:37:19,559 Speaker 3: not even talking about Jeff Bezos or Lauren Powell jobs 1736 01:37:19,640 --> 01:37:21,360 Speaker 3: or any number of billionaires who are in any number 1737 01:37:21,400 --> 01:37:24,719 Speaker 3: of publications, but the editors editing business and tech reporters 1738 01:37:24,720 --> 01:37:27,840 Speaker 3: who don't know anything about business and tech, or the 1739 01:37:27,920 --> 01:37:31,280 Speaker 3: senior editors the terrified of any byline that might dare 1740 01:37:31,320 --> 01:37:33,680 Speaker 3: get the outlet under fire from somebody who could call 1741 01:37:33,720 --> 01:37:49,920 Speaker 3: their boss it's fucking cowardice. There are, however, I should add, 1742 01:37:50,000 --> 01:37:52,400 Speaker 3: also those who simply defer to the powerful that assume 1743 01:37:52,439 --> 01:37:55,519 Speaker 3: that this much money can't be wrong, even if said money, 1744 01:37:55,640 --> 01:37:58,320 Speaker 3: in the case of Elon Musk, is repeatedly wrong, and 1745 01:37:58,360 --> 01:38:00,439 Speaker 3: there's an entire website about the wrong US and the 1746 01:38:00,520 --> 01:38:03,959 Speaker 3: liars and the bullshit, and I'm talking about Elon Musk still. Obviously, 1747 01:38:04,600 --> 01:38:06,400 Speaker 3: these editors are the people that look at the current 1748 01:38:06,439 --> 01:38:08,639 Speaker 3: crop of powerful tech companies that have failed to deliver 1749 01:38:08,760 --> 01:38:11,120 Speaker 3: any truly meaningful innovation in years, and they. 1750 01:38:11,080 --> 01:38:12,639 Speaker 4: Go ooh, oh, send me more. 1751 01:38:12,720 --> 01:38:15,800 Speaker 3: Daddy showed me more of the apps. It's fucking disgraceful. 1752 01:38:16,200 --> 01:38:17,840 Speaker 3: Just look at the coverage of Sam Mortman from the 1753 01:38:17,920 --> 01:38:20,080 Speaker 3: last year. You know, the guy who spent years lying 1754 01:38:20,120 --> 01:38:22,880 Speaker 3: about what AI can do, and tell me why every 1755 01:38:23,000 --> 01:38:26,559 Speaker 3: single thought he says must be uncritically catalog is every 1756 01:38:26,640 --> 01:38:30,240 Speaker 3: goddamn decision applauded, his every claim, trumpeted as certain, his 1757 01:38:30,400 --> 01:38:33,200 Speaker 3: brittle little company that burns five billion dollars a year 1758 01:38:33,560 --> 01:38:37,400 Speaker 3: talked about like it's a fucking living god. Sam Moltman 1759 01:38:37,520 --> 01:38:40,160 Speaker 3: is a liar who's been fired from two companies, including 1760 01:38:40,200 --> 01:38:42,360 Speaker 3: open Ai, and yet because he's a billionaire with a 1761 01:38:42,400 --> 01:38:45,600 Speaker 3: buzzy company, he's left totally unscathed. The powerful get a 1762 01:38:45,640 --> 01:38:47,960 Speaker 3: completely different set of rules to live by and exist 1763 01:38:48,040 --> 01:38:51,880 Speaker 3: in a totally different media environment. Their geniuses, entrepreneurs, fire brands. 1764 01:38:52,000 --> 01:38:54,519 Speaker 3: Their challenges are framed as missteps and their victories framed 1765 01:38:54,560 --> 01:38:56,920 Speaker 3: as certainties by the same outlets that told us that 1766 01:38:57,000 --> 01:38:59,559 Speaker 3: we were quiet quitting and that the economy is actually good, 1767 01:38:59,720 --> 01:39:02,720 Speaker 3: and that we're the problem for high prices. Well, it's 1768 01:39:02,800 --> 01:39:06,000 Speaker 3: correct to suggest that the right wing is horrendously ideological 1769 01:39:06,240 --> 01:39:08,760 Speaker 3: and they're terribly biased. It's very hard to look at 1770 01:39:08,800 --> 01:39:10,479 Speaker 3: the rest of the media and claim that they are not. 1771 01:39:11,120 --> 01:39:14,000 Speaker 3: The problem is that the so called left media, which 1772 01:39:14,080 --> 01:39:17,040 Speaker 3: usually is just the center, isn't biased towards what we 1773 01:39:17,120 --> 01:39:20,599 Speaker 3: may consider left wing causes like universal health care, strong unions, 1774 01:39:20,960 --> 01:39:22,720 Speaker 3: expanded social safety, and it's you know, the stuff that 1775 01:39:22,760 --> 01:39:25,080 Speaker 3: would actually be helpful. Now, they're biased in favor of 1776 01:39:25,160 --> 01:39:29,240 Speaker 3: filating an ever growing carousel of sociopathic billionaire assholes, elevating 1777 01:39:29,320 --> 01:39:31,800 Speaker 3: them to the status of American royalty, where they exist 1778 01:39:31,880 --> 01:39:35,519 Speaker 3: above expectations and norms that you and I must live by. 1779 01:39:36,320 --> 01:39:40,920 Speaker 3: This is the definition of elitism. The media has literally 1780 01:39:41,040 --> 01:39:43,639 Speaker 3: created a class of people who can lie and cheat 1781 01:39:43,880 --> 01:39:47,280 Speaker 3: and steal, and rather than condemn them for it. They're celebrated. 1782 01:39:48,320 --> 01:39:50,919 Speaker 3: While it might feel a little tangential to bring technology 1783 01:39:50,960 --> 01:39:54,559 Speaker 3: into this, I truly believe that everybody is affected by 1784 01:39:54,560 --> 01:39:58,040 Speaker 3: the rot economy, the growth or costs ecosystem where number 1785 01:39:58,120 --> 01:40:01,559 Speaker 3: must always go up because everybody is using technology all 1786 01:40:01,640 --> 01:40:04,679 Speaker 3: the time, and the technology in question is getting worse. 1787 01:40:05,520 --> 01:40:08,479 Speaker 3: This election cycle saw more than twenty five billion text 1788 01:40:08,520 --> 01:40:11,720 Speaker 3: messages sent to potential voters, and seemingly every website was 1789 01:40:11,720 --> 01:40:15,400 Speaker 3: crown full of random election advertising. Here's the thing about elections. 1790 01:40:15,600 --> 01:40:19,560 Speaker 3: They're not really always about policy. No, they're a referendum 1791 01:40:19,640 --> 01:40:22,360 Speaker 3: on the incumbent party you're president, and by proxy, a 1792 01:40:22,439 --> 01:40:25,400 Speaker 3: poll on how people feel. And the reality is that 1793 01:40:25,520 --> 01:40:29,519 Speaker 3: most people are fucking miserable. There's this all encompassing feeling 1794 01:40:29,600 --> 01:40:32,280 Speaker 3: that things are just harder now. It's harder to pay 1795 01:40:32,320 --> 01:40:34,519 Speaker 3: your bills, it's harder to keep in touch with your friends. 1796 01:40:34,760 --> 01:40:36,800 Speaker 3: It's harder to start a family, it's harder to buy 1797 01:40:36,840 --> 01:40:38,679 Speaker 3: a house, it's harder to fall in love, it's harder 1798 01:40:38,720 --> 01:40:41,800 Speaker 3: to do everything. And what we're seeing is an in 1799 01:40:41,840 --> 01:40:43,440 Speaker 3: shitification of existence. 1800 01:40:43,520 --> 01:40:45,400 Speaker 4: To use mister doctor Roe's phrase. 1801 01:40:45,680 --> 01:40:47,840 Speaker 3: Everything just I don't want to be this much of 1802 01:40:47,880 --> 01:40:51,479 Speaker 3: a commudgeon. But everything just kind of sucks. It's all terrible, 1803 01:40:51,520 --> 01:40:53,880 Speaker 3: it's miserable, and hardly anyone thinks it's going. 1804 01:40:53,800 --> 01:40:54,320 Speaker 4: To get better. 1805 01:40:54,720 --> 01:40:57,120 Speaker 3: And this creates the kind of fertile conditions for a 1806 01:40:57,160 --> 01:41:00,360 Speaker 3: strong man to have emerged, one who arises and says 1807 01:41:00,360 --> 01:41:02,479 Speaker 3: that only he can fix things, even if he spent 1808 01:41:02,560 --> 01:41:05,800 Speaker 3: four years proving how he could not. And the problem 1809 01:41:05,880 --> 01:41:08,240 Speaker 3: for democrats and for institutions more broadly is that the 1810 01:41:08,320 --> 01:41:11,000 Speaker 3: all encompassing nature of this milieu is kind of hard 1811 01:41:11,040 --> 01:41:13,599 Speaker 3: to solve. It's hard to change the perception that everything's 1812 01:41:13,680 --> 01:41:15,439 Speaker 3: terrible when you're reminded of it when you're trying to 1813 01:41:15,520 --> 01:41:18,040 Speaker 3: do the most basic of tasks. Our phones are full 1814 01:41:18,040 --> 01:41:20,439 Speaker 3: of notifications trying to growth hack us into doing things 1815 01:41:20,520 --> 01:41:23,360 Speaker 3: that companies want. Our apps are full of micro transactions. 1816 01:41:23,520 --> 01:41:26,120 Speaker 3: Our websites are slower and harder to use, with endless 1817 01:41:26,160 --> 01:41:28,320 Speaker 3: demands of our emails and our phone numbers, and the 1818 01:41:28,400 --> 01:41:30,559 Speaker 3: nay to log back in because they couldn't possibly lose 1819 01:41:30,600 --> 01:41:32,559 Speaker 3: a dollar to someone who dared to consume a Washington 1820 01:41:32,600 --> 01:41:35,040 Speaker 3: Post article. And yes, I'm talking about the post, which 1821 01:41:35,040 --> 01:41:36,800 Speaker 3: I fucking pay for, despite the fact it logs me 1822 01:41:36,840 --> 01:41:40,080 Speaker 3: out all the time. Our social networks are so algorithmically 1823 01:41:40,160 --> 01:41:41,880 Speaker 3: charged that they barely show us the things we want 1824 01:41:41,920 --> 01:41:44,800 Speaker 3: them to anymore, With executives dedicated to filling our feeds 1825 01:41:44,840 --> 01:41:47,719 Speaker 3: full of AI generated slop. Because despite being the customer, 1826 01:41:47,880 --> 01:41:51,080 Speaker 3: we're also the revenue mechanism, our search engines do less 1827 01:41:51,160 --> 01:41:52,880 Speaker 3: as a means of making us use them more. Our 1828 01:41:52,960 --> 01:41:55,360 Speaker 3: dating apps have become vehicles of private equity to add 1829 01:41:55,360 --> 01:41:57,360 Speaker 3: a toll to falling in love. Our video games are 1830 01:41:57,400 --> 01:41:59,719 Speaker 3: constantly nagging us to give them more money, and despite 1831 01:41:59,800 --> 01:42:02,000 Speaker 3: it losting money and being attached to our account, we 1832 01:42:02,080 --> 01:42:04,679 Speaker 3: don't actually own any of the streaming media we purchase. 1833 01:42:04,960 --> 01:42:08,000 Speaker 3: We're drowning in spam, both in our emails and our phones, 1834 01:42:08,040 --> 01:42:10,320 Speaker 3: and at this point in our lives, we've probably agreed 1835 01:42:10,360 --> 01:42:13,439 Speaker 3: to three million pages of privacy policies allowing companies to 1836 01:42:13,560 --> 01:42:16,920 Speaker 3: use our information as they see fit. We get one 1837 01:42:17,120 --> 01:42:20,200 Speaker 3: value transaction with every company they get eleven, they get 1838 01:42:20,240 --> 01:42:23,280 Speaker 3: one hundred. We really actually don't know because there's no 1839 01:42:23,400 --> 01:42:26,280 Speaker 3: legislation to tell us what they're fucking doing. And these 1840 01:42:26,360 --> 01:42:30,760 Speaker 3: are the issues that hit everything we do all the time, constantly, unrelentingly. 1841 01:42:31,080 --> 01:42:34,040 Speaker 3: Technology is our lives now. We wake up, we use 1842 01:42:34,080 --> 01:42:36,559 Speaker 3: our phone, we check our text, three spam calls, two 1843 01:42:36,600 --> 01:42:38,800 Speaker 3: spam texts. We look at our bank balance, two factor 1844 01:42:38,840 --> 01:42:40,599 Speaker 3: authentication check, we're rid the news. 1845 01:42:40,640 --> 01:42:41,759 Speaker 4: A quarter of the pages bot. 1846 01:42:41,640 --> 01:42:44,080 Speaker 3: Bone advertisement asking for our email that's deliberately built to 1847 01:42:44,120 --> 01:42:46,040 Speaker 3: hide the button to get rid of them. And then 1848 01:42:46,080 --> 01:42:48,440 Speaker 3: we log into slack and feel a pang of anxiety. 1849 01:42:48,520 --> 01:42:51,320 Speaker 3: Is fifteen different notifications appear in a way there is 1850 01:42:51,439 --> 01:42:53,720 Speaker 3: really not built for us to find what we need, 1851 01:42:54,080 --> 01:42:59,080 Speaker 3: just to let us know something happen. Modern existence is 1852 01:42:59,240 --> 01:43:03,559 Speaker 3: just engulfed in sludge. The institutions that exist to cut 1853 01:43:03,640 --> 01:43:06,360 Speaker 3: through it seem to bounce between the ignorance of their 1854 01:43:06,439 --> 01:43:10,720 Speaker 3: masters and this misplaced duty to objectivity. Our mechanisms for 1855 01:43:10,880 --> 01:43:13,679 Speaker 3: exploring and enjoying the world are interfered with by powerful 1856 01:43:13,720 --> 01:43:17,479 Speaker 3: forces that are just basically left unchecked. Opening our devices 1857 01:43:17,560 --> 01:43:21,080 Speaker 3: is wilfully subjecting us to attack after attack after attack 1858 01:43:21,160 --> 01:43:23,920 Speaker 3: from applications, websites, and devices that are built to make 1859 01:43:24,000 --> 01:43:26,919 Speaker 3: us do things for them, rather than operate with dignity 1860 01:43:27,000 --> 01:43:29,559 Speaker 3: and freedom that much of the Internet was actually founded upon. 1861 01:43:30,360 --> 01:43:33,120 Speaker 3: These millions of invisible acts of terror are too often 1862 01:43:33,200 --> 01:43:36,719 Speaker 3: left undiscussed because accepting the truth requires you to accept 1863 01:43:36,960 --> 01:43:39,320 Speaker 3: that most of the tech ecosystem is rotten, and that 1864 01:43:39,439 --> 01:43:42,640 Speaker 3: billions of dollars are made harassing and punishing billions of 1865 01:43:42,680 --> 01:43:45,479 Speaker 3: people every single day of their lives through the devices 1866 01:43:45,520 --> 01:43:47,640 Speaker 3: that we're required to use in order to exist in 1867 01:43:47,680 --> 01:43:51,479 Speaker 3: the modern world. Most users suffer the consequences, and most 1868 01:43:51,520 --> 01:43:53,880 Speaker 3: of the media fails to account for them, and in turn, 1869 01:43:54,160 --> 01:43:56,639 Speaker 3: people walk around knowing something is wrong, but not knowing 1870 01:43:56,680 --> 01:44:00,960 Speaker 3: who to blame until somebody provides a convenient excuse, like immigrants, 1871 01:44:01,760 --> 01:44:05,040 Speaker 3: like the Democrats, like whatever fucking works. Because we can't 1872 01:44:05,120 --> 01:44:08,960 Speaker 3: actually call the people out, the corporations crushing our existence, 1873 01:44:10,200 --> 01:44:11,840 Speaker 3: Why wouldn't people crave change? 1874 01:44:12,040 --> 01:44:14,800 Speaker 4: Why wouldn't people be angry? Living in the current world? 1875 01:44:14,840 --> 01:44:18,760 Speaker 3: Absolutely fucking sucks. Sometimes it's miserable. It's bereft of industry 1876 01:44:18,920 --> 01:44:24,920 Speaker 3: and filthy with manipulation. It's undignified, it's disrespectful, and it 1877 01:44:25,040 --> 01:44:27,720 Speaker 3: must be crushed if we want to escape this depressing, 1878 01:44:27,840 --> 01:44:32,120 Speaker 3: goddamn world we've found ourselves in. Our media institutions are 1879 01:44:32,200 --> 01:44:35,439 Speaker 3: fully fucking capable of dealing with these problems, but it 1880 01:44:35,600 --> 01:44:39,840 Speaker 3: starts with actually evaluating them and aggressively interrogating them without 1881 01:44:39,880 --> 01:44:43,519 Speaker 3: fearing accusations of bias that, as I've said, repeatedly, happen 1882 01:44:43,720 --> 01:44:46,920 Speaker 3: either way. The truth is that the media is more 1883 01:44:46,920 --> 01:44:49,639 Speaker 3: afraid of accusations of bias than they are of misleading 1884 01:44:49,680 --> 01:44:52,479 Speaker 3: their readers. And while that seems like a slippery slope, 1885 01:44:52,479 --> 01:44:54,519 Speaker 3: and it may very well be one, there must be 1886 01:44:54,640 --> 01:44:56,960 Speaker 3: room to inject the writer's voice back into their work, 1887 01:44:57,320 --> 01:44:59,320 Speaker 3: and a willingness to call out bad actors as such, 1888 01:44:59,520 --> 01:45:01,320 Speaker 3: no matter and how rich they are, no matter how 1889 01:45:01,360 --> 01:45:03,720 Speaker 3: big their products are, no matter how willing they are 1890 01:45:03,800 --> 01:45:05,960 Speaker 3: to bark and scream that things are unfair as they 1891 01:45:05,960 --> 01:45:09,800 Speaker 3: accumulate more power and money. We need context in our news. 1892 01:45:10,520 --> 01:45:13,040 Speaker 3: We need it, we need it now. We need opinion, 1893 01:45:13,120 --> 01:45:16,040 Speaker 3: we need voice, we need character, we need life, because 1894 01:45:16,080 --> 01:45:20,240 Speaker 3: as long as we follow this bullshit objectivity path, we're screwed. 1895 01:45:20,880 --> 01:45:23,000 Speaker 3: And if you're in the tech industry and hearing this 1896 01:45:23,160 --> 01:45:23,759 Speaker 3: and saying. 1897 01:45:23,720 --> 01:45:25,960 Speaker 4: Oh, the media is tea critical of tech, if that 1898 01:45:26,120 --> 01:45:27,759 Speaker 4: fucking wrong, kiss my asshole. 1899 01:45:28,120 --> 01:45:30,599 Speaker 3: Everything we're seeing happening right now is a direct result 1900 01:45:30,720 --> 01:45:32,960 Speaker 3: of a society that let technology in the ultra rich 1901 01:45:33,040 --> 01:45:35,840 Speaker 3: run rampant, free of both the governmental guardrails that might 1902 01:45:35,840 --> 01:45:38,080 Speaker 3: have stopped them and the media ecosystem that might have 1903 01:45:38,120 --> 01:45:41,800 Speaker 3: actually held them in check. Our default position in interrogating 1904 01:45:41,840 --> 01:45:44,360 Speaker 3: the intentions and actions of the tech industry has become 1905 01:45:44,400 --> 01:45:47,040 Speaker 3: that they will work it out, as they continually redefine 1906 01:45:47,040 --> 01:45:49,160 Speaker 3: what work it out means and turn it into make 1907 01:45:49,240 --> 01:45:53,920 Speaker 3: their products worse but more profitable. Covering Meta, Twitter, Google, 1908 01:45:54,040 --> 01:45:56,360 Speaker 3: open Ai, and other huge tech companies as if the 1909 01:45:56,400 --> 01:45:59,600 Speaker 3: products they make are remarkable and perfect is disrespectable to 1910 01:45:59,640 --> 01:46:03,840 Speaker 3: the reader intelligence and a disgusting abdication of responsibility, as 1911 01:46:03,920 --> 01:46:08,080 Speaker 3: their products, even when they're functional, are significantly worse, more annoying, 1912 01:46:08,160 --> 01:46:11,320 Speaker 3: more frustrating, and more convoluted than ever. And that's before 1913 01:46:11,360 --> 01:46:13,400 Speaker 3: you get to the ones like Facebook and Instagram that 1914 01:46:13,439 --> 01:46:17,160 Speaker 3: are out right broken. I don't give a shit if 1915 01:46:17,240 --> 01:46:19,240 Speaker 3: these people have raised a lot of money unless you 1916 01:46:19,320 --> 01:46:21,560 Speaker 3: use that as proof that something is fundamentally wrong with 1917 01:46:21,640 --> 01:46:24,920 Speaker 3: the tech industry. Meta making billions of dollars of profit 1918 01:46:25,000 --> 01:46:27,240 Speaker 3: is a sign that something is wrong with society, not 1919 01:46:27,400 --> 01:46:29,679 Speaker 3: proof that it's a good company or anything that should 1920 01:46:29,680 --> 01:46:32,800 Speaker 3: grant Mark Zuckerberg any kind of special treatment. Shove your 1921 01:46:32,880 --> 01:46:36,120 Speaker 3: chains up your ass, Mark open Ai being worth one 1922 01:46:36,200 --> 01:46:38,400 Speaker 3: hundred and fifty seven billion dollars for a company that 1923 01:46:38,439 --> 01:46:40,840 Speaker 3: burns five billion or more a year to make a 1924 01:46:40,840 --> 01:46:42,960 Speaker 3: product that destroys our environment. For a product yet to 1925 01:46:43,000 --> 01:46:45,200 Speaker 3: find any real meaning, isn't a sign that it should 1926 01:46:45,200 --> 01:46:48,560 Speaker 3: get more coverage or be taken more seriously. No, it 1927 01:46:48,600 --> 01:46:51,200 Speaker 3: should be a sign that something is broken, that something 1928 01:46:51,320 --> 01:46:55,160 Speaker 3: is wrong with society. Whatever you may feel about chat GPT, 1929 01:46:55,320 --> 01:46:58,000 Speaker 3: the coverage it received is outsized compared to its actual 1930 01:46:58,120 --> 01:47:00,360 Speaker 3: utility and the things built on top of it. And 1931 01:47:00,439 --> 01:47:02,599 Speaker 3: that's a direct result of a media industry that seems 1932 01:47:02,640 --> 01:47:05,960 Speaker 3: incapable of holding the powerful accountable or actually learning about 1933 01:47:05,960 --> 01:47:09,720 Speaker 3: the subject matter in question. It's time to accept that 1934 01:47:09,880 --> 01:47:12,880 Speaker 3: most people's digital life fucking sucks, as does the way 1935 01:47:12,960 --> 01:47:16,680 Speaker 3: we consume our information, and that there are people directly responsible. 1936 01:47:17,680 --> 01:47:20,000 Speaker 3: Be as angry as you want at Jeff Bezos, whose 1937 01:47:20,040 --> 01:47:23,320 Speaker 3: wealth and the inherent cruelty of Amazon's labor practices makes 1938 01:47:23,360 --> 01:47:26,360 Speaker 3: him an obvious target, but please don't forget Mark Zuckerberg, 1939 01:47:26,439 --> 01:47:29,800 Speaker 3: Elon Musk, Sander Peshai, Tim Cook, and every single other 1940 01:47:29,880 --> 01:47:33,200 Speaker 3: tech executive that has allowed our digital experiences to become 1941 01:47:33,320 --> 01:47:38,480 Speaker 3: fucked up through algorithms that we know nothing about. Similarly, 1942 01:47:38,800 --> 01:47:42,040 Speaker 3: governments have entirely failed to push through any legislation that 1943 01:47:42,160 --> 01:47:44,519 Speaker 3: might stop the raw, both in terms of dominance and 1944 01:47:44,600 --> 01:47:47,240 Speaker 3: a patness of algorithmic manipulation and the ways in which 1945 01:47:47,280 --> 01:47:51,280 Speaker 3: tach products exist with few real quality standards. We may have, 1946 01:47:51,439 --> 01:47:53,840 Speaker 3: at least for now, consumer standards for the majority of 1947 01:47:53,920 --> 01:47:57,000 Speaker 3: consumer goods, but software is left effectively untouched, which is 1948 01:47:57,040 --> 01:47:59,120 Speaker 3: why so much of our digital lives are such unfettered. 1949 01:47:59,160 --> 01:47:59,639 Speaker 4: Doug shit. 1950 01:48:00,600 --> 01:48:02,360 Speaker 3: And if you're hearing this and saying I'm being a 1951 01:48:02,400 --> 01:48:04,360 Speaker 3: hater or a pess and miss shut the fuck up, 1952 01:48:04,360 --> 01:48:06,680 Speaker 3: I'm tired of you. I'm so fucking tired of being 1953 01:48:06,720 --> 01:48:09,160 Speaker 3: told to calm down about this as we stare down 1954 01:48:09,439 --> 01:48:12,240 Speaker 3: the barrel of four years of authoritarianism built on top 1955 01:48:12,280 --> 01:48:14,519 Speaker 3: of the decay of our lives, both physical and digital, 1956 01:48:14,720 --> 01:48:16,840 Speaker 3: with a media ecosystem that doesn't do a great job 1957 01:48:16,880 --> 01:48:19,800 Speaker 3: explaining what's being done to the people in an ideologically 1958 01:48:19,880 --> 01:48:24,800 Speaker 3: consistent way. There's this extremely common assumption in the tech media, 1959 01:48:24,840 --> 01:48:27,719 Speaker 3: based on what I'm really not sure that these companies 1960 01:48:27,720 --> 01:48:29,320 Speaker 3: are all doing a good job, and that good job 1961 01:48:29,400 --> 01:48:31,400 Speaker 3: means having lots of users and making lots of money, 1962 01:48:31,600 --> 01:48:35,240 Speaker 3: and it drives tons of editorial decision making. If three 1963 01:48:35,360 --> 01:48:38,320 Speaker 3: quarters of the biggest car manufacturers were making record profits 1964 01:48:38,360 --> 01:48:39,920 Speaker 3: by making half of their cars or the break that 1965 01:48:40,040 --> 01:48:44,320 Speaker 3: sometimes didn't work, that'd be international news. Government inquiries would 1966 01:48:44,320 --> 01:48:46,240 Speaker 3: app but people will go to prison. And this isn't 1967 01:48:46,240 --> 01:48:50,040 Speaker 3: even conjecture. It actually happened after Volkswagen was caught deliberately 1968 01:48:50,080 --> 01:48:53,439 Speaker 3: programming its engines to only meet emission standards during laboratory testing. 1969 01:48:53,920 --> 01:48:56,960 Speaker 3: They were left to spew excessive pollution into the real world, 1970 01:48:57,479 --> 01:49:00,920 Speaker 3: but once lawmakers found out, they responded with civil and 1971 01:49:00,960 --> 01:49:04,080 Speaker 3: criminal action. The executives and engineers responsible were indicted, one 1972 01:49:04,280 --> 01:49:07,120 Speaker 3: received seven years in jail, and their former CEO is 1973 01:49:07,160 --> 01:49:09,439 Speaker 3: currently being tried in Germany and being indicted in the 1974 01:49:09,560 --> 01:49:13,080 Speaker 3: US too. And here we are in the tech industry. 1975 01:49:13,640 --> 01:49:18,519 Speaker 3: Facebook barely works, used to nigenocides and bully people and 1976 01:49:18,840 --> 01:49:21,920 Speaker 3: harassed teen girls. Pedophiles run rampant on there. There was 1977 01:49:21,920 --> 01:49:23,840 Speaker 3: a Wall Street Journal about story about it. 1978 01:49:24,400 --> 01:49:24,920 Speaker 4: They're fine. 1979 01:49:25,479 --> 01:49:28,320 Speaker 3: So much of the tech industry consumer software like Google 1980 01:49:28,360 --> 01:49:31,360 Speaker 3: or Facebook, Twitter, and even chat GBT and business software 1981 01:49:31,360 --> 01:49:32,880 Speaker 3: from companies like Microsoft and Slack. 1982 01:49:33,520 --> 01:49:37,200 Speaker 4: It sucks. It sucks, It's bad. You use it every day. 1983 01:49:37,240 --> 01:49:39,920 Speaker 3: You've been listening to be Ramble for fifty episodes, now 1984 01:49:40,000 --> 01:49:43,040 Speaker 3: you know what I'm talking about. It's everywhere, Yet the 1985 01:49:43,160 --> 01:49:45,680 Speaker 3: media covers it just like, eh, you know, it's just 1986 01:49:45,760 --> 01:49:49,320 Speaker 3: how things are mate now. Meta, by the admission of 1987 01:49:49,360 --> 01:49:52,320 Speaker 3: its own internal documents, makes products that are ruinous to 1988 01:49:52,360 --> 01:49:54,840 Speaker 3: the mental health of teenage girls, and it hasn't made 1989 01:49:54,880 --> 01:49:57,639 Speaker 3: any substantial changes as a result, nor has it received 1990 01:49:57,680 --> 01:50:01,000 Speaker 3: any significant pushback for fame to do so. Little bit 1991 01:50:01,040 --> 01:50:03,200 Speaker 3: of a side note here, big shout out to Jeff 1992 01:50:03,240 --> 01:50:04,800 Speaker 3: Horwitz and the rest of the Wall Street General people 1993 01:50:04,840 --> 01:50:07,200 Speaker 3: who did the Facebook files. There are our legacy media 1994 01:50:07,240 --> 01:50:11,400 Speaker 3: people doing a good job on this. Nevertheless, Meta exercises 1995 01:50:11,439 --> 01:50:15,080 Speaker 3: this reckless disregard for public safety, kind of like the 1996 01:50:15,200 --> 01:50:17,559 Speaker 3: auto industry in the sixties, and that was when Ralph 1997 01:50:17,640 --> 01:50:20,160 Speaker 3: Nader wrote Unsafe at Any Speed in his book. It 1998 01:50:20,280 --> 01:50:22,479 Speaker 3: actually brought about change. It led to the Department of 1999 01:50:22,520 --> 01:50:25,599 Speaker 3: Transportation and the passage of seatbelt laws in forty nine states, 2000 01:50:25,600 --> 01:50:27,320 Speaker 3: and a bunch of other things that can get overlooked. 2001 01:50:28,080 --> 01:50:30,960 Speaker 3: But the tech industry is somehow inoculated against any kind 2002 01:50:31,000 --> 01:50:33,400 Speaker 3: of public pressure or shame because it operates in this 2003 01:50:33,560 --> 01:50:35,800 Speaker 3: completely different world with this different rule book and a 2004 01:50:35,840 --> 01:50:39,080 Speaker 3: different criteria for success, as well as this completely different 2005 01:50:39,120 --> 01:50:56,480 Speaker 3: set of expectations. By allowing the market to become disconnected 2006 01:50:56,520 --> 01:50:58,720 Speaker 3: from the value it creates, we enable companies like I 2007 01:50:58,760 --> 01:51:01,800 Speaker 3: don't know, in Vidia that reduce the quality of their 2008 01:51:01,840 --> 01:51:04,320 Speaker 3: services they make more money for their g Force now service, 2009 01:51:04,439 --> 01:51:07,920 Speaker 3: or Facebook they can just destroy our political discourse so 2010 01:51:08,000 --> 01:51:10,960 Speaker 3: they can facilitate genocide in Myanmar, and then well, they 2011 01:51:11,040 --> 01:51:13,640 Speaker 3: get headlines about how good a CEO Mark Zuckerbig is 2012 01:51:13,720 --> 01:51:16,599 Speaker 3: and how cool his chains are, and how how everything's 2013 01:51:16,720 --> 01:51:18,639 Speaker 3: just fine with Facebook and they're making more money. 2014 01:51:18,800 --> 01:51:18,840 Speaker 1: No. 2015 01:51:19,120 --> 01:51:22,040 Speaker 4: No, I actually want to take a step back, though. 2016 01:51:22,040 --> 01:51:23,479 Speaker 4: I want to take a little bit of step back. 2017 01:51:23,560 --> 01:51:28,400 Speaker 3: I previously mentioned I said it twice now, Oh, Meta 2018 01:51:28,640 --> 01:51:32,799 Speaker 3: enables genocide and it destroys our politic our political discourse. 2019 01:51:32,840 --> 01:51:35,320 Speaker 3: I want to be clear when I say that everything 2020 01:51:35,439 --> 01:51:40,120 Speaker 3: is justified at Meta, I'm actually quoting their chief technology officer. 2021 01:51:40,760 --> 01:51:44,240 Speaker 3: That's quite literally what Andrew Bosworth said in an internal 2022 01:51:44,320 --> 01:51:46,800 Speaker 3: memo from twenty sixteen where he said that and I 2023 01:51:46,960 --> 01:51:52,479 Speaker 3: quote ahem, all the work Facebook does in growth is justified, 2024 01:51:52,880 --> 01:51:56,400 Speaker 3: even if that includes and I'm quoting him directly, somebody 2025 01:51:56,520 --> 01:52:01,559 Speaker 3: dying in a terrorist attack coordinated using Facebook's tool. Now, 2026 01:52:01,640 --> 01:52:04,200 Speaker 3: the mere mention of violent crime is enough to create 2027 01:52:04,520 --> 01:52:07,280 Speaker 3: dreams of articles questioning whether society is safe and whether 2028 01:52:07,320 --> 01:52:10,080 Speaker 3: we need more plastic in our walgreens. Yeah, our digital 2029 01:52:10,160 --> 01:52:13,920 Speaker 3: lives are this wasteland that people still discuss like a utopia, 2030 01:52:14,520 --> 01:52:17,280 Speaker 3: seriously putting aside the social networks. Have you visited a 2031 01:52:17,360 --> 01:52:20,160 Speaker 3: website on the phone recently? Have you tried to use 2032 01:52:20,240 --> 01:52:22,719 Speaker 3: a new app? Have you tried to buy something online 2033 01:52:22,840 --> 01:52:26,360 Speaker 3: starting with a Google search? Within those experiences, sis, has 2034 01:52:26,439 --> 01:52:29,080 Speaker 3: anything gone wrong? You know it, I know it has, 2035 01:52:29,800 --> 01:52:30,519 Speaker 3: you know it has. 2036 01:52:31,520 --> 01:52:35,639 Speaker 4: It's time to wake up. We the users of products. 2037 01:52:35,920 --> 01:52:38,200 Speaker 3: We're at war with the products we're using and the 2038 01:52:38,280 --> 01:52:41,080 Speaker 3: people that make them, and right now we are losing. 2039 01:52:41,840 --> 01:52:45,000 Speaker 3: The media must realign to fight for how things should be. 2040 01:52:45,560 --> 01:52:48,200 Speaker 3: This doesn't mean that they can't cover things positively, or 2041 01:52:48,240 --> 01:52:50,360 Speaker 3: give credit where credit is due, or be willing to 2042 01:52:50,439 --> 01:52:54,000 Speaker 3: accept that something could be something cool. But has the 2043 01:52:54,160 --> 01:52:56,880 Speaker 3: change is the evaluation of the products themselves, which have 2044 01:52:57,000 --> 01:52:58,720 Speaker 3: been allowed to decay to a level that has become 2045 01:52:58,760 --> 01:53:02,760 Speaker 3: at best annoying and at actively harmful for society. Our 2046 01:53:02,840 --> 01:53:06,120 Speaker 3: networks are rotten, Our information ecosystem is poisoned with its 2047 01:53:06,200 --> 01:53:10,160 Speaker 3: pure parts ideologically and strategically concussed. Our means of speaking 2048 01:53:10,240 --> 01:53:12,439 Speaker 3: to those that we love and making new connections are 2049 01:53:12,479 --> 01:53:15,600 Speaker 3: so constantly interfered with that personal choice and dignity is. 2050 01:53:15,560 --> 01:53:19,360 Speaker 4: All but removed. But there is hope, there really is. 2051 01:53:19,760 --> 01:53:21,960 Speaker 3: Those covering the tech industry right now have one of 2052 01:53:22,000 --> 01:53:24,800 Speaker 3: the most consequential jobs in journalism if they choose to 2053 01:53:24,880 --> 01:53:28,920 Speaker 3: fucking do it. Those willing to guide people through the wasteland, 2054 01:53:29,320 --> 01:53:31,920 Speaker 3: those willing to discuss what needs to change, how bad 2055 01:53:32,000 --> 01:53:35,040 Speaker 3: things have gone, and hold the powerful accountable and say 2056 01:53:35,080 --> 01:53:37,519 Speaker 3: what good might look like have the opportunity to push 2057 01:53:37,560 --> 01:53:39,720 Speaker 3: for a better future by spitting in the faces of 2058 01:53:39,800 --> 01:53:42,439 Speaker 3: those ruining it. I don't know where I sit, by 2059 01:53:42,479 --> 01:53:44,760 Speaker 3: the way, I don't know what to call myself. Am 2060 01:53:44,800 --> 01:53:47,360 Speaker 3: I legacy media? I got my start writing in print magazines. 2061 01:53:47,600 --> 01:53:49,520 Speaker 3: Am I an independent contractor? 2062 01:53:49,600 --> 01:53:51,479 Speaker 4: Am I an influencer? Am I content? 2063 01:53:51,760 --> 01:53:53,800 Speaker 3: I truly don't know, and I don't know over care. 2064 01:53:53,880 --> 01:53:55,920 Speaker 3: But all that I know is that I feel like 2065 01:53:56,040 --> 01:53:58,360 Speaker 3: I'm at war two and that we, if I can 2066 01:53:58,439 --> 01:54:00,920 Speaker 3: be considered part of the media, are at war with 2067 01:54:01,040 --> 01:54:03,160 Speaker 3: people that have changed the terms of innovation so that 2068 01:54:03,200 --> 01:54:07,200 Speaker 3: it's synonymous with value extraction. Technology is how I became 2069 01:54:07,240 --> 01:54:09,880 Speaker 3: a person, how I met my closest friends and loved ones. 2070 01:54:10,000 --> 01:54:11,720 Speaker 3: And without it, I wouldn't be able to write, I 2071 01:54:11,720 --> 01:54:12,400 Speaker 3: wouldn't be able to. 2072 01:54:12,400 --> 01:54:14,360 Speaker 4: Read this podcast. I wouldn't have got this podcast. 2073 01:54:15,240 --> 01:54:18,800 Speaker 3: And I feel this poison flowing through my veins as 2074 01:54:18,840 --> 01:54:21,000 Speaker 3: I see what these motherfuckers have done and what they're 2075 01:54:21,000 --> 01:54:25,519 Speaker 3: continuing to do, and I see how inconsistently and tipidly 2076 01:54:25,600 --> 01:54:30,160 Speaker 3: they're interrogated. Now is the time to talk bluntly about 2077 01:54:30,200 --> 01:54:34,440 Speaker 3: what is happening. The declining quality of tech products, the 2078 01:54:34,600 --> 01:54:38,440 Speaker 3: scourge of growth, hacking, the cancellrous growth at all cost mindset. 2079 01:54:38,640 --> 01:54:40,440 Speaker 3: These are all the things that need to be raised 2080 01:54:40,480 --> 01:54:44,120 Speaker 3: in every single piece, and judgments must be unrelenting. 2081 01:54:44,800 --> 01:54:45,760 Speaker 4: The companies will. 2082 01:54:45,640 --> 01:54:49,400 Speaker 3: Squeal ooh that they're being so unfairly treated by the 2083 01:54:49,520 --> 01:54:54,080 Speaker 3: biased legacy media. Oh oh save me, hey, Nelle Patel 2084 01:54:54,200 --> 01:54:56,800 Speaker 3: interview with Sandhar Pishchai. This is how you sounded when 2085 01:54:56,840 --> 01:54:59,520 Speaker 3: you handed him your phone. It was pathetic. They should 2086 01:54:59,560 --> 01:55:03,240 Speaker 3: be scared you, Nille. The powerful should be scared of 2087 01:55:03,320 --> 01:55:06,080 Speaker 3: the media. They shouldn't be sitting there sending letters to 2088 01:55:06,120 --> 01:55:10,320 Speaker 3: the editor like fucking customer support. No, they should see 2089 01:55:10,400 --> 01:55:12,960 Speaker 3: this podcast, they should see these news letters. They should 2090 01:55:13,000 --> 01:55:15,800 Speaker 3: see everything published by the tech media and go uh oh. 2091 01:55:17,000 --> 01:55:18,800 Speaker 4: And there can be good people. There can be good 2092 01:55:18,800 --> 01:55:20,160 Speaker 4: boys and girls than others. 2093 01:55:20,720 --> 01:55:23,040 Speaker 3: There can be plenty of people that make good products 2094 01:55:23,080 --> 01:55:25,760 Speaker 3: and get great press for it. But do you really 2095 01:55:25,920 --> 01:55:30,160 Speaker 3: think meta Google, Apple to an extent, frankly, do you 2096 01:55:30,240 --> 01:55:32,280 Speaker 3: think Amazon looks good right now? Do you think it's 2097 01:55:32,320 --> 01:55:34,280 Speaker 3: easy to find stuff? Or do you think it's slop 2098 01:55:34,360 --> 01:55:37,360 Speaker 3: full of more slop? Mark Zuckerberg said on an Earning 2099 01:55:37,440 --> 01:55:40,080 Speaker 3: Score the other day that he intends there to be 2100 01:55:40,160 --> 01:55:46,040 Speaker 3: an AI specific slop feed that should These are harmful things. 2101 01:55:46,080 --> 01:55:51,520 Speaker 3: This is pouring vants of oil into rivers and then 2102 01:55:51,680 --> 01:55:55,440 Speaker 3: getting told you're the best boy in town. These companies, 2103 01:55:55,440 --> 01:55:58,000 Speaker 3: they're poisoning the digital world and they must be held 2104 01:55:58,040 --> 01:55:59,640 Speaker 3: accountable for the damage they're causing. 2105 01:56:00,200 --> 01:56:01,320 Speaker 4: Readers are already aware. 2106 01:56:02,000 --> 01:56:05,000 Speaker 3: But ah, and this is really thanks to members of 2107 01:56:05,000 --> 01:56:08,040 Speaker 3: the media, by the way, the gaslighting themselves into believing that, oh, 2108 01:56:08,440 --> 01:56:10,960 Speaker 3: I just don't I don't keep up with technology. He 2109 01:56:11,080 --> 01:56:13,400 Speaker 3: is getting away from me. I'm not technical enough to 2110 01:56:13,520 --> 01:56:15,720 Speaker 3: use this. When the thing that they don't get that 2111 01:56:15,760 --> 01:56:17,879 Speaker 3: the average person doesn't get is that the tech industry 2112 01:56:17,920 --> 01:56:22,040 Speaker 3: has built legions of obfiscations, legions of legal tricks, and 2113 01:56:22,120 --> 01:56:26,280 Speaker 3: these horrible little user interface traps specifically made to trick 2114 01:56:26,360 --> 01:56:29,160 Speaker 3: you into doing things, to make the experience kind of 2115 01:56:29,240 --> 01:56:32,560 Speaker 3: subordinate to getting the money off of you. And I 2116 01:56:32,680 --> 01:56:35,480 Speaker 3: think that this is one of the biggest issues in society. 2117 01:56:35,560 --> 01:56:37,839 Speaker 3: And yes, of course I'm biased. I'm doing a podcast 2118 01:56:37,880 --> 01:56:41,480 Speaker 3: about tech, but for real, though, billions of people use smartphones, 2119 01:56:41,600 --> 01:56:43,920 Speaker 3: billions of people are on the computer every day. It's 2120 01:56:43,960 --> 01:56:48,480 Speaker 3: how we do everything. And it stinks. It stinks so bad. 2121 01:56:48,920 --> 01:56:51,480 Speaker 3: This is the rot economy. We're in the rot society. 2122 01:56:52,360 --> 01:56:55,640 Speaker 3: But things can change, and for them to change, it 2123 01:56:55,720 --> 01:56:58,360 Speaker 3: has to start with the information sources, and that starts 2124 01:56:58,480 --> 01:57:02,280 Speaker 3: with journalism. Work has already begun and will continue, but 2125 01:57:02,400 --> 01:57:04,360 Speaker 3: it must scale up and it must do so quickly. 2126 01:57:05,040 --> 01:57:08,160 Speaker 3: And you, the user, have the power learn to read 2127 01:57:08,160 --> 01:57:10,000 Speaker 3: a privacy policy and the link there is to the 2128 01:57:10,080 --> 01:57:12,440 Speaker 3: Washington Post. Yes, there are plenty of great reporters there. 2129 01:57:12,800 --> 01:57:15,440 Speaker 3: Fuck Bezos. You can move to Signal, which is an 2130 01:57:15,480 --> 01:57:18,520 Speaker 3: encrypted messaging app that works on just about everything. Get 2131 01:57:18,560 --> 01:57:20,560 Speaker 3: a service like delete me, and by the way I 2132 01:57:20,680 --> 01:57:22,360 Speaker 3: pay for it, I work from four years ago. 2133 01:57:22,560 --> 01:57:24,680 Speaker 4: I have no financial relationship with them. 2134 01:57:24,920 --> 01:57:27,040 Speaker 3: But they're great for removing you from data brokers. 2135 01:57:27,880 --> 01:57:29,200 Speaker 4: Molly White, who's a dear. 2136 01:57:29,040 --> 01:57:30,960 Speaker 3: Friend of mine and even better right who might remember 2137 01:57:31,200 --> 01:57:33,600 Speaker 3: from one of the early episodes about Wikipedia. She's also 2138 01:57:33,680 --> 01:57:36,000 Speaker 3: written this extremely long guide about what to do next 2139 01:57:36,040 --> 01:57:37,680 Speaker 3: that are linked to in the notes, and it runs 2140 01:57:37,720 --> 01:57:40,200 Speaker 3: through a ton of great things you can do unionization, 2141 01:57:40,400 --> 01:57:43,480 Speaker 3: finding your communities, dropping apps that collect and store sensitive data, 2142 01:57:43,520 --> 01:57:46,240 Speaker 3: and so on. I also heartily recommend Wired's guide to 2143 01:57:46,280 --> 01:57:48,640 Speaker 3: protecting yourself from government surveillance, which is linked in the 2144 01:57:48,680 --> 01:57:54,520 Speaker 3: show notes. Now, before we go, I want to leave 2145 01:57:54,520 --> 01:57:56,680 Speaker 3: you with something that I posted on November sixth on 2146 01:57:56,800 --> 01:57:57,280 Speaker 3: the Better. 2147 01:57:57,120 --> 01:57:57,880 Speaker 4: Offline redd app. 2148 01:57:59,800 --> 01:58:01,640 Speaker 3: The last twenty four hours of felt bleak and will 2149 01:58:01,720 --> 01:58:03,840 Speaker 3: likely feel more bleak as the months and years go on. 2150 01:58:04,160 --> 01:58:05,920 Speaker 3: It'll be easy to give in to doom, to assume 2151 01:58:05,960 --> 01:58:07,840 Speaker 3: the fight is lost, to assume that the bad guys 2152 01:58:07,880 --> 01:58:10,280 Speaker 3: have permanently won and there will never be any justice 2153 01:58:10,320 --> 01:58:14,080 Speaker 3: or joy again. Now's the time for solidarity to crystallize 2154 01:58:14,120 --> 01:58:16,400 Speaker 3: around ideas that matter, even if their a position in 2155 01:58:16,480 --> 01:58:19,200 Speaker 3: society is delayed. Even as the clouds darken and the 2156 01:58:19,280 --> 01:58:22,840 Speaker 3: storm's brew and the darkness feels all encompassing and suffocating. 2157 01:58:23,600 --> 01:58:26,000 Speaker 3: Reach out to those you love and don't just commiserate. 2158 01:58:26,200 --> 01:58:29,040 Speaker 3: Plan It doesn't have to be political, it doesn't even 2159 01:58:29,160 --> 01:58:31,879 Speaker 3: really have to matter, but shit on your fucking calendar. 2160 01:58:32,000 --> 01:58:34,440 Speaker 3: Keep yourself active and busy and if not distracted, at 2161 01:58:34,600 --> 01:58:36,160 Speaker 3: very least animated. 2162 01:58:36,760 --> 01:58:38,160 Speaker 4: Darkness feeds an idleness. 2163 01:58:38,360 --> 01:58:40,760 Speaker 3: Darkness feasts on a sense of failure and a sense 2164 01:58:40,800 --> 01:58:44,760 Speaker 3: of inability to make change. You don't know me very well, 2165 01:58:44,920 --> 01:58:47,280 Speaker 3: but know that I'm aware of the darkness and the 2166 01:58:47,400 --> 01:58:50,720 Speaker 3: sadness and the suffocation of when things feel overwhelming, give 2167 01:58:50,760 --> 01:58:53,160 Speaker 3: yourself some mercy and then the days to come. Don't 2168 01:58:53,240 --> 01:58:57,240 Speaker 3: castigate yourself a feeling gutted, Then keep going. I realize 2169 01:58:57,280 --> 01:58:59,160 Speaker 3: it's little solace to think, well, if I keep saying 2170 01:58:59,160 --> 01:59:01,440 Speaker 3: stuff out loud, things will get better. But I promise 2171 01:59:01,520 --> 01:59:04,840 Speaker 3: you doing so as an effect and actually matters. Keep 2172 01:59:04,880 --> 01:59:07,400 Speaker 3: talking about how fucked up things are, Make sure it's 2173 01:59:07,400 --> 01:59:09,560 Speaker 3: written down, make sure it's spoken cleanly, and with the 2174 01:59:09,680 --> 01:59:12,800 Speaker 3: rage and fire and piss and vinegar it deserves. Things 2175 01:59:12,840 --> 01:59:14,720 Speaker 3: will change for the better, even if it takes more 2176 01:59:14,760 --> 01:59:15,680 Speaker 3: time than it should. 2177 01:59:17,880 --> 01:59:18,120 Speaker 4: Look. 2178 01:59:19,160 --> 01:59:23,120 Speaker 3: I know I'm imperfect, emotional, off kill or at times 2179 01:59:23,200 --> 01:59:25,800 Speaker 3: I get emails saying that too angry. 2180 01:59:26,960 --> 01:59:28,160 Speaker 4: I'm sorry if it's ever triggered. 2181 01:59:28,160 --> 01:59:31,320 Speaker 3: You really do mean that. It's not intentional. I just 2182 01:59:31,440 --> 01:59:34,919 Speaker 3: I feel this in everything I do. I use technology 2183 01:59:34,960 --> 01:59:36,960 Speaker 3: all the time, and it is extremely annoying. But also 2184 01:59:37,360 --> 01:59:39,919 Speaker 3: I'm aware that I have privilege, and the more privilege 2185 01:59:39,920 --> 01:59:41,800 Speaker 3: you have with intake, the more you're able to escape 2186 01:59:41,800 --> 01:59:44,400 Speaker 3: the little things. Go and buy a cheap laptop today. 2187 01:59:44,520 --> 01:59:47,400 Speaker 3: Try and see what two hundred and three hundred. 2188 01:59:47,120 --> 01:59:47,840 Speaker 4: Dollars laptop is. 2189 01:59:47,840 --> 01:59:50,040 Speaker 3: It's slow, It's full of eighteen pop ups trying to 2190 01:59:50,040 --> 01:59:52,080 Speaker 3: sell you access to cloud storage, to shit that you'll 2191 01:59:52,120 --> 01:59:56,680 Speaker 3: never use, Tricking grannies and people who can't afford laptops, 2192 01:59:56,680 --> 01:59:59,760 Speaker 3: so people that just don't know. When I see this stuff, 2193 02:00:00,000 --> 02:00:02,320 Speaker 3: it enrages me, not just for me, but because I 2194 02:00:02,440 --> 02:00:04,920 Speaker 3: know that I'm at least lucky enough to know how 2195 02:00:04,920 --> 02:00:07,800 Speaker 3: to get around this shit. Spent most of my life online, 2196 02:00:08,000 --> 02:00:09,720 Speaker 3: spent most of my life playing with tech and how 2197 02:00:09,840 --> 02:00:14,080 Speaker 3: it works. And I know I have my tangents and 2198 02:00:14,200 --> 02:00:17,320 Speaker 3: my biases, but I wear them kind oft my heart, 2199 02:00:17,360 --> 02:00:20,280 Speaker 3: on my sleeve. I care about all this stuff in 2200 02:00:20,320 --> 02:00:22,160 Speaker 3: a way that might be a little different to some, 2201 02:00:23,120 --> 02:00:27,360 Speaker 3: and it's because I've I've watched an industry that really 2202 02:00:27,440 --> 02:00:30,480 Speaker 3: made me as a person, that allowed me to grow 2203 02:00:30,520 --> 02:00:34,320 Speaker 3: as a person, to actually meet people, to not feel 2204 02:00:34,360 --> 02:00:37,080 Speaker 3: as alone. And I imagine some of you feel like 2205 02:00:37,120 --> 02:00:40,680 Speaker 3: this too, and then watching what happens to it every day, 2206 02:00:41,120 --> 02:00:44,160 Speaker 3: watching the people who get so rich off of making 2207 02:00:44,200 --> 02:00:45,800 Speaker 3: it so much worse, and then seeing what happen on 2208 02:00:45,840 --> 02:00:48,840 Speaker 3: November fifth, and you can draw a line from it. 2209 02:00:50,160 --> 02:00:54,839 Speaker 3: People are scared, they're lost. Their lives are spent digitally, 2210 02:00:55,120 --> 02:00:59,800 Speaker 3: and your digital lives are just endless terrorism, endless harm. 2211 02:01:01,440 --> 02:01:03,600 Speaker 3: Some of you know your way around take so you 2212 02:01:03,640 --> 02:01:05,880 Speaker 3: can escape some of it, but it's impossible to escape 2213 02:01:05,920 --> 02:01:09,400 Speaker 3: all of it. Try meeting people these days, you can't. 2214 02:01:09,520 --> 02:01:13,080 Speaker 3: Everything is online, and everything online, everything on your phone 2215 02:01:13,160 --> 02:01:17,000 Speaker 3: is mitigated and interfered with. It's an assault on your senses, 2216 02:01:17,040 --> 02:01:21,720 Speaker 3: one deprived of dignity. And I see the people doing 2217 02:01:21,840 --> 02:01:25,480 Speaker 3: this and it feels me full of fucking rage, and 2218 02:01:25,600 --> 02:01:29,080 Speaker 3: it makes me angry for you and for me, for 2219 02:01:29,240 --> 02:01:31,680 Speaker 3: my son growing up, and what will probably be a 2220 02:01:31,800 --> 02:01:34,640 Speaker 3: worse world for my friends and loved ones who are 2221 02:01:34,720 --> 02:01:37,880 Speaker 3: harder to see, harder to speak to, whose lives too 2222 02:01:38,000 --> 02:01:41,440 Speaker 3: are interfered with. And there are the millions and millions 2223 02:01:41,480 --> 02:01:44,960 Speaker 3: of people who have no fucking idea it's happening, that 2224 02:01:45,160 --> 02:01:48,600 Speaker 3: just exist in this swill, in this active digital terrorism, 2225 02:01:49,000 --> 02:01:55,160 Speaker 3: poked and prodded and nagged and notified constantly, And I 2226 02:01:55,880 --> 02:01:58,640 Speaker 3: don't want Early on in this I got a message saying, 2227 02:01:58,760 --> 02:02:00,840 Speaker 3: don't tell people to be angry, and I stick by that. 2228 02:02:02,320 --> 02:02:04,760 Speaker 3: But I'm not going to hide that I am. I'm 2229 02:02:04,800 --> 02:02:06,360 Speaker 3: not going to hide the pain I feel. I'm not 2230 02:02:06,400 --> 02:02:08,600 Speaker 3: going to hide the pain I feel seeing this shit happen. 2231 02:02:10,960 --> 02:02:15,120 Speaker 3: And I've watched this thing that I love technology, really 2232 02:02:15,280 --> 02:02:17,120 Speaker 3: do love tech, I really do deeply. 2233 02:02:17,600 --> 02:02:18,200 Speaker 4: I've watched it. 2234 02:02:18,560 --> 02:02:23,160 Speaker 3: Corrupted and broken, and the people breaking it. They don't 2235 02:02:23,280 --> 02:02:25,760 Speaker 3: just make billions of dollars. They get articles in, they 2236 02:02:25,800 --> 02:02:29,840 Speaker 3: get interviewed on the news. Mark Zuckerbug, he wears a 2237 02:02:29,960 --> 02:02:32,320 Speaker 3: chain and there's articles about how cool he is. He 2238 02:02:32,400 --> 02:02:34,920 Speaker 3: should be in fucking prison. He should be on a 2239 02:02:35,000 --> 02:02:37,480 Speaker 3: prison on a boat that just circles the world, and 2240 02:02:37,600 --> 02:02:40,080 Speaker 3: he shouldn't have air conditioning or heat depending on how 2241 02:02:40,120 --> 02:02:43,080 Speaker 3: the weather is. And I know that I'm kind of 2242 02:02:43,280 --> 02:02:48,160 Speaker 3: errand and again tons of tangents. But look, the reason 2243 02:02:48,200 --> 02:02:50,520 Speaker 3: I'm like this is because I really care. And I 2244 02:02:50,560 --> 02:02:53,280 Speaker 3: think caring. I think being angry at the things that 2245 02:02:53,360 --> 02:02:57,800 Speaker 3: actually matter and giving context as a result. I think 2246 02:02:57,840 --> 02:03:01,120 Speaker 3: that's deeply valuable. And I realize I do for to 2247 02:03:01,160 --> 02:03:03,000 Speaker 3: handle a lot, but it really. 2248 02:03:02,960 --> 02:03:03,560 Speaker 4: Is because I care. 2249 02:03:04,120 --> 02:03:06,360 Speaker 3: I care about you, I care about the subject matter. 2250 02:03:07,600 --> 02:03:10,440 Speaker 3: I'm so grateful and so honored that you spend your 2251 02:03:10,440 --> 02:03:12,080 Speaker 3: time listening to me every week, and I hope you'll 2252 02:03:12,080 --> 02:03:22,640 Speaker 3: continue to do so because I'm not going anywhere. Thank 2253 02:03:22,680 --> 02:03:25,360 Speaker 3: you for listening to Better Offline. The editor and composer 2254 02:03:25,440 --> 02:03:28,240 Speaker 3: of the Better Offline theme song is Matasowski. You can 2255 02:03:28,320 --> 02:03:30,640 Speaker 3: check out more of his music and audio projects at 2256 02:03:30,720 --> 02:03:32,120 Speaker 3: Matasowski dot com. 2257 02:03:32,520 --> 02:03:35,880 Speaker 4: M A T T O s O W s ki 2258 02:03:36,280 --> 02:03:36,720 Speaker 4: dot com. 2259 02:03:37,480 --> 02:03:39,760 Speaker 3: You can email me at easy at Better Offline dot 2260 02:03:39,800 --> 02:03:42,040 Speaker 3: com or visit Better Offline dot com to find more 2261 02:03:42,080 --> 02:03:45,440 Speaker 3: podcast links and of course, my newsletter. I also really 2262 02:03:45,480 --> 02:03:47,680 Speaker 3: recommend you go to chat dot Where's youreed dot at 2263 02:03:47,800 --> 02:03:50,200 Speaker 3: to visit the discord, and go to our slash Better 2264 02:03:50,240 --> 02:03:53,400 Speaker 3: Offline to check out our reddit. Thank you so much 2265 02:03:53,480 --> 02:03:54,240 Speaker 3: for listening. 2266 02:03:54,800 --> 02:03:57,480 Speaker 5: Better Offline is a production of cool Zone Media. For 2267 02:03:57,640 --> 02:04:00,800 Speaker 5: more from cool Zone Media, visit our website cool zonemedia 2268 02:04:00,840 --> 02:04:03,680 Speaker 5: dot com, or check us out on the iHeartRadio app, 2269 02:04:03,720 --> 02:04:06,160 Speaker 5: Apple Podcasts, or wherever you get your podcasts.