1 00:00:02,240 --> 00:00:03,360 Speaker 1: All Zone Media. 2 00:00:05,440 --> 00:00:08,920 Speaker 2: Welcome to Better Offline. I'm your host ed Zitron. I've 3 00:00:08,920 --> 00:00:11,079 Speaker 2: spent the last fifteen years in tech, both as a 4 00:00:11,119 --> 00:00:14,080 Speaker 2: reporter and a public relations executive, and over that career, 5 00:00:14,120 --> 00:00:16,400 Speaker 2: I've realized that many see the tech industry as kind 6 00:00:16,400 --> 00:00:19,720 Speaker 2: of a monolith. It's actually a lot simpler than you'd imagine. 7 00:00:19,760 --> 00:00:22,280 Speaker 2: And it's my mission with this podcast to got through 8 00:00:22,280 --> 00:00:25,040 Speaker 2: the buzzwords and nonsense and explain to you how venture 9 00:00:25,079 --> 00:00:27,560 Speaker 2: capital and big tech are trying to change the future 10 00:00:27,680 --> 00:00:30,560 Speaker 2: for better or for worse. Today, I'm joined by Robert 11 00:00:30,600 --> 00:00:33,160 Speaker 2: Evans of Cool Zone Media to break down a theory 12 00:00:33,200 --> 00:00:35,600 Speaker 2: I have that is pretty much core to everything I 13 00:00:35,680 --> 00:00:45,640 Speaker 2: think about and we'll talk to you about on this podcast. 14 00:00:48,520 --> 00:00:50,879 Speaker 1: Thank you for having me ed. I love that we're 15 00:00:50,920 --> 00:00:53,239 Speaker 1: coming on to talk about this because the concept you're 16 00:00:53,240 --> 00:00:55,160 Speaker 1: about to introduce was like the thing that made me 17 00:00:55,240 --> 00:00:57,280 Speaker 1: reach out to you about doing this show, or at 18 00:00:57,360 --> 00:01:00,160 Speaker 1: least one of the major ones. I think everyone has 19 00:01:00,240 --> 00:01:03,200 Speaker 1: this feeling that like something is fucked up and wrong. 20 00:01:03,360 --> 00:01:07,039 Speaker 1: We can remember this period of like optimism about technology 21 00:01:07,400 --> 00:01:10,600 Speaker 1: and the Internet and like not just feeling excited about 22 00:01:10,680 --> 00:01:13,400 Speaker 1: like new shit that we could buy, but feeling excited 23 00:01:13,440 --> 00:01:16,240 Speaker 1: about the things that technology was going to bring us, 24 00:01:16,280 --> 00:01:20,200 Speaker 1: the promise of like the new Digital age, and that 25 00:01:20,360 --> 00:01:23,520 Speaker 1: all fell aside to be replaced by this looming sense 26 00:01:23,560 --> 00:01:25,800 Speaker 1: of doom so quickly that I don't I don't think 27 00:01:25,840 --> 00:01:27,560 Speaker 1: a lot of us have a good idea about like 28 00:01:28,240 --> 00:01:29,520 Speaker 1: why it happened. 29 00:01:30,200 --> 00:01:32,480 Speaker 2: And you know what, I think that's actually a great 30 00:01:32,520 --> 00:01:36,440 Speaker 2: place to start with a very simple question, why does 31 00:01:36,520 --> 00:01:39,360 Speaker 2: it feel like the tech we use every day is 32 00:01:39,400 --> 00:01:43,200 Speaker 2: getting worse? Look your search results. They're full of sponsored 33 00:01:43,240 --> 00:01:46,840 Speaker 2: content and articles that don't actually answer any questions, or 34 00:01:46,920 --> 00:01:49,040 Speaker 2: just a big thread of people asking the same question 35 00:01:49,120 --> 00:01:52,000 Speaker 2: without ever getting an answer. Your friends aren't popping up 36 00:01:52,040 --> 00:01:56,720 Speaker 2: on Instagram, you constantly gained product emails, notifications pop up, 37 00:01:56,840 --> 00:02:01,200 Speaker 2: spam emails, and people you've maybe spoken who wants to 38 00:02:01,240 --> 00:02:03,960 Speaker 2: buy some Dodad. But then you look at the news 39 00:02:04,520 --> 00:02:08,240 Speaker 2: and seemingly every tech company is making billions of dollars, 40 00:02:08,720 --> 00:02:13,359 Speaker 2: growing at this alarming, almost unsustainable rate, all while laying 41 00:02:13,400 --> 00:02:15,799 Speaker 2: off tens of thousands of people a year. And that's 42 00:02:15,800 --> 00:02:17,760 Speaker 2: because there is a problem at the center of the 43 00:02:17,800 --> 00:02:21,400 Speaker 2: tech industry and actually the market's at large. It's a 44 00:02:21,480 --> 00:02:24,440 Speaker 2: cancerous problem and it's at the fabric of how capital 45 00:02:24,520 --> 00:02:28,480 Speaker 2: is deployed in modern business. Public and private investors, along 46 00:02:28,480 --> 00:02:31,840 Speaker 2: with the markets themselves, have become entirely decoupled from the 47 00:02:31,880 --> 00:02:35,880 Speaker 2: concept of what good business truly is, focusing on only 48 00:02:35,960 --> 00:02:43,480 Speaker 2: one goddamn thing, only one noxious, shitty metric growth In 49 00:02:43,520 --> 00:02:47,440 Speaker 2: the row economy, products are made worse, customers are abused, 50 00:02:47,840 --> 00:02:54,240 Speaker 2: labor is disposable, and executives are getting richer every single day. Now, 51 00:02:54,760 --> 00:02:57,560 Speaker 2: growth in this case is not necessarily just about being 52 00:02:57,600 --> 00:03:02,960 Speaker 2: bigger or better. It's just about more. It's always more. 53 00:03:03,560 --> 00:03:07,600 Speaker 2: Everything must be generating more money, must be having a 54 00:03:07,680 --> 00:03:12,320 Speaker 2: higher headcount, even when laying people off. No, the problem 55 00:03:12,360 --> 00:03:15,280 Speaker 2: here is growth in this case is not necessarily about 56 00:03:15,280 --> 00:03:19,720 Speaker 2: being bigger or better. It's just about more. It means 57 00:03:19,720 --> 00:03:23,359 Speaker 2: the company's generating more revenue, higher valuations, and gaining more 58 00:03:23,400 --> 00:03:27,040 Speaker 2: market share. And then it's looking around and saying, you know, 59 00:03:27,160 --> 00:03:31,400 Speaker 2: that's just not enough. We must have more. Businesses are 60 00:03:31,440 --> 00:03:33,920 Speaker 2: expected to be in fact, they're rewarded for being so 61 00:03:34,360 --> 00:03:39,440 Speaker 2: eternal engines of capital. They must continue to create more 62 00:03:39,480 --> 00:03:43,320 Speaker 2: and more shareholder value while hopefully at some point providing 63 00:03:43,360 --> 00:03:46,640 Speaker 2: a service of some kind to a customer. In the 64 00:03:46,680 --> 00:03:50,320 Speaker 2: public markets. This means that companies like Google, Meta, Microsoft, 65 00:03:50,800 --> 00:03:55,440 Speaker 2: they're all rewarded for having these unfocused, capital intensive businesses, 66 00:03:56,120 --> 00:03:59,400 Speaker 2: they're also involved laying off tens of thousands of people. 67 00:03:59,440 --> 00:04:02,280 Speaker 2: Over the last use hundreds of thousands of workers have 68 00:04:02,400 --> 00:04:06,120 Speaker 2: been laid off as multiple tech companies have been worth 69 00:04:06,200 --> 00:04:10,760 Speaker 2: half to three quarters to even a trillion dollars. It's ridiculous. 70 00:04:11,840 --> 00:04:14,880 Speaker 1: Yeah, it's uh. I think like one of the really 71 00:04:14,920 --> 00:04:17,400 Speaker 1: important things to nail in here is that, like this 72 00:04:17,440 --> 00:04:20,400 Speaker 1: isn't a question of profitability. The fact that you tie 73 00:04:20,400 --> 00:04:23,400 Speaker 1: it to growth is so important because like the issue 74 00:04:23,440 --> 00:04:26,440 Speaker 1: is not that these companies are not profitable, Like profit 75 00:04:26,520 --> 00:04:29,400 Speaker 1: has nothing to do with these decisions. It's about all 76 00:04:29,440 --> 00:04:32,160 Speaker 1: I mean Abida does, which is basically like profit over 77 00:04:32,240 --> 00:04:34,600 Speaker 1: costs and stuff. But it's like it's this this question 78 00:04:34,640 --> 00:04:37,800 Speaker 1: of like growth, right that if you're not if you're 79 00:04:37,839 --> 00:04:42,880 Speaker 1: not returning this kind of algorithmically scaling growth, then your 80 00:04:42,920 --> 00:04:44,919 Speaker 1: company is seen as a failure by a lot of 81 00:04:44,920 --> 00:04:48,560 Speaker 1: the people who who are responsible for like the financial decisions, 82 00:04:48,839 --> 00:04:50,880 Speaker 1: and so all of these choices that are kind of 83 00:04:51,000 --> 00:04:54,360 Speaker 1: choking the Internet out and making the stuff we use 84 00:04:54,400 --> 00:04:57,000 Speaker 1: every day work less aren't about It's not a question 85 00:04:57,040 --> 00:04:59,200 Speaker 1: of the company could can't be profitable if we don't 86 00:04:59,240 --> 00:05:01,880 Speaker 1: do this, It's that we won't have the right number 87 00:05:01,920 --> 00:05:03,600 Speaker 1: on a sheet to show somebody. 88 00:05:04,520 --> 00:05:08,680 Speaker 2: And Microsoft is actually a really good example. So Microsoft, 89 00:05:08,680 --> 00:05:11,680 Speaker 2: of course, they bought Activision Blizzard for sixty eight billion 90 00:05:11,760 --> 00:05:16,599 Speaker 2: dollars recently. They then even more recently, downsized the gaming 91 00:05:16,640 --> 00:05:19,920 Speaker 2: division at the cost of about nineteen hundred jobs, and 92 00:05:20,000 --> 00:05:23,680 Speaker 2: then they became more valuable than Apple in the process. 93 00:05:23,800 --> 00:05:27,800 Speaker 2: In December, shares of Spotify jumped by nearly seven point 94 00:05:27,880 --> 00:05:30,359 Speaker 2: five percent upon the news that it was slashing seventeen 95 00:05:30,400 --> 00:05:34,359 Speaker 2: percent of its workforce, adding nearly two point six billion 96 00:05:34,400 --> 00:05:37,280 Speaker 2: to the company's market cap. And the process, none of 97 00:05:37,360 --> 00:05:39,960 Speaker 2: these companies, of course, were punished for their poor planning, 98 00:05:40,120 --> 00:05:43,279 Speaker 2: or their stagnating products, the mismanagement of human capital, or 99 00:05:43,279 --> 00:05:46,359 Speaker 2: their general lack of any real innovation. And it's because 100 00:05:46,360 --> 00:05:49,200 Speaker 2: the number kept going up, and that is all that 101 00:05:49,320 --> 00:05:50,440 Speaker 2: seems to matter. 102 00:05:51,720 --> 00:05:55,240 Speaker 1: Yeah, And I like, when I was a kid kind 103 00:05:55,240 --> 00:05:58,400 Speaker 1: of questioning why stuff like layoffs hat and you know 104 00:05:58,400 --> 00:06:00,880 Speaker 1: where my dad got laid off from his and tech 105 00:06:00,960 --> 00:06:03,200 Speaker 1: and stuff. It was always framed to me as like, well, 106 00:06:03,200 --> 00:06:05,640 Speaker 1: these are. You know, companies aren't charities. They have to 107 00:06:05,680 --> 00:06:08,720 Speaker 1: like do this to turn a profit, otherwise they can't 108 00:06:08,760 --> 00:06:11,719 Speaker 1: afford to operate. And I think it's so crucial that 109 00:06:11,800 --> 00:06:13,800 Speaker 1: like that is not at all what's happening here. 110 00:06:14,960 --> 00:06:19,080 Speaker 2: Yeah. Microsoft had twenty two billion dollars in profit. Yeah, 111 00:06:19,160 --> 00:06:22,720 Speaker 2: their lost learnings. Yeah, these companies are fine. Google when 112 00:06:22,720 --> 00:06:26,240 Speaker 2: they did mass layoffs last year and in yes, twenty 113 00:06:26,279 --> 00:06:29,480 Speaker 2: twenty three, they had like ten billion dollars in profit. 114 00:06:29,560 --> 00:06:34,040 Speaker 2: These companies are in the green and they're fine. There's 115 00:06:34,040 --> 00:06:35,919 Speaker 2: no reason to lay off these people other than the 116 00:06:35,960 --> 00:06:40,880 Speaker 2: fact that they can and they don't need them anymore. 117 00:06:41,000 --> 00:06:44,920 Speaker 2: Or perhaps they just ran the company in this big, shitty, 118 00:06:45,080 --> 00:06:48,920 Speaker 2: messy way. And nowhere is that more obvious than with Meta. 119 00:06:49,120 --> 00:06:51,680 Speaker 1: Oh god. Yeah. 120 00:06:51,839 --> 00:06:54,640 Speaker 2: So back in October of twenty twenty two, I said 121 00:06:54,680 --> 00:06:57,240 Speaker 2: that Mark Zuckerberg was going to kill Meta. Perhaps I 122 00:06:57,279 --> 00:06:59,919 Speaker 2: was a little bit early on that one, but I 123 00:07:00,120 --> 00:07:02,760 Speaker 2: genuinely think that they're on the path to death. So 124 00:07:02,839 --> 00:07:06,040 Speaker 2: my argument is that their user numbers are declining. They 125 00:07:06,080 --> 00:07:07,760 Speaker 2: will occasionally have a time when they go up by 126 00:07:07,760 --> 00:07:11,640 Speaker 2: the percent, perhaps, but then they'll drop. But also the 127 00:07:11,760 --> 00:07:16,080 Speaker 2: user experience of Facebook and Instagram has just become awful. 128 00:07:16,840 --> 00:07:20,200 Speaker 2: They spend these insane amounts of money, tens of billions 129 00:07:20,240 --> 00:07:24,280 Speaker 2: of dollars on Meta's reality division, which is where they 130 00:07:24,320 --> 00:07:27,640 Speaker 2: stick the horrible VR experiences that everybody hates, and the 131 00:07:27,720 --> 00:07:32,080 Speaker 2: term metaverse. But what was crazy was Mark Zuckerbog didn't 132 00:07:32,080 --> 00:07:35,200 Speaker 2: get punished for literally lying about the metaverse. The street 133 00:07:35,280 --> 00:07:38,880 Speaker 2: did not mind the fact that he very clearly misled everybody. 134 00:07:39,960 --> 00:07:42,840 Speaker 2: In the announcement of Meta, he played a video of 135 00:07:42,880 --> 00:07:45,000 Speaker 2: a bunch of stuff that no one can do is 136 00:07:45,040 --> 00:07:49,040 Speaker 2: technologically impossible, shit jumping through things all around you with 137 00:07:49,120 --> 00:07:53,000 Speaker 2: no headset, completely insane. The stock had a little bit 138 00:07:53,080 --> 00:07:56,320 Speaker 2: of a wobble, but nothing happened. Yet once he fired 139 00:07:56,360 --> 00:07:59,080 Speaker 2: eleven thousand people and claimed that twenty twenty three would 140 00:07:59,080 --> 00:08:03,280 Speaker 2: be the year of efficiency, the market loved him, double 141 00:08:03,280 --> 00:08:07,360 Speaker 2: digit increases in the price of Meta shares. Nothing happened 142 00:08:07,400 --> 00:08:10,000 Speaker 2: to Zuckerberg with the next part, though, because he also 143 00:08:10,080 --> 00:08:12,800 Speaker 2: lost thirteen point seven billion dollars on the metaverse thing 144 00:08:12,800 --> 00:08:14,040 Speaker 2: that has never made them any money. 145 00:08:14,280 --> 00:08:18,160 Speaker 1: Yeah, and for some reference, thirteen point seven billion dollars 146 00:08:18,720 --> 00:08:23,760 Speaker 1: is about the GDP of Rwanda. Like that, that's how much. 147 00:08:23,800 --> 00:08:26,680 Speaker 1: That's a rwanda worth of losses. He's lost a rwanda 148 00:08:26,800 --> 00:08:28,120 Speaker 1: in his in his metaverse. 149 00:08:28,200 --> 00:08:31,400 Speaker 2: Bet it's actually a bit more as well, because recently. 150 00:08:31,120 --> 00:08:32,840 Speaker 1: It's more than it is more than a rwanda. 151 00:08:32,960 --> 00:08:36,040 Speaker 2: Yes, it's actually multiple. It might even be as many 152 00:08:36,040 --> 00:08:41,320 Speaker 2: as three rwandas, because an estima from last year suggested 153 00:08:41,360 --> 00:08:44,280 Speaker 2: that he's lost nearly forty two billion dollars on the metaverse. 154 00:08:45,000 --> 00:08:48,640 Speaker 1: Jesus Christ. So yeah, that's that's about. That's almost three 155 00:08:48,720 --> 00:08:50,319 Speaker 1: and a half Macedonias. 156 00:08:50,800 --> 00:08:53,920 Speaker 2: And it doesn't matter to the markets. You'd think a 157 00:08:53,920 --> 00:08:57,959 Speaker 2: public company that makes his money meta off of social 158 00:08:57,960 --> 00:09:01,280 Speaker 2: media experiences and selling ads on them, you think the 159 00:09:01,400 --> 00:09:05,200 Speaker 2: markets would care. The Instagram, one of their largest products, 160 00:09:05,240 --> 00:09:07,600 Speaker 2: one of their big money makers, an app for sharing 161 00:09:07,640 --> 00:09:11,160 Speaker 2: and following your friends photos and videos, no longer reliably 162 00:09:11,200 --> 00:09:15,120 Speaker 2: shows you content from those you follow. That's because meta 163 00:09:15,360 --> 00:09:19,599 Speaker 2: must simply interrupt your feed with sponsored content based on 164 00:09:19,800 --> 00:09:22,440 Speaker 2: and actually impossible to define amount of data. It is 165 00:09:22,480 --> 00:09:25,040 Speaker 2: not obvious how they choose. There are things that have 166 00:09:25,120 --> 00:09:27,480 Speaker 2: appeared on my Instagram that I've only ever said out loud. 167 00:09:27,559 --> 00:09:30,640 Speaker 2: It's very strange. And this is because it doesn't matter 168 00:09:30,640 --> 00:09:33,640 Speaker 2: if the product sucks. If meta is growing by double 169 00:09:33,640 --> 00:09:36,360 Speaker 2: digit percentages if they're making billions of dollars, even if 170 00:09:36,360 --> 00:09:39,920 Speaker 2: they're fucking the user in the process. Another great example 171 00:09:39,960 --> 00:09:43,760 Speaker 2: for you is Google. It doesn't matter that Google's aipower 172 00:09:43,840 --> 00:09:49,800 Speaker 2: Gmail plugin literally makes up it hallucinates email conversations. But 173 00:09:49,920 --> 00:09:53,400 Speaker 2: because they put AI on Gmail, Google's market cap goes up. 174 00:09:53,600 --> 00:09:56,839 Speaker 2: Everyone loves Sanda Pishe despite the fact that he gets 175 00:09:56,840 --> 00:10:00,400 Speaker 2: paid two hundred and eighty million dollars and layser tens 176 00:10:00,400 --> 00:10:04,240 Speaker 2: of thousands of people seemingly all the time. And it 177 00:10:04,320 --> 00:10:06,480 Speaker 2: really doesn't matter. And this is one that really bothers 178 00:10:06,480 --> 00:10:09,800 Speaker 2: me because I think we're all kind of stuck with Amazon. 179 00:10:10,240 --> 00:10:13,360 Speaker 2: The Amazon experience is even worse. You go on there 180 00:10:13,360 --> 00:10:17,120 Speaker 2: and you type spoon or spatula, and there's eighty seven 181 00:10:17,240 --> 00:10:20,760 Speaker 2: different knockoffs with names that sound like someone had a 182 00:10:20,800 --> 00:10:24,839 Speaker 2: seizure while typing that skib spatula eleven dollars will deliver 183 00:10:24,920 --> 00:10:28,320 Speaker 2: in three minutes. But because Amazon is able to show 184 00:10:28,360 --> 00:10:32,160 Speaker 2: double digit percentages of growth every year, yay, it's fine. 185 00:10:32,280 --> 00:10:37,400 Speaker 2: It doesn't matter that the product sucks. No, no, no, no. 186 00:10:37,559 --> 00:10:39,880 Speaker 2: Only a few things really matter when it comes to 187 00:10:39,880 --> 00:10:44,480 Speaker 2: big tech in their oh outsized valuations, layoffs, growth and 188 00:10:44,520 --> 00:10:47,839 Speaker 2: the promise of growth, no matter how faint or how speculative. 189 00:10:48,559 --> 00:10:52,400 Speaker 2: The markets lack any long term thinking, and they lack 190 00:10:52,440 --> 00:10:55,800 Speaker 2: the analysis that would actually force businesses to become sustainable 191 00:10:55,800 --> 00:10:58,960 Speaker 2: and make great products. And when I say sustainable, I 192 00:10:59,000 --> 00:11:02,280 Speaker 2: don't mean anything about the environment. I mean and this 193 00:11:02,320 --> 00:11:04,400 Speaker 2: is a crazy concept, Robert. This one's going to really 194 00:11:04,400 --> 00:11:06,640 Speaker 2: blow you away. That they make more money than they 195 00:11:06,679 --> 00:11:09,240 Speaker 2: spend and don't have to fire people all the time. 196 00:11:09,640 --> 00:11:11,760 Speaker 1: Wow. Yeah, what a concept in business. 197 00:11:12,120 --> 00:11:15,560 Speaker 2: If the markets actually cared about sustainable companies that could 198 00:11:15,640 --> 00:11:18,920 Speaker 2: last the test of time without constantly burning capital and 199 00:11:18,960 --> 00:11:22,440 Speaker 2: firing people, the markets might not have ignored the fact 200 00:11:22,480 --> 00:11:26,040 Speaker 2: that Meta got a four hundred and ten million dollar 201 00:11:26,200 --> 00:11:30,080 Speaker 2: fine from the EU under the General Data Protection Regulation, 202 00:11:30,720 --> 00:11:34,480 Speaker 2: which is the laws that govern data use and customer 203 00:11:34,559 --> 00:11:38,840 Speaker 2: data in Europe. And they also would have probably noticed 204 00:11:38,840 --> 00:11:41,880 Speaker 2: the fact that European users now have to deliberately opt 205 00:11:42,000 --> 00:11:45,760 Speaker 2: in to share their data. Now. As a little aside, 206 00:11:45,880 --> 00:11:49,920 Speaker 2: Facebook makes their money off of using cookies and various 207 00:11:49,960 --> 00:11:52,360 Speaker 2: other means of getting your data and then presenting you 208 00:11:52,400 --> 00:11:55,040 Speaker 2: with ads based on that. Some of it comes from 209 00:11:55,160 --> 00:11:56,600 Speaker 2: how you use the platform, but a lot of it 210 00:11:56,640 --> 00:12:01,640 Speaker 2: comes from this very insidious hidden tracking. What's happened here 211 00:12:02,040 --> 00:12:05,880 Speaker 2: is that European users will no longer have that by default. 212 00:12:06,920 --> 00:12:09,960 Speaker 2: They will have to choose whether to help Mark Zuckerberg 213 00:12:10,040 --> 00:12:11,439 Speaker 2: by another fifth of Hawaii. 214 00:12:13,360 --> 00:12:15,200 Speaker 1: Hey, just MAUI just MAUI. 215 00:12:15,440 --> 00:12:17,960 Speaker 2: Oh, I'm sorry, terribly sorry. I wouldn't want to hurt 216 00:12:17,960 --> 00:12:21,800 Speaker 2: mister Zuckerberg. But what's really bad about this is statistically 217 00:12:22,040 --> 00:12:24,960 Speaker 2: users do not like opting into these things. Only about 218 00:12:25,240 --> 00:12:28,480 Speaker 2: twenty five percent of iOS users have chosen to opt 219 00:12:28,520 --> 00:12:31,439 Speaker 2: in to app tracking, which was a feature that was 220 00:12:31,480 --> 00:12:34,600 Speaker 2: actually added a few versions of iOS Ago to your iPhone, 221 00:12:34,640 --> 00:12:37,920 Speaker 2: your iPad and so on and so forth. And by 222 00:12:37,960 --> 00:12:41,800 Speaker 2: the way, this means that basically every app is kind 223 00:12:41,800 --> 00:12:44,560 Speaker 2: of on notice if it's an Apple device. Now users 224 00:12:44,600 --> 00:12:48,080 Speaker 2: know when they might be sharing their data, and it's 225 00:12:48,440 --> 00:12:51,160 Speaker 2: genuinely a threat to a lot of the tech ecosystem 226 00:12:51,160 --> 00:12:53,880 Speaker 2: that a lot of people don't know. In fact, Meta's 227 00:12:53,880 --> 00:12:57,160 Speaker 2: business model is intrinsically linked to the repurposing of customer 228 00:12:57,240 --> 00:13:01,680 Speaker 2: data into ad targeting. But the markets don't seem to care. 229 00:13:03,480 --> 00:13:06,920 Speaker 2: And I get it. It might seem unthinkable that mister 230 00:13:06,960 --> 00:13:09,960 Speaker 2: Zuckerberg's fancy party could come to an end, but we 231 00:13:10,120 --> 00:13:14,040 Speaker 2: really need to be clear about something. Meta's core advertising 232 00:13:14,080 --> 00:13:17,200 Speaker 2: models depend heavily on things that in the next decade 233 00:13:17,400 --> 00:13:21,560 Speaker 2: will likely become impossible to do legally. They might not 234 00:13:21,640 --> 00:13:26,520 Speaker 2: even be possible technically, because now Apple is adding apptracking transparency, 235 00:13:26,600 --> 00:13:29,959 Speaker 2: which I've kind of mentioned with the iOS update, and 236 00:13:30,000 --> 00:13:32,160 Speaker 2: this means that people are first of all aware of 237 00:13:32,200 --> 00:13:34,120 Speaker 2: when their date is being shared, and they're probably going 238 00:13:34,160 --> 00:13:37,560 Speaker 2: to tell them to pound sand. There are other different 239 00:13:37,600 --> 00:13:40,200 Speaker 2: changes that are happening as well, alphabet which Google of 240 00:13:40,240 --> 00:13:43,760 Speaker 2: course they're going to be retiring third party tracking cookies. 241 00:13:44,800 --> 00:13:47,400 Speaker 2: Meta is kind of a point where if anyone was 242 00:13:47,440 --> 00:13:50,679 Speaker 2: paying attention, they'd be a little bit more worried. And 243 00:13:50,720 --> 00:13:54,560 Speaker 2: that existential threat is why Mark Zuckerberg is so desperate 244 00:13:54,760 --> 00:13:57,760 Speaker 2: to distract you with dreams of the metaverse. It's so 245 00:13:58,040 --> 00:14:02,280 Speaker 2: desperate to make you think that Meta will be the 246 00:14:02,360 --> 00:14:06,880 Speaker 2: thing that builds the artificial general intelligence, which is just 247 00:14:06,920 --> 00:14:10,120 Speaker 2: the buzzword for an AI that can have kind of 248 00:14:10,240 --> 00:14:13,720 Speaker 2: human level intellect, despite the fact that most generative AI 249 00:14:13,840 --> 00:14:17,559 Speaker 2: right now just makes shit up and when you really 250 00:14:17,640 --> 00:14:20,000 Speaker 2: look at it, And this is what confuses me about 251 00:14:20,000 --> 00:14:22,400 Speaker 2: how much this company is worth hundreds of billions of dollars. 252 00:14:23,040 --> 00:14:27,359 Speaker 2: Their other products don't make that much money. And Zuckerberg's 253 00:14:27,440 --> 00:14:30,360 Speaker 2: last big idea, the one that he changed his goddamn 254 00:14:30,400 --> 00:14:34,240 Speaker 2: company name for, lost them billions of dollars. Forty two billion, 255 00:14:34,400 --> 00:14:36,400 Speaker 2: probably be sixty billion next year. 256 00:14:37,440 --> 00:14:40,640 Speaker 1: God. Yeah, why Street loves six seven Macedonias. 257 00:14:40,720 --> 00:14:46,960 Speaker 2: Yeah, multiple Macedonias there. Since I published my original thoughts 258 00:14:46,960 --> 00:14:49,440 Speaker 2: on the row economy, their share process is more than doubled. 259 00:14:50,040 --> 00:14:52,320 Speaker 2: It was one hundred and seventy seven dollars then and 260 00:14:52,360 --> 00:14:55,920 Speaker 2: it's now four hundred dollars right now. It's despicable. 261 00:15:02,760 --> 00:15:06,040 Speaker 1: Yeah, I think the thing you have to understand when 262 00:15:06,080 --> 00:15:09,360 Speaker 1: you look at like, how can a company be burning 263 00:15:09,440 --> 00:15:12,200 Speaker 1: money at this rate and their stock price increase as 264 00:15:12,280 --> 00:15:15,520 Speaker 1: much as Facebooks has, or Metas has. I guess we 265 00:15:15,600 --> 00:15:17,080 Speaker 1: have to call it. I don't. I still don't like 266 00:15:17,120 --> 00:15:18,840 Speaker 1: calling it that. I feel like that's giving him a win. 267 00:15:19,040 --> 00:15:20,400 Speaker 2: Yeah, it's it's not fair. 268 00:15:20,680 --> 00:15:24,760 Speaker 1: It's not fair. But I think part of what you 269 00:15:24,840 --> 00:15:29,680 Speaker 1: have to understand is that like the people buying the stock, 270 00:15:30,280 --> 00:15:34,120 Speaker 1: like their vested financial interest is in it, continuing to like, 271 00:15:34,400 --> 00:15:36,440 Speaker 1: this is not all that different from what was happening 272 00:15:36,440 --> 00:15:39,880 Speaker 1: with n FTS right in that like there is this, 273 00:15:40,040 --> 00:15:41,760 Speaker 1: there is very much a group of people with a 274 00:15:41,840 --> 00:15:45,320 Speaker 1: vested interest in that stock price raising well above what 275 00:15:45,400 --> 00:15:48,040 Speaker 1: can actually be justified by the performance of the business, 276 00:15:48,360 --> 00:15:51,680 Speaker 1: and the people who are making a fortune on that 277 00:15:51,840 --> 00:15:55,880 Speaker 1: right now understand that it is a limited timeframe that 278 00:15:55,960 --> 00:15:57,760 Speaker 1: this game will work in. I don't know that Mark 279 00:15:57,840 --> 00:16:01,240 Speaker 1: fully does. He may he's he maybe to some extent 280 00:16:01,280 --> 00:16:04,200 Speaker 1: a little deranged himself at this point. But I think 281 00:16:04,200 --> 00:16:07,160 Speaker 1: there are a lot of people who absolutely do understand 282 00:16:07,240 --> 00:16:10,520 Speaker 1: and are heavily invested in Facebook now trying to get 283 00:16:10,520 --> 00:16:12,960 Speaker 1: as much of that as they can while it lasts, 284 00:16:13,040 --> 00:16:15,480 Speaker 1: but they understand that it's not going to last forever. 285 00:16:16,520 --> 00:16:19,880 Speaker 2: And I think there is a big fall of Rome 286 00:16:20,120 --> 00:16:22,560 Speaker 2: style story coming for tech, and I think the rot 287 00:16:22,600 --> 00:16:26,560 Speaker 2: economy is behind it because so much of tech has 288 00:16:26,560 --> 00:16:29,560 Speaker 2: become engineered to growth at all costs. It's a phenomenon 289 00:16:29,600 --> 00:16:32,880 Speaker 2: that controls everything. And I think the place where it 290 00:16:32,960 --> 00:16:36,920 Speaker 2: hits the most people because you know, Instagram's annoying, Facebook's annoying, 291 00:16:37,160 --> 00:16:39,440 Speaker 2: but you can kind of get around it. But I 292 00:16:39,440 --> 00:16:43,080 Speaker 2: think the place that hits home the most is Google. Yeah, 293 00:16:43,320 --> 00:16:45,920 Speaker 2: so the thing that everyone knows Google for is. Of course, 294 00:16:46,040 --> 00:16:49,240 Speaker 2: Google search used to be a place where crazy idea. 295 00:16:49,320 --> 00:16:51,280 Speaker 2: You go and you type in a thing, tip tap 296 00:16:51,320 --> 00:16:54,040 Speaker 2: tap search, and then a bunch of results come up 297 00:16:54,040 --> 00:16:58,480 Speaker 2: that are actually useful. Now it's a labyrinth full of 298 00:16:58,560 --> 00:17:03,160 Speaker 2: optimized garbage. Search engine optimization is an important thing to 299 00:17:03,160 --> 00:17:05,720 Speaker 2: remember here, which is companies have worked out how to 300 00:17:05,800 --> 00:17:08,880 Speaker 2: trick Google into making you see their link rather than 301 00:17:08,880 --> 00:17:14,280 Speaker 2: Google finding a useful thing. This is partially ruined Google Search, 302 00:17:15,040 --> 00:17:20,240 Speaker 2: but the other problem is adds sponsored content and of 303 00:17:20,280 --> 00:17:24,200 Speaker 2: course artificial intelligence crap. So Charlie Wazel over the Atlantic, 304 00:17:24,240 --> 00:17:27,320 Speaker 2: he argued a couple of years back, he said, Google Search, 305 00:17:27,520 --> 00:17:30,199 Speaker 2: what many consider an indispensable tool of modern life, is 306 00:17:30,240 --> 00:17:33,800 Speaker 2: dead or dying. And I agree. Users have to effectively 307 00:17:33,840 --> 00:17:37,280 Speaker 2: find a way to cheat and con Google into giving 308 00:17:37,320 --> 00:17:39,680 Speaker 2: them what they want. They add things like I don't 309 00:17:39,680 --> 00:17:43,439 Speaker 2: know you put PC repair error, put the error in 310 00:17:43,520 --> 00:17:47,919 Speaker 2: plus reddit To get anything approaching a reliable answer, If 311 00:17:47,960 --> 00:17:50,160 Speaker 2: you go and type a regular tech problem in as 312 00:17:50,200 --> 00:17:53,240 Speaker 2: I mentioned earlier, you just get a lot of shit. 313 00:17:53,400 --> 00:17:57,240 Speaker 2: You just get a bunch of things that it will 314 00:17:57,320 --> 00:17:59,720 Speaker 2: seem right. You'll say, oh, this is how to fix 315 00:18:00,200 --> 00:18:02,600 Speaker 2: forty two on my iPhone. Cool, and then you click 316 00:18:02,680 --> 00:18:06,280 Speaker 2: through and there's not actually really much useful content. It's 317 00:18:06,400 --> 00:18:09,600 Speaker 2: have you tried resetting your iPhone? Is your iPhone broken? 318 00:18:10,160 --> 00:18:13,840 Speaker 2: Is something wrong? Have you tried restarting your iPhone? Useless garbage? 319 00:18:13,840 --> 00:18:19,120 Speaker 2: But nevertheless, pushing traffic towards media outlet's the kind of cretinous. 320 00:18:19,000 --> 00:18:23,000 Speaker 1: Yeah, And that's like, I think it's important both as 321 00:18:23,040 --> 00:18:25,840 Speaker 1: you noted, a lot of the fucking worst shit in 322 00:18:25,920 --> 00:18:29,240 Speaker 1: tech can be strangled by stuff like what the EU 323 00:18:29,359 --> 00:18:31,720 Speaker 1: has done to force the companies to not allow you 324 00:18:31,800 --> 00:18:35,320 Speaker 1: to be opted in on stuff that's directly against your interests. 325 00:18:35,840 --> 00:18:38,119 Speaker 1: And likewise, if that were to be, if that were 326 00:18:38,160 --> 00:18:40,000 Speaker 1: the kind of thing that we could like make more 327 00:18:40,040 --> 00:18:41,919 Speaker 1: moves like that in the United States. You're not just 328 00:18:42,000 --> 00:18:44,680 Speaker 1: strangling the worst parts of big tech with that, You're 329 00:18:44,720 --> 00:18:48,920 Speaker 1: strangling a lot of really predatory, toxic media with that 330 00:18:48,920 --> 00:18:52,200 Speaker 1: that all of their money comes from that kind of 331 00:18:52,280 --> 00:18:54,119 Speaker 1: shit one way or the other. That like, they are 332 00:18:54,160 --> 00:18:58,160 Speaker 1: an integral part of this and can be harmed by 333 00:18:58,240 --> 00:19:02,160 Speaker 1: like doing damage to this fundamentally dishonest way of doing business. 334 00:19:03,480 --> 00:19:08,240 Speaker 2: And Google spends decades, they claim trying to improve the 335 00:19:08,320 --> 00:19:12,439 Speaker 2: quality of organic results, but it's for over a decade, 336 00:19:12,480 --> 00:19:14,800 Speaker 2: has been easily gained by anyone who knows how to 337 00:19:14,840 --> 00:19:18,520 Speaker 2: create an algorithmic headline, a headline that will make the 338 00:19:18,560 --> 00:19:21,720 Speaker 2: Google spider that looks over all the websites say this 339 00:19:21,880 --> 00:19:25,080 Speaker 2: is the thing, this is the answer. I personally don't 340 00:19:25,200 --> 00:19:28,760 Speaker 2: know whether Google does this deliberately or whether they've simply 341 00:19:28,760 --> 00:19:32,160 Speaker 2: lost track of everything, whether the evil parties are winning. 342 00:19:32,320 --> 00:19:36,399 Speaker 2: But you mention media companies and these predatory ones. The 343 00:19:36,480 --> 00:19:41,000 Speaker 2: problem is it's not just predatory ones. A large chunk 344 00:19:41,040 --> 00:19:44,080 Speaker 2: of modern media is obsessed with affiliate marketing, which is 345 00:19:44,359 --> 00:19:47,040 Speaker 2: you'll see these things, which is like best apps for 346 00:19:47,160 --> 00:19:49,440 Speaker 2: blah blah, and those would be affiliate links, yeah, which 347 00:19:49,480 --> 00:19:52,040 Speaker 2: give them a little bit of little bit of money, 348 00:19:52,119 --> 00:19:55,680 Speaker 2: or best super Bowl deals again a little bit of cash, 349 00:19:55,760 --> 00:19:57,919 Speaker 2: little bit of money every time you buy something through it. 350 00:19:58,680 --> 00:20:01,919 Speaker 2: And it's turned a large chunk of the web, but 351 00:20:02,119 --> 00:20:08,119 Speaker 2: especially Google Search, into this complete slot, and without finding 352 00:20:08,119 --> 00:20:11,000 Speaker 2: a way to negotiate with it, you're offered this kind 353 00:20:11,000 --> 00:20:15,160 Speaker 2: of fragmented buffet of content based on what Google kind 354 00:20:15,160 --> 00:20:17,440 Speaker 2: of thinks you want to see, but it's more based 355 00:20:17,480 --> 00:20:20,000 Speaker 2: on how much money Google is paid and how often 356 00:20:20,040 --> 00:20:24,840 Speaker 2: Google has been trigged. Let's be honest, and I think 357 00:20:24,840 --> 00:20:27,359 Speaker 2: it's fair to argue that Google no longer provides the 358 00:20:27,359 --> 00:20:30,199 Speaker 2: best results to any query. It provides an answer that 359 00:20:30,680 --> 00:20:36,040 Speaker 2: it believes is most beneficial or profitable, which can sometimes 360 00:20:36,040 --> 00:20:38,480 Speaker 2: be the thing you want, isn't always. 361 00:20:38,840 --> 00:20:41,320 Speaker 1: I think one of the problems is that I don't 362 00:20:41,680 --> 00:20:43,639 Speaker 1: know of a search engine that I would say is 363 00:20:43,720 --> 00:20:45,879 Speaker 1: better than Google in the way that Google used to 364 00:20:45,920 --> 00:20:48,440 Speaker 1: be better than anything else, right, and I'm talking ten 365 00:20:48,480 --> 00:20:51,080 Speaker 1: years ago. I do know there are search engines and 366 00:20:51,119 --> 00:20:53,680 Speaker 1: options that are better than Google at certain things. I've 367 00:20:53,680 --> 00:20:56,399 Speaker 1: come to learn that, like, if I use Perplexity AI, 368 00:20:57,000 --> 00:20:59,119 Speaker 1: it'll be worse than Google at a bunch of stuff, 369 00:20:59,119 --> 00:21:01,560 Speaker 1: but there are a few specific things it's better at, 370 00:21:01,600 --> 00:21:05,440 Speaker 1: and like that has increasingly become how I use search, right, 371 00:21:05,640 --> 00:21:08,960 Speaker 1: is that I cycle through different options with the same 372 00:21:09,040 --> 00:21:11,400 Speaker 1: question to try to figure out, like, well, who's gonna 373 00:21:11,440 --> 00:21:13,760 Speaker 1: fuck me least on trying to get this information? 374 00:21:14,880 --> 00:21:18,879 Speaker 2: And it sucks as well, because sure we all know 375 00:21:19,040 --> 00:21:22,879 Speaker 2: that any free service online isn't free. There is some 376 00:21:23,000 --> 00:21:25,600 Speaker 2: way they're making money, and that's fine, you've got it, 377 00:21:25,720 --> 00:21:30,480 Speaker 2: service cost money, whatever, But at some point, how much 378 00:21:30,520 --> 00:21:33,920 Speaker 2: profit is too much and also how can you justify 379 00:21:33,960 --> 00:21:39,359 Speaker 2: this much profit while making things so much worse? And 380 00:21:39,400 --> 00:21:42,000 Speaker 2: I mean, the result is that it just sucks. It sucks. 381 00:21:42,080 --> 00:21:46,440 Speaker 2: Googling sucks now, it's an exercise in pain. It leads 382 00:21:46,440 --> 00:21:49,320 Speaker 2: you to a bunch of content that is so obviously 383 00:21:49,400 --> 00:21:53,240 Speaker 2: engineered to get your clicks rather than actually provide any service. 384 00:21:54,240 --> 00:21:59,320 Speaker 2: That the web is slower. This makes the Internet slower, 385 00:21:59,560 --> 00:22:04,080 Speaker 2: It makes the flow of information between people and countries worse. 386 00:22:04,880 --> 00:22:08,960 Speaker 2: And what's worse is Google loves this monopoly and they 387 00:22:09,000 --> 00:22:12,600 Speaker 2: pay a pretty penny to keep it. They pay Apple 388 00:22:12,760 --> 00:22:16,280 Speaker 2: eighteen billion dollars a year. And this came out in 389 00:22:16,320 --> 00:22:21,240 Speaker 2: a recent US versus Google antitrust case. Eighteen billion dollars 390 00:22:21,280 --> 00:22:24,639 Speaker 2: a year to be on your iPhone, to be on 391 00:22:24,760 --> 00:22:28,920 Speaker 2: Apple devices. It's not because they're better, it's not because 392 00:22:29,720 --> 00:22:32,760 Speaker 2: Apple even thinks that they couldn't do better. This is 393 00:22:32,800 --> 00:22:36,879 Speaker 2: specifically to stop Apple trying and to make your lives 394 00:22:36,920 --> 00:22:39,919 Speaker 2: worse so that Google can make their search worse to 395 00:22:39,960 --> 00:22:42,960 Speaker 2: make more money and give some Dharkashi two hundred and 396 00:22:43,040 --> 00:22:48,320 Speaker 2: eighty million dollars. And it's depressing. It's something that makes 397 00:22:48,359 --> 00:22:51,439 Speaker 2: me I have to fight the cynicism every time I 398 00:22:51,600 --> 00:22:55,320 Speaker 2: use Google. This is a problem that hits billions of people. 399 00:22:55,960 --> 00:22:59,679 Speaker 2: This is a real thing, and this isn't an attack 400 00:22:59,720 --> 00:23:03,359 Speaker 2: on any journalistic outlet, But why the fuck is the 401 00:23:03,720 --> 00:23:06,840 Speaker 2: horror head. This is a horrifying thing. Yeah, this would 402 00:23:06,840 --> 00:23:10,359 Speaker 2: be like if certain freeways just randomly led you to 403 00:23:10,400 --> 00:23:13,920 Speaker 2: a different place because someone paid the the ot money. 404 00:23:14,680 --> 00:23:18,160 Speaker 1: It would be like if we had a I don't 405 00:23:18,160 --> 00:23:20,640 Speaker 1: want to oversell this, but like what we're talking about 406 00:23:20,800 --> 00:23:22,840 Speaker 1: is there was a brief period that, for the first 407 00:23:22,840 --> 00:23:27,679 Speaker 1: time in human existence, all knowledge ever collected and earned 408 00:23:27,720 --> 00:23:32,440 Speaker 1: by human beings was easily accessible to the vast majority 409 00:23:32,720 --> 00:23:36,400 Speaker 1: of people if they had an Internet connection, and that 410 00:23:36,560 --> 00:23:39,919 Speaker 1: is becoming no longer the case with rapidity, and it 411 00:23:39,960 --> 00:23:41,760 Speaker 1: is it is a problem. It kin to like if 412 00:23:41,760 --> 00:23:44,800 Speaker 1: we had a cure for cancer and then we decided 413 00:23:44,840 --> 00:23:47,320 Speaker 1: to start like breaking it if you didn't buy like 414 00:23:47,359 --> 00:23:52,000 Speaker 1: the pancreatic cancer bonus pack, like like it is it 415 00:23:52,080 --> 00:23:54,000 Speaker 1: is that big a problem, Like when you think about, 416 00:23:54,080 --> 00:23:56,720 Speaker 1: like what it means for all human knowledge to be 417 00:23:56,800 --> 00:23:59,720 Speaker 1: accessible and then to throw that away so that Sundar 418 00:23:59,760 --> 00:24:01,680 Speaker 1: pat I can get two hundred and eighty million dollars 419 00:24:01,720 --> 00:24:02,080 Speaker 1: a year. 420 00:24:02,240 --> 00:24:06,040 Speaker 2: It is an obscenity, and then you have the level 421 00:24:06,119 --> 00:24:08,679 Speaker 2: up from that as well as think about it from 422 00:24:08,960 --> 00:24:11,119 Speaker 2: those of us who spend I don't know, twenty two 423 00:24:11,240 --> 00:24:14,600 Speaker 2: of our twenty four hours a day online, we're aware 424 00:24:14,640 --> 00:24:18,440 Speaker 2: of what SEO content looks like. Search engine optimization. There, 425 00:24:18,720 --> 00:24:21,199 Speaker 2: content that is built just to rank highly on Google, 426 00:24:21,280 --> 00:24:23,000 Speaker 2: to send money to an out there and traffic to 427 00:24:23,040 --> 00:24:25,280 Speaker 2: an out there. We know what that looks like. We 428 00:24:25,359 --> 00:24:28,399 Speaker 2: know when we are being misled. We know when the 429 00:24:28,440 --> 00:24:32,280 Speaker 2: flow of our information is being interfered with. I don't 430 00:24:32,359 --> 00:24:34,480 Speaker 2: argue that most people do, and this isn't their fault. 431 00:24:34,520 --> 00:24:40,240 Speaker 2: This is a nuanced topic, but this is horrifying. Google has, 432 00:24:40,280 --> 00:24:43,880 Speaker 2: like every major tech company, focused entirely on what will 433 00:24:43,880 --> 00:24:47,200 Speaker 2: make revenues and market share increase, even if the cost 434 00:24:47,200 --> 00:24:50,600 Speaker 2: of doing so is just destroying its entire legacy and 435 00:24:50,680 --> 00:24:54,240 Speaker 2: interrupting the free flow of information around the world. Last year, 436 00:24:54,440 --> 00:24:58,240 Speaker 2: they launched their own bard ai to compete with being 437 00:24:58,440 --> 00:25:02,080 Speaker 2: Microsoft Search Engine and their Chat GPT integration. By the way, 438 00:25:02,280 --> 00:25:05,360 Speaker 2: Chat GPT, all these generative AI things. Yeah, they make 439 00:25:05,359 --> 00:25:06,560 Speaker 2: stuff up. They hallucinate it. 440 00:25:06,920 --> 00:25:10,199 Speaker 1: Oh yeah, no, they hallucinate more than I do, and 441 00:25:10,240 --> 00:25:13,040 Speaker 1: I did permanent damage to my brain by experimenting with 442 00:25:13,080 --> 00:25:14,000 Speaker 1: sugaran chemicals. 443 00:25:14,760 --> 00:25:18,159 Speaker 2: And what was crazy was bing Ai came out and 444 00:25:18,280 --> 00:25:23,639 Speaker 2: immediately started hallucinating things bart Ai. Google did a media 445 00:25:23,760 --> 00:25:26,680 Speaker 2: day and they showed a demo of Bartii and you thought, 446 00:25:26,720 --> 00:25:29,360 Speaker 2: for a second, oh shit, this might replace search engines. 447 00:25:29,400 --> 00:25:32,360 Speaker 2: This might be a moment where they sell us back 448 00:25:32,400 --> 00:25:34,639 Speaker 2: an experience we had before. And then literally in the 449 00:25:34,640 --> 00:25:39,919 Speaker 2: demo it made a factual error. Yeah, Google attempted to 450 00:25:40,080 --> 00:25:43,520 Speaker 2: sell us back search engine results and then provided us 451 00:25:43,600 --> 00:25:46,840 Speaker 2: just another broken Google kind of like chat GPT is 452 00:25:46,880 --> 00:25:48,400 Speaker 2: just a broken form of knowledge. 453 00:25:48,560 --> 00:25:51,720 Speaker 1: Well, and didn't they also or was that Microsoft that 454 00:25:51,800 --> 00:25:55,880 Speaker 1: absolutely like lied in the demo and like it pretended 455 00:25:55,920 --> 00:25:58,120 Speaker 1: that they were showing live results when it was really 456 00:25:58,160 --> 00:25:59,160 Speaker 1: something they'd curated. 457 00:25:59,800 --> 00:26:03,280 Speaker 2: I'm not sure which one it was, but seemingly every 458 00:26:03,359 --> 00:26:07,679 Speaker 2: company does some sort of con like this. It sucks 459 00:26:07,800 --> 00:26:10,880 Speaker 2: and it just it doesn't help anyone. And you'd think, 460 00:26:10,920 --> 00:26:13,360 Speaker 2: and we're describing what is a big shitty mess, We're 461 00:26:13,400 --> 00:26:17,760 Speaker 2: describing something that takes billions of people, that was decayed 462 00:26:17,840 --> 00:26:21,960 Speaker 2: a very important product. You think the markets would respond negatively, 463 00:26:22,080 --> 00:26:25,000 Speaker 2: and you would be wrong. There was a couple percentage 464 00:26:25,040 --> 00:26:27,320 Speaker 2: points that got shaved off Google when they had that 465 00:26:27,600 --> 00:26:30,080 Speaker 2: initial wobble with barred Ai, and then it went right 466 00:26:30,119 --> 00:26:33,119 Speaker 2: back up. The markets they bloody love that Microsoft is 467 00:26:33,160 --> 00:26:37,080 Speaker 2: invested in chat GPT. They love the open Ai is 468 00:26:37,160 --> 00:26:40,960 Speaker 2: partially being meddled with by Microsoft, despite the fact that 469 00:26:40,960 --> 00:26:44,879 Speaker 2: that company does not make any profit. And that's because 470 00:26:44,920 --> 00:26:51,320 Speaker 2: the markets do not prioritize innovation. They don't prioritize sustainable growth, 471 00:26:51,640 --> 00:26:55,280 Speaker 2: companies that can last on their own without screwing over customers. 472 00:26:55,480 --> 00:27:00,960 Speaker 2: They don't care about stability. The result is that companies 473 00:27:01,400 --> 00:27:04,800 Speaker 2: don't function with the intent of making good businesses anymore. 474 00:27:05,400 --> 00:27:09,040 Speaker 2: They want businesses that kind of seem right there, kind 475 00:27:09,040 --> 00:27:11,040 Speaker 2: of feel good, and they sell a product and they 476 00:27:11,040 --> 00:27:14,920 Speaker 2: make money. But I don't really care about anything else 477 00:27:14,960 --> 00:27:18,560 Speaker 2: as long as it keeps growing exponentially ten eleven, fifteen, 478 00:27:18,600 --> 00:27:33,320 Speaker 2: twenty percent every quarter. It's disgraceful. Sadly, the rock economy 479 00:27:33,320 --> 00:27:36,040 Speaker 2: and its growth at all costs fuck with customer mentality 480 00:27:36,160 --> 00:27:41,000 Speaker 2: isn't limited to big tech startups, so private companies that 481 00:27:41,160 --> 00:27:45,040 Speaker 2: invested in by venture capital firms, private equity firms. They're 482 00:27:45,080 --> 00:27:49,280 Speaker 2: regularly pumped full of venture capital dollars enticing users with 483 00:27:49,359 --> 00:27:52,359 Speaker 2: a subsidized product, meaning that they basically sell something at 484 00:27:52,359 --> 00:27:55,240 Speaker 2: a massive loss to pump up their user numbers. That 485 00:27:55,359 --> 00:27:58,920 Speaker 2: gradually becomes worse and worse and more expensive over time 486 00:27:58,920 --> 00:28:03,800 Speaker 2: as they attempt to reach some sort of vague stability. 487 00:28:04,160 --> 00:28:09,199 Speaker 1: I think really crucially too, like they pump up the 488 00:28:09,240 --> 00:28:14,640 Speaker 1: price after they've killed everyone who provided maybe a better product, 489 00:28:15,200 --> 00:28:18,000 Speaker 1: but yeah, charged more money for it. Like it's the 490 00:28:18,119 --> 00:28:19,320 Speaker 1: uber effect, right. 491 00:28:19,800 --> 00:28:22,720 Speaker 2: Oh, and I will get to uber because a big 492 00:28:23,000 --> 00:28:26,720 Speaker 2: part of this, and it's what Corey doctor O calls 493 00:28:26,800 --> 00:28:30,320 Speaker 2: in shitification, which is a painful neologism that actually pretty 494 00:28:30,320 --> 00:28:33,679 Speaker 2: accurate in describing the startup ecosystem. A big part of 495 00:28:33,680 --> 00:28:35,919 Speaker 2: that is they get pumped full of these ventured dollars 496 00:28:36,040 --> 00:28:40,160 Speaker 2: and they become these horrifying companies that are not good companies. 497 00:28:40,200 --> 00:28:43,400 Speaker 2: They burn capital, they barely make anything, but they destroy 498 00:28:43,520 --> 00:28:47,239 Speaker 2: businesses that say run based on offering a product that 499 00:28:47,320 --> 00:28:49,720 Speaker 2: users paid for that is priced higher than the cost 500 00:28:49,760 --> 00:28:52,640 Speaker 2: of the product, making the company something known as profit. 501 00:28:53,160 --> 00:28:56,240 Speaker 1: Yeah, and I think that's I think that's so im important, 502 00:28:56,760 --> 00:29:00,320 Speaker 1: especially for like the younger people who maybe you never 503 00:29:00,400 --> 00:29:03,080 Speaker 1: fell for it. I did for a little while in 504 00:29:03,120 --> 00:29:07,480 Speaker 1: the early two thousands fall for some of Google's shtick. Yeah, 505 00:29:07,520 --> 00:29:10,240 Speaker 1: because the the thing that they delivered for a while 506 00:29:10,400 --> 00:29:15,440 Speaker 1: was miraculous, like using Google before it exists, like when 507 00:29:15,440 --> 00:29:18,320 Speaker 1: it like when it first came out, Like the degree 508 00:29:18,360 --> 00:29:20,360 Speaker 1: to which it was superior to any other way to 509 00:29:20,840 --> 00:29:24,480 Speaker 1: access knowledge that it ever existed was wild, and like 510 00:29:24,840 --> 00:29:28,000 Speaker 1: you wanted to believe, maybe these people are not a 511 00:29:28,040 --> 00:29:29,320 Speaker 1: bunch of fucking demons. 512 00:29:30,200 --> 00:29:32,400 Speaker 2: And it was don't be evil? That used to be 513 00:29:32,480 --> 00:29:37,680 Speaker 2: that tagline, Yeah anymore, no zoop, How don'd you go? 514 00:29:38,440 --> 00:29:41,520 Speaker 2: Now it's don't comma be evil with an exclamation. 515 00:29:42,600 --> 00:29:46,520 Speaker 1: They did a lot of a lot of and I 516 00:29:46,520 --> 00:29:48,680 Speaker 1: don't know that's it's a it's a fooling wance sort 517 00:29:48,680 --> 00:29:50,840 Speaker 1: of situation. I'll never I'll never love again. 518 00:29:51,640 --> 00:29:55,440 Speaker 2: Yeah, it is enough. And seeing Google decay, by the way, 519 00:29:55,640 --> 00:29:59,000 Speaker 2: really has That's what's jokified me. That's what's driven me 520 00:29:59,040 --> 00:30:02,400 Speaker 2: a little insane. But then when you look up the 521 00:30:02,520 --> 00:30:06,320 Speaker 2: startup side, and now that when you're aware of the 522 00:30:06,440 --> 00:30:09,080 Speaker 2: rot economy, you're aware of incitification, you can kind of 523 00:30:09,080 --> 00:30:11,440 Speaker 2: watch it in real time, and it drives you a 524 00:30:11,480 --> 00:30:15,640 Speaker 2: little crazier still because there is a very abusive cycle. Here, 525 00:30:16,240 --> 00:30:19,680 Speaker 2: companies are born, they're funded, and they're grown in this 526 00:30:19,880 --> 00:30:23,520 Speaker 2: unnatural way where they're subsidized on It's funny a lot 527 00:30:23,520 --> 00:30:26,840 Speaker 2: of these right leaning venture catalysts they hate welfare, but 528 00:30:26,880 --> 00:30:31,400 Speaker 2: they love putting startups on it because venture capital exists 529 00:30:31,640 --> 00:30:34,680 Speaker 2: in many ways at times in the tech industry to 530 00:30:34,760 --> 00:30:37,959 Speaker 2: keep companies alive that should be left to die. And 531 00:30:38,000 --> 00:30:41,160 Speaker 2: they would have that same attitude to a regular person 532 00:30:41,200 --> 00:30:44,040 Speaker 2: who could not meet their bills through unsustainable spending or 533 00:30:44,040 --> 00:30:49,560 Speaker 2: even sustainable spending. Nevertheless, they're fucking hypogrites. But the problem 534 00:30:49,640 --> 00:30:52,360 Speaker 2: is with these companies as well is their services. Because 535 00:30:52,400 --> 00:30:56,960 Speaker 2: they are not sustainable, eventually have to become worse. They 536 00:30:57,040 --> 00:31:00,000 Speaker 2: grow the dependence of the market, they kill everything else 537 00:31:00,120 --> 00:31:02,080 Speaker 2: in the market, and then they make them worse in 538 00:31:02,120 --> 00:31:06,440 Speaker 2: shitifying them again. Corey, love you, buddy, but no terrible term. 539 00:31:06,640 --> 00:31:09,680 Speaker 1: Great idea, I like, Yeah, I think we could all 540 00:31:09,720 --> 00:31:12,280 Speaker 1: agree he's got the idea right. It is really hard 541 00:31:12,360 --> 00:31:15,320 Speaker 1: to get, like your mom to buy into using that term. 542 00:31:16,080 --> 00:31:19,160 Speaker 2: Yeah, a little bit epic bacon for my taste. Look, 543 00:31:20,080 --> 00:31:23,840 Speaker 2: if these stars were held to real standards, crazy ideas 544 00:31:23,880 --> 00:31:26,000 Speaker 2: that a business should make more money that it spends 545 00:31:26,080 --> 00:31:29,880 Speaker 2: and be able to survive independent of investor capital. There 546 00:31:29,880 --> 00:31:32,200 Speaker 2: are a lot of starts that would just die. In fact, 547 00:31:32,480 --> 00:31:35,640 Speaker 2: that's kind of happened in twenty twenty three. There was 548 00:31:35,680 --> 00:31:39,640 Speaker 2: a phenomenon known as the zero interest free period, when 549 00:31:39,680 --> 00:31:42,080 Speaker 2: it was just easy for venture capitalists to get a 550 00:31:42,080 --> 00:31:45,200 Speaker 2: bunch of money at low interest or zero percent interests. 551 00:31:45,400 --> 00:31:48,200 Speaker 2: That went away into late twenty twenty two early twenty 552 00:31:48,240 --> 00:31:52,040 Speaker 2: twenty three. As a result, venture capitals suddenly realized, oh, 553 00:31:52,120 --> 00:31:55,920 Speaker 2: are we just spending money on bullshit? So they killed 554 00:31:55,920 --> 00:31:59,040 Speaker 2: the startups. They just pulled the plug. They didn't send 555 00:31:59,080 --> 00:32:01,440 Speaker 2: them any more money. There were no more venture rounds 556 00:32:01,520 --> 00:32:05,680 Speaker 2: and real companies, good companies died because no one got investment. 557 00:32:06,360 --> 00:32:09,560 Speaker 2: And it's sad, But don't worry. I'm major a promise 558 00:32:09,560 --> 00:32:11,800 Speaker 2: and I'm keeping to it. We've got to talk about Uber. 559 00:32:12,160 --> 00:32:14,600 Speaker 2: Uber is the ultimate rot economy startup. 560 00:32:15,280 --> 00:32:17,200 Speaker 1: Yeah, it was like that. It was the thing where 561 00:32:17,240 --> 00:32:19,840 Speaker 1: you know, unlike with Google, there was no like trade off, right, 562 00:32:19,920 --> 00:32:23,280 Speaker 1: like nothing was getting harmed with Uber. You always knew 563 00:32:23,320 --> 00:32:26,040 Speaker 1: the company was evil. But also it was so much 564 00:32:26,080 --> 00:32:29,320 Speaker 1: easier to get a fucking ride when you were like hammered, right, 565 00:32:29,640 --> 00:32:32,320 Speaker 1: Like it did make a problem go away for a lot. 566 00:32:32,240 --> 00:32:36,120 Speaker 2: Of people, and at first it wasn't. When Uber launched 567 00:32:36,360 --> 00:32:40,640 Speaker 2: in twenty eleven, it was mostly with like town cars, 568 00:32:40,760 --> 00:32:43,320 Speaker 2: black cab services. So when we call them back home, 569 00:32:43,440 --> 00:32:45,959 Speaker 2: I guess you'll just live. I don't know there's an 570 00:32:45,960 --> 00:32:48,440 Speaker 2: American term, I'm sure. But nevertheless, you were getting these 571 00:32:48,480 --> 00:32:51,000 Speaker 2: Lincoln town cars and you were able to call them, 572 00:32:51,040 --> 00:32:53,080 Speaker 2: and it was kind of magical and you were like, oh, wow, 573 00:32:53,120 --> 00:32:56,960 Speaker 2: we could do this. Eventually, they'd launch Uber X. Anyone 574 00:32:57,000 --> 00:32:59,640 Speaker 2: can pick up a phone and start driving their car 575 00:32:59,680 --> 00:33:03,560 Speaker 2: from mine with Uber. The press at the time, because 576 00:33:03,560 --> 00:33:05,400 Speaker 2: everyone was kind of still thinking that tech was the 577 00:33:05,400 --> 00:33:07,800 Speaker 2: good guys, kind of let it go by that these 578 00:33:07,800 --> 00:33:11,880 Speaker 2: people were paid below minimum wage, they had questionable insurance policies, 579 00:33:12,720 --> 00:33:15,920 Speaker 2: and absolutely no goddamn profit. They were screwing the drivers 580 00:33:15,960 --> 00:33:20,360 Speaker 2: while also not actually building a sustainable company. Very confusing 581 00:33:20,400 --> 00:33:23,560 Speaker 2: to me. Uber only became profitable last year. They had 582 00:33:23,560 --> 00:33:26,760 Speaker 2: a three hundred and twenty six million dollar operating profit 583 00:33:27,080 --> 00:33:30,120 Speaker 2: in August of twenty twenty three. To get there, it 584 00:33:30,160 --> 00:33:32,040 Speaker 2: had to burn thirty two billion dollars. 585 00:33:32,360 --> 00:33:35,360 Speaker 1: Hey, you gotta spend money to make money. 586 00:33:35,640 --> 00:33:41,160 Speaker 2: Yeah, what an incredible return just but for the sake 587 00:33:41,160 --> 00:33:44,680 Speaker 2: of clarity, it's also worth noting that Uber had previously 588 00:33:44,720 --> 00:33:47,520 Speaker 2: reported profitable quarters, but they didn't come from actually providing 589 00:33:47,640 --> 00:33:50,640 Speaker 2: rides or delivering food or any kind of business. They 590 00:33:50,720 --> 00:33:56,280 Speaker 2: were just selling stuff they'd bore. It's just Uber frustrates me. 591 00:33:56,400 --> 00:34:00,600 Speaker 2: Their business model is just incredibly precarious and relies seedingly 592 00:34:00,640 --> 00:34:05,040 Speaker 2: heavily on governments failing to impose labor laws. So much 593 00:34:05,120 --> 00:34:07,920 Speaker 2: of their existence is predicated on being able to screw 594 00:34:08,280 --> 00:34:13,480 Speaker 2: general contractors. Its continued existence would not have happened without 595 00:34:13,600 --> 00:34:19,399 Speaker 2: bullying local authorities, without local authorities seeding ground to these 596 00:34:19,400 --> 00:34:22,960 Speaker 2: companies because at the time they were like, yeah, taxis 597 00:34:23,040 --> 00:34:26,040 Speaker 2: kind of suck. Taxis are bad. Sure, And of course 598 00:34:26,120 --> 00:34:28,320 Speaker 2: Uber needed a bunch of money, with the largest amount 599 00:34:28,360 --> 00:34:32,600 Speaker 2: of that coming from the Saudi Sovereign Wealth Fund three 600 00:34:32,640 --> 00:34:34,000 Speaker 2: point five billion dollars. 601 00:34:34,440 --> 00:34:34,720 Speaker 1: Huh. 602 00:34:35,719 --> 00:34:38,760 Speaker 2: And also Uber just burns capital. They burned billions of dollars, 603 00:34:39,040 --> 00:34:41,319 Speaker 2: and their share prices doubled in the last year and 604 00:34:41,360 --> 00:34:44,759 Speaker 2: now has a larger market cap than Forward and Stilantis combined. 605 00:34:46,360 --> 00:34:48,640 Speaker 2: The markets are on crack. 606 00:34:50,000 --> 00:34:53,839 Speaker 1: It's it's so fucking frustrating. It's like if it's like 607 00:34:53,880 --> 00:34:56,040 Speaker 1: if there were a network of guys running those shell 608 00:34:56,080 --> 00:34:59,000 Speaker 1: games where you put like a dollar in worth three cups, right, 609 00:34:59,480 --> 00:35:03,000 Speaker 1: and some VC guys were like, hey, just give someone 610 00:35:03,080 --> 00:35:05,120 Speaker 1: money every time they pick a cup, no matter what 611 00:35:05,200 --> 00:35:08,000 Speaker 1: cup it is, and then we'll also buy up all 612 00:35:08,040 --> 00:35:10,239 Speaker 1: of the ATMs and shut them down. So this is 613 00:35:10,280 --> 00:35:12,680 Speaker 1: the only way to get cash, right, Like that that's 614 00:35:12,680 --> 00:35:15,640 Speaker 1: almost the way that it works, right, Like it is 615 00:35:17,040 --> 00:35:21,799 Speaker 1: replacing something that, like cabs needed to be. I hate 616 00:35:21,840 --> 00:35:25,000 Speaker 1: to say this disrupted, right, like it was a business 617 00:35:25,000 --> 00:35:28,480 Speaker 1: that existed. System is not working exactly exactly. There needed 618 00:35:28,480 --> 00:35:30,200 Speaker 1: to be a way for you to get one wherever 619 00:35:30,239 --> 00:35:32,080 Speaker 1: you happen to be or whatever, and do it through 620 00:35:32,120 --> 00:35:34,560 Speaker 1: your phone and not have to like fucking call a 621 00:35:34,640 --> 00:35:37,720 Speaker 1: cab company. Like there was a degree to which innovation 622 00:35:37,840 --> 00:35:41,239 Speaker 1: needed to happen. But one of the side effects that 623 00:35:41,360 --> 00:35:43,080 Speaker 1: I mean, in addition to the stuff you've brought up, 624 00:35:43,120 --> 00:35:45,440 Speaker 1: is that, like it's so much less safe. You have 625 00:35:45,520 --> 00:35:48,360 Speaker 1: no way of knowing if your driver is either qualified 626 00:35:48,480 --> 00:35:51,520 Speaker 1: or going to like sexually assault you, like, which is 627 00:35:51,520 --> 00:35:52,879 Speaker 1: the thing that happens a lot. 628 00:35:53,360 --> 00:35:56,200 Speaker 2: I had an Uber driver last year when my parents 629 00:35:56,200 --> 00:35:59,239 Speaker 2: were here in beautiful Las Vegas, Nevada, who was very 630 00:35:59,280 --> 00:36:03,120 Speaker 2: clearly drunk and just mumbling swear words the entire time. 631 00:36:03,280 --> 00:36:05,160 Speaker 2: I reported to Uber and I'm like, yeah, that sucks, 632 00:36:05,160 --> 00:36:07,680 Speaker 2: so we'll make sure you're not paired with him again. Thanks. 633 00:36:08,080 --> 00:36:10,279 Speaker 2: I hope he doesn't crash his car. 634 00:36:10,719 --> 00:36:11,319 Speaker 1: Yeah. 635 00:36:11,400 --> 00:36:15,400 Speaker 2: Cool. And the thing is the punishment should come from 636 00:36:15,440 --> 00:36:18,200 Speaker 2: the markets. It really should the markets. If the free 637 00:36:18,280 --> 00:36:21,520 Speaker 2: markets actually functioned in the way that guys with Grecian 638 00:36:21,600 --> 00:36:24,800 Speaker 2: statue avatars claimed it did on Twitter, then the market 639 00:36:24,800 --> 00:36:27,279 Speaker 2: would deal with this. But it doesn't. But there is 640 00:36:27,320 --> 00:36:30,880 Speaker 2: actually another culprit, and that's the media. They kind of 641 00:36:30,920 --> 00:36:34,520 Speaker 2: fuel the growth mongering. C NBC, for example. I generally 642 00:36:34,600 --> 00:36:38,239 Speaker 2: like CNBC, but the way they report earnings and many 643 00:36:38,239 --> 00:36:42,239 Speaker 2: people do. Lots of different media companies do this. They 644 00:36:42,280 --> 00:36:44,960 Speaker 2: don't acknowledge the fact that Uber just burns money, that 645 00:36:45,040 --> 00:36:47,680 Speaker 2: they spent fifteen goddamn years burning money, that they have 646 00:36:47,760 --> 00:36:52,240 Speaker 2: an unsustainable business. The Travis Kalanick, who's long since departed, 647 00:36:52,239 --> 00:36:57,319 Speaker 2: obviously was running a very mob esque thing where he'd 648 00:36:57,600 --> 00:37:01,200 Speaker 2: sidle in with Bradley Goddamn Tusk, a form of lobbyist 649 00:37:01,320 --> 00:37:05,200 Speaker 2: pr creature that he found in More Door and just 650 00:37:05,320 --> 00:37:09,400 Speaker 2: bullying local markets until they agreed. He would just launch 651 00:37:09,560 --> 00:37:12,440 Speaker 2: Uber places and be like, what you're gonna do about it? 652 00:37:12,480 --> 00:37:15,120 Speaker 2: And then the local box would say, uh, yeah, we're 653 00:37:15,160 --> 00:37:18,360 Speaker 2: gonna gonna do something, and then nothing would happen and 654 00:37:18,600 --> 00:37:21,759 Speaker 2: would be made legal and local. And just to be clear, 655 00:37:21,800 --> 00:37:24,040 Speaker 2: if you don't know this about your Uber driver, every 656 00:37:24,040 --> 00:37:26,239 Speaker 2: time you take Uber and you think, oh, I've paid 657 00:37:26,239 --> 00:37:28,600 Speaker 2: twenty bucks for that, the driver gets like less than 658 00:37:28,640 --> 00:37:31,160 Speaker 2: half I believe, Oh yeah, get they get screwed and 659 00:37:31,200 --> 00:37:34,160 Speaker 2: they don't have insurance in many cases, or they're not insured. 660 00:37:34,680 --> 00:37:36,319 Speaker 2: And this is a very technical thing, but this is 661 00:37:36,320 --> 00:37:39,319 Speaker 2: important to know. When your Uber driver has you in 662 00:37:39,360 --> 00:37:42,200 Speaker 2: the car, oftentimes they're insured. When they don't. When they're 663 00:37:42,239 --> 00:37:47,040 Speaker 2: just driving around, they oftentimes need different insurance because they're 664 00:37:47,040 --> 00:37:49,920 Speaker 2: not running a business while they're driving around. These are 665 00:37:49,920 --> 00:37:52,040 Speaker 2: the kind of things that happen when you don't have 666 00:37:52,120 --> 00:37:56,560 Speaker 2: strong labor laws, when you don't have the government protecting workers. 667 00:37:57,360 --> 00:38:00,960 Speaker 2: It's just it's frustrating, and like I can understand where 668 00:38:01,000 --> 00:38:04,240 Speaker 2: the media probably doesn't want to cover this because otherwise 669 00:38:04,280 --> 00:38:08,400 Speaker 2: the markets would be varying levels of yeah, this company sucks, 670 00:38:08,520 --> 00:38:13,000 Speaker 2: this one sucks. Also this one's bad. Uber. They really 671 00:38:13,080 --> 00:38:16,440 Speaker 2: just don't make enough money to really make sense. And 672 00:38:16,560 --> 00:38:20,120 Speaker 2: if they had, I don't know, ten twenty percent drop 673 00:38:20,160 --> 00:38:22,839 Speaker 2: in writing, I mean, they probably fall apart. They don't 674 00:38:22,840 --> 00:38:24,400 Speaker 2: want to say that, so they just want to just 675 00:38:24,680 --> 00:38:26,640 Speaker 2: look at it. They just want to go, Okay, here's 676 00:38:26,680 --> 00:38:30,919 Speaker 2: what's happening today. Uber also really cannot be killed now. 677 00:38:31,600 --> 00:38:34,040 Speaker 2: And that's a horrifying future we live in and a 678 00:38:34,200 --> 00:38:38,000 Speaker 2: horrifying present, I guess. And because people keep buying the stock, 679 00:38:38,200 --> 00:38:41,280 Speaker 2: it's a valuable company, and it's valuable in the eyes 680 00:38:41,640 --> 00:38:47,200 Speaker 2: of markets that seemingly have cataracts. And look, the raw 681 00:38:47,239 --> 00:38:49,840 Speaker 2: economy is why you see these oscillations of hiring and firing. 682 00:38:49,880 --> 00:38:53,840 Speaker 2: It's why you see Google or Microsoft making billions tens 683 00:38:53,840 --> 00:38:57,080 Speaker 2: of billions of dollars each quarter, then fire in ten 684 00:38:57,239 --> 00:39:01,040 Speaker 2: fifteen thousand people while the executives get rich. It's because 685 00:39:01,040 --> 00:39:03,760 Speaker 2: these companies are never actually punished for failing to operate 686 00:39:03,800 --> 00:39:06,759 Speaker 2: their business is in a sustainable way. There is no 687 00:39:06,840 --> 00:39:10,359 Speaker 2: punishment for them. There's not really a punishment for them 688 00:39:10,560 --> 00:39:14,280 Speaker 2: missing something. There's no CEO seems to be fired for, say, 689 00:39:15,000 --> 00:39:19,120 Speaker 2: over hiring by tens of thousands of people, that's fine, 690 00:39:19,239 --> 00:39:21,840 Speaker 2: you know what, Easy come, easy go. Not for the 691 00:39:21,840 --> 00:39:25,480 Speaker 2: people who got laid off, though, And when it comes 692 00:39:25,480 --> 00:39:29,240 Speaker 2: to the startup industry, when it comes to startups in general, 693 00:39:29,920 --> 00:39:33,400 Speaker 2: the rot economy is probably a much bigger deal than 694 00:39:33,440 --> 00:39:38,960 Speaker 2: you realize. Because startups got used to getting venture capital 695 00:39:39,000 --> 00:39:42,400 Speaker 2: whenever they needed it. Businesses like Uber were predicated on 696 00:39:42,480 --> 00:39:45,120 Speaker 2: an endless supply of cheap money, even though the FED 697 00:39:45,160 --> 00:39:47,600 Speaker 2: steadily ratcheted up interest rates in the years leading up 698 00:39:47,640 --> 00:39:51,120 Speaker 2: to the COVID pandemic, only slashing them to mitigate the 699 00:39:51,120 --> 00:39:53,560 Speaker 2: pain of COVID and to a lesser extent, the US 700 00:39:53,640 --> 00:39:56,239 Speaker 2: China trade war. They were trying to incentivize investment. They 701 00:39:56,239 --> 00:39:59,960 Speaker 2: were trying to incentivize putting money into companies that allegedly 702 00:40:00,040 --> 00:40:03,399 Speaker 2: would create jobs. But once the specter of inflation reared 703 00:40:03,440 --> 00:40:08,560 Speaker 2: its head, well things got kind of nasty. And then 704 00:40:08,640 --> 00:40:11,560 Speaker 2: there was the war in Ukraine. It's the collateral damage 705 00:40:11,560 --> 00:40:16,720 Speaker 2: of China zero COVID policy, the labor shortage. Things started 706 00:40:16,719 --> 00:40:19,400 Speaker 2: to unravel and now there isn't really any free money 707 00:40:19,400 --> 00:40:25,040 Speaker 2: to go around. We're in an illogical point in economic history, 708 00:40:25,080 --> 00:40:27,080 Speaker 2: and it's scary to me. It's scary when I look 709 00:40:27,080 --> 00:40:31,600 Speaker 2: at how many companies that just should not exist, and 710 00:40:31,640 --> 00:40:34,799 Speaker 2: it scares me. It scares me that the markets don't 711 00:40:34,840 --> 00:40:37,400 Speaker 2: react when they see like mass hiring of people to 712 00:40:37,640 --> 00:40:41,280 Speaker 2: capture consumer demand. They don't think, oh, what if consumer 713 00:40:41,360 --> 00:40:43,879 Speaker 2: demand goes down? They just don't think in that way. 714 00:40:44,640 --> 00:40:48,480 Speaker 2: They don't react when Microsoft, just an example, lays off 715 00:40:48,520 --> 00:40:53,360 Speaker 2: people almost every year and then makes billions of dollars, 716 00:40:53,520 --> 00:40:56,280 Speaker 2: makes giant acquisitions that don't even make sense. 717 00:40:56,680 --> 00:41:01,400 Speaker 1: Yeah, it's this enshraining of instability as a sign of virtue, 718 00:41:02,200 --> 00:41:05,520 Speaker 1: and like, yeah, like that is that is really dangerous 719 00:41:05,560 --> 00:41:08,799 Speaker 1: because like the more instability you accept and the more 720 00:41:08,840 --> 00:41:12,600 Speaker 1: you like reward the people running these companies for creating 721 00:41:12,719 --> 00:41:16,080 Speaker 1: situations that make the lives of their employees unstable and 722 00:41:16,160 --> 00:41:20,560 Speaker 1: our economy unstable, Like, the more you incentivize that, and eventually, 723 00:41:21,400 --> 00:41:25,200 Speaker 1: like it's going to oscillate too much for balance to 724 00:41:25,200 --> 00:41:26,360 Speaker 1: be regained, you know. 725 00:41:27,400 --> 00:41:29,680 Speaker 2: And I fear them, Yeah, and I fear the fact 726 00:41:29,719 --> 00:41:32,520 Speaker 2: that the market also has no memory. In twenty twenty, 727 00:41:33,000 --> 00:41:35,320 Speaker 2: sach An Adella, CEO of Microsoft, he called for a 728 00:41:35,480 --> 00:41:40,040 Speaker 2: and I'm serious here I referendum on capitalism, telling businesses 729 00:41:40,080 --> 00:41:43,520 Speaker 2: to start grading themselves on the wider economic benefits that 730 00:41:43,560 --> 00:41:47,600 Speaker 2: they bring to society rather than profits. To be clear, 731 00:41:47,640 --> 00:41:50,240 Speaker 2: this was four months after Microsoft laid off one thousand 732 00:41:50,239 --> 00:41:53,120 Speaker 2: people and one year before they hired twenty three thousand people, 733 00:41:53,160 --> 00:41:55,359 Speaker 2: and then in early twenty twenty three they laid off 734 00:41:55,360 --> 00:41:58,520 Speaker 2: a further ten thousand people to and I quote, deliver 735 00:41:58,680 --> 00:42:01,520 Speaker 2: results on an ongoing base while investing in their long 736 00:42:01,640 --> 00:42:05,799 Speaker 2: term opportunity. And these savage job cuts have continued into 737 00:42:05,800 --> 00:42:09,680 Speaker 2: twenty twenty four as well. As I mentioned, they laid 738 00:42:09,680 --> 00:42:13,520 Speaker 2: off nineteen hundred people from Activision Blizzard and their Xbox 739 00:42:13,520 --> 00:42:15,840 Speaker 2: division as well, and that's like eight percent of the 740 00:42:15,880 --> 00:42:18,880 Speaker 2: overall Microsoft gaming team. To be clear, Bobby Kotik, the 741 00:42:18,920 --> 00:42:22,880 Speaker 2: horrible pervert freak who used to run Activision Blizzard, he 742 00:42:23,120 --> 00:42:26,920 Speaker 2: got a several hundred million dollar payoff while all of 743 00:42:26,960 --> 00:42:30,240 Speaker 2: these people got fired. And then a week after laying 744 00:42:30,239 --> 00:42:34,200 Speaker 2: off these people, Microsoft would report solid second quarter earnings. 745 00:42:34,280 --> 00:42:37,040 Speaker 2: They beat expectations in both revenue and profit, and they 746 00:42:37,080 --> 00:42:40,080 Speaker 2: became the most valuable company in the world. And the 747 00:42:40,120 --> 00:42:45,200 Speaker 2: process human capital used as food and fuel for the rich. 748 00:42:45,800 --> 00:42:49,840 Speaker 2: And I hate to get that kind of alarmist, preachery feeling, 749 00:42:50,160 --> 00:42:53,359 Speaker 2: but that is what is happening, and it's something that 750 00:42:53,440 --> 00:42:56,560 Speaker 2: people need to realize and look at and scream at 751 00:42:57,040 --> 00:43:02,560 Speaker 2: because it's disgraceful. Real people with real problems lose their jobs, 752 00:43:02,560 --> 00:43:05,239 Speaker 2: and I know, tech workers get paid or whatever. There's 753 00:43:05,239 --> 00:43:09,200 Speaker 2: still people with mortgages, with rent, with families with children. 754 00:43:10,120 --> 00:43:14,680 Speaker 2: Satch Nadella has billions of dollars. He's fine. Bobby Kotik, 755 00:43:14,840 --> 00:43:18,760 Speaker 2: who was a reprehensible guy oversaw a period of multiple 756 00:43:18,800 --> 00:43:24,560 Speaker 2: alleged sexual harassment things over at Activision, paid off unfathomably rich. 757 00:43:24,880 --> 00:43:27,480 Speaker 2: Why did that asshole get money, Well, these people got 758 00:43:27,520 --> 00:43:31,800 Speaker 2: laid off. It's because the street doesn't care. It's because 759 00:43:31,840 --> 00:43:35,120 Speaker 2: the street doesn't see that as a problem. They don't 760 00:43:35,160 --> 00:43:39,920 Speaker 2: see moral problems or even logical problems like, huh, we 761 00:43:40,000 --> 00:43:41,799 Speaker 2: could have saved a bunch of money. No, they want 762 00:43:41,840 --> 00:43:46,480 Speaker 2: to reward the people that buy into their wretched, rotten system. 763 00:43:46,680 --> 00:43:49,279 Speaker 2: It's a scummy way of running a business that society 764 00:43:49,280 --> 00:43:52,640 Speaker 2: in the market seem to deeply appreciate, and it's actually 765 00:43:52,719 --> 00:43:57,840 Speaker 2: killing innovation. It rewards bad ideas that make lots of money. 766 00:43:57,920 --> 00:44:02,560 Speaker 2: It rewards shitty businesses that fail their customers but make 767 00:44:02,680 --> 00:44:07,239 Speaker 2: tons of money, And it rewards abusing customers in a 768 00:44:07,280 --> 00:44:09,160 Speaker 2: way that I find wretched. 769 00:44:10,400 --> 00:44:14,640 Speaker 1: Yeah, it is. It is disgusting the degree to which 770 00:44:15,120 --> 00:44:18,359 Speaker 1: the wealth of the billionaire class, and I guess even 771 00:44:18,480 --> 00:44:22,279 Speaker 1: to be even more specific, the CEO class, hinges upon 772 00:44:24,080 --> 00:44:27,840 Speaker 1: the regular human sacrifice, Like that is what they're doing. 773 00:44:27,960 --> 00:44:30,239 Speaker 1: Like part of why they hire people is so that 774 00:44:30,600 --> 00:44:33,120 Speaker 1: you can do these big layoffs when you need to 775 00:44:33,200 --> 00:44:37,080 Speaker 1: do them in order to increase stock value enough to 776 00:44:37,200 --> 00:44:39,920 Speaker 1: hit whatever your bonus target for that year or that 777 00:44:40,040 --> 00:44:43,560 Speaker 1: quarter is. Like it is very much, it is very 778 00:44:43,600 --> 00:44:46,640 Speaker 1: much just human sacrifice so that they can get an 779 00:44:46,680 --> 00:44:49,520 Speaker 1: extra however many million dollars a year that fucking bonus 780 00:44:49,640 --> 00:44:51,439 Speaker 1: provision in their contract give them. 781 00:44:51,719 --> 00:44:53,839 Speaker 2: What's crazy is there are guys like Mark Benioff, who 782 00:44:53,880 --> 00:44:57,360 Speaker 2: runs Salesforce again, another company that burns billions to make millions. 783 00:44:58,000 --> 00:45:00,680 Speaker 2: That guy has laid off ten thousands of peace all 784 00:45:00,719 --> 00:45:03,360 Speaker 2: while getting these glossy cover stories that talk about his 785 00:45:03,600 --> 00:45:07,640 Speaker 2: o'hanna philosophy where everybody is important up until the point 786 00:45:07,640 --> 00:45:11,080 Speaker 2: that they're not. And also, listeners, if you want to 787 00:45:11,120 --> 00:45:13,680 Speaker 2: email me easy at better offline dot com, do you 788 00:45:13,800 --> 00:45:17,040 Speaker 2: know what Salesforce does? Because I know multiple people who 789 00:45:17,040 --> 00:45:17,480 Speaker 2: pay for it. 790 00:45:17,520 --> 00:45:21,040 Speaker 1: Who don't they have their big tower with a shitty 791 00:45:21,080 --> 00:45:22,080 Speaker 1: screen on top of it. 792 00:45:22,440 --> 00:45:26,480 Speaker 2: Yeah, it's it's a way for Mark Benioff to make 793 00:45:26,520 --> 00:45:29,279 Speaker 2: a bunch of money, and that does not flow down 794 00:45:29,840 --> 00:45:32,799 Speaker 2: laying off tens of thousands of people. It's just it's disgraceful. 795 00:45:33,600 --> 00:45:35,839 Speaker 2: But you know what, it does begin somewhere, and I've 796 00:45:35,920 --> 00:45:38,160 Speaker 2: kind of hinted this with Uber, but it's important to 797 00:45:38,200 --> 00:45:43,320 Speaker 2: realize how much of this comes from this much more reckless, ugly, 798 00:45:43,400 --> 00:45:46,479 Speaker 2: and violent form of funding. I'm talking, of course, about 799 00:45:46,560 --> 00:45:51,680 Speaker 2: venture capital. Venture capitalists are sometimes firms who get money 800 00:45:51,719 --> 00:45:54,319 Speaker 2: from something called a limited partner, so they get a 801 00:45:54,320 --> 00:45:56,560 Speaker 2: pool of money that they invest in startups, using their 802 00:45:57,000 --> 00:46:00,600 Speaker 2: alleged smarts to pick the winners of the future. And 803 00:46:00,640 --> 00:46:02,920 Speaker 2: then when they put that money into these companies, they 804 00:46:02,920 --> 00:46:05,239 Speaker 2: hope that this company will either go public much like 805 00:46:05,360 --> 00:46:09,440 Speaker 2: Uber did and Facebook did, or be sold to another company. 806 00:46:10,320 --> 00:46:12,400 Speaker 2: And when you look at many of the problems that 807 00:46:12,440 --> 00:46:15,240 Speaker 2: you find in the tech industry, when you search for something, 808 00:46:15,760 --> 00:46:17,760 Speaker 2: you think you've said the right thing and just eleven 809 00:46:17,920 --> 00:46:21,000 Speaker 2: lines of nonsense pop up, or you go to look 810 00:46:21,000 --> 00:46:24,080 Speaker 2: at your grandmother's pictures on Facebook, but someone tries to 811 00:46:24,120 --> 00:46:27,920 Speaker 2: sell you a fitness supplement. This is what's happening. The 812 00:46:28,000 --> 00:46:31,279 Speaker 2: rot economy is working against you. The CEOs of the 813 00:46:31,280 --> 00:46:35,399 Speaker 2: companies of the products you're using, they don't care. They 814 00:46:35,440 --> 00:46:37,320 Speaker 2: care as much as they need to monetize you. But 815 00:46:37,800 --> 00:46:41,600 Speaker 2: deep down they're going to choose themselves and their shareholders 816 00:46:41,640 --> 00:46:45,680 Speaker 2: and their board members way before you, and they currently 817 00:46:45,760 --> 00:46:50,400 Speaker 2: have all the power and the web. I'm scared, but 818 00:46:50,480 --> 00:46:53,520 Speaker 2: I don't want you to be. I want to inform 819 00:46:53,560 --> 00:46:56,080 Speaker 2: you about how this is happening. I want the Better 820 00:46:56,120 --> 00:46:59,879 Speaker 2: Offline podcast to be a place where you can understand, 821 00:47:00,120 --> 00:47:02,799 Speaker 2: where you can be educated about how you are being 822 00:47:02,880 --> 00:47:06,600 Speaker 2: conned about how they are monetizing your digital lives. And 823 00:47:06,680 --> 00:47:09,399 Speaker 2: I very much look forward to telling you more about 824 00:47:09,400 --> 00:47:12,359 Speaker 2: this in the future. Thank you for listening. Please check 825 00:47:12,400 --> 00:47:14,920 Speaker 2: out Better Offline dot com and email me at easy 826 00:47:15,040 --> 00:47:17,480 Speaker 2: at better Offline dot com if you've got any thoughts. 827 00:47:17,880 --> 00:47:20,120 Speaker 2: Thank you Robert so much for joining me. By the way, 828 00:47:20,360 --> 00:47:23,799 Speaker 2: I very much appreciate the coosone Media rocks. I've very 829 00:47:23,880 --> 00:47:25,680 Speaker 2: much enjoyed building this with you. 830 00:47:26,080 --> 00:47:27,960 Speaker 1: Yeah, I'm excited to hear what you come up with 831 00:47:28,040 --> 00:47:31,800 Speaker 1: next and continue this conversation because I think you're putting 832 00:47:31,800 --> 00:47:34,040 Speaker 1: like a name to a demon that has been like 833 00:47:34,160 --> 00:47:37,279 Speaker 1: haunting all of our nightmares for a while now. And 834 00:47:37,880 --> 00:47:40,000 Speaker 1: that's not you know, the only thing that you need 835 00:47:40,040 --> 00:47:42,319 Speaker 1: to do to beat it, but it's certainly where like 836 00:47:43,320 --> 00:47:44,960 Speaker 1: turning back the tide starts. 837 00:47:45,680 --> 00:47:47,880 Speaker 2: And I really look forward to walking through these problems 838 00:47:47,880 --> 00:47:51,080 Speaker 2: with you and many other incredibly smart people in the future. 839 00:47:51,480 --> 00:47:53,960 Speaker 2: And of course, Better Offline is a weekly podcast. You 840 00:47:53,960 --> 00:47:57,359 Speaker 2: can find us every Wednesday on the iHeartRadio app and 841 00:47:57,600 --> 00:48:01,000 Speaker 2: wherever else you find your podcasts. The editor and composer 842 00:48:01,040 --> 00:48:03,839 Speaker 2: of the Better Offline theme song is Matasowski. You can 843 00:48:03,920 --> 00:48:06,279 Speaker 2: check out more of his music and audio projects at 844 00:48:06,320 --> 00:48:09,759 Speaker 2: Matasowski dot com, M A T T O S O 845 00:48:10,440 --> 00:48:14,239 Speaker 2: W s ki dot com. More information, Got a Better 846 00:48:14,239 --> 00:48:16,839 Speaker 2: Offline dot com or email me at easy at better 847 00:48:16,840 --> 00:48:23,440 Speaker 2: offline dot com. Thank you for listening everyone. 848 00:48:27,360 --> 00:48:29,800 Speaker 1: Better Offline is a production of cool Zone Media. 849 00:48:30,000 --> 00:48:31,240 Speaker 2: For more from cool Zone. 850 00:48:31,000 --> 00:48:33,040 Speaker 1: Media, visit our website cool. 851 00:48:32,880 --> 00:48:36,239 Speaker 2: Zonemedia dot com, or check us out on the iHeartRadio app, 852 00:48:36,280 --> 00:48:52,959 Speaker 2: Apple Podcasts, or wherever you get your podcasts that's off 853 00:48:53,000 --> 00:48:53,239 Speaker 2: Line