1 00:00:04,440 --> 00:00:12,479 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,480 --> 00:00:14,960 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:14,960 --> 00:00:17,479 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:17,600 --> 00:00:20,720 Speaker 1: tech are You. I've got a special episode today. I've 5 00:00:20,720 --> 00:00:24,320 Speaker 1: got a fellow podcaster in the studio with me, virtually 6 00:00:24,800 --> 00:00:29,360 Speaker 1: Ed Zetron, who has the better offline podcast. If y'all 7 00:00:29,400 --> 00:00:32,880 Speaker 1: think that I cast a critical eye at the tech industry, 8 00:00:33,360 --> 00:00:38,199 Speaker 1: strap in, because Ed, you know, I give him the business. 9 00:00:38,280 --> 00:00:41,560 Speaker 1: But I've listened to your show and it's gloves off 10 00:00:41,600 --> 00:00:42,360 Speaker 1: when you start. 11 00:00:42,680 --> 00:00:46,400 Speaker 2: I can't help myself. I have to because everyone's angry 12 00:00:46,400 --> 00:00:49,280 Speaker 2: at tech right now. Everyone is so angry, and I 13 00:00:49,320 --> 00:00:51,720 Speaker 2: get emails about it saying, oh, you've really put into 14 00:00:51,720 --> 00:00:54,320 Speaker 2: words so I feel. And I think it's just I 15 00:00:54,360 --> 00:00:57,400 Speaker 2: am naturally pissed off. I guess very much, just a 16 00:00:57,520 --> 00:00:58,400 Speaker 2: natural fury. 17 00:00:58,960 --> 00:00:59,560 Speaker 3: I suppose. 18 00:00:59,720 --> 00:01:03,240 Speaker 1: Yeah, I'm a gen xer, so I'm a little older, 19 00:01:03,560 --> 00:01:07,720 Speaker 1: and I think the natural cynicism that I possess has 20 00:01:07,840 --> 00:01:11,600 Speaker 1: sort of dulled me to the point where I'm angry. 21 00:01:11,640 --> 00:01:15,640 Speaker 1: But I'm also tired supposed to being angry. I've been 22 00:01:15,680 --> 00:01:21,640 Speaker 1: angry since the early eighties. Yeah, I mean, you grow 23 00:01:21,720 --> 00:01:23,880 Speaker 1: up in that Cold War era and you really start 24 00:01:23,920 --> 00:01:27,280 Speaker 1: you learn how to be angry. But I wanted to 25 00:01:27,319 --> 00:01:30,959 Speaker 1: talk about your show and about its genesis and sort 26 00:01:31,000 --> 00:01:34,480 Speaker 1: of the process of you know, how you came about 27 00:01:34,560 --> 00:01:37,680 Speaker 1: thinking about doing a podcast and the sort of things 28 00:01:37,680 --> 00:01:40,960 Speaker 1: that specifically inspired you. I'd love to talk about the 29 00:01:41,040 --> 00:01:43,720 Speaker 1: episode that published most recently as we record this, where 30 00:01:43,720 --> 00:01:47,520 Speaker 1: you talked about the metaverse, because it's something that I share. 31 00:01:47,880 --> 00:01:50,240 Speaker 1: All the feelings that you express in that episode I've 32 00:01:50,240 --> 00:01:53,240 Speaker 1: talked about on this show as well, which is largely 33 00:01:53,280 --> 00:01:57,200 Speaker 1: about how infuriating it is to have this ill defined 34 00:01:57,360 --> 00:02:01,560 Speaker 1: concept take over to an extent where you feel you're 35 00:02:01,560 --> 00:02:04,160 Speaker 1: just screaming into the void. Why are you doing? Why 36 00:02:04,160 --> 00:02:07,240 Speaker 1: are you spending money? Why are these businesses jumping into it? 37 00:02:07,440 --> 00:02:09,720 Speaker 1: But you can say the same thing about NFTs, you 38 00:02:09,760 --> 00:02:13,400 Speaker 1: can say the same thing about AI and he did, yes, 39 00:02:13,639 --> 00:02:16,600 Speaker 1: and you continue you will continue to do so. But 40 00:02:16,720 --> 00:02:19,200 Speaker 1: let's talk about that, Let's talk about sort of what 41 00:02:19,360 --> 00:02:23,360 Speaker 1: your background is and then leading into the creation of 42 00:02:23,520 --> 00:02:25,240 Speaker 1: the Better Offline podcast. 43 00:02:25,880 --> 00:02:29,920 Speaker 2: So in twenty twenty, I was as many people were 44 00:02:29,919 --> 00:02:32,720 Speaker 2: getting quite depressed, and it was because I wasn't writing 45 00:02:32,800 --> 00:02:34,799 Speaker 2: as well as the world burning around us. So I thought, 46 00:02:34,840 --> 00:02:38,520 Speaker 2: I'm going to start a newsletter. I had two hundred subscribers, 47 00:02:38,520 --> 00:02:40,600 Speaker 2: three hundred maybe, and most of them, I'm pretty sure 48 00:02:40,600 --> 00:02:43,280 Speaker 2: with dead emails. But I started writing about tech stuff 49 00:02:43,280 --> 00:02:46,160 Speaker 2: because I've be running a PR firm for about twelve years, 50 00:02:46,480 --> 00:02:48,960 Speaker 2: and a tech PR firm specifically, so I've seen all 51 00:02:48,960 --> 00:02:52,760 Speaker 2: these things up close. And then during twenty twenty, I 52 00:02:52,800 --> 00:02:54,680 Speaker 2: was writing a lot about return to office, and that 53 00:02:54,720 --> 00:02:56,560 Speaker 2: felt like I was screaming into the void as well. 54 00:02:56,560 --> 00:02:57,679 Speaker 3: People were saying. 55 00:02:57,919 --> 00:03:01,400 Speaker 2: From like July, oh no, even like May Well, We've 56 00:03:01,400 --> 00:03:02,800 Speaker 2: got to get back to the office, but no one 57 00:03:02,840 --> 00:03:05,320 Speaker 2: could say why. There were all these articles Wall Street Journal, 58 00:03:05,520 --> 00:03:07,800 Speaker 2: New York Times talking about the office is better, the 59 00:03:07,800 --> 00:03:10,720 Speaker 2: office is better, and it was always talking to executives, 60 00:03:10,720 --> 00:03:13,880 Speaker 2: and that really began to joke afire me a bit. 61 00:03:13,960 --> 00:03:17,040 Speaker 2: I was like, ah, I don't like this, and people 62 00:03:17,080 --> 00:03:19,000 Speaker 2: were reading it. People were responding really well. But I 63 00:03:19,040 --> 00:03:21,760 Speaker 2: remember the thing that the first time it really took 64 00:03:21,800 --> 00:03:26,160 Speaker 2: off was this thing called Clubhouse. Clubhouse was I don't 65 00:03:26,160 --> 00:03:30,120 Speaker 2: know if you remember, so everyone in the tech industry 66 00:03:30,200 --> 00:03:33,280 Speaker 2: was saying clubhouse is the future. Clubhouse is the future. 67 00:03:33,440 --> 00:03:36,800 Speaker 2: What Clubhouse was was basically a live podcast show. It's 68 00:03:36,840 --> 00:03:40,560 Speaker 2: Twitter spaces, that's all. It is a live voice chat 69 00:03:40,640 --> 00:03:43,560 Speaker 2: thing where any number of speakers can speak at the 70 00:03:43,560 --> 00:03:46,960 Speaker 2: same time. And what happened was Andrewson Horowitz major venture 71 00:03:47,000 --> 00:03:50,280 Speaker 2: capital firm, and Mark Andresen himself were really trying to 72 00:03:50,280 --> 00:03:52,960 Speaker 2: push this through. They were pushing every tech person, every 73 00:03:52,960 --> 00:03:55,880 Speaker 2: celebrity they knew to use this platform, and everyone was 74 00:03:55,920 --> 00:03:58,080 Speaker 2: talking about it and saying this is the future. And 75 00:03:58,360 --> 00:04:01,560 Speaker 2: I couldn't tell how because last time I checked, radio 76 00:04:01,640 --> 00:04:05,120 Speaker 2: existed and everyone was talking about this thing, saying it's 77 00:04:05,120 --> 00:04:07,280 Speaker 2: going to be worth billions, and it was allegedly a 78 00:04:07,320 --> 00:04:10,040 Speaker 2: four billion dollar offer. And I kept writing, just saying, 79 00:04:10,160 --> 00:04:12,600 Speaker 2: this is vaporware. This isn't the thing. There is no 80 00:04:12,960 --> 00:04:16,400 Speaker 2: there was no revenue, there was no monetization strategy. There 81 00:04:16,440 --> 00:04:19,359 Speaker 2: really wasn't a point to it. And also tech people 82 00:04:19,400 --> 00:04:24,400 Speaker 2: are frightfully boring. So hearing these ultra rich buffoons. 83 00:04:23,920 --> 00:04:26,839 Speaker 3: Going, oh, well, you know, the innovations are very important, 84 00:04:26,839 --> 00:04:28,120 Speaker 3: and they'd ramble on and on. 85 00:04:28,200 --> 00:04:31,600 Speaker 2: It was dull, but people were flocking to it. And 86 00:04:31,640 --> 00:04:35,240 Speaker 2: then one day they just stopped. They just stopped using it, 87 00:04:35,839 --> 00:04:38,080 Speaker 2: and I did a few like aha moments where I 88 00:04:38,160 --> 00:04:40,400 Speaker 2: was like, ah, gotcha, and then I got quoted in 89 00:04:40,480 --> 00:04:42,400 Speaker 2: like Wired, and I was like, oh, okay, this is cool. 90 00:04:42,400 --> 00:04:44,839 Speaker 2: But I kept writing because I enjoyed it. And also 91 00:04:45,120 --> 00:04:48,440 Speaker 2: I kind of see both sides. I see the business 92 00:04:48,480 --> 00:04:50,279 Speaker 2: side and I see the journalism side. And I was 93 00:04:50,279 --> 00:04:53,120 Speaker 2: a games journalist before this, and so I just kept 94 00:04:53,160 --> 00:04:54,720 Speaker 2: writing about the stuff and I kept calling it as 95 00:04:54,720 --> 00:04:57,240 Speaker 2: I see it because it was an independent newsletter, and 96 00:04:57,440 --> 00:05:00,359 Speaker 2: it went from two hundred people to fourth out like 97 00:05:00,400 --> 00:05:03,520 Speaker 2: one thousand to four thousand subscribers. It's twenty three thousand today, 98 00:05:04,160 --> 00:05:07,159 Speaker 2: and so it became a big thing. And I don't know, 99 00:05:07,279 --> 00:05:09,560 Speaker 2: I like writing a lot, and I can write very fast. 100 00:05:09,600 --> 00:05:11,680 Speaker 2: I used to do three or four a week. I 101 00:05:11,680 --> 00:05:13,919 Speaker 2: have no idea how I had that output. Now I 102 00:05:14,040 --> 00:05:15,720 Speaker 2: just do once a week when I can make it. 103 00:05:15,920 --> 00:05:19,080 Speaker 2: But I write in the way that I grew up 104 00:05:19,120 --> 00:05:21,760 Speaker 2: reading and seeing on television. In England, there was generally 105 00:05:21,800 --> 00:05:26,080 Speaker 2: a more critical approach to journalism, but there was more opinion, 106 00:05:26,120 --> 00:05:29,359 Speaker 2: and when the opinion ran, there wasn't really afraid of 107 00:05:29,440 --> 00:05:31,640 Speaker 2: leaving a few black eyes. As long as you punched up, 108 00:05:31,680 --> 00:05:34,720 Speaker 2: you don't attack the people below, you, attack the executives, 109 00:05:34,760 --> 00:05:37,599 Speaker 2: the sun duper shies of the world, Google's alphabet CEO 110 00:05:38,240 --> 00:05:40,520 Speaker 2: guy who makes two hundred and twenty million dollars a 111 00:05:40,600 --> 00:05:44,400 Speaker 2: year and lays off thousands of people and drives Google 112 00:05:44,440 --> 00:05:46,960 Speaker 2: Search into the ground. These are things that people see 113 00:05:47,080 --> 00:05:47,680 Speaker 2: and people. 114 00:05:47,440 --> 00:05:48,200 Speaker 3: Are angry about. 115 00:05:48,240 --> 00:05:50,599 Speaker 2: And I think I'm good at putting that into words 116 00:05:50,920 --> 00:05:51,560 Speaker 2: for better. 117 00:05:53,080 --> 00:05:53,560 Speaker 3: Words there. 118 00:05:53,600 --> 00:05:57,440 Speaker 1: I guess sure. It's such a valuable asset to have 119 00:05:57,720 --> 00:06:01,719 Speaker 1: because to your point, in your various podcasts and hearing 120 00:06:01,760 --> 00:06:05,440 Speaker 1: you talk and meetings and things, you know, we got 121 00:06:05,440 --> 00:06:09,120 Speaker 1: a lot of quote unquote tech journalism that's not really 122 00:06:09,200 --> 00:06:13,960 Speaker 1: journalism at all. It's essentially regurgitating press releases, which doesn't 123 00:06:14,000 --> 00:06:16,560 Speaker 1: have any value to it. You could just go and 124 00:06:16,600 --> 00:06:18,600 Speaker 1: read the press release if that's what you wanted to do. 125 00:06:18,960 --> 00:06:22,600 Speaker 1: And it's only a few outlets that go any further 126 00:06:22,680 --> 00:06:27,720 Speaker 1: than that, and I find them incredibly valuable. But unfortunately 127 00:06:27,760 --> 00:06:31,200 Speaker 1: those are like the exceptions rather than the rule. And 128 00:06:31,279 --> 00:06:35,240 Speaker 1: actually we could even talk about tech's effect on journalism 129 00:06:35,279 --> 00:06:37,599 Speaker 1: and how that has played a huge part in this 130 00:06:37,800 --> 00:06:41,120 Speaker 1: and that it has devalued journalism to large extent. There's 131 00:06:41,120 --> 00:06:44,200 Speaker 1: the infamous pivot to video moment we could talk about like, 132 00:06:44,240 --> 00:06:47,240 Speaker 1: there's so much that ends up being wrapped up in this. 133 00:06:47,480 --> 00:06:50,159 Speaker 1: It becomes a lot of inside baseball for folks who 134 00:06:50,240 --> 00:06:53,840 Speaker 1: work in the industry. But it's important to recognize so 135 00:06:53,920 --> 00:06:57,000 Speaker 1: that when I do something like advocate for critical thinking, 136 00:06:57,440 --> 00:07:00,279 Speaker 1: people have those assets that they can go to that 137 00:07:00,440 --> 00:07:03,960 Speaker 1: help train them on this and not just throwing them 138 00:07:03,960 --> 00:07:06,320 Speaker 1: into the deep end of the pool and saying think 139 00:07:06,360 --> 00:07:08,280 Speaker 1: critically and you'll be fine. 140 00:07:08,680 --> 00:07:11,840 Speaker 2: Yeah, what does thinking critical look like? Tech journalism as 141 00:07:11,840 --> 00:07:14,760 Speaker 2: a whole has got more critical. On the broadcast side, 142 00:07:15,360 --> 00:07:18,600 Speaker 2: a lot of tech podcasts are very right leaning right now. 143 00:07:18,640 --> 00:07:21,360 Speaker 2: The two most successful ones the All In Podcast and 144 00:07:21,920 --> 00:07:25,160 Speaker 2: Lex Fridmund's And Lex Fridman is a right wing so 145 00:07:25,240 --> 00:07:28,520 Speaker 2: it is the all In guys. David sax Is a 146 00:07:28,560 --> 00:07:33,000 Speaker 2: pro Russia goon. Jason Kalakanis isn't much better. These guys 147 00:07:33,120 --> 00:07:36,880 Speaker 2: are controlling the airwaves to some extent. But a lot 148 00:07:36,920 --> 00:07:39,280 Speaker 2: of that is because a lot of the tech industry 149 00:07:40,280 --> 00:07:43,840 Speaker 2: thinks that what media in tech should do is positive. 150 00:07:43,920 --> 00:07:46,120 Speaker 2: It should be rah rah, and we should only criticize 151 00:07:46,120 --> 00:07:48,560 Speaker 2: a few things when we kind of feel like here 152 00:07:48,920 --> 00:07:51,440 Speaker 2: versus what people want, which is an explanation as to 153 00:07:51,520 --> 00:07:54,560 Speaker 2: why things aren't working right. They want to understand the 154 00:07:54,640 --> 00:07:56,800 Speaker 2: decisions that people are making. Something you've done very what 155 00:07:56,920 --> 00:07:59,600 Speaker 2: was You've explained why things are the way they are, 156 00:07:59,760 --> 00:08:03,440 Speaker 2: how things have got places when Google was a search 157 00:08:03,440 --> 00:08:06,720 Speaker 2: company versus an AD's company. People don't know these things. 158 00:08:07,000 --> 00:08:09,480 Speaker 2: There is a need for things like how your iPhone works. 159 00:08:09,560 --> 00:08:13,320 Speaker 2: Rich Deimiro Katla excellent guy, great radio host as well. 160 00:08:13,520 --> 00:08:16,360 Speaker 2: He does very good like how to avoid scam stuff. 161 00:08:16,480 --> 00:08:19,280 Speaker 2: That kind of journalism is extremely valuable. Steve Noviello and 162 00:08:19,320 --> 00:08:23,160 Speaker 2: Houston on Fox another great guy for this, that's very valuable. 163 00:08:23,160 --> 00:08:26,400 Speaker 2: But otherwise, tech journalism is growing into an industry. It's 164 00:08:26,400 --> 00:08:28,720 Speaker 2: growing into something that critiques things and says, here are 165 00:08:28,760 --> 00:08:30,800 Speaker 2: what the powerful people are doing. But I don't think 166 00:08:30,840 --> 00:08:34,200 Speaker 2: that there's enough just explaining why the world is the 167 00:08:34,200 --> 00:08:37,760 Speaker 2: way it is and showing the emotion the anger you 168 00:08:37,800 --> 00:08:41,280 Speaker 2: need to because this stuff Google Search being worse is horrifying. 169 00:08:41,280 --> 00:08:44,720 Speaker 2: It's something that hits billions of people. You should be angry. 170 00:08:44,760 --> 00:08:47,720 Speaker 2: You're angry. I'm angry. This should be front page news 171 00:08:47,760 --> 00:08:50,520 Speaker 2: everywhere except Google put out a thing saying, oh, we're 172 00:08:50,520 --> 00:08:53,240 Speaker 2: gonna fix search we're gonna make it better, and everyone's like, great, 173 00:08:53,280 --> 00:08:56,199 Speaker 2: Google's good. Now they're not. They haven't done anything well. 174 00:08:56,280 --> 00:08:58,000 Speaker 1: And if it were a front page news, you never 175 00:08:58,040 --> 00:08:59,520 Speaker 1: find it because you've put it in Google Search and 176 00:08:59,520 --> 00:09:00,960 Speaker 1: it wouldn't come exactly. 177 00:09:01,320 --> 00:09:03,720 Speaker 3: You get three different seo spam sites. 178 00:09:03,880 --> 00:09:06,520 Speaker 1: Yeah, it's fascinating to me because I can actually, I mean, 179 00:09:06,559 --> 00:09:09,199 Speaker 1: obviously this isn't a new thing. I can see patterns 180 00:09:09,240 --> 00:09:12,120 Speaker 1: of this kind of cycle where if you start asking 181 00:09:12,200 --> 00:09:14,959 Speaker 1: questions and you start coming up with answers, you start 182 00:09:14,960 --> 00:09:19,120 Speaker 1: to see kind of the cynical background that's fueling things 183 00:09:19,160 --> 00:09:21,840 Speaker 1: within the tech industry. I'm reminded when I first started 184 00:09:22,200 --> 00:09:25,440 Speaker 1: getting into tech podcasting, it was in two thousand and eight. 185 00:09:25,559 --> 00:09:28,320 Speaker 1: Two thousand and eight was around the time where all 186 00:09:28,400 --> 00:09:31,560 Speaker 1: the different television manufacturing companies were all pushing three D 187 00:09:31,920 --> 00:09:35,760 Speaker 1: televisions and you start to ask the question, why why 188 00:09:35,760 --> 00:09:37,920 Speaker 1: are you pushing three D TV? And if you start 189 00:09:37,960 --> 00:09:41,440 Speaker 1: coming up with answers, Well, one, you've got the various 190 00:09:41,480 --> 00:09:44,480 Speaker 1: studios out there that are really keen on the idea 191 00:09:44,600 --> 00:09:47,280 Speaker 1: of creating a format that is very difficult to pirate, 192 00:09:48,080 --> 00:09:50,960 Speaker 1: and so therefore they have an incentive to push the 193 00:09:51,000 --> 00:09:53,360 Speaker 1: three D technology, and then you've got the manufacturers who 194 00:09:53,440 --> 00:09:56,320 Speaker 1: they need to come up with ways to differentiate this 195 00:09:56,440 --> 00:09:59,559 Speaker 1: year's models from last year's models. It's not just good 196 00:09:59,640 --> 00:10:02,120 Speaker 1: enough to out with another TV that does what last 197 00:10:02,200 --> 00:10:06,280 Speaker 1: year's TV does, and you can't just constantly push resolution 198 00:10:06,480 --> 00:10:09,000 Speaker 1: year over year. So there were all these very cynical 199 00:10:09,040 --> 00:10:12,640 Speaker 1: reasons for it, and ultimately the response from consumers was 200 00:10:12,679 --> 00:10:13,960 Speaker 1: I just don't want this. 201 00:10:14,679 --> 00:10:17,280 Speaker 2: But that's a fascinating one, though, because I remember that 202 00:10:17,280 --> 00:10:18,679 Speaker 2: two thousand and eight was actually when I moved to 203 00:10:18,760 --> 00:10:21,920 Speaker 2: America and I moved into the tech industry. And the 204 00:10:22,040 --> 00:10:25,000 Speaker 2: other thing that you're completely right about is you'll notice 205 00:10:25,040 --> 00:10:28,760 Speaker 2: you didn't mention the filmmakers. The filmmakers were just like, 206 00:10:28,840 --> 00:10:31,840 Speaker 2: I don't want I don't want to do this. This 207 00:10:31,880 --> 00:10:34,280 Speaker 2: doesn't sound like it will make the movie better unless 208 00:10:34,280 --> 00:10:37,120 Speaker 2: I'm making Shrek. Shrek was one of the few the 209 00:10:37,160 --> 00:10:40,120 Speaker 2: minions movies I think much later, obviously, Yeah, but that 210 00:10:40,280 --> 00:10:42,400 Speaker 2: was really one of those times where I don't think 211 00:10:42,400 --> 00:10:44,120 Speaker 2: I saw it at the time quite as clearly as 212 00:10:44,160 --> 00:10:46,440 Speaker 2: you did, but you're right, that was one of those times. 213 00:10:46,440 --> 00:10:49,160 Speaker 2: So it's like this something he's being done to consumers 214 00:10:49,440 --> 00:10:50,400 Speaker 2: rather than for them. 215 00:10:50,640 --> 00:10:53,000 Speaker 1: Yeah, they were trying to pull a Steve Jobs, who 216 00:10:53,080 --> 00:10:56,160 Speaker 1: was obviously brilliant at convincing people they needed something that 217 00:10:56,160 --> 00:10:59,439 Speaker 1: they probable. Guy though, listen, we're not going to talk 218 00:10:59,440 --> 00:11:02,600 Speaker 1: about getting on this podcast because that was a thing 219 00:11:02,640 --> 00:11:05,959 Speaker 1: at Apple, I know. But yeah, well probably know people 220 00:11:05,960 --> 00:11:08,080 Speaker 1: who got jobs at one point that was when you 221 00:11:08,120 --> 00:11:10,240 Speaker 1: would walk in and Steve Jobs would fire you on 222 00:11:10,280 --> 00:11:14,200 Speaker 1: the spot for some minor thing. Anyway, what he was 223 00:11:14,240 --> 00:11:17,400 Speaker 1: really good at was convincing people that they wanted a product, 224 00:11:17,440 --> 00:11:19,839 Speaker 1: or that a product that he had on offer was 225 00:11:19,880 --> 00:11:21,559 Speaker 1: going to solve a problem they didn't even know they 226 00:11:21,600 --> 00:11:24,960 Speaker 1: had before. And partly why he was able to create 227 00:11:25,160 --> 00:11:27,240 Speaker 1: entirely new markets. I don't want to discount all the 228 00:11:27,280 --> 00:11:29,679 Speaker 1: other people who also put in a huge amount of work, 229 00:11:29,960 --> 00:11:31,840 Speaker 1: you know, Johnny Ives and all that kind of stuff. 230 00:11:32,040 --> 00:11:34,840 Speaker 1: But the industry was trying to copy what he was 231 00:11:34,880 --> 00:11:37,520 Speaker 1: able to do, and they just weren't able to achieve it. 232 00:11:37,559 --> 00:11:40,680 Speaker 1: They were not able to convince the market that three 233 00:11:40,720 --> 00:11:44,000 Speaker 1: D television was really a valuable thing. There was the 234 00:11:44,160 --> 00:11:47,000 Speaker 1: issue that there wasn't that much content in the beginning 235 00:11:47,120 --> 00:11:48,480 Speaker 1: to really grab people. 236 00:11:48,760 --> 00:11:51,680 Speaker 2: This still isn't Apple on the vision pro is still 237 00:11:51,679 --> 00:11:53,679 Speaker 2: trying to convince you to get three D content. 238 00:11:54,600 --> 00:11:56,880 Speaker 1: Yeah, well, I mean when you look at some of 239 00:11:56,880 --> 00:11:58,520 Speaker 1: the ones that came out, like there were a few 240 00:11:58,679 --> 00:12:02,960 Speaker 1: obvious standouts that showed spectacle and how spectacle could be effect. 241 00:12:02,960 --> 00:12:05,959 Speaker 1: I mean, Avatar is it's impossible to argue like it 242 00:12:06,360 --> 00:12:09,000 Speaker 1: was the top grossing film of all time and in 243 00:12:09,040 --> 00:12:12,640 Speaker 1: part was because of the spectacle of this three dimensional 244 00:12:12,960 --> 00:12:17,079 Speaker 1: filmmaking approach. But then you had things like the Hobbit series, 245 00:12:17,120 --> 00:12:21,840 Speaker 1: which certainly did not capture imaginations the same way Avatar did, 246 00:12:21,880 --> 00:12:24,560 Speaker 1: despite the fact that it was three D and forty 247 00:12:24,600 --> 00:12:27,320 Speaker 1: eight frames per second film making pushing the envelope, saying 248 00:12:27,360 --> 00:12:29,000 Speaker 1: this is the future of film, and people said, yeah, 249 00:12:29,040 --> 00:12:32,000 Speaker 1: can we make it? Not that because I don't like 250 00:12:32,600 --> 00:12:35,120 Speaker 1: the Hobbits looking like they're acting in a Mexican soap opera. 251 00:12:35,200 --> 00:12:36,560 Speaker 1: It's just not appealing to me. 252 00:12:36,760 --> 00:12:40,040 Speaker 2: But also just no one wanted it. No one was 253 00:12:40,080 --> 00:12:42,160 Speaker 2: saying that being like I wish this movie was more 254 00:12:42,160 --> 00:12:42,920 Speaker 2: in front of me. 255 00:12:43,120 --> 00:12:45,320 Speaker 1: Right, or that it was three times longer than it 256 00:12:45,320 --> 00:12:49,080 Speaker 1: needed to be. Yeah, oh god, yeah. 257 00:12:48,720 --> 00:12:49,200 Speaker 3: We won't go. 258 00:12:49,559 --> 00:12:51,280 Speaker 1: So I also have a Lord of the Rings tattoo, 259 00:12:51,360 --> 00:12:54,160 Speaker 1: so I'm not even gonna go into my disdain for 260 00:12:54,240 --> 00:12:57,120 Speaker 1: the Hobbit films, but that, to me is an example 261 00:12:57,400 --> 00:12:59,400 Speaker 1: of sort of the things that are going on behind 262 00:12:59,400 --> 00:13:01,840 Speaker 1: the scenes that you may not necessarily be aware of 263 00:13:01,880 --> 00:13:04,160 Speaker 1: as a consumer or just you know, part of the 264 00:13:04,160 --> 00:13:07,640 Speaker 1: mainstream public. And that's why I find podcasts like Better 265 00:13:07,679 --> 00:13:10,480 Speaker 1: Offline really valuable, because what you're doing is you're asking 266 00:13:10,520 --> 00:13:13,320 Speaker 1: the questions and peeling back the layers and saying, not 267 00:13:13,559 --> 00:13:17,720 Speaker 1: only does this thing not make sense, but you need 268 00:13:17,720 --> 00:13:21,200 Speaker 1: to understand the motivations behind why it's a thing in 269 00:13:21,240 --> 00:13:24,280 Speaker 1: the first place, and why you should question it and 270 00:13:24,800 --> 00:13:27,600 Speaker 1: why you might feel uneasy toward it. Like I again, 271 00:13:27,640 --> 00:13:30,320 Speaker 1: I think to NFTs as being a great example of 272 00:13:31,240 --> 00:13:34,720 Speaker 1: the thing that everyone who was already invested in the 273 00:13:34,760 --> 00:13:38,600 Speaker 1: crypto world, we're desperate to convince you this is a 274 00:13:38,679 --> 00:13:41,439 Speaker 1: valuable thing enough so that companies were jumping in without 275 00:13:41,440 --> 00:13:44,160 Speaker 1: really understanding what NFTs were. I think I think most 276 00:13:44,160 --> 00:13:47,960 Speaker 1: companies still don't understand what NFTs are, and everyone was 277 00:13:47,960 --> 00:13:50,280 Speaker 1: getting that feeling of am I actually missing out? Is 278 00:13:50,280 --> 00:13:53,200 Speaker 1: this going to be a case of not investing early 279 00:13:53,240 --> 00:13:55,520 Speaker 1: on and I'm going to miss out on a huge opportunity, 280 00:13:55,720 --> 00:13:58,439 Speaker 1: and of course, we saw the bottom drop out of 281 00:13:58,520 --> 00:14:01,360 Speaker 1: NFTs at the end of that EA, and it hasn't 282 00:14:01,360 --> 00:14:06,040 Speaker 1: recovered since, which suggests that the people who were skeptical 283 00:14:06,320 --> 00:14:10,000 Speaker 1: and expressing caution were right all along. That to me 284 00:14:10,120 --> 00:14:12,440 Speaker 1: is where the value for a show like Better Offline 285 00:14:12,480 --> 00:14:15,120 Speaker 1: really comes in, because it teaches you to ask those 286 00:14:15,200 --> 00:14:17,760 Speaker 1: questions yourself, even if you haven't done an episode about 287 00:14:17,760 --> 00:14:19,040 Speaker 1: that specific topic yet. 288 00:14:19,240 --> 00:14:21,800 Speaker 2: And a lot of it is as well that people 289 00:14:21,800 --> 00:14:23,800 Speaker 2: are a lot smarter than they think they are with 290 00:14:23,880 --> 00:14:26,160 Speaker 2: this stuff. And I think the tech industry is done 291 00:14:26,680 --> 00:14:29,480 Speaker 2: and the marketers especially have done a really good job 292 00:14:29,800 --> 00:14:34,160 Speaker 2: making people feel that they won't understand there are algorithms 293 00:14:34,200 --> 00:14:36,400 Speaker 2: and powers at play that you just simply could not 294 00:14:36,440 --> 00:14:39,400 Speaker 2: on Stay You don't understand NFTs, and that's why you 295 00:14:39,440 --> 00:14:42,440 Speaker 2: will have fun staying Poor or Crypto the same deal, 296 00:14:42,680 --> 00:14:45,840 Speaker 2: when ultimately with Crypto they claim to be making decentralized 297 00:14:45,840 --> 00:14:49,080 Speaker 2: software and they weren't. The software does not work, None 298 00:14:49,120 --> 00:14:53,520 Speaker 2: of it is good. Axioinfinity is allegedly like a Pokemon clone. 299 00:14:53,520 --> 00:14:56,400 Speaker 2: It costs like one thousand dollars just to start, and 300 00:14:56,560 --> 00:14:59,280 Speaker 2: by the way, it's a terrible game. It's not good. 301 00:15:00,080 --> 00:15:02,880 Speaker 2: A lot of guys rich though, And that's the thing. 302 00:15:03,000 --> 00:15:07,280 Speaker 2: When people understand the incentives and the actual basic things. 303 00:15:07,520 --> 00:15:10,240 Speaker 2: I think a lot of people can understand tech way more. 304 00:15:10,280 --> 00:15:11,800 Speaker 2: And you know this, You've been doing this podcast for 305 00:15:11,840 --> 00:15:14,560 Speaker 2: a long time. People are way more aware of this stuff. 306 00:15:15,040 --> 00:15:19,240 Speaker 2: But there is almost a deliberate obfuscation at times, and 307 00:15:19,280 --> 00:15:21,920 Speaker 2: I find it reprehensible. I find it very annoying. And 308 00:15:21,960 --> 00:15:24,760 Speaker 2: the reason the Metaverse episode got me so angry was 309 00:15:25,120 --> 00:15:28,880 Speaker 2: I remember everyone trying to explain to me, Hey ed 310 00:15:29,120 --> 00:15:31,240 Speaker 2: virtual worlds of the future, and I looked them dead 311 00:15:31,280 --> 00:15:33,360 Speaker 2: in the iron and said, I was literally like the 312 00:15:33,400 --> 00:15:38,240 Speaker 2: second guy in England who wrote about massively multiplayer online RPGs. 313 00:15:38,680 --> 00:15:40,800 Speaker 2: I know these things, and I've known them for a 314 00:15:40,840 --> 00:15:43,840 Speaker 2: long time. I played Meridium fifty nine and I know this. 315 00:15:44,280 --> 00:15:46,920 Speaker 2: I played Ultimore Online. I know this stuff. What you 316 00:15:46,960 --> 00:15:48,840 Speaker 2: were describing is the stuff that I was playing on 317 00:15:48,880 --> 00:15:52,560 Speaker 2: a PCMIA card. So please tell me how this is different. 318 00:15:52,680 --> 00:15:55,080 Speaker 2: And they couldn't. It was a big lie, and I 319 00:15:55,120 --> 00:15:57,840 Speaker 2: think everyone saw it at the time, but the media 320 00:15:57,880 --> 00:16:00,160 Speaker 2: fell for it. There were so many there were Wall 321 00:16:00,200 --> 00:16:03,160 Speaker 2: Street Journal, New York Times, Time Magazine, all of these 322 00:16:03,160 --> 00:16:06,440 Speaker 2: places saying the metaverse is here, The metaverse is here everyone, 323 00:16:06,440 --> 00:16:07,400 Speaker 2: and it wasn't. 324 00:16:07,400 --> 00:16:10,240 Speaker 1: Yeah, and it was reminding me if you go back 325 00:16:10,240 --> 00:16:13,880 Speaker 1: a decade of discussions of the semantic web, the semantic web. 326 00:16:13,920 --> 00:16:16,680 Speaker 2: Oh god, I haven't saw about the semantic web in forever, 327 00:16:16,840 --> 00:16:19,680 Speaker 2: I know, But it's essentially I mean, it has a 328 00:16:19,720 --> 00:16:22,760 Speaker 2: lot of the same groundwork as as the metaverse does. 329 00:16:22,880 --> 00:16:26,160 Speaker 1: Right, it doesn't necessarily incorporate a virtual world, but then 330 00:16:26,280 --> 00:16:29,680 Speaker 1: arguably neither does the metaverse because the word doesn't mean anything, 331 00:16:30,040 --> 00:16:33,200 Speaker 1: so like the metaverse could involve a virtual world, or 332 00:16:33,240 --> 00:16:35,960 Speaker 1: you could be talking about something entirely different. It all 333 00:16:35,960 --> 00:16:36,960 Speaker 1: depends upon how. 334 00:16:36,840 --> 00:16:40,200 Speaker 2: I saw Microsoft teams referring to itself as ye verse. 335 00:16:40,280 --> 00:16:43,400 Speaker 3: I got that was when it's not nope, yes enough, it's. 336 00:16:43,240 --> 00:16:47,440 Speaker 1: Not just frustrating, it's destructive. When you see enterprises jump 337 00:16:47,560 --> 00:16:51,160 Speaker 1: in on these trending ideas that are not fully baked 338 00:16:51,880 --> 00:16:58,080 Speaker 1: and convincing people, whether it's it's employees or partners or investors, 339 00:16:58,200 --> 00:17:01,520 Speaker 1: that this is the future when clearly no one has 340 00:17:01,640 --> 00:17:04,919 Speaker 1: a really firm grip on what they're even talking about, 341 00:17:05,400 --> 00:17:08,560 Speaker 1: and it's such a massive waste of resources. And when 342 00:17:08,560 --> 00:17:10,600 Speaker 1: you sit there, when you see when tech works well, 343 00:17:10,800 --> 00:17:13,600 Speaker 1: when tech is doing its job and actually making things 344 00:17:13,800 --> 00:17:16,679 Speaker 1: easier or better, and you sit there and think, what 345 00:17:16,800 --> 00:17:19,439 Speaker 1: if we took all that wasted time and effort and 346 00:17:19,480 --> 00:17:23,480 Speaker 1: we had actually dedicated it to enterprises that could benefit people, 347 00:17:23,560 --> 00:17:26,080 Speaker 1: where would we be now? Like, those are the questions 348 00:17:26,080 --> 00:17:29,800 Speaker 1: that get me angry. I still rant about it, and. 349 00:17:29,760 --> 00:17:31,880 Speaker 2: A lot of it comes down to something are called 350 00:17:31,880 --> 00:17:35,960 Speaker 2: the rot economy, where we are in a world where 351 00:17:36,400 --> 00:17:39,280 Speaker 2: companies are evaluated in the public markets and at times 352 00:17:39,359 --> 00:17:42,919 Speaker 2: the private markets based on growth, not based on what 353 00:17:43,000 --> 00:17:45,920 Speaker 2: they produce, whether they have happy customers, whether they are 354 00:17:45,920 --> 00:17:48,400 Speaker 2: making a sustainable income, whether they could do the same 355 00:17:48,440 --> 00:17:50,320 Speaker 2: thing for another three years and may be fine, No, 356 00:17:50,520 --> 00:17:53,320 Speaker 2: you must grow ten twenty percent a year, which is 357 00:17:53,359 --> 00:17:57,160 Speaker 2: why Microsoft jumped on the metaverse wagon. Disney same deal 358 00:17:57,480 --> 00:17:59,920 Speaker 2: and then drop them the moment things look bad, and hey, 359 00:18:00,040 --> 00:18:03,960 Speaker 2: you know who also dropped the metaverse meta They also 360 00:18:04,000 --> 00:18:05,800 Speaker 2: moved itward. They went fro him in twenty twenty two 361 00:18:05,880 --> 00:18:08,159 Speaker 2: talk about the metaverse being the most important thing to 362 00:18:08,600 --> 00:18:11,960 Speaker 2: twenty twenty three being the year of efficiency, and that 363 00:18:12,160 --> 00:18:15,760 Speaker 2: AI was where Boz and Zuck were now facing themselves. 364 00:18:16,000 --> 00:18:18,159 Speaker 2: And he'll still claim that the metaverse is around, but 365 00:18:18,200 --> 00:18:21,520 Speaker 2: it's just that is the symptom of an economy that 366 00:18:21,600 --> 00:18:25,840 Speaker 2: doesn't build technology for people, but builds technology to sell 367 00:18:26,040 --> 00:18:29,040 Speaker 2: to investors. And that's where we're going right now. And 368 00:18:29,040 --> 00:18:31,520 Speaker 2: that's the scary thing. Tech can be great and a 369 00:18:31,520 --> 00:18:33,320 Speaker 2: big thing better off on I'm trying not to do 370 00:18:33,960 --> 00:18:37,919 Speaker 2: is not to be just endlessly cynical. I'm angry, and 371 00:18:37,960 --> 00:18:41,679 Speaker 2: you should be angry. But I don't hate tech. I 372 00:18:41,760 --> 00:18:45,880 Speaker 2: hate some people in tech. I find Mark Zuckerberg truly disgusting. 373 00:18:45,960 --> 00:18:49,480 Speaker 2: Same with Steve Jobs, staying with Elon Musk, all of 374 00:18:49,880 --> 00:18:52,120 Speaker 2: these people are not working for people. They are working 375 00:18:52,119 --> 00:18:55,119 Speaker 2: for investors in themselves. And Steve Jobs was an evil 376 00:18:55,160 --> 00:18:57,120 Speaker 2: cret in listened to Behind the Bastards if you want 377 00:18:57,119 --> 00:18:59,440 Speaker 2: to know more. But at the very least he made 378 00:18:59,480 --> 00:19:02,560 Speaker 2: stuff for people. He had an inherent sense of like, 379 00:19:02,760 --> 00:19:06,720 Speaker 2: this feels good in the hand, This UIUX situation is satisfying. 380 00:19:07,040 --> 00:19:08,840 Speaker 2: They can now do this and that will make me 381 00:19:08,960 --> 00:19:11,800 Speaker 2: lots of money. He was good at that. Tim Cook less, 382 00:19:11,840 --> 00:19:13,919 Speaker 2: so he's better, and he's better than a lot of 383 00:19:13,920 --> 00:19:16,520 Speaker 2: the rest of them. But Sundar Pishai, he doesn't give 384 00:19:16,560 --> 00:19:19,080 Speaker 2: a rat's ass about any of this he wants to 385 00:19:19,080 --> 00:19:21,080 Speaker 2: see Google grows. It was two hundred and twenty three 386 00:19:21,119 --> 00:19:24,199 Speaker 2: million in twenty twenty two, two hundred and eighty and 387 00:19:24,240 --> 00:19:27,439 Speaker 2: twenty twenty one year off of the pandemic, millions of 388 00:19:27,440 --> 00:19:29,800 Speaker 2: people dying, and he's becomes one of the richest guys. 389 00:19:29,960 --> 00:19:33,240 Speaker 2: Really really sad, especially because Google Search has got so bad. 390 00:19:34,920 --> 00:19:36,680 Speaker 2: And I think the big difference is it no longer 391 00:19:36,680 --> 00:19:39,520 Speaker 2: feels like a fair trade. We're no longer in situation 392 00:19:39,640 --> 00:19:42,199 Speaker 2: where they're making cool stuff and we're okay with them 393 00:19:42,240 --> 00:19:45,240 Speaker 2: getting rich doing so because they're making bad stuff and 394 00:19:45,320 --> 00:19:46,320 Speaker 2: getting richer. 395 00:19:46,440 --> 00:19:49,679 Speaker 1: And getting richer ed. And I will be back to 396 00:19:49,840 --> 00:19:52,200 Speaker 1: rant a lot more, but first, let's take a quick 397 00:19:52,200 --> 00:20:04,000 Speaker 1: break to think our sponsors. Google in particular is a 398 00:20:04,000 --> 00:20:07,960 Speaker 1: tricky topic simply because even in its good years, it 399 00:20:08,040 --> 00:20:11,520 Speaker 1: was a company that seemed like it was throw anything 400 00:20:11,560 --> 00:20:13,800 Speaker 1: at the wall and hope that things stick. And even 401 00:20:13,840 --> 00:20:17,200 Speaker 1: if things stuck, you weren't guaranteed to have that product 402 00:20:17,800 --> 00:20:19,960 Speaker 1: last for more than a few years before Google would 403 00:20:19,960 --> 00:20:23,280 Speaker 1: either discontinue it or you know, cannibalize it for parts 404 00:20:23,280 --> 00:20:26,280 Speaker 1: and put it into other existing Google features. Like I 405 00:20:26,359 --> 00:20:30,880 Speaker 1: talked recently about Google Wave, a product that was, oh, 406 00:20:30,920 --> 00:20:33,520 Speaker 1: Google Wave good for maybe two people, and it was 407 00:20:33,600 --> 00:20:35,600 Speaker 1: me and my co host because. 408 00:20:35,400 --> 00:20:37,640 Speaker 3: Because we were so cool. But it was useless. 409 00:20:37,720 --> 00:20:39,719 Speaker 1: It was it was useless except we would use it 410 00:20:39,760 --> 00:20:42,679 Speaker 1: to literally build run of show for a live streaming 411 00:20:42,720 --> 00:20:44,919 Speaker 1: show we did once a week, and for that it 412 00:20:45,000 --> 00:20:47,320 Speaker 1: was perfect. And I said, I literally can't think of 413 00:20:47,359 --> 00:20:49,960 Speaker 1: another use case for this particular tool. 414 00:20:50,160 --> 00:20:50,480 Speaker 3: The thing. 415 00:20:50,600 --> 00:20:54,920 Speaker 2: Imagine if they'd have built Discord instead, which is basically mIRC, 416 00:20:55,240 --> 00:20:58,440 Speaker 2: which is a twenty thirty year old piece of free software. 417 00:20:59,640 --> 00:21:03,280 Speaker 1: Yeah. Yeah, So even in Google's you know heyday, when 418 00:21:03,320 --> 00:21:06,320 Speaker 1: the Don't Be Evil was still at least nodded to. 419 00:21:06,640 --> 00:21:09,359 Speaker 1: It was a tricky place, but at least you felt 420 00:21:09,400 --> 00:21:11,920 Speaker 1: like they were sincere. I felt that it was a 421 00:21:11,960 --> 00:21:15,480 Speaker 1: lot of engineers building tools for other engineers, not necessarily 422 00:21:15,520 --> 00:21:17,679 Speaker 1: for the general public. If you weren't an engineer. It 423 00:21:17,720 --> 00:21:20,680 Speaker 1: wasn't nearly user or friendly enough. But I felt like 424 00:21:20,720 --> 00:21:24,000 Speaker 1: they believed in what they did. I don't necessarily feel 425 00:21:24,080 --> 00:21:27,400 Speaker 1: that today, largely for the same reasons you've mentioned. 426 00:21:27,640 --> 00:21:30,920 Speaker 2: Well, there's a big thing that happened in twenty nineteen. 427 00:21:31,000 --> 00:21:34,600 Speaker 2: Twenty twenty. Ben Gomes, who was the head of Google 428 00:21:34,640 --> 00:21:37,360 Speaker 2: Search at the time he was Actually he put out 429 00:21:37,359 --> 00:21:39,640 Speaker 2: there was an event at Google called Code Yellow, where 430 00:21:39,680 --> 00:21:41,920 Speaker 2: basically they realized that the search had a problem, and 431 00:21:42,040 --> 00:21:45,639 Speaker 2: he put out an alert warning saying the Google Ads 432 00:21:45,920 --> 00:21:48,359 Speaker 2: site is getting too close to search. They keep trying 433 00:21:48,359 --> 00:21:52,240 Speaker 2: to mess with him. Twenty twenty, Prabakar Rakavan takes over 434 00:21:52,760 --> 00:21:57,520 Speaker 2: his former job head of Ads at Google. That's what happened, folks. 435 00:21:57,680 --> 00:22:01,000 Speaker 1: This it's not a bug, it's a feature. Yeah, it's 436 00:22:01,040 --> 00:22:02,040 Speaker 1: not a problem, it's a goal. 437 00:22:02,320 --> 00:22:04,320 Speaker 2: And what was funny was at the time Prova CoV 438 00:22:04,440 --> 00:22:06,480 Speaker 2: was like, oh, yeah, yeah, God's doing a great job 439 00:22:06,520 --> 00:22:06,960 Speaker 2: with search. 440 00:22:07,040 --> 00:22:08,879 Speaker 3: Great job man. Zip. 441 00:22:09,240 --> 00:22:13,239 Speaker 2: Now Google Search sucks so bad. Yeah, it's so And 442 00:22:13,280 --> 00:22:15,240 Speaker 2: that's the thing. That's the emails I get about the 443 00:22:15,280 --> 00:22:17,440 Speaker 2: Row Economy episode, where it's like, thank you for saying 444 00:22:17,480 --> 00:22:20,399 Speaker 2: that I'm being confused this whole time. These things feel 445 00:22:20,520 --> 00:22:22,840 Speaker 2: very obvious to those of us who were extremely online, 446 00:22:23,160 --> 00:22:25,800 Speaker 2: who were doused in this every day. But I don't 447 00:22:25,840 --> 00:22:29,119 Speaker 2: think most people realize because there's so much happening, and 448 00:22:29,240 --> 00:22:32,080 Speaker 2: to your point earlier, Google news is not particularly useful. 449 00:22:32,680 --> 00:22:36,639 Speaker 2: News is not something that's very well spread these days. 450 00:22:36,680 --> 00:22:40,000 Speaker 1: Sure, yeah, I mean, well, I'm not gonna open up 451 00:22:40,040 --> 00:22:41,639 Speaker 1: the can of worms, because I totally could. But this 452 00:22:41,760 --> 00:22:43,920 Speaker 1: is it's making me want to start talking about things 453 00:22:44,000 --> 00:22:47,440 Speaker 1: like recommendation algorithms and the evils they present. But we've 454 00:22:47,480 --> 00:22:49,679 Speaker 1: done I've done tons of episodes, but I've done a 455 00:22:49,680 --> 00:22:52,280 Speaker 1: lot that. Yeah, I don't need to retread that ground, 456 00:22:52,359 --> 00:22:54,960 Speaker 1: but there are big problems that I think need to 457 00:22:54,960 --> 00:22:57,680 Speaker 1: be recognized and addressed. I don't know what the solutions 458 00:22:57,720 --> 00:23:00,640 Speaker 1: are because when I look at the issues of you're 459 00:23:00,680 --> 00:23:04,160 Speaker 1: running a company specifically to keep shareholders happy, right, rather 460 00:23:04,200 --> 00:23:07,919 Speaker 1: than to see any other corporate mission through. That to me, 461 00:23:08,080 --> 00:23:11,160 Speaker 1: has its roots deep in the again the nineteen eighties, 462 00:23:11,200 --> 00:23:13,119 Speaker 1: like there I grew up, and like that's where that 463 00:23:13,200 --> 00:23:16,320 Speaker 1: kind of really started to take hold thanks to like 464 00:23:16,359 --> 00:23:20,280 Speaker 1: people like Jack Welch and stuff and areas outside of tech. 465 00:23:20,560 --> 00:23:22,879 Speaker 1: But the tech industry seems to have really taken that 466 00:23:22,960 --> 00:23:23,879 Speaker 1: ball and run with it. 467 00:23:23,960 --> 00:23:27,560 Speaker 2: They're rewarded for it by the markets, of course, is 468 00:23:27,720 --> 00:23:28,480 Speaker 2: very profitable. 469 00:23:28,600 --> 00:23:31,280 Speaker 1: It's what makes me have to ask questions too, that 470 00:23:32,760 --> 00:23:35,600 Speaker 1: even when I'm hearing things that on the surface sound 471 00:23:36,400 --> 00:23:40,359 Speaker 1: beneficial or benign, I started asking questions and then it 472 00:23:40,480 --> 00:23:43,600 Speaker 1: turns out there's a darker side to it. So, for example, 473 00:23:43,880 --> 00:23:48,960 Speaker 1: open AI and the push for regulations in AI, Like 474 00:23:49,040 --> 00:23:51,240 Speaker 1: that sounds great, you know, the idea of, oh, we 475 00:23:51,359 --> 00:23:56,160 Speaker 1: have a dominant company in generative AI that's advocating for 476 00:23:56,280 --> 00:24:01,480 Speaker 1: the purposes of creating regulations rules for AI. That sounds 477 00:24:01,480 --> 00:24:04,320 Speaker 1: like that's good. Like that seems like that's within the 478 00:24:04,440 --> 00:24:08,359 Speaker 1: actual mission statement of the original open AI, which was 479 00:24:08,680 --> 00:24:15,200 Speaker 1: presumably created to make responsible artificial intelligence implementations and avoid 480 00:24:15,280 --> 00:24:17,040 Speaker 1: all the pitfalls that we worry about. 481 00:24:17,080 --> 00:24:20,359 Speaker 2: Whether it's you know, except you were basically letting a 482 00:24:20,440 --> 00:24:22,840 Speaker 2: murderer at that point dictate his own sentence. 483 00:24:23,000 --> 00:24:26,080 Speaker 1: Yes, you're you're letting the person who would most benefit 484 00:24:26,119 --> 00:24:29,760 Speaker 1: from making the rules make the rules right, which could 485 00:24:29,880 --> 00:24:32,199 Speaker 1: very well mean that, oh, let's make rules so that 486 00:24:32,400 --> 00:24:36,120 Speaker 1: it suppresses anyone else who's getting into the AI game, 487 00:24:36,200 --> 00:24:38,840 Speaker 1: giving us an even greater advantage in the market than 488 00:24:38,840 --> 00:24:43,240 Speaker 1: we already have while impacting us the bare minimum amount. 489 00:24:43,560 --> 00:24:46,560 Speaker 1: And it starts sounding like you're getting into fringe theories, 490 00:24:46,560 --> 00:24:49,800 Speaker 1: but that's really what's going on. It's really it's really 491 00:24:49,960 --> 00:24:52,119 Speaker 1: the truth that you're talking about people who they have 492 00:24:52,200 --> 00:24:54,360 Speaker 1: such a huge stake in the game, And I think 493 00:24:54,359 --> 00:24:57,000 Speaker 1: that kind of applies to pretty much everything we've talked about. 494 00:24:57,040 --> 00:24:59,840 Speaker 1: They have such a huge steak, that's the full and 495 00:25:00,080 --> 00:25:02,240 Speaker 1: centive for them to do this. It may sound on 496 00:25:02,280 --> 00:25:05,920 Speaker 1: the surface to be like this very kind of gracious approach, 497 00:25:06,400 --> 00:25:08,679 Speaker 1: but in reality it's very self serving. 498 00:25:09,119 --> 00:25:12,240 Speaker 2: Well, it's kind of like the entire open ai nonprofit 499 00:25:12,320 --> 00:25:16,199 Speaker 2: side that when Sam Altman was elvested, was proven to 500 00:25:16,200 --> 00:25:19,639 Speaker 2: be completely ineffectual. People should be really angry about that. 501 00:25:19,800 --> 00:25:22,439 Speaker 2: What happened there was Sam Oltman was kicked out for 502 00:25:22,520 --> 00:25:23,800 Speaker 2: reasons we still do not know. 503 00:25:24,160 --> 00:25:25,159 Speaker 3: We really do not know. 504 00:25:25,240 --> 00:25:27,800 Speaker 2: That's never come out. However, what happened was pressure from 505 00:25:27,880 --> 00:25:32,280 Speaker 2: venture capitalists and Microsoft, a three trillion dollar tech firm, 506 00:25:32,760 --> 00:25:35,000 Speaker 2: was used to put him back in. The pressure was done. 507 00:25:35,280 --> 00:25:37,800 Speaker 2: That should That is one of the darker things I've 508 00:25:37,800 --> 00:25:40,640 Speaker 2: seen happen in the tech industry. And I saw journalists 509 00:25:40,680 --> 00:25:43,960 Speaker 2: saying this is beautiful. That scared me. That scared me 510 00:25:44,040 --> 00:25:48,240 Speaker 2: more than anything I've seen in this industry when you see. 511 00:25:48,080 --> 00:25:49,800 Speaker 1: That the other people who are on the board were 512 00:25:49,840 --> 00:25:51,960 Speaker 1: members of the co founding group that came up with 513 00:25:52,080 --> 00:25:54,000 Speaker 1: the idea of open ai in the first place. 514 00:25:53,760 --> 00:25:57,960 Speaker 2: And the people on the board were replaced by Larry Summers. Yeah, 515 00:25:58,200 --> 00:26:01,719 Speaker 2: you want some gen x cynicism. Yeah, Harry bloody Sommers. 516 00:26:02,040 --> 00:26:05,199 Speaker 2: Adam DiAngelo, the co founder of Qura, he's still on 517 00:26:05,240 --> 00:26:07,960 Speaker 2: the board. Cora is currently full of open ai stuff 518 00:26:08,080 --> 00:26:11,639 Speaker 2: ruining the platform. People don't realize this is the stuff 519 00:26:11,680 --> 00:26:15,840 Speaker 2: that matters. These moments are the things to be angry about. 520 00:26:15,720 --> 00:26:19,040 Speaker 1: Right and when people like the members of the former 521 00:26:19,080 --> 00:26:23,120 Speaker 1: board are trying to take action in order to get 522 00:26:23,160 --> 00:26:26,199 Speaker 1: back on the track that was originally intended when they 523 00:26:26,240 --> 00:26:28,800 Speaker 1: were first forming open Ai, you know what they get 524 00:26:28,840 --> 00:26:32,880 Speaker 1: for their their efforts is this huge backlash that was 525 00:26:32,960 --> 00:26:36,800 Speaker 1: exacerbated by the media coverage of it, and then forced 526 00:26:36,840 --> 00:26:40,720 Speaker 1: to not only reinstate Altman, but to step down as well. 527 00:26:41,000 --> 00:26:44,040 Speaker 1: Right like, you had co founders having stepped. 528 00:26:43,720 --> 00:26:47,639 Speaker 2: Down and replaced with Larry goddamn Summers and one of 529 00:26:47,680 --> 00:26:51,480 Speaker 2: the sales force guys. Just like, replace them, Just replace 530 00:26:51,520 --> 00:26:54,360 Speaker 2: them with Darth Vader, just go, just go the whole way, 531 00:26:54,480 --> 00:26:57,840 Speaker 2: replace them with replace them with David Petraeus. Why not 532 00:26:58,359 --> 00:27:02,200 Speaker 2: just go the whole damn way. I'm sure they considered 533 00:27:02,200 --> 00:27:05,439 Speaker 2: Contaliza Rice m so they now have something in common 534 00:27:05,480 --> 00:27:06,640 Speaker 2: with the Cleveland Browns. 535 00:27:07,960 --> 00:27:10,480 Speaker 1: This totally off topic, but I'm reminded of the time. 536 00:27:11,040 --> 00:27:14,520 Speaker 1: I'm a huge fan of the Disney parks because I 537 00:27:14,560 --> 00:27:17,199 Speaker 1: went to them as a kid. So even as an adult, 538 00:27:17,200 --> 00:27:20,560 Speaker 1: while I recognized the shortcomings of the Disney Company, the 539 00:27:20,600 --> 00:27:23,160 Speaker 1: parks have a special place in nostalgia in my heart. 540 00:27:23,480 --> 00:27:26,480 Speaker 1: It's not that different from today, honestly, but this happened, 541 00:27:26,680 --> 00:27:28,400 Speaker 1: you know, back in the nineties. There was a time 542 00:27:28,440 --> 00:27:33,080 Speaker 1: period where Disney had replaced the executive level of leadership 543 00:27:33,119 --> 00:27:37,520 Speaker 1: for the Disney parks with retail leaders so people from 544 00:27:37,560 --> 00:27:40,560 Speaker 1: like the Banana Republican Gap, and that was a time 545 00:27:40,640 --> 00:27:43,080 Speaker 1: where they started to do things like gut all the 546 00:27:43,119 --> 00:27:45,800 Speaker 1: stuff that gave character to the Disney parks and replace 547 00:27:45,840 --> 00:27:48,280 Speaker 1: them with stores, and to take out some of the 548 00:27:48,960 --> 00:27:52,000 Speaker 1: other attractions and to put in like quick service food 549 00:27:52,280 --> 00:27:55,800 Speaker 1: carts because they were much higher profit margins, right like 550 00:27:55,840 --> 00:27:58,240 Speaker 1: they were running it like they were running a retail business. 551 00:27:58,440 --> 00:28:01,560 Speaker 1: And it started to take away a lot of the 552 00:28:01,680 --> 00:28:04,840 Speaker 1: charm that had been there in the parks. And to me, 553 00:28:05,080 --> 00:28:07,600 Speaker 1: like I recognize a lot of that same sort of 554 00:28:07,600 --> 00:28:10,520 Speaker 1: stuff going on in the tech world in general, like 555 00:28:10,560 --> 00:28:13,880 Speaker 1: even tools that maybe once upon a time you liked 556 00:28:14,240 --> 00:28:15,760 Speaker 1: a lot of the sheen has worn off. 557 00:28:16,280 --> 00:28:18,760 Speaker 2: Well, I mean Google is more profitable than ever. Yeah, 558 00:28:18,800 --> 00:28:21,400 Speaker 2: same with Meta, Like these companies are so much worse, 559 00:28:21,400 --> 00:28:23,760 Speaker 2: the products are so much worse, but they're more profitable 560 00:28:23,800 --> 00:28:24,120 Speaker 2: than ever. 561 00:28:24,320 --> 00:28:26,840 Speaker 1: Yeah. When you're thinking about Google, I also think about YouTube. 562 00:28:27,040 --> 00:28:29,960 Speaker 1: I think about how incredibly successful YouTube is, and I 563 00:28:30,000 --> 00:28:33,560 Speaker 1: think about all the creators on YouTube and how at 564 00:28:33,560 --> 00:28:36,520 Speaker 1: the mercy of YouTube they are. Like, if there's any 565 00:28:36,720 --> 00:28:40,120 Speaker 1: change in YouTube at all, it has a disproportionate effect 566 00:28:40,120 --> 00:28:42,360 Speaker 1: on the people who are creating the content there. And 567 00:28:42,400 --> 00:28:45,440 Speaker 1: you could have someone who has really good content and 568 00:28:46,480 --> 00:28:50,440 Speaker 1: genuinely has a good voice and something to say and 569 00:28:50,760 --> 00:28:54,560 Speaker 1: good production values, and they could be absolutely devastated by 570 00:28:54,600 --> 00:28:59,160 Speaker 1: a change in YouTube's algorithms or be completely laid waste 571 00:28:59,320 --> 00:29:04,080 Speaker 1: when there's unsubstantial copyright claims put against them, Like the 572 00:29:04,640 --> 00:29:08,440 Speaker 1: platform can be weaponized, and so it has been weaponized. 573 00:29:08,640 --> 00:29:09,200 Speaker 3: I look at that. 574 00:29:09,280 --> 00:29:11,520 Speaker 1: And meanwhile, you have at the company level, a company 575 00:29:11,520 --> 00:29:15,360 Speaker 1: that continues to just benefit financially larger amounts a year 576 00:29:15,360 --> 00:29:16,120 Speaker 1: over year over year. 577 00:29:16,320 --> 00:29:19,320 Speaker 2: Yes, and by making the product more annoying to use 578 00:29:19,640 --> 00:29:22,080 Speaker 2: so that people spend more time on it. I mean 579 00:29:22,200 --> 00:29:25,239 Speaker 2: it's frustrating as well. Because the fair question is what 580 00:29:25,280 --> 00:29:28,480 Speaker 2: do we do here? And the answer genuinely is things 581 00:29:28,560 --> 00:29:31,640 Speaker 2: like four h four media independent group of former VI 582 00:29:31,800 --> 00:29:36,320 Speaker 2: some motherboard people at Jason Coblo, Emanuel Milberg, Fantastic, Semantically Colors. Well, 583 00:29:37,120 --> 00:29:39,880 Speaker 2: they did a really great story about research thing that 584 00:29:40,000 --> 00:29:43,720 Speaker 2: showed that Google results were getting worse. I believe that 585 00:29:43,760 --> 00:29:47,160 Speaker 2: awareness of these issues will put pressure on these companies. 586 00:29:48,440 --> 00:29:51,040 Speaker 2: I know it's I'm just running a podcast, really, but 587 00:29:51,280 --> 00:29:53,840 Speaker 2: I believe that more people being aware of these things 588 00:29:53,960 --> 00:29:56,960 Speaker 2: means that more things will change. These companies have gotten 589 00:29:57,000 --> 00:30:00,400 Speaker 2: away with it because they've obfuscated the value proposition and 590 00:30:00,600 --> 00:30:03,880 Speaker 2: the way they're making their money. Now that that's changing, 591 00:30:04,280 --> 00:30:08,040 Speaker 2: I actually believe that they may stop pulling these things back. 592 00:30:08,120 --> 00:30:08,920 Speaker 2: It will take time. 593 00:30:09,480 --> 00:30:14,480 Speaker 1: I have some optimism, largely because I have seen activism 594 00:30:14,520 --> 00:30:18,400 Speaker 1: and organization and unionizing starting to grow, especially in the 595 00:30:18,400 --> 00:30:21,080 Speaker 1: tech industry over the last couple of years, which was 596 00:30:21,160 --> 00:30:23,080 Speaker 1: unheard of a few years ago. 597 00:30:23,960 --> 00:30:26,120 Speaker 2: Unions will also break the back of these companies, which 598 00:30:26,120 --> 00:30:27,120 Speaker 2: why they're so scared of them. 599 00:30:27,200 --> 00:30:31,440 Speaker 1: Yes, and it's why they're challenging the National Labor Relations 600 00:30:31,480 --> 00:30:33,240 Speaker 1: Board and whether or not as constitutional. 601 00:30:33,600 --> 00:30:35,120 Speaker 3: I know, right so discussed I'm. 602 00:30:35,040 --> 00:30:37,280 Speaker 1: Not going to go into it because I'll start screaming 603 00:30:37,280 --> 00:30:42,000 Speaker 1: into the microphone. Support your unions, guys, please do. But yeah, 604 00:30:42,160 --> 00:30:45,040 Speaker 1: like seeing that has given me a lot of hope 605 00:30:45,120 --> 00:30:48,280 Speaker 1: because it tells me that there has finally reached a 606 00:30:48,320 --> 00:30:52,040 Speaker 1: point where you can't just sit back and not be 607 00:30:52,160 --> 00:30:56,560 Speaker 1: active and not support your fellow workers, and to actually 608 00:30:56,640 --> 00:30:59,640 Speaker 1: call out these companies when they are doing things that 609 00:30:59,680 --> 00:31:04,200 Speaker 1: are hard for everyone other than the executives and the shareholders. 610 00:31:04,280 --> 00:31:07,840 Speaker 1: And I think that is promising. It's not a guarantee 611 00:31:07,840 --> 00:31:10,200 Speaker 1: that we're going to turn the corner and enter into, 612 00:31:10,440 --> 00:31:14,000 Speaker 1: you know, the idyllic version of the future that Gene 613 00:31:14,200 --> 00:31:16,720 Speaker 1: Roddenberry had when he made Star Trek. I don't think 614 00:31:16,800 --> 00:31:21,120 Speaker 1: that we're like on that doorstep. But I find solace 615 00:31:21,600 --> 00:31:24,960 Speaker 1: in seeing that movement grow in various sectors of the 616 00:31:25,000 --> 00:31:27,880 Speaker 1: tech industry. And my hope is to continue to see 617 00:31:27,960 --> 00:31:31,920 Speaker 1: that grow and to continue seeing people educate themselves, as 618 00:31:31,960 --> 00:31:36,400 Speaker 1: you've said, and to advocate for those changes because ultimately 619 00:31:36,440 --> 00:31:39,880 Speaker 1: it does benefit the largest amount of people. The people 620 00:31:39,920 --> 00:31:42,920 Speaker 1: it doesn't benefit would be again the executive team, the 621 00:31:42,960 --> 00:31:46,239 Speaker 1: c suites, and the shareholders. But honestly, I think they 622 00:31:46,240 --> 00:31:48,600 Speaker 1: can afford to it will be fine. 623 00:31:48,640 --> 00:31:51,720 Speaker 2: Think, Yeah, you'll make fifty million dollars a year. 624 00:31:51,760 --> 00:31:53,600 Speaker 3: Hell, you'll probably make one hundred million dollars. 625 00:31:53,720 --> 00:31:56,800 Speaker 2: You'll be fine. You'll be fine. Make good stuff again. 626 00:31:56,920 --> 00:31:59,040 Speaker 1: Yeah, and in the long term, you'll actually be better. 627 00:32:00,480 --> 00:32:04,320 Speaker 1: I'm not done being angry at the tech industry just yet, 628 00:32:04,800 --> 00:32:06,720 Speaker 1: but I do have to take a break to thank 629 00:32:06,760 --> 00:32:19,880 Speaker 1: our sponsors. So there's an online media company that I 630 00:32:19,920 --> 00:32:22,120 Speaker 1: was a fan of for a very long time called 631 00:32:22,200 --> 00:32:26,800 Speaker 1: rooster Teeth, and yeah, Warner Brothers Discovery announced yesterday they 632 00:32:26,880 --> 00:32:29,680 Speaker 1: shutting it down. So after twenty one years, almost twenty 633 00:32:29,720 --> 00:32:32,800 Speaker 1: one years, the company, rooster Teeth is getting shut down. 634 00:32:32,800 --> 00:32:34,400 Speaker 1: At more than one hundred and fifty people are losing 635 00:32:34,440 --> 00:32:36,400 Speaker 1: their jobs as a result of that. I plan on 636 00:32:36,440 --> 00:32:39,640 Speaker 1: doing a full episode kind of giving a history of 637 00:32:39,640 --> 00:32:43,080 Speaker 1: that company and its impact on Internet culture. But I 638 00:32:43,120 --> 00:32:45,040 Speaker 1: look at things like that and I think that's why 639 00:32:45,080 --> 00:32:49,520 Speaker 1: it's really important to organize, to have unions, to have 640 00:32:49,600 --> 00:32:53,200 Speaker 1: the ability to advocate, because the problems of Rooster Teeth 641 00:32:53,200 --> 00:32:55,160 Speaker 1: it's more than just Warner Brothers making a decision, a 642 00:32:55,240 --> 00:32:58,360 Speaker 1: unilateral decision to shut them down. It's not just David 643 00:32:58,440 --> 00:33:02,120 Speaker 1: Zaslov wants to say money. It's also that there were 644 00:33:02,120 --> 00:33:04,640 Speaker 1: a lot of bad decisions made over the last decade 645 00:33:04,680 --> 00:33:07,719 Speaker 1: that have hurt that company in the long run. And 646 00:33:08,040 --> 00:33:11,520 Speaker 1: if you have things like unions, you can help mitigate that. 647 00:33:11,600 --> 00:33:14,160 Speaker 1: If you have people who passionately believe in what they 648 00:33:14,200 --> 00:33:17,480 Speaker 1: are doing, and it's not just how can we make 649 00:33:17,560 --> 00:33:20,480 Speaker 1: number go up, I think that you see a much 650 00:33:20,600 --> 00:33:25,440 Speaker 1: healthier ecosystem for all involved. And I'm so tired of 651 00:33:26,040 --> 00:33:31,239 Speaker 1: watching companies sacrifice things to get short term gains and 652 00:33:31,280 --> 00:33:32,880 Speaker 1: then suffer in the long term. 653 00:33:32,960 --> 00:33:36,520 Speaker 2: It's exhausting, and that is the right economy is the 654 00:33:36,520 --> 00:33:39,680 Speaker 2: growth of all costs economy. It is a frustrating business. 655 00:33:40,240 --> 00:33:43,360 Speaker 2: And the only way to fight is awareness because we 656 00:33:43,480 --> 00:33:46,160 Speaker 2: cannot test people. But like you said, labor organization will 657 00:33:46,160 --> 00:33:48,760 Speaker 2: do it too. Yeah, a worker owned business is like 658 00:33:48,800 --> 00:33:51,760 Speaker 2: the facta like for or for media, like Aftermath. They 659 00:33:51,760 --> 00:33:53,600 Speaker 2: are going to fight this. They're going to fight the 660 00:33:53,600 --> 00:33:57,719 Speaker 2: people like Corey Hike who ran Vice into the ground. 661 00:33:58,200 --> 00:34:01,360 Speaker 2: These people have names, yeah, and they should be said 662 00:34:01,400 --> 00:34:03,840 Speaker 2: out loud on podcasts and in the media. The people 663 00:34:03,920 --> 00:34:07,600 Speaker 2: that are responsible. David Zaslav one brother Discovery. He is 664 00:34:07,640 --> 00:34:10,800 Speaker 2: responsible for so many great things not happening like Coyote 665 00:34:10,920 --> 00:34:14,120 Speaker 2: vosus acme. Yeah for example, Yeah, that woman. 666 00:34:14,080 --> 00:34:16,680 Speaker 3: Like he just literally killing media. 667 00:34:16,800 --> 00:34:19,640 Speaker 1: So full disclosure, The company I worked for when I 668 00:34:19,680 --> 00:34:22,680 Speaker 1: started this podcast was HowStuffWorks dot com, which at one 669 00:34:22,719 --> 00:34:25,960 Speaker 1: point was part of Discovery Communications. So, as I've said 670 00:34:26,000 --> 00:34:28,160 Speaker 1: on this podcast a few times, I am well aware 671 00:34:28,239 --> 00:34:30,960 Speaker 1: of how David Zaslov operates because I got to see 672 00:34:30,960 --> 00:34:33,640 Speaker 1: it up close and personal. And I can't say that 673 00:34:33,719 --> 00:34:35,920 Speaker 1: anything that's happened over the last two years as its 674 00:34:35,960 --> 00:34:38,839 Speaker 1: surprised me. I've felt discouraged. I mean, as a fan 675 00:34:38,960 --> 00:34:41,160 Speaker 1: of those properties, I really would have liked the chance 676 00:34:41,200 --> 00:34:43,360 Speaker 1: to see them. And it's a shame that for a 677 00:34:43,440 --> 00:34:46,200 Speaker 1: tax right off, it means that they can never see 678 00:34:46,200 --> 00:34:48,560 Speaker 1: the light of day. It's not just oh it's on 679 00:34:48,640 --> 00:34:51,319 Speaker 1: the shelf. No, it's going to be like obliterated from 680 00:34:51,400 --> 00:34:54,520 Speaker 1: the earth for no reason. Yeah, tax right off, that's it. 681 00:34:54,600 --> 00:34:58,120 Speaker 1: But also do you need that tax right off, whether 682 00:34:58,160 --> 00:34:59,759 Speaker 1: they need it or not. Like when you see someone 683 00:34:59,760 --> 00:35:04,880 Speaker 1: who's in control of a company that makes content seemingly 684 00:35:04,960 --> 00:35:09,400 Speaker 1: have no passion for content at all, it's just heartbreaking. 685 00:35:09,520 --> 00:35:11,319 Speaker 2: And this is the problem behind a lot of the 686 00:35:11,360 --> 00:35:14,200 Speaker 2: problems of Catalysm right now that the people running things 687 00:35:14,239 --> 00:35:16,840 Speaker 2: are not making things for people. Yeah, they are making 688 00:35:16,920 --> 00:35:19,440 Speaker 2: things for the markets. They're making things for algorithms. They're 689 00:35:19,480 --> 00:35:22,359 Speaker 2: making things because they saw something on TikTok is. They 690 00:35:22,400 --> 00:35:25,520 Speaker 2: want to appeal to an algorithm or a market. And 691 00:35:25,560 --> 00:35:28,840 Speaker 2: it's disgraceful because this kind of works until it doesn't, 692 00:35:28,920 --> 00:35:32,040 Speaker 2: and when it stops working, it's calamitous. Yeah, look at 693 00:35:32,080 --> 00:35:36,040 Speaker 2: what happened to Disney with the streaming. The massive overinvestment 694 00:35:36,080 --> 00:35:39,279 Speaker 2: in streaming. Things like that are a result of this disconnection. 695 00:35:39,640 --> 00:35:42,839 Speaker 1: And to your point, it ends up affecting like when 696 00:35:42,880 --> 00:35:45,000 Speaker 1: I was talking about rooster teeth, it ends up affecting 697 00:35:45,080 --> 00:35:47,520 Speaker 1: so many people who end up without a job and 698 00:35:47,600 --> 00:35:50,560 Speaker 1: the end of it when it ultimately collapses in on itself. 699 00:35:50,640 --> 00:35:52,839 Speaker 1: We saw the same thing when the pandemic hit and 700 00:35:52,880 --> 00:35:57,280 Speaker 1: you had all these tech companies invest in their infrastructure, 701 00:35:57,680 --> 00:36:01,160 Speaker 1: which made sense at the time, seemingly, but then once 702 00:36:01,200 --> 00:36:04,799 Speaker 1: the pandemic starts to recede, they realize, oh, we now 703 00:36:04,840 --> 00:36:09,400 Speaker 1: have this huge investment in infrastructure that's no longer necessary. 704 00:36:09,440 --> 00:36:11,520 Speaker 1: It's now we could view it as a cost as 705 00:36:11,520 --> 00:36:15,160 Speaker 1: opposed to an asset that's downsize it's just such a 706 00:36:16,960 --> 00:36:20,719 Speaker 1: massive waste and so many talented people end up being 707 00:36:20,800 --> 00:36:21,440 Speaker 1: jerked around. 708 00:36:21,760 --> 00:36:24,280 Speaker 2: The CEO should be fired. Yeah, but then. 709 00:36:24,800 --> 00:36:28,040 Speaker 1: Well, and I've heard plenty of people argue about when 710 00:36:28,080 --> 00:36:32,120 Speaker 1: they talk about AI, not generative AI, but AI potentially 711 00:36:32,160 --> 00:36:34,120 Speaker 1: replacing jobs, Like, why aren't we looking at the C 712 00:36:34,200 --> 00:36:35,920 Speaker 1: suite for that? Because that seems to me. 713 00:36:36,080 --> 00:36:38,680 Speaker 2: I wrote a business inside a piece on this very subject, 714 00:36:39,520 --> 00:36:42,480 Speaker 2: and people were very mad at it. People did not 715 00:36:42,719 --> 00:36:46,480 Speaker 2: like that idea, most of them CEOs. Most of them 716 00:36:46,480 --> 00:36:48,319 Speaker 2: were like, hey, whoa, whoa, I got sweet gig. 717 00:36:48,440 --> 00:36:48,680 Speaker 3: Yeah. 718 00:36:48,920 --> 00:36:50,799 Speaker 1: Yeah, I could see why they would be mad at 719 00:36:50,840 --> 00:36:53,120 Speaker 1: the same reason why I get mad when I hear 720 00:36:53,760 --> 00:36:57,680 Speaker 1: a CEO suggests that eight thousand jobs could be held 721 00:36:57,719 --> 00:36:59,560 Speaker 1: off from being filled by humans because we can just 722 00:36:59,560 --> 00:37:01,040 Speaker 1: fill them AI in the future. 723 00:37:01,600 --> 00:37:05,000 Speaker 2: Here's a funny one for you, so read it. Read 724 00:37:05,080 --> 00:37:07,800 Speaker 2: it laid off what ninety people just before their IPO, 725 00:37:08,200 --> 00:37:12,040 Speaker 2: how much Steve Hoffman made in twenty twenty three, how 726 00:37:12,120 --> 00:37:18,920 Speaker 2: much over one hundred and ninety million dollars? Actually, let me, 727 00:37:18,960 --> 00:37:20,440 Speaker 2: I want to confirm that number. 728 00:37:21,200 --> 00:37:22,560 Speaker 1: That's a that's a lie of schaeddar. 729 00:37:23,320 --> 00:37:25,400 Speaker 2: Well, it was a combination like all these things between 730 00:37:25,440 --> 00:37:29,880 Speaker 2: stock options and all that. Yeah, his total compensation exceeded 731 00:37:29,880 --> 00:37:34,640 Speaker 2: one hundred and ninety three million dollars in twenty twenty three. Yeah, 732 00:37:34,760 --> 00:37:38,759 Speaker 2: and read it is getting worse made and he has 733 00:37:38,800 --> 00:37:43,160 Speaker 2: not paid a single person who actually dedicated themselves to 734 00:37:43,160 --> 00:37:43,800 Speaker 2: building Reddit. 735 00:37:44,400 --> 00:37:44,960 Speaker 3: That's the thing. 736 00:37:45,320 --> 00:37:47,760 Speaker 2: Yeah, these people better hope that there is not actually 737 00:37:47,800 --> 00:37:51,880 Speaker 2: a revolution. Their faces are all over the internet. 738 00:37:51,960 --> 00:37:54,160 Speaker 1: I mean, you know, it's it's funny because every year 739 00:37:54,200 --> 00:37:56,480 Speaker 1: I feel like we're just a little step closer. Well, 740 00:37:56,520 --> 00:37:58,319 Speaker 1: this has been a great conversation. 741 00:37:58,840 --> 00:37:59,279 Speaker 3: Thank you. 742 00:37:59,400 --> 00:38:01,920 Speaker 1: Yeah, I'm ready now to go out and start swinging. 743 00:38:02,160 --> 00:38:03,440 Speaker 3: So hell yeah. 744 00:38:03,520 --> 00:38:06,240 Speaker 1: I want to again remind all my listeners. Better Offline 745 00:38:06,320 --> 00:38:08,680 Speaker 1: is the podcast. It's live now. You can find it 746 00:38:08,719 --> 00:38:11,799 Speaker 1: wherever you get your podcasts. You absolutely should check it 747 00:38:11,800 --> 00:38:12,760 Speaker 1: out and take a listen. 748 00:38:13,280 --> 00:38:15,720 Speaker 3: Better Offline dot com and all. 749 00:38:15,600 --> 00:38:17,759 Speaker 2: The links, all the links go there. 750 00:38:18,320 --> 00:38:22,040 Speaker 1: Check out the newsletter too. Again, being informed is your 751 00:38:22,080 --> 00:38:24,880 Speaker 1: best weapon when it comes to these sorts of things, 752 00:38:24,960 --> 00:38:27,279 Speaker 1: and you know, otherwise you're just going to be left 753 00:38:27,320 --> 00:38:31,080 Speaker 1: asking why is nothing as good as I remember it? Being? 754 00:38:32,760 --> 00:38:35,239 Speaker 1: Like there are reasons why I'm not on well what 755 00:38:35,360 --> 00:38:37,920 Speaker 1: is now X? But used to be Twitter? Like for 756 00:38:37,960 --> 00:38:40,040 Speaker 1: a while, it was great. I loved it. It was 757 00:38:40,040 --> 00:38:40,879 Speaker 1: a fantastic one. 758 00:38:40,920 --> 00:38:43,759 Speaker 2: I have made my business, made friends, fallen in love 759 00:38:43,800 --> 00:38:46,160 Speaker 2: on that, and now now it's going to the dogs. 760 00:38:46,400 --> 00:38:50,799 Speaker 1: Yeah I remember getting opportunities to do things that I 761 00:38:50,920 --> 00:38:53,160 Speaker 1: never would have done had I not been on a 762 00:38:53,200 --> 00:38:57,680 Speaker 1: platform like that. But yeah, it just got so useless 763 00:38:57,800 --> 00:39:01,680 Speaker 1: so quickly that I couldn't justify staying there anymore. I 764 00:39:01,760 --> 00:39:03,640 Speaker 1: have friends who still use it as a way of 765 00:39:03,760 --> 00:39:08,000 Speaker 1: promoting stuff, but I just found it far too depressing, 766 00:39:08,120 --> 00:39:11,560 Speaker 1: honestly to be active on the platform, and for my 767 00:39:11,640 --> 00:39:13,680 Speaker 1: own mental health, I step back. I was like, you 768 00:39:13,680 --> 00:39:15,640 Speaker 1: know what, I did build an audience, but honestly, I 769 00:39:15,680 --> 00:39:17,480 Speaker 1: don't even know how many of those people are still 770 00:39:17,520 --> 00:39:19,719 Speaker 1: active on this platform, because that's how bad it got. 771 00:39:19,840 --> 00:39:23,640 Speaker 1: So definitely check out all those links and ed, thank 772 00:39:23,680 --> 00:39:25,440 Speaker 1: you so much for agreeing to be on the show. 773 00:39:25,960 --> 00:39:27,719 Speaker 2: Thank you for having me really appreciate it. 774 00:39:33,360 --> 00:39:38,000 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 775 00:39:38,320 --> 00:39:42,040 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 776 00:39:42,080 --> 00:39:46,720 Speaker 1: to your favorite shows.