1 00:00:02,600 --> 00:00:17,320 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. 2 00:00:18,960 --> 00:00:22,160 Speaker 2: Hello and welcome to another episode of The Odd Lots Podcast. 3 00:00:22,200 --> 00:00:24,439 Speaker 3: I'm Joe Wisenthal and I'm Tracy Alloway. 4 00:00:24,520 --> 00:00:26,720 Speaker 2: Tracy, you don't use Twitter like quite as much as 5 00:00:26,760 --> 00:00:26,960 Speaker 2: I do. 6 00:00:27,040 --> 00:00:29,760 Speaker 3: Still, no, I kind of. I don't know what happened. 7 00:00:29,800 --> 00:00:31,800 Speaker 3: I guess I just went off Twitter. You know what 8 00:00:32,040 --> 00:00:34,520 Speaker 3: platform I'm on quite a lot, and it doesn't help 9 00:00:34,560 --> 00:00:36,640 Speaker 3: me in any way in terms of all lots in 10 00:00:36,680 --> 00:00:38,680 Speaker 3: my career. But I really like Reddit. 11 00:00:38,479 --> 00:00:40,960 Speaker 2: Now I know, I see you. Actually, Tracy doesn't like 12 00:00:41,000 --> 00:00:44,559 Speaker 2: that I sometimes notice what she's browsing, but I respect 13 00:00:44,560 --> 00:00:45,120 Speaker 2: that You're. 14 00:00:45,479 --> 00:00:48,320 Speaker 3: Joe is constantly looking over my shoulder. And whenever I 15 00:00:48,400 --> 00:00:51,760 Speaker 3: start like shopping online for five minutes, you start talking 16 00:00:51,760 --> 00:00:53,200 Speaker 3: about the furniture that I'm buying. 17 00:00:53,280 --> 00:00:56,080 Speaker 2: Well, it's really nice, It's really I'm always it's because 18 00:00:56,160 --> 00:00:58,880 Speaker 2: it looks really good. But I still am completely addicted 19 00:00:58,920 --> 00:01:03,720 Speaker 2: to Twitter. And I don't really know why, Because why 20 00:01:04,720 --> 00:01:06,840 Speaker 2: I don't know. This is like a question for therapy 21 00:01:06,959 --> 00:01:09,080 Speaker 2: for a long time, because like I know, it's gotten 22 00:01:09,120 --> 00:01:11,280 Speaker 2: really bad and I don't even I don't know. I 23 00:01:11,360 --> 00:01:13,360 Speaker 2: like to be there. I like to post words and 24 00:01:13,720 --> 00:01:16,560 Speaker 2: the short form is a good medium for me. It 25 00:01:16,640 --> 00:01:20,040 Speaker 2: feels like sometimes I see interesting things. We still source 26 00:01:20,080 --> 00:01:22,400 Speaker 2: a lot of guests from Twitter, that's where you're be honest, 27 00:01:22,400 --> 00:01:25,160 Speaker 2: and we still discover like experts. They're still there, but 28 00:01:25,280 --> 00:01:27,800 Speaker 2: like it's subjectively gotten way worse and I don't even 29 00:01:27,840 --> 00:01:29,160 Speaker 2: think it's just a Elon Musk thing. 30 00:01:29,200 --> 00:01:31,560 Speaker 3: By the way, Well, here's the thing, Like this is 31 00:01:31,600 --> 00:01:34,559 Speaker 3: the key difference in my mind between Reddit right now 32 00:01:34,560 --> 00:01:37,480 Speaker 3: and a bunch of other social media platforms. Twitter is 33 00:01:37,520 --> 00:01:40,600 Speaker 3: now run by this like insane algo right that just 34 00:01:40,760 --> 00:01:44,520 Speaker 3: pushes stuff at you and sometimes it's really random. TikTok 35 00:01:44,680 --> 00:01:47,320 Speaker 3: is the same thing. Reddit, at least you can still 36 00:01:47,440 --> 00:01:51,360 Speaker 3: curate your feeds relatively easily. But even with Reddit, obviously 37 00:01:51,400 --> 00:01:54,960 Speaker 3: since the IPO, there's pressure to boost revenues and we're 38 00:01:55,000 --> 00:01:58,240 Speaker 3: seeing like more algo driven stuff, a lot more AI 39 00:01:58,520 --> 00:02:01,480 Speaker 3: generated content like have always been a problem, but I 40 00:02:01,520 --> 00:02:03,600 Speaker 3: feel like it might be starting to get worse. 41 00:02:03,840 --> 00:02:07,640 Speaker 2: You know what platform is really good? What Discord? Oh? Yes, 42 00:02:08,320 --> 00:02:11,720 Speaker 2: in fact, listeners should check out discord, dot gg slash 43 00:02:11,760 --> 00:02:14,200 Speaker 2: odd lots because twenty four to seven in there there 44 00:02:14,240 --> 00:02:18,440 Speaker 2: is a well curated world of fellow listeners who are 45 00:02:18,480 --> 00:02:20,600 Speaker 2: interested in all the same things, talking in a very 46 00:02:20,840 --> 00:02:27,680 Speaker 2: generally polite and adult way about important news and finance, markets, economics, tech, automotive, semiconductors, 47 00:02:27,720 --> 00:02:30,079 Speaker 2: all those things that we like. And actually today we're 48 00:02:30,080 --> 00:02:32,400 Speaker 2: gonna chat with He's not just a Discord member, but 49 00:02:32,480 --> 00:02:34,720 Speaker 2: sometimes he chats in there and he has opinions on 50 00:02:34,760 --> 00:02:35,280 Speaker 2: the Internet. 51 00:02:35,360 --> 00:02:36,120 Speaker 3: That sounds great. 52 00:02:36,240 --> 00:02:38,200 Speaker 2: The perfect guest, the perfect guests. We have the perfect 53 00:02:38,200 --> 00:02:40,120 Speaker 2: guest to talk about why so much of the Internet 54 00:02:40,160 --> 00:02:43,320 Speaker 2: has essentially turned to complete garbage filled with slop AI 55 00:02:43,400 --> 00:02:47,079 Speaker 2: generated nonsense, completely made up news stuff like that. We're 56 00:02:47,120 --> 00:02:50,880 Speaker 2: going to be speaking with Max read odd Lots, Discord lurker, 57 00:02:50,919 --> 00:02:54,400 Speaker 2: but more importantly, the author and sole proprietor of the 58 00:02:54,440 --> 00:02:57,200 Speaker 2: excellent read Max substack, one of the few substacks that 59 00:02:57,240 --> 00:02:59,560 Speaker 2: I actually pay money for. So Max, thank you for 60 00:02:59,639 --> 00:03:00,440 Speaker 2: coming on odd Lacks. 61 00:03:00,600 --> 00:03:01,639 Speaker 4: I'm so excited to be here. 62 00:03:01,760 --> 00:03:03,359 Speaker 2: Thank you for being a part of the Discord. I 63 00:03:03,400 --> 00:03:05,480 Speaker 2: did's find that you don't poets much, but we appreciate 64 00:03:05,520 --> 00:03:05,959 Speaker 2: that you lark. 65 00:03:06,080 --> 00:03:07,960 Speaker 4: I mean, I'm trying to put all of my words 66 00:03:08,000 --> 00:03:11,880 Speaker 4: in a monetizable format. Yeah, I have the newsletter fair enough. 67 00:03:12,200 --> 00:03:14,720 Speaker 3: So one thing I'm always curious about when people are 68 00:03:14,760 --> 00:03:18,519 Speaker 3: like internet or culture correspondence, how did you get into 69 00:03:18,560 --> 00:03:19,440 Speaker 3: that realm? 70 00:03:20,000 --> 00:03:22,600 Speaker 4: So I got my starting journalism at Gawker, the media 71 00:03:22,639 --> 00:03:26,400 Speaker 4: gossip website that no longer exists, and I started there 72 00:03:26,400 --> 00:03:29,040 Speaker 4: in twenty ten, and I would say I worked my 73 00:03:29,120 --> 00:03:31,120 Speaker 4: way up, except the way things worked at Cowkers, the 74 00:03:31,120 --> 00:03:33,079 Speaker 4: people at the top just got fired and whoever was 75 00:03:33,160 --> 00:03:36,760 Speaker 4: left continued to advance. So I was there during the 76 00:03:36,840 --> 00:03:39,560 Speaker 4: sort of twenty eleven twenty twelve era which people who 77 00:03:39,600 --> 00:03:42,080 Speaker 4: were in media will remember as the moment when Facebook, 78 00:03:42,200 --> 00:03:45,559 Speaker 4: so to speak, turned on the floodgates or open the floodgates, 79 00:03:45,880 --> 00:03:48,560 Speaker 4: and all of a sudden, we went from sort of 80 00:03:48,640 --> 00:03:52,800 Speaker 4: jockeying for traffic from Google or even homepage traffic. People 81 00:03:52,800 --> 00:03:54,800 Speaker 4: would visit Gawker dot com all the time. I used 82 00:03:54,800 --> 00:03:56,880 Speaker 4: to do that, Yeah, I mean Goacker was old enough 83 00:03:56,920 --> 00:03:59,440 Speaker 4: to be a homepage site. And then Facebook said, hey, 84 00:03:59,480 --> 00:04:04,280 Speaker 4: we've got hundreds of millions, billions of users, let's direct 85 00:04:04,280 --> 00:04:06,160 Speaker 4: some of that at the media. And then, for various 86 00:04:06,200 --> 00:04:09,560 Speaker 4: occult reasons, we started to get fifteen million page views 87 00:04:09,600 --> 00:04:12,280 Speaker 4: on a blog, totally unheard of. For reasons we couldn't 88 00:04:12,320 --> 00:04:14,840 Speaker 4: figure out, we couldn't understand, and because I was in 89 00:04:14,880 --> 00:04:17,160 Speaker 4: management at this point, I became kind of obsessed with this, 90 00:04:17,520 --> 00:04:22,479 Speaker 4: and I found myself thinking constantly about how culture and 91 00:04:22,560 --> 00:04:26,080 Speaker 4: media and businesses based around those things were changing because 92 00:04:26,120 --> 00:04:28,400 Speaker 4: of the way platforms were able to direct these huge 93 00:04:28,400 --> 00:04:31,799 Speaker 4: audiences to places. So, I mean, it's essentially it's because 94 00:04:31,839 --> 00:04:34,040 Speaker 4: I went crazy when Facebook decided to change the way 95 00:04:34,080 --> 00:04:35,840 Speaker 4: media worked, and I've never become Saine, sir. 96 00:04:37,440 --> 00:04:39,840 Speaker 2: Do you think every once in a while, like I 97 00:04:39,880 --> 00:04:42,840 Speaker 2: don't know, like Gawker's there's probably is some version of 98 00:04:42,880 --> 00:04:45,040 Speaker 2: Gaarker that exists out there on like It's fifth relaunching. 99 00:04:45,080 --> 00:04:47,200 Speaker 2: I'm not actually sure if that title exists, but it'll 100 00:04:47,240 --> 00:04:50,239 Speaker 2: be launched. If it's not currently existing, someone will relaunch 101 00:04:50,279 --> 00:04:52,240 Speaker 2: it at some point. But even though it sided that, 102 00:04:52,720 --> 00:04:55,000 Speaker 2: there's always every once in a while, like some new thing. 103 00:04:55,040 --> 00:04:56,800 Speaker 2: It's like we're going to bring back the good old 104 00:04:56,880 --> 00:04:59,560 Speaker 2: days of like a blog and a homepage and it's 105 00:04:59,560 --> 00:04:59,960 Speaker 2: like a team. 106 00:05:00,560 --> 00:05:02,840 Speaker 3: I'm pretty sure we've said that too. 107 00:05:03,440 --> 00:05:05,800 Speaker 2: Is there just impossible in twenty to twenty. 108 00:05:05,520 --> 00:05:08,640 Speaker 4: Four, Yeah, I mean, I think that substack is probably 109 00:05:08,640 --> 00:05:13,640 Speaker 4: the closest platform to facilitate the creation of a blog 110 00:05:13,800 --> 00:05:16,880 Speaker 4: like object, but because it's email distribution above all, you 111 00:05:16,880 --> 00:05:19,200 Speaker 4: actually can sort of force your way into people's field 112 00:05:19,200 --> 00:05:22,120 Speaker 4: of vision instead of requiring them to go to a homepage. 113 00:05:22,120 --> 00:05:23,960 Speaker 4: I mean, we were talking about this already, but the 114 00:05:24,000 --> 00:05:26,680 Speaker 4: existence of the homepage, as I think that people would 115 00:05:26,760 --> 00:05:29,360 Speaker 4: check day and day out, was really important to as 116 00:05:29,400 --> 00:05:31,200 Speaker 4: far as I'm concerned, was really important to the way 117 00:05:31,240 --> 00:05:33,480 Speaker 4: a blog exists, how it feels, the tone, the kind 118 00:05:33,520 --> 00:05:36,559 Speaker 4: of structure of it. And now that most people's first 119 00:05:36,760 --> 00:05:40,000 Speaker 4: visit is a platform like Twitter or Instagram or one 120 00:05:40,040 --> 00:05:42,400 Speaker 4: of the others, it's a lot harder to like cultivate 121 00:05:42,440 --> 00:05:45,600 Speaker 4: the audience necessary to type in, you know, Joe Wisenthal 122 00:05:45,640 --> 00:05:47,760 Speaker 4: dot com or whatever your blog homepage would be. 123 00:05:48,279 --> 00:05:51,000 Speaker 3: Could you talk a bit more about what happened when 124 00:05:51,200 --> 00:05:55,400 Speaker 3: like Facebook and the other platforms started dominating the media 125 00:05:55,520 --> 00:05:58,600 Speaker 3: sphere and generating those millions of page views, Like, how 126 00:05:58,600 --> 00:05:59,720 Speaker 3: did that change. 127 00:05:59,400 --> 00:06:03,159 Speaker 4: The actual Yeah, I'll use Gawker as an example. So 128 00:06:03,279 --> 00:06:06,560 Speaker 4: Gacker began, as people remember as a pretty mean let's 129 00:06:06,600 --> 00:06:08,080 Speaker 4: call it a mean gossip, not as mean as I 130 00:06:08,080 --> 00:06:11,080 Speaker 4: think people remember, but it was sort of sophisticated, witty 131 00:06:11,320 --> 00:06:15,800 Speaker 4: cutting and about a kind of narrow cast of media 132 00:06:15,920 --> 00:06:19,760 Speaker 4: and political and entertainment figures. And this was sort of 133 00:06:19,760 --> 00:06:22,040 Speaker 4: starting to change and open up as it became clear that, 134 00:06:22,360 --> 00:06:25,080 Speaker 4: you know, search engine optimization or SEO would allow you 135 00:06:25,120 --> 00:06:27,920 Speaker 4: to get a bigger audience if you wrote about celebrities 136 00:06:27,920 --> 00:06:30,400 Speaker 4: that people were googling, for example. But Facebook opened it 137 00:06:30,520 --> 00:06:32,800 Speaker 4: up even bigger and even as far as I was concerned, 138 00:06:32,800 --> 00:06:35,280 Speaker 4: sort of stranger way, because what you were going for 139 00:06:35,440 --> 00:06:38,200 Speaker 4: was not like what are people searching for or even 140 00:06:38,200 --> 00:06:41,400 Speaker 4: necessarily sort of what people are interested in. It's what 141 00:06:41,440 --> 00:06:44,040 Speaker 4: will people share and what will people click on if 142 00:06:44,040 --> 00:06:47,600 Speaker 4: they come across it in their Facebook feed. So in general, 143 00:06:47,720 --> 00:06:49,640 Speaker 4: especially at the early stages there, what you were getting 144 00:06:49,720 --> 00:06:52,120 Speaker 4: was a lot less of the kind of like insidery, 145 00:06:52,160 --> 00:06:54,920 Speaker 4: what's Rupert Murdock up to? And a lot more I mean, 146 00:06:55,000 --> 00:06:57,280 Speaker 4: cat videos doesn't even quite begin to cover it. It's 147 00:06:57,320 --> 00:06:59,840 Speaker 4: sort of sentimental content, inspirational. 148 00:07:00,480 --> 00:07:03,440 Speaker 2: I always think of like those BuzzFeed lists like twenty 149 00:07:03,480 --> 00:07:05,800 Speaker 2: three things only people who live in Upstate New York 150 00:07:05,800 --> 00:07:09,200 Speaker 2: would know, and they're so powerful because you just share 151 00:07:09,240 --> 00:07:11,120 Speaker 2: it if you're from Upstate New York with everyone you 152 00:07:11,160 --> 00:07:13,640 Speaker 2: know from upstate New York, and it just sort of changes, 153 00:07:13,720 --> 00:07:16,960 Speaker 2: like what that category of content is that people care about, 154 00:07:16,960 --> 00:07:17,960 Speaker 2: that affinity content. 155 00:07:18,040 --> 00:07:19,880 Speaker 4: Yeah, absolutely, I mean there was a huge amount of 156 00:07:19,920 --> 00:07:23,640 Speaker 4: affinity you know, micro targeted identity content, things that people 157 00:07:23,640 --> 00:07:26,200 Speaker 4: would share, things that people would sort of like throw 158 00:07:26,200 --> 00:07:28,240 Speaker 4: out to all their friends. I mean this was the 159 00:07:28,280 --> 00:07:30,720 Speaker 4: Internet between say twenty twelve and twenty fourteen or so, 160 00:07:30,840 --> 00:07:33,200 Speaker 4: was really dominated by this kind of thing. And if 161 00:07:33,240 --> 00:07:36,000 Speaker 4: you were there, you remember BuzzFeed as being you know, 162 00:07:36,160 --> 00:07:38,160 Speaker 4: Jona Predi would talk about it as being the next 163 00:07:38,160 --> 00:07:40,360 Speaker 4: Disney or something, because it had that kind of momentum 164 00:07:40,400 --> 00:07:43,160 Speaker 4: behind it, the sense that it was so big. Underneath 165 00:07:43,160 --> 00:07:44,480 Speaker 4: all of that, it turns out what was actually big 166 00:07:44,560 --> 00:07:46,000 Speaker 4: was Facebook, not BuzzFeed. 167 00:07:46,600 --> 00:07:48,840 Speaker 2: I feel like there's also this and there was a 168 00:07:48,880 --> 00:07:51,520 Speaker 2: famous I think it was actually the All which you 169 00:07:51,560 --> 00:07:53,640 Speaker 2: were part of, but I think it was published there 170 00:07:53,640 --> 00:07:57,800 Speaker 2: maybe somewhere else. Like the psychological damage that people who 171 00:07:57,800 --> 00:08:01,000 Speaker 2: weren't at BuzzFeed were undergoing at the time because everyone 172 00:08:01,080 --> 00:08:04,200 Speaker 2: just saw BuzzFeed is like there like huge internet juggernaut, 173 00:08:04,200 --> 00:08:06,280 Speaker 2: and if you weren't there, like you weren't cool and 174 00:08:06,320 --> 00:08:08,360 Speaker 2: you weren't part of the Like, I think it's actually crazy. 175 00:08:08,400 --> 00:08:11,200 Speaker 2: The degree to which people who weren't their brand got 176 00:08:11,240 --> 00:08:13,280 Speaker 2: broken by that'd. 177 00:08:12,560 --> 00:08:16,880 Speaker 3: Be broken by BuzzFeed. Okay, so we've been reminiscing about 178 00:08:16,960 --> 00:08:20,440 Speaker 3: the good old Internet of like twenty thirteen. Can we 179 00:08:20,440 --> 00:08:21,680 Speaker 3: talk about what's happening now? 180 00:08:22,240 --> 00:08:24,800 Speaker 4: Yeah. So I think we're sort of undergoing a process 181 00:08:24,840 --> 00:08:27,600 Speaker 4: that I think of as tiktokization of all these platforms. 182 00:08:27,720 --> 00:08:30,720 Speaker 4: I mean, you know, the Carsonization, the idea that like 183 00:08:30,920 --> 00:08:34,760 Speaker 4: creatures that live deep deep deep underwater tend to evolve 184 00:08:34,800 --> 00:08:38,200 Speaker 4: into crab like shapes even though they're genetically quite distinct. 185 00:08:38,240 --> 00:08:40,200 Speaker 2: That goes viral on Twitter everyone. Yeah, exactly. 186 00:08:40,200 --> 00:08:44,080 Speaker 4: The evolutionary craftures basically turned them all into like hard shelled, 187 00:08:44,240 --> 00:08:47,640 Speaker 4: you know, multi limbed, claw bearing animals. And I think 188 00:08:47,679 --> 00:08:51,040 Speaker 4: there's a similar process happening online where every platform is 189 00:08:51,040 --> 00:08:53,800 Speaker 4: turning itself into TikTok I mean, Carsonization is like a 190 00:08:53,800 --> 00:08:55,559 Speaker 4: fun metaphor. It might not be quite right. It might 191 00:08:55,600 --> 00:08:57,559 Speaker 4: just be people copying what they see is the most 192 00:08:57,600 --> 00:09:01,120 Speaker 4: successful and biggest platform. But when I say tiktokozation, what 193 00:09:01,160 --> 00:09:03,880 Speaker 4: I mean is a focus on content from strangers and 194 00:09:03,920 --> 00:09:06,720 Speaker 4: people you don't know, rather than your sort of social networks. 195 00:09:07,080 --> 00:09:10,640 Speaker 4: A tendency like a sort of scrollable video feed that 196 00:09:10,679 --> 00:09:13,920 Speaker 4: you just bounce through really quickly. And you know, like 197 00:09:14,000 --> 00:09:15,960 Speaker 4: when I say content from strangers, this is all based 198 00:09:15,960 --> 00:09:19,079 Speaker 4: around the FYP like a really you know, heavily weighted 199 00:09:19,080 --> 00:09:21,160 Speaker 4: algorithm that's supposed to give you exactly what you're looking 200 00:09:22,080 --> 00:09:24,319 Speaker 4: down for. It's just the for you page on TikTok. 201 00:09:24,360 --> 00:09:26,760 Speaker 4: So if you're a twin Instagram right right, So if 202 00:09:26,760 --> 00:09:28,319 Speaker 4: you're on Twitter, that for you, there's a four you 203 00:09:28,480 --> 00:09:31,200 Speaker 4: feed which is effectively the same deal. It sends you 204 00:09:31,440 --> 00:09:34,200 Speaker 4: what's going viral on the platform. If you scroll past 205 00:09:34,200 --> 00:09:36,559 Speaker 4: a college football tweet, you will suddenly see twenty five 206 00:09:36,600 --> 00:09:38,920 Speaker 4: college football tweets in your feed. And then the other 207 00:09:38,920 --> 00:09:40,560 Speaker 4: big thing, which I think is really important to this 208 00:09:40,760 --> 00:09:43,200 Speaker 4: is that the platforms that are heading in the TikTok 209 00:09:43,240 --> 00:09:46,760 Speaker 4: direction tend to pay posters directly for engagement, which we 210 00:09:46,840 --> 00:09:49,679 Speaker 4: know about from Twitter, especially the blue check program and 211 00:09:50,040 --> 00:09:52,160 Speaker 4: the sort of general like Twitter program where they will 212 00:09:52,360 --> 00:09:54,200 Speaker 4: pay you out if you have particularly engaging posts. 213 00:09:54,200 --> 00:09:57,679 Speaker 2: Would you show your FYP to your friends and family, 214 00:09:57,840 --> 00:10:01,080 Speaker 2: my Twitter fyp or my anyone anyone? 215 00:10:01,320 --> 00:10:03,200 Speaker 4: I think I would. I mean, I find it. It 216 00:10:03,320 --> 00:10:04,920 Speaker 4: reveals me to be a much I brought up college 217 00:10:04,920 --> 00:10:06,760 Speaker 4: footble because it turns out I'm a much bigger sports 218 00:10:06,800 --> 00:10:09,280 Speaker 4: fan than I realized. I was, like, all of my fops. 219 00:10:08,880 --> 00:10:11,160 Speaker 3: Are like ninety or the ago has just kind you 220 00:10:11,320 --> 00:10:12,440 Speaker 3: as like a white male. 221 00:10:12,559 --> 00:10:15,680 Speaker 2: See I recently searched like a health related thing, and 222 00:10:15,720 --> 00:10:19,680 Speaker 2: then my FOYP thing was like disgusting after I'm not 223 00:10:19,679 --> 00:10:20,319 Speaker 2: going to talk about it. 224 00:10:20,320 --> 00:10:22,800 Speaker 3: That this is This is one of the problems with algos. 225 00:10:22,880 --> 00:10:25,280 Speaker 3: It's like when you first sign up to a platform, 226 00:10:25,360 --> 00:10:27,480 Speaker 3: I find there is so much pressure on the first 227 00:10:27,480 --> 00:10:29,560 Speaker 3: thing you click, because that's basically going to show up 228 00:10:29,600 --> 00:10:31,959 Speaker 3: in your algorithm forever. It feels like. 229 00:10:32,240 --> 00:10:34,319 Speaker 4: Yeah, I mean, and I find myself very consciously if 230 00:10:34,360 --> 00:10:36,679 Speaker 4: I have accidentally gotten into a place I really don't 231 00:10:36,720 --> 00:10:39,400 Speaker 4: want to be. You know, politics on Twitter in particular 232 00:10:39,520 --> 00:10:41,240 Speaker 4: is a bad situation for this. If you if you 233 00:10:41,240 --> 00:10:42,840 Speaker 4: go down the wrong rabbit hole, the next thing you know, 234 00:10:42,840 --> 00:10:45,160 Speaker 4: your FYP is just filled with guys you never want 235 00:10:45,200 --> 00:10:46,959 Speaker 4: to hear from and don't ever want to see again. 236 00:10:47,160 --> 00:10:49,600 Speaker 4: And so I will really consciously go and find something 237 00:10:49,600 --> 00:10:52,120 Speaker 4: else to like scroll through, click on, you know, like 238 00:10:52,280 --> 00:10:54,920 Speaker 4: wait over so that I can cleanse my fyp of 239 00:10:54,960 --> 00:10:55,439 Speaker 4: all this stuff. 240 00:10:55,600 --> 00:10:59,000 Speaker 3: Yeah, so one thing I don't get about this recent development, 241 00:10:59,080 --> 00:11:01,240 Speaker 3: And by the way, I don't so I'm not on TikTok. 242 00:11:01,360 --> 00:11:04,840 Speaker 3: I rely on samro to curate the best of TikTok 243 00:11:04,920 --> 00:11:08,520 Speaker 3: and post it on Instagram for me. But like, why 244 00:11:08,600 --> 00:11:11,880 Speaker 3: the emphasis on random content from alges Because when I 245 00:11:11,880 --> 00:11:14,960 Speaker 3: think back to the essence of a social network, the 246 00:11:15,040 --> 00:11:17,840 Speaker 3: idea was that you had an actual social network of 247 00:11:17,880 --> 00:11:19,920 Speaker 3: people that you know that you want to hear from. 248 00:11:20,240 --> 00:11:22,719 Speaker 4: I mean, I think what the big platforms found was 249 00:11:22,760 --> 00:11:25,360 Speaker 4: that people actually didn't really want to hear from their 250 00:11:25,400 --> 00:11:28,280 Speaker 4: friends and family. I mean, we know this as a phenomenon. 251 00:11:28,320 --> 00:11:30,520 Speaker 4: I think we can all acknowledge it being a phenomenon 252 00:11:30,559 --> 00:11:33,240 Speaker 4: on both Facebook and Twitter, where you had this idea 253 00:11:33,360 --> 00:11:36,760 Speaker 4: of context collapse, where you would be posting something to 254 00:11:37,000 --> 00:11:40,720 Speaker 4: an imagined audience of your friends, say, but your aunts 255 00:11:40,760 --> 00:11:43,319 Speaker 4: and uncles and everybody else is also on the same platform, 256 00:11:43,400 --> 00:11:45,600 Speaker 4: and all of a sudden, the things you feel comfortable 257 00:11:45,640 --> 00:11:47,040 Speaker 4: saying in front of your friends and in front of 258 00:11:47,080 --> 00:11:49,600 Speaker 4: your family might be different, and they might get in 259 00:11:49,640 --> 00:11:51,760 Speaker 4: fights in the comments, and it might be awkward and strange. 260 00:11:52,040 --> 00:11:55,920 Speaker 4: And I think that Facebook kind of recognized this. Twitter 261 00:11:55,960 --> 00:11:57,319 Speaker 4: is a slightly differenting because it was I think it 262 00:11:57,360 --> 00:11:59,400 Speaker 4: was pretty rare to sort of follow your family on Twitter. 263 00:11:59,440 --> 00:12:01,640 Speaker 4: But you know, there's a reason for that, and I 264 00:12:01,679 --> 00:12:04,240 Speaker 4: think TikTok coming through and showing that actually there was 265 00:12:04,280 --> 00:12:09,000 Speaker 4: a huge opportunity in these sort of fully programmatic, algorithmic, 266 00:12:09,280 --> 00:12:12,840 Speaker 4: non social networking feeds sort of pushed Facebook in particular, 267 00:12:12,920 --> 00:12:15,800 Speaker 4: because remember Meta owns Facebook and Instagram, and it sort 268 00:12:15,840 --> 00:12:18,520 Speaker 4: of controls a huge amount of what platforms are doing. 269 00:12:18,600 --> 00:12:20,560 Speaker 4: In an even moment, decided that they were going to 270 00:12:20,559 --> 00:12:23,240 Speaker 4: start pushing a very similar kind of structure. 271 00:12:38,120 --> 00:12:40,760 Speaker 2: So can we talk about Twitter or X for a second, 272 00:12:40,840 --> 00:12:42,800 Speaker 2: because that's where I started and I still to this 273 00:12:42,920 --> 00:12:46,040 Speaker 2: day on Twitter like a lot, although actually I think 274 00:12:46,040 --> 00:12:48,120 Speaker 2: I post a little bit less than I used to, 275 00:12:48,240 --> 00:12:49,280 Speaker 2: especially on weekends. 276 00:12:49,320 --> 00:12:50,480 Speaker 3: Are you monetizing yet? 277 00:12:50,559 --> 00:12:53,000 Speaker 2: I am not. I do not think that. I don't know. 278 00:12:53,040 --> 00:12:55,120 Speaker 2: I just did my ethics training here at Bloomberg and 279 00:12:55,160 --> 00:12:57,880 Speaker 2: I didn't say anything about that specifically, but I just 280 00:12:57,920 --> 00:13:00,480 Speaker 2: probably not going to touch that. You know, there was 281 00:13:00,520 --> 00:13:03,040 Speaker 2: a point, I would say, I don't know, seven or 282 00:13:03,040 --> 00:13:05,720 Speaker 2: eight years ago, maybe a little longer, where it never 283 00:13:05,800 --> 00:13:08,760 Speaker 2: felt that like Twitter just never had the same size 284 00:13:08,800 --> 00:13:11,360 Speaker 2: really as some of the other big social networks, particularly 285 00:13:11,360 --> 00:13:14,640 Speaker 2: Instagram or Facebook. But I always felt like all reporters 286 00:13:14,760 --> 00:13:17,000 Speaker 2: are on there. You know. Other people have said this 287 00:13:17,080 --> 00:13:19,840 Speaker 2: Twitter is like the assignment desk for much of the Internet, 288 00:13:20,120 --> 00:13:22,880 Speaker 2: And I remember, like one point, I don't know, watching 289 00:13:22,880 --> 00:13:26,240 Speaker 2: the Summer Olympics one year, and the basically the news 290 00:13:26,360 --> 00:13:29,040 Speaker 2: was like and then this basketball player tweeted this from 291 00:13:29,120 --> 00:13:31,240 Speaker 2: the city, and it's like, Oh, the future of TV 292 00:13:31,400 --> 00:13:34,920 Speaker 2: news is actually people reading tweets on air. Is Twitter 293 00:13:35,040 --> 00:13:38,520 Speaker 2: right now like setting aside whether it's profitable or good 294 00:13:38,600 --> 00:13:39,760 Speaker 2: or anything, is it important? 295 00:13:40,440 --> 00:13:40,760 Speaker 1: Yes? 296 00:13:41,760 --> 00:13:43,440 Speaker 4: I mean I think it's important in a kind of 297 00:13:43,960 --> 00:13:46,760 Speaker 4: disconnected way, not in the same way that you're talking about, 298 00:13:46,800 --> 00:13:49,000 Speaker 4: where we'd functioned as a kind of assignment desk. My 299 00:13:49,000 --> 00:13:51,079 Speaker 4: friend John Herman, who's the tech writer for New York 300 00:13:51,120 --> 00:13:53,120 Speaker 4: mag and is a great thinker about these things, used 301 00:13:53,120 --> 00:13:55,880 Speaker 4: to call Twitter the context for media, like it was 302 00:13:55,920 --> 00:13:58,199 Speaker 4: where you'd go to understand sort of what was going on. 303 00:13:58,559 --> 00:14:01,840 Speaker 4: And I don't think it funked like that anymore. But 304 00:14:02,080 --> 00:14:04,439 Speaker 4: because of the fact that it's owned by Elon Musk, 305 00:14:04,480 --> 00:14:06,680 Speaker 4: I think you could argue, oh, Elon bought Twitter in 306 00:14:06,800 --> 00:14:09,480 Speaker 4: order to make himself important to politics. But in a 307 00:14:09,480 --> 00:14:12,800 Speaker 4: funny way, I think Elon as an incredibly influential and 308 00:14:13,080 --> 00:14:16,080 Speaker 4: like hugely well resourced figure. Owning the social network means 309 00:14:16,120 --> 00:14:18,600 Speaker 4: that what happens on it is important in so far 310 00:14:18,640 --> 00:14:21,440 Speaker 4: as it reflects an idea about like where politics is headed. 311 00:14:21,720 --> 00:14:24,480 Speaker 4: But like the way you're talking about reading people's tweets 312 00:14:24,480 --> 00:14:26,520 Speaker 4: out loud, it still happens, sure, but it's just not 313 00:14:26,640 --> 00:14:27,560 Speaker 4: it's not the same thing. 314 00:14:27,960 --> 00:14:30,120 Speaker 3: So one thing I wanted to bring up is, I 315 00:14:30,120 --> 00:14:33,080 Speaker 3: guess breaking news on Twitter, because that seems to have 316 00:14:33,160 --> 00:14:36,120 Speaker 3: changed quite a bit. And I do remember again the 317 00:14:36,160 --> 00:14:38,120 Speaker 3: good old days when you would go on Twitter when 318 00:14:38,160 --> 00:14:40,720 Speaker 3: something is happening, you would search and you would get 319 00:14:40,760 --> 00:14:43,800 Speaker 3: like a lot of different opinions, maybe little anecdotes on 320 00:14:43,880 --> 00:14:48,040 Speaker 3: the ground, reportage, things like that. What happens with breaking 321 00:14:48,080 --> 00:14:48,640 Speaker 3: news now? 322 00:14:49,120 --> 00:14:52,760 Speaker 4: Well, I mean talking about the FYP because it's not chronological, 323 00:14:52,800 --> 00:14:55,080 Speaker 4: because it's really kind of anti chronological. You're going to 324 00:14:55,120 --> 00:14:56,760 Speaker 4: see stuff if you sign on to Twitter and you 325 00:14:56,800 --> 00:14:59,040 Speaker 4: look at not your following list or any of the list, 326 00:14:59,040 --> 00:15:00,760 Speaker 4: if you just look at the def feed, which is 327 00:15:00,800 --> 00:15:04,040 Speaker 4: your FYP, you're going to get tweets from twelve hours ago, 328 00:15:04,200 --> 00:15:06,920 Speaker 4: from twenty hours ago, from five minutes ago, all jammed 329 00:15:07,000 --> 00:15:08,680 Speaker 4: up against one another. You're going to get a huge 330 00:15:08,720 --> 00:15:11,680 Speaker 4: number of people who are putting themselves out there, not 331 00:15:11,960 --> 00:15:16,840 Speaker 4: for any kind of let's say, noble informational value, yeah, 332 00:15:16,880 --> 00:15:20,600 Speaker 4: but to specifically to get followers to get engagement payouts. 333 00:15:20,920 --> 00:15:25,040 Speaker 4: So you're incentivizing hoaxes. You're sort of diminishing the value 334 00:15:25,080 --> 00:15:27,800 Speaker 4: of the feed itself, and so what you get. I mean, 335 00:15:27,840 --> 00:15:29,840 Speaker 4: we saw this with the most recent hurricanes. This is 336 00:15:29,840 --> 00:15:31,360 Speaker 4: a you know, I wrote about this for my newsletter 337 00:15:31,360 --> 00:15:33,760 Speaker 4: that this was a thing where Twitter used to be 338 00:15:33,880 --> 00:15:36,840 Speaker 4: the place you would go check for a ongoing news 339 00:15:36,840 --> 00:15:39,960 Speaker 4: event like that, and consistently it was. For all the 340 00:15:39,960 --> 00:15:41,560 Speaker 4: things you can say that were bad about Twitter, it 341 00:15:41,600 --> 00:15:44,360 Speaker 4: was a great place to find people like local reporters 342 00:15:44,400 --> 00:15:46,240 Speaker 4: in the area or even people who were like living 343 00:15:46,320 --> 00:15:48,800 Speaker 4: through a disaster event like that. You would find images 344 00:15:48,840 --> 00:15:51,280 Speaker 4: and video, you'd have one hundred different meteorologists who were 345 00:15:51,280 --> 00:15:54,400 Speaker 4: able to help you contextualize all these things. And I 346 00:15:54,440 --> 00:15:55,960 Speaker 4: wasn't able to find any of that, really, I mean, 347 00:15:55,960 --> 00:15:57,440 Speaker 4: it was there to some extent. I don't want to 348 00:15:57,480 --> 00:16:01,240 Speaker 4: pretend that there weren't meteorologists, say, being helpful, or local reporters, 349 00:16:01,240 --> 00:16:03,920 Speaker 4: but it was just drowned in this flood of influencers 350 00:16:03,960 --> 00:16:06,280 Speaker 4: and hoaxters and people just trying to get attention for 351 00:16:06,360 --> 00:16:08,720 Speaker 4: various reasons in a way that felt sort of impossible 352 00:16:08,760 --> 00:16:10,920 Speaker 4: to learn anything real or true about what was happening. 353 00:16:11,040 --> 00:16:11,080 Speaker 1: No. 354 00:16:11,240 --> 00:16:14,960 Speaker 2: I found this true. In fact, on the most recent hurricane, 355 00:16:15,360 --> 00:16:18,400 Speaker 2: the one that was heading right near Tampa Bay, I 356 00:16:18,520 --> 00:16:22,280 Speaker 2: realized that I had been on Twitter all morning the 357 00:16:22,400 --> 00:16:24,640 Speaker 2: next like through the night, and I didn't actually know 358 00:16:24,960 --> 00:16:26,960 Speaker 2: what had happened. And then the only place that I 359 00:16:27,000 --> 00:16:30,120 Speaker 2: found the sort of the preliminary assessments was Bloomberg dot com, 360 00:16:30,440 --> 00:16:32,480 Speaker 2: And I thought that was like really striking, like I 361 00:16:32,560 --> 00:16:34,920 Speaker 2: just didn't get any value. But you know the other thing, 362 00:16:35,120 --> 00:16:38,600 Speaker 2: and you mentioned hoaxes, and I guess, like in twenty 363 00:16:38,640 --> 00:16:41,680 Speaker 2: sixteen people started calling it fake news, but like the 364 00:16:41,800 --> 00:16:45,480 Speaker 2: degree of like fiction or fantasy, you know, you'd like 365 00:16:45,480 --> 00:16:48,760 Speaker 2: see these tweets that would go viral for like Fema 366 00:16:48,840 --> 00:16:52,120 Speaker 2: is on the ground aiming snipers at anyone who's trying, 367 00:16:52,160 --> 00:16:54,600 Speaker 2: and they've been told that they're not allowed to return 368 00:16:54,680 --> 00:16:57,400 Speaker 2: to their house and their private property is now the 369 00:16:57,400 --> 00:16:59,880 Speaker 2: property of the federal government and all that, like truly 370 00:17:00,160 --> 00:17:03,840 Speaker 2: like beyond fiction type stuff, and then it'll have like 371 00:17:03,880 --> 00:17:07,600 Speaker 2: fifteen thousand retweets. And I'm curiously if you have a 372 00:17:07,640 --> 00:17:10,560 Speaker 2: theory for that, because to me it feels like part 373 00:17:10,600 --> 00:17:13,280 Speaker 2: of what's going on there's the engagement bait, there's maybe 374 00:17:13,320 --> 00:17:15,800 Speaker 2: the desire for blue checks to get payout. But I 375 00:17:15,840 --> 00:17:20,159 Speaker 2: also like worry that basically sort of real facts that 376 00:17:20,240 --> 00:17:23,960 Speaker 2: exist in the world are just becoming like a category 377 00:17:23,960 --> 00:17:25,800 Speaker 2: of facts. And then there's other but these are just 378 00:17:25,800 --> 00:17:28,280 Speaker 2: as good and these are just as interesting. To me, it. 379 00:17:28,119 --> 00:17:32,680 Speaker 3: Feels like the algo like can only measure engagement. Basically, 380 00:17:32,720 --> 00:17:36,120 Speaker 3: it can't measure quality of the tweet, right, So if 381 00:17:36,160 --> 00:17:38,960 Speaker 3: someone is doing engagement bait, if they're just making people 382 00:17:39,000 --> 00:17:41,760 Speaker 3: angry or they're just tweeting out something that's wrong and 383 00:17:41,760 --> 00:17:44,199 Speaker 3: then a bunch of people come on and start like 384 00:17:44,280 --> 00:17:46,800 Speaker 3: pointing out that it's wrong, it still counts as engagement. 385 00:17:47,080 --> 00:17:48,560 Speaker 2: And I guess what it would also say is like 386 00:17:48,600 --> 00:17:51,040 Speaker 2: I'm not even sure people care that it's fact, Like 387 00:17:51,040 --> 00:17:53,040 Speaker 2: the idea like there may have been one point where, like, 388 00:17:53,359 --> 00:17:55,959 Speaker 2: you know, I don't believe traditional media. I believe these 389 00:17:56,000 --> 00:17:59,200 Speaker 2: alternative sources for various reasons. Some of them actually might 390 00:17:59,240 --> 00:18:01,960 Speaker 2: be legitimate my view, but actually the idea that like 391 00:18:02,280 --> 00:18:04,720 Speaker 2: this is an actual fact to be believed, I'm not 392 00:18:04,840 --> 00:18:06,160 Speaker 2: convinced is that important to people. 393 00:18:06,160 --> 00:18:07,600 Speaker 4: I mean, I think there's an attitude a lot of 394 00:18:07,600 --> 00:18:09,639 Speaker 4: people have on Twitter. I mean, there's two things I'll say. 395 00:18:09,680 --> 00:18:10,840 Speaker 4: One is that I think there's an attitude a lot 396 00:18:10,880 --> 00:18:13,080 Speaker 4: of people have on Twitter, on social media, but especially 397 00:18:13,119 --> 00:18:16,399 Speaker 4: on Twitter, that you're not actually engaged in describing the 398 00:18:16,440 --> 00:18:19,399 Speaker 4: world as it is, in communicating information other people. You 399 00:18:19,400 --> 00:18:23,200 Speaker 4: were engaged in a fight or a war between political parties, 400 00:18:23,200 --> 00:18:25,800 Speaker 4: I suppose, but between sort of political factions, and therefore 401 00:18:26,080 --> 00:18:27,960 Speaker 4: it doesn't have to have any relationship to truth. You're 402 00:18:27,960 --> 00:18:31,160 Speaker 4: just out there to own the other guys, to rally 403 00:18:31,160 --> 00:18:33,520 Speaker 4: people to your side, to do whatever. And I think 404 00:18:33,560 --> 00:18:36,639 Speaker 4: that the changes to the sort of algorithm and to 405 00:18:36,680 --> 00:18:40,800 Speaker 4: the structure of the platform have exacerbated this tendency because again, Twitter, 406 00:18:40,960 --> 00:18:43,200 Speaker 4: over its first decade or so, sort of realized that 407 00:18:43,240 --> 00:18:46,359 Speaker 4: the edge it had over other platforms was news. It 408 00:18:46,359 --> 00:18:49,520 Speaker 4: always thought of itself as a social network structured around news, 409 00:18:49,720 --> 00:18:52,520 Speaker 4: and this was like, ultimately not a great way to 410 00:18:52,520 --> 00:18:54,760 Speaker 4: structure themselves in terms of like revenue and profits and 411 00:18:54,800 --> 00:18:56,680 Speaker 4: growth and whatever else. But it was pretty good for 412 00:18:56,720 --> 00:18:58,399 Speaker 4: getting a lot of journalists to pay attention and to 413 00:18:58,440 --> 00:19:01,959 Speaker 4: share honestly and earnestly, and to to sort of assess 414 00:19:01,960 --> 00:19:04,160 Speaker 4: the truth and fact value. And now it's like more 415 00:19:04,200 --> 00:19:06,240 Speaker 4: like a political like, among other things, a sort of 416 00:19:06,240 --> 00:19:10,040 Speaker 4: political arena for people with not great relationships the truth 417 00:19:10,080 --> 00:19:11,720 Speaker 4: to sort of like fight with each other. 418 00:19:11,840 --> 00:19:15,399 Speaker 2: On this point real quickly, Like I'm old too, so 419 00:19:15,440 --> 00:19:17,639 Speaker 2: I don't I'm not on TikTok are the would you 420 00:19:17,640 --> 00:19:20,480 Speaker 2: say the other platforms have emerged in the same way 421 00:19:20,680 --> 00:19:24,080 Speaker 2: or people view them as an arena for fighting yee 422 00:19:24,400 --> 00:19:25,800 Speaker 2: or for a pushing a faction. 423 00:19:26,200 --> 00:19:28,040 Speaker 4: Yeah, to some extent, I mean, I don't think it's 424 00:19:28,080 --> 00:19:29,679 Speaker 4: not quite the same way as it is on Twitter, 425 00:19:29,680 --> 00:19:31,760 Speaker 4: which has, because of its news background as sort of 426 00:19:31,800 --> 00:19:34,879 Speaker 4: political background. And Twitter still has to some extent a 427 00:19:34,960 --> 00:19:36,920 Speaker 4: kind of I think because if its size and its 428 00:19:36,920 --> 00:19:39,919 Speaker 4: structure a kind of central sense of what people are 429 00:19:39,920 --> 00:19:41,840 Speaker 4: talking about and how to join it, and TikTok can 430 00:19:41,840 --> 00:19:45,360 Speaker 4: feel much more kind of open as and obviously Instagram 431 00:19:45,359 --> 00:19:47,200 Speaker 4: and Facebook the idea that there's like a trend a 432 00:19:47,280 --> 00:19:49,040 Speaker 4: set of topics that people are talking about. It's just 433 00:19:49,080 --> 00:19:51,080 Speaker 4: non existent. The other thing I would say about this 434 00:19:51,240 --> 00:19:53,720 Speaker 4: is that one of the sort of industry wide trends 435 00:19:53,800 --> 00:19:55,679 Speaker 4: that really obviously in the case of Twitter, but it's 436 00:19:55,720 --> 00:19:58,600 Speaker 4: happening in Meta and Facebook two, is a lot of 437 00:19:58,600 --> 00:20:00,960 Speaker 4: people on content moderation teams are getting laid off, people 438 00:20:01,240 --> 00:20:04,040 Speaker 4: whose job is to try and sort of limit and 439 00:20:04,160 --> 00:20:07,000 Speaker 4: true information, to make judgment calls about what is true 440 00:20:07,000 --> 00:20:10,280 Speaker 4: and what isn't true. I think Elon's kind of desire 441 00:20:10,359 --> 00:20:12,480 Speaker 4: to blow that up on Twitter because he seemed to 442 00:20:12,480 --> 00:20:15,080 Speaker 4: believe that it was over biased and incorrect and all 443 00:20:15,080 --> 00:20:18,280 Speaker 4: these things has opened up space for Meta, for example, 444 00:20:18,320 --> 00:20:20,760 Speaker 4: to lay off a ton of its own content moderators. 445 00:20:20,920 --> 00:20:22,840 Speaker 4: And so I think that, you know, for all that 446 00:20:23,200 --> 00:20:26,560 Speaker 4: people love to complain about Facebook's and consistent moderation and 447 00:20:26,600 --> 00:20:28,960 Speaker 4: the bad moderation, all this there was people doing real 448 00:20:29,040 --> 00:20:32,119 Speaker 4: jobs at trying to make the platforms like enjoyable to 449 00:20:32,280 --> 00:20:34,280 Speaker 4: use and not just filled with garbage all the time. 450 00:20:34,520 --> 00:20:36,760 Speaker 4: Those people often don't work at those companies anymore, or 451 00:20:36,760 --> 00:20:39,560 Speaker 4: they're overwhelmed, and there's not a huge amount of will 452 00:20:39,600 --> 00:20:41,840 Speaker 4: from what I understand, to sort of bring that. 453 00:20:41,840 --> 00:20:45,960 Speaker 3: Back, just going back to people fighting on platforms, I 454 00:20:46,000 --> 00:20:48,359 Speaker 3: will say there is a lot of low level drama 455 00:20:48,400 --> 00:20:50,880 Speaker 3: on TikTok that I kind of enjoy. And I don't 456 00:20:50,880 --> 00:20:53,920 Speaker 3: watch TikTok directly, but I do watch the YouTube summaries 457 00:20:54,040 --> 00:20:58,280 Speaker 3: of like Internet scandals like the Bridgerton party that went 458 00:20:58,359 --> 00:21:01,080 Speaker 3: wrong and things like that. I do enjoy those. 459 00:21:01,280 --> 00:21:03,720 Speaker 4: Yeah, it's less political, but definitely because TikTok has the 460 00:21:03,800 --> 00:21:06,760 Speaker 4: reply feature where you stitch yourself into another person's video, 461 00:21:06,800 --> 00:21:09,320 Speaker 4: there's a lot of responses and trying to coast off 462 00:21:09,320 --> 00:21:12,080 Speaker 4: of the viral success of somebody else or something. So 463 00:21:12,280 --> 00:21:14,520 Speaker 4: you're absolutely right, that's definitely a huge part of it. 464 00:21:15,080 --> 00:21:16,639 Speaker 3: Are we going to be replaced by AI? 465 00:21:17,920 --> 00:21:19,800 Speaker 4: I don't think so, but I think a lot of 466 00:21:19,800 --> 00:21:21,960 Speaker 4: posts are going to be replaced by AI. We're already 467 00:21:21,960 --> 00:21:25,360 Speaker 4: seeing a lot of stuff on Facebook, on Twitter, even 468 00:21:25,400 --> 00:21:28,439 Speaker 4: on TikTok and Instagram that is basically AI generated that 469 00:21:28,480 --> 00:21:30,080 Speaker 4: seems to have no real really. 470 00:21:30,080 --> 00:21:33,200 Speaker 3: Oh yeah, the Facebook images those are crazy. 471 00:21:33,359 --> 00:21:33,560 Speaker 2: Yeah. 472 00:21:33,600 --> 00:21:36,000 Speaker 4: The weird Facebook AI slop If you haven't seen these 473 00:21:36,000 --> 00:21:39,040 Speaker 4: on Facebook, there's a great Twitter account called Weird Facebook 474 00:21:39,040 --> 00:21:41,240 Speaker 4: Ai Slap, which I recently wrote a piece about for 475 00:21:41,280 --> 00:21:43,640 Speaker 4: New York magazine. This is a case where you have 476 00:21:43,920 --> 00:21:46,240 Speaker 4: Facebook has started paying out sort of creator bonus to 477 00:21:46,280 --> 00:21:49,000 Speaker 4: people who get a lot of engagement, and from what 478 00:21:49,040 --> 00:21:51,600 Speaker 4: I can tell, it's not necessarily worth it for like 479 00:21:51,640 --> 00:21:54,359 Speaker 4: an American who has an opportunity to make American minimum 480 00:21:54,400 --> 00:21:55,960 Speaker 4: wage to try this out as a full time job. 481 00:21:56,240 --> 00:21:57,560 Speaker 4: But you have a lot of people all over the 482 00:21:57,560 --> 00:22:00,840 Speaker 4: world for whom the payouts actually are pretty meaningful, and 483 00:22:00,880 --> 00:22:04,600 Speaker 4: who can figure out ways to juice engagement on Facebook 484 00:22:04,640 --> 00:22:06,960 Speaker 4: groups or on the posts they're making in order to 485 00:22:07,040 --> 00:22:09,560 Speaker 4: get payouts. So I talked to a guy in Kenya, 486 00:22:09,600 --> 00:22:13,320 Speaker 4: for example, who had started a bunch of pages. He 487 00:22:13,359 --> 00:22:17,320 Speaker 4: told me that his most engaging topics were Jesus, pets, 488 00:22:17,320 --> 00:22:20,600 Speaker 4: and animals, the US military and Manchester United were like this, 489 00:22:20,720 --> 00:22:23,240 Speaker 4: four big engagement subjects. And he goes in and he 490 00:22:23,359 --> 00:22:26,240 Speaker 4: just makes these ai images, he sort of roco co 491 00:22:26,560 --> 00:22:30,520 Speaker 4: sentimental Jesus images, and he posts them and tries to 492 00:22:30,560 --> 00:22:32,639 Speaker 4: get you know, hundreds or thousands of people to like 493 00:22:32,720 --> 00:22:35,040 Speaker 4: and share, and he can make you know, five hundred 494 00:22:35,320 --> 00:22:37,280 Speaker 4: or more bucks a month off of this, which is 495 00:22:37,280 --> 00:22:39,320 Speaker 4: pretty close to just above minimum wage in a lot 496 00:22:39,359 --> 00:22:42,560 Speaker 4: of places in Kenya, and this stuff is kind of 497 00:22:42,560 --> 00:22:45,359 Speaker 4: taking over Facebook. Facebook will tell you it's not, and 498 00:22:45,400 --> 00:22:47,200 Speaker 4: I believe them that in some sense it's not, because 499 00:22:47,200 --> 00:22:50,280 Speaker 4: it's a drop in what is a huge ocean of content. 500 00:22:50,720 --> 00:22:52,720 Speaker 4: But I've talked to a lot of people who found 501 00:22:52,760 --> 00:22:55,399 Speaker 4: that if they click or scroll or check out an 502 00:22:55,440 --> 00:22:58,120 Speaker 4: AI generated image, that in some way and in some sense, 503 00:22:58,160 --> 00:23:00,520 Speaker 4: the Facebook algorithm is really pushing this stuff on people. 504 00:23:00,960 --> 00:23:03,919 Speaker 4: And I mean, I find it like fascinating. It's like 505 00:23:04,040 --> 00:23:05,520 Speaker 4: it's like a car wreck. You know, you go and 506 00:23:05,560 --> 00:23:06,760 Speaker 4: you can't not look at these things. But it's not 507 00:23:06,760 --> 00:23:09,040 Speaker 4: what I want out of a social network really, you know. 508 00:23:09,400 --> 00:23:12,159 Speaker 2: Yeah, there's tons and tons of slop. I also like 509 00:23:12,240 --> 00:23:16,240 Speaker 2: noticed on both Twitter and LinkedIn replies that I'm like 510 00:23:16,359 --> 00:23:18,640 Speaker 2: sure our ai Like, I'll like post like a new 511 00:23:18,680 --> 00:23:23,159 Speaker 2: episode about mortgage rates, and then I'll get a reply that, no, 512 00:23:23,280 --> 00:23:25,800 Speaker 2: I wish it were that. It's more like, it's more like, 513 00:23:26,040 --> 00:23:28,520 Speaker 2: thank you for keeping us updated on mortgage rates. Staying 514 00:23:28,560 --> 00:23:31,159 Speaker 2: informed on mortgage rates is really important. And it's like 515 00:23:31,280 --> 00:23:33,880 Speaker 2: kind of at that line where maybe like someone could 516 00:23:33,880 --> 00:23:35,560 Speaker 2: have said that, but I'm not sure, And it's making 517 00:23:35,600 --> 00:23:35,880 Speaker 2: me like. 518 00:23:35,920 --> 00:23:38,560 Speaker 3: Somewhere, there's a poor guy out there who like said 519 00:23:38,600 --> 00:23:42,480 Speaker 3: that in complete seriousness trying to be part of the conversation. 520 00:23:42,640 --> 00:23:45,879 Speaker 2: No, but I'm like not sure anymore, because that's very easy, Like, 521 00:23:45,960 --> 00:23:48,439 Speaker 2: you know, the free version of chat GPT could come 522 00:23:48,480 --> 00:23:50,880 Speaker 2: up with that quite easily. But going back to whether 523 00:23:50,920 --> 00:23:54,760 Speaker 2: we'll be replaced by AI, Like I listened to some 524 00:23:54,880 --> 00:23:57,520 Speaker 2: of the I think it's called the Google Notebook LM 525 00:23:57,600 --> 00:24:00,480 Speaker 2: like fake podcast and stuff. Here's my lot on some 526 00:24:00,520 --> 00:24:03,080 Speaker 2: of this stuff, Like none of it is interesting, Like 527 00:24:03,119 --> 00:24:06,919 Speaker 2: I've never seen like an actually like interesting AI output 528 00:24:07,280 --> 00:24:10,359 Speaker 2: really that could replace a like an interesting writer. But 529 00:24:11,119 --> 00:24:13,040 Speaker 2: I didn't hate it, like if you like, you know, 530 00:24:13,119 --> 00:24:15,639 Speaker 2: I listened to this podcast that was about some deally 531 00:24:15,920 --> 00:24:19,159 Speaker 2: nuclear report and it's like I did not think it 532 00:24:19,200 --> 00:24:21,320 Speaker 2: was a terrible way to consume that content, like if 533 00:24:21,359 --> 00:24:23,160 Speaker 2: we're on the subway or something like that, Like it's 534 00:24:23,200 --> 00:24:25,800 Speaker 2: not that interesting or that good, but it actually like 535 00:24:25,840 --> 00:24:29,000 Speaker 2: sort of summarized it in an audio way. Yeah, it 536 00:24:29,040 --> 00:24:29,719 Speaker 2: was not terrible. 537 00:24:29,880 --> 00:24:32,760 Speaker 4: Yeah, I mean those podcasts are one of the is 538 00:24:32,760 --> 00:24:34,639 Speaker 4: probably the most recent thing from out of AI that 539 00:24:34,680 --> 00:24:37,240 Speaker 4: I've felt like are cool that are sort of like wow, 540 00:24:37,280 --> 00:24:40,080 Speaker 4: this is really cool and like surprising and interesting and 541 00:24:40,119 --> 00:24:42,439 Speaker 4: like I could see this being useful in some ways. 542 00:24:42,520 --> 00:24:45,040 Speaker 4: I mean, I exported a bunch of my group chats 543 00:24:45,040 --> 00:24:47,119 Speaker 4: and made podcast like as text and put them in 544 00:24:47,119 --> 00:24:50,040 Speaker 4: notebook LM, and then made podcasts about my group chats 545 00:24:50,080 --> 00:24:52,080 Speaker 4: that I had shared with my group chats, which was 546 00:24:52,480 --> 00:24:56,560 Speaker 4: not productive or useful in an economics you know, but 547 00:24:57,000 --> 00:24:59,240 Speaker 4: I love doing it, you know, I think that there's 548 00:24:59,240 --> 00:25:00,680 Speaker 4: a lot of use for them that kind of thing. 549 00:25:01,119 --> 00:25:03,159 Speaker 4: The other thing I'd say is that, you know, I 550 00:25:03,200 --> 00:25:05,280 Speaker 4: think there's a lot of ways that people have talked 551 00:25:05,280 --> 00:25:07,600 Speaker 4: about the relationship of AI to the platforms that we're 552 00:25:07,640 --> 00:25:10,680 Speaker 4: talking about here, And one thing writing and reporting about 553 00:25:10,720 --> 00:25:12,560 Speaker 4: this sort of wave of SLOT made me realize is 554 00:25:12,600 --> 00:25:16,120 Speaker 4: how much these two technologies are quite complementary to each other. 555 00:25:16,400 --> 00:25:19,960 Speaker 4: That in fact, like on Facebook, you have this effectively 556 00:25:20,000 --> 00:25:23,360 Speaker 4: infinite market for content of basically any kind as long 557 00:25:23,400 --> 00:25:26,240 Speaker 4: as somebody will look at it, and in llms you 558 00:25:26,320 --> 00:25:30,639 Speaker 4: have an effectively infinite content generator. So if I was 559 00:25:31,200 --> 00:25:33,920 Speaker 4: in charge of the Facebook platform or whatever, I would 560 00:25:33,960 --> 00:25:35,960 Speaker 4: have a really hard time figuring out how to make 561 00:25:35,960 --> 00:25:38,399 Speaker 4: a decision about what, you know, like what stuff I 562 00:25:38,440 --> 00:25:40,600 Speaker 4: allow and what stuff I don't allow, because obviously there's 563 00:25:40,640 --> 00:25:42,800 Speaker 4: plenty of ways that a podcast about a deal energy 564 00:25:42,800 --> 00:25:44,399 Speaker 4: report is like, I mean, that's not going to go 565 00:25:44,480 --> 00:25:46,879 Speaker 4: viral on Facebook, let's be honest. But there's there's interesting 566 00:25:46,960 --> 00:25:48,440 Speaker 4: versions of this that you don't want to tell people 567 00:25:48,480 --> 00:25:51,159 Speaker 4: they can't post. But the problem is the threshold for 568 00:25:51,240 --> 00:25:54,200 Speaker 4: creating this stuff is so minimal, and ultimately the value 569 00:25:54,240 --> 00:25:56,240 Speaker 4: add is for the ninety nine percent of what you're 570 00:25:56,280 --> 00:25:59,480 Speaker 4: creating with AI stuff is just non existent. It's hard 571 00:25:59,480 --> 00:26:01,640 Speaker 4: to imagine, and that the noise isn't gonna drown out 572 00:26:01,640 --> 00:26:02,240 Speaker 4: the signal. 573 00:26:02,000 --> 00:26:20,199 Speaker 3: At some point. There is the dead Internet theory, so 574 00:26:20,280 --> 00:26:22,959 Speaker 3: the idea that eventually it's just going to be bots 575 00:26:23,119 --> 00:26:26,240 Speaker 3: talking to each other and people just aren't going to 576 00:26:26,320 --> 00:26:30,240 Speaker 3: be that engaged in terms of content creation anymore. Doesn't 577 00:26:30,320 --> 00:26:34,280 Speaker 3: matter who is producing content on the Internet. Like, assuming 578 00:26:34,359 --> 00:26:37,280 Speaker 3: that we're all addicted to it and we're still going 579 00:26:37,320 --> 00:26:40,760 Speaker 3: to keep looking at it, why does it matter. 580 00:26:41,480 --> 00:26:45,040 Speaker 4: I have a strong feeling that it matters in some 581 00:26:45,119 --> 00:26:48,359 Speaker 4: fundamental way to us that the things we see and 582 00:26:48,480 --> 00:26:52,560 Speaker 4: engage about are human created. I'm trying to think about 583 00:26:52,560 --> 00:26:54,320 Speaker 4: how to articulate this. Maybe put it this way. It 584 00:26:54,320 --> 00:26:56,760 Speaker 4: matters less than the sort of content we consume, like 585 00:26:56,800 --> 00:26:59,639 Speaker 4: the TV shows or the novels or whatever are created 586 00:26:59,680 --> 00:27:01,920 Speaker 4: by a then it does that there are other humans 587 00:27:02,160 --> 00:27:04,199 Speaker 4: to talk about that stuff with. Like, I think so 588 00:27:04,320 --> 00:27:08,800 Speaker 4: much of how we consume culture is about our relationship 589 00:27:08,800 --> 00:27:11,040 Speaker 4: to other people and the ability to kind of consume 590 00:27:11,080 --> 00:27:13,600 Speaker 4: it in tandem with other people that we can talk 591 00:27:13,640 --> 00:27:16,840 Speaker 4: about it with. I'm skeptical that we will get the 592 00:27:16,880 --> 00:27:21,560 Speaker 4: same level of satisfaction or like wholesome like happiness out 593 00:27:21,640 --> 00:27:25,240 Speaker 4: of like talking about AI generated content with AI bots 594 00:27:25,400 --> 00:27:27,920 Speaker 4: waiting for the next AI generated episode of a TV show. 595 00:27:28,119 --> 00:27:29,359 Speaker 4: But it remains to be seen. 596 00:27:29,640 --> 00:27:31,960 Speaker 3: Does it matter to advertisers at all? Like, if the 597 00:27:32,000 --> 00:27:35,560 Speaker 3: currency of the Internet is eyeballs and they're still getting eyeballs, 598 00:27:35,600 --> 00:27:36,359 Speaker 3: do you think they care? 599 00:27:36,800 --> 00:27:39,040 Speaker 4: Right now? It doesn't seem like they do. I mean, look, 600 00:27:39,080 --> 00:27:41,560 Speaker 4: I think this for me. What is optimistic for me 601 00:27:41,800 --> 00:27:44,000 Speaker 4: is the idea that we're looking you know, on a 602 00:27:44,080 --> 00:27:46,639 Speaker 4: medium term time horizon. There is a recognition on the 603 00:27:46,640 --> 00:27:49,640 Speaker 4: part of advertisers that these are not like high quality 604 00:27:50,040 --> 00:27:52,000 Speaker 4: views that they're getting on their ads. That if you're 605 00:27:52,119 --> 00:27:54,560 Speaker 4: like scrolling through a bunch of slop, like a lot 606 00:27:54,600 --> 00:27:55,880 Speaker 4: of people are going to have left that the people 607 00:27:55,880 --> 00:27:57,440 Speaker 4: who are scrolling through it are not looking for the 608 00:27:57,440 --> 00:28:00,280 Speaker 4: stuff they're doing, but right now it doesn't seem like it. 609 00:28:00,280 --> 00:28:02,000 Speaker 4: I mean, they're happy to get their clicks, and you know, 610 00:28:02,040 --> 00:28:04,160 Speaker 4: so much of what Facebook is about is like really 611 00:28:04,200 --> 00:28:07,359 Speaker 4: specific micro targeting for small to medium sized businesses, where 612 00:28:07,520 --> 00:28:09,520 Speaker 4: it doesn't matter if it's all AI slaps running, as 613 00:28:09,560 --> 00:28:11,080 Speaker 4: long as you're getting the people who might go to 614 00:28:11,119 --> 00:28:12,520 Speaker 4: your restaurant or whatever it is. 615 00:28:12,840 --> 00:28:16,119 Speaker 2: I mean, everyone knows about the terrible economics for a 616 00:28:16,119 --> 00:28:19,560 Speaker 2: lot of legacy media businesses, the entities that in theory 617 00:28:19,640 --> 00:28:23,360 Speaker 2: try to create factual, real things, et cetera. But there 618 00:28:23,400 --> 00:28:26,600 Speaker 2: are like some things. I mean, you're a sole proprietor 619 00:28:26,800 --> 00:28:32,040 Speaker 2: of a profitable newsletter that people pay for, And there 620 00:28:32,119 --> 00:28:36,600 Speaker 2: are Reddit pages and discord groups that are sort of 621 00:28:36,680 --> 00:28:39,720 Speaker 2: run by a dictatorship in which no, unlike right like no, 622 00:28:39,840 --> 00:28:42,040 Speaker 2: there are rules and you will be banned at whom 623 00:28:42,080 --> 00:28:44,840 Speaker 2: and if it was a mistake, sorry, But on net 624 00:28:44,920 --> 00:28:46,600 Speaker 2: like this is a good thing to have a very 625 00:28:46,640 --> 00:28:49,240 Speaker 2: heavy hand at. It is the future of all media, 626 00:28:49,400 --> 00:28:53,640 Speaker 2: just like narrow subscriptions to actual humans you trust, or 627 00:28:54,080 --> 00:28:57,000 Speaker 2: narrow communities run by humans you trust. 628 00:28:57,320 --> 00:29:00,080 Speaker 4: I think we're basically we're hollowing out the middle. So 629 00:29:00,160 --> 00:29:01,800 Speaker 4: you have on the one side, I mean, this is 630 00:29:01,800 --> 00:29:03,360 Speaker 4: what it seems like to me right now. It's on 631 00:29:03,360 --> 00:29:05,760 Speaker 4: the one side you have the Times, probably the Wall 632 00:29:05,760 --> 00:29:09,640 Speaker 4: Street Journal Bloomberg, who can remain afloat as really sort 633 00:29:09,640 --> 00:29:13,160 Speaker 4: of enormous national outlets. I mean, the Times already employees 634 00:29:13,240 --> 00:29:15,720 Speaker 4: like seven percent of everybody of all journalists or something 635 00:29:15,760 --> 00:29:17,239 Speaker 4: like that. I may be making that up, but I'm 636 00:29:17,240 --> 00:29:19,200 Speaker 4: pretty sure i'm not. It's some shocking number of the 637 00:29:19,560 --> 00:29:21,560 Speaker 4: currently employed journalists in the US are employed by the 638 00:29:21,600 --> 00:29:23,320 Speaker 4: New York Times. And then on the other side you 639 00:29:23,360 --> 00:29:26,160 Speaker 4: have sub stacks and streamers and the sort of Internet 640 00:29:26,160 --> 00:29:29,640 Speaker 4: personality model of media, which is for better or for worse, 641 00:29:29,680 --> 00:29:31,960 Speaker 4: where my career has ended up. But this sort of 642 00:29:31,960 --> 00:29:35,280 Speaker 4: big middle of what used to be like a really 643 00:29:35,320 --> 00:29:40,040 Speaker 4: diverse set of magazines, you know, weeklies, even like local 644 00:29:40,120 --> 00:29:41,240 Speaker 4: daily newspapers you know it. 645 00:29:41,200 --> 00:29:44,400 Speaker 2: Still exists, is local news networks like you'll go to 646 00:29:44,480 --> 00:29:47,080 Speaker 2: like a town like or something like will be traveling 647 00:29:47,120 --> 00:29:49,960 Speaker 2: and there still are like the local NBC affiliate or whatever, 648 00:29:50,000 --> 00:29:51,200 Speaker 2: which is always a surprised with me. 649 00:29:51,240 --> 00:29:53,560 Speaker 4: They're also like surprisingly powerful too, like the fact that 650 00:29:53,600 --> 00:29:56,200 Speaker 4: Sinclair owns them and is like quite pro Trump has 651 00:29:56,200 --> 00:29:58,440 Speaker 4: always been a little bit interesting to me, But yeah, 652 00:29:58,440 --> 00:30:01,000 Speaker 4: they do still exist. I mean I wonder like what 653 00:30:01,040 --> 00:30:03,160 Speaker 4: the time horizon is for those, but it seems like 654 00:30:03,200 --> 00:30:05,880 Speaker 4: that's a it seems to be relatively successful business model 655 00:30:05,920 --> 00:30:06,240 Speaker 4: for now. 656 00:30:06,680 --> 00:30:10,320 Speaker 3: On the personality driven media, one thing I always think 657 00:30:10,520 --> 00:30:13,640 Speaker 3: doesn't get enough attention, but like there is a tension 658 00:30:13,760 --> 00:30:17,200 Speaker 3: between telling reporters that they need to develop their own 659 00:30:17,240 --> 00:30:20,880 Speaker 3: brand and audience and do all that. And Joe and 660 00:30:20,920 --> 00:30:23,840 Speaker 3: I have been told that basically our entire careers and 661 00:30:23,880 --> 00:30:28,200 Speaker 3: have tried to do that, but then the legacy newsroom 662 00:30:28,760 --> 00:30:31,160 Speaker 3: doesn't really know how to handle that, Like they don't 663 00:30:31,160 --> 00:30:34,640 Speaker 3: seem to have like a strategy for if you develop 664 00:30:34,640 --> 00:30:36,440 Speaker 3: your own audience, what do they actually do with it. 665 00:30:36,760 --> 00:30:39,000 Speaker 4: Yeah, I mean we've seen a million examples of this 666 00:30:39,040 --> 00:30:41,800 Speaker 4: over the last few years. Taylor Lorenz, the Internet reporter, 667 00:30:41,840 --> 00:30:44,480 Speaker 4: recently left the Washington Post and sort of essentially said 668 00:30:44,480 --> 00:30:47,120 Speaker 4: this that she'd built a brand for herself and she 669 00:30:47,280 --> 00:30:50,400 Speaker 4: was running up against the limits of what the Washington 670 00:30:50,440 --> 00:30:52,640 Speaker 4: Post was willing to let her do or allow. And 671 00:30:52,680 --> 00:30:54,640 Speaker 4: I think that a lot of the tension that we've 672 00:30:54,680 --> 00:30:56,800 Speaker 4: seen in newsrooms. I'm sure a lot of people have 673 00:30:56,840 --> 00:30:59,640 Speaker 4: read about tension inside The Times, for example, you know, 674 00:30:59,680 --> 00:31:03,360 Speaker 4: since twenty twenty or so, between rank and file journalists 675 00:31:03,360 --> 00:31:06,800 Speaker 4: and management. I think that this sort of star economy 676 00:31:06,840 --> 00:31:09,280 Speaker 4: of this is a bigger driver of that tension than 677 00:31:09,320 --> 00:31:12,880 Speaker 4: people maybe recognize or understand that. Both in the sense that, 678 00:31:13,000 --> 00:31:14,960 Speaker 4: you know, if you are a working journalist and you're 679 00:31:15,000 --> 00:31:17,760 Speaker 4: watching people get a lot of leeway and a lot 680 00:31:17,760 --> 00:31:20,200 Speaker 4: of rope from management because they happen to be big stars, 681 00:31:20,240 --> 00:31:22,640 Speaker 4: you might start to feel frustrated. And also because you 682 00:31:22,680 --> 00:31:25,840 Speaker 4: have people who have huge followings on Twitter or Instagram, say, 683 00:31:25,880 --> 00:31:27,520 Speaker 4: who know that they have an audience, know that they 684 00:31:27,560 --> 00:31:30,560 Speaker 4: have some leverage with their bosses because they're bringing people 685 00:31:30,640 --> 00:31:32,600 Speaker 4: to the Times, and maybe they should just walk away, 686 00:31:32,880 --> 00:31:34,400 Speaker 4: you know. Again, like I said, I wouldn't want to 687 00:31:34,400 --> 00:31:35,840 Speaker 4: be in charge of Facebook. I'm not sure i'd want 688 00:31:35,840 --> 00:31:38,520 Speaker 4: to be in charge of a big aditional media organization too. 689 00:31:38,640 --> 00:31:40,520 Speaker 2: I remember, like a couple of years ago at some 690 00:31:40,560 --> 00:31:43,280 Speaker 2: point when people were like convinced that Twitter servers were 691 00:31:43,280 --> 00:31:46,200 Speaker 2: about to crash, and I remember like people talking about, 692 00:31:46,240 --> 00:31:47,880 Speaker 2: you know, they would try to find some other site, 693 00:31:47,920 --> 00:31:50,120 Speaker 2: and a few people use the word lifeboat, which I 694 00:31:50,120 --> 00:31:53,960 Speaker 2: thought was really interesting because it implies to my mind 695 00:31:54,160 --> 00:31:57,240 Speaker 2: that a journalist's social media profile is the thing that 696 00:31:57,280 --> 00:32:00,840 Speaker 2: they hold on to when their organizations. So Twitter is 697 00:32:00,840 --> 00:32:04,080 Speaker 2: a lifeboat if you're at like some newspaper, Twitter is 698 00:32:04,080 --> 00:32:06,920 Speaker 2: your lifeboat because your newspaper might go down, but you 699 00:32:06,960 --> 00:32:09,400 Speaker 2: still exist as an entity and that could take you 700 00:32:09,480 --> 00:32:11,960 Speaker 2: to the next safe like port of call. And so 701 00:32:12,240 --> 00:32:14,520 Speaker 2: like this idea is like I need a new lifeboat 702 00:32:14,560 --> 00:32:16,240 Speaker 2: because Twitter might not be there anymore. I think it 703 00:32:16,280 --> 00:32:19,880 Speaker 2: was like a very revealing word to describe how a 704 00:32:19,920 --> 00:32:22,160 Speaker 2: lot of these star journalists view the platform. 705 00:32:22,280 --> 00:32:24,080 Speaker 4: Yeah, I mean you need an audience and like my 706 00:32:24,080 --> 00:32:26,160 Speaker 4: I mean substack is now my lifeboat and I have 707 00:32:26,200 --> 00:32:27,880 Speaker 4: a port. I mean one thing that subssec was very 708 00:32:27,920 --> 00:32:29,760 Speaker 4: attractive with for a lot of journalists is they allow 709 00:32:29,760 --> 00:32:32,120 Speaker 4: you to port your email lists, so you've got you know, 710 00:32:32,160 --> 00:32:33,760 Speaker 4: you lose your Twitter account as I have, but I 711 00:32:33,800 --> 00:32:35,960 Speaker 4: now have forty email lists that I own that I 712 00:32:35,960 --> 00:32:37,280 Speaker 4: can take where that's my lifeboat. 713 00:32:38,480 --> 00:32:41,000 Speaker 2: Max Read, thank you so much for coming on the OBLAS. 714 00:32:41,120 --> 00:32:41,600 Speaker 2: That was awesome. 715 00:32:41,640 --> 00:32:42,520 Speaker 4: Thank you so much for having me. 716 00:32:42,560 --> 00:32:57,200 Speaker 2: Had a great time, Tracy. I love talking to him. 717 00:32:57,240 --> 00:32:59,000 Speaker 2: I like, you know, it's always fun every once in. 718 00:32:59,000 --> 00:33:01,640 Speaker 3: A while, basically just gossiping about our own industry. 719 00:33:01,760 --> 00:33:03,320 Speaker 2: You know, I don't like to indulge in it too much. 720 00:33:03,360 --> 00:33:06,760 Speaker 2: I actually find a lot of media naval gazing to 721 00:33:06,760 --> 00:33:10,360 Speaker 2: be sort of boring, to be honest, But sometimes it's 722 00:33:10,400 --> 00:33:12,600 Speaker 2: sort of fun to look back and how things have 723 00:33:12,680 --> 00:33:14,280 Speaker 2: changed in the last I don't know, fifteen years. 724 00:33:14,320 --> 00:33:15,920 Speaker 3: You know, I got a really good idea from that 725 00:33:16,720 --> 00:33:19,640 Speaker 3: that won't be useful in my personal life, but maybe 726 00:33:19,640 --> 00:33:22,280 Speaker 3: it will be useful to others. I think when you 727 00:33:22,320 --> 00:33:23,920 Speaker 3: go on a date with someone, When you go on 728 00:33:23,960 --> 00:33:27,360 Speaker 3: a first date, people should show their for you page 729 00:33:27,760 --> 00:33:30,800 Speaker 3: on various social media platforms, and you would instantly get 730 00:33:30,840 --> 00:33:33,000 Speaker 3: a sense of who that person is, or at least 731 00:33:33,240 --> 00:33:34,600 Speaker 3: who the algorithm things are. 732 00:33:34,640 --> 00:33:37,280 Speaker 2: Do you be a for you page based dating app 733 00:33:37,360 --> 00:33:39,440 Speaker 2: where it's just all you see is the person's for 734 00:33:39,480 --> 00:33:42,160 Speaker 2: you page? I like this person. That would be a 735 00:33:42,160 --> 00:33:43,360 Speaker 2: good Should we make that start up? 736 00:33:43,440 --> 00:33:43,640 Speaker 3: Yeah? 737 00:33:43,680 --> 00:33:45,240 Speaker 2: I think that's a good idea. You know what. I 738 00:33:45,280 --> 00:33:48,600 Speaker 2: really liked the word carsonization. Oh yeah, and I think 739 00:33:48,640 --> 00:33:51,560 Speaker 2: there's a really powerful idea, the idea that, like all 740 00:33:51,720 --> 00:33:54,880 Speaker 2: apps trend towards looking and functioning the same way, and 741 00:33:54,920 --> 00:33:57,440 Speaker 2: currently it's all towards TikTok. The other thing, which we 742 00:33:57,480 --> 00:34:00,920 Speaker 2: didn't get into, but I think is also an example 743 00:34:00,960 --> 00:34:04,400 Speaker 2: of carsonization too, is like everything sort of being it's 744 00:34:04,440 --> 00:34:07,800 Speaker 2: either a TikTok or a chatbot like text. Like I've 745 00:34:07,840 --> 00:34:11,080 Speaker 2: like opened windows, like I'll open an app and it 746 00:34:11,200 --> 00:34:13,239 Speaker 2: will take me to a second to realize, Wait, am 747 00:34:13,280 --> 00:34:16,359 Speaker 2: I in I am right now? Am I in chat 748 00:34:16,400 --> 00:34:19,880 Speaker 2: GPT right now? Like this sort of this dominant mode 749 00:34:19,880 --> 00:34:22,600 Speaker 2: of interfacing with the Internet that's all like chat related. 750 00:34:22,760 --> 00:34:25,240 Speaker 3: Yeah, I guess that's true. I mean, it is true 751 00:34:25,800 --> 00:34:28,800 Speaker 3: on this point that even Reddit, which like I started 752 00:34:28,800 --> 00:34:32,759 Speaker 3: out fawning over, is starting to like experiment with more 753 00:34:32,840 --> 00:34:37,160 Speaker 3: algo pushed content. So it will suggest subreddits for you now, 754 00:34:37,239 --> 00:34:40,719 Speaker 3: but luckily you can still ignore them for the time being. 755 00:34:40,760 --> 00:34:45,000 Speaker 3: But again, like as pressure to generate revenue actually ramps up, 756 00:34:45,400 --> 00:34:48,319 Speaker 3: it seems like the direction of travel all roads lead 757 00:34:48,400 --> 00:34:49,000 Speaker 3: to TikTok. 758 00:34:49,120 --> 00:34:52,640 Speaker 2: Basically, Yeah, Spotify too, and like you know, there's like 759 00:34:52,680 --> 00:34:55,480 Speaker 2: stories on Spotify. It looks very similar. You get pushed 760 00:34:55,520 --> 00:34:58,560 Speaker 2: all this different stuff that's people that you're not listening to. 761 00:34:59,360 --> 00:35:02,200 Speaker 2: I think the answer is to just subscribe to odd 762 00:35:02,280 --> 00:35:04,280 Speaker 2: Lots and just hang out in the odd Lots Discord 763 00:35:04,640 --> 00:35:06,720 Speaker 2: and then just not worry about any of this other stuff. 764 00:35:06,719 --> 00:35:09,040 Speaker 3: That's right. It's a safe space and you can still 765 00:35:09,120 --> 00:35:11,839 Speaker 3: curate what you what you want to talk about. All right, 766 00:35:11,880 --> 00:35:12,600 Speaker 3: shall we leave it there. 767 00:35:12,680 --> 00:35:13,439 Speaker 2: Let's leave it there. 768 00:35:13,719 --> 00:35:16,640 Speaker 3: This has been another episode of the Oudlots podcast. I'm 769 00:35:16,680 --> 00:35:19,719 Speaker 3: Tracy Alloway. You can follow me at Tracy Alloway. 770 00:35:19,520 --> 00:35:22,239 Speaker 2: And I'm Joe Wisenthal. You can follow me at the Stalwart. 771 00:35:22,560 --> 00:35:25,600 Speaker 2: Check out Max read Substack. It's called Readmax. Find that 772 00:35:25,680 --> 00:35:29,759 Speaker 2: at maxreadt substack dot com. Follow our producers Carmen Rodriguez 773 00:35:29,800 --> 00:35:32,960 Speaker 2: at Carman Arman dash Ol Bennett at Dashbot and Kelbrooks 774 00:35:32,960 --> 00:35:36,279 Speaker 2: at Kilbrooks. Thank you to our producer Moses Ondam. More 775 00:35:36,320 --> 00:35:39,360 Speaker 2: Oddlots content go to Bloomberg dot com slash odlocks. We 776 00:35:39,440 --> 00:35:42,520 Speaker 2: have transcripts, a blog and a daily newsletter that you 777 00:35:42,560 --> 00:35:45,480 Speaker 2: could subscribe to and get into your inbox, and you 778 00:35:45,520 --> 00:35:48,399 Speaker 2: can check twenty four to seven in the nicest, most 779 00:35:48,440 --> 00:35:52,120 Speaker 2: moderated environment. You can find our discord Discord dot gg, 780 00:35:52,239 --> 00:35:53,799 Speaker 2: slash online. 781 00:35:53,320 --> 00:35:55,879 Speaker 3: And if you enjoy adlots, if you like it, when 782 00:35:55,920 --> 00:35:59,560 Speaker 3: we occasionally indulge ourselves by talking about the media. Then 783 00:35:59,640 --> 00:36:03,240 Speaker 3: please leave us a positive review on your favorite podcast platform. 784 00:36:03,680 --> 00:36:06,520 Speaker 3: And remember, if you are a Bloomberg subscriber, you can 785 00:36:06,560 --> 00:36:09,560 Speaker 3: listen to all of our episodes absolutely ad free. You 786 00:36:09,600 --> 00:36:12,000 Speaker 3: will also get that daily newsletter. 787 00:36:12,400 --> 00:36:13,239 Speaker 1: All you have to do. 788 00:36:13,239 --> 00:36:16,279 Speaker 3: To listen to ad free episodes is find the Bloomberg 789 00:36:16,360 --> 00:36:20,280 Speaker 3: channel on Apple Podcasts and follow the instructions there. Thanks 790 00:36:20,280 --> 00:36:20,840 Speaker 3: for listening.