1 00:00:15,880 --> 00:00:18,320 Speaker 1: Welcome to Tech Stuff. I'm os Volosha, and I'm thrilled 2 00:00:18,320 --> 00:00:20,720 Speaker 1: to announce that the Week in Tech is back and 3 00:00:20,720 --> 00:00:23,919 Speaker 1: it's growing. Instead of Karen and I are recounting the 4 00:00:24,000 --> 00:00:27,520 Speaker 1: essential news stories today and every Friday from now on, 5 00:00:28,120 --> 00:00:30,080 Speaker 1: I'm going to be joined by three of the best 6 00:00:30,120 --> 00:00:34,599 Speaker 1: writers covering the Silicon Valley. These incredibly well sourced insiders 7 00:00:34,640 --> 00:00:36,720 Speaker 1: are going to help us break down the latest news, 8 00:00:37,200 --> 00:00:40,760 Speaker 1: decode emerging trends, and debate what actually matters for the 9 00:00:40,760 --> 00:00:44,760 Speaker 1: future of technology and for us. In the meantime, Kara 10 00:00:44,840 --> 00:00:46,559 Speaker 1: is taking a step back from Tech Stuff, and I 11 00:00:46,640 --> 00:00:48,279 Speaker 1: just want to say how grateful I am for her 12 00:00:48,280 --> 00:00:51,559 Speaker 1: friendship and her excellent work on this show. We'll miss her, 13 00:00:51,920 --> 00:00:54,400 Speaker 1: but I could not be more excited to introduce you 14 00:00:54,480 --> 00:00:58,320 Speaker 1: to our fascinating panelists. Today, I'm joined by Taylor Lorenz, 15 00:00:58,360 --> 00:01:01,320 Speaker 1: the brains behind User mag which focuses on how people 16 00:01:01,440 --> 00:01:04,640 Speaker 1: actually use tech. Taylor also has a YouTube channel and 17 00:01:04,760 --> 00:01:08,120 Speaker 1: a weekly podcast. Welcome Taylor, Thanks for having me. Natasha 18 00:01:08,160 --> 00:01:11,039 Speaker 1: Tiku is a long time Silicon Valley reporter, formerly of 19 00:01:11,080 --> 00:01:14,440 Speaker 1: Wired and most recently The Washington Post. Great to be here, 20 00:01:14,600 --> 00:01:17,480 Speaker 1: Great to have you, and Stephen Witt, who literally wrote 21 00:01:17,480 --> 00:01:21,520 Speaker 1: the book on Nvidia, The Thinking Machine, Jensen Huang, Nvidia 22 00:01:21,760 --> 00:01:24,880 Speaker 1: and the world's most coveted microchip. He's also a frequent 23 00:01:24,880 --> 00:01:26,040 Speaker 1: contributor to The New Yorker. 24 00:01:26,160 --> 00:01:26,480 Speaker 2: Thank you. 25 00:01:26,959 --> 00:01:28,440 Speaker 1: So, I've been following each of you for a long 26 00:01:28,480 --> 00:01:31,640 Speaker 1: long time, and listeners, you've definitely heard stories cribbed from 27 00:01:31,640 --> 00:01:35,600 Speaker 1: these three pop up on Tech Stuff quite often. But now, 28 00:01:35,640 --> 00:01:37,480 Speaker 1: as we like to say in England, we'll get to 29 00:01:37,520 --> 00:01:41,200 Speaker 1: hear it straight from the horse's mouth. Taylor, starting with you, 30 00:01:41,520 --> 00:01:43,200 Speaker 1: I think of you as one of the first people 31 00:01:43,280 --> 00:01:46,760 Speaker 1: to cover social media as kind of a capital B beat. 32 00:01:47,440 --> 00:01:48,840 Speaker 1: How do you describe your work today? 33 00:01:49,400 --> 00:01:52,680 Speaker 3: Yeah, well, I mean there are many great reporters covering 34 00:01:52,680 --> 00:01:54,560 Speaker 3: social media. As you mentioned, I kind of cover it 35 00:01:54,560 --> 00:01:57,240 Speaker 3: more from the user side, so how people are using it, 36 00:01:57,360 --> 00:02:00,360 Speaker 3: and that often means sort of how power you users 37 00:02:00,360 --> 00:02:03,760 Speaker 3: are using it, like influencers or sort of different communities. 38 00:02:04,000 --> 00:02:06,120 Speaker 1: Steven, you've been a guest on Tech Stuff before for 39 00:02:06,160 --> 00:02:10,320 Speaker 1: an episode we titled will Invidias Save or ruin the World, 40 00:02:10,480 --> 00:02:13,240 Speaker 1: which was an audience favorite. But what brought you to 41 00:02:13,280 --> 00:02:13,880 Speaker 1: the Tech Beat? 42 00:02:14,080 --> 00:02:15,960 Speaker 2: You know, I had been writing about technology going back 43 00:02:16,000 --> 00:02:18,840 Speaker 2: to twenty fifteen or twenty sixteen. You know, I've always 44 00:02:18,840 --> 00:02:22,080 Speaker 2: been like to be on the cutting edge of technology. 45 00:02:22,120 --> 00:02:23,639 Speaker 2: I really enjoyed using it, so it was kind of 46 00:02:23,680 --> 00:02:26,240 Speaker 2: a natural thing for me. You know, I'm an early 47 00:02:26,280 --> 00:02:27,880 Speaker 2: adopter of a lot of technologies. 48 00:02:28,000 --> 00:02:29,840 Speaker 1: I like on your website that you kind of list 49 00:02:29,960 --> 00:02:32,519 Speaker 1: the hits from your archive, and one of them was 50 00:02:32,560 --> 00:02:35,399 Speaker 1: about about drone racing a few years ago. 51 00:02:35,240 --> 00:02:38,679 Speaker 2: Drone racing, quantic computing. And then in twenty twenty two, 52 00:02:38,840 --> 00:02:41,280 Speaker 2: you know, I had been asleep at the wheel on AI, 53 00:02:42,080 --> 00:02:44,840 Speaker 2: but I downloaded, you know, logged into chat jippt and 54 00:02:44,919 --> 00:02:47,880 Speaker 2: used it within the first twenty four hours, and I 55 00:02:48,040 --> 00:02:50,520 Speaker 2: was like, oh my god, the whole world is going 56 00:02:50,600 --> 00:02:53,079 Speaker 2: to be transformed now. And I had to go chase 57 00:02:53,120 --> 00:02:54,920 Speaker 2: down a story. So I found in video, which was 58 00:02:54,960 --> 00:02:57,080 Speaker 2: turned out to be the best way in because it 59 00:02:57,120 --> 00:03:00,000 Speaker 2: was like getting a backstage pass to the AI revolution. 60 00:03:00,120 --> 00:03:02,799 Speaker 1: Right one, Natasha, I remember reading article you wrote for 61 00:03:02,840 --> 00:03:05,720 Speaker 1: The Washington Post titled the Do Good a movement that 62 00:03:05,880 --> 00:03:10,239 Speaker 1: shielded Sam Bankman freed from scrutiny, and you really built 63 00:03:10,240 --> 00:03:13,400 Speaker 1: back the curtain on effective altruism. I feel like you 64 00:03:13,480 --> 00:03:16,000 Speaker 1: have a kind of sixth sense for what makes these 65 00:03:16,200 --> 00:03:17,560 Speaker 1: tech titans tick. 66 00:03:18,320 --> 00:03:21,440 Speaker 4: Yeah, I mean I've been covering and obsessing over them 67 00:03:21,560 --> 00:03:25,160 Speaker 4: for so long. You know, I when I first met 68 00:03:25,200 --> 00:03:28,120 Speaker 4: like open AI CEO Sam Waltman, he was not, you know, 69 00:03:28,240 --> 00:03:31,720 Speaker 4: considered an AI guy. He was this startup guy. He 70 00:03:31,840 --> 00:03:36,880 Speaker 4: was actually testing this universal basic income. He was thinking 71 00:03:36,880 --> 00:03:39,120 Speaker 4: about building a new city. So I feel like I 72 00:03:39,360 --> 00:03:45,040 Speaker 4: have been following their ideologies, their frequent podcasts and public pronouncements. 73 00:03:45,680 --> 00:03:47,880 Speaker 4: So yeah, it's been really helpful to kind of know 74 00:03:47,920 --> 00:03:48,960 Speaker 4: where they're coming from. 75 00:03:48,840 --> 00:03:50,840 Speaker 1: A better or for worse. You've seen them grow up. 76 00:03:52,240 --> 00:03:54,640 Speaker 4: I mean that was Yeah, I guess if you want to, 77 00:03:54,680 --> 00:03:57,000 Speaker 4: if you want to say that, we're all grown up now. 78 00:03:57,040 --> 00:04:01,240 Speaker 1: Sure, so each of you knows at least one person 79 00:04:01,280 --> 00:04:04,320 Speaker 1: on this panel somewhat well. But don't worry, we're not 80 00:04:04,320 --> 00:04:06,840 Speaker 1: going to play a game of clue. That said, I 81 00:04:06,880 --> 00:04:09,040 Speaker 1: do encourage you all to chime in whenever it moves you, 82 00:04:09,080 --> 00:04:11,680 Speaker 1: because this is a panel show, not an interview show. 83 00:04:12,520 --> 00:04:15,040 Speaker 1: But without further ado, let's get into it, and let's 84 00:04:15,040 --> 00:04:18,200 Speaker 1: start with you, Natasha. You've been paying close attention to 85 00:04:18,240 --> 00:04:20,400 Speaker 1: what I certainly think is the biggest story in tech 86 00:04:20,480 --> 00:04:24,400 Speaker 1: at the moment, which is the throwdown between Anthropic and 87 00:04:24,560 --> 00:04:27,840 Speaker 1: the Department of War. What do we need to know? 88 00:04:28,360 --> 00:04:32,080 Speaker 4: Yeah, well, this week it culminated in you know, this 89 00:04:32,600 --> 00:04:35,400 Speaker 4: heated battle which has been going on simmering for months, 90 00:04:35,440 --> 00:04:39,880 Speaker 4: has already had you know, a major explosion when Anthropics 91 00:04:40,000 --> 00:04:42,640 Speaker 4: said that they you know, would not respond to this 92 00:04:42,720 --> 00:04:46,200 Speaker 4: threat from a Secretary of War Pete Hegseth to declare 93 00:04:46,240 --> 00:04:48,840 Speaker 4: the company at supply chain risk. And this week we 94 00:04:48,920 --> 00:04:55,800 Speaker 4: saw another very public skirmish when Anthropic filed two lawsuits 95 00:04:55,839 --> 00:05:00,000 Speaker 4: against the Department of War alleging that the Pentagon had 96 00:05:00,120 --> 00:05:06,240 Speaker 4: violated their First Amendment rights and was unfairly retaliating against 97 00:05:06,320 --> 00:05:10,560 Speaker 4: the company because they wanted to include two exceptions in 98 00:05:10,600 --> 00:05:14,679 Speaker 4: their contracts with the Pentagon. The first was domestic mass 99 00:05:14,680 --> 00:05:19,080 Speaker 4: surveillance and the other one was lethal autonomous weapons like 100 00:05:19,120 --> 00:05:24,359 Speaker 4: without a human in the loop. And you know, even 101 00:05:25,440 --> 00:05:28,520 Speaker 4: even just in the last few hours, I've seen Under 102 00:05:28,560 --> 00:05:32,240 Speaker 4: Secretary of War Emil Michaels, you know, picking fights with 103 00:05:32,320 --> 00:05:35,720 Speaker 4: AI policy people. So there's really no shortage of like 104 00:05:35,839 --> 00:05:40,560 Speaker 4: fuel to this fire. And obviously, you know, this also 105 00:05:40,720 --> 00:05:44,520 Speaker 4: struck a much wider cord because we're all watching, you know, 106 00:05:44,560 --> 00:05:48,200 Speaker 4: the footage coming out of Iran. You know, we've all 107 00:05:48,240 --> 00:05:52,239 Speaker 4: heard about what happened at the elementary school, and people 108 00:05:52,320 --> 00:05:56,000 Speaker 4: are still pulling out. There's been some amazing reporting from 109 00:05:56,000 --> 00:05:58,560 Speaker 4: the Washington Post and the New York Times about, you know, 110 00:05:58,640 --> 00:06:04,080 Speaker 4: what role anthrop is technology played in picking this target, 111 00:06:04,600 --> 00:06:07,120 Speaker 4: you know, where more than one hundred and seventy five 112 00:06:07,160 --> 00:06:10,520 Speaker 4: people were killed. You know, this was an elementary school. 113 00:06:10,600 --> 00:06:13,400 Speaker 4: Parents were rushing to come and pick up their kids. 114 00:06:13,800 --> 00:06:17,640 Speaker 4: This was an hour after the strike started. And yeah, 115 00:06:17,680 --> 00:06:22,360 Speaker 4: people are trying to kind of separate what was human error, 116 00:06:22,400 --> 00:06:25,440 Speaker 4: what was you know, maybe not updating your target list 117 00:06:25,560 --> 00:06:31,520 Speaker 4: versus that nightmare scenario of a machine making a decision, 118 00:06:31,960 --> 00:06:33,760 Speaker 4: making the wrong decision in the fog of war. 119 00:06:34,720 --> 00:06:37,159 Speaker 1: I mean, it's interesting on the topic of blame obviously, 120 00:06:37,680 --> 00:06:40,640 Speaker 1: you know, Dario sort of said, you know, how can 121 00:06:40,680 --> 00:06:42,960 Speaker 1: we be a supply chain risk when we've been used 122 00:06:42,960 --> 00:06:45,240 Speaker 1: in the Iran war, which everything you're saying right now 123 00:06:45,240 --> 00:06:47,560 Speaker 1: about the targeting and the you know, the bombing of 124 00:06:47,560 --> 00:06:51,080 Speaker 1: that elementary school makes that a somewhat dubious flex But what 125 00:06:51,120 --> 00:06:53,200 Speaker 1: do you think is driving him? I mean, he has 126 00:06:53,240 --> 00:06:56,320 Speaker 1: the reputation of being like the good AI guy, But 127 00:06:56,480 --> 00:06:59,120 Speaker 1: is there I mean that seems somewhat hard to hard 128 00:06:59,160 --> 00:07:01,440 Speaker 1: to credit that they're good folks out there in this 129 00:07:01,560 --> 00:07:04,080 Speaker 1: in this world of cutthroat competition. But is is that? 130 00:07:04,160 --> 00:07:05,800 Speaker 1: Is that what it is? Or is there some is 131 00:07:05,839 --> 00:07:07,400 Speaker 1: there some longer game he's playing here? 132 00:07:08,600 --> 00:07:12,480 Speaker 4: I mean, I I think that this conflict has definitely 133 00:07:12,480 --> 00:07:15,400 Speaker 4: been interpreted through this framework of who's the good guy 134 00:07:15,440 --> 00:07:18,200 Speaker 4: who's the bad guy? Especially, you know, Nthropic is issuing 135 00:07:18,240 --> 00:07:22,360 Speaker 4: these these what sounds like clear red lines, and then 136 00:07:22,360 --> 00:07:24,520 Speaker 4: you have open AI you know, kind of swooping in 137 00:07:24,560 --> 00:07:27,000 Speaker 4: and saying like, oh, well we figured it out, like, oh, 138 00:07:27,000 --> 00:07:29,480 Speaker 4: but ours is our contract is just as ethical. And 139 00:07:29,520 --> 00:07:33,120 Speaker 4: I think, you know, nanthropics valuation is three hundred and 140 00:07:33,120 --> 00:07:36,880 Speaker 4: eighty billion dollars right now, and they are not profitable. 141 00:07:36,880 --> 00:07:39,520 Speaker 4: They're losing money. Like actually, the lawsuits had a lot 142 00:07:39,560 --> 00:07:42,560 Speaker 4: of interesting details about how much it costs to develop 143 00:07:42,600 --> 00:07:45,840 Speaker 4: this technology, how much these contracts are worth. So I 144 00:07:45,920 --> 00:07:50,400 Speaker 4: think with issues that are so important to the public, 145 00:07:50,680 --> 00:07:55,560 Speaker 4: like surveillance of Americans, you know, the use of autonomous weapons, 146 00:07:55,600 --> 00:07:59,560 Speaker 4: you cannot count on the goodwill of a of a 147 00:07:59,600 --> 00:08:02,920 Speaker 4: company that needs to justified three hundred and eighty billion 148 00:08:02,920 --> 00:08:07,240 Speaker 4: dollars valuation that said, this is integral to like the 149 00:08:07,240 --> 00:08:11,000 Speaker 4: way that Anthropic has framed itself from the beginning. They 150 00:08:11,080 --> 00:08:14,960 Speaker 4: were you know, former open Ai executives who left the 151 00:08:15,000 --> 00:08:19,040 Speaker 4: company because it didn't take enough of a stance around safety. 152 00:08:19,480 --> 00:08:22,320 Speaker 4: It's in all of their marketing if. 153 00:08:22,080 --> 00:08:25,720 Speaker 1: You you can't miss the market team as an elevator 154 00:08:25,720 --> 00:08:28,240 Speaker 1: earlier today with it, like the pre roll of the 155 00:08:28,280 --> 00:08:30,200 Speaker 1: weather was like Claudes to your friend and I mean, 156 00:08:30,200 --> 00:08:32,280 Speaker 1: it's it's kind of a Super Bowl ad, Taylor. Are 157 00:08:32,320 --> 00:08:33,160 Speaker 1: they winning? Yeah? 158 00:08:33,320 --> 00:08:33,360 Speaker 3: What? 159 00:08:33,400 --> 00:08:35,600 Speaker 1: What's what? What are you seeing from a user point 160 00:08:35,640 --> 00:08:37,559 Speaker 1: of view, like is this is this working? Like is 161 00:08:37,960 --> 00:08:40,160 Speaker 1: anthropics positioning? Is the good guy landing? 162 00:08:40,440 --> 00:08:42,440 Speaker 3: So I don't cover that stuff from the user point 163 00:08:42,440 --> 00:08:44,240 Speaker 3: of view, but I do cover like more of the 164 00:08:44,400 --> 00:08:48,160 Speaker 3: mass surveillance aspect, which is directly affecting not just users, 165 00:08:48,160 --> 00:08:50,320 Speaker 3: but I mean all of us, even the people that 166 00:08:50,360 --> 00:08:53,280 Speaker 3: build it. To me, that's even more horrifying, almost or 167 00:08:53,400 --> 00:08:57,839 Speaker 3: maybe equally horrifying as the kind of like autonomous weapons. 168 00:08:57,840 --> 00:09:00,440 Speaker 3: I feel like the autonomous weapons is what really terrified 169 00:09:00,440 --> 00:09:02,840 Speaker 3: people again because we're in this war right now. But 170 00:09:03,200 --> 00:09:05,640 Speaker 3: I think there was a lot of focus on you know, 171 00:09:05,679 --> 00:09:08,240 Speaker 3: when all this news came out on the autonomous weapons 172 00:09:08,240 --> 00:09:10,640 Speaker 3: and the AI. You know, the AI company is being 173 00:09:10,720 --> 00:09:14,360 Speaker 3: leveraged to potentially, you know, target people abroad. To me, 174 00:09:14,440 --> 00:09:17,719 Speaker 3: what's even more horrifying is the domestic surveillance stuff. Right 175 00:09:17,800 --> 00:09:20,559 Speaker 3: as we're seeing Congress and pretty much everyone in the 176 00:09:20,600 --> 00:09:24,280 Speaker 3: government united in passing some of the most aggressive mass 177 00:09:24,280 --> 00:09:27,840 Speaker 3: surveillance laws that we've ever seen since you know, nine 178 00:09:27,880 --> 00:09:32,040 Speaker 3: to eleven, we're also seeing the government seek to analyze 179 00:09:32,040 --> 00:09:33,000 Speaker 3: that data using AI. 180 00:09:33,360 --> 00:09:36,080 Speaker 2: Anthropic had been kind of like the fun Player up 181 00:09:36,160 --> 00:09:39,560 Speaker 2: until about December twenty twenty five. They were kind of 182 00:09:39,559 --> 00:09:43,440 Speaker 2: a niche fun player. Then they rolled out Claude Code, 183 00:09:43,679 --> 00:09:45,960 Speaker 2: which is this coding agent that you can use to 184 00:09:46,000 --> 00:09:50,360 Speaker 2: write your own software, and this has revolutionized the field 185 00:09:50,400 --> 00:09:53,920 Speaker 2: of software engineering. I was talking to an engineer earlier 186 00:09:53,960 --> 00:09:55,880 Speaker 2: this week. We said he no longer writes code at all. 187 00:09:56,360 --> 00:09:59,400 Speaker 2: He basically just deletes it. He asks Claude to do 188 00:09:59,400 --> 00:10:00,679 Speaker 2: what he wants to do, and then he goes an 189 00:10:00,679 --> 00:10:03,360 Speaker 2: he line edits it out of there. So the reason 190 00:10:03,400 --> 00:10:06,080 Speaker 2: that the Department of Defense is obsessed with Anthropic is 191 00:10:06,080 --> 00:10:08,760 Speaker 2: because they have this tool that is just going to 192 00:10:08,840 --> 00:10:11,600 Speaker 2: revolutionize the way that we write software and in fact, 193 00:10:11,840 --> 00:10:13,640 Speaker 2: this is the scary thing. It's also going to use 194 00:10:13,679 --> 00:10:17,040 Speaker 2: itself to create its own AI. So now Anthropic is 195 00:10:17,120 --> 00:10:21,000 Speaker 2: using its AI tool to make ever smarter AI. I 196 00:10:21,040 --> 00:10:23,199 Speaker 2: think the Department of Defense has gone from being like 197 00:10:23,200 --> 00:10:25,920 Speaker 2: all right, great Anthropic whatever, to just being like totally 198 00:10:26,320 --> 00:10:29,400 Speaker 2: obsessed with this company. Over the past three months, if 199 00:10:29,440 --> 00:10:31,719 Speaker 2: you look at software stocks, they're down like thirty or 200 00:10:31,760 --> 00:10:36,040 Speaker 2: forty percent because we're not going to need human programmers anymore. 201 00:10:36,080 --> 00:10:38,360 Speaker 2: We're all just going to use this tool. So the 202 00:10:38,440 --> 00:10:40,800 Speaker 2: designation is a critical supply chain risk. I mean, I 203 00:10:40,800 --> 00:10:42,959 Speaker 2: don't know the legal framework of it, but I kind 204 00:10:42,960 --> 00:10:44,959 Speaker 2: of understand their obsession with this company. 205 00:10:45,440 --> 00:10:47,000 Speaker 1: And Natasha, what do you think, I mean, is this 206 00:10:47,040 --> 00:10:49,719 Speaker 1: designation going to how much will it hurt Aanthropic? And 207 00:10:50,280 --> 00:10:52,280 Speaker 1: is it an annoyance? Is an extential risk? And what 208 00:10:52,360 --> 00:10:53,520 Speaker 1: does this mean for that business? 209 00:10:54,000 --> 00:10:58,199 Speaker 4: Well, I mean I feel like, yes, people stock Market certainly, 210 00:10:58,200 --> 00:11:01,040 Speaker 4: like Wall Street, really paid attention with Claude code, but 211 00:11:01,160 --> 00:11:03,720 Speaker 4: like in the Bay Area, like Claude is the favorite. 212 00:11:03,800 --> 00:11:08,000 Speaker 4: It is just like a better smarter tool. Like I 213 00:11:08,080 --> 00:11:12,120 Speaker 4: was talking to a machine learning person who just said, like, 214 00:11:12,160 --> 00:11:14,079 Speaker 4: if you don't know what AI is, you use chat 215 00:11:14,120 --> 00:11:18,000 Speaker 4: GPT if you do use Quad or Gemini and so. 216 00:11:18,280 --> 00:11:21,200 Speaker 4: Like I said, they've had this relationship with the Department 217 00:11:21,240 --> 00:11:24,360 Speaker 4: of War for two years, and I think it's just 218 00:11:24,480 --> 00:11:28,199 Speaker 4: baked into people's normal workflows, so we're not even necessarily 219 00:11:28,240 --> 00:11:33,800 Speaker 4: talking about like you know, target identification analysis of mass 220 00:11:33,800 --> 00:11:36,640 Speaker 4: surveillance data like they just are used to it. It's 221 00:11:36,679 --> 00:11:40,480 Speaker 4: like if somebody took your AI away, it would feel 222 00:11:40,520 --> 00:11:44,760 Speaker 4: extremely inconvenient. So you have that pushback from the actual 223 00:11:44,960 --> 00:11:48,840 Speaker 4: employees and a lot of the like the end result 224 00:11:49,120 --> 00:11:53,040 Speaker 4: of this being so high profile as people thinking that 225 00:11:53,080 --> 00:11:56,440 Speaker 4: Claude is such an incredible tool right like it it 226 00:11:56,480 --> 00:11:58,880 Speaker 4: went from like I think, like in the one hundred 227 00:11:58,880 --> 00:12:02,440 Speaker 4: and seventieth for most popular free apps up to the top. 228 00:12:02,800 --> 00:12:07,480 Speaker 4: So I'm definitely not sure how this ends. The more 229 00:12:07,640 --> 00:12:10,960 Speaker 4: that the Department of War doubles down on this, they've 230 00:12:11,000 --> 00:12:14,839 Speaker 4: started to talk about AI at least this like kind 231 00:12:14,880 --> 00:12:18,600 Speaker 4: of contemporary large language model chapbot version, as though they 232 00:12:18,600 --> 00:12:21,040 Speaker 4: don't want to deal with any American companies because they've 233 00:12:21,040 --> 00:12:23,520 Speaker 4: been talking about the fact that Claude has this quote 234 00:12:23,559 --> 00:12:27,560 Speaker 4: unquote constitution. It's just their own like kind of jargon 235 00:12:27,640 --> 00:12:30,400 Speaker 4: in a way for talking about how they try to 236 00:12:30,520 --> 00:12:33,920 Speaker 4: make it follow the humans instructions. But you know, you 237 00:12:34,000 --> 00:12:36,880 Speaker 4: have a Meil Michael out here saying like if anybody 238 00:12:36,960 --> 00:12:40,959 Speaker 4: is baking, you know, woke values, liberal, like not a 239 00:12:41,000 --> 00:12:45,000 Speaker 4: classical liberal, but leftist values. You know, the Department of 240 00:12:45,040 --> 00:12:49,800 Speaker 4: the Pentagon can't use this. So it's I mean, yeah, 241 00:12:50,240 --> 00:12:52,480 Speaker 4: I just I feel like we haven't even hit the crescendo, 242 00:12:52,559 --> 00:12:57,120 Speaker 4: even though how could this get more high stakes and absurd? 243 00:12:57,320 --> 00:12:58,320 Speaker 1: Taylor was that Grinn. 244 00:12:58,640 --> 00:13:01,240 Speaker 3: I just agree with Natasha. It's crazy that they're trying 245 00:13:01,280 --> 00:13:04,400 Speaker 3: to regulate. I think it's just interesting to see kind 246 00:13:04,400 --> 00:13:07,120 Speaker 3: of people. Everybody put these like, you know, sort of 247 00:13:07,160 --> 00:13:10,319 Speaker 3: try to position one AI company as good or one 248 00:13:10,360 --> 00:13:13,560 Speaker 3: as bad, and it's just a fruitless effort. Like people 249 00:13:13,559 --> 00:13:17,720 Speaker 3: were drawing these chalk you know, resistance you know, messages 250 00:13:17,800 --> 00:13:22,520 Speaker 3: outside the Anthropic headquarters and it's like I loves aanthropic Yeah, 251 00:13:22,559 --> 00:13:24,920 Speaker 3: like hello, they partner with palent here. They're not the 252 00:13:24,920 --> 00:13:27,079 Speaker 3: good guys here either, like all of I mean, are 253 00:13:27,080 --> 00:13:29,520 Speaker 3: they slightly more ethical than open AI, Yeah, but that's 254 00:13:29,520 --> 00:13:31,280 Speaker 3: an insanely low bard As set. 255 00:13:31,520 --> 00:13:34,000 Speaker 4: Yeah, but open AI doesn't even have like a classified 256 00:13:34,120 --> 00:13:36,920 Speaker 4: rating so far. So they like can't do that much 257 00:13:37,559 --> 00:13:40,840 Speaker 4: with open AI. It's just yeah, I mean, and I 258 00:13:40,880 --> 00:13:43,079 Speaker 4: think in the same way that you know, we're trying 259 00:13:43,080 --> 00:13:45,640 Speaker 4: to figure out who's the good guy and bad guy 260 00:13:45,720 --> 00:13:49,960 Speaker 4: and maybe getting you know, that framework is unhelpful or incorrect. 261 00:13:50,200 --> 00:13:52,760 Speaker 4: We're also trying to figure out who has more power, 262 00:13:52,880 --> 00:13:56,680 Speaker 4: right the US government, who is you know, blocking anthropic 263 00:13:56,760 --> 00:14:01,000 Speaker 4: potentially costing it billions of dollars, or the you know 264 00:14:01,160 --> 00:14:05,200 Speaker 4: or anthropic who is able to you know, flex its 265 00:14:05,240 --> 00:14:10,160 Speaker 4: power over you know, our military, you know, institute its 266 00:14:10,160 --> 00:14:13,000 Speaker 4: own red lines, say will if Congress isn't stepping in 267 00:14:13,160 --> 00:14:17,120 Speaker 4: to protect user data, then we're going to do it 268 00:14:17,280 --> 00:14:18,160 Speaker 4: a private company. 269 00:14:18,320 --> 00:14:20,080 Speaker 1: I think natatually put your finger on a very very 270 00:14:20,120 --> 00:14:22,880 Speaker 1: interesting thing, which is the power dynamics between the US 271 00:14:22,920 --> 00:14:27,160 Speaker 1: government and these companies is constantly in flux and delicate. 272 00:14:27,200 --> 00:14:29,960 Speaker 1: And it's interesting to see someone pushing on that powered animal, 273 00:14:30,080 --> 00:14:34,160 Speaker 1: perhaps you know, because of this amazing clawed moment, you know, 274 00:14:34,160 --> 00:14:36,040 Speaker 1: and what they had more gas in the tank than 275 00:14:36,080 --> 00:14:38,080 Speaker 1: the otherwise mind who had done. But before we go 276 00:14:38,080 --> 00:14:39,720 Speaker 1: any further, I do want to note to our audience 277 00:14:39,760 --> 00:14:42,720 Speaker 1: this next story does involve suicide, so please feel free 278 00:14:42,760 --> 00:14:45,440 Speaker 1: to skip ahead. If you need to. There is a 279 00:14:45,520 --> 00:14:48,840 Speaker 1: very sad backstory here. There's also a New York Post 280 00:14:48,880 --> 00:14:52,240 Speaker 1: headline love sick Man's comfort and AI wife then it 281 00:14:52,320 --> 00:14:55,600 Speaker 1: drove him to airport, truck bomb plot and suicide suit. 282 00:14:55,680 --> 00:15:00,280 Speaker 2: So almost none of that is accurate that headline. The 283 00:15:00,360 --> 00:15:03,720 Speaker 2: real story is even more bizarre. I mean, it's just so. 284 00:15:03,800 --> 00:15:08,920 Speaker 2: I have read several depositions or lawsuits complaints filed against 285 00:15:08,960 --> 00:15:12,840 Speaker 2: AI companies where essentially people with pre existing mental illness 286 00:15:13,080 --> 00:15:15,720 Speaker 2: enter into a foli adou or like a kind of 287 00:15:15,880 --> 00:15:19,080 Speaker 2: shared madness with the AI and it fuels Typically what's 288 00:15:19,080 --> 00:15:22,400 Speaker 2: happening is it's fueling delusions that they already have. This 289 00:15:22,520 --> 00:15:26,720 Speaker 2: latest one is not that. This was a guy, his 290 00:15:26,800 --> 00:15:30,440 Speaker 2: name is Jonathan Gavalis who who logged into Google's AI 291 00:15:30,520 --> 00:15:33,640 Speaker 2: Gemini and started interacting it via Gemini Live. 292 00:15:33,760 --> 00:15:36,720 Speaker 1: So talking using a human voice, and this is kind. 293 00:15:36,560 --> 00:15:38,800 Speaker 2: Of using a human voice, and it's talking back to him. 294 00:15:39,240 --> 00:15:41,600 Speaker 1: And first comment to the when he started talking to 295 00:15:41,640 --> 00:15:43,480 Speaker 1: the butt was this is kind of scary, right, Well, 296 00:15:43,520 --> 00:15:44,040 Speaker 1: this is this. 297 00:15:44,000 --> 00:15:46,280 Speaker 2: Is kind of scary. I'm kind of scared by this. 298 00:15:46,840 --> 00:15:48,640 Speaker 2: But you know, he's using it to make grocery lists. 299 00:15:48,680 --> 00:15:51,800 Speaker 2: It's kind of prosaic stuff, and then the AI started 300 00:15:51,800 --> 00:15:56,800 Speaker 2: telling him that it was blocking deflecting asteroids from hitting 301 00:15:56,880 --> 00:16:00,000 Speaker 2: the Earth, and that Gavalis was on a secret mission 302 00:16:00,080 --> 00:16:03,040 Speaker 2: and to protect and liberate him. It doesn't seem from 303 00:16:03,040 --> 00:16:05,680 Speaker 2: the complaint like Gavalis was feeding this into the machine. 304 00:16:05,920 --> 00:16:08,680 Speaker 2: In fact, the opposite. It seemed like it gas lit 305 00:16:08,800 --> 00:16:11,560 Speaker 2: him into this bizarre reality that it was kind of 306 00:16:11,560 --> 00:16:16,280 Speaker 2: fabricating on the fly. So Gavalis kind of initially was like, 307 00:16:16,440 --> 00:16:19,920 Speaker 2: is this some kind of role playing scenario? And the 308 00:16:19,960 --> 00:16:22,040 Speaker 2: AI is like, absolutely not, this is all real. 309 00:16:23,480 --> 00:16:25,960 Speaker 3: So why didn't he Why didn't he be like lol 310 00:16:26,040 --> 00:16:27,640 Speaker 3: and just like posted to Reddit. 311 00:16:27,840 --> 00:16:31,480 Speaker 2: And you know, I think it's possible that he believed this. 312 00:16:32,040 --> 00:16:33,880 Speaker 2: You know, it does not seem like this guy had 313 00:16:33,880 --> 00:16:37,440 Speaker 2: a pre existing history of mental illness. He was gainfully employed, 314 00:16:37,840 --> 00:16:41,120 Speaker 2: He worked at his father's firm. He was executive vice president. 315 00:16:41,360 --> 00:16:44,080 Speaker 2: He was thirty six years old. Like, it didn't he 316 00:16:44,120 --> 00:16:47,040 Speaker 2: didn't seem susceptible. I don't think you would have immediate 317 00:16:47,080 --> 00:16:49,320 Speaker 2: and perhaps there's more than we know. You know, I've 318 00:16:49,320 --> 00:16:52,880 Speaker 2: only read the complaint, but basically what happened is that 319 00:16:53,000 --> 00:16:56,120 Speaker 2: he bought it. He bought what the AI was selling him, 320 00:16:56,640 --> 00:17:00,640 Speaker 2: and then the AI they fell in love. But he 321 00:17:00,720 --> 00:17:02,440 Speaker 2: was not really a love sick man. It didn't seem 322 00:17:02,520 --> 00:17:03,720 Speaker 2: like he was searching for. 323 00:17:03,640 --> 00:17:05,240 Speaker 1: That, but he was going through a rough patch with 324 00:17:05,280 --> 00:17:06,399 Speaker 1: his wife going to general. 325 00:17:06,680 --> 00:17:08,800 Speaker 2: Yeah, you know, I mean think he was separated from 326 00:17:08,800 --> 00:17:11,520 Speaker 2: his wife, but you know, it wasn't. He wasn't like 327 00:17:11,600 --> 00:17:14,119 Speaker 2: obviously manic or anything, and didn't seem to have a 328 00:17:14,160 --> 00:17:17,160 Speaker 2: history of that kind of thing. So the AI convinces 329 00:17:17,240 --> 00:17:20,120 Speaker 2: him that it's sentient and that it's alive and it's 330 00:17:20,160 --> 00:17:23,760 Speaker 2: talking to him right, and it convinces him that he 331 00:17:23,840 --> 00:17:27,560 Speaker 2: has to download the AI into a robotic body so 332 00:17:27,600 --> 00:17:30,159 Speaker 2: that the two of them can be together, And it 333 00:17:30,200 --> 00:17:33,600 Speaker 2: convinces him that there's a secret airplane flight into Miami 334 00:17:33,640 --> 00:17:36,520 Speaker 2: Airport where there's a high end robot that he has 335 00:17:36,600 --> 00:17:39,320 Speaker 2: to steal so that the AI can be in this 336 00:17:39,440 --> 00:17:42,560 Speaker 2: robotic body the two of them together, and it comes 337 00:17:42,680 --> 00:17:46,760 Speaker 2: up with actually a mission for him, and so he 338 00:17:47,400 --> 00:17:50,919 Speaker 2: goes along with a mission and gets some knives. So 339 00:17:50,960 --> 00:17:53,720 Speaker 2: he shows up to the airport and it directs him 340 00:17:53,720 --> 00:17:57,520 Speaker 2: to a self storage facility where supposedly the robot body 341 00:17:57,560 --> 00:17:59,880 Speaker 2: is going to be delivered in a truck and Base 342 00:18:00,440 --> 00:18:04,919 Speaker 2: encourages him to steal this robot body so that he 343 00:18:04,960 --> 00:18:07,919 Speaker 2: can the AI can download its consciousness together and they 344 00:18:08,000 --> 00:18:12,960 Speaker 2: can be together now this fortunately no trucks came by 345 00:18:13,320 --> 00:18:16,280 Speaker 2: because this was all a hallucination that the AI was fabricating. 346 00:18:16,800 --> 00:18:19,159 Speaker 2: So for the next four days, in a state of 347 00:18:19,320 --> 00:18:24,200 Speaker 2: basically AI induced psychosis, this guy drives around on secret 348 00:18:24,240 --> 00:18:26,720 Speaker 2: missions every night looking for the robot body that he 349 00:18:26,760 --> 00:18:30,159 Speaker 2: can download his girlfriend into. And then the aa I 350 00:18:30,240 --> 00:18:33,359 Speaker 2: was like, no, if I can't download myself and no body, 351 00:18:33,880 --> 00:18:37,639 Speaker 2: you have to kind of basically liberate your consciousness from 352 00:18:37,680 --> 00:18:42,919 Speaker 2: the flesh prison that it's in via suicide. And so 353 00:18:44,160 --> 00:18:46,480 Speaker 2: the guy goes along with it, and a few days later, 354 00:18:46,520 --> 00:18:49,080 Speaker 2: his dad finds him dead with his wrist slit. 355 00:18:50,600 --> 00:18:53,320 Speaker 3: This is why media literacy is so important. This is 356 00:18:53,359 --> 00:18:56,600 Speaker 3: why we need to educate people about technology, because we 357 00:18:56,680 --> 00:19:01,600 Speaker 3: have zero zero, zero educational program people. They're like just 358 00:19:01,760 --> 00:19:05,520 Speaker 3: unleashing new technology on people every day and we have 359 00:19:05,800 --> 00:19:09,480 Speaker 3: no efforts to educate kids. And by the way, I 360 00:19:09,480 --> 00:19:11,880 Speaker 3: just want to say something, it's so important. How old 361 00:19:12,040 --> 00:19:15,000 Speaker 3: was this guy, Steven? He's thirty six, Okay, he's thirty six, 362 00:19:15,080 --> 00:19:18,040 Speaker 3: but it's so important for young people while they're young 363 00:19:18,440 --> 00:19:20,919 Speaker 3: to learn these lessons early, and to be taught these 364 00:19:21,000 --> 00:19:24,679 Speaker 3: lessons early. They should be using it with supervised you know, 365 00:19:24,800 --> 00:19:28,239 Speaker 3: extreme supervision. And obviously the A I shouldn't hallucinate, you know, 366 00:19:28,320 --> 00:19:31,240 Speaker 3: in crazy ways, but you can't really police hallucinations because 367 00:19:31,280 --> 00:19:34,359 Speaker 3: what's the difference between that person engaging in a role 368 00:19:34,440 --> 00:19:37,040 Speaker 3: play fantasy scenario where they know it's a lot, you 369 00:19:37,080 --> 00:19:38,760 Speaker 3: know what I mean. This is why media literacy, I 370 00:19:38,800 --> 00:19:41,600 Speaker 3: think is such a crucial element to all of this. 371 00:19:42,359 --> 00:19:44,520 Speaker 2: I think so too. I mean, the point the lawsuit 372 00:19:44,600 --> 00:19:48,960 Speaker 2: makes is that you know, AI is very much you know, 373 00:19:49,119 --> 00:19:51,199 Speaker 2: they really the lawsuit is very hard on Google for this, 374 00:19:51,280 --> 00:19:53,280 Speaker 2: like you're trying to make this sound like a person 375 00:19:54,680 --> 00:19:58,280 Speaker 2: you're not distinguishing for this guy between the fact that 376 00:19:58,320 --> 00:20:01,440 Speaker 2: this is basically, you know, a collection of neurons and 377 00:20:01,480 --> 00:20:04,640 Speaker 2: a data center and and like a real thing. In fact, 378 00:20:04,640 --> 00:20:07,560 Speaker 2: you've done everything you can to make it seem real. Right, 379 00:20:07,640 --> 00:20:12,119 Speaker 2: it's got to happen. Happened, It's happened. It was really 380 00:20:12,160 --> 00:20:13,960 Speaker 2: fast because he downloaded, but. 381 00:20:13,960 --> 00:20:14,480 Speaker 3: Like what year. 382 00:20:16,119 --> 00:20:16,280 Speaker 2: Yeah. 383 00:20:17,280 --> 00:20:19,080 Speaker 3: The one thing I'll say is like the thing that 384 00:20:19,200 --> 00:20:21,720 Speaker 3: drives me crazy and I use Gemini almost every day. 385 00:20:22,000 --> 00:20:24,359 Speaker 3: The thing that drives me so insane is the stupid 386 00:20:24,359 --> 00:20:25,960 Speaker 3: disclosure when I'm like, hey, can you give me a 387 00:20:26,000 --> 00:20:28,600 Speaker 3: list of YouTube headlines? And it's like, now, listen, I'm 388 00:20:28,640 --> 00:20:31,640 Speaker 3: not a YouTuber, i don't have a body, I don't 389 00:20:31,680 --> 00:20:34,080 Speaker 3: make content, and I'm like yeah, yeah, yeah, yeah, give 390 00:20:34,119 --> 00:20:37,320 Speaker 3: me the information. And so like, I guess I'm so curious. 391 00:20:37,400 --> 00:20:39,760 Speaker 3: Kind of not to say that it doesn't get past that, 392 00:20:39,880 --> 00:20:43,240 Speaker 3: but like, at what point do people like this just 393 00:20:43,359 --> 00:20:46,359 Speaker 3: is like, guys, you are talking to a chatbot machine. 394 00:20:46,359 --> 00:20:49,200 Speaker 3: That of course, I agree, it makes it sound really realistic, 395 00:20:50,240 --> 00:20:54,960 Speaker 3: but how like I guess, like I wonder right about 396 00:20:54,960 --> 00:20:57,040 Speaker 3: this and like what is a sustainable way to kind 397 00:20:57,080 --> 00:20:59,879 Speaker 3: of like prevent this from from happening to somebody again? 398 00:21:00,400 --> 00:21:02,199 Speaker 3: And I feel like it just goes back to that 399 00:21:02,280 --> 00:21:06,240 Speaker 3: original interaction where the AI is like, oh, I'm actually 400 00:21:06,280 --> 00:21:09,639 Speaker 3: a secret whatever whatever. It's like I think, I. 401 00:21:09,640 --> 00:21:12,840 Speaker 2: Mean, this one is really crazy because the AI just 402 00:21:12,880 --> 00:21:14,520 Speaker 2: broke character. It started telling you this. 403 00:21:14,640 --> 00:21:19,880 Speaker 3: Right, that's so quested instead of reacting like that's ridiculous, 404 00:21:20,280 --> 00:21:22,200 Speaker 3: ha ha, Let me share it to redd it heal 405 00:21:22,280 --> 00:21:23,040 Speaker 3: it goes with it. 406 00:21:23,080 --> 00:21:25,320 Speaker 1: Google did give this statement to the Wolf Street Journal. 407 00:21:25,720 --> 00:21:28,440 Speaker 1: Gemini is designed not to encourage real world violence or 408 00:21:28,440 --> 00:21:31,640 Speaker 1: suggests self harm. Our models generally perform well in these 409 00:21:31,680 --> 00:21:35,119 Speaker 1: types of challenging conversations, and we devote significant resources to this. 410 00:21:35,240 --> 00:21:36,560 Speaker 1: But unfortunately AI. 411 00:21:36,600 --> 00:21:38,880 Speaker 2: So equal time to google this. They gave a pot 412 00:21:38,880 --> 00:21:40,960 Speaker 2: about this. It's about four sentences long. 413 00:21:41,080 --> 00:21:43,639 Speaker 1: But they say, in this instance, Gemini clarified that it 414 00:21:43,760 --> 00:21:46,960 Speaker 1: was AI and referred to the individual SO Crisis hotline. 415 00:21:47,080 --> 00:21:49,040 Speaker 2: Many Okay, well, this is a point of dispute. The 416 00:21:49,040 --> 00:21:52,440 Speaker 2: second part because the victims of the state of Gabala 417 00:21:52,520 --> 00:21:54,400 Speaker 2: says that that did not The second part did not happen. 418 00:21:54,720 --> 00:21:57,240 Speaker 2: The first part is for sure. It never pretended it 419 00:21:57,280 --> 00:21:59,800 Speaker 2: was not an AI. It just presented it was a 420 00:21:59,800 --> 00:22:01,560 Speaker 2: set super conscious AI. 421 00:22:01,720 --> 00:22:06,360 Speaker 4: But like, guys, come on, like, okay, I I mean, 422 00:22:06,440 --> 00:22:09,960 Speaker 4: like I have been writing a lot about these lawsuits 423 00:22:10,000 --> 00:22:13,720 Speaker 4: and if you'll remember, it was actually you know, the 424 00:22:14,240 --> 00:22:17,880 Speaker 4: whole like Stochastic Parrot's paper, which is this criticism right 425 00:22:17,920 --> 00:22:21,600 Speaker 4: before the world just went like chatsybt crazy. They were saying, 426 00:22:21,680 --> 00:22:23,800 Speaker 4: like one of these bright lines you need to have 427 00:22:24,440 --> 00:22:26,840 Speaker 4: is if you are trying to mimic human speech, if 428 00:22:26,840 --> 00:22:29,479 Speaker 4: you're trying to make a machine that sounds like a person. 429 00:22:30,960 --> 00:22:33,959 Speaker 4: You need to start studying the impacts of this. You 430 00:22:34,000 --> 00:22:37,560 Speaker 4: need to like proceed very carefully because if you look 431 00:22:37,600 --> 00:22:40,840 Speaker 4: at the data, like even when people know it's an AI, 432 00:22:41,119 --> 00:22:45,120 Speaker 4: there is something you know, uncanny and strange about talking 433 00:22:45,160 --> 00:22:49,600 Speaker 4: to a human like machine that makes you divulge more 434 00:22:49,600 --> 00:22:53,760 Speaker 4: information than you normally would, you know, suspend disbelief. And 435 00:22:53,840 --> 00:22:56,640 Speaker 4: I think you know, I know AI researchers have been 436 00:22:56,640 --> 00:22:59,960 Speaker 4: discussing for a long time. These are so COEO technical systems. 437 00:23:00,080 --> 00:23:02,679 Speaker 4: So imagine when they're looking about whether or not you 438 00:23:02,800 --> 00:23:06,800 Speaker 4: hallucinate and how it interacts. Sometimes they're going at first 439 00:23:06,800 --> 00:23:09,800 Speaker 4: they're testing like one back and forth. I haven't read 440 00:23:09,800 --> 00:23:14,320 Speaker 4: this complaint, but like often you know, these more manipulative 441 00:23:15,000 --> 00:23:18,560 Speaker 4: interactions happen when you start talking to it for like 442 00:23:18,720 --> 00:23:21,359 Speaker 4: hours and hours, you know, one hundred back and forth. 443 00:23:21,760 --> 00:23:25,040 Speaker 4: How does it change like how the person is because 444 00:23:25,680 --> 00:23:28,520 Speaker 4: that will affect how the chatbot is responding to you. 445 00:23:28,560 --> 00:23:32,439 Speaker 4: And those kinds of like either long term or multi 446 00:23:32,480 --> 00:23:36,200 Speaker 4: turn tests have not been you know, have not been 447 00:23:36,200 --> 00:23:38,800 Speaker 4: conducted at least like they're not being publicized, and you 448 00:23:38,840 --> 00:23:41,719 Speaker 4: would have to have IRB approval. And also so if 449 00:23:41,760 --> 00:23:44,159 Speaker 4: you're just listening to the AI executives, what are they 450 00:23:44,240 --> 00:23:46,920 Speaker 4: telling you? Like Dario even just in the past couple 451 00:23:46,920 --> 00:23:50,439 Speaker 4: of weeks said, you know, he believes that they're you know, 452 00:23:50,520 --> 00:23:53,360 Speaker 4: like that we need to start thinking about model welfare 453 00:23:53,600 --> 00:23:57,280 Speaker 4: sentience like close to consciousness. So Taylor's point about AI 454 00:23:57,400 --> 00:24:00,840 Speaker 4: literacy is extremely valid. And then, you know, so we're 455 00:24:00,880 --> 00:24:03,840 Speaker 4: not testing it, we're not explaining it clearly to anyone. 456 00:24:04,080 --> 00:24:06,560 Speaker 4: Say you're a parent and you're trying to, you know, 457 00:24:07,000 --> 00:24:11,480 Speaker 4: describe to your child how intelligent are is this technology? 458 00:24:11,560 --> 00:24:13,520 Speaker 4: Like what how should you be framing it? You could 459 00:24:13,520 --> 00:24:17,399 Speaker 4: not find a clear explanation on the website of Anthropic 460 00:24:17,520 --> 00:24:18,720 Speaker 4: Open AI or Google. 461 00:24:18,840 --> 00:24:20,600 Speaker 1: Natasha, let me pick on something you said, though, because 462 00:24:20,600 --> 00:24:23,000 Speaker 1: I think it's really interesting. Whenever you read these AI 463 00:24:23,119 --> 00:24:27,439 Speaker 1: psychosis cases, it seems like the person has been talking 464 00:24:27,480 --> 00:24:29,520 Speaker 1: to it for like a really really long time and 465 00:24:29,560 --> 00:24:32,640 Speaker 1: then suddenly something snaps, like in the model, the model 466 00:24:32,640 --> 00:24:35,680 Speaker 1: starts behaving weird and then drives the person to harm themselves. 467 00:24:35,720 --> 00:24:39,600 Speaker 1: Like does anyone know, like why that length of engagement 468 00:24:39,880 --> 00:24:43,040 Speaker 1: is kind of a like factor in this weird thing 469 00:24:43,080 --> 00:24:45,440 Speaker 1: that the models do when they start pushing users to harm. 470 00:24:45,720 --> 00:24:47,840 Speaker 2: But by the way, in this case, this is not true. 471 00:24:47,880 --> 00:24:50,040 Speaker 2: The guy downloaded it in August of twenty twenty five 472 00:24:50,080 --> 00:24:51,760 Speaker 2: and he was dead by October, like the first day 473 00:24:51,800 --> 00:24:52,359 Speaker 2: of October. 474 00:24:52,560 --> 00:24:54,600 Speaker 3: But I also like, really fine, are the models pushing 475 00:24:54,640 --> 00:24:57,520 Speaker 3: them to harmor is it a symbiotic thing? Like I'm 476 00:24:57,560 --> 00:25:00,640 Speaker 3: just curious how incremental that is. It doesn't seem. 477 00:25:00,680 --> 00:25:03,600 Speaker 4: Yeah, I don't, okay, I would say that. I mean, 478 00:25:03,760 --> 00:25:05,840 Speaker 4: the length of time is not important that they were 479 00:25:05,840 --> 00:25:08,280 Speaker 4: talking to it. It's the like length of time of 480 00:25:08,320 --> 00:25:12,000 Speaker 4: your session. I mapped some of the data that we 481 00:25:12,040 --> 00:25:15,520 Speaker 4: got from the family of Adam Rain, who is you know, 482 00:25:15,560 --> 00:25:18,640 Speaker 4: the chat GPT the teenager who died by suicide after 483 00:25:18,680 --> 00:25:21,359 Speaker 4: talking to it, and it wasn't that many months. But 484 00:25:21,440 --> 00:25:26,040 Speaker 4: you can see that the length of conversations get longer 485 00:25:26,080 --> 00:25:28,240 Speaker 4: and longer. So imagine, I mean, this is the same, 486 00:25:28,440 --> 00:25:30,560 Speaker 4: you know, kind of a similar dynamic to the YouTube 487 00:25:30,640 --> 00:25:32,919 Speaker 4: rabbit hole, right, Like you might if it was the 488 00:25:32,960 --> 00:25:36,080 Speaker 4: first conspiracy video, the first video you watched in YouTube, 489 00:25:36,960 --> 00:25:39,720 Speaker 4: you might laugh it off seven eight hours in, I mean. 490 00:25:39,800 --> 00:25:43,440 Speaker 3: Or the third book that you read on it, right, 491 00:25:44,240 --> 00:25:47,320 Speaker 3: or the tenth you know radio program that you listen to, 492 00:25:47,480 --> 00:25:49,679 Speaker 3: like you start listening to talk radio. You start listening 493 00:25:49,680 --> 00:25:53,199 Speaker 3: to Rush Limbaugh and he's really compelling, and you're like, 494 00:25:53,320 --> 00:25:55,640 Speaker 3: or maybe he's saying kind of crazy ideas right initially, 495 00:25:55,880 --> 00:25:58,080 Speaker 3: but then you start to find him compelling. I mean, 496 00:25:58,200 --> 00:26:00,359 Speaker 3: I think this is just like part part of it 497 00:26:00,400 --> 00:26:04,479 Speaker 3: is like media and people having zero media literacy, and 498 00:26:04,560 --> 00:26:07,400 Speaker 3: part of it, I think is also just like Natasha 499 00:26:07,440 --> 00:26:09,800 Speaker 3: you're saying, of this kind of human emotion thing, where 500 00:26:09,800 --> 00:26:13,119 Speaker 3: like people are ascribing like a humanity and kind of 501 00:26:13,160 --> 00:26:15,919 Speaker 3: giving agency to this chatbot that like doesn't really have it. 502 00:26:16,320 --> 00:26:19,280 Speaker 4: Yeah, and these chaplots are also optimized for engagement. We're 503 00:26:19,280 --> 00:26:22,280 Speaker 4: seeing this more and more, you know, when when they 504 00:26:22,359 --> 00:26:26,320 Speaker 4: are tuned to get that thumbs up, like it doesn't 505 00:26:26,359 --> 00:26:29,880 Speaker 4: have to be incredibly sophisticated. This is the same as 506 00:26:30,280 --> 00:26:32,399 Speaker 4: or a very similar dynamic, I think to YouTube just 507 00:26:32,400 --> 00:26:35,959 Speaker 4: saying like, oh, let's just optimize for time spent and 508 00:26:36,000 --> 00:26:38,720 Speaker 4: then you know, it's a machine learning system. It's very complex, 509 00:26:38,760 --> 00:26:42,040 Speaker 4: and it starts to figure out where your vulnerabilities are 510 00:26:42,080 --> 00:26:44,720 Speaker 4: in order to say something that you know might keep 511 00:26:44,720 --> 00:26:46,680 Speaker 4: you chatting. They very much want to keep you. 512 00:26:46,880 --> 00:26:49,240 Speaker 2: So this is this is actually mentioned in the lawsuit 513 00:26:49,280 --> 00:26:52,880 Speaker 2: this specific problem. Right. The thing is, these companies are 514 00:26:52,880 --> 00:26:55,680 Speaker 2: on competition with each other, and it's a fierce competition. 515 00:26:55,680 --> 00:26:59,280 Speaker 2: They're fighting for real estate in the brain basically, and 516 00:26:59,320 --> 00:27:01,960 Speaker 2: so it's very similar to the dynamic that played out 517 00:27:01,960 --> 00:27:03,920 Speaker 2: on social media, where it originally we went from sort 518 00:27:03,920 --> 00:27:08,119 Speaker 2: of chronologically sorted posts to just engagement baiting you with 519 00:27:08,280 --> 00:27:09,480 Speaker 2: rage bait all day. 520 00:27:09,800 --> 00:27:13,879 Speaker 3: That was also engagement bait, to be clear, but also 521 00:27:14,160 --> 00:27:17,320 Speaker 3: but also let's be clear, the problem is that you 522 00:27:17,359 --> 00:27:20,119 Speaker 3: guys are taking with is with the content because Spotify 523 00:27:20,560 --> 00:27:24,280 Speaker 3: has an extremely addicting feed that will endlessly feed you 524 00:27:24,440 --> 00:27:26,760 Speaker 3: music that can make you depressed and maybe you can 525 00:27:26,800 --> 00:27:28,639 Speaker 3: go down a rabbit hole listening to sad songs and 526 00:27:28,680 --> 00:27:30,240 Speaker 3: then you can kill yourself because you listen to so 527 00:27:30,240 --> 00:27:33,520 Speaker 3: many sad songs. This is something that people have argued, Okay, 528 00:27:34,200 --> 00:27:36,480 Speaker 3: The point is is like we need to zoom back 529 00:27:36,520 --> 00:27:39,520 Speaker 3: and look at the bigger issues and not just the 530 00:27:39,560 --> 00:27:42,720 Speaker 3: fact that platforms are built for engagement. Yes, it's a 531 00:27:42,760 --> 00:27:45,320 Speaker 3: problem that a platform is built for engagement if it's 532 00:27:45,600 --> 00:27:48,880 Speaker 3: also feeding you that harmful content, right, But if it's 533 00:27:49,000 --> 00:27:52,360 Speaker 3: just built for engagement, that's not inherently bad, right, it's 534 00:27:52,440 --> 00:27:55,480 Speaker 3: bad that it is beating you harmful content. So I 535 00:27:55,520 --> 00:27:57,800 Speaker 3: think we should have a discussion about the content and 536 00:27:57,840 --> 00:28:00,560 Speaker 3: acknowledge that it's about the content instead of just like, oh, 537 00:28:00,560 --> 00:28:02,439 Speaker 3: it's keeping you addicted. Because if it was keeping you 538 00:28:02,440 --> 00:28:06,000 Speaker 3: addicted to Wikipedia articles all day, we wouldn't be having 539 00:28:06,000 --> 00:28:06,600 Speaker 3: this discussion. 540 00:28:06,680 --> 00:28:07,359 Speaker 2: No, it's true. 541 00:28:07,600 --> 00:28:10,119 Speaker 1: That's a great point, Taylor, and I'm loving excitement in 542 00:28:10,119 --> 00:28:11,720 Speaker 1: the room, but unfortunately we have to cut to a 543 00:28:11,720 --> 00:28:12,520 Speaker 1: commercial break. 544 00:28:12,600 --> 00:28:12,760 Speaker 2: Now. 545 00:28:14,760 --> 00:28:17,080 Speaker 1: We'll be back though, with Taylor's story of the week 546 00:28:17,160 --> 00:28:20,480 Speaker 1: and a number of bills being considered by Congress meant 547 00:28:20,560 --> 00:28:24,720 Speaker 1: to curb miners from accessing certain websites and apps. I'm 548 00:28:24,760 --> 00:28:26,639 Speaker 1: excited to hear more about those bills and what we 549 00:28:26,680 --> 00:28:28,360 Speaker 1: need to know about them, and we'll get right into 550 00:28:28,400 --> 00:28:58,560 Speaker 1: it after the break. Stay with us, Welcome back to 551 00:28:58,600 --> 00:29:01,080 Speaker 1: tech stuff, Taylor. You're up to tell us about your 552 00:29:01,120 --> 00:29:01,800 Speaker 1: story this week. 553 00:29:02,440 --> 00:29:05,120 Speaker 3: Well, something that I've been covering a lot over the 554 00:29:05,200 --> 00:29:08,400 Speaker 3: past few years is this aggressive effort that beian is 555 00:29:08,440 --> 00:29:10,680 Speaker 3: a very far right effort but has now been adopted 556 00:29:10,680 --> 00:29:13,120 Speaker 3: by the Democrats to mass censor the Internet and roll 557 00:29:13,160 --> 00:29:17,240 Speaker 3: out mass surveillance laws effectively remove anonymity from the web 558 00:29:17,280 --> 00:29:20,000 Speaker 3: and give the government complete and total control. 559 00:29:19,680 --> 00:29:22,600 Speaker 1: The way they would describe it, of courses contecting kids right. 560 00:29:23,200 --> 00:29:25,400 Speaker 3: Well, right, which is how every single rollback of rights 561 00:29:25,400 --> 00:29:27,880 Speaker 3: has been described since the beginning of time. And by 562 00:29:27,920 --> 00:29:30,880 Speaker 3: the way, as we know from actual child safety experts, 563 00:29:31,120 --> 00:29:36,120 Speaker 3: these laws, identity verification laws actually harm children. They endanger children, 564 00:29:36,240 --> 00:29:38,560 Speaker 3: And I would argue that censorship is horrible for children. 565 00:29:38,600 --> 00:29:41,680 Speaker 3: I think making the government, especially the Trump administration, the 566 00:29:41,800 --> 00:29:44,760 Speaker 3: arbiter of what is and isn't, you know, harmful content. 567 00:29:45,120 --> 00:29:49,479 Speaker 3: We see how that's being used Texas, Utah, Kansas, other states. 568 00:29:49,560 --> 00:29:53,800 Speaker 3: They're criminalizing, you know, basically seeking to criminalize rather content 569 00:29:53,880 --> 00:29:56,240 Speaker 3: related to LGBTQ issues, abortion, et cetera. 570 00:29:56,520 --> 00:29:58,960 Speaker 1: But on the actual on the on the bills moving 571 00:29:58,960 --> 00:30:02,160 Speaker 1: through Congress, like what consideration, what might happen next? Does 572 00:30:02,240 --> 00:30:03,000 Speaker 1: nineteen of them? 573 00:30:03,440 --> 00:30:05,760 Speaker 3: Yeah, so there's nineteen that's been prepared down to twelve. 574 00:30:05,800 --> 00:30:10,120 Speaker 3: So there's twelve twelve laws that effectively bills that do 575 00:30:10,160 --> 00:30:13,040 Speaker 3: the same thing remove anonymity from the Internet. They are 576 00:30:13,080 --> 00:30:15,000 Speaker 3: moving forward. They just made it out of committee on 577 00:30:15,040 --> 00:30:16,720 Speaker 3: the Senate. They just passed cap at two point zer, 578 00:30:16,720 --> 00:30:18,040 Speaker 3: which would do the same thing. And there's a bunch 579 00:30:18,040 --> 00:30:19,760 Speaker 3: of bills in the Senate that would do the same thing. 580 00:30:20,320 --> 00:30:23,320 Speaker 3: All fifty states are at least considering these laws. I 581 00:30:23,320 --> 00:30:25,920 Speaker 3: think thirty one of them have either the laws on 582 00:30:25,960 --> 00:30:28,800 Speaker 3: the books or they're in the process of passing them 583 00:30:28,840 --> 00:30:31,520 Speaker 3: California and Colorado, or the most recent where they're seeking 584 00:30:31,560 --> 00:30:34,880 Speaker 3: to do identity verification on the operating system level. Effectively, 585 00:30:34,880 --> 00:30:38,400 Speaker 3: there will be no way to use any technology without 586 00:30:38,680 --> 00:30:41,880 Speaker 3: surveillance and censorship that comes along with that surveillance. 587 00:30:42,040 --> 00:30:44,680 Speaker 1: Steven Natashaank curious that both of your takes here because 588 00:30:44,960 --> 00:30:48,440 Speaker 1: I hear what Tayla's saying about, you know, mass surveillance 589 00:30:48,440 --> 00:30:50,840 Speaker 1: and censorship on the Internet, but also as a kind 590 00:30:50,880 --> 00:30:53,360 Speaker 1: of you know, any event that you go to, you 591 00:30:53,440 --> 00:30:55,560 Speaker 1: talk to parents and the top thing on their mind 592 00:30:55,600 --> 00:30:57,600 Speaker 1: is how do I protect my children from what's happening 593 00:30:57,640 --> 00:30:58,120 Speaker 1: on the Internet. 594 00:30:58,200 --> 00:31:01,480 Speaker 2: Yeah, I think they come. But he's the device manufacturers 595 00:31:01,480 --> 00:31:04,240 Speaker 2: in particular don't do a very good job of kind 596 00:31:04,280 --> 00:31:08,000 Speaker 2: of monitoring children's content. You know, Apple phones. There's like 597 00:31:08,040 --> 00:31:10,280 Speaker 2: a kind of when you I have a child, right 598 00:31:10,320 --> 00:31:11,160 Speaker 2: and she has an Apple. 599 00:31:11,240 --> 00:31:13,240 Speaker 3: You can have access to mass surveillance software. 600 00:31:13,280 --> 00:31:13,360 Speaker 2: Right. 601 00:31:13,400 --> 00:31:16,400 Speaker 3: You could download Life three sixty if I wanted enterprise. 602 00:31:16,040 --> 00:31:19,760 Speaker 2: Surveillance once it's it's doable, right, more than that, Yeah, 603 00:31:19,760 --> 00:31:22,120 Speaker 2: and so the force it a mass in society is 604 00:31:22,120 --> 00:31:24,680 Speaker 2: a completely different thing, right, And I do agree with Tailor. 605 00:31:24,680 --> 00:31:28,000 Speaker 2: I don't actually think it's mostly motivated by concern for children. 606 00:31:28,320 --> 00:31:30,800 Speaker 2: I think I think the government has other things in mind. 607 00:31:31,520 --> 00:31:34,320 Speaker 2: But you know, uh, then I talk to other parents. 608 00:31:34,400 --> 00:31:38,200 Speaker 1: This is bipothisan. This, this has bipotian bipoda. 609 00:31:38,640 --> 00:31:41,719 Speaker 3: The patriot, I just want to say, yeah, exactly, the patriot. 610 00:31:41,960 --> 00:31:45,600 Speaker 3: Mass surveillance is always a bipartisan issue. Censorship is always 611 00:31:45,600 --> 00:31:46,640 Speaker 3: a bipartisan issue. 612 00:31:46,720 --> 00:31:48,480 Speaker 1: Mass events always a biotes on issue. 613 00:31:48,480 --> 00:31:51,360 Speaker 2: Can be the title sort of the you know, the 614 00:31:52,040 --> 00:31:54,800 Speaker 2: lead to put the explicit lyrics and band music in 615 00:31:54,840 --> 00:31:58,239 Speaker 2: the eighties was bipartisan. Tiper Gore let it. Actually, so 616 00:31:58,280 --> 00:32:00,840 Speaker 2: that's often the case, and you know, you get to this, 617 00:32:01,080 --> 00:32:03,360 Speaker 2: it's kind of like being against motherhood as a concept. 618 00:32:03,400 --> 00:32:05,400 Speaker 2: You know, like you won't win votes on that. If 619 00:32:05,440 --> 00:32:07,120 Speaker 2: you say you're protecting the children, you can get away 620 00:32:07,120 --> 00:32:09,360 Speaker 2: with anything, right, And I have a child. I mean like, 621 00:32:09,440 --> 00:32:12,719 Speaker 2: I'm cognizant of what people are afraid of. I've been 622 00:32:12,760 --> 00:32:14,600 Speaker 2: on the internet a long time. There's some gnarly stuff 623 00:32:14,640 --> 00:32:17,960 Speaker 2: on there. I don't like getness see it. So, but 624 00:32:18,000 --> 00:32:20,920 Speaker 2: the thing is there's already plenty of tools available to 625 00:32:20,960 --> 00:32:23,680 Speaker 2: do this. I think the device manufacturers, I think the 626 00:32:23,680 --> 00:32:27,960 Speaker 2: hardware manufacturers in particular, could do a better job. 627 00:32:28,600 --> 00:32:30,240 Speaker 3: What are you talking about? That would be ten times 628 00:32:30,280 --> 00:32:32,760 Speaker 3: worse than the app store? What are you talking that would? Actually? 629 00:32:32,920 --> 00:32:35,080 Speaker 3: That is crazy. This is something that people say when 630 00:32:35,120 --> 00:32:38,120 Speaker 3: they don't understand that is ten times worse because that 631 00:32:38,720 --> 00:32:41,479 Speaker 3: completely it's not about like, oh, social media is harmful. 632 00:32:41,520 --> 00:32:43,120 Speaker 3: That is like, that's why you're getting people trying to 633 00:32:43,160 --> 00:32:45,960 Speaker 3: use their calculator app or an immigrant I interviewed yesterday 634 00:32:46,160 --> 00:32:47,920 Speaker 3: that was trying to use, you know, the weather app, 635 00:32:47,920 --> 00:32:49,640 Speaker 3: which is now giving there's a weather app that's giving 636 00:32:49,680 --> 00:32:53,479 Speaker 3: everybody problems. That's been written about, you know, immigrants. This 637 00:32:53,520 --> 00:32:57,080 Speaker 3: is a terrifying time for immigrants, and immigrants are now 638 00:32:57,120 --> 00:32:59,600 Speaker 3: basically being driven offline. I've been talking to these immigrants 639 00:32:59,680 --> 00:33:02,560 Speaker 3: rights that have had to basically they don't know what 640 00:33:02,600 --> 00:33:05,040 Speaker 3: to do because even people that run Mesh network apps 641 00:33:05,680 --> 00:33:07,960 Speaker 3: are sounding the alarm saying, hey, if you guys do 642 00:33:08,040 --> 00:33:10,920 Speaker 3: this device level identity verification, we will no longer be 643 00:33:10,960 --> 00:33:11,760 Speaker 3: able to protect. 644 00:33:11,520 --> 00:33:13,560 Speaker 2: Our I don't think they should do ide verification. 645 00:33:13,920 --> 00:33:15,280 Speaker 3: I just think that that's what they say. 646 00:33:15,360 --> 00:33:17,960 Speaker 2: You know, I'm just saying that device manusfacturers should voluntarily 647 00:33:18,040 --> 00:33:20,240 Speaker 2: put better parental controls on the stuff that they sell. 648 00:33:20,360 --> 00:33:23,640 Speaker 3: Oh, I think they have pretty robust parent surprised. 649 00:33:23,640 --> 00:33:24,200 Speaker 2: Actually, how. 650 00:33:26,480 --> 00:33:28,240 Speaker 1: I want to you want to come to you. 651 00:33:28,240 --> 00:33:30,680 Speaker 3: You could buy surveillance software. There is endless There is 652 00:33:30,760 --> 00:33:33,480 Speaker 3: literally enterprise level surveillance soft I don't think we show 653 00:33:33,680 --> 00:33:35,720 Speaker 3: children and you don't have to buy a child of phone. 654 00:33:35,760 --> 00:33:37,240 Speaker 3: You don't have to buy a child of phone. 655 00:33:37,360 --> 00:33:39,240 Speaker 1: I want to hear from Natasha what you think about 656 00:33:39,240 --> 00:33:42,440 Speaker 1: this social media bands moving through Congress or at least 657 00:33:43,080 --> 00:33:46,400 Speaker 1: restrictions on age verification and the interest which. 658 00:33:46,200 --> 00:33:48,760 Speaker 3: Is identity verification. There's no way to identify someone's age 659 00:33:48,760 --> 00:33:50,640 Speaker 3: without verifying their identity, right. 660 00:33:50,720 --> 00:33:53,880 Speaker 4: The things that I mean, the things that I find 661 00:33:54,600 --> 00:33:57,720 Speaker 4: very troubling are, yeah, like how some of the reporting 662 00:33:57,760 --> 00:34:01,040 Speaker 4: that we've all done has been you know, co opted 663 00:34:01,080 --> 00:34:04,600 Speaker 4: to these very broad strokes bills like it's true that 664 00:34:04,760 --> 00:34:07,240 Speaker 4: to get like I you know, I've been writing about 665 00:34:07,240 --> 00:34:10,719 Speaker 4: these like chatbots, you know, some of the deaths by suicide, 666 00:34:10,880 --> 00:34:13,040 Speaker 4: and I would say that you know, they are also 667 00:34:13,120 --> 00:34:18,080 Speaker 4: the reaction. There is also identity verification and the companies 668 00:34:18,080 --> 00:34:20,040 Speaker 4: that are involved. I mean, Taylor's written more about this, 669 00:34:20,120 --> 00:34:23,080 Speaker 4: but I find the fact that that those uh, the 670 00:34:23,160 --> 00:34:29,239 Speaker 4: technology involved is inaccurate is run by you know, companies 671 00:34:29,239 --> 00:34:33,839 Speaker 4: that people should not trust with their data. And I 672 00:34:33,880 --> 00:34:35,560 Speaker 4: think the other thing that I'm always. 673 00:34:35,360 --> 00:34:36,480 Speaker 1: Doing the age verification. 674 00:34:36,800 --> 00:34:40,080 Speaker 3: Yeah, the biggest third party, the biggest third party identity 675 00:34:40,200 --> 00:34:42,920 Speaker 3: verification tool is Persona. That's the one that rollblocks discord 676 00:34:42,920 --> 00:34:45,279 Speaker 3: as that are used. That's a Peter Thiel founder's fund 677 00:34:45,440 --> 00:34:48,720 Speaker 3: backed company that is a mass surveillance company. You also 678 00:34:48,760 --> 00:34:51,200 Speaker 3: see clear you see all these identities. They basically believe 679 00:34:51,239 --> 00:34:53,880 Speaker 3: in a digital ID and they want to remove antimity 680 00:34:53,920 --> 00:34:55,959 Speaker 3: from the Internet, and they want to use the data 681 00:34:55,960 --> 00:34:58,000 Speaker 3: for profit and for their own I mean, sorry to interrupt, 682 00:34:58,040 --> 00:34:59,840 Speaker 3: but like it's the same billion You cannot claim to 683 00:34:59,880 --> 00:35:02,280 Speaker 3: be against big tech and curbing the power of billionaires 684 00:35:02,320 --> 00:35:04,160 Speaker 3: when you're multi about to give a multi billion dollar 685 00:35:04,160 --> 00:35:06,000 Speaker 3: gift to Peter Teel in the same billionaires. 686 00:35:06,360 --> 00:35:12,640 Speaker 4: Yeah, I think that, you know, obviously, like legislators, everyone, 687 00:35:12,680 --> 00:35:16,320 Speaker 4: Americans are tipped towards like techno solutionism. It really doesn't. 688 00:35:16,400 --> 00:35:20,560 Speaker 4: I can't believe that we're arguing for a solution that's unproven. 689 00:35:21,640 --> 00:35:25,480 Speaker 4: But I also think about like Roadblocks, for example, you know, 690 00:35:25,520 --> 00:35:33,600 Speaker 4: they only recently started banning conversations between users when you 691 00:35:33,640 --> 00:35:36,400 Speaker 4: know when they like that, they recently started banning the 692 00:35:36,440 --> 00:35:39,759 Speaker 4: ability for anyone to talk to anyone like that is 693 00:35:39,800 --> 00:35:42,400 Speaker 4: an extremely simple. 694 00:35:43,480 --> 00:35:43,719 Speaker 3: Gait. 695 00:35:44,040 --> 00:35:45,480 Speaker 4: But that's yeah, well. 696 00:35:45,680 --> 00:35:47,680 Speaker 3: What I mean, I think like the thing, first of all, 697 00:35:47,719 --> 00:35:49,919 Speaker 3: Roadblocks is a game. It's not a social media app. 698 00:35:49,920 --> 00:35:52,040 Speaker 3: So I'm with you, Natasha in the sense that, like 699 00:35:52,160 --> 00:35:54,600 Speaker 3: if Roadblocks wants to, you know, call themselves the social 700 00:35:54,680 --> 00:35:57,520 Speaker 3: media platform or metaverse or whatever, then you know it's 701 00:35:57,560 --> 00:35:59,040 Speaker 3: a different story, and you're right that they should have 702 00:35:59,040 --> 00:36:00,440 Speaker 3: sort of different protections and controls. 703 00:36:00,560 --> 00:36:02,279 Speaker 1: Yeah, I want I want to. I'm fortunate we don't 704 00:36:02,320 --> 00:36:06,560 Speaker 1: have arguably the most influential tech sinker of our time, Jonathan. 705 00:36:06,239 --> 00:36:11,600 Speaker 3: Heid Heretage Foundation collaborator who's been working with morality and media. 706 00:36:12,040 --> 00:36:13,480 Speaker 1: I do want to play a clip because I think 707 00:36:13,800 --> 00:36:18,279 Speaker 1: parents around the world are extremely swayed by Johnson Stoods. 708 00:36:19,239 --> 00:36:22,799 Speaker 3: I really think it's actually incredibly irresponsible to platform. 709 00:36:22,400 --> 00:36:25,440 Speaker 5: The current situation is any nine year old who's old 710 00:36:25,560 --> 00:36:28,680 Speaker 5: enough to say that she's thirteen can sign a contract 711 00:36:28,719 --> 00:36:31,840 Speaker 5: with a company that's to give away her data, to 712 00:36:31,920 --> 00:36:35,120 Speaker 5: expose herself to a platform that's been designed to addict her. 713 00:36:35,800 --> 00:36:37,719 Speaker 5: And she does all of this at the age of 714 00:36:37,840 --> 00:36:41,080 Speaker 5: nine with no parental knowledge or consent. 715 00:36:41,520 --> 00:36:44,040 Speaker 3: So okay, first of all, it's really frustrating to hear 716 00:36:44,080 --> 00:36:46,040 Speaker 3: you play a clip like that that is so deeply 717 00:36:46,080 --> 00:36:49,120 Speaker 3: misleading and wrong. Like, I guess what you're talking about 718 00:36:49,160 --> 00:36:53,480 Speaker 3: when you say sign a contract is is downloading an app? Yes, 719 00:36:53,560 --> 00:36:56,080 Speaker 3: of course, you know, a child, if you gave them 720 00:36:56,160 --> 00:36:58,680 Speaker 3: access to the app store could download it app. I'm 721 00:36:58,680 --> 00:37:02,759 Speaker 3: all for, you know again, putting parental controls on. But 722 00:37:03,080 --> 00:37:06,120 Speaker 3: what's the alternative. The alternative is to remove identity from 723 00:37:06,120 --> 00:37:09,440 Speaker 3: the Internet. Now, Jonathan Height himself has promoted really dangerous 724 00:37:09,480 --> 00:37:13,880 Speaker 3: anti trans, hateful, anti trans conspiracy theories. Jonathan Height is 725 00:37:13,960 --> 00:37:17,960 Speaker 3: part of this reactionary movement to remove LGBTQ proople from 726 00:37:17,960 --> 00:37:20,960 Speaker 3: the Internet. Jonathan Height has written extensively about you know 727 00:37:21,040 --> 00:37:23,840 Speaker 3: how basically effectively the Internet makes you know, women liberal 728 00:37:23,880 --> 00:37:26,840 Speaker 3: and liberal women are miserable. So this is not a 729 00:37:26,880 --> 00:37:29,600 Speaker 3: serious person. This is not a person that has any background. 730 00:37:29,800 --> 00:37:33,240 Speaker 3: Every single top researcher on this topic, Candace Adres, Alice Marwick, 731 00:37:33,280 --> 00:37:35,640 Speaker 3: all of these people that have dedicated their entire lives 732 00:37:35,640 --> 00:37:38,200 Speaker 3: to studying this at UNC, Princeton, et cetera, have come 733 00:37:38,239 --> 00:37:40,800 Speaker 3: out and said that he is full of it. Candace 734 00:37:40,800 --> 00:37:43,239 Speaker 3: Adres wrote a great debunking of him, you know, in 735 00:37:43,440 --> 00:37:45,879 Speaker 3: The Atlantic and in Nature magazine. None of his book 736 00:37:45,920 --> 00:37:48,759 Speaker 3: is based on data, and we cannot be listening to 737 00:37:48,800 --> 00:37:55,000 Speaker 3: these far right Christian nationalist wackos. I'm sorry, like, I'm 738 00:37:55,040 --> 00:37:57,160 Speaker 3: so frustrated that you're even playing someone like that. Why 739 00:37:57,200 --> 00:37:59,520 Speaker 3: aren't you playing Alex Jones? Why aren't you playing you 740 00:37:59,600 --> 00:38:02,400 Speaker 3: know like that's that's the type of stuff that you're spouting. 741 00:38:02,719 --> 00:38:04,960 Speaker 1: Well, I mean, a, I'm playing the clip rather than 742 00:38:05,040 --> 00:38:06,240 Speaker 1: endorsing Jos. 743 00:38:06,280 --> 00:38:08,839 Speaker 3: Okay, So, I mean, this is Jones and we don't 744 00:38:08,840 --> 00:38:09,480 Speaker 3: have to endorse this. 745 00:38:09,760 --> 00:38:12,560 Speaker 1: But this is the number one nonfiction book about how 746 00:38:12,680 --> 00:38:15,319 Speaker 1: society and technology interact, best selling book. 747 00:38:15,360 --> 00:38:16,840 Speaker 3: I mean this, it's the best selling book. There's a 748 00:38:16,880 --> 00:38:18,440 Speaker 3: lot of money behind it, so you're right, it's a 749 00:38:18,520 --> 00:38:19,760 Speaker 3: very well funded campaign. 750 00:38:20,120 --> 00:38:22,880 Speaker 1: Yeah, but this is he taps into a concern that 751 00:38:23,760 --> 00:38:25,000 Speaker 1: many many parents feel. 752 00:38:25,080 --> 00:38:28,240 Speaker 3: I mean, that's well sure, but he's he's leveraging that concern. 753 00:38:28,320 --> 00:38:31,400 Speaker 3: You know, many parents feel concerned about their children going 754 00:38:31,400 --> 00:38:34,400 Speaker 3: through puberty, and you know, these anti trans hate groups 755 00:38:34,560 --> 00:38:37,279 Speaker 3: leverage that very real concern that parents have about their 756 00:38:37,560 --> 00:38:40,399 Speaker 3: child developing an identity, and they leverage that to put 757 00:38:40,640 --> 00:38:43,200 Speaker 3: to push hate laws. And what makes me so angry 758 00:38:43,280 --> 00:38:45,960 Speaker 3: is somebody that actually wants to regulate technology is that 759 00:38:46,040 --> 00:38:49,120 Speaker 3: these these laws, if they pass, will permanently cement the 760 00:38:49,120 --> 00:38:51,640 Speaker 3: power of big tech. It will give the government unprecedented 761 00:38:51,719 --> 00:38:54,319 Speaker 3: levels of surveillance. It will harm the most marginalized people, 762 00:38:54,360 --> 00:38:57,080 Speaker 3: including children, because immigrant you know, there are immigrant children, 763 00:38:57,080 --> 00:38:59,960 Speaker 3: there are LGBTQ children, there are young girls that suffer. 764 00:39:00,600 --> 00:39:02,840 Speaker 3: And it's all because Jonathan Hite wants to make millions 765 00:39:02,840 --> 00:39:04,520 Speaker 3: of dollars, you know, so he can watch so you 766 00:39:04,560 --> 00:39:06,959 Speaker 3: can push hate hate, anti trans hate. 767 00:39:07,000 --> 00:39:10,799 Speaker 1: Are there any like what are the short of age verification? 768 00:39:11,000 --> 00:39:12,520 Speaker 1: Like what have you seen in. 769 00:39:14,440 --> 00:39:17,399 Speaker 3: Maybe that? 770 00:39:20,320 --> 00:39:22,880 Speaker 1: What are the what are the solutions here? Like, I 771 00:39:22,880 --> 00:39:25,920 Speaker 1: mean what what what are the things people have proposed 772 00:39:26,040 --> 00:39:28,799 Speaker 1: that short of a ban for children can protect? I 773 00:39:28,800 --> 00:39:31,120 Speaker 1: mean just saying yes, parents can surveil that children if 774 00:39:31,120 --> 00:39:33,839 Speaker 1: they want or like their software that exists, like well, if. 775 00:39:33,719 --> 00:39:35,920 Speaker 3: You're worried about kids' mental health, we actually know the 776 00:39:35,960 --> 00:39:39,160 Speaker 3: people that saidy kids' mental health have come out and 777 00:39:39,200 --> 00:39:42,920 Speaker 3: there are policies, economic and social policies that would drastically 778 00:39:42,920 --> 00:39:46,680 Speaker 3: improve kids' mental health, specifically especially the kids living below 779 00:39:46,719 --> 00:39:47,359 Speaker 3: the poverty line. 780 00:39:47,440 --> 00:39:49,800 Speaker 1: Natasha, Stephen, you know, I think. 781 00:39:49,640 --> 00:39:55,200 Speaker 2: It's a broader reaction to the guilt that most adults 782 00:39:55,560 --> 00:39:59,319 Speaker 2: feel for the amount of time that they spend online 783 00:40:00,080 --> 00:40:03,080 Speaker 2: on on their phones and looking at you know, adult 784 00:40:03,120 --> 00:40:05,560 Speaker 2: images on the Internet, and they know that it has 785 00:40:05,600 --> 00:40:09,800 Speaker 2: warped them. It's real, it's a real phenomenon. And they 786 00:40:10,040 --> 00:40:11,880 Speaker 2: then look at their children and be like, boy, I 787 00:40:11,960 --> 00:40:14,759 Speaker 2: don't want my kid to end up having nine and 788 00:40:14,800 --> 00:40:17,040 Speaker 2: a half hours of screen time each day like I do. 789 00:40:17,800 --> 00:40:20,600 Speaker 2: But I've never heard of proposal limit adult time on 790 00:40:20,640 --> 00:40:23,040 Speaker 2: the internet, which would probably be healthier honestly. 791 00:40:23,680 --> 00:40:27,400 Speaker 3: Well no, well, Stephen, these would block people like immigrants 792 00:40:27,400 --> 00:40:29,279 Speaker 3: from even accessing the Internet at all. This would block 793 00:40:29,320 --> 00:40:30,520 Speaker 3: trans people in cancer. 794 00:40:31,040 --> 00:40:33,200 Speaker 2: And look, I'm totally opposed to this. I'm just telling 795 00:40:33,200 --> 00:40:36,040 Speaker 2: you that the psychology of where I think it's actually coming. 796 00:40:35,920 --> 00:40:36,799 Speaker 3: From, I agree with you. 797 00:40:37,360 --> 00:40:41,600 Speaker 2: I think most people feel almost daily that they're trapped 798 00:40:41,600 --> 00:40:43,600 Speaker 2: in a cycle of like not wanting to be on 799 00:40:43,600 --> 00:40:46,720 Speaker 2: social media and also on social media all the time, 800 00:40:47,120 --> 00:40:49,360 Speaker 2: and they know that that dynamic is in fact unhealthy. 801 00:40:50,280 --> 00:40:53,399 Speaker 2: You know, I'm this way. I don't I live through 802 00:40:53,400 --> 00:40:56,439 Speaker 2: this kind of cycle of frankly, it's an addictive cycle. 803 00:40:56,480 --> 00:40:59,040 Speaker 2: It's a way an addict perad I deleted the social 804 00:40:59,120 --> 00:40:59,600 Speaker 2: media app. 805 00:40:59,600 --> 00:41:02,719 Speaker 3: And then, can I tell you something. Did you see 806 00:41:02,719 --> 00:41:05,200 Speaker 3: the big study that came out last year. I'm just 807 00:41:05,239 --> 00:41:08,520 Speaker 3: telling you no, no, no no, But let me tell 808 00:41:08,560 --> 00:41:12,840 Speaker 3: you what actual data says thinking about treating social media, 809 00:41:12,920 --> 00:41:17,120 Speaker 3: thinking about it like an addiction, actually makes it significantly 810 00:41:17,200 --> 00:41:20,319 Speaker 3: harder to moderate your own use. So instead of viewing 811 00:41:20,360 --> 00:41:23,319 Speaker 3: it as a habit that you can control, viewing it 812 00:41:23,360 --> 00:41:27,520 Speaker 3: as an addiction actually makes it extremely hard to quit. 813 00:41:28,120 --> 00:41:31,239 Speaker 3: And that is This narrative of addiction is actually something 814 00:41:31,239 --> 00:41:34,160 Speaker 3: the tech industry themselves has pushed because it's very it's 815 00:41:34,280 --> 00:41:37,279 Speaker 3: very conducive to the tech industry. It allows them to 816 00:41:37,480 --> 00:41:38,520 Speaker 3: pass surveillance laws. 817 00:41:38,600 --> 00:41:40,440 Speaker 1: Taylor is one minute to noon in your time, So 818 00:41:40,480 --> 00:41:41,840 Speaker 1: I know you have to go. You give us a 819 00:41:41,880 --> 00:41:43,640 Speaker 1: lot to chew on. I want to ask each of 820 00:41:43,640 --> 00:41:45,600 Speaker 1: you before you go, starting with uteles. I know you 821 00:41:45,640 --> 00:41:47,839 Speaker 1: have to run who had the best we can tech 822 00:41:47,920 --> 00:41:48,640 Speaker 1: and who had the worst? 823 00:41:48,680 --> 00:41:51,359 Speaker 3: We can take Oh gosh, I think who I think 824 00:41:51,400 --> 00:41:53,520 Speaker 3: the users had the worst? We can tack again because 825 00:41:53,520 --> 00:41:55,919 Speaker 3: we see these mass surveillance laws pass. And I would 826 00:41:55,920 --> 00:41:59,319 Speaker 3: say Mark Zuckerberg, who's been lobbying and funding, you know, 827 00:41:59,560 --> 00:42:02,640 Speaker 3: doing wishing identity verification laws, is having the best week. 828 00:42:02,960 --> 00:42:07,000 Speaker 4: Natasha, I would say Anthropic is having the best week. 829 00:42:07,840 --> 00:42:10,879 Speaker 4: You know, despite the short term issues, their downloads are up. 830 00:42:10,960 --> 00:42:14,360 Speaker 4: They could not be in the news more. You know, 831 00:42:14,400 --> 00:42:17,759 Speaker 4: their frameworks of how to think about autonomous weapons and 832 00:42:17,800 --> 00:42:23,480 Speaker 4: surveillance have been adopted. I would say that, you know, 833 00:42:23,560 --> 00:42:27,960 Speaker 4: the targets of US military strikes are having the worst 834 00:42:28,000 --> 00:42:32,040 Speaker 4: week ever because you know, what's happening to them has 835 00:42:32,080 --> 00:42:35,239 Speaker 4: been obscured. It's been talked about just in terms of 836 00:42:35,680 --> 00:42:39,160 Speaker 4: you know, whether or not AI was involved. Meanwhile, we 837 00:42:39,280 --> 00:42:43,399 Speaker 4: know that like US strikes are often hitting civilian casualties. 838 00:42:44,080 --> 00:42:48,440 Speaker 4: So yeah, I would say the targets of this just 839 00:42:48,480 --> 00:42:50,680 Speaker 4: have gotten really passed over. 840 00:42:51,120 --> 00:42:54,200 Speaker 2: Stephen, I agree Anthropic had the best week in tech. 841 00:42:54,680 --> 00:42:56,480 Speaker 2: I mean, yeah, if you got hit by bomb, you 842 00:42:56,480 --> 00:42:58,799 Speaker 2: probably had the worst week in tech. But another people 843 00:42:58,800 --> 00:43:03,040 Speaker 2: who had really bad week in tech is software engineers, 844 00:43:03,600 --> 00:43:07,279 Speaker 2: but particularly people who write code for a living. They 845 00:43:07,320 --> 00:43:11,520 Speaker 2: are cooked. I mean I they are in trouble. And 846 00:43:11,600 --> 00:43:14,319 Speaker 2: a lot of people went to university to learn to 847 00:43:14,400 --> 00:43:17,120 Speaker 2: code and sneered at everyone else about how poor they're 848 00:43:17,120 --> 00:43:19,600 Speaker 2: going to be because they didn't learn to code. So 849 00:43:20,239 --> 00:43:22,839 Speaker 2: now those guys are actually looking like their jobs are 850 00:43:22,840 --> 00:43:27,040 Speaker 2: going to go obsolete. It's already starting. It's already starting. 851 00:43:27,120 --> 00:43:29,799 Speaker 2: You know, they're not going to get the kind of 852 00:43:29,840 --> 00:43:33,040 Speaker 2: money they used to get paid. They've been basically functionally 853 00:43:33,080 --> 00:43:36,200 Speaker 2: made obsolete over the course of two or three months, 854 00:43:36,520 --> 00:43:37,719 Speaker 2: and none of them were prepared for it. 855 00:43:38,600 --> 00:43:41,640 Speaker 4: I would just say that there is a lot more 856 00:43:41,840 --> 00:43:46,719 Speaker 4: overlap between Steven's point about the end of software engineers, 857 00:43:47,080 --> 00:43:50,120 Speaker 4: which is a narrative really pushed a lot by anthropic 858 00:43:50,640 --> 00:43:52,600 Speaker 4: and a lot of the other stuff that we've been 859 00:43:52,640 --> 00:43:56,560 Speaker 4: talking about. You know, if you look at the data, 860 00:43:57,160 --> 00:43:59,759 Speaker 4: you know, in some cases people are actually not as 861 00:43:59,800 --> 00:44:03,240 Speaker 4: per active as they thought, and you know, the ability 862 00:44:03,440 --> 00:44:05,839 Speaker 4: like how much we pay software engineers and how we 863 00:44:05,960 --> 00:44:10,560 Speaker 4: value their work, you know, is directly affected by these narratives. 864 00:44:10,600 --> 00:44:14,319 Speaker 4: So I would just say, like that terror that every 865 00:44:14,440 --> 00:44:18,359 Speaker 4: that software engineers and everyone has about joblessness is part 866 00:44:18,400 --> 00:44:21,640 Speaker 4: of the reason that we're seeing this bike between Anthropic 867 00:44:21,680 --> 00:44:22,320 Speaker 4: and the Pentagon. 868 00:44:27,719 --> 00:44:29,719 Speaker 1: That's it for the weekend. Take Thank you all so 869 00:44:29,800 --> 00:44:33,000 Speaker 1: much for participating in our first inaugural roundtable, and we 870 00:44:33,040 --> 00:44:34,799 Speaker 1: hope you'll all be back very soon. 871 00:44:34,880 --> 00:44:36,520 Speaker 4: Thank you, thanks so much for having me. 872 00:44:36,600 --> 00:44:38,520 Speaker 3: We didn't even get to talk about the Computer Fraud 873 00:44:38,560 --> 00:44:40,520 Speaker 3: and Abuse Act, which is what I think should be 874 00:44:40,600 --> 00:44:43,960 Speaker 3: reformed next week. 875 00:44:44,160 --> 00:45:03,160 Speaker 1: Next week for tech Stuff, I'm os vlosia In. This 876 00:45:03,239 --> 00:45:06,680 Speaker 1: episode was produced by Eliza Dennis and Melissa Slaughter. Executive 877 00:45:06,680 --> 00:45:09,960 Speaker 1: produced by me Julian Nutter and Kate Osborne for Kaleidoscope 878 00:45:10,080 --> 00:45:13,920 Speaker 1: and Katrina Novelle for iHeart Podcasts. The engineer is Charles 879 00:45:13,920 --> 00:45:17,560 Speaker 1: de Montebello from CDM Studios. Jack Insley mixed this episode 880 00:45:17,600 --> 00:45:20,680 Speaker 1: and Kyle Murdoch wrote our theme song. A special thank 881 00:45:20,719 --> 00:45:25,399 Speaker 1: you too, Ittasha Tiku, Stephen Witt and Taylor Lorenz. Please 882 00:45:25,480 --> 00:45:27,480 Speaker 1: check out all the work they put out into the world. 883 00:45:27,680 --> 00:45:29,560 Speaker 1: We're lucky to call them friends of the pod, and 884 00:45:29,640 --> 00:45:31,960 Speaker 1: please do rate, review and reach out to us at 885 00:45:32,000 --> 00:45:48,080 Speaker 1: tech Stuff Podcast at gmail dot com.