1 00:00:14,080 --> 00:00:17,599 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff, I'm 2 00:00:17,600 --> 00:00:18,560 Speaker 1: as Voloscian and. 3 00:00:18,520 --> 00:00:19,360 Speaker 2: I'm care Price. 4 00:00:19,640 --> 00:00:23,680 Speaker 1: Today we get into what the aftermath of Charlie Kirk's 5 00:00:23,720 --> 00:00:27,880 Speaker 1: assassination tells us about the state of content moderation and 6 00:00:27,960 --> 00:00:30,880 Speaker 1: also how teens in New York City are responding to 7 00:00:30,960 --> 00:00:33,440 Speaker 1: having their smartphones banned in schools. 8 00:00:34,080 --> 00:00:37,120 Speaker 2: Then, on Chatting Me, a woman turns to chat GPT 9 00:00:37,360 --> 00:00:39,720 Speaker 2: to help her mom with a mysterious ailment. 10 00:00:40,360 --> 00:00:45,000 Speaker 3: Her life was very diminished because she couldn't walk upstairs, 11 00:00:45,000 --> 00:00:47,680 Speaker 3: she was having really poor sleeps because the pain would 12 00:00:47,680 --> 00:00:49,960 Speaker 3: wake her up in the night, and overall, she just 13 00:00:50,000 --> 00:00:52,760 Speaker 3: started to feel like she was really declining in her health. 14 00:00:52,800 --> 00:00:56,680 Speaker 1: Overall, all of that on the Weekend Tech, It's Friday, 15 00:00:56,920 --> 00:01:03,000 Speaker 1: September nineteenth. 16 00:01:03,080 --> 00:01:04,440 Speaker 2: Hi Cara, Hi os. 17 00:01:04,800 --> 00:01:08,040 Speaker 1: So normally we start these episodes a bit more lighthearted, 18 00:01:08,319 --> 00:01:10,440 Speaker 1: but of course, the major news of the last week 19 00:01:10,480 --> 00:01:14,040 Speaker 1: has been anything but lighthearted with the assassination of the 20 00:01:14,120 --> 00:01:16,479 Speaker 1: right wing pundit and influencer Charlie Kirk. 21 00:01:16,840 --> 00:01:19,360 Speaker 2: Yeah, it seems kind of like the only news story 22 00:01:19,440 --> 00:01:19,840 Speaker 2: right now. 23 00:01:20,000 --> 00:01:22,080 Speaker 1: Yeah, and obviously it does have a lot of overlap 24 00:01:22,120 --> 00:01:26,759 Speaker 1: with tech, especially social media. From the gaming references inscribed 25 00:01:26,800 --> 00:01:29,800 Speaker 1: on the bullets to the Internet communities. The shooter was 26 00:01:29,880 --> 00:01:32,920 Speaker 1: part of to the way Charlie Kirk himself sort of 27 00:01:32,920 --> 00:01:36,560 Speaker 1: reshaped the way social media is used for political organizing. 28 00:01:36,920 --> 00:01:40,040 Speaker 2: Yeah, and of course there's the countless memes and takes 29 00:01:40,200 --> 00:01:42,200 Speaker 2: following a major news event like this. 30 00:01:42,720 --> 00:01:44,640 Speaker 1: Yeah. I was actually talking to a friend of mine 31 00:01:44,680 --> 00:01:48,400 Speaker 1: the other day about the video of the assassination, and 32 00:01:48,960 --> 00:01:51,280 Speaker 1: he said to me, you know, in the old days 33 00:01:51,400 --> 00:01:54,240 Speaker 1: on the Internet, I might have to go looking for 34 00:01:54,320 --> 00:01:59,080 Speaker 1: this kind of horrific, very disturbing, almost real time content, 35 00:01:59,560 --> 00:02:02,880 Speaker 1: But these days I have to basically do everything I 36 00:02:02,960 --> 00:02:05,920 Speaker 1: can to avoid seeing it. And that really stuck with me, 37 00:02:06,000 --> 00:02:08,520 Speaker 1: because in the past, you would have to go in 38 00:02:08,560 --> 00:02:12,240 Speaker 1: search of this kind of confrontation with a video of 39 00:02:12,280 --> 00:02:16,680 Speaker 1: someone being murdered in plain sight, but now it's plastered 40 00:02:16,720 --> 00:02:19,399 Speaker 1: over every social media app. Over the last few months, 41 00:02:19,440 --> 00:02:22,880 Speaker 1: we've seen all kinds of stories reporting about how the 42 00:02:22,919 --> 00:02:26,440 Speaker 1: social media platforms have been pulling back on content moderation, 43 00:02:27,080 --> 00:02:29,520 Speaker 1: either because of the politics at the moment or simply 44 00:02:29,680 --> 00:02:30,680 Speaker 1: to save costs. 45 00:02:31,120 --> 00:02:33,239 Speaker 2: Yeah. You know, I've also seen a bunch of stories 46 00:02:33,280 --> 00:02:37,800 Speaker 2: about companies like TikTok and meta replacing human content moderators 47 00:02:37,800 --> 00:02:40,720 Speaker 2: with AI, which apparently has had mixed results. 48 00:02:41,000 --> 00:02:43,400 Speaker 1: I'm going to put my hand up because on the 49 00:02:43,440 --> 00:02:46,000 Speaker 1: face of it, I'm not that drawn to stories about 50 00:02:46,080 --> 00:02:46,880 Speaker 1: content moderation. 51 00:02:47,040 --> 00:02:48,840 Speaker 2: You fight against them, Actually. 52 00:02:48,960 --> 00:02:51,280 Speaker 1: I fight against them. We get pitched it quite regularly 53 00:02:51,320 --> 00:02:55,119 Speaker 1: by our wonderful producers. The last time was when Facebook 54 00:02:55,160 --> 00:02:58,799 Speaker 1: replaced their content moderation system largely with these things called 55 00:02:58,880 --> 00:03:02,440 Speaker 1: community notes, where the community itself flags if something it's 56 00:03:02,560 --> 00:03:07,160 Speaker 1: misinformation or hateful, et cetera, etc. But this week is 57 00:03:07,160 --> 00:03:10,640 Speaker 1: obviously something we couldn't avoid, and it feels to me 58 00:03:10,840 --> 00:03:13,000 Speaker 1: I almost feel a little bit embarrassed about not having 59 00:03:13,160 --> 00:03:15,000 Speaker 1: focus on this earlier, because it feels like one of 60 00:03:15,000 --> 00:03:18,280 Speaker 1: those frog in the pot moments where it's a little late. 61 00:03:18,160 --> 00:03:20,079 Speaker 2: To jump out and you think we're boiling. 62 00:03:20,240 --> 00:03:22,840 Speaker 1: I think we're boiling, or perhaps even boiled. It was 63 00:03:22,840 --> 00:03:24,840 Speaker 1: a story and why that really brought it home to me. 64 00:03:25,000 --> 00:03:28,360 Speaker 1: With the headline Charlie Kirk was shot and killed in 65 00:03:28,360 --> 00:03:31,839 Speaker 1: a post content moderation world, the wide article points out 66 00:03:31,840 --> 00:03:34,200 Speaker 1: that very few of the videos of the shooting that 67 00:03:34,200 --> 00:03:37,480 Speaker 1: are all over various social media have content warnings, and 68 00:03:37,520 --> 00:03:40,480 Speaker 1: actually a ton of them play automatically before viewers have 69 00:03:40,560 --> 00:03:42,680 Speaker 1: the chance to consent to what they're about to see. 70 00:03:42,880 --> 00:03:45,400 Speaker 2: Yeah, auto play in this sense is not your friend, 71 00:03:45,680 --> 00:03:49,920 Speaker 2: and I think it probably accounts for the dissemination of 72 00:03:49,960 --> 00:03:53,000 Speaker 2: this video more so than just the average person looking 73 00:03:53,040 --> 00:03:56,560 Speaker 2: for it. And whenever a video like this, most recent 74 00:03:56,640 --> 00:03:59,840 Speaker 2: Charlie Kirk killing video comes out, I think about the 75 00:04:00,240 --> 00:04:02,880 Speaker 2: at which video is now disseminated and how ill prepared 76 00:04:02,920 --> 00:04:04,200 Speaker 2: the Internet is to handle it. 77 00:04:04,600 --> 00:04:07,960 Speaker 1: Yeah. Also, I mean shooting in four K from close up. 78 00:04:08,080 --> 00:04:10,960 Speaker 1: I mean the horror of this video, at least in 79 00:04:11,000 --> 00:04:14,360 Speaker 1: what I've read about it, is quite striking. Experts in 80 00:04:14,400 --> 00:04:16,599 Speaker 1: the wide piece who have been tracking the spread of 81 00:04:16,640 --> 00:04:19,120 Speaker 1: the videos online say that a lot of the platforms 82 00:04:19,160 --> 00:04:22,640 Speaker 1: are actually failing to enforce their own content moderation rules, 83 00:04:23,120 --> 00:04:25,239 Speaker 1: but also that it's kind of a tricky situation because 84 00:04:25,240 --> 00:04:29,120 Speaker 1: the video falls in between quote graphic content, which is allowed, 85 00:04:29,200 --> 00:04:32,640 Speaker 1: and quote glorified violence, which usually isn't right. 86 00:04:32,680 --> 00:04:34,480 Speaker 2: And then there's the fact that this is like a 87 00:04:34,600 --> 00:04:37,919 Speaker 2: major news event, which I'm sure makes it even more complicated. 88 00:04:38,240 --> 00:04:43,560 Speaker 1: Definitely, it's also the culmination of some fairly major political 89 00:04:43,720 --> 00:04:47,520 Speaker 1: and philosophical changes to the way social media platforms are run. 90 00:04:48,080 --> 00:04:50,680 Speaker 1: Of course, we all remember when Elon Musk brought Twitter 91 00:04:50,680 --> 00:04:53,760 Speaker 1: back in twenty twenty two. He loosened content restrictions pretty 92 00:04:53,839 --> 00:04:56,720 Speaker 1: much immediately, and Bettie said this was in response to 93 00:04:56,720 --> 00:05:00,320 Speaker 1: what he claimed was suppression of conservative leaning post hosts. 94 00:05:00,720 --> 00:05:04,120 Speaker 1: Then YouTube and Meta both followed suit. In twenty twenty three, 95 00:05:04,640 --> 00:05:07,919 Speaker 1: YouTube said that curbing misinformation could lead to quote the 96 00:05:08,000 --> 00:05:12,360 Speaker 1: unintended effect of curtailing political speech without meaningfully reducing the 97 00:05:12,480 --> 00:05:15,640 Speaker 1: risk of violence or other real world harm. In twenty 98 00:05:15,680 --> 00:05:19,719 Speaker 1: twenty five, Meta cited quote recent elections as a reason 99 00:05:19,839 --> 00:05:23,600 Speaker 1: to quote remove restrictions on topics like immigration and gender 100 00:05:23,920 --> 00:05:25,880 Speaker 1: that are out of touch with mainstream discourse. 101 00:05:26,200 --> 00:05:29,400 Speaker 2: Yeah, and now the whole Internet is basically four chan basically. 102 00:05:29,560 --> 00:05:32,760 Speaker 1: I mean. The interesting thing, though, as Wide points out, 103 00:05:32,839 --> 00:05:36,280 Speaker 1: is that the more established platforms do still have real 104 00:05:36,360 --> 00:05:40,560 Speaker 1: rules around content moderation, and in many cases the circulation 105 00:05:40,680 --> 00:05:44,800 Speaker 1: and distribution of the assassination footage violates those rules. But 106 00:05:44,880 --> 00:05:48,280 Speaker 1: nonetheless the videos are getting millions and millions of views 107 00:05:48,360 --> 00:05:50,440 Speaker 1: and are still up and circulating widely. 108 00:05:51,000 --> 00:05:53,000 Speaker 2: So what is your takeaway from all of this? 109 00:05:53,440 --> 00:05:58,400 Speaker 1: I mean, it's this kind of confection of corrosive and 110 00:05:58,680 --> 00:06:05,560 Speaker 1: divisive rhetoric online translating into offline, translating back into online. 111 00:06:06,040 --> 00:06:10,200 Speaker 1: This kind of vicious circle or a buros of radicalization. 112 00:06:10,600 --> 00:06:14,560 Speaker 1: And there's an expert quoted who'd observe people on x 113 00:06:14,920 --> 00:06:18,640 Speaker 1: commenting on the video of the Charlie Kirk assassination saying 114 00:06:18,640 --> 00:06:22,040 Speaker 1: that it heradicalized them. Now, it didn't say in which direction. 115 00:06:22,160 --> 00:06:24,440 Speaker 1: We may be able to infer that it radicalize them 116 00:06:24,760 --> 00:06:27,320 Speaker 1: against the left, but either way, I mean, this is 117 00:06:27,400 --> 00:06:30,800 Speaker 1: just another moment that I'm very concerned about, and I 118 00:06:30,839 --> 00:06:34,920 Speaker 1: think everyone is will further harden the lines and further 119 00:06:35,320 --> 00:06:39,400 Speaker 1: feed this beast of online radicalization and real world violence. 120 00:06:39,640 --> 00:06:43,200 Speaker 2: You know, this story really encapsulates practically all of the 121 00:06:43,240 --> 00:06:46,680 Speaker 2: harms that people fear that social media might unleash onto 122 00:06:46,720 --> 00:06:48,919 Speaker 2: the world. The story that I want to report on 123 00:06:49,000 --> 00:06:52,719 Speaker 2: today has to do with the effects of technology actually 124 00:06:52,800 --> 00:06:56,360 Speaker 2: on our younger citizens and is a little more lighthearted. 125 00:06:56,520 --> 00:06:59,200 Speaker 2: Have you heard about the cell phone band in New York? 126 00:06:59,560 --> 00:07:01,200 Speaker 1: A little bit, but tell me more so. 127 00:07:01,600 --> 00:07:05,120 Speaker 2: New York City public schools recently became the largest district 128 00:07:05,400 --> 00:07:08,080 Speaker 2: in the United States to ban students from using cell 129 00:07:08,120 --> 00:07:10,320 Speaker 2: phones during a school day. So this is a state 130 00:07:10,360 --> 00:07:13,040 Speaker 2: wide law, and it's something that's happening more and more 131 00:07:13,120 --> 00:07:17,680 Speaker 2: across the country. California and Louisiana also have restrictions, and 132 00:07:17,960 --> 00:07:20,440 Speaker 2: this week marks the second week since the band went 133 00:07:20,480 --> 00:07:21,760 Speaker 2: into effect in New York. 134 00:07:22,200 --> 00:07:24,600 Speaker 1: I kind of can't believe this is happening. It feels 135 00:07:24,680 --> 00:07:28,080 Speaker 1: like the pipe dream of so many digital and social 136 00:07:28,160 --> 00:07:31,720 Speaker 1: media theorists being an actually in real life, how's it going? 137 00:07:31,880 --> 00:07:33,160 Speaker 2: You know? When I saw it, I was like, is 138 00:07:33,200 --> 00:07:35,679 Speaker 2: this for real? You know, like it feels that sort 139 00:07:35,680 --> 00:07:39,400 Speaker 2: of unbelievable. I actually did some reporting on this. We 140 00:07:39,440 --> 00:07:42,320 Speaker 2: have a friend whose child, Ruby is a middle schooler 141 00:07:42,400 --> 00:07:45,560 Speaker 2: in Brooklyn, New York, and they were kind enough to 142 00:07:45,600 --> 00:07:47,920 Speaker 2: send us some voice memos about their experience with the 143 00:07:47,920 --> 00:07:50,120 Speaker 2: phone band. So here are some of the things they 144 00:07:50,120 --> 00:07:51,840 Speaker 2: say are tough about the phone ban. 145 00:07:52,320 --> 00:07:55,280 Speaker 4: The one good thing about on phones at school is 146 00:07:55,320 --> 00:07:58,280 Speaker 4: that we can't take videos of other people. I mean, 147 00:07:58,320 --> 00:08:02,040 Speaker 4: sometimes we want to like record something within our phone 148 00:08:02,040 --> 00:08:05,200 Speaker 4: group that we think is funny, like recording one of 149 00:08:05,280 --> 00:08:09,600 Speaker 4: us eating a ginormous hamburger or something funny like that. 150 00:08:09,800 --> 00:08:13,320 Speaker 4: Sometimes it's hard when it's like a fight or something, 151 00:08:13,640 --> 00:08:14,880 Speaker 4: it's really not good to have. 152 00:08:14,840 --> 00:08:15,440 Speaker 2: A phone out. 153 00:08:15,840 --> 00:08:18,680 Speaker 1: I mean, not having a digital record of all of 154 00:08:18,720 --> 00:08:22,280 Speaker 1: the shenanigans at school is something I think many kids 155 00:08:22,320 --> 00:08:25,000 Speaker 1: will be grateful for in future and parents are probably 156 00:08:25,040 --> 00:08:28,760 Speaker 1: grateful for today. But how does this actually work practicing? 157 00:08:28,880 --> 00:08:31,320 Speaker 1: How do you stop kids from using their phones at school? 158 00:08:31,520 --> 00:08:35,000 Speaker 2: Different schools are actually taking different approaches to the phone band. 159 00:08:35,240 --> 00:08:38,760 Speaker 2: Most of them either confiscate students' phones at the beginning 160 00:08:38,840 --> 00:08:40,800 Speaker 2: of the day and put them in lockers or these 161 00:08:40,840 --> 00:08:44,920 Speaker 2: magnetic pouches that have locks in them. But Ruby also 162 00:08:44,960 --> 00:08:48,040 Speaker 2: said that some kids and their parents just like literally 163 00:08:48,080 --> 00:08:49,959 Speaker 2: don't care about the rules at all. 164 00:08:50,559 --> 00:08:54,200 Speaker 5: A lot of kids just don't give him their phones 165 00:08:54,240 --> 00:08:56,760 Speaker 5: and they lie that they don't have a device, and 166 00:08:56,840 --> 00:09:00,160 Speaker 5: their parents sign it because their parents don't really care, 167 00:09:00,360 --> 00:09:02,360 Speaker 5: and then they just have their phones out in launch. 168 00:09:02,720 --> 00:09:05,040 Speaker 2: This probably would have been me to be honest. 169 00:09:04,840 --> 00:09:06,760 Speaker 1: Would your parents have signed the fake waiver that you 170 00:09:06,800 --> 00:09:07,960 Speaker 1: actually don't have a phone? 171 00:09:08,440 --> 00:09:11,839 Speaker 2: No, they were kind of rule followers, unfortunately, But I 172 00:09:11,840 --> 00:09:13,920 Speaker 2: would have figured out a way to hack. That would 173 00:09:13,920 --> 00:09:15,360 Speaker 2: have been part of the fun of going to school. 174 00:09:15,679 --> 00:09:17,800 Speaker 2: There was this article that really caught my eye from 175 00:09:17,840 --> 00:09:22,120 Speaker 2: Gotham Mist, which has this amazing headline from burner phones 176 00:09:22,200 --> 00:09:25,360 Speaker 2: to decks of cards and YC. Teens are adjusting to 177 00:09:25,400 --> 00:09:28,760 Speaker 2: the smartphone ban, and the reporters of this article talk 178 00:09:28,840 --> 00:09:31,360 Speaker 2: to a bunch of teenagers about how they're adapting to 179 00:09:31,400 --> 00:09:33,920 Speaker 2: the ban, and a lot of these teenagers are embracing 180 00:09:34,240 --> 00:09:37,680 Speaker 2: their low tech school day. Now they're playing with Polaroid cameras, 181 00:09:37,720 --> 00:09:40,920 Speaker 2: which have made a huge comeback. And it's just interesting 182 00:09:41,000 --> 00:09:44,000 Speaker 2: to me that teens have been forced into the dark 183 00:09:44,080 --> 00:09:46,400 Speaker 2: and have found new light with things that we were 184 00:09:46,400 --> 00:09:48,680 Speaker 2: just doing twenty years ago. 185 00:09:48,960 --> 00:09:51,360 Speaker 1: Yeah, I mean the twenty year ago equivalent of this. 186 00:09:51,440 --> 00:09:53,800 Speaker 1: When I was a kid, my greatest pleasure was watching 187 00:09:53,840 --> 00:09:56,320 Speaker 1: TV and playing with my PlayStation one. I lived in 188 00:09:56,320 --> 00:09:59,240 Speaker 1: the countryside in England and we would have these blackouts 189 00:09:59,280 --> 00:10:01,400 Speaker 1: or power out from time to time, I didn't know 190 00:10:01,440 --> 00:10:04,200 Speaker 1: once every six months, and my mom would bring out 191 00:10:04,200 --> 00:10:06,280 Speaker 1: the candles and then we'd play a board game or 192 00:10:06,320 --> 00:10:09,040 Speaker 1: play Uno or something which had this kind of sweet 193 00:10:09,080 --> 00:10:11,760 Speaker 1: nostalgic quality to it. At the same time, I was 194 00:10:11,840 --> 00:10:13,920 Speaker 1: very happy that we were plugged in most of the time. 195 00:10:13,920 --> 00:10:15,079 Speaker 1: I didn't have to do it more than a couple 196 00:10:15,120 --> 00:10:15,760 Speaker 1: of times a year. 197 00:10:15,960 --> 00:10:17,920 Speaker 2: And I think what got you into Oxford was how 198 00:10:17,920 --> 00:10:20,679 Speaker 2: good you were at Uno? Had you been playing PlayStation 199 00:10:20,800 --> 00:10:24,120 Speaker 2: one that whole time? God knows, But in all seriousness, 200 00:10:24,160 --> 00:10:27,920 Speaker 2: there does seem to be a sort of nostalgia among 201 00:10:28,000 --> 00:10:31,200 Speaker 2: gen z for the esthetics of really even the two thousands, 202 00:10:31,240 --> 00:10:33,680 Speaker 2: not even so much the nineties, like the two thousands, 203 00:10:33,720 --> 00:10:35,760 Speaker 2: which is so funny to me about having nostalgia for 204 00:10:35,760 --> 00:10:38,280 Speaker 2: a time where I was just a teenager. One student 205 00:10:38,360 --> 00:10:40,560 Speaker 2: quoted in the article said she's looking into whether the 206 00:10:40,600 --> 00:10:43,200 Speaker 2: school would allow her to bring an MP three. 207 00:10:43,080 --> 00:10:46,559 Speaker 1: Player otherwise known as an iPod iPod, And. 208 00:10:46,520 --> 00:10:48,839 Speaker 2: A teacher talked about how some of his students brought 209 00:10:48,880 --> 00:10:52,240 Speaker 2: in a transistor radio but they didn't know they needed 210 00:10:52,240 --> 00:10:54,400 Speaker 2: to extend the antenna, which he had to help him with. 211 00:10:54,760 --> 00:10:56,760 Speaker 1: I love that that is a true that is a 212 00:10:56,760 --> 00:10:59,080 Speaker 1: true teachable moment because this is like almost like a 213 00:10:59,120 --> 00:11:03,000 Speaker 1: scene from a from like a feel good atis TV 214 00:11:03,080 --> 00:11:06,439 Speaker 1: show where the teacher help helps this year, no sun 215 00:11:06,520 --> 00:11:11,800 Speaker 1: you have to extend their antennae. Exactly beyond fueling the 216 00:11:11,800 --> 00:11:15,800 Speaker 1: demand for nostalgia tech, is the phone band really changing 217 00:11:15,880 --> 00:11:17,440 Speaker 1: much about the experience of being at school? 218 00:11:17,640 --> 00:11:20,880 Speaker 2: So that same teacher who extended the antenna said that actually, 219 00:11:20,960 --> 00:11:24,360 Speaker 2: the lunch room is noticeably louder in a good way. 220 00:11:24,520 --> 00:11:27,719 Speaker 2: He said that previously the lunch room was muted, and 221 00:11:27,920 --> 00:11:31,280 Speaker 2: that this band has really lifted a pall. The teacher said, 222 00:11:31,280 --> 00:11:33,560 Speaker 2: this is a huge difference from last year, when kids 223 00:11:33,559 --> 00:11:36,240 Speaker 2: would spend twenty minutes in the bathroom checking their phone 224 00:11:36,559 --> 00:11:38,880 Speaker 2: and walk through the halls in silence with their heads down. 225 00:11:39,120 --> 00:11:40,960 Speaker 2: I know this is true because it sounds like a 226 00:11:40,960 --> 00:11:42,319 Speaker 2: description of me in my own house. 227 00:11:43,400 --> 00:11:46,880 Speaker 1: Unfortunately me too. Ruby mentioned the idea of getting your 228 00:11:46,880 --> 00:11:48,760 Speaker 1: parents to write letters saying you don't have a phone 229 00:11:48,760 --> 00:11:50,840 Speaker 1: in the first place. It's one kind of work around, 230 00:11:51,040 --> 00:11:52,960 Speaker 1: but how compliant to the kids being what are the 231 00:11:53,000 --> 00:11:54,440 Speaker 1: ways around this they're looking for. 232 00:11:54,480 --> 00:11:56,240 Speaker 2: They're coming up with all kinds of ways. So the 233 00:11:56,280 --> 00:11:58,640 Speaker 2: pouches that I told you about that are supposed to 234 00:11:58,679 --> 00:12:01,199 Speaker 2: make phones inaccessible during the day, there's all of these 235 00:12:01,240 --> 00:12:05,199 Speaker 2: tiktoks now showing students breaking into them, and Ruby actually 236 00:12:05,240 --> 00:12:08,280 Speaker 2: says that trying to get them open has become a 237 00:12:08,320 --> 00:12:12,800 Speaker 2: common downtime activity for students. Kids also use their school 238 00:12:12,840 --> 00:12:16,200 Speaker 2: issued laptops to send each other emails or to chat 239 00:12:16,240 --> 00:12:19,400 Speaker 2: each other using Google docs like imagine being so desperate 240 00:12:19,440 --> 00:12:23,160 Speaker 2: that you're dming in Google docs. And of course, you know, 241 00:12:23,280 --> 00:12:25,680 Speaker 2: kids still have access to their phones all the time 242 00:12:25,679 --> 00:12:27,840 Speaker 2: that they're not in school, which is most of the day. 243 00:12:28,200 --> 00:12:30,800 Speaker 2: A couple of teachers in the article talked about chaotic 244 00:12:30,840 --> 00:12:33,280 Speaker 2: scenes at the end of the day when kids finally 245 00:12:33,320 --> 00:12:36,480 Speaker 2: get their phones back, which made me laugh. But overall, 246 00:12:36,840 --> 00:12:39,679 Speaker 2: the band seems to be going kind of well, and 247 00:12:39,760 --> 00:12:42,240 Speaker 2: it may be something that becomes a standard for students 248 00:12:42,320 --> 00:12:45,000 Speaker 2: around the globe. You know, over half US states have 249 00:12:45,040 --> 00:12:47,480 Speaker 2: at least a partial band in place, and there are 250 00:12:47,480 --> 00:12:49,840 Speaker 2: a ton of other countries that have student cell phone 251 00:12:49,880 --> 00:12:54,239 Speaker 2: bands with varying degrees of strictness. Denmark, of course, Australia. 252 00:12:54,400 --> 00:12:56,120 Speaker 2: France has had one for a while but made it 253 00:12:56,160 --> 00:12:59,400 Speaker 2: even more strict earlier this year, and China has had 254 00:12:59,440 --> 00:13:01,160 Speaker 2: one since twenty twenty one. 255 00:13:01,480 --> 00:13:04,600 Speaker 1: It's sort of hard to imagine why it took so long. 256 00:13:05,920 --> 00:13:07,720 Speaker 1: Are there any people who are objecting to this? Apart 257 00:13:07,720 --> 00:13:09,920 Speaker 1: from obviously the students who want to be on their phones. 258 00:13:09,720 --> 00:13:13,200 Speaker 2: All day, there are objectors. Not everybody is happy. The 259 00:13:13,240 --> 00:13:16,520 Speaker 2: president of the National Parents Union actually wrote an op 260 00:13:16,679 --> 00:13:19,640 Speaker 2: ed in USA today saying what we really need are 261 00:13:19,760 --> 00:13:23,199 Speaker 2: smarter rules around phone usage, and that treating all phone 262 00:13:23,280 --> 00:13:26,000 Speaker 2: uses a distraction can cause kids to miss out on 263 00:13:26,040 --> 00:13:28,240 Speaker 2: the ways phones can be helpful to students. 264 00:13:28,360 --> 00:13:31,560 Speaker 1: With all due respect to the president of the National Parents' Union, 265 00:13:31,600 --> 00:13:34,440 Speaker 1: you feel as a certain self selecting quality to that job. 266 00:13:36,720 --> 00:13:40,760 Speaker 1: Is your job to play devil's advocate on every single issue. 267 00:13:40,800 --> 00:13:43,760 Speaker 2: That's right. She argues phones can keep kids safe during 268 00:13:43,800 --> 00:13:46,800 Speaker 2: emergencies and that we should be preparing them for adulthood 269 00:13:46,800 --> 00:13:49,080 Speaker 2: in the modern world by teaching them to have healthy 270 00:13:49,120 --> 00:13:52,040 Speaker 2: relationships with their phones. And also, some parents don't like 271 00:13:52,080 --> 00:13:53,960 Speaker 2: that they suddenly can't get a hold of their kids 272 00:13:54,040 --> 00:13:54,880 Speaker 2: during the school day. 273 00:13:55,280 --> 00:13:55,520 Speaker 4: Yeah. 274 00:13:55,600 --> 00:13:58,320 Speaker 1: I mean also, as a parent of kids in school, 275 00:13:58,679 --> 00:14:03,559 Speaker 1: the reality of emergency situations, violence, etc. Is also true. 276 00:14:03,600 --> 00:14:05,840 Speaker 1: So I do understand why parents want to be in 277 00:14:05,840 --> 00:14:08,319 Speaker 1: contact with their kids. But as always in life, you're 278 00:14:08,320 --> 00:14:10,439 Speaker 1: weighing harms, right, and the harm of having your kid 279 00:14:10,480 --> 00:14:12,840 Speaker 1: on their phone all the time at school seems to 280 00:14:12,840 --> 00:14:14,680 Speaker 1: me is somebod who's not a parent to be quite 281 00:14:14,720 --> 00:14:16,920 Speaker 1: significant and a good thing to mitigate. 282 00:14:17,280 --> 00:14:20,080 Speaker 2: Yeah, you know, I think on the whole people really 283 00:14:20,200 --> 00:14:22,800 Speaker 2: like the effects these bands are having, and not just 284 00:14:22,840 --> 00:14:25,360 Speaker 2: in New York. I actually read this blog post that 285 00:14:25,440 --> 00:14:28,720 Speaker 2: aggregated a bunch of social media reactions from teachers in 286 00:14:28,800 --> 00:14:32,080 Speaker 2: states with student phone bands across the country. One teacher 287 00:14:32,120 --> 00:14:34,440 Speaker 2: was saying, has the solution really been this easy the 288 00:14:34,480 --> 00:14:37,520 Speaker 2: whole time? Teachers across the country are saying that libraries 289 00:14:37,560 --> 00:14:40,600 Speaker 2: are busier, behavior issues are down, and that you know 290 00:14:40,640 --> 00:14:43,640 Speaker 2: this is anecdotal, but still. One teacher even wrote that 291 00:14:43,880 --> 00:14:46,080 Speaker 2: kids are passing notes to each other in class instead 292 00:14:46,080 --> 00:14:48,600 Speaker 2: of texting and called it quote beautiful. 293 00:14:49,840 --> 00:14:52,440 Speaker 1: When I was in school, passing notes was quote against 294 00:14:52,480 --> 00:14:55,400 Speaker 1: the rules. But I guess punishment. I guess it's not 295 00:14:55,440 --> 00:15:02,360 Speaker 1: just kids who are nostalgic. Teachers going to be nostalgic too. 296 00:15:04,160 --> 00:15:07,680 Speaker 1: After the break, Elon Musk is briefly supplanted as the 297 00:15:07,800 --> 00:15:10,840 Speaker 1: richest man in the world. A podcast company puts out 298 00:15:10,960 --> 00:15:13,480 Speaker 1: three thousand episodes a week with the help of AI, 299 00:15:13,840 --> 00:15:17,080 Speaker 1: and British members of parliament get accused of using Chat 300 00:15:17,240 --> 00:15:32,280 Speaker 1: to write their speeches. Stay with us, We're back and 301 00:15:32,280 --> 00:15:34,360 Speaker 1: we've got a few more headlines to you this week, and. 302 00:15:34,240 --> 00:15:36,920 Speaker 2: Then a story about how chat GPT helped a woman 303 00:15:37,080 --> 00:15:40,000 Speaker 2: diagnose her mother's mysterious knee problem. 304 00:15:40,280 --> 00:15:44,480 Speaker 1: But first, Kara, have you had a man named Larry Ellison. 305 00:15:44,280 --> 00:15:46,960 Speaker 2: Owner of the island of Lenai I have. 306 00:15:47,680 --> 00:15:49,880 Speaker 1: That's his Awaiian reoubt. What else do you know about him? 307 00:15:50,120 --> 00:15:52,960 Speaker 2: You know, he really burst into consciousness this year. He's 308 00:15:53,080 --> 00:15:56,320 Speaker 2: obviously the CEO of Oracle. He's been spending a bunch 309 00:15:56,360 --> 00:15:58,760 Speaker 2: of time at the White House. We talked about him 310 00:15:58,760 --> 00:16:02,000 Speaker 2: announcing the five hundred billion dollars Stargate Data Center project 311 00:16:02,040 --> 00:16:04,680 Speaker 2: with Donald Trump just a few days after the inauguration. 312 00:16:05,200 --> 00:16:08,800 Speaker 2: He also helped finance his son David's acquisition of Paramount, 313 00:16:09,120 --> 00:16:12,920 Speaker 2: and apparently they're also looking at buying Warner Brothers Discovery. 314 00:16:13,200 --> 00:16:15,200 Speaker 2: I think the company is going to be called Para 315 00:16:15,320 --> 00:16:17,520 Speaker 2: Brothers Discover Warner. 316 00:16:17,920 --> 00:16:21,440 Speaker 1: Twenty twenty five really has been the year of Larry Ellison, 317 00:16:21,800 --> 00:16:24,800 Speaker 1: which is impressive given he's eighty one years old. I 318 00:16:24,840 --> 00:16:27,560 Speaker 1: read a piece in Bloomberg about him that charts some 319 00:16:27,640 --> 00:16:31,720 Speaker 1: of his enthusiasms. There is the private island, which you mentioned. 320 00:16:32,120 --> 00:16:34,560 Speaker 1: He also flies a jet, of course. He once shot 321 00:16:34,640 --> 00:16:37,680 Speaker 1: at his elbow in a high speed bicycle crash and 322 00:16:37,680 --> 00:16:41,000 Speaker 1: punctured his lung body surfing. He also had a cameo 323 00:16:41,200 --> 00:16:44,600 Speaker 1: in the Marvel movie Iron Man two and for one 324 00:16:45,080 --> 00:16:49,640 Speaker 1: brief glimmering moment. Last week, he became the world's richest man, 325 00:16:49,800 --> 00:16:54,080 Speaker 1: overtaking Elon. Ellison's personal wealth peaked at three hundred and 326 00:16:54,160 --> 00:16:57,360 Speaker 1: ninety three billion dollars, just ahead of Elon at three 327 00:16:57,440 --> 00:16:59,200 Speaker 1: hundred and eighty four billion dollars. 328 00:16:59,800 --> 00:17:02,040 Speaker 2: How how did he overtake Elon? 329 00:17:02,400 --> 00:17:05,719 Speaker 1: Well, it all happened because last Tuesday, Oracle announced their 330 00:17:05,760 --> 00:17:09,680 Speaker 1: quarterly results and they surprised the market with extraordinary growth 331 00:17:09,840 --> 00:17:14,480 Speaker 1: in spend on cloud computing services driven by AI demand 332 00:17:14,600 --> 00:17:17,760 Speaker 1: at other tech companies. Essentially, so they announced not one, 333 00:17:17,960 --> 00:17:21,840 Speaker 1: not two, not three, but four multi billion dollar AI 334 00:17:21,960 --> 00:17:25,880 Speaker 1: contracts for their cloud services with three different clients. One 335 00:17:25,880 --> 00:17:28,480 Speaker 1: of those deals was with open Ai and was worth 336 00:17:28,680 --> 00:17:32,920 Speaker 1: three hundred billion dollars. So Oracle shares shot up by 337 00:17:32,920 --> 00:17:37,800 Speaker 1: thirty six percent, and Larry Ellison is Oracle's largest shareholder, 338 00:17:38,119 --> 00:17:41,199 Speaker 1: earning forty percent of the company. So the events of 339 00:17:41,280 --> 00:17:44,840 Speaker 1: last week shot his personal net worth up by one 340 00:17:44,920 --> 00:17:46,080 Speaker 1: hundred billion dollars. 341 00:17:46,359 --> 00:17:49,240 Speaker 2: I said that he fell back behind Elon, though he. 342 00:17:49,240 --> 00:17:52,159 Speaker 1: Did fall back behind Elon, but I think they're they're close, 343 00:17:52,320 --> 00:17:56,320 Speaker 1: although Elon's crown apparently is safe for now. Our friend 344 00:17:56,359 --> 00:17:58,959 Speaker 1: Nick Thompson who's the CEO of The Atlantic, had an 345 00:17:59,000 --> 00:18:02,119 Speaker 1: interesting take on the wider implications of this story. He 346 00:18:02,160 --> 00:18:04,800 Speaker 1: talked about how the fact that Oracle has not built 347 00:18:04,840 --> 00:18:09,200 Speaker 1: its own large language model makes it an attractive, non 348 00:18:09,240 --> 00:18:13,320 Speaker 1: competitive collaborator for a lot of other AI companies, and 349 00:18:13,359 --> 00:18:16,200 Speaker 1: so in a strange sense, Oracle's a kind of legacy 350 00:18:16,240 --> 00:18:19,200 Speaker 1: player has been rewarded for the fact that it wasn't 351 00:18:19,200 --> 00:18:22,080 Speaker 1: at the cutting edge of the air revolution and therefore 352 00:18:22,560 --> 00:18:25,760 Speaker 1: is seen as a non threatening partner by other tech companies. 353 00:18:26,119 --> 00:18:29,439 Speaker 2: There's actually more good news for Oracle investors. As of 354 00:18:29,480 --> 00:18:32,400 Speaker 2: the time of this recording, the US and China are 355 00:18:32,400 --> 00:18:35,320 Speaker 2: expected to announce a deal for TikTok's operations in the 356 00:18:35,400 --> 00:18:38,560 Speaker 2: US to be taken over by Oracle, silver Lake and 357 00:18:38,600 --> 00:18:42,080 Speaker 2: in recent Horowitz, with US investors holding about an eighty 358 00:18:42,119 --> 00:18:46,120 Speaker 2: percent stake and Chinese investors owning the rest. But there's 359 00:18:46,160 --> 00:18:50,439 Speaker 2: actually another AI boom a bruin, and this one is 360 00:18:50,480 --> 00:18:55,480 Speaker 2: in the form of podcasts. I read this really crazy 361 00:18:55,560 --> 00:18:58,720 Speaker 2: article in the Hollywood Reporter recently about a new podcast 362 00:18:58,720 --> 00:19:02,160 Speaker 2: company called in Inception Point Ai, which is a name 363 00:19:02,200 --> 00:19:04,840 Speaker 2: that was only possibly created by AI. 364 00:19:05,119 --> 00:19:07,040 Speaker 1: Well, yeah, but I don't think the prompt was very 365 00:19:07,040 --> 00:19:09,560 Speaker 1: good because it's not much of a name for podcast company. 366 00:19:09,600 --> 00:19:10,960 Speaker 1: It's an inception. 367 00:19:10,600 --> 00:19:12,880 Speaker 2: Point as really it's really not, but I guess. 368 00:19:12,720 --> 00:19:13,920 Speaker 1: A good name for an AI company. 369 00:19:14,000 --> 00:19:15,480 Speaker 2: It is a good name for an AI company, which 370 00:19:15,520 --> 00:19:19,119 Speaker 2: this is. So. This podcast network has over five thousand 371 00:19:19,240 --> 00:19:23,080 Speaker 2: shows and they put out over three thousand episodes per 372 00:19:23,119 --> 00:19:26,520 Speaker 2: week in the last two years. They claim to have 373 00:19:26,600 --> 00:19:29,960 Speaker 2: ten million downloads. Inception Point is saying that their secret 374 00:19:30,320 --> 00:19:32,639 Speaker 2: is that they can make a podcast episode in about 375 00:19:32,680 --> 00:19:36,360 Speaker 2: an hour for one dollar or less, and they use 376 00:19:36,400 --> 00:19:40,119 Speaker 2: AI for everything, choosing episode topics based on Google and 377 00:19:40,160 --> 00:19:43,879 Speaker 2: social media trends and then releasing five different versions with 378 00:19:44,000 --> 00:19:47,120 Speaker 2: different SEO friendly titles to see which ones perform best. 379 00:19:47,640 --> 00:19:50,600 Speaker 2: The ones that do well get scaled up, and the 380 00:19:50,640 --> 00:19:54,000 Speaker 2: production process is largely automated, using one hundred and eighty 381 00:19:54,000 --> 00:19:57,000 Speaker 2: four custom AI agents to make episodes that are then 382 00:19:57,119 --> 00:20:01,160 Speaker 2: basically just given quick QC quality ch and some sound 383 00:20:01,200 --> 00:20:02,560 Speaker 2: design from a human producer. 384 00:20:02,840 --> 00:20:05,760 Speaker 1: I don't doubt that you can make an AI podcast 385 00:20:05,800 --> 00:20:09,200 Speaker 1: episode for a dollar. I do wonder if a dollar 386 00:20:09,240 --> 00:20:11,640 Speaker 1: will be enough to pay somebody to spend an hour 387 00:20:12,119 --> 00:20:14,600 Speaker 1: listening to it, or perhaps the audience for these AI 388 00:20:14,680 --> 00:20:16,400 Speaker 1: podcasts is other AIS. 389 00:20:16,800 --> 00:20:19,399 Speaker 2: It just might be you know, I don't think people 390 00:20:19,520 --> 00:20:22,199 Speaker 2: in the actual podcast industry are very thrilled, but the 391 00:20:22,200 --> 00:20:25,679 Speaker 2: inception points CEO Janine Wright, who actually used to be 392 00:20:26,119 --> 00:20:29,879 Speaker 2: the CEO of the podcast company Wondery, insists that this 393 00:20:29,960 --> 00:20:32,639 Speaker 2: is the future and this is my favorite part. She 394 00:20:32,760 --> 00:20:35,960 Speaker 2: says in the article, quote we believe that in the 395 00:20:36,040 --> 00:20:39,440 Speaker 2: near future, half of the people on the planet will 396 00:20:39,480 --> 00:20:42,560 Speaker 2: be AI, and we are the company that's bringing those 397 00:20:42,600 --> 00:20:43,720 Speaker 2: people to life. 398 00:20:43,880 --> 00:20:47,040 Speaker 1: What a truly weird quote. When you're starting a new 399 00:20:47,080 --> 00:20:49,520 Speaker 1: business or fundraising, you always need some kind of bold 400 00:20:49,640 --> 00:20:53,560 Speaker 1: or provocative mission statement. But we believe that in future 401 00:20:53,640 --> 00:20:55,760 Speaker 1: half the people in the world will be AI is 402 00:20:56,680 --> 00:20:57,679 Speaker 1: definitely provocative. 403 00:20:57,840 --> 00:21:00,760 Speaker 2: I want to introduce you to some of these people 404 00:21:00,800 --> 00:21:03,439 Speaker 2: that will make up half of the population of the world, 405 00:21:03,680 --> 00:21:09,239 Speaker 2: people including chef Claire Delish. She's a host. I think 406 00:21:09,240 --> 00:21:14,800 Speaker 2: I would imagine a food podcast or nature expert Nigel Thistledown, 407 00:21:15,440 --> 00:21:18,760 Speaker 2: who has to be English or else. He's definitely not real. 408 00:21:19,200 --> 00:21:23,480 Speaker 2: The episodes on this network cover everything from weather reports 409 00:21:23,520 --> 00:21:26,520 Speaker 2: to cooking and gardening, and the company is looking into 410 00:21:26,520 --> 00:21:30,720 Speaker 2: turning these AI personalities into chatbots that can then interact 411 00:21:30,760 --> 00:21:33,680 Speaker 2: with listeners, though they claim they want to steer clear 412 00:21:33,760 --> 00:21:37,720 Speaker 2: of developing anything that someone would accidentally have a deep 413 00:21:37,760 --> 00:21:41,560 Speaker 2: relationship with. I don't know how anyone could resist a 414 00:21:41,600 --> 00:21:44,800 Speaker 2: relationship with Chef Claire Delish on the basis of the 415 00:21:44,840 --> 00:21:46,200 Speaker 2: content that she's creating. 416 00:21:46,240 --> 00:21:49,760 Speaker 1: But you know, I have to say, this story feels 417 00:21:49,760 --> 00:21:53,440 Speaker 1: like he was hallucinated by it, does AI. I'm also 418 00:21:53,600 --> 00:21:57,119 Speaker 1: somewhat curious about that ten million download number. Did you 419 00:21:57,160 --> 00:21:59,919 Speaker 1: find anything about any kind of listener responses to the 420 00:22:00,200 --> 00:22:01,600 Speaker 1: well of inspiring content? 421 00:22:01,680 --> 00:22:03,679 Speaker 2: Well, other than all of the women who had fallen 422 00:22:03,680 --> 00:22:07,439 Speaker 2: in love with Nigel thistledown in a parasocial relationship. A 423 00:22:07,480 --> 00:22:10,160 Speaker 2: lot of people say that these episodes are not very good, 424 00:22:10,520 --> 00:22:12,280 Speaker 2: and they've also said that they have that kind of 425 00:22:12,400 --> 00:22:16,119 Speaker 2: uncanny quality that still screams text to speech. But Janine 426 00:22:16,160 --> 00:22:19,680 Speaker 2: Wright doesn't really seem to care. In the article, she says, quote, 427 00:22:19,800 --> 00:22:21,960 Speaker 2: I think that people who are still referring to all 428 00:22:21,960 --> 00:22:26,240 Speaker 2: AI generated content as aislop are probably lazy luddites. That's 429 00:22:26,240 --> 00:22:28,720 Speaker 2: I think she's talking to us, because there's a lot 430 00:22:28,760 --> 00:22:31,600 Speaker 2: of really good stuff out there. As for that purported 431 00:22:31,640 --> 00:22:34,480 Speaker 2: ten million download number, a quick glance at the company's 432 00:22:34,480 --> 00:22:37,560 Speaker 2: shows and reviews says they probably don't have a ton 433 00:22:37,600 --> 00:22:41,240 Speaker 2: of listeners. However, with such low overhead, they say they 434 00:22:41,280 --> 00:22:44,840 Speaker 2: only need about twenty downloads per episode to turn a profit. 435 00:22:45,640 --> 00:22:47,200 Speaker 2: Companies take note. 436 00:22:47,440 --> 00:22:50,640 Speaker 1: Yeah, it's funny that reference to lazy Luddites. We did 437 00:22:50,640 --> 00:22:52,560 Speaker 1: an episode on tech Stuff a couple of months ago 438 00:22:52,640 --> 00:22:55,120 Speaker 1: with a guy called Brian Merchant who wrote a book 439 00:22:55,119 --> 00:22:58,520 Speaker 1: about the real Luddites, who I think anything but lazy. 440 00:22:58,600 --> 00:23:02,400 Speaker 1: They were kind of pro testers who sacrificed, in many 441 00:23:02,440 --> 00:23:06,840 Speaker 1: cases their lives during the Industrial Revolution to protest against 442 00:23:06,920 --> 00:23:11,199 Speaker 1: how the mechanization of work was making people jobless and 443 00:23:11,280 --> 00:23:14,040 Speaker 1: driving them into poverty. So I think Genine right may 444 00:23:14,080 --> 00:23:17,520 Speaker 1: be wrong on that one. But on the topic of 445 00:23:18,080 --> 00:23:22,320 Speaker 1: unconvincing AI personalities, have you heard about the most recent 446 00:23:22,480 --> 00:23:24,320 Speaker 1: throwdown in the British Parliament? 447 00:23:24,640 --> 00:23:27,560 Speaker 2: No? I have not. I don't feels the followed British politics. 448 00:23:27,640 --> 00:23:29,959 Speaker 1: Yeah, I don't think that many people do in Britain 449 00:23:30,000 --> 00:23:33,639 Speaker 1: alone in the US. But there's an article in the 450 00:23:33,640 --> 00:23:37,480 Speaker 1: newspaper called The Independent about how several labor MPs have 451 00:23:37,520 --> 00:23:41,399 Speaker 1: been accused by their conservative counterparts of using AI to 452 00:23:41,440 --> 00:23:44,680 Speaker 1: write their speeches, and how could they tell well. One 453 00:23:44,720 --> 00:23:49,440 Speaker 1: of the people making these accusations is Conservative MP Tom Tuggandat, 454 00:23:49,880 --> 00:23:53,480 Speaker 1: who says that labor MP speeches are increasingly starting to 455 00:23:53,520 --> 00:23:55,680 Speaker 1: have more and more americanisms. 456 00:23:56,160 --> 00:24:01,240 Speaker 2: Tom Tuggenhat works with Claire Delish and Nigel Thistledown. Did 457 00:24:01,400 --> 00:24:04,760 Speaker 2: he mention specifically what americanisms are being used? 458 00:24:05,200 --> 00:24:09,399 Speaker 1: Yes, the phrase I rise to speak really got Tom 459 00:24:09,520 --> 00:24:12,600 Speaker 1: hot under the hat. He said, quote I rise to speak, 460 00:24:12,680 --> 00:24:15,720 Speaker 1: I rise to speak, I rise to speak. Chack GPT 461 00:24:15,920 --> 00:24:18,480 Speaker 1: knows you're there. But this is an americanism that we 462 00:24:18,640 --> 00:24:21,800 Speaker 1: don't use but still keep using it because it makes 463 00:24:21,840 --> 00:24:24,199 Speaker 1: it clear that this place has become absurd. 464 00:24:24,720 --> 00:24:27,920 Speaker 2: As an American, I can say I've never once said 465 00:24:27,960 --> 00:24:31,000 Speaker 2: I rise to speak, maybe save for in my high 466 00:24:31,080 --> 00:24:31,879 Speaker 2: chair growing up. 467 00:24:33,119 --> 00:24:34,960 Speaker 1: Yeah, as a bridge I've never used it either. But 468 00:24:35,080 --> 00:24:37,080 Speaker 1: I did some googling and it turns out I rise 469 00:24:37,119 --> 00:24:40,600 Speaker 1: to speak is frequently used in Congress in the US, 470 00:24:40,760 --> 00:24:44,680 Speaker 1: and it's had a meteoric rise in British Parliament recently. 471 00:24:45,160 --> 00:24:48,119 Speaker 1: Tom is correct to pointed out last year it was 472 00:24:48,240 --> 00:24:50,720 Speaker 1: used two hundred and thirty times in the whole year. 473 00:24:51,200 --> 00:24:54,240 Speaker 1: This year it's already been used six hundred and thirty 474 00:24:54,240 --> 00:24:57,119 Speaker 1: five times, and I think it's the kind of processed 475 00:24:57,160 --> 00:24:59,560 Speaker 1: type thing that people use in the House of Representatives 476 00:24:59,560 --> 00:25:02,840 Speaker 1: which is now infiltrated Parliament. And Tom tuggan Dat is 477 00:25:02,920 --> 00:25:06,159 Speaker 1: blaming chachipt for this lazy MP is using it to 478 00:25:06,160 --> 00:25:07,040 Speaker 1: write their speeches. 479 00:25:07,440 --> 00:25:10,840 Speaker 2: Are there any other examples of AI in the MP speeches? 480 00:25:11,240 --> 00:25:16,640 Speaker 1: Yeah, there are words like underscore, streamline, navigate, bustling, which 481 00:25:16,720 --> 00:25:19,119 Speaker 1: often used by large language models and are now popping 482 00:25:19,200 --> 00:25:22,360 Speaker 1: up more and more in Parliament, as are these commonly 483 00:25:22,520 --> 00:25:26,359 Speaker 1: used AI sentence constructions like not only X, but y 484 00:25:27,000 --> 00:25:29,639 Speaker 1: or the phrase not merely. And of course I mean 485 00:25:29,680 --> 00:25:32,720 Speaker 1: there's this interesting socio linguistic question here right, which is 486 00:25:33,160 --> 00:25:36,720 Speaker 1: are all of these MPs actually using chat gibt to 487 00:25:36,760 --> 00:25:40,640 Speaker 1: write their speeches or have the phrases of chat gibt 488 00:25:40,840 --> 00:25:44,360 Speaker 1: become so ubiquitous in human language that real humans are 489 00:25:44,720 --> 00:25:47,280 Speaker 1: unthinkingly now imitating machines. 490 00:25:47,359 --> 00:25:50,160 Speaker 2: That's what I ask myself about your posts on LinkedIn. 491 00:26:02,920 --> 00:26:06,320 Speaker 2: And now to our final segment of the day, Chat 492 00:26:06,320 --> 00:26:09,560 Speaker 2: and Me, where we discuss how people are really using chatbots. 493 00:26:09,920 --> 00:26:12,080 Speaker 2: And remember we want to hear from you, so please 494 00:26:12,119 --> 00:26:15,080 Speaker 2: send your chat stories to our inbox Tech Stuff podcast 495 00:26:15,119 --> 00:26:17,560 Speaker 2: at gmail dot com. This week we heard from our 496 00:26:17,600 --> 00:26:21,320 Speaker 2: listener Shalon from Vancouver, Canada, and she told us Chat 497 00:26:21,359 --> 00:26:24,120 Speaker 2: helped her mom with a mysterious medical problem. 498 00:26:24,560 --> 00:26:28,320 Speaker 3: So my mother is seventy four and she started having 499 00:26:28,520 --> 00:26:31,760 Speaker 3: knee pain about a year ago and it kind of 500 00:26:31,840 --> 00:26:33,800 Speaker 3: came out of nowhere, but it was pretty severe and 501 00:26:33,880 --> 00:26:36,399 Speaker 3: it was making it really difficult for her to sleep. 502 00:26:36,520 --> 00:26:40,120 Speaker 3: So obviously she talked to her doctor. The advice from 503 00:26:40,160 --> 00:26:41,800 Speaker 3: the doctor was just give it a little time to 504 00:26:41,960 --> 00:26:44,840 Speaker 3: d and then eventually she went back to the doctor. 505 00:26:44,840 --> 00:26:46,800 Speaker 3: The doctor said, okay, fine, we're going to give you 506 00:26:46,840 --> 00:26:50,040 Speaker 3: an X ray. There was absolutely nothing wrong with her knee. 507 00:26:50,400 --> 00:26:52,639 Speaker 1: Here's always comforting when you go to the doctor with 508 00:26:52,680 --> 00:26:55,760 Speaker 1: a medical problem like severe pain, and you get a 509 00:26:55,800 --> 00:26:58,160 Speaker 1: test and the answer comes back nothing wrong with you. 510 00:26:58,280 --> 00:27:00,880 Speaker 2: It gets worse. A few months later, Shanan and her 511 00:27:00,880 --> 00:27:03,520 Speaker 2: mom finally got the doctor to give them a referral 512 00:27:03,560 --> 00:27:07,119 Speaker 2: for a physiotherapist, but the physiotherapist also couldn't find anything 513 00:27:07,160 --> 00:27:09,879 Speaker 2: wrong with her knee. Another nightmare. Shllan told us that 514 00:27:09,920 --> 00:27:12,119 Speaker 2: at this point, the injury was starting to have a 515 00:27:12,160 --> 00:27:14,520 Speaker 2: big impact on her mom's mental health as well as 516 00:27:14,520 --> 00:27:15,320 Speaker 2: her physical health. 517 00:27:15,760 --> 00:27:20,359 Speaker 3: Her life was very diminished because she couldn't walk upstairs, 518 00:27:20,359 --> 00:27:23,040 Speaker 3: she was having really poor sleeps because the pain would 519 00:27:23,040 --> 00:27:25,320 Speaker 3: wake her up in the night, and overall, she just 520 00:27:25,320 --> 00:27:28,119 Speaker 3: started to feel like she was really declining in her health. 521 00:27:28,160 --> 00:27:33,159 Speaker 1: Overall, this sounds quite horrific. Does Chat ride to the rescue? 522 00:27:33,320 --> 00:27:36,200 Speaker 2: This is where Chat comes to the rescue. Shllen's mom 523 00:27:36,280 --> 00:27:39,160 Speaker 2: was visiting her and as a kind of last resort, 524 00:27:39,600 --> 00:27:42,760 Speaker 2: they turned to Chat GPT. They listed all the symptoms 525 00:27:42,760 --> 00:27:45,320 Speaker 2: and told Chat the whole story, what all the different 526 00:27:45,320 --> 00:27:47,840 Speaker 2: doctors had said. She talked to Chat for about forty 527 00:27:47,880 --> 00:27:49,400 Speaker 2: five minutes, and then. 528 00:27:49,600 --> 00:27:54,720 Speaker 3: It diagnosed her with a condition and gave her exercises 529 00:27:54,800 --> 00:27:56,720 Speaker 3: she needed to do, what she needed to do to 530 00:27:56,760 --> 00:27:58,880 Speaker 3: sleep at night, how she needed to sleep at night, 531 00:27:59,000 --> 00:28:01,840 Speaker 3: how she needed to walk during the day, et cetera, 532 00:28:01,880 --> 00:28:02,320 Speaker 3: et cetera. 533 00:28:02,960 --> 00:28:04,320 Speaker 1: So did it work well? 534 00:28:04,359 --> 00:28:06,959 Speaker 2: Shllon and her mom decided to do an experiment and 535 00:28:07,040 --> 00:28:10,040 Speaker 2: follow Chat's advice religiously for two weeks just to see 536 00:28:10,040 --> 00:28:13,560 Speaker 2: if it helped. Guess what completely healed, and Shalon said, 537 00:28:13,560 --> 00:28:15,560 Speaker 2: it's made a huge difference in her mom's life, not 538 00:28:15,640 --> 00:28:18,360 Speaker 2: just because the pain is gone, but because doctors had 539 00:28:18,359 --> 00:28:20,399 Speaker 2: been kind of dismissing it, and because it was also 540 00:28:20,480 --> 00:28:23,240 Speaker 2: one of the first experiences her mom had had with 541 00:28:23,320 --> 00:28:25,399 Speaker 2: the very scary reality of getting older. 542 00:28:25,840 --> 00:28:28,440 Speaker 3: As people age, it's like one thing happens, and then 543 00:28:28,480 --> 00:28:30,600 Speaker 3: the next thing happens, and then the next thing happens, 544 00:28:30,600 --> 00:28:34,000 Speaker 3: and your body is falling apart. And so my mom 545 00:28:34,080 --> 00:28:37,119 Speaker 3: has never had like a major issue that she couldn't 546 00:28:37,119 --> 00:28:39,880 Speaker 3: overcome in her health, and so this was the first 547 00:28:39,960 --> 00:28:42,800 Speaker 3: thing that was like, Okay, I guess this is the beginning. 548 00:28:42,520 --> 00:28:44,440 Speaker 2: Of the end, like it's over for me. 549 00:28:44,920 --> 00:28:46,960 Speaker 3: And so the fact that she has been able to 550 00:28:47,000 --> 00:28:50,280 Speaker 3: overcome this and has her kneeback and her health back, 551 00:28:50,800 --> 00:29:08,800 Speaker 3: it's been really life changing for her. 552 00:29:15,440 --> 00:29:17,040 Speaker 2: That's it for this week for Tech Stuff. 553 00:29:17,080 --> 00:29:20,080 Speaker 1: I'm Kara Price and I'm os Valashin. This episode was 554 00:29:20,120 --> 00:29:24,120 Speaker 1: produced by Eliza Dennis, Melissa Slaughter, Julian Nutter, and Tyler Hill. 555 00:29:24,360 --> 00:29:27,280 Speaker 1: He was executive produced by me Kara Price and Kate 556 00:29:27,320 --> 00:29:32,000 Speaker 1: Osborne for Kaleidoscope and Kastrian Norvel for iHeart Podcasts, Kyle 557 00:29:32,080 --> 00:29:34,640 Speaker 1: Murdoch Missy's episode and also write our theme song. 558 00:29:35,040 --> 00:29:37,560 Speaker 2: Join us next Wednesday for the first episode of the 559 00:29:37,600 --> 00:29:41,040 Speaker 2: new podcast on Crisper with Walter Isaacson. It's a five 560 00:29:41,080 --> 00:29:43,800 Speaker 2: part series that tells the story of how a revolutionary 561 00:29:43,840 --> 00:29:45,360 Speaker 2: gene editing tool was created. 562 00:29:45,640 --> 00:29:48,720 Speaker 1: And please do rate and review the show on Apple Podcasts, 563 00:29:48,720 --> 00:29:51,800 Speaker 1: on Spotify, wherever you listen, and also write to us 564 00:29:51,800 --> 00:29:54,680 Speaker 1: at tech Stuff podcast at gmail dot com. We're getting 565 00:29:54,720 --> 00:29:56,800 Speaker 1: so many great submissions for our Chat and me segment 566 00:29:56,920 --> 00:29:59,200 Speaker 1: and we love to get even more. We'll send you 567 00:29:59,240 --> 00:30:00,360 Speaker 1: a T shirt if you you like