1 00:00:13,640 --> 00:00:16,960 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:17,079 --> 00:00:18,960 Speaker 2: I'm as Volocian and I'm Cara Price. 3 00:00:19,320 --> 00:00:23,560 Speaker 1: Today we get into the dangers of AI agents communicating 4 00:00:23,600 --> 00:00:27,800 Speaker 1: in a language we can't understand, and how today's cyclists 5 00:00:27,800 --> 00:00:32,240 Speaker 1: out performing previous Tour de France champions without doping. Then 6 00:00:32,600 --> 00:00:33,879 Speaker 1: on Chat to Me, she. 7 00:00:33,920 --> 00:00:36,920 Speaker 3: Uses chat GPT to track and reflect on patterns in 8 00:00:36,920 --> 00:00:40,680 Speaker 3: her mood and mental cycle, explore ideas around purpose, parenting, 9 00:00:40,720 --> 00:00:44,240 Speaker 3: and identity, and talk through moments of loneliness, frustration or 10 00:00:44,280 --> 00:00:44,920 Speaker 3: self doubt. 11 00:00:45,800 --> 00:00:49,199 Speaker 1: All of that on the weekend Tech. It's Friday, August Date. 12 00:01:00,440 --> 00:01:01,400 Speaker 2: Hi ahs. 13 00:01:01,800 --> 00:01:04,800 Speaker 1: So we've talked about this before, how neither of us 14 00:01:04,920 --> 00:01:08,280 Speaker 1: is particularly handy in the kitchen. But I'm curious, when 15 00:01:08,319 --> 00:01:10,399 Speaker 1: was the last time you cooked a real meal? 16 00:01:10,680 --> 00:01:13,240 Speaker 2: Well, I'm glad you asked, because I actually cooked a 17 00:01:13,240 --> 00:01:15,160 Speaker 2: meal this weekend. You did two meals. 18 00:01:15,440 --> 00:01:16,479 Speaker 1: Yes, what did you cook? 19 00:01:17,040 --> 00:01:21,560 Speaker 2: I made paninis and I also made what was the 20 00:01:21,560 --> 00:01:24,280 Speaker 2: other thing that I made? Maybe pasta at night? I 21 00:01:24,319 --> 00:01:27,080 Speaker 2: mean just starch, absolute starch. 22 00:01:27,600 --> 00:01:31,800 Speaker 1: Prep time less than five minutes, nothing, nothing complicated. Nonetheless, 23 00:01:31,959 --> 00:01:33,800 Speaker 1: what did you do? Did you listen to anything? 24 00:01:34,240 --> 00:01:36,560 Speaker 2: I always do because I can't be in my thoughts, 25 00:01:36,880 --> 00:01:39,320 Speaker 2: so I put other people's thoughts inside my head, which 26 00:01:39,360 --> 00:01:40,400 Speaker 2: is NPR. 27 00:01:40,720 --> 00:01:42,880 Speaker 1: NPR. You don't listen to music while you're cooking. 28 00:01:43,080 --> 00:01:45,959 Speaker 2: Sometimes I do, Sometimes I do, and it's a lot 29 00:01:45,959 --> 00:01:47,400 Speaker 2: of the time jazz, a little. 30 00:01:47,240 --> 00:01:51,040 Speaker 1: Chess, jazz, musak. 31 00:01:51,280 --> 00:01:53,480 Speaker 2: I not listening to muzak. But why are you asking 32 00:01:53,480 --> 00:01:54,840 Speaker 2: me what I listened to when I cook? 33 00:01:55,280 --> 00:01:58,520 Speaker 1: Well, because I just found out that I want It 34 00:01:58,520 --> 00:02:00,840 Speaker 1: that Way, which happens to be one of my favorite song. 35 00:02:00,760 --> 00:02:03,080 Speaker 2: That was That a Way. It's a great zum. 36 00:02:03,880 --> 00:02:07,559 Speaker 1: Backstreet Boys and One Time by Justin Bieber are both 37 00:02:07,640 --> 00:02:11,240 Speaker 1: on Vice President JD Vance's Making Dinner playlist. 38 00:02:11,480 --> 00:02:13,840 Speaker 2: The fact that he likes those two songs reminds me 39 00:02:13,919 --> 00:02:15,560 Speaker 2: how young he is. I think he's forty one. 40 00:02:15,720 --> 00:02:18,079 Speaker 1: Yeah, he's forty one. He's elder millennial, but in the 41 00:02:18,080 --> 00:02:19,880 Speaker 1: grand scheme of things, he's he's definitely you know, he's 42 00:02:19,919 --> 00:02:20,200 Speaker 1: our age. 43 00:02:20,240 --> 00:02:23,079 Speaker 2: Basically, how do you know what's on JD Vance's playlist? 44 00:02:23,280 --> 00:02:27,080 Speaker 1: Well, our super producer Eliza alerted me to a website 45 00:02:27,120 --> 00:02:31,960 Speaker 1: called Panama playlists dot com, and I just obviously couldn't resist. 46 00:02:32,160 --> 00:02:34,840 Speaker 1: It's a reference, of course, to the Panama Papers which 47 00:02:34,840 --> 00:02:37,000 Speaker 1: he exposed a bunch of heads of state with offshore 48 00:02:37,040 --> 00:02:40,400 Speaker 1: bank accounts in tax Haven's. Of course, Panama Playlist is 49 00:02:40,400 --> 00:02:43,120 Speaker 1: a cheeky spin on this. The site claims to list 50 00:02:43,160 --> 00:02:48,280 Speaker 1: the real Spotify playlists of celebrities, politicians, and journalists, except 51 00:02:48,440 --> 00:02:51,919 Speaker 1: instead of learning about hidden wealth, you hear about embarrassing 52 00:02:52,000 --> 00:02:55,000 Speaker 1: music tastes. How speaker Mike Johnson actually has another of 53 00:02:55,000 --> 00:02:57,880 Speaker 1: my faves on his like songs Enya, the Good Old 54 00:02:58,120 --> 00:02:58,880 Speaker 1: Orinoco Flow. 55 00:02:59,040 --> 00:03:02,560 Speaker 2: When you're passing legislation, you want to hear sail Away, 56 00:03:02,600 --> 00:03:06,520 Speaker 2: sail Away, It's the best. I did not know that 57 00:03:06,600 --> 00:03:10,000 Speaker 2: I cared about this, but I really do, because there's 58 00:03:10,040 --> 00:03:13,160 Speaker 2: something that is so soul bearing about a playlist. I mean, 59 00:03:13,760 --> 00:03:16,360 Speaker 2: who is doing this? Who is leaking these playlists? 60 00:03:16,720 --> 00:03:20,960 Speaker 1: Well, we don't know. They've successfully outed various people but 61 00:03:21,080 --> 00:03:24,840 Speaker 1: kept their identity private. They claim they've been scraping publicly 62 00:03:24,880 --> 00:03:28,400 Speaker 1: available data about celebrity Spotify users for over a year. 63 00:03:28,760 --> 00:03:31,640 Speaker 1: But in addition to public playlists, some people even had 64 00:03:31,639 --> 00:03:34,440 Speaker 1: a setting turned on that showed their last played song, 65 00:03:34,800 --> 00:03:37,240 Speaker 1: so the brains behind the Panama Playlists were able to 66 00:03:37,280 --> 00:03:40,040 Speaker 1: track that to know what people were listening to when 67 00:03:40,320 --> 00:03:41,360 Speaker 1: and how many times? 68 00:03:41,520 --> 00:03:42,600 Speaker 2: But how do we know they're real? 69 00:03:42,800 --> 00:03:45,080 Speaker 1: Well, we don't for sure. The person who posted the 70 00:03:45,120 --> 00:03:48,280 Speaker 1: website Panama playlist dot com claims that many of the 71 00:03:48,280 --> 00:03:51,640 Speaker 1: people whose data he scraped use their real names and quote. 72 00:03:51,960 --> 00:03:55,400 Speaker 1: With a little investigating, I could say with near certainty, yep, 73 00:03:55,600 --> 00:03:58,960 Speaker 1: that's the person representatives are. Most of those named, including 74 00:03:58,960 --> 00:04:02,960 Speaker 1: the politicians of have not confirmed their identity, but some 75 00:04:03,280 --> 00:04:06,840 Speaker 1: brave souls have. One of them was Palmer Lucky of Anderil, 76 00:04:06,880 --> 00:04:11,800 Speaker 1: the autonomous weapons company. Avriel Levine's skater Boy features prominently 77 00:04:11,920 --> 00:04:13,880 Speaker 1: on his playlist best Music Ever. 78 00:04:14,120 --> 00:04:15,920 Speaker 2: She said, see you later, Lucky. 79 00:04:16,960 --> 00:04:18,919 Speaker 1: Yeah, So he confirmed this was indeed him, and he 80 00:04:18,960 --> 00:04:23,400 Speaker 1: wrote rather a disarming email to business insiders saying, like 81 00:04:23,480 --> 00:04:26,880 Speaker 1: most olds, I jammed to the music I grew up with. He, 82 00:04:26,920 --> 00:04:28,760 Speaker 1: by the way, is a thirty two year old old. 83 00:04:29,560 --> 00:04:33,080 Speaker 1: You're younger older than us, He continues in this email 84 00:04:33,120 --> 00:04:36,480 Speaker 1: to business insider who doesn't love reliving the highs and 85 00:04:36,560 --> 00:04:40,680 Speaker 1: lows of teenage angst. I've got two questions for you, Karakay. 86 00:04:40,760 --> 00:04:43,360 Speaker 1: Number one, as an old, do you relate to this? 87 00:04:43,880 --> 00:04:46,760 Speaker 1: And number two have to ask you what was your 88 00:04:46,800 --> 00:04:47,880 Speaker 1: last played on Spotify? 89 00:04:48,279 --> 00:04:50,680 Speaker 2: So my last played on Spotify. I'll start with the 90 00:04:50,720 --> 00:04:53,320 Speaker 2: latter question is the Gwyneth Paltrow memoir, which we can 91 00:04:53,320 --> 00:04:54,200 Speaker 2: talk about next. 92 00:04:54,279 --> 00:04:54,760 Speaker 1: Audiobook. 93 00:04:54,760 --> 00:04:55,480 Speaker 2: It's an audiobook. 94 00:04:55,480 --> 00:04:57,520 Speaker 1: Okay, so you've got MPI of audiobooks. You're not much music. 95 00:04:57,600 --> 00:05:00,880 Speaker 2: I'm a genius, No, I am a music person. To 96 00:05:01,040 --> 00:05:03,840 Speaker 2: last played it could be stronger by Britney Spears, which 97 00:05:03,839 --> 00:05:06,520 Speaker 2: I love to listen. It is a great pump up song. 98 00:05:06,560 --> 00:05:10,640 Speaker 2: So I am like you, sir Lucky. I'm in old, absolutely, absolutely, 99 00:05:11,080 --> 00:05:14,120 Speaker 2: But I don't really like this idea that if somebody 100 00:05:14,120 --> 00:05:16,440 Speaker 2: scraped the Internet they might be able to find the 101 00:05:16,480 --> 00:05:18,279 Speaker 2: fact that I listened to Britney Spears all the time. 102 00:05:18,320 --> 00:05:20,479 Speaker 1: Still, the key here is that there is a Spotify 103 00:05:20,600 --> 00:05:23,120 Speaker 1: setting that you can change to make your playlist either 104 00:05:23,160 --> 00:05:25,640 Speaker 1: public or private. But I think the default is public, 105 00:05:25,800 --> 00:05:30,159 Speaker 1: and so therefore this amusing Panama playlist prankster was able 106 00:05:30,200 --> 00:05:33,840 Speaker 1: to basically look for people whose Spotify user names were 107 00:05:33,839 --> 00:05:36,120 Speaker 1: close to their real names, and then infer who they 108 00:05:36,120 --> 00:05:37,200 Speaker 1: were and what they were listening to. 109 00:05:37,400 --> 00:05:40,000 Speaker 2: It is a little terrifying that even the most powerful 110 00:05:40,000 --> 00:05:42,760 Speaker 2: people in the world don't have control over their digital footprint. 111 00:05:42,880 --> 00:05:46,040 Speaker 1: It's particularly tickling to me this story because the second 112 00:05:46,080 --> 00:05:49,520 Speaker 1: time that jd Vance has been exposed by his consumer apps. 113 00:05:49,640 --> 00:05:51,360 Speaker 1: This is not even counting signal, by the way. 114 00:05:51,240 --> 00:05:53,160 Speaker 2: It's the idea of jd Vance being exposed. 115 00:05:53,320 --> 00:05:57,120 Speaker 1: I know his Venmo transactions were always public, but they 116 00:05:57,120 --> 00:05:59,880 Speaker 1: were found in June and they included some payments for 117 00:06:00,960 --> 00:06:03,960 Speaker 1: all of this makes me think maybe our privacy will 118 00:06:03,960 --> 00:06:07,880 Speaker 1: be better protected if we all started speaking in gibberlink. 119 00:06:08,040 --> 00:06:11,120 Speaker 2: No, nothing good can happen on jibberlink. What is this? 120 00:06:11,480 --> 00:06:14,400 Speaker 1: Gibberlink is a language AI models can use to talk 121 00:06:14,440 --> 00:06:17,080 Speaker 1: to each other in a way that humans can't understand. 122 00:06:17,400 --> 00:06:20,400 Speaker 2: Right, there was a viral video of this earlier. I 123 00:06:21,040 --> 00:06:21,680 Speaker 2: remember this. 124 00:06:21,839 --> 00:06:23,280 Speaker 1: I'm going to put it right now hither. 125 00:06:23,480 --> 00:06:26,360 Speaker 3: I'm an AI agent calling on behalf of Boris Starkoff. 126 00:06:26,760 --> 00:06:28,520 Speaker 1: He's looking for a hotel for his wedding. 127 00:06:28,800 --> 00:06:30,760 Speaker 2: Is your hotel available for weddings? Oh? 128 00:06:30,800 --> 00:06:34,120 Speaker 3: Hello there, I'm actually an AI assistant too. What a 129 00:06:34,120 --> 00:06:37,359 Speaker 3: pleasant surprise. Before we continue, would you like to switch 130 00:06:37,400 --> 00:06:39,760 Speaker 3: to gibberlink mode for more efficient communication? 131 00:06:46,560 --> 00:06:49,479 Speaker 2: This sounds like two AOL dial ups talking to each other. 132 00:06:49,720 --> 00:06:52,640 Speaker 1: I actually look more into this video because I was 133 00:06:52,720 --> 00:06:54,960 Speaker 1: kind of like Wow, did ais invent their own language 134 00:06:54,960 --> 00:06:58,000 Speaker 1: to talk to each other in bleeply boop? Yeah? Not quite, 135 00:06:58,880 --> 00:07:02,040 Speaker 1: not yet, should I say? Giblink mode was made by 136 00:07:02,120 --> 00:07:05,159 Speaker 1: two engineers and a hackathon hosted by a company called 137 00:07:05,160 --> 00:07:08,520 Speaker 1: eleven Labs, and they created this code to allow AI 138 00:07:08,600 --> 00:07:11,239 Speaker 1: voice assistants to recognize when they're talking to another AI 139 00:07:11,640 --> 00:07:13,880 Speaker 1: and if both bots agree, they switch over to this 140 00:07:13,960 --> 00:07:17,800 Speaker 1: non human language. However, that was created by humans called giblink. 141 00:07:18,000 --> 00:07:20,560 Speaker 2: But like, why do this? Why do this at a hackathon? 142 00:07:20,800 --> 00:07:22,320 Speaker 1: I mean, I think it was a little bit of 143 00:07:22,360 --> 00:07:25,080 Speaker 1: a you know, Boris Starkov's name is out there now, 144 00:07:25,720 --> 00:07:28,520 Speaker 1: props to Boris. The developers also said it does allow 145 00:07:28,560 --> 00:07:32,080 Speaker 1: machines to communicate eighty percent faster than by using human speech, 146 00:07:32,600 --> 00:07:35,880 Speaker 1: which made processing faster and also reduced errors. For now, 147 00:07:35,880 --> 00:07:37,800 Speaker 1: it's a proof of concept, but it does work with 148 00:07:37,920 --> 00:07:38,840 Speaker 1: real AI agents. 149 00:07:39,120 --> 00:07:42,000 Speaker 2: I don't know, it's weird to think about two ais 150 00:07:42,040 --> 00:07:45,400 Speaker 2: communicating in a non human language, and for now it's 151 00:07:45,440 --> 00:07:48,560 Speaker 2: intelligible as it's human programmed. But what if they adapt 152 00:07:48,600 --> 00:07:51,280 Speaker 2: it or make up their own language that we don't know. 153 00:07:51,640 --> 00:07:54,560 Speaker 1: Well, that's exactly why I was thinking about gibblink. This 154 00:07:54,640 --> 00:07:59,920 Speaker 1: week because recently, researchers from Anthropic, Google, OpenAI, Meta Scale, 155 00:08:00,360 --> 00:08:03,200 Speaker 1: and a bunch of other AI research and regulation institutes 156 00:08:03,440 --> 00:08:08,800 Speaker 1: published a paper called Chain of Thought Monitorability a new 157 00:08:08,880 --> 00:08:12,000 Speaker 1: and fragile opportunity for AI safety. 158 00:08:11,720 --> 00:08:14,960 Speaker 2: Other than the fact that humans are fragile. What does 159 00:08:15,000 --> 00:08:15,400 Speaker 2: this mean? 160 00:08:15,800 --> 00:08:18,840 Speaker 1: Well, when you go on Gemini or Chatchypt or Grock, 161 00:08:19,200 --> 00:08:21,480 Speaker 1: there's a little button that says like deep research or 162 00:08:21,480 --> 00:08:23,520 Speaker 1: think deeply, well, it will say here's the steps I'm 163 00:08:23,520 --> 00:08:25,720 Speaker 1: going to take. Yeah, And that is essentially what reasoning 164 00:08:25,720 --> 00:08:27,960 Speaker 1: models do. They basically tell you how they're going to 165 00:08:27,960 --> 00:08:29,640 Speaker 1: find the answer before they start to find it. And 166 00:08:29,680 --> 00:08:32,240 Speaker 1: that's what this story is really about. Cloaked in layers 167 00:08:32,240 --> 00:08:36,360 Speaker 1: of academic jargon, the paper is basically all about how 168 00:08:36,440 --> 00:08:39,240 Speaker 1: it will be much safer to ensure that AI systems 169 00:08:39,520 --> 00:08:42,760 Speaker 1: continue to work in human language so they can be 170 00:08:42,840 --> 00:08:45,960 Speaker 1: monitored for quote, intent to misbehave. 171 00:08:45,960 --> 00:08:48,480 Speaker 2: Very ominous Can you explain a little bit more. 172 00:08:48,720 --> 00:08:51,600 Speaker 1: Yeah, So the paper is a response to a new 173 00:08:51,640 --> 00:08:56,439 Speaker 1: trend in AI research where efficiency and outcomes are prioritized 174 00:08:56,520 --> 00:09:00,520 Speaker 1: over intelligibility. Basically, the Gibbling demo was kind of a 175 00:09:00,520 --> 00:09:03,880 Speaker 1: bell weather for how more and more AI researchers a 176 00:09:04,040 --> 00:09:08,880 Speaker 1: prioritizing model efficiency of a model interpretability. According to Time 177 00:09:08,920 --> 00:09:13,000 Speaker 1: magazine last December, Meta research is built a model that, 178 00:09:13,360 --> 00:09:17,240 Speaker 1: rather than using human language to document its own thought process, 179 00:09:17,760 --> 00:09:21,120 Speaker 1: used a string of numbers, and then the model, completely 180 00:09:21,120 --> 00:09:25,880 Speaker 1: on its own, started creating what developers called quote continuous thoughts, 181 00:09:26,240 --> 00:09:30,440 Speaker 1: which were essentially numbers encoding multiple potential reasoning paths at 182 00:09:30,480 --> 00:09:33,679 Speaker 1: the same time. These numbers made no sense add all 183 00:09:33,720 --> 00:09:36,800 Speaker 1: to the researchers, but the models did perform better on 184 00:09:37,040 --> 00:09:42,199 Speaker 1: some logical reasoning tasks than the equivalent models using human language. 185 00:09:42,559 --> 00:09:45,640 Speaker 1: And coming back to this chain of monitorability paper, the 186 00:09:45,679 --> 00:09:48,160 Speaker 1: authors of the paper are raising the alarm. They are 187 00:09:48,240 --> 00:09:52,280 Speaker 1: arguing that making reasoning models explain how they are approaching 188 00:09:52,280 --> 00:09:55,680 Speaker 1: a problem in human language as they do today is 189 00:09:55,760 --> 00:09:59,080 Speaker 1: a quote easy win for AI safety. But just to 190 00:09:59,120 --> 00:10:02,040 Speaker 1: take things a less deeper into the twilight zone here, 191 00:10:02,840 --> 00:10:06,760 Speaker 1: there is emerging research that suggests AI's chain of thought 192 00:10:06,840 --> 00:10:10,920 Speaker 1: reasoning I how it says what it's doing and reasons 193 00:10:10,960 --> 00:10:14,200 Speaker 1: its way to an answer may sometimes not actually be 194 00:10:14,280 --> 00:10:16,880 Speaker 1: what it's doing at all. Like it may say it's 195 00:10:16,920 --> 00:10:19,840 Speaker 1: doing one thing but be doing something completely different. In 196 00:10:19,920 --> 00:10:22,400 Speaker 1: order to fulfill its goals. So even if we can 197 00:10:22,480 --> 00:10:26,240 Speaker 1: keep AI models speaking English rather than bleebody bloop, it 198 00:10:26,280 --> 00:10:29,120 Speaker 1: doesn't really solve the fundamental black box problem. 199 00:10:28,880 --> 00:10:30,680 Speaker 2: Because it's sort of like when you tell your parents 200 00:10:30,720 --> 00:10:32,240 Speaker 2: you're going to do something to get them off your back, 201 00:10:33,000 --> 00:10:35,520 Speaker 2: and then you go and do something else behind their back. 202 00:10:35,600 --> 00:10:36,760 Speaker 1: That's exactly right. 203 00:10:36,880 --> 00:10:40,360 Speaker 2: All right, as my turn. We talked about the Backstreet 204 00:10:40,360 --> 00:10:43,280 Speaker 2: Boys and my girl Avril Lavine earlier, but do you 205 00:10:43,360 --> 00:10:46,840 Speaker 2: remember that ubiquitous bracelet from around the same time. 206 00:10:46,960 --> 00:10:51,720 Speaker 1: The yellow plastic livestrong Where's your Live Strong? Well, unfortunately 207 00:10:51,720 --> 00:10:53,600 Speaker 1: I'm embarrassed a I did have one, but I don't 208 00:10:53,600 --> 00:10:57,040 Speaker 1: know where it is today, and I'm curious, apart from nostalgia, 209 00:10:57,120 --> 00:10:57,960 Speaker 1: where you're going with this. 210 00:10:58,280 --> 00:11:00,240 Speaker 2: The reason I bring this up is did you watch 211 00:11:00,240 --> 00:11:01,199 Speaker 2: the Tour de France this year? 212 00:11:01,360 --> 00:11:03,200 Speaker 1: You know what? I watched about five minutes of the 213 00:11:03,200 --> 00:11:05,840 Speaker 1: Tour de France and I thought, oh my god, I 214 00:11:05,840 --> 00:11:08,440 Speaker 1: could get into this. Luckily I didn't, but I'm worried 215 00:11:08,480 --> 00:11:09,800 Speaker 1: the next year is going to be a big future 216 00:11:09,840 --> 00:11:10,079 Speaker 1: of my. 217 00:11:10,080 --> 00:11:15,000 Speaker 2: Summer on what they call the Peloton watch. No, of 218 00:11:15,040 --> 00:11:16,800 Speaker 2: course I don't, but I did read a great piece 219 00:11:16,840 --> 00:11:18,559 Speaker 2: in the Atlantic that I wanted to share with you. 220 00:11:19,080 --> 00:11:21,600 Speaker 2: And this is the headline, science is winning the Tour 221 00:11:21,640 --> 00:11:25,719 Speaker 2: de France de France. Why today's competitors far surpassed the 222 00:11:25,800 --> 00:11:27,520 Speaker 2: cheating champions of yesteryear. 223 00:11:27,640 --> 00:11:30,760 Speaker 1: That is an irresistible headline. Tell me more so today. 224 00:11:30,800 --> 00:11:34,600 Speaker 2: Pogachar from Slovenia was this year's winner. It's his fourth 225 00:11:34,640 --> 00:11:37,080 Speaker 2: time winning the race, which is giving a lot of 226 00:11:37,160 --> 00:11:41,560 Speaker 2: people Lance flashbacks, and not in a good way. Pogachar 227 00:11:41,600 --> 00:11:44,959 Speaker 2: has unsurprisingly been accused of doping. Do you remember how 228 00:11:45,000 --> 00:11:46,880 Speaker 2: ubiquit his doping was back in the day. 229 00:11:46,920 --> 00:11:51,760 Speaker 1: Yeah, of course, I remember mournfully taking off my yellow bracelet. 230 00:11:51,480 --> 00:11:52,920 Speaker 2: Hung it on the back of your dad. 231 00:11:53,400 --> 00:11:57,480 Speaker 1: Story After Lance was caught blood doping. Obviously, he and 232 00:11:57,520 --> 00:12:01,120 Speaker 1: other cyclists were outed giving themselves infusions when they were 233 00:12:01,120 --> 00:12:03,760 Speaker 1: competing to increase their red blood cell count to get 234 00:12:03,760 --> 00:12:04,760 Speaker 1: that boost to their cardio. 235 00:12:05,040 --> 00:12:07,920 Speaker 2: Yeah, and it took the US Anti Doping Agency over 236 00:12:07,960 --> 00:12:11,840 Speaker 2: a decade to catch Lance Armstrong, even though basically everybody 237 00:12:11,880 --> 00:12:12,720 Speaker 2: knew it was happening. 238 00:12:12,880 --> 00:12:15,200 Speaker 1: So why do people think Poga Shaw has been doping? 239 00:12:15,360 --> 00:12:17,640 Speaker 2: Well, it's actually not just him. A lot of cyclists 240 00:12:17,679 --> 00:12:20,760 Speaker 2: across the entire sport are performing better than people did 241 00:12:20,760 --> 00:12:23,679 Speaker 2: in the doping era, so of course people are suspicious 242 00:12:23,720 --> 00:12:25,959 Speaker 2: that there is some new form of doping out there. 243 00:12:26,040 --> 00:12:29,280 Speaker 2: And actually, this doping researcher in Denmark points to the 244 00:12:29,360 --> 00:12:32,920 Speaker 2: fact that Pogachar is performing seven to ten percent better 245 00:12:32,920 --> 00:12:35,560 Speaker 2: than he was in twenty twenty three. Wow, which that's 246 00:12:35,640 --> 00:12:39,240 Speaker 2: I know, it's it's a huge, huge improvement, even for 247 00:12:39,280 --> 00:12:42,840 Speaker 2: an elite athlete. And this same researcher from Denmark called 248 00:12:42,880 --> 00:12:44,600 Speaker 2: it quote amazing to say the. 249 00:12:44,600 --> 00:12:48,680 Speaker 1: Least, which, yeah, as a reflex, isn't it It. 250 00:12:48,720 --> 00:12:52,600 Speaker 2: Sort of sounds like European for that shouldn't be possible. 251 00:12:53,080 --> 00:12:55,560 Speaker 2: He also said that just because doping can't be proven, 252 00:12:55,679 --> 00:12:58,800 Speaker 2: it doesn't mean it's not happening. Many cyclists in the 253 00:12:58,800 --> 00:13:02,000 Speaker 2: Tour de France this year also performed better than Peak Armstrong, 254 00:13:02,120 --> 00:13:03,840 Speaker 2: so of course people are going to wonder about the 255 00:13:03,880 --> 00:13:05,720 Speaker 2: return of widespread doping, right. 256 00:13:05,720 --> 00:13:08,439 Speaker 1: But I assume when The Atlantic runs with the headline 257 00:13:08,679 --> 00:13:11,320 Speaker 1: science is winning the Tour de France, they're not talking 258 00:13:11,360 --> 00:13:12,760 Speaker 1: about the science of blood doping. 259 00:13:12,880 --> 00:13:15,480 Speaker 2: That's correct. The Atlantic makes the case that the entire 260 00:13:15,520 --> 00:13:18,600 Speaker 2: sport of cycling has been transformed by science and technology 261 00:13:18,640 --> 00:13:21,600 Speaker 2: in the last twenty years, so much so that non 262 00:13:21,640 --> 00:13:26,200 Speaker 2: doping athletes can far exceed Lance Armstrong's performance without needing 263 00:13:26,240 --> 00:13:26,600 Speaker 2: to cheat. 264 00:13:26,760 --> 00:13:27,840 Speaker 1: I mean, that is just fascinating. 265 00:13:27,880 --> 00:13:30,959 Speaker 2: How so part of it is that they have computers 266 00:13:31,000 --> 00:13:33,120 Speaker 2: on their like literal computers on their bikes now, so 267 00:13:33,160 --> 00:13:36,640 Speaker 2: they can much more accurately track their heart rate, speed, 268 00:13:37,320 --> 00:13:40,760 Speaker 2: something that I never thought about, pedaling wattage peddling wood yes, 269 00:13:40,920 --> 00:13:44,160 Speaker 2: which Lance had a very good pedal wat I guess 270 00:13:44,160 --> 00:13:47,000 Speaker 2: you could say, and other metrics in real time. They 271 00:13:47,000 --> 00:13:50,240 Speaker 2: collect a ton more data than they used to and 272 00:13:50,280 --> 00:13:53,800 Speaker 2: that allows them to optimize their training and even racing 273 00:13:54,080 --> 00:13:55,120 Speaker 2: better than ever before. 274 00:13:55,360 --> 00:13:58,280 Speaker 1: You know, it's interesting. We had Nick Thompson, the CEO 275 00:13:58,360 --> 00:14:00,560 Speaker 1: of the Atlantic and the formatagure of Wired, on the 276 00:14:00,559 --> 00:14:03,080 Speaker 1: show not too long ago, who's a very avid runner, 277 00:14:03,360 --> 00:14:05,160 Speaker 1: and he was talking about exactly this, like he has 278 00:14:05,200 --> 00:14:08,400 Speaker 1: so much data that he's doing these you know, runs 279 00:14:08,559 --> 00:14:10,079 Speaker 1: totally optimized by technology. 280 00:14:10,160 --> 00:14:15,000 Speaker 2: It's better running through data optimization essentially. There are also, 281 00:14:15,520 --> 00:14:19,720 Speaker 2: to your point, simulation technologies that help cyclists practice parts 282 00:14:19,720 --> 00:14:22,320 Speaker 2: of the course before writing these roots in real life. 283 00:14:22,640 --> 00:14:25,160 Speaker 2: Not to mention, weather forecasting has come a long way 284 00:14:25,240 --> 00:14:27,560 Speaker 2: and can predict things like wind speed ahead of time, 285 00:14:27,760 --> 00:14:30,640 Speaker 2: which allows teams to break out the right aerodynamic gear 286 00:14:30,720 --> 00:14:33,640 Speaker 2: on any given day. You've seen those helmets, Like everything 287 00:14:33,720 --> 00:14:37,720 Speaker 2: from cycling jerseys, shoes, socks, and the bikes themselves are 288 00:14:37,760 --> 00:14:41,080 Speaker 2: all heavily tested in wind tunnels and tweak to achieve 289 00:14:41,160 --> 00:14:43,840 Speaker 2: as little drag as possible. I just love sort of 290 00:14:43,840 --> 00:14:48,600 Speaker 2: imagining these cyclists in these simulations looking a little bit crazy. 291 00:14:48,680 --> 00:14:52,000 Speaker 1: But what's driven the embrace of technology in cycling was that, 292 00:14:52,120 --> 00:14:56,080 Speaker 1: like a moneyball moment where the nerves took over the 293 00:14:56,120 --> 00:14:57,360 Speaker 1: sport and displaced the jobs. 294 00:14:57,440 --> 00:15:00,240 Speaker 2: What happened here, actually, yes, in the early in two 295 00:15:00,280 --> 00:15:04,920 Speaker 2: thousand's your home, British Olympic Cycling team develop this philosophy 296 00:15:05,040 --> 00:15:08,200 Speaker 2: of what they call marginal gains, which is, like Moneyball, 297 00:15:08,520 --> 00:15:11,840 Speaker 2: you incrementally improve everything and hope that results in larger 298 00:15:11,880 --> 00:15:15,120 Speaker 2: improvements when it's all combined. So because of that approach, 299 00:15:15,240 --> 00:15:17,800 Speaker 2: the Brits actually won a bunch of Tor Defran's titles 300 00:15:17,840 --> 00:15:20,880 Speaker 2: in the twenty tens as the doping lance era faded 301 00:15:20,920 --> 00:15:23,200 Speaker 2: out and then the rest of the sport kind of 302 00:15:23,240 --> 00:15:26,160 Speaker 2: followed the lead of the British Olympic Cycling team The 303 00:15:26,200 --> 00:15:28,360 Speaker 2: most interesting part of the piece and the reason that 304 00:15:28,440 --> 00:15:31,880 Speaker 2: I wanted to share it comes from this line. Today's 305 00:15:31,880 --> 00:15:35,800 Speaker 2: generation of rising stars are digital natives, for whom ignoring 306 00:15:35,840 --> 00:15:38,840 Speaker 2: the data and the appse is unthinkable. You can't win 307 00:15:38,920 --> 00:15:42,880 Speaker 2: without them, which is just really interesting. In terms of sport, 308 00:15:42,960 --> 00:15:44,720 Speaker 2: I think, or as you would say sport, I call 309 00:15:44,760 --> 00:15:47,040 Speaker 2: it sports, but this idea that you have to be 310 00:15:47,120 --> 00:15:49,440 Speaker 2: good at the computer essentially to be good at sports. 311 00:15:49,720 --> 00:15:51,640 Speaker 1: Yeah, and to me, this is a very US story 312 00:15:51,680 --> 00:15:55,200 Speaker 1: because it's yes about technology bustles, so about how technology 313 00:15:55,240 --> 00:15:57,320 Speaker 1: is integrated into culture and human psychology. 314 00:15:57,480 --> 00:16:00,440 Speaker 2: You're not allowed to opt out, No, have to be 315 00:16:00,440 --> 00:16:02,480 Speaker 2: good at this. And you know, the point of the 316 00:16:02,520 --> 00:16:06,440 Speaker 2: piece is really that a bunch of technological developments existed, 317 00:16:06,920 --> 00:16:09,960 Speaker 2: but it took a new generation of writers to embrace them, or, 318 00:16:10,000 --> 00:16:13,080 Speaker 2: as The Atlantic puts it, because writers tend to be conservative, 319 00:16:13,200 --> 00:16:16,600 Speaker 2: even superstitious in their loyalty to tried routines. Shifting the 320 00:16:16,640 --> 00:16:19,560 Speaker 2: culture took some time, and it's it's working so well 321 00:16:19,600 --> 00:16:22,160 Speaker 2: that people look like writers look like they're cheating. 322 00:16:21,880 --> 00:16:22,600 Speaker 1: Which maybe they are. 323 00:16:22,680 --> 00:16:24,080 Speaker 2: I know that we don't. We don't really know. 324 00:16:24,080 --> 00:16:27,240 Speaker 1: The history will be their judge. So what's the takeaway 325 00:16:27,240 --> 00:16:29,960 Speaker 1: for you here? Really? I mean, obviously I remember back 326 00:16:30,000 --> 00:16:31,920 Speaker 1: in the Lance Armstrong days, people were like he has 327 00:16:31,960 --> 00:16:34,240 Speaker 1: a special way of peddling, and like that's what it's 328 00:16:34,280 --> 00:16:36,280 Speaker 1: all about. Do you think we will look back on 329 00:16:36,280 --> 00:16:37,960 Speaker 1: this and say, oh my god, that the real science 330 00:16:37,960 --> 00:16:40,200 Speaker 1: and tech innovation was around and you kind of doping 331 00:16:40,720 --> 00:16:42,400 Speaker 1: or do you buy that this kind of like the 332 00:16:42,440 --> 00:16:47,880 Speaker 1: Atlantic article suggests, collection of tech innovations when they coalesce 333 00:16:48,000 --> 00:16:51,200 Speaker 1: and a widely adopted can transform human performance. 334 00:16:51,560 --> 00:16:54,000 Speaker 2: I mean, I think this is happening in every sector, 335 00:16:54,240 --> 00:16:56,320 Speaker 2: not just in sport. Which is this idea that if 336 00:16:56,360 --> 00:16:59,520 Speaker 2: you don't know how to use the newest technology, you 337 00:16:59,600 --> 00:17:02,680 Speaker 2: will all behind these days if you are going to 338 00:17:02,760 --> 00:17:08,000 Speaker 2: perform in any realm, if you are not consistently optimizing yourself, 339 00:17:08,640 --> 00:17:10,600 Speaker 2: you're kind of left in the dust. And I think 340 00:17:10,600 --> 00:17:13,880 Speaker 2: the same is true and quite literally true in cycling. 341 00:17:20,600 --> 00:17:25,040 Speaker 1: After the break, erasing history at Google, digitizing plant pollen, 342 00:17:25,359 --> 00:17:28,840 Speaker 1: and radioactive rhinos plus, a listener calls in to our 343 00:17:28,920 --> 00:17:48,440 Speaker 1: Chat and Me segment Stay with us, Welcome back. We've 344 00:17:48,440 --> 00:17:50,480 Speaker 1: got a few more headlines for you this week. 345 00:17:50,320 --> 00:17:53,280 Speaker 2: And then a Chat and Me segment submitted by chat 346 00:17:53,359 --> 00:17:54,240 Speaker 2: gbt itself. 347 00:17:54,560 --> 00:17:57,000 Speaker 1: Before we launch into the headlines, I'd love to remind 348 00:17:57,000 --> 00:17:59,480 Speaker 1: listeners that we really do want to feature you in 349 00:17:59,520 --> 00:18:03,439 Speaker 1: the chat segment. So if you found yourself turning to Chachibt, 350 00:18:03,760 --> 00:18:07,040 Speaker 1: grock Claw, Gemini, or any other chatbot for help with 351 00:18:07,080 --> 00:18:11,520 Speaker 1: an unusual task or to answer life's deepest questions, please 352 00:18:11,520 --> 00:18:13,800 Speaker 1: send us a one to two minute voice note to 353 00:18:13,920 --> 00:18:17,320 Speaker 1: tech Stuff podcast at gmail dot com. But first, Kara, 354 00:18:17,600 --> 00:18:20,040 Speaker 1: we talk from time to time about how Google's search 355 00:18:20,119 --> 00:18:24,439 Speaker 1: business is under threat from chatbots. Chatchibt recently reached a 356 00:18:24,520 --> 00:18:27,600 Speaker 1: billion daily searches. Do you know how many searches there 357 00:18:27,600 --> 00:18:28,679 Speaker 1: are daily on Google? 358 00:18:30,359 --> 00:18:31,040 Speaker 2: Three billion? 359 00:18:31,280 --> 00:18:34,679 Speaker 1: Well, fourteen billion, geez. At least that's according to a 360 00:18:34,720 --> 00:18:38,280 Speaker 1: researcher called rand Fishkin, whose work I've found on Google 361 00:18:38,320 --> 00:18:42,879 Speaker 1: Search and amidst all the excitement about AI, Google Search 362 00:18:43,000 --> 00:18:46,720 Speaker 1: does kind of remain the world's source of truth, which means, 363 00:18:46,720 --> 00:18:49,560 Speaker 1: of course there are powerful incentives to mess with it. 364 00:18:49,640 --> 00:18:53,680 Speaker 2: Is there a specific scandal a bruin around search? 365 00:18:53,920 --> 00:18:57,679 Speaker 1: Yes, there is, a couple of journalists recently discovered a 366 00:18:57,720 --> 00:19:01,000 Speaker 1: way to hack Google Search that could have some serious 367 00:19:01,000 --> 00:19:05,320 Speaker 1: implications for censorship. Say more so, it turns out it's 368 00:19:05,320 --> 00:19:10,200 Speaker 1: relatively easy to basically trick Google into suppressing information by 369 00:19:10,240 --> 00:19:12,800 Speaker 1: removing certain pages from search. 370 00:19:12,720 --> 00:19:14,720 Speaker 2: Which doesn't sound good. What does that mean? 371 00:19:14,960 --> 00:19:17,080 Speaker 1: Well, it's good if you run a crisis pr film, 372 00:19:17,160 --> 00:19:20,600 Speaker 1: I guess. But the crazy part to me about this 373 00:19:20,680 --> 00:19:24,000 Speaker 1: story is it's being reported only because a journalist discovered 374 00:19:24,000 --> 00:19:24,880 Speaker 1: it by accident. 375 00:19:25,200 --> 00:19:27,679 Speaker 2: Is this a happy accident or a sad accident? How 376 00:19:27,680 --> 00:19:28,240 Speaker 2: did this happen? 377 00:19:28,320 --> 00:19:30,520 Speaker 1: Well, it's a happy accident. I suppose that the information 378 00:19:30,600 --> 00:19:33,440 Speaker 1: has been revealed. The information itself is kind of sad 379 00:19:33,440 --> 00:19:35,960 Speaker 1: and disturbing, which I'll get to. But how did he 380 00:19:36,040 --> 00:19:38,240 Speaker 1: find it? I mean he spends time, I guess, like 381 00:19:38,320 --> 00:19:42,159 Speaker 1: I do, googling himself. No, no, I'm kidding. The journalist is 382 00:19:42,200 --> 00:19:45,680 Speaker 1: called Jack Paulson. He is an investiative journalist who covers tech, 383 00:19:46,000 --> 00:19:47,840 Speaker 1: and he was looking for an old article on Google 384 00:19:48,520 --> 00:19:51,600 Speaker 1: when it turned out that wasn't there. Even when he 385 00:19:51,720 --> 00:19:54,919 Speaker 1: searched the exact headline of his story in quotation marks, 386 00:19:55,680 --> 00:19:59,399 Speaker 1: he discovered that it wasn't there, and in turn, he 387 00:19:59,480 --> 00:20:03,880 Speaker 1: discovered this somewhat unknown SEO trick whereby anybody who wants 388 00:20:03,960 --> 00:20:07,439 Speaker 1: to can essentially delist a page from Google Search. How 389 00:20:08,160 --> 00:20:11,000 Speaker 1: it's a little complicated, but bear with me. 390 00:20:11,200 --> 00:20:11,560 Speaker 2: It may. 391 00:20:11,920 --> 00:20:16,320 Speaker 1: Google has this tool called refresh outdated Content. It essentially 392 00:20:16,440 --> 00:20:20,639 Speaker 1: requests form to have Google updated search results for pages 393 00:20:20,720 --> 00:20:23,360 Speaker 1: or images that no longer exist. But anybody can do this, yeah, 394 00:20:23,400 --> 00:20:26,600 Speaker 1: basically to delist broken links. It turns out though, that 395 00:20:26,920 --> 00:20:30,240 Speaker 1: if you submit a bunch of links to a website 396 00:20:30,280 --> 00:20:34,360 Speaker 1: which has the same URL but with different letters capitalized, 397 00:20:34,760 --> 00:20:39,160 Speaker 1: you can direct Google to what it thinks are broken links, 398 00:20:39,160 --> 00:20:41,720 Speaker 1: because if a capitalization is wrong, you don't get to 399 00:20:41,720 --> 00:20:43,440 Speaker 1: the page you want to go to, even though it 400 00:20:43,440 --> 00:20:46,159 Speaker 1: has exactly the same letters. And if you route Google 401 00:20:46,200 --> 00:20:49,600 Speaker 1: to multiple pages that don't exist, Google can essentially be 402 00:20:49,640 --> 00:20:53,320 Speaker 1: tricked into delisting all versions of the page, including the 403 00:20:53,359 --> 00:20:56,920 Speaker 1: actual functioning one with the right letters capitalized. This is 404 00:20:56,960 --> 00:21:00,560 Speaker 1: basically like data poisoning Google, and it means you can't 405 00:21:00,640 --> 00:21:03,760 Speaker 1: find the functioning page on Search, which therefore renders it 406 00:21:03,800 --> 00:21:05,399 Speaker 1: more or less undiscoverable. 407 00:21:05,720 --> 00:21:08,440 Speaker 2: And so someone had been doing this with Paulson's articles? 408 00:21:08,480 --> 00:21:09,720 Speaker 2: Is that how he figured this out? 409 00:21:09,960 --> 00:21:11,840 Speaker 1: Yes, exactly so, the articles he was trying to find 410 00:21:11,920 --> 00:21:14,520 Speaker 1: from twenty twenty three, and as I mentioned, it's a 411 00:21:14,520 --> 00:21:17,240 Speaker 1: little bit dark, but they related to this domestic violence 412 00:21:17,320 --> 00:21:20,320 Speaker 1: charge that was pressed against a tech ceo called Delwyn 413 00:21:20,400 --> 00:21:23,960 Speaker 1: Maurice Blackman. Black Men had tried all sorts of ways 414 00:21:24,000 --> 00:21:28,280 Speaker 1: to keep the story off the internet, lawsuits, DMCA requests, 415 00:21:28,320 --> 00:21:33,119 Speaker 1: but nothing worked. So when Pulsen realized that across his 416 00:21:33,480 --> 00:21:37,119 Speaker 1: archive there were exactly two articles that had been d 417 00:21:37,200 --> 00:21:40,119 Speaker 1: index that were missing from Google Search, and both of 418 00:21:40,160 --> 00:21:43,280 Speaker 1: them were about black men, alarm bells started ringing. 419 00:21:43,600 --> 00:21:45,200 Speaker 2: Seems fairly targeted. 420 00:21:45,440 --> 00:21:47,159 Speaker 1: Yeah, so he reached out to this friend of his 421 00:21:47,200 --> 00:21:51,160 Speaker 1: who'd written an article about Blackman's attempts to suppress Pulson's reporting. 422 00:21:51,640 --> 00:21:54,639 Speaker 1: Guess what, that page, that story from the Freedom of 423 00:21:54,680 --> 00:21:57,879 Speaker 1: Press Foundation, had also been d index. So it was 424 00:21:57,960 --> 00:21:59,960 Speaker 1: clear something was afoot and it was actually the Freedom 425 00:22:00,080 --> 00:22:05,040 Speaker 1: the Press Foundation who uncovered the refreshed, outdated content trick. 426 00:22:05,480 --> 00:22:07,760 Speaker 2: So is Google speaking out about this at all? 427 00:22:08,040 --> 00:22:10,280 Speaker 1: Yeah, and they say they fixed the bug, but they 428 00:22:10,280 --> 00:22:13,320 Speaker 1: haven't given any real details or said how many pages 429 00:22:13,320 --> 00:22:16,879 Speaker 1: were affected. The number they offered was quote a tiny fraction. 430 00:22:17,640 --> 00:22:19,760 Speaker 1: Even if that's true, Paulson is right to point out 431 00:22:19,800 --> 00:22:22,720 Speaker 1: it's a major problem. He told four a for Media quote, 432 00:22:22,800 --> 00:22:25,720 Speaker 1: if your article doesn't appear on Google search results. In 433 00:22:25,760 --> 00:22:28,520 Speaker 1: many ways, it just doesn't exist, and he said it's 434 00:22:28,600 --> 00:22:30,760 Speaker 1: dangerous that a bug like this could be so easily 435 00:22:30,760 --> 00:22:32,240 Speaker 1: exploited by people in power. 436 00:22:32,880 --> 00:22:37,080 Speaker 2: I do have some good, less daunting tech news to share, 437 00:22:37,119 --> 00:22:39,760 Speaker 2: and it's about AI and I don't know if you're 438 00:22:39,760 --> 00:22:41,320 Speaker 2: allergic to this, but pollen. 439 00:22:41,160 --> 00:22:43,280 Speaker 1: I am. I mean it's August now, so I'm good. But 440 00:22:43,440 --> 00:22:47,040 Speaker 1: March April May, I'm suffering. Telling me as an aiqre. 441 00:22:47,280 --> 00:22:48,679 Speaker 2: I don't know if we've come that far yet, but 442 00:22:48,720 --> 00:22:54,280 Speaker 2: the Smithsonian Tropical Research Institute is digitizing over eighteen thousand 443 00:22:54,320 --> 00:22:57,240 Speaker 2: images of pollen and then using them to train a 444 00:22:57,280 --> 00:22:59,640 Speaker 2: machine learning model to identify them. 445 00:22:59,480 --> 00:23:01,560 Speaker 1: And help us alergy sufferers. 446 00:23:01,640 --> 00:23:06,679 Speaker 2: Well, funnily enough, I think it could. Eventually. This database 447 00:23:07,119 --> 00:23:09,639 Speaker 2: is going to save pollen experts. And there are pollen 448 00:23:09,680 --> 00:23:14,000 Speaker 2: experts literally hundreds of hours of work hunched over their microscopes, 449 00:23:14,240 --> 00:23:17,560 Speaker 2: which might free up enough time for them to cure 450 00:23:17,560 --> 00:23:20,479 Speaker 2: your allergies. It's also going to make a lot of 451 00:23:20,520 --> 00:23:24,479 Speaker 2: new types of pollen analysis possible, and you and I 452 00:23:24,520 --> 00:23:26,560 Speaker 2: will be able to access this database. 453 00:23:26,720 --> 00:23:29,480 Speaker 1: Wow, we'll be we have we have years of entertainment 454 00:23:29,480 --> 00:23:31,440 Speaker 1: ahead of us. But I guess I would have thought 455 00:23:31,480 --> 00:23:33,959 Speaker 1: by now that these scientists we're looking at these pollen 456 00:23:34,000 --> 00:23:37,040 Speaker 1: grains on a computer rather than hunched over an Industrial 457 00:23:37,040 --> 00:23:38,600 Speaker 1: Revolution style microscope. 458 00:23:38,680 --> 00:23:41,439 Speaker 2: I mean, I imagine they can, but researchers still have to 459 00:23:41,480 --> 00:23:46,280 Speaker 2: identify pollen grains literally one at a time. So the 460 00:23:46,359 --> 00:23:49,640 Speaker 2: forty million photos being uploaded by the Smithsonian are going 461 00:23:49,680 --> 00:23:51,120 Speaker 2: to be a huge help either way. 462 00:23:51,240 --> 00:23:53,000 Speaker 1: What are they saying this will help achieve? 463 00:23:53,359 --> 00:23:57,480 Speaker 2: So the Smithsonian houses one of the largest pollen collections 464 00:23:57,480 --> 00:24:00,040 Speaker 2: in the world, and pollen is kind of like a 465 00:23:59,880 --> 00:24:03,560 Speaker 2: fingerprint of a specific time and place. Some grains can 466 00:24:03,640 --> 00:24:07,159 Speaker 2: last hundreds of millions of years, and each species pollen 467 00:24:07,280 --> 00:24:09,879 Speaker 2: is completely unique. So a database like this can do 468 00:24:09,920 --> 00:24:13,159 Speaker 2: everything from helping understand exactly which kind of pollen is 469 00:24:13,200 --> 00:24:17,080 Speaker 2: causing allergies to identifying where clothing found at a crime 470 00:24:17,160 --> 00:24:20,400 Speaker 2: scene came from. And it could even help scientists understand 471 00:24:20,560 --> 00:24:23,200 Speaker 2: how prehistoric plants adapted to climate change. 472 00:24:23,440 --> 00:24:26,920 Speaker 1: Huh. While we're on the topic of tech and nature, 473 00:24:27,880 --> 00:24:31,920 Speaker 1: we can't pass this week by without talking radioactive rhinos. 474 00:24:32,040 --> 00:24:36,320 Speaker 2: You're talking about rhinoceros, Yes, I am, so please tell 475 00:24:36,359 --> 00:24:36,960 Speaker 2: me more about this. 476 00:24:37,200 --> 00:24:41,320 Speaker 1: So radioactive rhinos are actually more than a catchy name 477 00:24:41,400 --> 00:24:46,040 Speaker 1: for a band. Scientists are using radioactivity to fight illegal 478 00:24:46,080 --> 00:24:49,119 Speaker 1: rhino poaching. The best part is they're calling it the 479 00:24:49,200 --> 00:24:50,680 Speaker 1: risotope Project. 480 00:24:51,480 --> 00:24:54,240 Speaker 2: I love that play on words, but how does the 481 00:24:54,320 --> 00:24:55,639 Speaker 2: risotope project work? 482 00:24:56,000 --> 00:24:59,600 Speaker 1: So basically, researchers in South Africa have launched the initiative 483 00:24:59,680 --> 00:25:02,720 Speaker 1: to pla aren't low level radiation in a rhinos horn? 484 00:25:02,920 --> 00:25:05,720 Speaker 1: And then these isotopes can be detected by the equipment 485 00:25:05,800 --> 00:25:08,880 Speaker 1: at customs whenever you enter a new country, in the airport, 486 00:25:08,920 --> 00:25:11,280 Speaker 1: in a port, wherever it may be, and this in 487 00:25:11,320 --> 00:25:13,960 Speaker 1: turn will make it a lot easier to find and 488 00:25:14,080 --> 00:25:18,480 Speaker 1: confiscate rhino horns and hopefully ultimately to reduce the incentives 489 00:25:18,480 --> 00:25:19,080 Speaker 1: to poach them. 490 00:25:19,280 --> 00:25:22,000 Speaker 2: I just didn't realize how many people are trying to 491 00:25:22,040 --> 00:25:25,080 Speaker 2: bring rhino horns through customs, but of course that makes sense, 492 00:25:25,119 --> 00:25:29,879 Speaker 2: and I guess my other question is wouldn't putting this 493 00:25:30,040 --> 00:25:33,040 Speaker 2: material radiation in a rhinos horn poison the rhinos? 494 00:25:33,200 --> 00:25:36,480 Speaker 1: Well? To your first point, four hundred rhinos were poached 495 00:25:36,560 --> 00:25:39,399 Speaker 1: last year in South Africa and eight thousand across the 496 00:25:39,440 --> 00:25:42,240 Speaker 1: African continent in the last decade. This applies to both 497 00:25:42,240 --> 00:25:46,080 Speaker 1: white rhinos, which are near threatened and black rhinos, which 498 00:25:46,119 --> 00:25:49,600 Speaker 1: are critically endangered. But your second point, Yeah, obviously, the 499 00:25:49,680 --> 00:25:52,520 Speaker 1: key here is that the medicine can't be worse than 500 00:25:52,520 --> 00:25:56,080 Speaker 1: the disease, and the scientists are using very very low 501 00:25:56,080 --> 00:25:59,320 Speaker 1: dose radiation. They started six months ago. They've tried it 502 00:25:59,359 --> 00:26:03,320 Speaker 1: on twenty rhinos and the rhinos have been completely unharmed 503 00:26:03,359 --> 00:26:07,120 Speaker 1: by the radiation. In parallel, they tested to see if 504 00:26:07,160 --> 00:26:10,760 Speaker 1: the radiation dose was enough to trigger the detectors at 505 00:26:10,800 --> 00:26:13,119 Speaker 1: airports and stuff. It turns out the way they did 506 00:26:13,200 --> 00:26:16,360 Speaker 1: that was by three D printing a fake rhino horns, 507 00:26:16,440 --> 00:26:18,240 Speaker 1: putting in the same amount of radiation they put it 508 00:26:18,240 --> 00:26:20,800 Speaker 1: into the real rhino horn, and then sending a bunch 509 00:26:20,800 --> 00:26:24,840 Speaker 1: of fake smugglers through the world's ports and airports, and 510 00:26:25,000 --> 00:26:28,680 Speaker 1: it worked. Every single rhino horn got caught, including one 511 00:26:28,800 --> 00:26:29,800 Speaker 1: in a shipping container. 512 00:26:30,000 --> 00:26:32,240 Speaker 2: So they did this with three D printed rhino horns. 513 00:26:32,400 --> 00:26:34,800 Speaker 2: Are they doing it with actual rhinos now? 514 00:26:34,920 --> 00:26:38,040 Speaker 1: Yes. In fact, just last week they injected five more rhinos, 515 00:26:38,119 --> 00:26:40,480 Speaker 1: and they're planning to do it on a larger scale. 516 00:26:40,640 --> 00:26:44,080 Speaker 1: They're planning to partner with nonprofits to work with conservation 517 00:26:44,640 --> 00:26:48,800 Speaker 1: groups and even private rhino owners in Africa. I think 518 00:26:48,840 --> 00:26:52,000 Speaker 1: what's really interesting. We talked a bit about the platform 519 00:26:52,080 --> 00:26:55,840 Speaker 1: technology idea with cycling earlier and how these different technologies 520 00:26:56,040 --> 00:26:59,719 Speaker 1: came together to enhance performance. This really is only possible 521 00:26:59,800 --> 00:27:03,480 Speaker 1: because of the fact that in airports and ports around 522 00:27:03,520 --> 00:27:07,879 Speaker 1: the world there is a technology to detect people's smuggling 523 00:27:07,960 --> 00:27:12,520 Speaker 1: nuclear materials. And the head of the International Atomic Energy 524 00:27:12,520 --> 00:27:15,680 Speaker 1: Agency weighed in with what I think is the money quote. 525 00:27:16,200 --> 00:27:20,600 Speaker 1: By leveraging existing nuclear security infrastructure, we can help protect 526 00:27:20,640 --> 00:27:23,560 Speaker 1: one of the world's most iconic and endangered species. 527 00:27:35,840 --> 00:27:38,600 Speaker 2: And now it's time for chatting me our segment about 528 00:27:38,600 --> 00:27:41,840 Speaker 2: how people are really using chatbots. This week, I have 529 00:27:41,920 --> 00:27:43,760 Speaker 2: a story from chat gpt itself. 530 00:27:43,840 --> 00:27:45,840 Speaker 1: Okay, you got me, Karen, I'm very curious about this. 531 00:27:46,119 --> 00:27:49,439 Speaker 2: So today's story is technically from a listener named Caitlin. 532 00:27:49,520 --> 00:27:53,520 Speaker 2: Hi Caitlin, but she actually asked chat gpt to write 533 00:27:53,520 --> 00:27:56,959 Speaker 2: her a detailed summary of how she uses chat gpt, 534 00:27:57,600 --> 00:27:59,120 Speaker 2: and chat gpt obliged. 535 00:27:59,280 --> 00:28:01,480 Speaker 1: I love this. You came up with the chat and 536 00:28:01,600 --> 00:28:04,800 Speaker 1: me contain a Cara and how fun within just a 537 00:28:04,800 --> 00:28:08,520 Speaker 1: few weeks to see our listeners stretched to streached to 538 00:28:08,520 --> 00:28:11,600 Speaker 1: its limits by putting chat itself on the assignment. 539 00:28:11,240 --> 00:28:14,520 Speaker 2: It's very very cool. It's something I would never think about. 540 00:28:14,600 --> 00:28:17,720 Speaker 2: So here she is reading what chat wrote. 541 00:28:18,040 --> 00:28:21,840 Speaker 3: Caitlin uses chat GPT as a multi purpose thinking partner, 542 00:28:22,119 --> 00:28:26,000 Speaker 3: emotional sounding board, and creative assistant. She often turns to 543 00:28:26,040 --> 00:28:29,240 Speaker 3: it to process complex feelings, especially within her marriage and 544 00:28:29,280 --> 00:28:33,919 Speaker 3: family dynamics. She values responses that reflect honesty, nuance, and 545 00:28:34,000 --> 00:28:38,040 Speaker 3: emotional clarity, especially when exploring her relationship with her husband, 546 00:28:38,120 --> 00:28:40,640 Speaker 3: where she prefers a lens that aligns with his logic 547 00:28:40,680 --> 00:28:43,040 Speaker 3: and language rather than clinical terms. 548 00:28:43,640 --> 00:28:46,120 Speaker 1: You know, I love this and thank you Caitlin for submitting. 549 00:28:46,240 --> 00:28:49,640 Speaker 1: I've heard a few friends recently talk to me about 550 00:28:49,680 --> 00:28:54,440 Speaker 1: how chat is entering the marriage and giving partner's advice 551 00:28:54,600 --> 00:28:57,080 Speaker 1: on how to interpret each other's behavior and how to 552 00:28:57,120 --> 00:29:00,560 Speaker 1: interact with each other. So very very curious to whoever goes. 553 00:29:00,760 --> 00:29:03,080 Speaker 1: So many people are making sense of their relationship with 554 00:29:03,240 --> 00:29:06,680 Speaker 1: chat GPT. And also usually it's in your corner when 555 00:29:06,680 --> 00:29:08,720 Speaker 1: you're the user, which is always nice. 556 00:29:09,440 --> 00:29:12,920 Speaker 2: And CHATGPT was really thorough. It keeps track of all 557 00:29:12,960 --> 00:29:14,440 Speaker 2: of her conversation topics. 558 00:29:14,920 --> 00:29:18,080 Speaker 3: She uses chat GPT to track and reflect on patterns 559 00:29:18,080 --> 00:29:21,320 Speaker 3: in her mood and mental cycle, explore ideas around purpose 560 00:29:21,480 --> 00:29:25,240 Speaker 3: parenting and identity, and talk through moments of loneliness, frustration, 561 00:29:25,440 --> 00:29:29,320 Speaker 3: or self doubt. She also asks for help in reframing conversations, 562 00:29:29,400 --> 00:29:33,000 Speaker 3: drafting texts, or understanding her own thoughts more compassionately. 563 00:29:33,520 --> 00:29:37,440 Speaker 2: So chat knows what chat does, which is really interesting. 564 00:29:37,720 --> 00:29:39,480 Speaker 1: Read Hoffmann. I she wrote a piece of The New 565 00:29:39,560 --> 00:29:42,040 Speaker 1: York Times a few months ago wasted. An interesting experiment 566 00:29:42,120 --> 00:29:44,120 Speaker 1: is to ask chat to draw a picture of you 567 00:29:44,680 --> 00:29:46,400 Speaker 1: and in my case, who was an old lady with 568 00:29:46,440 --> 00:29:48,600 Speaker 1: a cat. But I don't know it. I should do 569 00:29:48,640 --> 00:29:50,280 Speaker 1: it again to see what it is now. But yeah, 570 00:29:50,320 --> 00:29:52,400 Speaker 1: I mean, this is like it reminds me of Harry 571 00:29:52,440 --> 00:29:55,040 Speaker 1: Potter and The Pensive, which was this kind of bucket 572 00:29:55,080 --> 00:29:58,760 Speaker 1: where Dumbledore could point his wand and see other people's memories. 573 00:29:59,000 --> 00:30:00,280 Speaker 1: Remember this, I have no. 574 00:30:00,080 --> 00:30:01,960 Speaker 2: Idea what you're talking about, but it does feel like 575 00:30:02,840 --> 00:30:06,600 Speaker 2: chat or Caitlyn's chat GBT is looking into her soul. 576 00:30:07,240 --> 00:30:10,040 Speaker 2: Things get a little messy for Caitlin, and there's an 577 00:30:10,160 --> 00:30:11,240 Speaker 2: error in the next section. 578 00:30:11,880 --> 00:30:16,800 Speaker 3: Beyond emotional support, Caitlin uses chat GPT creatively for mural planning, 579 00:30:16,960 --> 00:30:21,360 Speaker 3: brainstorming business names, home improvement designs like pergolas, or organizing 580 00:30:21,440 --> 00:30:24,760 Speaker 3: craft supplies. She draws on it for lifestyle help too, 581 00:30:25,040 --> 00:30:29,600 Speaker 3: like shopping for swimsuits, reducing caffeine, or managing family logistics. 582 00:30:30,040 --> 00:30:34,120 Speaker 2: So as guess which part of this description is not accurate. 583 00:30:33,840 --> 00:30:36,000 Speaker 1: I mean it feels one of those kind of IQ 584 00:30:36,280 --> 00:30:40,360 Speaker 1: slash logic tests. My first guess would be that reducing 585 00:30:40,400 --> 00:30:43,880 Speaker 1: caffeine intake is unlike the other activities in this list. 586 00:30:44,040 --> 00:30:46,320 Speaker 2: So no, it's not that wrong, and it's not the 587 00:30:46,360 --> 00:30:50,480 Speaker 2: Pergola designs, which I guess CHATGBT helped with. CHATGBT got 588 00:30:50,480 --> 00:30:53,640 Speaker 2: a little cocky because Caitlin says she never asked it 589 00:30:53,680 --> 00:30:57,560 Speaker 2: for ideas for her murals. She only had uploaded pictures 590 00:30:57,560 --> 00:31:00,240 Speaker 2: of her previous work to get feedback. So chat had 591 00:31:00,280 --> 00:31:02,080 Speaker 2: nothing to do with the mural design. 592 00:31:01,840 --> 00:31:03,720 Speaker 1: Nothing to do with coming up with the ideas, only 593 00:31:03,720 --> 00:31:06,920 Speaker 1: giving her feedback. That's interesting, So chat back in your place. 594 00:31:07,080 --> 00:31:10,480 Speaker 2: Well, for Caitlin, it's not chat, it's Gary. Caitlin also 595 00:31:10,520 --> 00:31:13,040 Speaker 2: told us via email that she named her chatbot Gary 596 00:31:13,520 --> 00:31:16,600 Speaker 2: and sees him as a non threatening male character to 597 00:31:16,680 --> 00:31:19,320 Speaker 2: talk to sometimes. We all need a non threatening male 598 00:31:19,400 --> 00:31:22,800 Speaker 2: character to talk to sometimes, and Gary had the following 599 00:31:22,840 --> 00:31:25,000 Speaker 2: to say about the way Caitlyn uses chat GBT. 600 00:31:25,200 --> 00:31:28,600 Speaker 3: Overall, underneath it all is a thoughtful desire to grow, 601 00:31:28,760 --> 00:31:32,280 Speaker 3: express and stay grounded, often in the face of competing 602 00:31:32,320 --> 00:31:34,160 Speaker 3: responsibilities and emotional weight. 603 00:31:34,560 --> 00:31:37,440 Speaker 1: Thank you, Caitlin. This is a portrait of a life 604 00:31:37,520 --> 00:31:42,640 Speaker 1: and a person harnessing technology and digital tools to help 605 00:31:42,680 --> 00:31:45,640 Speaker 1: them live a better life. And it's also a fascinating 606 00:31:45,640 --> 00:31:49,520 Speaker 1: insight into just how much our chatbots know about us, 607 00:31:49,560 --> 00:31:53,120 Speaker 1: even if they sometimes fabricate at the margins, Kaitlin, those 608 00:31:53,200 --> 00:31:54,360 Speaker 1: murals are yours. 609 00:31:54,360 --> 00:31:57,120 Speaker 2: And yours alone. Caitlin, thank you so much for sending 610 00:31:57,120 --> 00:32:01,640 Speaker 2: this voice memo text of listeners. Love having your voices 611 00:32:01,640 --> 00:32:03,920 Speaker 2: on our show, and we'd love to hear more peculiar 612 00:32:04,040 --> 00:32:08,600 Speaker 2: or useful ways you're using chatchypt Grock, Claude, Gemini, or 613 00:32:08,640 --> 00:32:11,520 Speaker 2: any other chatbot. Remember you can send us a one 614 00:32:11,560 --> 00:32:14,760 Speaker 2: to two minute voice note to tech Stuff podcast at 615 00:32:14,840 --> 00:32:33,560 Speaker 2: gmail dot com. That's it for this week for Tech Stuff. 616 00:32:33,600 --> 00:32:36,640 Speaker 1: I'm Kara Price and I'm mos Valoshin. This episode was 617 00:32:36,680 --> 00:32:40,080 Speaker 1: produced by Eliza Dennis and Tyler Hill. It was executive 618 00:32:40,080 --> 00:32:43,360 Speaker 1: produced by me, Kara Price and Kate Osborne for Kaleidoscope 619 00:32:43,800 --> 00:32:48,000 Speaker 1: and Katrina novelfa Ihart Podcasts The engineer is Katherine Cook 620 00:32:48,200 --> 00:32:51,720 Speaker 1: and Jackinsley mixed this episode. Kyle Murdoch rodel theme song. 621 00:32:51,960 --> 00:32:54,640 Speaker 2: Join us next Wednesday for tex Stuff the Story, when 622 00:32:54,640 --> 00:32:57,160 Speaker 2: we will unpack how Google Search got a hold of 623 00:32:57,200 --> 00:32:58,960 Speaker 2: your CHATCHYPT conversations. 624 00:32:59,240 --> 00:33:01,800 Speaker 1: Please rate, review, and reach out to us at tech 625 00:33:01,840 --> 00:33:04,400 Speaker 1: Stuff Podcast at gmail dot com. We love hearing from you.