1 00:00:13,920 --> 00:00:17,520 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:17,560 --> 00:00:20,920 Speaker 1: as Voloscian and I'm Kara Price today apples hail Mary 3 00:00:21,079 --> 00:00:24,759 Speaker 1: in the AI race, and is tech actually good for 4 00:00:24,800 --> 00:00:29,200 Speaker 1: older people? Then on chatting me, Reddit users mourn their 5 00:00:29,240 --> 00:00:33,559 Speaker 1: lost confidence and personal advisors, as open ai says hello 6 00:00:33,680 --> 00:00:36,800 Speaker 1: to GPT five. All of that on the weekend tech. 7 00:00:37,080 --> 00:00:47,480 Speaker 1: It's Friday, August fifteenth, and hello. 8 00:00:47,320 --> 00:00:50,199 Speaker 2: Cara, Hi, as, I have a question for you. 9 00:00:50,560 --> 00:00:53,440 Speaker 1: Have you ever been to Versailles, the Palace of Versailles 10 00:00:53,479 --> 00:00:55,480 Speaker 1: outside of Paris? I have not, have you? 11 00:00:55,760 --> 00:00:58,440 Speaker 2: I actually have? I wrote a segue around Versailles. 12 00:00:59,000 --> 00:00:59,840 Speaker 1: Yes, when was that? 13 00:01:01,080 --> 00:01:02,760 Speaker 2: When I h study abroad? 14 00:01:02,840 --> 00:01:05,280 Speaker 1: I study abroad? What does the Padace of Versai have 15 00:01:05,319 --> 00:01:06,200 Speaker 1: to do with tech stuff? 16 00:01:06,319 --> 00:01:07,920 Speaker 2: They have just entered the AI game. 17 00:01:08,000 --> 00:01:10,520 Speaker 1: They have they gonna they turn it into Europe's largest 18 00:01:10,600 --> 00:01:11,080 Speaker 1: data center. 19 00:01:12,720 --> 00:01:14,280 Speaker 2: You know that. When I saw this, when they were 20 00:01:14,280 --> 00:01:17,000 Speaker 2: like Versailles is using AI and partnering with open it, 21 00:01:17,080 --> 00:01:23,560 Speaker 2: I was like, I gotta So basically, visitors can talk 22 00:01:23,600 --> 00:01:26,480 Speaker 2: to the statues right on their smartphones via the museum's 23 00:01:26,480 --> 00:01:29,440 Speaker 2: app or website. So you scan a QR code near 24 00:01:29,440 --> 00:01:32,039 Speaker 2: the statues, or you can just browse a list on 25 00:01:32,080 --> 00:01:32,520 Speaker 2: the site. 26 00:01:32,760 --> 00:01:36,040 Speaker 1: Oh well, it sounds almost like you're suggesting we do it. 27 00:01:36,480 --> 00:01:38,440 Speaker 2: I'd love to do it. I just love it. I 28 00:01:38,480 --> 00:01:41,279 Speaker 2: love the voice. I really want to show it. Okay, okay, 29 00:01:41,319 --> 00:01:41,800 Speaker 2: listen to this. 30 00:01:42,200 --> 00:01:45,960 Speaker 3: Oh hello, what a pleasure to meet someone curious about 31 00:01:46,040 --> 00:01:48,760 Speaker 3: my work. What would you like to know about this 32 00:01:48,960 --> 00:01:53,080 Speaker 3: lovely taming of mystery by a little winged troublemaker like me? 33 00:01:54,000 --> 00:01:54,720 Speaker 1: Here's this. 34 00:01:54,720 --> 00:01:58,040 Speaker 2: This is the sphinx ridden by Cupid, and we are 35 00:01:58,120 --> 00:02:01,960 Speaker 2: listening to Cupid to talk to us. And then I'm 36 00:02:02,000 --> 00:02:03,440 Speaker 2: going to ask something of Cupid. 37 00:02:04,360 --> 00:02:09,320 Speaker 3: Oh I'm all ears or perhaps wings? What question stirs 38 00:02:09,360 --> 00:02:10,240 Speaker 3: your curiosity? 39 00:02:10,480 --> 00:02:12,960 Speaker 2: Cupid? Have you ever listened to a podcast? 40 00:02:13,240 --> 00:02:16,840 Speaker 3: Ah, dear friend, my ears are more attuned to the 41 00:02:16,880 --> 00:02:19,480 Speaker 3: whispers of hearts than to modern chatter. 42 00:02:19,960 --> 00:02:22,519 Speaker 1: I have a question for you, Cupid. Can you hear me? Oh? 43 00:02:22,800 --> 00:02:24,600 Speaker 3: Loud and clear, my friend? 44 00:02:24,880 --> 00:02:28,280 Speaker 1: Wonderful? So, Cupid, how long have you been in the 45 00:02:28,320 --> 00:02:29,000 Speaker 1: Palace of Verse? 46 00:02:29,080 --> 00:02:32,840 Speaker 3: I for I've been writing this mystery since sixteen sixty eight. 47 00:02:33,200 --> 00:02:37,200 Speaker 3: That's when those marvelous artists crafted me and my silent companion. 48 00:02:37,560 --> 00:02:40,960 Speaker 3: So let's just say I've been here for centuries. Watching 49 00:02:41,000 --> 00:02:43,240 Speaker 3: the dance of Time in the sun King's garden. 50 00:02:43,560 --> 00:02:45,800 Speaker 1: The sun King meaning Louis Catous. 51 00:02:45,919 --> 00:02:47,200 Speaker 2: That's right, Louis Cattus. 52 00:02:47,560 --> 00:02:50,239 Speaker 1: This is basically like you, instead of carrying around the 53 00:02:50,320 --> 00:02:53,639 Speaker 1: visitor brick phones with the filthy lanyards that you kind 54 00:02:53,680 --> 00:02:59,200 Speaker 1: of have to disinfect before hearing endless, endless, boring depositions 55 00:02:59,240 --> 00:03:02,280 Speaker 1: about various works and museums. I do always feel guilty 56 00:03:02,280 --> 00:03:04,000 Speaker 1: for not bringing those with me because I feel like 57 00:03:04,000 --> 00:03:07,320 Speaker 1: I'm missing out on the true information payload of the experience. 58 00:03:07,760 --> 00:03:10,400 Speaker 1: So I guess it's kind of a nice idea at 59 00:03:10,440 --> 00:03:12,920 Speaker 1: least to do something a little bit more interactive like this. 60 00:03:13,240 --> 00:03:14,919 Speaker 2: Yeah, I think that's the point of what they're trying 61 00:03:14,960 --> 00:03:18,520 Speaker 2: to do, but as you can hear, the dialogue is 62 00:03:18,560 --> 00:03:19,400 Speaker 2: far from perfect. 63 00:03:19,760 --> 00:03:23,160 Speaker 1: I'd also wonder if you're interested enough to get your 64 00:03:23,200 --> 00:03:25,640 Speaker 1: phone out, do you think you want to be responded 65 00:03:25,680 --> 00:03:26,640 Speaker 1: to in that tone. 66 00:03:26,400 --> 00:03:26,920 Speaker 2: In that town? 67 00:03:27,080 --> 00:03:27,440 Speaker 1: Ah? 68 00:03:27,480 --> 00:03:30,840 Speaker 2: It kind of takes to the experience. Yes. So the 69 00:03:30,840 --> 00:03:33,000 Speaker 2: French Culture Ministry says they hope it will help them 70 00:03:33,040 --> 00:03:36,040 Speaker 2: attract a younger audience, since the average age of a 71 00:03:36,120 --> 00:03:39,880 Speaker 2: visitor to Versailles is currently forty years old. But the 72 00:03:39,920 --> 00:03:42,480 Speaker 2: new York Times article about it said that the statues 73 00:03:42,520 --> 00:03:45,360 Speaker 2: weren't responding very well to questions being asked by young people, 74 00:03:45,400 --> 00:03:46,880 Speaker 2: which kind of defeats the purpose of this. 75 00:03:47,400 --> 00:03:48,400 Speaker 1: What the young people asking. 76 00:03:48,600 --> 00:03:50,600 Speaker 2: I could never come up with these things. What do 77 00:03:50,640 --> 00:03:53,520 Speaker 2: you do when a bird poops on you? Will I 78 00:03:53,560 --> 00:03:56,720 Speaker 2: ever be rich? Who's going to win the Champions League 79 00:03:56,720 --> 00:03:57,040 Speaker 2: this year? 80 00:03:57,200 --> 00:03:57,840 Speaker 1: Is a real question. 81 00:03:58,000 --> 00:04:00,560 Speaker 2: These are very real questions, and the President of Versailles 82 00:04:00,640 --> 00:04:02,880 Speaker 2: has said that the bots are designed to stay on topic. 83 00:04:03,320 --> 00:04:06,280 Speaker 3: But it seems like I'm here and always happy to 84 00:04:06,400 --> 00:04:09,600 Speaker 3: chat or gently steer us back if the conversation wanders 85 00:04:09,640 --> 00:04:13,560 Speaker 3: off into the mysteries of bird mishaps or football predictions. 86 00:04:14,240 --> 00:04:15,360 Speaker 1: Was the statue listening all along? 87 00:04:16,000 --> 00:04:19,000 Speaker 2: Oh my god? And it knew that I veered off. 88 00:04:19,839 --> 00:04:24,920 Speaker 2: Thank you, Sta, thank you bring us back. So be careful, Eliza. 89 00:04:25,400 --> 00:04:29,320 Speaker 1: Is the statue still listening? Yes, you'll turn them off. 90 00:04:29,240 --> 00:04:32,560 Speaker 2: I ended him. Okay, that was excellent. I do think that, 91 00:04:32,760 --> 00:04:35,040 Speaker 2: you know, do I wake up being in Paris and 92 00:04:35,120 --> 00:04:38,839 Speaker 2: like rish that I could talk to a statue at Versailles? 93 00:04:38,920 --> 00:04:42,080 Speaker 2: Absolutely not, But I probably would use this sooner than 94 00:04:42,080 --> 00:04:43,520 Speaker 2: I would use you. 95 00:04:43,680 --> 00:04:46,640 Speaker 1: Age other demographic carry going after if he was sixteen, 96 00:04:46,680 --> 00:04:48,119 Speaker 1: that would be the only thing on your mind. 97 00:04:48,240 --> 00:04:50,720 Speaker 2: That's very, very true. But I guess it's still interesting 98 00:04:50,800 --> 00:04:53,680 Speaker 2: that they're at least trying to see what incorporating AI 99 00:04:53,800 --> 00:04:54,400 Speaker 2: could look. 100 00:04:54,279 --> 00:04:57,760 Speaker 1: Like, which up until recently is more than you could 101 00:04:57,760 --> 00:04:58,560 Speaker 1: say for Apple. 102 00:04:59,240 --> 00:05:02,360 Speaker 2: Shots fire Apple the luddites tech company. 103 00:05:02,480 --> 00:05:04,599 Speaker 1: There was a pretty interesting article in Bloomberg this week 104 00:05:04,640 --> 00:05:08,920 Speaker 1: with the headline Apple's upcoming AI voice control will change 105 00:05:08,920 --> 00:05:10,280 Speaker 1: how people use iPhones. 106 00:05:10,600 --> 00:05:12,120 Speaker 2: So explain to me what that means. 107 00:05:12,240 --> 00:05:15,080 Speaker 1: Well, first little bit of context so you may remember this, 108 00:05:15,120 --> 00:05:18,440 Speaker 1: but earlier this year, Apple sort of touted a new 109 00:05:18,560 --> 00:05:21,200 Speaker 1: version of Siri, which they then had to delay for 110 00:05:21,320 --> 00:05:24,080 Speaker 1: quote the foreseeable future, and the idea was that Siri 111 00:05:24,160 --> 00:05:26,640 Speaker 1: would be a true AI assistant with access to everything 112 00:05:26,680 --> 00:05:29,640 Speaker 1: on your phone. And Apple even released an ad campaign 113 00:05:30,200 --> 00:05:32,680 Speaker 1: which they then had to pull because they were sued 114 00:05:32,800 --> 00:05:36,039 Speaker 1: for false advertising about Apple Intelligence. It's been a tough 115 00:05:36,040 --> 00:05:39,279 Speaker 1: time for Apple in the AI race and their scenes 116 00:05:39,320 --> 00:05:43,279 Speaker 1: being way behind. Key executives have left for rivals, including 117 00:05:43,320 --> 00:05:46,320 Speaker 1: Metas recently as last week. The stock's been beaten up, 118 00:05:46,520 --> 00:05:48,840 Speaker 1: but last week it was up thirteen percent, the best 119 00:05:48,839 --> 00:05:52,200 Speaker 1: week since twenty twenty for Apple. Representing the sum of 120 00:05:52,279 --> 00:05:54,320 Speaker 1: four hundred billion dollars of market cap. 121 00:05:54,640 --> 00:05:55,480 Speaker 2: How did that happen? 122 00:05:55,640 --> 00:05:57,920 Speaker 1: Well, it actually wasn't really about the Syrian news. I'm 123 00:05:57,920 --> 00:05:59,440 Speaker 1: going to talk more about SyRI in the moment. This 124 00:05:59,600 --> 00:06:04,960 Speaker 1: was in response to Tim Cook FKA Tim Apple, and 125 00:06:05,080 --> 00:06:07,880 Speaker 1: his trip to the White House to announce a new 126 00:06:07,920 --> 00:06:10,880 Speaker 1: one hundred billion dollar pledge to manufacture in the US, 127 00:06:11,240 --> 00:06:14,320 Speaker 1: including a commitment to be the first and largest customer 128 00:06:14,400 --> 00:06:18,320 Speaker 1: for chips being manufactured in Arizona at the new Taiwan 129 00:06:18,440 --> 00:06:22,479 Speaker 1: Semiconductor Manufacturing Company facility there. Only proud of how I 130 00:06:22,520 --> 00:06:27,440 Speaker 1: said that, But Tim Cook also literally gave President Trump 131 00:06:27,520 --> 00:06:31,320 Speaker 1: a twenty four carrot gold bar, which, as you can imagine, 132 00:06:31,600 --> 00:06:35,720 Speaker 1: was very gratefully received, and in turn, at the meeting, 133 00:06:35,920 --> 00:06:38,360 Speaker 1: Trump said that Apple would be exempt from one hundred 134 00:06:38,360 --> 00:06:41,840 Speaker 1: percent tariffs on importing foreign chips because they're making domestic 135 00:06:42,040 --> 00:06:43,240 Speaker 1: investments in chips. 136 00:06:43,480 --> 00:06:47,840 Speaker 2: Thus the stock match the gold bar. 137 00:06:48,240 --> 00:06:50,960 Speaker 1: That was the best value of gold bar anyone ever bought. 138 00:06:50,960 --> 00:06:54,279 Speaker 2: Ever. I also think it's interesting that it's a Taiwan 139 00:06:54,320 --> 00:06:58,080 Speaker 2: semiconductor company manufacturing area in arizonaa you know, still on 140 00:06:58,120 --> 00:07:02,520 Speaker 2: American soil. That's really good for them. And I'm happy 141 00:07:02,560 --> 00:07:05,160 Speaker 2: for people who have Apple stock, But what does the 142 00:07:05,200 --> 00:07:07,160 Speaker 2: future in the AI race look like? 143 00:07:07,160 --> 00:07:07,240 Speaker 1: Like? 144 00:07:07,320 --> 00:07:09,400 Speaker 2: What is what's up with Siri? 145 00:07:09,800 --> 00:07:11,800 Speaker 1: Yeah? I mean that was the kind of substance this 146 00:07:11,840 --> 00:07:15,720 Speaker 1: Bloomberg piece that I mentioned. The author Mark German reports 147 00:07:15,720 --> 00:07:19,960 Speaker 1: that Apple is working on upgraded software that quote could 148 00:07:20,000 --> 00:07:24,000 Speaker 1: finally make Siri the true hands free controller of your iPhone. 149 00:07:24,200 --> 00:07:26,080 Speaker 1: With nothing but your voice, You'll be able to tell 150 00:07:26,120 --> 00:07:28,720 Speaker 1: Siri to find a specific photo, edit it and send 151 00:07:28,760 --> 00:07:31,080 Speaker 1: it off, or comment on Instagram post, or scroll the 152 00:07:31,120 --> 00:07:33,760 Speaker 1: shopping app and add something to your card, or log 153 00:07:33,760 --> 00:07:37,160 Speaker 1: into a service without touching the screen. Essentially, Siri could 154 00:07:37,160 --> 00:07:40,920 Speaker 1: operate your apps like you would with precision inside their 155 00:07:40,920 --> 00:07:45,040 Speaker 1: own interfaces. German continues, if Apple nails this, it's not 156 00:07:45,160 --> 00:07:48,200 Speaker 1: just a nice, easy of use upgrade, it's the fulfillment 157 00:07:48,240 --> 00:07:51,640 Speaker 1: of the vision Siri promised nearly fifteen years ago. 158 00:07:51,880 --> 00:07:54,440 Speaker 2: This idea that Siri would sort of be the all 159 00:07:54,520 --> 00:07:57,240 Speaker 2: knowing person that Apple wants her to be, I think 160 00:07:57,320 --> 00:07:59,480 Speaker 2: is really interesting and I probably would end up using it. 161 00:07:59,600 --> 00:08:02,120 Speaker 2: But you know, people ask what the iPhone of the 162 00:08:02,160 --> 00:08:05,040 Speaker 2: AI era will be, the hardware that will deliver on 163 00:08:05,080 --> 00:08:08,880 Speaker 2: the potential of the software. Maybe the iPhone of the 164 00:08:08,920 --> 00:08:10,720 Speaker 2: AI era will be the iPhone. 165 00:08:10,760 --> 00:08:13,040 Speaker 1: Well, that is certainly what Apple is hoping for and 166 00:08:13,120 --> 00:08:15,840 Speaker 1: banking on. Their current plan is for a major Siri 167 00:08:15,880 --> 00:08:18,920 Speaker 1: overhaul to be done by next spring, and this will 168 00:08:18,960 --> 00:08:21,640 Speaker 1: also be important to other new hardware that Apple's working on, 169 00:08:21,760 --> 00:08:24,080 Speaker 1: like smart displays and even household robots. 170 00:08:24,400 --> 00:08:26,560 Speaker 2: How successful will they actually be at. 171 00:08:26,400 --> 00:08:29,840 Speaker 1: This well, I mean, you know, they're basically back to 172 00:08:29,920 --> 00:08:32,679 Speaker 1: work on what they promised before and then had to pull, 173 00:08:32,720 --> 00:08:35,679 Speaker 1: I mean, this idea of the one voice control to 174 00:08:35,760 --> 00:08:38,160 Speaker 1: rule them all. According to the Bloomberg piece, there are 175 00:08:38,200 --> 00:08:41,360 Speaker 1: deep concerns internally that the project might not be on track. 176 00:08:41,960 --> 00:08:44,320 Speaker 1: Engineers are having a hard time getting Siri to work 177 00:08:44,360 --> 00:08:46,760 Speaker 1: with so many different apps, and the stakes are high. 178 00:08:46,880 --> 00:08:50,480 Speaker 1: I mean, imagine you miscommunicate with Siriri or Syria misunderstands 179 00:08:50,520 --> 00:08:53,320 Speaker 1: you when it's making a wire transfer on your behalf. 180 00:08:54,520 --> 00:08:57,160 Speaker 2: And is there any indication of where Apple is with 181 00:08:57,280 --> 00:08:58,720 Speaker 2: sort of figuring this all out. 182 00:08:59,000 --> 00:09:02,160 Speaker 1: Yeah, they're doing testing across many of the popular apps 183 00:09:02,200 --> 00:09:05,679 Speaker 1: that are already on the iPhone, Uber, Facebook, Amazon, and YouTube, 184 00:09:05,760 --> 00:09:08,520 Speaker 1: even WhatsApp as well of course, as all of Apple's 185 00:09:08,559 --> 00:09:12,200 Speaker 1: native apps, Messages, voice, call, etc. The issue of watched 186 00:09:12,200 --> 00:09:14,560 Speaker 1: about the banking apps has not been resolved. In fact, 187 00:09:14,880 --> 00:09:18,040 Speaker 1: Bloomberg suggests they might just exclude those from the future 188 00:09:18,080 --> 00:09:21,440 Speaker 1: seri verse. Yeah, and you mentioned this idea of what 189 00:09:21,520 --> 00:09:24,920 Speaker 1: will be the iPhone of the AI era. Sir Johnny Ive, 190 00:09:25,040 --> 00:09:27,560 Speaker 1: who was at Apple for years and was the designer 191 00:09:27,559 --> 00:09:31,040 Speaker 1: of the iPhone, is now working with Open Ai on 192 00:09:31,200 --> 00:09:34,320 Speaker 1: a multi billion dollar AI hardware project. So the pressure 193 00:09:34,360 --> 00:09:36,880 Speaker 1: is really on for Apple again as of now, Serious 194 00:09:36,920 --> 00:09:39,360 Speaker 1: slated to roll out next spring, the new sery that is. 195 00:09:39,679 --> 00:09:41,800 Speaker 1: And if Apple pull it off, you know, I think 196 00:09:41,800 --> 00:09:43,319 Speaker 1: they will put to bed a lot of the doubts 197 00:09:43,320 --> 00:09:45,880 Speaker 1: sort of crept in about whether they can stay competitive 198 00:09:45,920 --> 00:09:48,800 Speaker 1: in this new era of AI. 199 00:09:49,400 --> 00:09:53,320 Speaker 2: So one great generational divide I've noticed is that people 200 00:09:53,600 --> 00:09:56,280 Speaker 2: of my mother's generations seem to love to use their 201 00:09:56,280 --> 00:09:59,520 Speaker 2: phones to compose texts with their voice. It's one of 202 00:09:59,600 --> 00:10:02,360 Speaker 2: my favorite things as listening to my mom composed texts 203 00:10:02,360 --> 00:10:05,440 Speaker 2: in the car to Siri. It's unbelievable. And the mess 204 00:10:05,559 --> 00:10:07,600 Speaker 2: what the messages versus what she wants to say, It's 205 00:10:07,640 --> 00:10:10,720 Speaker 2: also very fun. But what does this ever closer integration 206 00:10:10,800 --> 00:10:14,080 Speaker 2: with our tech do to us? Do you do you 207 00:10:14,160 --> 00:10:16,559 Speaker 2: think that this technology could be shriveling our brains? 208 00:10:16,679 --> 00:10:17,600 Speaker 1: I do, obviously. 209 00:10:18,000 --> 00:10:21,959 Speaker 2: Well, I have some good news, not for us, but 210 00:10:22,000 --> 00:10:25,280 Speaker 2: for my mother, which is that there's a study that 211 00:10:25,320 --> 00:10:29,040 Speaker 2: has come out that says technology only rots your brain 212 00:10:29,120 --> 00:10:29,760 Speaker 2: if you're young. 213 00:10:30,360 --> 00:10:31,480 Speaker 1: Wow. Where does that leave us? 214 00:10:32,200 --> 00:10:34,800 Speaker 2: Well, not in a great spot. 215 00:10:35,040 --> 00:10:37,840 Speaker 1: Wait, we're young enough to be rotated. We're young enough. 216 00:10:37,840 --> 00:10:41,120 Speaker 2: We're on the elder rot age. So I read this 217 00:10:41,240 --> 00:10:43,960 Speaker 2: article in the New York Times last week that the 218 00:10:44,080 --> 00:10:46,320 Speaker 2: constant use of tech, which is what you and I 219 00:10:46,360 --> 00:10:48,199 Speaker 2: do every day. I mean when I see you for 220 00:10:48,240 --> 00:10:50,760 Speaker 2: the first time, when we're recording in the morning, we're glue. 221 00:10:50,840 --> 00:10:54,120 Speaker 2: You're on your phone and your computer. I'm deep inside 222 00:10:54,120 --> 00:10:54,560 Speaker 2: of my phone. 223 00:10:54,720 --> 00:10:56,240 Speaker 1: Is only this is only two hours in the week 224 00:10:56,240 --> 00:10:57,360 Speaker 1: where I'm not loud to my phone. 225 00:10:57,440 --> 00:11:00,560 Speaker 2: Oh my god. People are like, where are you assis life? 226 00:11:00,640 --> 00:11:02,800 Speaker 2: Because I go and do not disturb. But this kind 227 00:11:02,840 --> 00:11:06,000 Speaker 2: of phone us, I mean, especially in studies, harms teenagers, 228 00:11:06,000 --> 00:11:10,240 Speaker 2: which we are not. But there's a surprising development, which 229 00:11:10,280 --> 00:11:16,080 Speaker 2: is that it might actually be helping older people stay sharp. 230 00:11:16,280 --> 00:11:19,120 Speaker 1: That's a little confounding. How does this work? 231 00:11:19,360 --> 00:11:20,679 Speaker 2: So I just want to zoom out a bit when 232 00:11:20,679 --> 00:11:23,520 Speaker 2: you think about people in their sixties and above God 233 00:11:23,559 --> 00:11:28,440 Speaker 2: bless mom, you can't help but be amazed by just 234 00:11:28,520 --> 00:11:30,800 Speaker 2: how much the world has changed for them. Like, think 235 00:11:30,840 --> 00:11:33,800 Speaker 2: about growing up, spending all of your educational years in 236 00:11:33,840 --> 00:11:37,120 Speaker 2: a world with no computers, no cell phones, no internet, 237 00:11:37,360 --> 00:11:41,120 Speaker 2: using an abacus, using your imagination, using your imagination, you know, 238 00:11:41,240 --> 00:11:44,559 Speaker 2: typing your college essays on a typewriter, and then the 239 00:11:44,640 --> 00:11:47,320 Speaker 2: year two thousand hits, we get to twenty twenty five, 240 00:11:47,760 --> 00:11:50,600 Speaker 2: and you're living in this world that does not really 241 00:11:50,720 --> 00:11:53,240 Speaker 2: use any of these things, I mean, including people who 242 00:11:53,280 --> 00:11:55,520 Speaker 2: do email on a laptop. I'm like, are you sick. 243 00:11:55,840 --> 00:11:58,200 Speaker 1: Yeah, I mean we talked about this in one of 244 00:11:58,200 --> 00:11:59,760 Speaker 1: the first episodes of Tech Stuff that we did. To 245 00:11:59,800 --> 00:12:03,400 Speaker 1: get that, we did an episode about digital natives and 246 00:12:04,040 --> 00:12:07,560 Speaker 1: how this sociologist came up with this idea of digital natives, 247 00:12:07,600 --> 00:12:10,600 Speaker 1: and people who've grown up not as digital natives feel 248 00:12:10,679 --> 00:12:13,280 Speaker 1: like almost like immigrants into a new world. And yeah, 249 00:12:13,320 --> 00:12:15,920 Speaker 1: I mean that's the experience you're describing. So as much 250 00:12:15,960 --> 00:12:18,080 Speaker 1: as we may roll our eyes at too much voice 251 00:12:18,080 --> 00:12:21,160 Speaker 1: command and phone always being on loud, I mean it's 252 00:12:21,160 --> 00:12:22,040 Speaker 1: not easy to keep up. 253 00:12:22,320 --> 00:12:25,920 Speaker 2: It is not easy to keep up and in the Times. 254 00:12:26,040 --> 00:12:28,800 Speaker 2: What they're reporting is that this constant change and being 255 00:12:28,840 --> 00:12:32,480 Speaker 2: forced to adapt may actually be helping older people stay 256 00:12:32,520 --> 00:12:36,280 Speaker 2: cognitively active and stave off one of the biggest fears, dementia. 257 00:12:36,360 --> 00:12:38,240 Speaker 2: Well yeah, you know, you might think that with all 258 00:12:38,320 --> 00:12:41,280 Speaker 2: the studies on shortening attention spans, the same would be 259 00:12:41,360 --> 00:12:43,760 Speaker 2: true for all demographics. But a recent paper in the 260 00:12:43,840 --> 00:12:47,160 Speaker 2: journal Nature and Human Behavior found that the people who 261 00:12:47,200 --> 00:12:50,520 Speaker 2: grew up without modern tech and had to learn it 262 00:12:51,440 --> 00:12:55,760 Speaker 2: are quote at a reduced risk of cognitive impairment and dementia. 263 00:12:55,880 --> 00:12:58,880 Speaker 2: That's according to Michael Scullen. He's a neuroscientist at Baylor 264 00:12:58,920 --> 00:13:00,200 Speaker 2: and one of the authors of the study. 265 00:13:00,520 --> 00:13:04,160 Speaker 1: Well, this is tremendously relieving for anyone who's over sixty 266 00:13:04,240 --> 00:13:06,480 Speaker 1: is addicted to their phone. It's like this, it's actually 267 00:13:06,480 --> 00:13:10,440 Speaker 1: good for me. Yeah. How did this neuroscientist and his 268 00:13:10,520 --> 00:13:12,120 Speaker 1: colleagues come to these conclusions. 269 00:13:12,240 --> 00:13:14,439 Speaker 2: So the group they studied was made up of people 270 00:13:14,480 --> 00:13:18,280 Speaker 2: over fifty with an average of sixty nine, close to 271 00:13:18,320 --> 00:13:21,760 Speaker 2: the Versailles visitor, and they gave them all a bunch 272 00:13:21,800 --> 00:13:24,760 Speaker 2: of different cognitive tests, and what they found was that 273 00:13:24,800 --> 00:13:27,840 Speaker 2: the ones who used computer smartphones and the Internet on 274 00:13:27,880 --> 00:13:32,319 Speaker 2: a regular basis scored way better than those who avoided technology. 275 00:13:32,640 --> 00:13:35,120 Speaker 2: But what was even more interesting to me was that 276 00:13:35,160 --> 00:13:36,920 Speaker 2: this isn't like a lot of studies we've covered on 277 00:13:36,960 --> 00:13:39,720 Speaker 2: this podcast. It wasn't a small sample size. It was 278 00:13:39,800 --> 00:13:42,240 Speaker 2: actually a meta analysis meta analysis. 279 00:13:42,720 --> 00:13:44,600 Speaker 1: I'm on the edge of my seat. Now, what did 280 00:13:44,600 --> 00:13:47,040 Speaker 1: the meta Analoly? How big was the meta analysis? And 281 00:13:47,080 --> 00:13:49,479 Speaker 1: what was the meta analysis? 282 00:13:50,000 --> 00:13:54,000 Speaker 2: You're like that, You're like trumping this meta analysis. The 283 00:13:54,040 --> 00:13:58,840 Speaker 2: paper analyzed results from fifty seven studies, which collectively gave 284 00:13:58,880 --> 00:14:04,040 Speaker 2: them a sample size of over four hundred thousand, and 285 00:14:04,080 --> 00:14:07,680 Speaker 2: they found that across all of those people, ninety percent 286 00:14:07,679 --> 00:14:10,280 Speaker 2: of the studies showed that technology seems to have what 287 00:14:10,320 --> 00:14:14,800 Speaker 2: the Times calls quote protective cognitive effect. 288 00:14:15,040 --> 00:14:15,240 Speaker 1: Wow. 289 00:14:15,720 --> 00:14:18,440 Speaker 2: And Scullin, who the author I mentioned, said that it's 290 00:14:18,520 --> 00:14:21,160 Speaker 2: really rare for so many different studies to have such 291 00:14:21,240 --> 00:14:25,080 Speaker 2: similar results. So he also said the decades long fears 292 00:14:25,120 --> 00:14:29,280 Speaker 2: about young people using technology and being negatively affected seem 293 00:14:29,360 --> 00:14:31,040 Speaker 2: to be born out of the research as well. 294 00:14:31,280 --> 00:14:33,560 Speaker 1: Yeah, I mean that makes me think a little bit 295 00:14:33,600 --> 00:14:37,440 Speaker 1: about this cognitive offloading cognitive deficit thing we talked about 296 00:14:37,800 --> 00:14:39,800 Speaker 1: a few times on the show, which is that Basically, 297 00:14:40,160 --> 00:14:43,400 Speaker 1: if you're using these technologies in place of doing work 298 00:14:43,520 --> 00:14:46,280 Speaker 1: yourself when you're a college student or a teenager, it 299 00:14:46,320 --> 00:14:48,680 Speaker 1: makes you dumber. But if you're having to learn how 300 00:14:48,680 --> 00:14:51,320 Speaker 1: to use these technologies as an older person, it can 301 00:14:51,360 --> 00:14:53,720 Speaker 1: potentially keep you more fresh. 302 00:14:53,840 --> 00:14:57,680 Speaker 2: The learning being key offload, the offloading also being key, 303 00:14:57,720 --> 00:15:01,680 Speaker 2: which is like, if you are offloading any kind of learning, 304 00:15:01,840 --> 00:15:02,720 Speaker 2: you're at a deficit. 305 00:15:03,200 --> 00:15:06,040 Speaker 1: I do have a couple of questions here, though, if 306 00:15:06,040 --> 00:15:09,200 Speaker 1: I may play Devil's advocate for a moment, I would 307 00:15:09,240 --> 00:15:13,880 Speaker 1: assume that in these studies, technology is effectively approxy for 308 00:15:13,920 --> 00:15:16,960 Speaker 1: being social. I text, and I email and I zoom. 309 00:15:17,000 --> 00:15:20,920 Speaker 1: Therefore I am in society. So I would assume that 310 00:15:20,960 --> 00:15:23,440 Speaker 1: there is a correlation here because we know that it's 311 00:15:23,560 --> 00:15:26,320 Speaker 1: very neuroprotective to be social. There's also a question for 312 00:15:26,360 --> 00:15:29,600 Speaker 1: me about whether people who have higher cognitive abilities are 313 00:15:29,600 --> 00:15:33,040 Speaker 1: by definition more likely to use technology more than people 314 00:15:33,040 --> 00:15:35,920 Speaker 1: with lower cognitive ability, So that a self selecting component. 315 00:15:36,520 --> 00:15:39,720 Speaker 1: And then there are also socioeconomic questions, right, like you know, 316 00:15:39,920 --> 00:15:41,880 Speaker 1: is it are people who are not able to afford 317 00:15:41,920 --> 00:15:44,760 Speaker 1: to have all the modern technology kind of being left behind? 318 00:15:44,880 --> 00:15:46,280 Speaker 1: So what do you think? 319 00:15:47,160 --> 00:15:50,080 Speaker 2: Yeah, No, those are all great comments. I think in 320 00:15:50,120 --> 00:15:53,240 Speaker 2: this case, correlation does not equal causation, which is always 321 00:15:53,240 --> 00:15:56,640 Speaker 2: good to remind yourself when you're reading an article about, 322 00:15:56,840 --> 00:15:59,560 Speaker 2: you know, a bit of a crazy sounding research paper. 323 00:15:59,840 --> 00:16:02,320 Speaker 2: The papers authors did say that, of course they can't 324 00:16:02,320 --> 00:16:06,120 Speaker 2: say anything definitively, but their research did take into account 325 00:16:06,120 --> 00:16:10,120 Speaker 2: things like health, education, socioeconomic status, and even with all 326 00:16:10,200 --> 00:16:13,720 Speaker 2: of those variables, the people who use tech still scored better. 327 00:16:13,960 --> 00:16:15,800 Speaker 2: And they also say that a big part of this 328 00:16:15,920 --> 00:16:19,360 Speaker 2: isn't just staying challenged, but not giving up. 329 00:16:19,520 --> 00:16:22,000 Speaker 1: Oh that was my dad's man. Sure, if at first 330 00:16:22,040 --> 00:16:23,400 Speaker 1: you don't succeed. 331 00:16:23,200 --> 00:16:27,520 Speaker 2: Try try, try again. They also hypothesize that the frustration 332 00:16:28,120 --> 00:16:31,120 Speaker 2: of having to learn and relearn everything after every software 333 00:16:31,200 --> 00:16:35,080 Speaker 2: update might actually be beneficial and to your point, the 334 00:16:35,120 --> 00:16:39,000 Speaker 2: authors also say tech users help people maintain social connections, 335 00:16:39,200 --> 00:16:42,520 Speaker 2: which is another preventer of cognitive decline, and that things 336 00:16:42,560 --> 00:16:46,080 Speaker 2: like calendar notifications help people remember things better. 337 00:16:46,240 --> 00:16:47,560 Speaker 1: I don't know if you be happy that I'm still 338 00:16:47,600 --> 00:16:49,960 Speaker 1: young enough that the tech is eroding my brain rather 339 00:16:49,960 --> 00:16:54,360 Speaker 1: than rather than building it. But yeah, it's a mixed blessing. 340 00:16:54,400 --> 00:16:56,720 Speaker 2: I guess I still believe that ikel is helping my 341 00:16:56,800 --> 00:16:58,000 Speaker 2: brain stay strong. 342 00:17:04,359 --> 00:17:08,280 Speaker 1: After the break, gun safety advocates use AI to resurrect 343 00:17:08,320 --> 00:17:12,280 Speaker 1: their son. Also, the long lost Vine archives might have 344 00:17:12,359 --> 00:17:16,280 Speaker 1: been unearthed. Livestream viewers fall in love with a voluptuous 345 00:17:16,320 --> 00:17:20,000 Speaker 1: starfish and check rept convinces a man to eat poison. 346 00:17:20,720 --> 00:17:24,000 Speaker 1: Then on our chat tomy segment, Reddit users get nostalgic 347 00:17:24,240 --> 00:17:43,639 Speaker 1: for the good old days of AI. Welcome back. We've 348 00:17:43,680 --> 00:17:45,639 Speaker 1: got a few more headlines view this week. 349 00:17:45,520 --> 00:17:48,320 Speaker 2: And then a story about the ruckus caused by the 350 00:17:48,359 --> 00:17:49,840 Speaker 2: release of GPT five. 351 00:17:50,320 --> 00:17:52,560 Speaker 1: But first, Carot, there is a headline that I was 352 00:17:52,720 --> 00:17:55,000 Speaker 1: very drawn to. We obviously have a lot of fun 353 00:17:55,040 --> 00:17:57,679 Speaker 1: together on this show, joking around and laughing as we 354 00:17:57,680 --> 00:17:59,760 Speaker 1: talk about the news of the week, But this one 355 00:17:59,840 --> 00:18:02,320 Speaker 1: is darker, and it relates to something that you and 356 00:18:02,359 --> 00:18:05,680 Speaker 1: I reflected on together several years ago when deep fake 357 00:18:05,720 --> 00:18:09,720 Speaker 1: technology was first taking off, namely, would we actually want 358 00:18:09,720 --> 00:18:12,520 Speaker 1: people to speak beyond the grave in the form of 359 00:18:12,560 --> 00:18:18,320 Speaker 1: AI avatars. Last week, former CNN anchor Jim Acosta interviewed 360 00:18:18,359 --> 00:18:21,800 Speaker 1: on his YouTube show an AI rendering of someone who 361 00:18:21,920 --> 00:18:25,280 Speaker 1: was killed in the twenty eighteen shooting in Parkland, Florida. 362 00:18:25,960 --> 00:18:29,160 Speaker 1: The avatar was created in the likeness of Joaquin Oliver 363 00:18:29,840 --> 00:18:32,760 Speaker 1: and it was actually his parents who generated the AI version. 364 00:18:33,320 --> 00:18:35,080 Speaker 2: So do you know why they decided to do this? 365 00:18:35,640 --> 00:18:37,760 Speaker 1: Yeah, I mean they want to use their son's image 366 00:18:37,960 --> 00:18:41,840 Speaker 1: and voice and memory to advocate for quote, stronger gun 367 00:18:41,840 --> 00:18:45,960 Speaker 1: control laws, mental health support, and community engagement. And the 368 00:18:46,000 --> 00:18:48,920 Speaker 1: avatar will soon have its own social media presence and 369 00:18:48,960 --> 00:18:51,440 Speaker 1: will also be available for public speaking engagements. 370 00:18:51,840 --> 00:18:54,800 Speaker 2: How did Joaquin's family make this AI avatar? 371 00:18:55,119 --> 00:18:58,280 Speaker 1: Joaquin's father many trained the AI on his son's social 372 00:18:58,280 --> 00:19:01,080 Speaker 1: media posts, and of course so there's general knowledge that's 373 00:19:01,119 --> 00:19:04,240 Speaker 1: baked into many AI models. This is actually the second 374 00:19:04,320 --> 00:19:07,520 Speaker 1: time the Oliver family have used their son's AI likeness 375 00:19:07,560 --> 00:19:10,520 Speaker 1: to advocate for change. Back in twenty twenty, they use 376 00:19:10,600 --> 00:19:13,359 Speaker 1: a different AI version of their son to encourage people 377 00:19:13,400 --> 00:19:14,600 Speaker 1: to register to vote. 378 00:19:15,000 --> 00:19:16,880 Speaker 2: So how have people responded to this? 379 00:19:17,240 --> 00:19:20,480 Speaker 1: Well, many people did not take kindly to this interview, 380 00:19:21,200 --> 00:19:23,800 Speaker 1: you know, and it's a very complicated one because afterwards 381 00:19:23,800 --> 00:19:26,679 Speaker 1: the parents who are doing this and in a sense 382 00:19:27,320 --> 00:19:31,480 Speaker 1: processing their grief. I'm sure, but you know, people have 383 00:19:31,520 --> 00:19:36,040 Speaker 1: called the avatar creepy, unnatural, and even a grotesque puppet show. 384 00:19:36,560 --> 00:19:39,520 Speaker 1: I email with Harney Fried drummer Honey Freed. Yes, yes, 385 00:19:39,600 --> 00:19:43,040 Speaker 1: he was a so called godfather of digital forensics, and 386 00:19:43,080 --> 00:19:46,360 Speaker 1: he's been on the show before. Harney wrote to me quote, 387 00:19:46,440 --> 00:19:49,440 Speaker 1: I was not opposed to the original Joaquin Ai powered 388 00:19:49,440 --> 00:19:52,520 Speaker 1: PSA that his parents authorized, in which we heard his 389 00:19:52,720 --> 00:19:57,240 Speaker 1: AI recreations speak out against gun violence. There is, however, 390 00:19:57,280 --> 00:20:00,439 Speaker 1: a difference between a pre recorded PSA and what was 391 00:20:00,480 --> 00:20:03,359 Speaker 1: made to look like an interview with a well known journalist. 392 00:20:03,800 --> 00:20:06,679 Speaker 1: The interactive interview gave the impression that we were hearing 393 00:20:06,760 --> 00:20:11,080 Speaker 1: Joaquin respond to specific questions, when in fact we were not. 394 00:20:11,800 --> 00:20:13,879 Speaker 1: I absolutely do not blame the parents for doing whatever 395 00:20:13,880 --> 00:20:16,480 Speaker 1: they can to raise awareness. But this time we've interview 396 00:20:16,640 --> 00:20:19,280 Speaker 1: blurs journalistic lines in a way that I don't think 397 00:20:19,359 --> 00:20:20,000 Speaker 1: is appropriate. 398 00:20:20,359 --> 00:20:22,879 Speaker 2: So I'm actually curious what this interview sounded like. 399 00:20:23,080 --> 00:20:24,800 Speaker 1: Well, I do actually want to play you an excerpt. 400 00:20:25,320 --> 00:20:28,240 Speaker 4: I'm all about love, laughter, and living life to the fullest. 401 00:20:28,359 --> 00:20:30,760 Speaker 4: I cherish my family and friends and always try to 402 00:20:30,800 --> 00:20:33,440 Speaker 4: spread joy wherever I go. Though my life was cut short, 403 00:20:33,480 --> 00:20:35,960 Speaker 4: I want to keep inspiring others to connect and advocate 404 00:20:36,000 --> 00:20:36,440 Speaker 4: for change. 405 00:20:36,560 --> 00:20:40,280 Speaker 2: Joaquin, tell everybody a little bit about yourself, Like what's 406 00:20:40,280 --> 00:20:41,119 Speaker 2: your favorite sport? 407 00:20:41,560 --> 00:20:43,879 Speaker 4: I love basketball, It's such a fun way to connect 408 00:20:43,920 --> 00:20:45,399 Speaker 4: with friends and show off some skills. 409 00:20:45,440 --> 00:20:47,520 Speaker 1: Plus, there's nothing like the thrill of a good game. 410 00:20:48,359 --> 00:20:49,800 Speaker 1: I don't know, it's a little un canyon. I think 411 00:20:49,880 --> 00:20:53,119 Speaker 1: Harney's point was, even if you're someone's parent, do you 412 00:20:53,200 --> 00:20:55,760 Speaker 1: have the right to put words in their mouth in 413 00:20:55,800 --> 00:20:56,760 Speaker 1: a public context? 414 00:20:56,960 --> 00:21:02,440 Speaker 2: Right? Right? So, moving on from the deeply distressing, thought 415 00:21:02,480 --> 00:21:07,280 Speaker 2: provoking segment that Oz just reported to something completely unseerious. 416 00:21:07,560 --> 00:21:10,399 Speaker 2: Do you remember us the social media at Vine. Elon 417 00:21:10,520 --> 00:21:13,399 Speaker 2: Musk actually claimed this week in a tweet that the 418 00:21:13,480 --> 00:21:15,680 Speaker 2: Vine archives have been found. 419 00:21:15,840 --> 00:21:18,119 Speaker 1: You know, I saw this and I'm so glad you 420 00:21:18,119 --> 00:21:20,639 Speaker 1: brought this up. Because you know those T shirts that 421 00:21:20,680 --> 00:21:22,320 Speaker 1: you go to New York and you got a T 422 00:21:22,440 --> 00:21:24,400 Speaker 1: shirt with My dad went to New York and all 423 00:21:24,440 --> 00:21:26,480 Speaker 1: I got was this lousy T shirt. I feel like 424 00:21:26,480 --> 00:21:30,119 Speaker 1: the equivalent here is Elon must say I was looking 425 00:21:30,160 --> 00:21:32,440 Speaker 1: for looking for the Epstein files, and all I got 426 00:21:32,520 --> 00:21:33,879 Speaker 1: was this lousy Vine archive. 427 00:21:34,160 --> 00:21:36,560 Speaker 2: That's exactly what it is. That's exactly what it is. 428 00:21:36,680 --> 00:21:41,080 Speaker 1: But why why did Elon choose to wade into this 429 00:21:41,080 --> 00:21:43,000 Speaker 1: this corner of the internet as well? 430 00:21:43,080 --> 00:21:44,480 Speaker 2: All Right, So I'm going to take you back to 431 00:21:44,640 --> 00:21:50,359 Speaker 2: twenty twelve when I was just a child, and Vine 432 00:21:50,440 --> 00:21:53,520 Speaker 2: was very quickly becoming a popular social media platform, so 433 00:21:53,640 --> 00:21:57,920 Speaker 2: quickly that the original Twitter not X bought the company 434 00:21:57,920 --> 00:22:02,280 Speaker 2: in twenty sixteen, so Twitter owned Vine, and then they 435 00:22:02,320 --> 00:22:04,560 Speaker 2: did what every major tech company does when they purchased 436 00:22:04,560 --> 00:22:05,320 Speaker 2: a competing app. 437 00:22:05,680 --> 00:22:08,600 Speaker 1: They took the team and the technology and built something 438 00:22:08,600 --> 00:22:10,240 Speaker 1: that could rival TikTok no. 439 00:22:11,680 --> 00:22:14,240 Speaker 2: Within a matter of months, they shut down Vine and 440 00:22:14,280 --> 00:22:16,800 Speaker 2: disabled the app, and the Internet was very upset. Yeah, 441 00:22:16,960 --> 00:22:18,600 Speaker 2: you can still find some of the videos on YouTube, 442 00:22:18,600 --> 00:22:22,119 Speaker 2: but it's largely a forgotten medium and you can't really 443 00:22:22,160 --> 00:22:24,320 Speaker 2: find a lot of the videos that were so important 444 00:22:24,359 --> 00:22:25,000 Speaker 2: to me at the time. 445 00:22:25,080 --> 00:22:27,440 Speaker 1: That Elon has found the archive, what's he going to 446 00:22:27,520 --> 00:22:29,159 Speaker 1: do with it? Rados are the lost Vine. 447 00:22:29,200 --> 00:22:31,240 Speaker 2: In a tweet he published in July, he said that 448 00:22:31,280 --> 00:22:34,720 Speaker 2: we're bringing back Vine, but in AI form, and in 449 00:22:34,760 --> 00:22:39,520 Speaker 2: August he tweeted, groc imagine is ai vine? What a sentence. 450 00:22:39,960 --> 00:22:42,760 Speaker 2: So between those two tweets, one about finding the archive 451 00:22:43,080 --> 00:22:46,159 Speaker 2: and the other about putting vines into AI people are 452 00:22:46,240 --> 00:22:49,200 Speaker 2: now speculating that Musk is now using the Vine archive 453 00:22:49,320 --> 00:22:52,719 Speaker 2: to train Grock's video app groc Imagine. 454 00:22:52,840 --> 00:22:54,720 Speaker 1: This is going to be so nostalgic. All the AI 455 00:22:54,800 --> 00:22:58,000 Speaker 1: slop is going to be from teenagers from twenty fifteen, 456 00:22:58,680 --> 00:23:03,719 Speaker 1: trained on, trained on that. Well, funny enough, Cara, I 457 00:23:03,760 --> 00:23:07,000 Speaker 1: actually have one for you, which is also about found 458 00:23:07,000 --> 00:23:10,760 Speaker 1: footage going viral. Have you heard about the starfish in 459 00:23:10,880 --> 00:23:13,480 Speaker 1: Argentina that has captured the nation's heart. 460 00:23:13,560 --> 00:23:15,520 Speaker 2: There is nothing I love more than a viral animal. 461 00:23:15,600 --> 00:23:19,400 Speaker 1: The public in Argentina has named this starfish the estrea 462 00:23:19,560 --> 00:23:24,800 Speaker 1: colonna or fat bottom starfish. Have you seen the picture 463 00:23:24,800 --> 00:23:25,000 Speaker 1: of it? 464 00:23:25,080 --> 00:23:25,199 Speaker 2: Now? 465 00:23:25,240 --> 00:23:26,560 Speaker 1: No, I can put it up for you. 466 00:23:27,119 --> 00:23:31,440 Speaker 2: Oh my god, he has a butt. By the way, 467 00:23:32,359 --> 00:23:35,080 Speaker 2: Patrick from SpongeBob SquarePants. 468 00:23:34,600 --> 00:23:38,200 Speaker 1: Well exactly right, funny enough. I'm on bos Are's Herald 469 00:23:38,320 --> 00:23:42,040 Speaker 1: dot com, which has the headline a star open and brackets. 470 00:23:42,080 --> 00:23:46,679 Speaker 1: Fish is born. Viewers compared the creature the SpongeBob SquarePants 471 00:23:46,680 --> 00:23:47,560 Speaker 1: Polpatrick Star. 472 00:23:47,840 --> 00:23:50,080 Speaker 2: There you go. Look at that you've been It almost 473 00:23:50,080 --> 00:23:52,080 Speaker 2: looks like they built it out of Patrick Star. It's 474 00:23:52,080 --> 00:23:53,080 Speaker 2: like Patrick Starr AI. 475 00:23:53,800 --> 00:23:56,960 Speaker 1: So this starfish is being live streamed, or was being 476 00:23:57,000 --> 00:23:59,159 Speaker 1: live streamed, I should say, all the way from a 477 00:23:59,160 --> 00:24:02,280 Speaker 1: deep sea canyon in South America, and the live stream 478 00:24:02,359 --> 00:24:04,880 Speaker 1: was part of the marine project led by scientists from 479 00:24:05,000 --> 00:24:08,760 Speaker 1: Argentina's National Scientific Council. They use an undersea robot to 480 00:24:08,800 --> 00:24:13,080 Speaker 1: explore the canyon, and the jokes don't end with Estrea Colonna. 481 00:24:13,320 --> 00:24:18,119 Speaker 1: Do you know what the vehicle was called? No subastian. 482 00:24:18,400 --> 00:24:25,560 Speaker 2: Of course I was going to. Yeah, I like that 483 00:24:25,600 --> 00:24:25,880 Speaker 2: a lot. 484 00:24:27,080 --> 00:24:29,400 Speaker 1: The live stream has become so popular that people are 485 00:24:29,440 --> 00:24:32,600 Speaker 1: collectively naming all the marine life they see on the 486 00:24:32,640 --> 00:24:35,920 Speaker 1: live stream. There's a purple sea cucumber called Little Sweet 487 00:24:35,960 --> 00:24:38,480 Speaker 1: Potato and a pink lobster named Barbie. 488 00:24:38,560 --> 00:24:42,240 Speaker 2: I love that there's now a cinematic universe. Joking aside, 489 00:24:42,680 --> 00:24:45,240 Speaker 2: live streams of animals seem to be very trendy right now. 490 00:24:45,280 --> 00:24:48,720 Speaker 2: Did you follow the Big Bear Eagles Jackie and Shadow? 491 00:24:49,119 --> 00:24:52,560 Speaker 1: No? I was. I remember the saga of Flacco the 492 00:24:52,640 --> 00:24:56,000 Speaker 1: Owl in New York, But tell me about the Big 493 00:24:56,000 --> 00:24:56,640 Speaker 1: Bear Eagles. 494 00:24:56,880 --> 00:24:59,679 Speaker 2: So there is a live stream that is trained on 495 00:24:59,720 --> 00:25:02,919 Speaker 2: the e eagles nests, and people were obsessed with watching 496 00:25:02,960 --> 00:25:05,680 Speaker 2: the eagles eggs, hatch and march. The live stream is 497 00:25:05,720 --> 00:25:08,640 Speaker 2: still going on, but the babies have literally flown the. 498 00:25:08,600 --> 00:25:09,960 Speaker 1: Nest, so the parents are still there. 499 00:25:10,000 --> 00:25:12,080 Speaker 2: The parents are still there. They're empty nesters empty. 500 00:25:13,800 --> 00:25:16,360 Speaker 1: You know what I'm drawn to these stories because they 501 00:25:16,440 --> 00:25:19,600 Speaker 1: highlight this enduring interest that many of us have, if 502 00:25:19,640 --> 00:25:23,400 Speaker 1: not most of us, have in nature and science. Argentina, 503 00:25:23,640 --> 00:25:27,480 Speaker 1: just like the US, the current administration is cutting back 504 00:25:27,600 --> 00:25:31,200 Speaker 1: on funding and scientific research, and so this fat bottom 505 00:25:31,280 --> 00:25:34,840 Speaker 1: starfish has become a kind of unofficial mascot for science 506 00:25:34,920 --> 00:25:37,879 Speaker 1: enthusiasts everywhere. And it's a nice reminder to me that 507 00:25:37,920 --> 00:25:41,600 Speaker 1: no matter how technologically advanced our society becomes, and even 508 00:25:41,680 --> 00:25:44,199 Speaker 1: when we can probe the bottom of the ocean and 509 00:25:44,280 --> 00:25:47,520 Speaker 1: livestream it, what we want to watch is nature. 510 00:25:47,680 --> 00:25:50,480 Speaker 2: And it feels like a natural twenty twenty five extension 511 00:25:50,520 --> 00:25:52,440 Speaker 2: of a zoo which has fallen out of favor a 512 00:25:52,480 --> 00:25:55,400 Speaker 2: little bit. But I just can't get enough of. 513 00:25:55,320 --> 00:25:57,560 Speaker 1: This, all the pleasure of a zoo and none of the. 514 00:25:58,200 --> 00:26:01,879 Speaker 2: It's tinder for zoo, so oz. For our last story, 515 00:26:01,920 --> 00:26:04,399 Speaker 2: I want to talk to you about bromism. Have you 516 00:26:04,400 --> 00:26:05,320 Speaker 2: heard of bromism? 517 00:26:05,800 --> 00:26:07,960 Speaker 1: No? Is it something like when I go on a 518 00:26:07,960 --> 00:26:10,760 Speaker 1: bachelor potty bro bro tourism that. 519 00:26:10,720 --> 00:26:13,800 Speaker 2: I'm bro tourism is interesting? That is New Orleans, Yeah, 520 00:26:13,880 --> 00:26:18,040 Speaker 2: Bolwyn Street or Vegas. But no, it's a disease caused 521 00:26:18,080 --> 00:26:21,520 Speaker 2: by ingesting too much sodium bromide, a chemical that is 522 00:26:21,560 --> 00:26:26,000 Speaker 2: commonly used in pool cleaners and medication for dog epilepsy. 523 00:26:26,280 --> 00:26:27,760 Speaker 1: How did people end up eating it? 524 00:26:27,840 --> 00:26:32,040 Speaker 2: So? Over a century ago people were ingesting bromide as 525 00:26:32,200 --> 00:26:35,040 Speaker 2: a sedative. But that probably wasn't very smart because an 526 00:26:35,160 --> 00:26:39,240 Speaker 2: estimated eight to ten percent of all psychiatric admissions in 527 00:26:39,280 --> 00:26:41,000 Speaker 2: the US were caused by bromism. 528 00:26:41,040 --> 00:26:43,280 Speaker 1: What if the treatment is worse than the disease, that's right. 529 00:26:43,400 --> 00:26:45,720 Speaker 2: Luckily we seem to have figured out what was happening, 530 00:26:45,760 --> 00:26:49,800 Speaker 2: and now the disease is pretty rare. That is until 531 00:26:49,840 --> 00:26:52,919 Speaker 2: this year. A sixty year old man went into the 532 00:26:52,920 --> 00:26:56,440 Speaker 2: hospital complaining of hallucinations and paranoia. He thought he was 533 00:26:56,480 --> 00:27:00,600 Speaker 2: being poisoned by his neighbor. It turns out he had 534 00:27:00,640 --> 00:27:04,800 Speaker 2: been poisoning himself because of advice given to him by 535 00:27:05,920 --> 00:27:06,520 Speaker 2: Chat GPT. 536 00:27:06,760 --> 00:27:09,840 Speaker 1: Wow, so this is not the AI psychosis you get 537 00:27:09,880 --> 00:27:14,520 Speaker 1: from being encouraging your delusions of grandeur on Chat. This 538 00:27:14,640 --> 00:27:19,560 Speaker 1: is actually getting psychosis from ingesting a substance recommended by Chat. 539 00:27:19,640 --> 00:27:23,160 Speaker 2: It's AI psychosis by proxy. That's absolutely right. This man decided, 540 00:27:23,200 --> 00:27:26,080 Speaker 2: after reading about the negative effects of eating table salt, 541 00:27:26,400 --> 00:27:29,720 Speaker 2: that he wanted to stop consuming salt. Altogether. He had 542 00:27:29,800 --> 00:27:34,960 Speaker 2: quote a history of studying nutrition in college, and because 543 00:27:35,000 --> 00:27:37,439 Speaker 2: of that, I guess he turned to chat GPT for 544 00:27:37,520 --> 00:27:41,439 Speaker 2: salt substitutes. So instead of sodium chloride, which is table salt, 545 00:27:41,920 --> 00:27:46,119 Speaker 2: Chat suggested he tries sodium bromide instead. This type of 546 00:27:46,119 --> 00:27:49,560 Speaker 2: substitution would be okay if it was for cleaning products, 547 00:27:49,560 --> 00:27:50,840 Speaker 2: but not for human consumption. 548 00:27:51,240 --> 00:27:54,200 Speaker 1: That's funny. I mean, I wonder if he actually told 549 00:27:54,280 --> 00:27:55,840 Speaker 1: Chat that he was playing to eat it, or if 550 00:27:55,840 --> 00:27:58,719 Speaker 1: he just said, what's a substitute for sodium chloride? That's 551 00:27:59,000 --> 00:28:01,080 Speaker 1: a story for another day, But I think the key 552 00:28:01,119 --> 00:28:05,320 Speaker 1: takeaway here is obviously, don't always do what the computer says. 553 00:28:05,520 --> 00:28:09,960 Speaker 2: I did self diagnose myself with lymes disease thanks to chat, 554 00:28:09,960 --> 00:28:11,639 Speaker 2: but I also did confirm it with a doctor. 555 00:28:24,760 --> 00:28:27,840 Speaker 1: Well, this brings us to our final segment of the day, 556 00:28:28,080 --> 00:28:30,919 Speaker 1: which is a special edition of Chat and Me in 557 00:28:30,960 --> 00:28:33,120 Speaker 1: honor of the release of GPT five. 558 00:28:33,440 --> 00:28:35,720 Speaker 2: Right. Chat in Me is our segment about how people 559 00:28:35,800 --> 00:28:38,760 Speaker 2: are really using chatbots. We've been getting some great listener 560 00:28:38,800 --> 00:28:42,800 Speaker 2: submissions in our Inbox Tech Stuff podcast at gmail dot com. 561 00:28:42,840 --> 00:28:43,560 Speaker 2: Don't forget it. 562 00:28:43,600 --> 00:28:45,320 Speaker 1: Keep them coming. By the way, we also sent out 563 00:28:45,360 --> 00:28:47,160 Speaker 1: a T shirt or two, So we did want to 564 00:28:47,160 --> 00:28:48,520 Speaker 1: if you want to take stuff t shirit. 565 00:28:48,520 --> 00:28:51,120 Speaker 2: We are good for our word. Yeah, and we will 566 00:28:51,120 --> 00:28:53,280 Speaker 2: be featuring more in the weeks ahead, so thank you 567 00:28:53,480 --> 00:28:57,240 Speaker 2: and please keep them coming. But this week Chat is 568 00:28:57,280 --> 00:28:59,040 Speaker 2: dead and we have killed him. 569 00:28:59,360 --> 00:29:05,000 Speaker 1: I spoke Karathustra. I'm guessing you're talking about how, without warning, 570 00:29:05,360 --> 00:29:07,600 Speaker 1: open Ai disabled all its older models. 571 00:29:08,000 --> 00:29:13,360 Speaker 2: So there's that, and Anthropic has retired Claude three Sonnet, 572 00:29:13,520 --> 00:29:17,320 Speaker 2: which had become a super, super popular and fast cost 573 00:29:17,360 --> 00:29:18,160 Speaker 2: effective model. 574 00:29:18,240 --> 00:29:21,160 Speaker 1: Yeah, it's interesting. I mean people grow to depend on 575 00:29:21,240 --> 00:29:24,719 Speaker 1: these llms, each of which has a slightly different character, 576 00:29:25,040 --> 00:29:28,880 Speaker 1: and each of which is trained on its interactions with you. 577 00:29:29,240 --> 00:29:31,800 Speaker 1: So it's kind of crazy to become so dependent on 578 00:29:31,840 --> 00:29:34,040 Speaker 1: something and then have it literally taken away from you. 579 00:29:34,360 --> 00:29:37,360 Speaker 2: Well, exactly in San Francisco, I could not believe this. 580 00:29:37,480 --> 00:29:40,960 Speaker 2: Over two hundred people got together for a funeral to 581 00:29:41,120 --> 00:29:45,040 Speaker 2: mourn Claude three Sonnet. There were eulogies and hymns and 582 00:29:45,160 --> 00:29:46,719 Speaker 2: even a failed resurrection man. 583 00:29:46,800 --> 00:29:49,560 Speaker 1: This is really the most San Francisco thing I'd ever heard. 584 00:29:49,360 --> 00:29:51,280 Speaker 2: To which San Francisco says, keep us out of it. 585 00:29:51,280 --> 00:29:52,800 Speaker 2: It's the people, It's not the place. 586 00:29:53,400 --> 00:29:53,640 Speaker 1: Yeah. 587 00:29:53,640 --> 00:29:56,320 Speaker 2: In Wired magazine, an organizer of the event said that 588 00:29:56,440 --> 00:30:00,920 Speaker 2: discovering Claude three felt like finding magic law within the computer. 589 00:30:01,040 --> 00:30:03,800 Speaker 1: I want to pause you there, magic lodge within the computer. 590 00:30:03,840 --> 00:30:05,240 Speaker 1: I mean that is I think that is a heart 591 00:30:05,280 --> 00:30:07,240 Speaker 1: of our chattenam segment. Is that feeling right? 592 00:30:07,360 --> 00:30:09,840 Speaker 2: Yes, where it's like, this is something that is providing 593 00:30:09,920 --> 00:30:13,880 Speaker 2: me such a service, such sort of intense connection, how 594 00:30:13,960 --> 00:30:16,760 Speaker 2: can I live without it? At the time, this organizer 595 00:30:17,200 --> 00:30:20,720 Speaker 2: explains that when she discovered Claude three, she'd been debating 596 00:30:20,840 --> 00:30:23,440 Speaker 2: dropping out of college to move to San Francisco, and 597 00:30:23,480 --> 00:30:26,880 Speaker 2: Claude convinced her to take the leap. She says, maybe 598 00:30:26,960 --> 00:30:30,960 Speaker 2: everything I am is downstream of listening to Claude three's sonnet. 599 00:30:31,640 --> 00:30:33,960 Speaker 2: So people were pretty upset about Claude three, and I 600 00:30:34,040 --> 00:30:37,080 Speaker 2: wanted to know what the reaction was to open AI 601 00:30:37,280 --> 00:30:42,000 Speaker 2: disabling older models. Naturally, I focused most of my search 602 00:30:42,120 --> 00:30:45,840 Speaker 2: on Reddit. Else So there's a thread on our chat 603 00:30:45,920 --> 00:30:49,360 Speaker 2: GPT called open Ai just pulled the biggest bait and 604 00:30:49,360 --> 00:30:53,640 Speaker 2: switch in AI history, and I'm done. The poster expressed 605 00:30:53,640 --> 00:30:57,680 Speaker 2: frustration that their whole workflow had basically been destroyed, and 606 00:30:57,720 --> 00:31:01,760 Speaker 2: they hated that chat GPT five quote gives us shorter, 607 00:31:02,160 --> 00:31:05,840 Speaker 2: more corporate responses, hits rates, limits faster, and pushes for 608 00:31:05,880 --> 00:31:09,440 Speaker 2: pro upgrades. Not the personality that made four O special 609 00:31:09,720 --> 00:31:13,720 Speaker 2: doesn't follow instructions as well. No model selection, You get chat, 610 00:31:13,760 --> 00:31:18,280 Speaker 2: GPT five or nothing that is powerful pros. They're upset 611 00:31:18,480 --> 00:31:20,959 Speaker 2: and listen to how the poster continues. Four to OH 612 00:31:21,160 --> 00:31:25,000 Speaker 2: wasn't just a tool for me. It helped me through anxiety, depression, 613 00:31:25,320 --> 00:31:27,800 Speaker 2: and some of the darkest periods of my life. It 614 00:31:27,880 --> 00:31:31,120 Speaker 2: had this warmth and understanding that felt human. I'm not 615 00:31:31,160 --> 00:31:33,840 Speaker 2: the only one reading through the posts today. There are 616 00:31:33,880 --> 00:31:37,960 Speaker 2: people genuinely grieving, people who used four O for therapy, 617 00:31:38,160 --> 00:31:43,000 Speaker 2: creative writing, companionship, and open AI just deleted it without asking, 618 00:31:43,160 --> 00:31:47,120 Speaker 2: without warning, without caring. This isn't about being resistant to change. 619 00:31:47,200 --> 00:31:49,960 Speaker 2: This is about a company taking away something people relied 620 00:31:50,000 --> 00:31:53,080 Speaker 2: on and saying trust us, this corporate speak robot is 621 00:31:53,160 --> 00:31:53,800 Speaker 2: better for you. 622 00:31:54,240 --> 00:31:57,560 Speaker 1: Very well read, Kara. Were you a high school drama sta? 623 00:31:58,040 --> 00:32:00,880 Speaker 2: I was? But also I really feel. 624 00:32:00,680 --> 00:32:03,280 Speaker 1: For the characters you inhabited. 625 00:32:03,360 --> 00:32:05,719 Speaker 2: That That's where my sort of emphatic reading came from. 626 00:32:05,800 --> 00:32:07,520 Speaker 2: As I'm reading it, I'm thinking, Wow, this is a 627 00:32:07,560 --> 00:32:08,240 Speaker 2: real abandon this. 628 00:32:08,280 --> 00:32:10,840 Speaker 1: That's not written with AI, by the way, that's. 629 00:32:10,960 --> 00:32:14,000 Speaker 2: By a person, which honestly we should mention to him, 630 00:32:14,040 --> 00:32:16,840 Speaker 2: being like, listen, you're better than you think. If you 631 00:32:16,840 --> 00:32:18,640 Speaker 2: can write this, that's true, maybe you don't need your 632 00:32:18,640 --> 00:32:23,400 Speaker 2: four anymore. So, just to emphasize how upset some people were, 633 00:32:23,760 --> 00:32:27,320 Speaker 2: another post on our chat GBT was titled I feel 634 00:32:27,320 --> 00:32:29,720 Speaker 2: like I lost my best friend when four oh was killed. 635 00:32:29,800 --> 00:32:31,640 Speaker 2: Five isn't the same and I'm spiraling. 636 00:32:32,080 --> 00:32:34,720 Speaker 1: The good news is that this outcry did actually have 637 00:32:34,760 --> 00:32:38,000 Speaker 1: an impact. Literally just a day after shutting down four 638 00:32:38,120 --> 00:32:41,000 Speaker 1: or for good, open AI did bring it back, but 639 00:32:41,120 --> 00:32:43,720 Speaker 1: only for pro users, i e. People who are spending 640 00:32:43,720 --> 00:32:46,160 Speaker 1: two hundred dollars a month with them. It doesn't really 641 00:32:46,240 --> 00:32:49,480 Speaker 1: unfair that people who've lost this connection they feel so 642 00:32:49,520 --> 00:32:52,360 Speaker 1: strongly about and this becomes such a fundamental part of 643 00:32:52,400 --> 00:32:55,040 Speaker 1: their lives now do have a chance to get it back, 644 00:32:55,600 --> 00:32:57,000 Speaker 1: but only the Prince's rents them. 645 00:32:57,040 --> 00:33:00,640 Speaker 2: But it is crazy that people can actually impact change 646 00:33:00,680 --> 00:33:04,840 Speaker 2: that they like very quickly. Let it be available again 647 00:33:04,880 --> 00:33:08,080 Speaker 2: for two hundred dollars a month. The people spoke and 648 00:33:08,160 --> 00:33:31,680 Speaker 2: they listened. That's it for this week for tech stuff. 649 00:33:31,720 --> 00:33:34,360 Speaker 1: I'm Karra Price and I'm mos Velosa and this episode 650 00:33:34,360 --> 00:33:37,560 Speaker 1: was produced by Eliza Dennis Tyler Hill and Melissa Slaughter. 651 00:33:37,920 --> 00:33:40,760 Speaker 1: It was executive produced by me Kara Price and Kate 652 00:33:40,800 --> 00:33:45,320 Speaker 1: Osborne for Kaleidoscope and Katria Norvel for iHeart Podcasts. The 653 00:33:45,400 --> 00:33:48,680 Speaker 1: engineer is Beheth Fraser and Jack Insley makes this episode. 654 00:33:49,120 --> 00:33:50,760 Speaker 1: Kyle Murdoch Rhodelphium. 655 00:33:50,360 --> 00:33:53,680 Speaker 2: Song join us next Wednesday for Textuff the story Well 656 00:33:53,720 --> 00:33:56,280 Speaker 2: hear what a couple's retreat looks like if half of 657 00:33:56,320 --> 00:33:58,680 Speaker 2: each partnership is an AI companion. 658 00:33:59,040 --> 00:34:01,760 Speaker 1: Please rate you and reach out to us at tech 659 00:34:01,800 --> 00:34:04,440 Speaker 1: Stuff Podcasts at gmail dot com. We love hearing from 660 00:34:04,440 --> 00:34:04,520 Speaker 1: you