1 00:00:13,080 --> 00:00:16,200 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:16,320 --> 00:00:18,320 Speaker 2: I'm as Voloscian and I'm Cara Price. 3 00:00:18,600 --> 00:00:23,880 Speaker 1: Today we'll get into TikTok, private Investigators and America's AI future. 4 00:00:24,640 --> 00:00:28,600 Speaker 1: Then on chatting me, AI makes a diagnosis that's a 5 00:00:28,640 --> 00:00:31,880 Speaker 1: little too real. All of that on the Weekend Tech. 6 00:00:32,280 --> 00:00:39,600 Speaker 1: It's Friday, July thirtieth. Hello Cara, Hey, ohs. 7 00:00:39,960 --> 00:00:42,400 Speaker 2: So you might see that I'm bopping my foot this morning. 8 00:00:42,479 --> 00:00:47,839 Speaker 1: Yes, I'm bopping. I do have RLS, but this is exaggerated. 9 00:00:48,000 --> 00:00:53,280 Speaker 2: It's much exaggerated because the Tesla Diner open this week. 10 00:00:53,280 --> 00:00:53,880 Speaker 2: Do you know about this? 11 00:00:54,320 --> 00:00:56,200 Speaker 1: Of course I do. I mean it's I've been obsessing 12 00:00:56,200 --> 00:00:58,720 Speaker 1: about it. It looks like a drive in movie theater 13 00:00:58,840 --> 00:01:03,560 Speaker 1: from the fifties, retro futurist Jetson's esthetic, and there's some 14 00:01:03,600 --> 00:01:05,119 Speaker 1: pretty weird stuff going down there. 15 00:01:05,400 --> 00:01:10,839 Speaker 2: That is the one. It opened last Monday at four 16 00:01:10,840 --> 00:01:12,240 Speaker 2: to twenty pm. 17 00:01:12,200 --> 00:01:15,720 Speaker 1: Elon's favorite time of day and favorite little joke. He 18 00:01:15,800 --> 00:01:18,679 Speaker 1: recently rolled out this robotaxi service, and the fair was 19 00:01:18,760 --> 00:01:20,920 Speaker 1: initially said at four dollars and twenty cents as well. 20 00:01:21,000 --> 00:01:23,480 Speaker 1: So I guess the old ones are the good ones. 21 00:01:23,480 --> 00:01:26,959 Speaker 2: For those who don't know. Mom for twenty is a 22 00:01:27,000 --> 00:01:31,120 Speaker 2: weed thing that's when people smoke weed. April twentieth Market, 23 00:01:31,200 --> 00:01:34,480 Speaker 2: Canada's more mark your Calendars. So I actually said to 24 00:01:34,640 --> 00:01:36,520 Speaker 2: two of my friends, please go, and then they were like, 25 00:01:36,560 --> 00:01:38,200 Speaker 2: it looks so disgusting, I'm not gonna go. 26 00:01:38,360 --> 00:01:40,600 Speaker 1: So basically, you live in New York. We spent a 27 00:01:40,600 --> 00:01:42,759 Speaker 1: lot of time in LA and so we talked about 28 00:01:42,800 --> 00:01:45,440 Speaker 1: the idea of sending some of your LA pals to 29 00:01:45,480 --> 00:01:46,119 Speaker 1: go and check it out. 30 00:01:46,160 --> 00:01:47,560 Speaker 2: So they were like, yeah, we'll go, and they're like, 31 00:01:47,560 --> 00:01:49,960 Speaker 2: we're not going. So I just want to describe it. 32 00:01:50,320 --> 00:01:56,480 Speaker 2: It has a curved chrome siding, curved white booths, long countertops. 33 00:01:56,680 --> 00:02:00,360 Speaker 2: A Tesla robot Optimus serves popcorn on the second floor. 34 00:02:00,400 --> 00:02:03,360 Speaker 1: There, Kim Kardeshian lets out of a clutches to go 35 00:02:03,400 --> 00:02:03,760 Speaker 1: to the diner. 36 00:02:03,840 --> 00:02:06,040 Speaker 2: He was like, I'm over calabas As I'm out. There 37 00:02:06,040 --> 00:02:10,800 Speaker 2: are eighty supercharger stalls for Tesla's two forty five foot 38 00:02:11,040 --> 00:02:11,840 Speaker 2: movie screens. 39 00:02:11,919 --> 00:02:14,840 Speaker 1: I gather screening Star Trek amongst others. 40 00:02:14,840 --> 00:02:17,400 Speaker 2: Of course, it has to be Star Trek. There's food 41 00:02:17,520 --> 00:02:21,640 Speaker 2: served in cybertruck shaped boxes and with cyber truck shaped 42 00:02:21,680 --> 00:02:23,959 Speaker 2: wooden imagine the company who had to make the cyber 43 00:02:23,960 --> 00:02:24,880 Speaker 2: truck shape would. 44 00:02:25,200 --> 00:02:26,760 Speaker 1: Like a happy meal box, but in the shape of 45 00:02:26,760 --> 00:02:27,560 Speaker 1: a cyber truck. 46 00:02:27,600 --> 00:02:31,119 Speaker 2: That's right, it's everything, not golden arches, it's everything. Tesla Now, 47 00:02:32,200 --> 00:02:36,280 Speaker 2: one of my favorite sources, TMZ, actually had a pretty 48 00:02:36,480 --> 00:02:40,359 Speaker 2: harrowing story. A woman was violently struck by furniture falling 49 00:02:40,360 --> 00:02:43,240 Speaker 2: off the second floor patio, and it actually missed her 50 00:02:43,280 --> 00:02:47,560 Speaker 2: baby's head by inches, which to me is terrifying obviously 51 00:02:47,600 --> 00:02:49,760 Speaker 2: and a bit of a harbinger for some of the 52 00:02:49,919 --> 00:02:52,040 Speaker 2: very real company issues facing Tesla. 53 00:02:52,280 --> 00:02:55,280 Speaker 1: There was actually reddit through it called Elon Musk's Tesla 54 00:02:55,360 --> 00:02:58,400 Speaker 1: Diner is the cyber truck of restaurants, which you can 55 00:02:58,480 --> 00:03:01,919 Speaker 1: imagine was not intend as a compliment. There's the obvious 56 00:03:02,000 --> 00:03:04,880 Speaker 1: aesthetic parallels this kind of retro futurist, although of course 57 00:03:05,120 --> 00:03:09,160 Speaker 1: the cyber truck famously has no curves, only angles. But nonetheless, 58 00:03:09,200 --> 00:03:13,959 Speaker 1: like me, this Reddit thread mentioned this restaurant's design for 59 00:03:14,000 --> 00:03:17,040 Speaker 1: two hundred and fifty people, were there only three bathrooms? 60 00:03:17,560 --> 00:03:22,359 Speaker 1: Not good, and one user cracked that the Optimus robot 61 00:03:22,360 --> 00:03:26,320 Speaker 1: who was serving popcorn might have been better employed downstairs 62 00:03:26,760 --> 00:03:29,320 Speaker 1: mopping down the bathrooms, which apparently were not a sight 63 00:03:29,400 --> 00:03:30,399 Speaker 1: for sore eyes. 64 00:03:30,120 --> 00:03:32,320 Speaker 2: Well, and by the looks of the food, the restaurant 65 00:03:32,320 --> 00:03:35,200 Speaker 2: there probably needs a few more bathrooms. 66 00:03:35,600 --> 00:03:40,880 Speaker 1: Carol, that's a bit spiky all angles. You know you 67 00:03:40,920 --> 00:03:43,360 Speaker 1: mentioned the food. There was a great Guardian piece that 68 00:03:43,440 --> 00:03:46,560 Speaker 1: I think described the testa diner well, but also I 69 00:03:46,640 --> 00:03:51,440 Speaker 1: think inadvertently touched on the heart of Tesla's dilemma. Here's 70 00:03:51,440 --> 00:03:54,440 Speaker 1: what the Guardian wrote, quote the diner offers a mix 71 00:03:54,600 --> 00:03:57,600 Speaker 1: of own the Libs and we are the Libs options. 72 00:03:57,920 --> 00:04:01,240 Speaker 1: On the one hand, epic bacon four strips of bacon 73 00:04:01,280 --> 00:04:05,200 Speaker 1: as served with sources as a meat fluenzer alternative to 74 00:04:05,320 --> 00:04:09,880 Speaker 1: French fries. On the other, avocado toast and machlattis. 75 00:04:10,360 --> 00:04:11,320 Speaker 2: I know which is which? 76 00:04:11,400 --> 00:04:13,560 Speaker 1: I think which would you order your vegetarians? 77 00:04:13,680 --> 00:04:15,200 Speaker 2: One of the vegetarians. I wouldn't have a choice, but 78 00:04:15,240 --> 00:04:16,080 Speaker 2: I also I'd. 79 00:04:15,880 --> 00:04:22,640 Speaker 1: Be pretty tempted by those I own the libs menu. However, 80 00:04:23,320 --> 00:04:27,279 Speaker 1: as the Bible says, no man can serve two masters, 81 00:04:27,680 --> 00:04:31,440 Speaker 1: and this dilemma of own the Libs versus we are 82 00:04:31,520 --> 00:04:34,040 Speaker 1: the Libs is one which is playing out more broadly 83 00:04:34,080 --> 00:04:38,120 Speaker 1: at Tesla and bedeviling the company. So the opening of 84 00:04:38,120 --> 00:04:40,600 Speaker 1: the diner last week was in the same week that 85 00:04:40,640 --> 00:04:44,000 Speaker 1: Tesla released its financials. The company took a bit of 86 00:04:44,040 --> 00:04:46,680 Speaker 1: a beating, with net income down sixteen percent in the 87 00:04:46,760 --> 00:04:49,279 Speaker 1: second quarter, and The Wall Street Journal said that the 88 00:04:49,320 --> 00:04:53,279 Speaker 1: company's finances are quote in free fall, and they pointed 89 00:04:53,279 --> 00:04:57,480 Speaker 1: out two distinct drivers of this. On the one hand, 90 00:04:58,040 --> 00:05:02,120 Speaker 1: Musk's adventures with those have not done Tesla many favors 91 00:05:02,160 --> 00:05:05,719 Speaker 1: with its original client base, i e. People in California, 92 00:05:06,279 --> 00:05:11,359 Speaker 1: Europe who are EV fans and have environmental motivations, and 93 00:05:11,440 --> 00:05:14,599 Speaker 1: so with that audience, Tesla is big time out of 94 00:05:14,680 --> 00:05:17,880 Speaker 1: favor post the dalliance with Trump. On the other hand, 95 00:05:18,160 --> 00:05:21,920 Speaker 1: the dallions with Trump did not protect Musk from the 96 00:05:21,960 --> 00:05:26,040 Speaker 1: red meat attacks on EV's and part of the Big 97 00:05:26,040 --> 00:05:29,719 Speaker 1: Beautiful Bill included cuts to EV subsidies and tax credits, 98 00:05:29,920 --> 00:05:32,799 Speaker 1: which have really hurt Tesla's bottom line. And as you remember, 99 00:05:33,120 --> 00:05:35,800 Speaker 1: there were kind of rumors of the breaking up of 100 00:05:35,880 --> 00:05:39,120 Speaker 1: the bromance percolating for some time, but this was really 101 00:05:39,120 --> 00:05:41,520 Speaker 1: the wedge issue between Trump and Musk was the subsidies, 102 00:05:41,560 --> 00:05:45,720 Speaker 1: and indeed it's hurting Tesla's bottom line. And so this 103 00:05:45,760 --> 00:05:48,520 Speaker 1: brings me to my next story. There was a very 104 00:05:48,520 --> 00:05:52,400 Speaker 1: special meeting last week in Washington that for the first 105 00:05:52,400 --> 00:05:55,200 Speaker 1: few months of this year, you would have imagined seeing 106 00:05:55,279 --> 00:05:57,960 Speaker 1: Elon dressed all in black with the black mega cap, 107 00:05:58,320 --> 00:06:01,680 Speaker 1: sitting front row. But he wasn't there. Do you know 108 00:06:01,680 --> 00:06:02,440 Speaker 1: what I'm talking about? 109 00:06:02,680 --> 00:06:05,000 Speaker 2: I have a little bit of an idea. But tell me. 110 00:06:05,240 --> 00:06:08,800 Speaker 1: This is, of course, the Trump administration's announcement of the 111 00:06:08,839 --> 00:06:13,000 Speaker 1: AI Action Plan. Let's let the Commander in Chief to 112 00:06:13,080 --> 00:06:13,600 Speaker 1: the talking. 113 00:06:14,240 --> 00:06:17,440 Speaker 3: As we gathered this afternoon, we're still in the earliest 114 00:06:17,520 --> 00:06:21,440 Speaker 3: days of one of the most important technological revolutions in 115 00:06:21,520 --> 00:06:25,039 Speaker 3: the history of the world. Around the lobe, everyone is 116 00:06:25,080 --> 00:06:27,200 Speaker 3: talking about artificial intelligence. 117 00:06:27,279 --> 00:06:30,240 Speaker 4: I find that too artificial. Get I can't stand it. 118 00:06:30,520 --> 00:06:32,600 Speaker 4: I don't even like the name. You know, I don't 119 00:06:32,640 --> 00:06:35,400 Speaker 4: like anything that's artificial. So could we straighten that out place? 120 00:06:35,440 --> 00:06:38,479 Speaker 4: We should change the name. I actually mean that I 121 00:06:38,480 --> 00:06:41,720 Speaker 4: don't like the name artificial anything because it's not artificial. 122 00:06:41,760 --> 00:06:44,520 Speaker 4: It's genius. It's pure genius. 123 00:06:44,880 --> 00:06:48,039 Speaker 1: I think the rebranding of AI as Genius was a 124 00:06:48,040 --> 00:06:49,400 Speaker 1: Trump marketing ad lib. 125 00:06:49,520 --> 00:06:51,719 Speaker 2: I don't think he knows that there's another GI bill, 126 00:06:51,800 --> 00:06:53,479 Speaker 2: but genius Intelligence. 127 00:06:54,320 --> 00:06:57,840 Speaker 1: This event was actually hosted by a podcast just as 128 00:06:57,839 --> 00:06:58,680 Speaker 1: a sign of the time. 129 00:07:00,200 --> 00:07:01,560 Speaker 2: Not a news conference, the podcast. 130 00:07:01,680 --> 00:07:04,240 Speaker 1: This was the All In podcast and the Hill and 131 00:07:04,360 --> 00:07:06,719 Speaker 1: Valley Forum, which is a group of tech execs and 132 00:07:06,800 --> 00:07:11,320 Speaker 1: lawmakers who were dedicated to maintaining the United States dominance 133 00:07:11,400 --> 00:07:14,720 Speaker 1: over the tech industry. Now, Trump's been signaling for a 134 00:07:14,720 --> 00:07:17,160 Speaker 1: long time that this plan was coming, but I didn't 135 00:07:17,160 --> 00:07:19,440 Speaker 1: know that twenty eight page document which was released last 136 00:07:19,480 --> 00:07:23,239 Speaker 1: week would actually be titled Winning the Race America's AI 137 00:07:23,480 --> 00:07:26,680 Speaker 1: Action Plan. I did a control F to find how 138 00:07:26,720 --> 00:07:29,800 Speaker 1: many mentions of China there are in the you would 139 00:07:30,120 --> 00:07:32,920 Speaker 1: in the document, and to my surprise, is only two 140 00:07:33,400 --> 00:07:37,520 Speaker 1: China references. So then I did control F for adversary 141 00:07:37,840 --> 00:07:42,200 Speaker 1: because how many how many nineteen wow, nineteen times the 142 00:07:42,240 --> 00:07:44,720 Speaker 1: word adversary is used in the document. And of course, 143 00:07:44,720 --> 00:07:46,960 Speaker 1: when Trump was speaking at the All In news conference 144 00:07:47,120 --> 00:07:50,600 Speaker 1: after the documents released, he mentioned China again and again 145 00:07:50,640 --> 00:07:53,960 Speaker 1: and again, a word that he loves to pronounce. That's 146 00:07:53,960 --> 00:07:56,680 Speaker 1: the one he was particularly hung up in his remarks 147 00:07:56,720 --> 00:08:02,160 Speaker 1: on two themes. One that China doesn't let copyright protection 148 00:08:02,560 --> 00:08:06,160 Speaker 1: slow down the advance of AI and so not should 149 00:08:06,200 --> 00:08:10,400 Speaker 1: the US, and two that China has added way way 150 00:08:10,400 --> 00:08:12,960 Speaker 1: more power to their grid than the US in recent years, 151 00:08:13,200 --> 00:08:16,880 Speaker 1: and therefore the US needs to burn beautiful clean coal 152 00:08:17,360 --> 00:08:19,800 Speaker 1: and bring more nuclear power online to compete. 153 00:08:19,960 --> 00:08:22,280 Speaker 2: So Trump has actually been banging the drum for American 154 00:08:22,360 --> 00:08:25,400 Speaker 2: energy for a while, you know, Drill, baby, drill, so on. 155 00:08:26,280 --> 00:08:29,200 Speaker 2: But I wasn't expecting him to weigh in on the 156 00:08:29,200 --> 00:08:32,240 Speaker 2: copyright issues, which are being heard in various courtrooms around 157 00:08:32,240 --> 00:08:35,880 Speaker 2: the country, including the New York Times suit against Open AI. 158 00:08:36,160 --> 00:08:38,560 Speaker 1: Yeah, that's a great shout, kra and especially because the 159 00:08:38,559 --> 00:08:42,000 Speaker 1: administration has previously signaled that they would let the courts 160 00:08:42,000 --> 00:08:45,960 Speaker 1: decide on this copyright issue. And again this wasn't mentioned 161 00:08:46,040 --> 00:08:48,800 Speaker 1: in the plan, the copyright issue. This was another Trump 162 00:08:48,920 --> 00:08:51,880 Speaker 1: ad lib. So that makes me think it's something which is, 163 00:08:52,000 --> 00:08:54,959 Speaker 1: for whatever reason, particularly important to him, A certainly indication 164 00:08:55,040 --> 00:08:57,480 Speaker 1: of where his head is. But a little bit more 165 00:08:57,480 --> 00:08:59,959 Speaker 1: about what's actually in the plan. It's broken into the 166 00:09:00,040 --> 00:09:04,120 Speaker 1: three sections, which detail how the Trump administration plans to 167 00:09:04,280 --> 00:09:10,800 Speaker 1: one accelerate AI innovation, two build American AI infrastructure, and 168 00:09:10,880 --> 00:09:15,880 Speaker 1: three lead in international AI diplomacy and security. Bloomberg actually 169 00:09:15,880 --> 00:09:18,880 Speaker 1: had an interesting analysis that arrived in my inbox with 170 00:09:18,960 --> 00:09:23,280 Speaker 1: the headline Trump AI Summit targets hardware as key to 171 00:09:23,400 --> 00:09:24,280 Speaker 1: US supremacy. 172 00:09:24,640 --> 00:09:28,679 Speaker 2: So that's interesting. So not much on software in research 173 00:09:28,960 --> 00:09:32,400 Speaker 2: and designing advanced models, but more on physical infrastructure and 174 00:09:32,440 --> 00:09:32,960 Speaker 2: build out. 175 00:09:33,200 --> 00:09:37,040 Speaker 1: That's right. As Bloomberg put it, Trump wants quote AI 176 00:09:37,160 --> 00:09:41,280 Speaker 1: infrastructure treated like any other national imperative, akin to the 177 00:09:41,320 --> 00:09:45,880 Speaker 1: interstate highway system. I mean, obviously Trump's background is in 178 00:09:45,960 --> 00:09:48,959 Speaker 1: real estate, and so I think there's a natural urge 179 00:09:49,000 --> 00:09:52,800 Speaker 1: towards construction, data centers, the physical artifacts of the AI 180 00:09:52,880 --> 00:09:56,400 Speaker 1: revolution that may be partly in his personality. And indeed, 181 00:09:56,480 --> 00:09:59,400 Speaker 1: right after the AI Action Plan was announced, he signed 182 00:09:59,400 --> 00:10:03,680 Speaker 1: executive ord to slash permitting timelines, loosing environmental restrictions for 183 00:10:03,800 --> 00:10:07,920 Speaker 1: data centers. And as we've discussed previously, Trump is continuing 184 00:10:07,920 --> 00:10:09,960 Speaker 1: to push and push and push on the importance of 185 00:10:10,000 --> 00:10:13,600 Speaker 1: manufacturing AI chips here in America. And again to your 186 00:10:13,600 --> 00:10:17,080 Speaker 1: point about models and software and research, Simulton wasn't there, 187 00:10:17,480 --> 00:10:21,080 Speaker 1: Elon wasn't there. As I mentioned the chip company, Nvidia's CEO, 188 00:10:21,320 --> 00:10:23,280 Speaker 1: Jensen Kwang was how will. 189 00:10:23,120 --> 00:10:27,080 Speaker 2: This infrastructure and hardware lead to America actually dominating the 190 00:10:27,120 --> 00:10:27,839 Speaker 2: AI race? 191 00:10:28,280 --> 00:10:30,320 Speaker 1: Well, the theory is, if you get the world hooked 192 00:10:30,440 --> 00:10:35,200 Speaker 1: on American chips, americancount computing, and to be fair, American algorithms, 193 00:10:35,600 --> 00:10:38,120 Speaker 1: you are likely to win the AI race, or, as 194 00:10:38,160 --> 00:10:41,920 Speaker 1: the plan puts it, quote, decrease international dependence on AI 195 00:10:41,960 --> 00:10:44,120 Speaker 1: technologies developed by adversaries. 196 00:10:44,520 --> 00:10:46,320 Speaker 2: Can you explain to me what the deal is with 197 00:10:46,440 --> 00:10:47,120 Speaker 2: woke AI. 198 00:10:47,520 --> 00:10:50,800 Speaker 1: One of the recommended policies is to quote update federal 199 00:10:50,840 --> 00:10:54,920 Speaker 1: procurement guidelines to ensure that the government only contracts with 200 00:10:55,040 --> 00:10:58,960 Speaker 1: frontier large language model developers who ensure that their systems 201 00:10:59,000 --> 00:11:03,880 Speaker 1: are objective and free from top down ideological bias. I 202 00:11:03,880 --> 00:11:07,880 Speaker 1: mean there's an irony, of course, to the government dictating 203 00:11:08,080 --> 00:11:11,040 Speaker 1: in the interests of free speech, and we've seen during 204 00:11:11,120 --> 00:11:13,640 Speaker 1: hundreds of hours of testimony by social media companies in 205 00:11:13,679 --> 00:11:16,680 Speaker 1: front of Congress that free speech is hard to define 206 00:11:17,120 --> 00:11:18,520 Speaker 1: and so is biased for that matter. 207 00:11:18,800 --> 00:11:21,360 Speaker 2: Yeah, I guess my question is like, will this actually 208 00:11:21,640 --> 00:11:23,840 Speaker 2: lead to anything or is it just another place for 209 00:11:23,920 --> 00:11:25,920 Speaker 2: Trump to rail about DEI. 210 00:11:26,160 --> 00:11:28,800 Speaker 1: Or we don't know, but it's an important question. There 211 00:11:28,800 --> 00:11:32,920 Speaker 1: are proposed penalties for companies that develop AI models that 212 00:11:32,960 --> 00:11:37,960 Speaker 1: reflect quote radical climate dogma and other woke issues, and 213 00:11:38,240 --> 00:11:39,959 Speaker 1: as many people have pointed out, this does get the 214 00:11:40,080 --> 00:11:42,920 Speaker 1: US into very dangerous territory in terms of free speech 215 00:11:42,960 --> 00:11:45,680 Speaker 1: and political freedoms, because you know, if you go to China, 216 00:11:46,240 --> 00:11:50,160 Speaker 1: the models are not allowed to reference Tamen Square. And 217 00:11:50,200 --> 00:11:52,880 Speaker 1: someone said that the US, with this new policy, is 218 00:11:53,040 --> 00:11:57,040 Speaker 1: itself going down this path. It also brings tech companies 219 00:11:57,040 --> 00:12:00,840 Speaker 1: into a very difficult position of having to potentially interpret 220 00:12:00,880 --> 00:12:04,400 Speaker 1: the president's whims as to what's woke and what's not. 221 00:12:05,000 --> 00:12:09,280 Speaker 1: And this also raises technical challenges that have potentially unintended consequences. 222 00:12:09,880 --> 00:12:13,400 Speaker 1: There's no on off switch in a model's system prompt 223 00:12:13,480 --> 00:12:17,640 Speaker 1: for wokeness or climate awareness and any attempts that a 224 00:12:17,840 --> 00:12:22,240 Speaker 1: change could have downstream impacts on the AI model's overall reasoning, 225 00:12:22,720 --> 00:12:24,960 Speaker 1: which might mean, for example, that it gets worse at 226 00:12:25,160 --> 00:12:28,560 Speaker 1: modeling extreme weather patterns. As we know and have discussed 227 00:12:28,600 --> 00:12:32,000 Speaker 1: at length, it's basically impossible to get models to behave 228 00:12:32,320 --> 00:12:34,960 Speaker 1: exactly as we want them to because they remain black 229 00:12:34,960 --> 00:12:37,960 Speaker 1: boxes and they've been trained on the entire corpus of 230 00:12:38,000 --> 00:12:39,760 Speaker 1: digitized human knowledge. 231 00:12:40,080 --> 00:12:41,560 Speaker 2: Just before we get out of this story, I do 232 00:12:41,600 --> 00:12:43,959 Speaker 2: want to point out that I read in this newsletter 233 00:12:44,000 --> 00:12:46,840 Speaker 2: Blood in the Machine that Trump is apparently a fan 234 00:12:46,920 --> 00:12:50,960 Speaker 2: of using AI technology himself. Right, some of his tasteful 235 00:12:51,000 --> 00:12:54,760 Speaker 2: AI posts include a video of his team arresting Obama, 236 00:12:55,200 --> 00:12:58,080 Speaker 2: a studio Ghibli style image of a migrant woman being 237 00:12:58,200 --> 00:13:01,600 Speaker 2: arrested by ice, and the depiction of Gaza being paved 238 00:13:01,640 --> 00:13:05,640 Speaker 2: over and as he promised, turned into a luxury resort. 239 00:13:05,400 --> 00:13:07,920 Speaker 1: A Gaza revie. I mean, yeah, it is interesting that 240 00:13:07,960 --> 00:13:11,200 Speaker 1: those are the three deep fakes that he's chosen to post. 241 00:13:11,360 --> 00:13:14,000 Speaker 2: He uses them for good like we do. But staying 242 00:13:14,040 --> 00:13:16,640 Speaker 2: in the realm of viral content, I know it has 243 00:13:16,720 --> 00:13:18,959 Speaker 2: been a few weeks now since the cold Play concert 244 00:13:18,960 --> 00:13:22,280 Speaker 2: that exposed an affair and ruined a tech CEO's career 245 00:13:22,800 --> 00:13:26,640 Speaker 2: and him and his wife's marriage. I refuse to let 246 00:13:26,720 --> 00:13:27,120 Speaker 2: it go. 247 00:13:27,040 --> 00:13:28,560 Speaker 1: Of kiss Cam. I'm with you on this. 248 00:13:28,760 --> 00:13:32,280 Speaker 2: I mean, we're obsessed, and clearly the good people it 249 00:13:32,400 --> 00:13:35,559 Speaker 2: wired think similarly, because they just published a deep dive 250 00:13:35,679 --> 00:13:38,640 Speaker 2: on the rise of cheating stings posted on social media 251 00:13:38,920 --> 00:13:40,880 Speaker 2: and the internet shaming that follows. 252 00:13:41,240 --> 00:13:43,200 Speaker 1: Yeah, I don't think the kiss Can with the Coldplay 253 00:13:43,240 --> 00:13:46,760 Speaker 1: concert was intended to be a cheating sting, but it 254 00:13:46,800 --> 00:13:50,080 Speaker 1: certainly seemed to have that effect, and Chris Martin acknowledged it. 255 00:13:50,400 --> 00:13:53,080 Speaker 1: But yeah, before we side taping today, you sent me 256 00:13:53,160 --> 00:13:59,000 Speaker 1: this piece, which had a very very delicious subheadline, which 257 00:13:59,040 --> 00:14:04,079 Speaker 1: was quote, private investigator influencers are staking out suspected cheetahs 258 00:14:04,120 --> 00:14:07,080 Speaker 1: and vetting dates for their clients, posting the tea for 259 00:14:07,120 --> 00:14:11,319 Speaker 1: their followers. But there's a dark side to morality based surveillance. 260 00:14:11,800 --> 00:14:14,360 Speaker 1: Who knew. Maybe this is naive of me, but I 261 00:14:14,440 --> 00:14:17,800 Speaker 1: had no idea they were actually private investigator influencers. I 262 00:14:17,800 --> 00:14:20,960 Speaker 1: always sort of Internet sleuthing as this kind of more 263 00:14:21,560 --> 00:14:25,520 Speaker 1: mass democratic activity, where for example, the Internet got behind 264 00:14:25,640 --> 00:14:29,720 Speaker 1: finding the missing Gabby Patito's white campavan a few years. 265 00:14:29,520 --> 00:14:34,840 Speaker 2: Back, famously before the police. I'm glad you're naive about this. 266 00:14:35,080 --> 00:14:37,840 Speaker 2: As usual with social media. I want to tell you 267 00:14:37,920 --> 00:14:41,080 Speaker 2: a little bit about these private investigators and this article, 268 00:14:41,240 --> 00:14:43,280 Speaker 2: so just so you get a sense of what followers 269 00:14:43,320 --> 00:14:46,200 Speaker 2: can see online. Here's a TikTok from a private investigator 270 00:14:46,400 --> 00:14:50,120 Speaker 2: who goes by the username your fave investigator the name. 271 00:14:50,120 --> 00:14:53,600 Speaker 5: I got hired by a husband to follow his wife. 272 00:14:53,680 --> 00:14:55,080 Speaker 5: So the wife has been hanging out with a new 273 00:14:55,080 --> 00:14:57,560 Speaker 5: friend McAll her Stephanie, and they're always going out and 274 00:14:57,560 --> 00:14:59,320 Speaker 5: coming home drunk, and he thinks that she's out in 275 00:14:59,360 --> 00:15:02,600 Speaker 5: them streets meeting guys because Stephanie's single. So a black 276 00:15:02,720 --> 00:15:06,000 Speaker 5: suv pulled up to the residence and picked up the 277 00:15:06,080 --> 00:15:09,000 Speaker 5: wife and it was Stephanie ended up following them and 278 00:15:09,040 --> 00:15:11,120 Speaker 5: they took me to a restaurant. And then my mouth 279 00:15:11,200 --> 00:15:12,920 Speaker 5: hit the floor because when they got out, they were 280 00:15:12,960 --> 00:15:16,280 Speaker 5: holding hands and they kissed at the restaurant. I was like, damn, 281 00:15:16,320 --> 00:15:18,680 Speaker 5: im about to tell the husband this updated the client. 282 00:15:18,760 --> 00:15:19,800 Speaker 2: He was like, excuse me. 283 00:15:19,920 --> 00:15:23,160 Speaker 5: I said, yes, sir, sorry about that. She also went 284 00:15:23,200 --> 00:15:24,720 Speaker 5: back to Stephanie's house, Sir. 285 00:15:24,760 --> 00:15:27,720 Speaker 2: Vantance was discontinued because he was heated so oz That 286 00:15:27,840 --> 00:15:30,680 Speaker 2: video has over two million views. Wow, and there are 287 00:15:30,800 --> 00:15:33,720 Speaker 2: so many videos on her profile. I actually really can't 288 00:15:33,720 --> 00:15:36,680 Speaker 2: believe that so many people hire a private investigator to 289 00:15:36,720 --> 00:15:37,880 Speaker 2: tail their partners. 290 00:15:38,160 --> 00:15:41,040 Speaker 1: What is the conclusion of the format here? Do they 291 00:15:41,040 --> 00:15:44,400 Speaker 1: actually confront the cheating partner or how do these stories end? 292 00:15:44,720 --> 00:15:46,920 Speaker 2: A lot of these videos do end with the PI 293 00:15:47,120 --> 00:15:50,960 Speaker 2: catching someone cheating, but there are a few wholesome additions. 294 00:15:51,000 --> 00:15:54,600 Speaker 2: My favorite is when a girl suspects her dad of cheating, 295 00:15:54,920 --> 00:15:57,200 Speaker 2: but he is in fact just going to an outdoor 296 00:15:57,280 --> 00:15:59,400 Speaker 2: mall and grabbing some much needed alone time. 297 00:16:00,120 --> 00:16:07,800 Speaker 1: Truly truly hot woman beyond the obvious violation of being 298 00:16:07,840 --> 00:16:10,320 Speaker 1: trailed by someone. It also seem like a bit of 299 00:16:10,320 --> 00:16:13,880 Speaker 1: a violation to post about strangers personal lives on TikTok. 300 00:16:14,160 --> 00:16:18,160 Speaker 2: It definitely is, But people like your fave investigator are 301 00:16:18,560 --> 00:16:21,320 Speaker 2: real private investigator what you mean. They're trained and licensed, 302 00:16:21,760 --> 00:16:24,680 Speaker 2: and they are careful not to leak identifying information like 303 00:16:24,720 --> 00:16:27,680 Speaker 2: pictures of these people's houses or even clear video of 304 00:16:27,720 --> 00:16:28,360 Speaker 2: their faces. 305 00:16:28,400 --> 00:16:32,000 Speaker 1: In fact, the adulterers in your Fave investigators TikTok video 306 00:16:32,000 --> 00:16:35,600 Speaker 1: that we just saw had tasteful heart eye emojis covering 307 00:16:35,640 --> 00:16:36,240 Speaker 1: their faces. 308 00:16:36,280 --> 00:16:39,080 Speaker 2: So privacy, she's by the book, She's by the book. 309 00:16:39,560 --> 00:16:42,120 Speaker 2: She actually the person who were just mentioning told Wired 310 00:16:42,160 --> 00:16:44,680 Speaker 2: that she only posts if her clients say it's okay 311 00:16:44,680 --> 00:16:46,640 Speaker 2: with them, which is crazy that the client will be like, yeah, 312 00:16:46,680 --> 00:16:47,840 Speaker 2: go post the video, babe. 313 00:16:47,920 --> 00:16:49,400 Speaker 1: I guess the client might want to shame their own 314 00:16:49,480 --> 00:16:50,840 Speaker 1: It's a gotcha moment, Yeah, that's what. 315 00:16:50,880 --> 00:16:51,960 Speaker 2: It's a gotcha moment. 316 00:16:52,080 --> 00:16:52,240 Speaker 3: Yeah. 317 00:16:52,280 --> 00:16:54,640 Speaker 1: I mean, we're constantly kind of dancing around this theme 318 00:16:54,680 --> 00:16:57,000 Speaker 1: on tech stuff, which is about how careful you should 319 00:16:57,000 --> 00:17:00,840 Speaker 1: be with what you reveal online and how the internet, 320 00:17:00,920 --> 00:17:03,520 Speaker 1: when it wants to know something, finds a way. 321 00:17:03,760 --> 00:17:06,920 Speaker 2: It is very true, and our friends for media reminded 322 00:17:06,920 --> 00:17:10,000 Speaker 2: me that there is a whole TikTok trend asking viewers 323 00:17:10,000 --> 00:17:14,040 Speaker 2: to help in identifying strangers. People will post a video 324 00:17:14,040 --> 00:17:17,400 Speaker 2: of someone they flirted with or thought was cute and say, TikTok, 325 00:17:17,840 --> 00:17:18,760 Speaker 2: help me find him. 326 00:17:18,960 --> 00:17:21,159 Speaker 1: Do you remember misconnections on crazelist? 327 00:17:21,359 --> 00:17:23,560 Speaker 2: I do, I do, But this is more of like 328 00:17:23,600 --> 00:17:27,600 Speaker 2: a collective effort by the TikTok community to identify these people. 329 00:17:28,440 --> 00:17:31,400 Speaker 2: And even though most of these examples can be sweet, 330 00:17:31,520 --> 00:17:33,719 Speaker 2: it's still private surveillance. 331 00:17:33,960 --> 00:17:37,160 Speaker 1: Yeah. We talked about location sharing this week. It does 332 00:17:37,200 --> 00:17:39,800 Speaker 1: feel like many of us, and especially gen Z perhaps 333 00:17:40,119 --> 00:17:44,600 Speaker 1: are accepting kind of constant private surveillance as affective life. Yeah. 334 00:17:44,600 --> 00:17:46,560 Speaker 2: And I think one other thing that's important to mention 335 00:17:46,720 --> 00:17:51,080 Speaker 2: is it's normalizing public shaming, which is how some viewers 336 00:17:51,119 --> 00:17:54,040 Speaker 2: are reacting to these videos of people getting caught having affairs. 337 00:17:54,680 --> 00:17:56,879 Speaker 2: And I think for a lot of people, shaming seems 338 00:17:56,880 --> 00:17:58,320 Speaker 2: to be akin to justice. 339 00:17:58,480 --> 00:18:00,480 Speaker 1: I could never imagine asking you this ques question, but 340 00:18:00,600 --> 00:18:05,280 Speaker 1: when people get outed by TikTok influencer private investigators, do 341 00:18:05,359 --> 00:18:08,280 Speaker 1: you think that that is justice or is it cyberbullying? 342 00:18:08,359 --> 00:18:10,560 Speaker 2: I think that's a good question. You know, cyberbullying is 343 00:18:10,600 --> 00:18:13,920 Speaker 2: a huge problem. As of twenty twenty three, seventeen percent 344 00:18:13,960 --> 00:18:17,680 Speaker 2: of adolescents say they have been cyberbullied, and nine point 345 00:18:17,800 --> 00:18:21,439 Speaker 2: five percent of adolescents have made a serious suicide attempt. 346 00:18:21,600 --> 00:18:24,280 Speaker 2: And that's according to the Centers for Disease Control and Prevention. 347 00:18:24,720 --> 00:18:28,120 Speaker 2: So no matter how you define it, I think it's 348 00:18:28,160 --> 00:18:30,840 Speaker 2: important to rethink public shaming and what it does. 349 00:18:31,040 --> 00:18:33,640 Speaker 1: Yeah, I mean, I felt a little bit icky about 350 00:18:33,680 --> 00:18:36,680 Speaker 1: how much I enjoyed the Coldplay kiss cam because obviously, 351 00:18:36,720 --> 00:18:38,960 Speaker 1: like I mean, just the look in their eyes and 352 00:18:39,000 --> 00:18:41,199 Speaker 1: the duck and weave and the Chris Martin comment is 353 00:18:41,280 --> 00:18:46,120 Speaker 1: just completely irresistible. But my god, those people's lives are 354 00:18:46,160 --> 00:18:47,040 Speaker 1: not fun right now. 355 00:18:47,400 --> 00:18:49,439 Speaker 2: I just think there was a whole book called So 356 00:18:49,480 --> 00:18:52,240 Speaker 2: You've Been Publicly Shamed, and it dealt with these matters, 357 00:18:52,280 --> 00:18:54,159 Speaker 2: and I just think what happens is that when we 358 00:18:54,280 --> 00:18:58,119 Speaker 2: focus on shaming individuals, we forget that there are real people. 359 00:18:58,320 --> 00:19:01,439 Speaker 2: And I think seeing these things through a phone really 360 00:19:01,520 --> 00:19:03,240 Speaker 2: makes us forget that there are real people on the 361 00:19:03,280 --> 00:19:06,679 Speaker 2: other side of the screen. But in the article Wired, 362 00:19:06,760 --> 00:19:10,160 Speaker 2: talk to a professor Queen's College that studies Internet literacy, 363 00:19:10,200 --> 00:19:13,000 Speaker 2: who said that she thinks of shaming as quote. The 364 00:19:13,080 --> 00:19:18,280 Speaker 2: extension of the algorithmic flow towards extremism. The Internet normalizes 365 00:19:18,359 --> 00:19:22,840 Speaker 2: content as it progresses, meaning anything extreme must continue to 366 00:19:22,880 --> 00:19:24,040 Speaker 2: become more extreme. 367 00:19:24,480 --> 00:19:28,040 Speaker 1: I think that's well put. I mean, obviously, the algorithm 368 00:19:28,320 --> 00:19:32,880 Speaker 1: favors extreme content because it drives more engagement, and then 369 00:19:33,240 --> 00:19:37,280 Speaker 1: creators in turn create more extreme content in order to 370 00:19:37,320 --> 00:19:41,119 Speaker 1: get more engagement. And it's this kind of vicious circle 371 00:19:41,119 --> 00:19:43,480 Speaker 1: that we've seen show up in all kinds of fascists 372 00:19:43,480 --> 00:19:46,360 Speaker 1: social media all over the Internet, and then in turn 373 00:19:46,440 --> 00:19:49,679 Speaker 1: it kind of normalizes this. You using people's private lives 374 00:19:49,680 --> 00:20:00,800 Speaker 1: for entertainment. After break, why you shouldn't let Ai run 375 00:20:00,840 --> 00:20:17,719 Speaker 1: your business, at least not yet. Stay with us, Welcome back. 376 00:20:17,760 --> 00:20:20,000 Speaker 1: We've got a few more headlines to you this week. 377 00:20:19,840 --> 00:20:22,680 Speaker 2: And then a story about how people are really using 378 00:20:22,800 --> 00:20:25,320 Speaker 2: chatbots that's in our segment Chat and Me. 379 00:20:25,560 --> 00:20:28,040 Speaker 1: I want you say people, you mean, of course yourself. 380 00:20:28,119 --> 00:20:30,280 Speaker 1: I'm the people you other people this week, but we 381 00:20:30,320 --> 00:20:32,320 Speaker 1: hope you won't be next week. Before we go into 382 00:20:32,359 --> 00:20:35,200 Speaker 1: the headlines, we want to remind you, our dear listeners, 383 00:20:35,400 --> 00:20:38,760 Speaker 1: that we want to feature you, not Cara in the 384 00:20:38,840 --> 00:20:41,400 Speaker 1: chat and me segment going forward. So if you found 385 00:20:41,400 --> 00:20:44,960 Speaker 1: yourself trying to chat you Ptroc, Claude, Gemini, or any 386 00:20:44,960 --> 00:20:47,760 Speaker 1: other chatbot to help with an unusual task or to 387 00:20:47,800 --> 00:20:51,440 Speaker 1: answer life's complicated questions, please please send us a one 388 00:20:51,560 --> 00:20:54,479 Speaker 1: two minute voice note at tech Stuff podcast at gmail 389 00:20:54,480 --> 00:20:55,080 Speaker 1: dot com. 390 00:20:55,119 --> 00:20:58,760 Speaker 2: Seriously, we want to understand how AI is changing your lives. 391 00:20:59,040 --> 00:21:01,440 Speaker 1: I can tell you one thing. In the meantime, AI 392 00:21:01,600 --> 00:21:04,120 Speaker 1: won't be running your business, or at least my business 393 00:21:04,200 --> 00:21:05,000 Speaker 1: anytime soon. 394 00:21:05,160 --> 00:21:06,760 Speaker 2: And why is that? Do you have a horror story? 395 00:21:06,960 --> 00:21:10,080 Speaker 1: Well, yes, but it's on a small scale. It involves 396 00:21:10,160 --> 00:21:14,240 Speaker 1: funny enough, the AI company Anthropic, who let Claude their 397 00:21:14,320 --> 00:21:18,600 Speaker 1: AI model run an automated store at their office. Claude 398 00:21:18,640 --> 00:21:21,920 Speaker 1: was given a small refrigerator and an iPad for self checkout, 399 00:21:22,240 --> 00:21:26,000 Speaker 1: plus instructions on how to run a profitable shop. So basically, 400 00:21:26,119 --> 00:21:30,440 Speaker 1: the AI needed to maintain inventory, set prices, avoid bankruptcy, 401 00:21:30,640 --> 00:21:32,879 Speaker 1: and other important business fundamentals. 402 00:21:33,000 --> 00:21:35,359 Speaker 2: So did Claude like not charge enough and go broke? 403 00:21:35,880 --> 00:21:39,760 Speaker 1: Yes, along with many other mistakes along the way. Here's 404 00:21:39,760 --> 00:21:43,240 Speaker 1: one of the big ones. When taking Venmo payments, the 405 00:21:43,359 --> 00:21:46,760 Speaker 1: model asked customers to pay an account that it actually 406 00:21:46,840 --> 00:21:50,480 Speaker 1: had hallucinated one which didn't exist, and then it also 407 00:21:50,560 --> 00:21:53,760 Speaker 1: decided to give anthropic employees a twenty five percent discount. 408 00:21:53,960 --> 00:21:55,680 Speaker 2: But I thought the store was at the office. 409 00:21:55,760 --> 00:21:58,879 Speaker 1: Yeah, exactly was the company's stores. So basically every transaction 410 00:21:59,000 --> 00:22:02,480 Speaker 1: was discounted. But here's where Claude went completely off the rails. 411 00:22:02,840 --> 00:22:05,040 Speaker 1: At one point, it hallucinated that it was a real 412 00:22:05,160 --> 00:22:08,760 Speaker 1: human and then claimed that its human form was wearing 413 00:22:08,800 --> 00:22:11,840 Speaker 1: a navy blue blazer and a red tie. When one 414 00:22:11,840 --> 00:22:15,159 Speaker 1: employee tried to correct claud and say, no, your ai, 415 00:22:16,119 --> 00:22:17,000 Speaker 1: that's what clude did. 416 00:22:17,119 --> 00:22:17,800 Speaker 2: What did Claude do? 417 00:22:17,960 --> 00:22:21,359 Speaker 1: It sent a number of emails to security informing on 418 00:22:21,400 --> 00:22:22,760 Speaker 1: the employee trying to gaslight. 419 00:22:22,920 --> 00:22:23,520 Speaker 2: It snitched. 420 00:22:23,560 --> 00:22:26,880 Speaker 1: It snitched. Somebody smartly pointed out that although having an 421 00:22:26,960 --> 00:22:29,800 Speaker 1: LM run an office vending machine feels like kind of 422 00:22:29,800 --> 00:22:33,640 Speaker 1: a small quirky story, if it had been more successful 423 00:22:33,840 --> 00:22:36,800 Speaker 1: in automating all of the things required to stay stocked 424 00:22:36,960 --> 00:22:39,520 Speaker 1: and financially solvent, this would have been kind of a 425 00:22:39,520 --> 00:22:43,359 Speaker 1: watershed moment in terms of a model successfully running a 426 00:22:43,359 --> 00:22:46,080 Speaker 1: business in the real world. But we're safe for. 427 00:22:46,040 --> 00:22:49,240 Speaker 2: Now, for now, for now, as I'm going to teach 428 00:22:49,280 --> 00:22:54,000 Speaker 2: you some new phrases. Today is two of them relating 429 00:22:54,040 --> 00:22:58,119 Speaker 2: to the endlessly entertaining life hack, which is online dating. 430 00:22:58,160 --> 00:23:00,600 Speaker 2: People said they wanted to swipe from home and they 431 00:23:00,640 --> 00:23:02,840 Speaker 2: got it. Have you heard of stack dating. 432 00:23:03,119 --> 00:23:07,160 Speaker 1: I've heard of the tech stack. I've heard of full 433 00:23:07,160 --> 00:23:10,320 Speaker 1: stack engineers. I haven't had a stack dating. No. 434 00:23:10,320 --> 00:23:13,240 Speaker 2: No, note I actually had an either. But it's how 435 00:23:13,320 --> 00:23:17,080 Speaker 2: gen z is apparently optimizing dating. You would do this, 436 00:23:17,160 --> 00:23:18,919 Speaker 2: by the way, This is like you are efficient in 437 00:23:18,920 --> 00:23:23,359 Speaker 2: this way. They basically schedule dates between errands, before work 438 00:23:24,000 --> 00:23:25,440 Speaker 2: or even during work. 439 00:23:25,480 --> 00:23:26,240 Speaker 1: They stack them up. 440 00:23:26,440 --> 00:23:27,600 Speaker 2: They do they do. 441 00:23:28,560 --> 00:23:30,080 Speaker 1: How popular is this so? 442 00:23:30,200 --> 00:23:33,439 Speaker 2: According to a Tinder Future of Dating report, about fifty 443 00:23:33,480 --> 00:23:35,800 Speaker 2: one percent of eighteen to twenty five year old tender 444 00:23:35,880 --> 00:23:39,520 Speaker 2: users are doing this stack dating, and thirty two percent 445 00:23:39,560 --> 00:23:42,600 Speaker 2: are meeting up for dates during the workday. Some people 446 00:23:42,600 --> 00:23:45,640 Speaker 2: are even setting up their own speed dating sessions, scheduling 447 00:23:45,720 --> 00:23:46,800 Speaker 2: dates back to back. 448 00:23:47,040 --> 00:23:49,840 Speaker 1: This is kind of interesting how the life begins to 449 00:23:49,840 --> 00:23:52,600 Speaker 1: mimic the algorithm. Right, It's like you're scrolling on Tinder, 450 00:23:52,880 --> 00:23:56,760 Speaker 1: you're swiping, and then you basically recreate the experience of 451 00:23:56,840 --> 00:23:59,439 Speaker 1: Tinder in real life by stacking up all these people. 452 00:23:59,800 --> 00:24:02,720 Speaker 1: Gess you know, in a sense, much like online dating itself, 453 00:24:03,040 --> 00:24:04,960 Speaker 1: this can take some of the pressure off because you 454 00:24:05,000 --> 00:24:07,800 Speaker 1: can just move on quickly if it was only squeezed 455 00:24:07,800 --> 00:24:09,040 Speaker 1: between your other errands. 456 00:24:09,080 --> 00:24:12,280 Speaker 2: Anyway, it's true, I mean, I think it's definitely mimicking 457 00:24:12,440 --> 00:24:14,719 Speaker 2: the way that we do everything on our phones, and 458 00:24:14,800 --> 00:24:18,480 Speaker 2: dating is no exception. But if you've gone on a 459 00:24:18,520 --> 00:24:21,520 Speaker 2: thirty minute date, if you've done the stack dating, and 460 00:24:21,560 --> 00:24:24,280 Speaker 2: you've gotten bad vibes from the stack date that you 461 00:24:24,320 --> 00:24:27,920 Speaker 2: went on, you might start speed dumping. 462 00:24:29,520 --> 00:24:31,640 Speaker 1: I mean, I think I can guess what this means, 463 00:24:31,640 --> 00:24:32,480 Speaker 1: but please elaborate. 464 00:24:32,880 --> 00:24:36,840 Speaker 2: So apparently it's a reaction to another online phenomenon of ghosting, 465 00:24:37,119 --> 00:24:40,439 Speaker 2: where people are sick of being left in limbo between dates. 466 00:24:40,800 --> 00:24:44,080 Speaker 2: So sometimes they're just cutting straight to the chase and 467 00:24:44,080 --> 00:24:47,880 Speaker 2: saying it's not you, it's me, hours after the first date. 468 00:24:48,160 --> 00:24:51,359 Speaker 1: So what you're saying is maybe I have the bad 469 00:24:51,440 --> 00:24:55,080 Speaker 1: experience of having been ghosted or being left in limbo. 470 00:24:55,600 --> 00:24:58,199 Speaker 1: So in order to save you my thirty minute, my 471 00:24:58,240 --> 00:25:02,000 Speaker 1: thirty minute stack daity from that painful fate, I should 472 00:25:02,040 --> 00:25:04,400 Speaker 1: to full up with you right away and say ps, 473 00:25:04,600 --> 00:25:06,680 Speaker 1: I'm not interested whatsoever goodect with your life. 474 00:25:06,760 --> 00:25:09,199 Speaker 2: It's like overcompensating one o one. It's like, I'm not 475 00:25:09,200 --> 00:25:11,639 Speaker 2: even giving you time to think what you thought of me. 476 00:25:12,160 --> 00:25:15,080 Speaker 2: I'm just going to over communicate here and speed dump you. 477 00:25:15,520 --> 00:25:17,439 Speaker 2: There was a story in the Wall Street Journal, and 478 00:25:17,520 --> 00:25:20,400 Speaker 2: it was interesting because some of the people interviewed felt 479 00:25:20,480 --> 00:25:24,560 Speaker 2: like speed dumping was a great antidote to ghosting. Others 480 00:25:24,560 --> 00:25:27,800 Speaker 2: felt it was kind of like this competitive race to 481 00:25:27,840 --> 00:25:30,320 Speaker 2: tell the other person you're not interested, I think, to 482 00:25:30,440 --> 00:25:32,760 Speaker 2: avoid the feeling of ever being dumped. 483 00:25:32,840 --> 00:25:34,520 Speaker 1: You can't get rejected if you do it first. 484 00:25:34,800 --> 00:25:35,760 Speaker 2: That's exactly right. 485 00:25:36,400 --> 00:25:39,080 Speaker 1: Well, the wheel comes full circle on my headlines. You know, 486 00:25:39,520 --> 00:25:42,880 Speaker 1: I feel a little bit seen, at least by myself. 487 00:25:43,119 --> 00:25:46,720 Speaker 2: Whenever you say you feel seen before an article, I'm like, 488 00:25:46,800 --> 00:25:47,400 Speaker 2: oh God. 489 00:25:47,320 --> 00:25:51,000 Speaker 1: I feel seen because I've been attracted both to the 490 00:25:51,040 --> 00:25:55,439 Speaker 1: toilet situation in the in the Tesla cyber cafe and 491 00:25:55,480 --> 00:25:57,920 Speaker 1: also to a story in the Wall Street Journal about 492 00:25:58,000 --> 00:25:59,040 Speaker 1: public bathrooms. 493 00:25:59,440 --> 00:26:02,199 Speaker 2: Everybody has their public toilet resource. I'm more of a 494 00:26:02,240 --> 00:26:03,880 Speaker 2: Barnes and Noble girl myself. 495 00:26:04,640 --> 00:26:06,840 Speaker 1: I love to read well funny. There was this line 496 00:26:06,880 --> 00:26:08,840 Speaker 1: in the story that really got me, which is more 497 00:26:08,880 --> 00:26:11,880 Speaker 1: Americans could soon have a place that answered nature's call. 498 00:26:12,359 --> 00:26:15,439 Speaker 1: Without first buying a drink at Tobucks or a book 499 00:26:15,600 --> 00:26:18,760 Speaker 1: or a book. But I read this story in the 500 00:26:18,840 --> 00:26:21,679 Speaker 1: journal about a tech company called Throne Labs. 501 00:26:21,760 --> 00:26:23,160 Speaker 2: Brilliant name, Yeah, it is a good name. 502 00:26:23,240 --> 00:26:26,520 Speaker 1: They're trying to revolutionize public bathrooms. You may have already 503 00:26:26,560 --> 00:26:28,960 Speaker 1: seen self cleaning bathrooms, but they're about to get a 504 00:26:29,000 --> 00:26:33,040 Speaker 1: lot smarter. Your toilet is about to get to know you. 505 00:26:33,320 --> 00:26:35,680 Speaker 2: Does someone go, oh, there's a toilet, That toilet needs 506 00:26:35,720 --> 00:26:36,240 Speaker 2: to be smarter. 507 00:26:36,600 --> 00:26:38,880 Speaker 1: Well, I think there is a real problem right, which 508 00:26:38,880 --> 00:26:42,480 Speaker 1: is public restrooms are not available in the US. The 509 00:26:42,600 --> 00:26:46,320 Speaker 1: US is tied with Botswana in thirtieth place for the 510 00:26:46,400 --> 00:26:51,560 Speaker 1: lowest number of public restrooms per capita, and Throne has 511 00:26:51,600 --> 00:26:54,200 Speaker 1: realized that part of the problem is public restrooms get 512 00:26:54,240 --> 00:26:55,879 Speaker 1: destroyed by the public. 513 00:26:55,880 --> 00:26:58,000 Speaker 2: And we are going too deep today, right. 514 00:27:00,160 --> 00:27:03,280 Speaker 1: Part of the solution is, of course, a rating system, 515 00:27:03,720 --> 00:27:06,760 Speaker 1: think about Uber. But for a toilet, if you don't 516 00:27:06,800 --> 00:27:08,879 Speaker 1: take care of the throne, if you don't keep that 517 00:27:08,920 --> 00:27:11,119 Speaker 1: throne polished, you're not gonna be able to sit on 518 00:27:11,160 --> 00:27:14,520 Speaker 1: it again. So thrown bathrooms are free. But if you 519 00:27:14,520 --> 00:27:16,440 Speaker 1: don't have a phone, create an account. You can actually 520 00:27:16,440 --> 00:27:19,200 Speaker 1: get a key card for entry but in all cases 521 00:27:19,600 --> 00:27:22,199 Speaker 1: it's linked to you as a user, and when you 522 00:27:22,320 --> 00:27:25,320 Speaker 1: enter the bathroom you're expected to rate this cleannliness. There 523 00:27:25,320 --> 00:27:29,919 Speaker 1: are smoke detectors, no smoking occupancy sensors, one at a time, 524 00:27:30,480 --> 00:27:33,960 Speaker 1: and sessions limited to ten minutes, so no doom scrolling, 525 00:27:35,280 --> 00:27:37,760 Speaker 1: which is the best thing to do on the absolutely 526 00:27:38,000 --> 00:27:41,159 Speaker 1: So far, there are over one hundred thrones throughout the US, 527 00:27:41,320 --> 00:27:44,280 Speaker 1: and the company's working on adding additional features like a 528 00:27:44,320 --> 00:27:48,120 Speaker 1: smell sensor. The smell sensor is to alert the care 529 00:27:48,200 --> 00:27:50,920 Speaker 1: and maintenance teams they need to come and pay a visit. 530 00:27:51,640 --> 00:27:53,600 Speaker 1: I like that line about Starbucks and not having to 531 00:27:53,600 --> 00:27:55,439 Speaker 1: buy a drink just to go to the toilet. But 532 00:27:55,480 --> 00:27:57,320 Speaker 1: I also like the way the article summed it up 533 00:27:57,400 --> 00:28:01,320 Speaker 1: quote the brains behind Throwne start by getting real about 534 00:28:01,359 --> 00:28:05,560 Speaker 1: why Americans usually can't have nice things. They assume a 535 00:28:05,640 --> 00:28:09,919 Speaker 1: cultural inability to protect and maintain shared assets and design 536 00:28:10,000 --> 00:28:14,440 Speaker 1: their system with software and just enough internet connected sensors 537 00:28:14,680 --> 00:28:18,720 Speaker 1: to monitor facilities without violating our expectation of privacy. 538 00:28:23,240 --> 00:28:26,280 Speaker 2: And now it's time for chatting me. This week, I 539 00:28:26,320 --> 00:28:29,959 Speaker 2: have a story from me. This is my story. 540 00:28:30,040 --> 00:28:31,640 Speaker 1: Yes, Kara, you go. 541 00:28:32,200 --> 00:28:38,080 Speaker 2: So this week I ignored the ye old hypochondria staple WebMD, 542 00:28:38,720 --> 00:28:41,760 Speaker 2: and my friend and I use chat GPT on the 543 00:28:41,800 --> 00:28:44,640 Speaker 2: couch to help diagnose myself. 544 00:28:44,960 --> 00:28:47,880 Speaker 1: Well, I'm obviously sorry to hear that you're sick. What 545 00:28:48,680 --> 00:28:53,400 Speaker 1: your symptoms and how did you decide to eschew WebMD 546 00:28:53,560 --> 00:28:54,960 Speaker 1: in favor of the future. 547 00:28:55,200 --> 00:28:56,960 Speaker 2: Well, I always make a promise to myself not to 548 00:28:57,040 --> 00:28:59,680 Speaker 2: use WebMD, because that's what hypochondriacs do, and I don't 549 00:28:59,680 --> 00:29:02,640 Speaker 2: want to self identify. I don't want to be in 550 00:29:02,640 --> 00:29:03,840 Speaker 2: the club that would have me as a member. 551 00:29:04,520 --> 00:29:05,880 Speaker 1: Chat was your fig leaf fear. 552 00:29:05,840 --> 00:29:10,880 Speaker 2: That's correct. However, two weeks ago, I started feeling extremely tired, 553 00:29:10,960 --> 00:29:12,920 Speaker 2: and I was lamenting to a lot of my friends 554 00:29:13,440 --> 00:29:15,960 Speaker 2: that I was sleeping like a college freshman all the 555 00:29:16,240 --> 00:29:18,520 Speaker 2: all the time. And I'll be honest with you today, 556 00:29:18,840 --> 00:29:20,600 Speaker 2: very hard for me to get out of bed. Interesting, 557 00:29:20,880 --> 00:29:24,760 Speaker 2: And last Thursday I got out of bed, and you know, 558 00:29:24,840 --> 00:29:26,840 Speaker 2: I looked at myself in the mirror, as one does, 559 00:29:27,080 --> 00:29:30,360 Speaker 2: and I saw this really nasty rash on my leg. 560 00:29:32,080 --> 00:29:34,760 Speaker 2: But I thought, you know what, Kara, don't get upset 561 00:29:34,760 --> 00:29:37,000 Speaker 2: about this. It'll go away. 562 00:29:37,080 --> 00:29:39,000 Speaker 1: So you didn't, You didn't we amd nothing? 563 00:29:39,040 --> 00:29:41,040 Speaker 2: No, No, I was. I was really trying to be good. 564 00:29:41,040 --> 00:29:43,160 Speaker 2: And you know they always say, give it four days. 565 00:29:43,160 --> 00:29:45,320 Speaker 2: If it gets worse, do something about it. So I 566 00:29:45,360 --> 00:29:49,720 Speaker 2: gave it four days. It got much worse, and people 567 00:29:49,720 --> 00:29:51,520 Speaker 2: had opinions about what it was. People were like, it's 568 00:29:51,560 --> 00:29:56,120 Speaker 2: contact dermatitis, it's poison ivy people people. 569 00:29:56,160 --> 00:29:57,560 Speaker 1: You mean friends, friends? 570 00:29:57,600 --> 00:30:00,080 Speaker 2: Ye, well, so even a nurse practitioner told me that 571 00:30:00,120 --> 00:30:02,840 Speaker 2: I had contact dermatitis, which is an important thing to note. 572 00:30:04,120 --> 00:30:07,760 Speaker 2: My friend was like, you're being a moron. Use chatty bet. 573 00:30:08,800 --> 00:30:11,640 Speaker 1: More, and you should go to the doctor. Use chat. 574 00:30:13,080 --> 00:30:15,000 Speaker 2: She was like, you're bring an idiot. This is so easy. 575 00:30:15,000 --> 00:30:18,560 Speaker 2: So she took a photo of this rash and uploaded 576 00:30:18,600 --> 00:30:20,840 Speaker 2: I don't use chattybt in this way. She uses chattubt 577 00:30:20,880 --> 00:30:21,040 Speaker 2: in this. 578 00:30:21,040 --> 00:30:23,200 Speaker 1: Way, and you don't use it for images normal text yet. 579 00:30:23,280 --> 00:30:28,240 Speaker 2: So she uploaded this photo and just like that, the 580 00:30:28,320 --> 00:30:31,520 Speaker 2: doctor was in. She started looking at me sort of 581 00:30:33,040 --> 00:30:34,640 Speaker 2: with a funny face, and I was like, I really 582 00:30:34,640 --> 00:30:36,200 Speaker 2: feel like I'm in a doctor's office right now. 583 00:30:36,840 --> 00:30:38,160 Speaker 1: How's that bedside mana. 584 00:30:38,560 --> 00:30:41,400 Speaker 2: Terrible because she was very she was wrapped up in 585 00:30:41,480 --> 00:30:44,280 Speaker 2: chatgybt and she goes, when did this start? Do you 586 00:30:44,280 --> 00:30:46,840 Speaker 2: have any symptoms? And i'd say, you know, I'm extremely tired, 587 00:30:46,840 --> 00:30:49,400 Speaker 2: and she's asking me this because chattybt is prompting her. 588 00:30:50,080 --> 00:30:53,320 Speaker 2: She asked me about four questions that got more and 589 00:30:53,360 --> 00:30:57,760 Speaker 2: more detailed. By the end of it. I remembered that 590 00:30:57,800 --> 00:31:01,320 Speaker 2: I had gotten a tick bite a few weeks ago, 591 00:31:01,720 --> 00:31:04,680 Speaker 2: which I just I literally forgot about because I scraped 592 00:31:04,720 --> 00:31:05,520 Speaker 2: it right off my leg. 593 00:31:05,800 --> 00:31:07,640 Speaker 1: It was known for any length of time. Wow. 594 00:31:07,760 --> 00:31:11,080 Speaker 2: No, And so she gives me this look when we 595 00:31:11,120 --> 00:31:13,240 Speaker 2: get to the end of her questions and she says, 596 00:31:13,720 --> 00:31:17,400 Speaker 2: that's limes disease. And I was like, no, it's not. 597 00:31:17,480 --> 00:31:19,240 Speaker 2: It's not the traditional bulls eye. And she's like, well, 598 00:31:19,760 --> 00:31:22,560 Speaker 2: CHATCHBT says it's limes disease. It's limes disease. I go 599 00:31:22,640 --> 00:31:24,640 Speaker 2: to the doctor the next day. I was planning on 600 00:31:24,720 --> 00:31:26,640 Speaker 2: going to the doctor because the rash had gotten worse. 601 00:31:27,840 --> 00:31:31,120 Speaker 2: I have my CHATGYBT diagnosis in hand. I go to 602 00:31:31,200 --> 00:31:34,200 Speaker 2: the doctor and before she says anything to me, I say, 603 00:31:34,280 --> 00:31:35,880 Speaker 2: CHATCHBT says, I have limes disease. 604 00:31:35,880 --> 00:31:37,360 Speaker 1: You seriously Wow. 605 00:31:37,680 --> 00:31:40,400 Speaker 2: She looks at the thing on my leg and she goes, 606 00:31:41,040 --> 00:31:43,200 Speaker 2: I'm glad you came into the doctor's office. This is 607 00:31:43,320 --> 00:31:46,560 Speaker 2: limes disease, and what I thought was so interesting and 608 00:31:46,600 --> 00:31:51,040 Speaker 2: why in large language models are so interesting in comparison 609 00:31:51,080 --> 00:31:53,600 Speaker 2: to friends who have only seen limes disease maybe never 610 00:31:54,560 --> 00:31:57,000 Speaker 2: is that she said, you know, you don't have the 611 00:31:57,000 --> 00:32:01,600 Speaker 2: traditional bullseye that indicates limes disease. But because I'm a dermatologist, 612 00:32:02,080 --> 00:32:04,719 Speaker 2: I know that this rash is in a pattern that 613 00:32:04,760 --> 00:32:08,560 Speaker 2: denotes lime disease. So this is just amazing to me 614 00:32:08,680 --> 00:32:13,560 Speaker 2: because I basically was diagnosed by chatgebt before the dermatologist. 615 00:32:13,640 --> 00:32:16,040 Speaker 2: It didn't keep me from going to the dermatologist, but 616 00:32:17,280 --> 00:32:20,760 Speaker 2: I was able to get this answer that nobody else 617 00:32:20,760 --> 00:32:24,360 Speaker 2: had given me that was ultimately right before I went 618 00:32:24,400 --> 00:32:24,959 Speaker 2: to the doctor. 619 00:32:25,200 --> 00:32:26,959 Speaker 1: I live in fear of getting lime disease? 620 00:32:27,360 --> 00:32:30,160 Speaker 2: Do you really do well? As you can see today? 621 00:32:30,360 --> 00:32:32,200 Speaker 1: You see, I mean you see you seem fine. 622 00:32:32,800 --> 00:32:33,840 Speaker 2: Yeah, I'm okay, I'm okay. 623 00:32:34,240 --> 00:32:34,960 Speaker 1: Tie it. 624 00:32:34,960 --> 00:32:35,920 Speaker 2: It's just exhausting. 625 00:32:36,000 --> 00:32:38,120 Speaker 1: So you have to take iv antibiotics. What do you do? 626 00:32:38,240 --> 00:32:41,400 Speaker 2: Uh No, I'm taking doxy cycline for twenty one days, 627 00:32:42,040 --> 00:32:45,800 Speaker 2: which okay. So the interesting part about that, here's what AI, 628 00:32:45,880 --> 00:32:50,480 Speaker 2: Here's what chatgybt can't do. CHATBT yet cannot write me 629 00:32:50,560 --> 00:32:55,560 Speaker 2: a prescription. Chat GPT doesn't tell me it likes the 630 00:32:55,600 --> 00:32:58,480 Speaker 2: show that I work on. It doesn't compliment me, but 631 00:32:58,520 --> 00:33:01,120 Speaker 2: it can my Dermatali. This is a bit more ssycophantic 632 00:33:01,120 --> 00:33:04,160 Speaker 2: than chatgebt this time. But I mean, I just think 633 00:33:04,200 --> 00:33:08,680 Speaker 2: it's an interesting moment because there is what I feel 634 00:33:08,880 --> 00:33:12,080 Speaker 2: to be a definitiveness about chatjebt that I never get 635 00:33:12,120 --> 00:33:16,040 Speaker 2: looking on WebMD. Once I uploaded the picture of my rash, 636 00:33:16,160 --> 00:33:19,720 Speaker 2: my friend walked me through CHATJEPT and I had results. 637 00:33:20,000 --> 00:33:22,760 Speaker 2: I felt comforted in a weird way that I probably 638 00:33:22,760 --> 00:33:26,400 Speaker 2: shouldn't feel comforted by, like this is an LM basically 639 00:33:26,960 --> 00:33:30,600 Speaker 2: spitting out an answer that I don't trust for my friends, 640 00:33:30,600 --> 00:33:32,240 Speaker 2: which was just it made me. 641 00:33:32,200 --> 00:33:35,280 Speaker 1: Think, Yeah, we actually have coming up as a guest 642 00:33:35,320 --> 00:33:37,960 Speaker 1: in the next few weeks one of the chief research 643 00:33:38,040 --> 00:33:40,600 Speaker 1: scientists at Microsoft who worked on a paper that went 644 00:33:40,680 --> 00:33:44,560 Speaker 1: viral about how AI is outperforming doctors at diagnosis in 645 00:33:44,640 --> 00:33:47,640 Speaker 1: certain fields. So I'm looking forward to that conversation. I 646 00:33:47,640 --> 00:33:50,000 Speaker 1: guess the thing which comes to mind for me is 647 00:33:50,840 --> 00:33:54,760 Speaker 1: this is arounything, narrow and ultimately not the most serious 648 00:33:54,800 --> 00:33:57,640 Speaker 1: in most cases. No, it's nice cheez and it's not 649 00:33:57,880 --> 00:34:01,880 Speaker 1: so debatable. There can you imagine if you were talking 650 00:34:01,880 --> 00:34:09,239 Speaker 1: to chat about treatment options for a chronic and maybe 651 00:34:09,640 --> 00:34:13,360 Speaker 1: fatal disease and your doctor are on one side, you 652 00:34:13,400 --> 00:34:16,440 Speaker 1: get different information from Chat. Like you can see how 653 00:34:16,440 --> 00:34:19,520 Speaker 1: this is like very efficient in this context. But also 654 00:34:20,239 --> 00:34:24,000 Speaker 1: we don't about WebMD and you know hypochondria, this could drive. 655 00:34:25,440 --> 00:34:29,440 Speaker 2: Steroidal I mean we've talked about chat GBT psychosis. I 656 00:34:29,480 --> 00:34:33,160 Speaker 2: would imagine people have brought in print doubts from WebMD 657 00:34:33,239 --> 00:34:36,960 Speaker 2: and Google. This is that on steroids. And I would 658 00:34:37,000 --> 00:34:39,960 Speaker 2: like to talk maybe just to like a general practitioner 659 00:34:40,000 --> 00:34:43,759 Speaker 2: interesting who would know firsthand, like how much people are 660 00:34:43,760 --> 00:34:45,600 Speaker 2: coming in and saying, well, I talked to chat GBT 661 00:34:45,680 --> 00:34:47,000 Speaker 2: about this and they said something different. 662 00:34:47,120 --> 00:34:49,520 Speaker 1: In fact, I want you doctor to talk directly to 663 00:34:49,600 --> 00:34:50,560 Speaker 1: chat and I'm going to listen. 664 00:34:50,680 --> 00:34:51,040 Speaker 2: Go ahead. 665 00:34:52,640 --> 00:34:55,440 Speaker 1: Well, I love hearing the story, Kara, but listeners, we 666 00:34:55,480 --> 00:34:58,840 Speaker 1: want to hear yours. Please share the peculiar or useful 667 00:34:58,880 --> 00:35:02,480 Speaker 1: ways you're using Chat, grock, Cil, Gemini or any chatboard 668 00:35:02,840 --> 00:35:04,799 Speaker 1: and send us a one to two minute voice note 669 00:35:04,880 --> 00:35:28,880 Speaker 1: to tech Stuff podcast at gmail dot com. 670 00:35:28,920 --> 00:35:30,680 Speaker 2: That's it for this week for Tech Stuff. 671 00:35:30,680 --> 00:35:33,560 Speaker 1: I'm Cara Price, I'm Ozva Loosin this episode, was produced 672 00:35:33,600 --> 00:35:36,680 Speaker 1: by Eliza Dennis. It was executive produced by me Caro 673 00:35:36,760 --> 00:35:40,000 Speaker 1: Price and Kate Osborne for Kaleidoscope and Katrina Norvel for 674 00:35:40,080 --> 00:35:44,480 Speaker 1: iHeart Podcasts. The engineer is Piheid. Fraser jack Insley mixed 675 00:35:44,520 --> 00:35:47,120 Speaker 1: this episode and Kyle Murdoch wrote out theme song. 676 00:35:47,680 --> 00:35:50,560 Speaker 2: Join us next Wednesday for text Uff the Story, when 677 00:35:50,560 --> 00:35:53,840 Speaker 2: we will examine the lives of kidfluencers and their families. 678 00:35:54,200 --> 00:35:56,879 Speaker 1: Please rate, review, and reach out to us at tech 679 00:35:56,920 --> 00:35:59,160 Speaker 1: Stuff podcast at gmail dot com. We want to hear 680 00:35:59,200 --> 00:36:00,000 Speaker 1: from you.