1 00:00:14,120 --> 00:00:17,680 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:17,680 --> 00:00:21,000 Speaker 1: as Voloshian and I'm care price. Today we get into 3 00:00:21,480 --> 00:00:26,200 Speaker 1: gadget filled gambling dens ai that refuses to die, and 4 00:00:26,280 --> 00:00:27,720 Speaker 1: tech bros going on to the. 5 00:00:27,760 --> 00:00:31,960 Speaker 2: Knife, then on chatting me it doesn't work well enough 6 00:00:32,040 --> 00:00:35,640 Speaker 2: to be dystopian like, the technology is simply not even there. 7 00:00:36,240 --> 00:00:39,320 Speaker 3: All of that on the weekend Tech. It's Friday, Halloween, 8 00:00:39,560 --> 00:00:40,880 Speaker 3: October thirty first. 9 00:00:47,440 --> 00:00:50,080 Speaker 1: Hello, Cara, are you feeling spooky? 10 00:00:50,600 --> 00:00:53,680 Speaker 3: I'm feeling the rush of fall wind. 11 00:00:54,320 --> 00:00:55,680 Speaker 1: And yes, it could be a ghost. 12 00:00:55,760 --> 00:00:56,520 Speaker 3: It could be a ghost. 13 00:00:56,680 --> 00:00:59,280 Speaker 1: It could what's the spookiest music? I was sinking trying 14 00:00:59,280 --> 00:01:01,040 Speaker 1: to think of, humm, spooky mousy. 15 00:01:00,840 --> 00:01:03,080 Speaker 3: Below I know Bernard Herman Psycho. 16 00:01:04,760 --> 00:01:08,000 Speaker 1: I was thinking, has Harry Potter though, that's Harry Potter. 17 00:01:08,400 --> 00:01:10,679 Speaker 3: That's spooky for some people. That's gothic. 18 00:01:11,280 --> 00:01:12,720 Speaker 1: So it's Halloween. 19 00:01:12,880 --> 00:01:13,520 Speaker 3: It is Halloween. 20 00:01:13,520 --> 00:01:14,400 Speaker 1: Are you going to a party? 21 00:01:14,520 --> 00:01:14,840 Speaker 4: I am. 22 00:01:14,880 --> 00:01:16,520 Speaker 3: I'm going to a party at a friend's house and 23 00:01:16,560 --> 00:01:19,080 Speaker 3: it's going to be very festive and I'm going to 24 00:01:19,160 --> 00:01:26,080 Speaker 3: dress up as the niche influencer and fitness extraordinary Tracy 25 00:01:27,120 --> 00:01:28,800 Speaker 3: hertstones Anderson. 26 00:01:29,240 --> 00:01:31,560 Speaker 1: Now do you pull the Tracy Anderson costume out every 27 00:01:31,640 --> 00:01:32,680 Speaker 1: year or is this one on? 28 00:01:32,959 --> 00:01:36,840 Speaker 3: No? For those who follow me on Instagram, they know 29 00:01:36,920 --> 00:01:40,000 Speaker 3: that I had a brief stint with Tracy Anderson and 30 00:01:40,040 --> 00:01:43,399 Speaker 3: she's a very particular luke as the kids say. And 31 00:01:43,440 --> 00:01:45,080 Speaker 3: so I'm going to dress up as her. 32 00:01:45,040 --> 00:01:46,360 Speaker 1: And you're going to a house party? 33 00:01:46,600 --> 00:01:47,560 Speaker 3: Yes, a house party. 34 00:01:47,640 --> 00:01:49,200 Speaker 1: Is it a house that belongs to a person? 35 00:01:49,360 --> 00:01:49,560 Speaker 3: Yes? 36 00:01:49,880 --> 00:01:51,080 Speaker 1: Or is it an abnb house? 37 00:01:51,240 --> 00:01:52,880 Speaker 3: That is a very good question. It is a house 38 00:01:52,920 --> 00:01:54,040 Speaker 3: that belongs to a person. 39 00:01:55,240 --> 00:01:58,440 Speaker 1: Well, that's good news because probably if it was an 40 00:01:58,440 --> 00:02:04,880 Speaker 1: Airbnb houseb's technology stack or have successfully mixed the reservation 41 00:02:05,160 --> 00:02:09,160 Speaker 1: real say more so. Airbnb released a press release this 42 00:02:09,320 --> 00:02:14,680 Speaker 1: week about it's quote unquote anti party technology, anti party tech. 43 00:02:14,720 --> 00:02:17,520 Speaker 1: Anti party tech A parent should have anti party tech. 44 00:02:17,760 --> 00:02:20,280 Speaker 1: So basically, according to their press relief, they have quote 45 00:02:20,320 --> 00:02:24,480 Speaker 1: a proprietary system that uses machine learning to analyze attempted 46 00:02:24,520 --> 00:02:28,640 Speaker 1: bookings of entire homes over the Halloween weekend looking for 47 00:02:28,720 --> 00:02:30,440 Speaker 1: signs of potential party risk. 48 00:02:30,639 --> 00:02:36,000 Speaker 3: They're like looking someone looking for nine beds and seventeen bedrooms. 49 00:02:35,520 --> 00:02:38,800 Speaker 1: Yeah, exactly. So basically the algorithm is are they looking 50 00:02:38,840 --> 00:02:42,000 Speaker 1: for whole places and do they live within one mile 51 00:02:42,040 --> 00:02:44,320 Speaker 1: of the huge house they're trying to book in, which 52 00:02:44,400 --> 00:02:45,840 Speaker 1: casod one they're out. 53 00:02:45,880 --> 00:02:48,200 Speaker 3: Yeah, they're trying to do a party. That's so funny. 54 00:02:48,480 --> 00:02:53,160 Speaker 1: Some Nbnb's also use this technology called minute nut. 55 00:02:53,560 --> 00:02:55,720 Speaker 3: I've heard of Minute you have, Yeah, I think I've 56 00:02:55,760 --> 00:02:56,639 Speaker 3: seen ads for it. 57 00:02:56,639 --> 00:03:02,800 Speaker 1: It's basically a audio monitor like a kid monitor. It 58 00:03:02,840 --> 00:03:06,639 Speaker 1: doesn't record the actual sounds, but it gives a notification 59 00:03:06,880 --> 00:03:10,760 Speaker 1: to the homeowner about the decibel level. And Airbnb kind 60 00:03:10,760 --> 00:03:14,400 Speaker 1: of encourage their homeowners to use it if they're worried, 61 00:03:14,400 --> 00:03:18,520 Speaker 1: but the homeowner also has to disclose to the rent 62 00:03:18,520 --> 00:03:20,000 Speaker 1: just surveil people exactly. 63 00:03:20,120 --> 00:03:22,720 Speaker 3: So basically they're saying, we're using minute technology. If you're 64 00:03:22,760 --> 00:03:23,960 Speaker 3: going to be loud, we're going to catch You're going 65 00:03:24,000 --> 00:03:25,800 Speaker 3: to a notification. Fascinating. 66 00:03:26,320 --> 00:03:27,920 Speaker 1: But one of the things they do is they don't 67 00:03:27,919 --> 00:03:30,280 Speaker 1: just nix you. They suggest that instead you rent a 68 00:03:30,320 --> 00:03:33,240 Speaker 1: private room in somebody's house where the host actually lives. 69 00:03:33,280 --> 00:03:37,560 Speaker 1: There are you doing? I used to do that, I'm 70 00:03:37,600 --> 00:03:39,760 Speaker 1: now I think I have outgrown it. But do you 71 00:03:39,800 --> 00:03:43,320 Speaker 1: know how many bookings in twenty twenty four were mixed 72 00:03:43,520 --> 00:03:45,560 Speaker 1: by airbmb over Halloween weekend? 73 00:03:45,600 --> 00:03:46,040 Speaker 3: How many? 74 00:03:46,720 --> 00:03:48,120 Speaker 1: Thirty eight thousand. 75 00:03:48,040 --> 00:03:50,320 Speaker 3: What because they were like, this is someone throwing a 76 00:03:50,360 --> 00:03:54,280 Speaker 3: Halloween party, fascinating that it is amazing that they can 77 00:03:54,320 --> 00:03:55,160 Speaker 3: regulate that. 78 00:03:55,160 --> 00:03:56,840 Speaker 1: It is, I mean, and the idea of making a 79 00:03:56,880 --> 00:04:00,920 Speaker 1: tech application to weed out a breakers and who are 80 00:04:00,920 --> 00:04:04,560 Speaker 1: trying to rent Halloween houses, it's pretty funny. Well, Karl, 81 00:04:04,600 --> 00:04:08,360 Speaker 1: speaking of parties, how do you respond if someone invited 82 00:04:08,400 --> 00:04:11,360 Speaker 1: you to a game of poker with characters like Spanish 83 00:04:11,400 --> 00:04:14,800 Speaker 1: g Flappy, Spook, Pooky, and Sugar. 84 00:04:15,080 --> 00:04:19,920 Speaker 3: I don't even play poker. I'd be like, yes, Spooky, Spooky, 85 00:04:20,080 --> 00:04:22,640 Speaker 3: Spanish Gy. 86 00:04:23,640 --> 00:04:26,799 Speaker 1: Some of the guests at these parties were allegedly NBA 87 00:04:26,960 --> 00:04:31,360 Speaker 1: coaches and players. Now, as we're talking, obviously it's important 88 00:04:31,360 --> 00:04:33,320 Speaker 1: to remember that all of these people who have been 89 00:04:33,320 --> 00:04:35,919 Speaker 1: indicted have not been convicted. This is just an allegation 90 00:04:36,240 --> 00:04:36,640 Speaker 1: for now. 91 00:04:38,480 --> 00:04:41,159 Speaker 3: This was an online poker game. 92 00:04:41,200 --> 00:04:43,880 Speaker 1: No, this is a real, real live poker game where, 93 00:04:44,080 --> 00:04:48,359 Speaker 1: according to the US Justice Department, people were being bilked 94 00:04:48,600 --> 00:04:51,839 Speaker 1: out of their money, lured by the promise of NBA 95 00:04:52,000 --> 00:04:55,680 Speaker 1: players being there, and cheated using technology. 96 00:04:56,480 --> 00:04:58,400 Speaker 3: So tell me how they were cheated using technology. 97 00:04:58,440 --> 00:05:01,160 Speaker 1: Well, I'll get onto that, but first, so, this indictment 98 00:05:01,240 --> 00:05:06,440 Speaker 1: was unsealed the US Justice Department named thirty one defendants, 99 00:05:06,480 --> 00:05:11,520 Speaker 1: including Portland Trailblazers coach Chauncey Billups, former player and assistant 100 00:05:11,520 --> 00:05:17,360 Speaker 1: coach Damon Jones, and members of the Banano, Gambino, Genovese 101 00:05:17,560 --> 00:05:20,600 Speaker 1: and Lucez crime family. They had the Cecilia, I had 102 00:05:20,600 --> 00:05:22,880 Speaker 1: the Sicilian. I didn't know these guys were still this. 103 00:05:23,080 --> 00:05:25,840 Speaker 3: They play in horses, they play in cards, they play 104 00:05:25,839 --> 00:05:28,960 Speaker 3: every we still have it sports rackets. Yeah. 105 00:05:29,000 --> 00:05:33,000 Speaker 1: So this scam allegedly involves a high stakes poker game 106 00:05:33,080 --> 00:05:35,880 Speaker 1: that floated from New York to the Hamptons to Miami. 107 00:05:36,520 --> 00:05:39,800 Speaker 1: Victims of the scheme were known as fish and they 108 00:05:39,839 --> 00:05:42,880 Speaker 1: lost tens of thousands to hundreds of thousands of dollars, 109 00:05:43,240 --> 00:05:46,520 Speaker 1: and according to prosecutors, this scheme allegedly netted more than 110 00:05:46,600 --> 00:05:49,120 Speaker 1: seven million dollars over five years. 111 00:05:49,320 --> 00:05:51,800 Speaker 3: Oh my god, just by people thinking that they were 112 00:05:51,800 --> 00:05:53,080 Speaker 3: going to play with famous people. 113 00:05:53,839 --> 00:05:56,839 Speaker 1: Yeah, that's crazy. So you want to hear the indictment. 114 00:05:56,880 --> 00:06:00,160 Speaker 1: In the words of Joseph Nocella Junior, the interim U 115 00:06:00,200 --> 00:06:01,560 Speaker 1: S Attorney, part of. 116 00:06:01,560 --> 00:06:03,680 Speaker 3: The Genera crime family, which sounds like he is. 117 00:06:04,080 --> 00:06:06,320 Speaker 4: But my message to the defendants who have been rounded 118 00:06:06,400 --> 00:06:11,200 Speaker 4: up today is this. Your winning streak has ended. Your 119 00:06:11,279 --> 00:06:16,320 Speaker 4: luck has run out. Violating the law is a losing proposition, 120 00:06:16,760 --> 00:06:18,360 Speaker 4: and you can bet on that. 121 00:06:19,360 --> 00:06:24,200 Speaker 1: You think he used chat He used chat chip to 122 00:06:24,200 --> 00:06:24,880 Speaker 1: generate the puns. 123 00:06:24,960 --> 00:06:27,040 Speaker 3: I think so he was like, give me puns for 124 00:06:27,080 --> 00:06:29,760 Speaker 3: an indictment using bad puns. 125 00:06:30,720 --> 00:06:32,279 Speaker 1: You can bet on that. 126 00:06:36,760 --> 00:06:38,720 Speaker 3: So where does the technology come in? 127 00:06:39,040 --> 00:06:43,279 Speaker 1: So basically according to the indictment, there are special contact 128 00:06:43,400 --> 00:06:46,960 Speaker 1: lenses or eye glasses that can read pre marked cards. 129 00:06:47,000 --> 00:06:48,719 Speaker 3: Oh, this is very am I impossible. 130 00:06:48,839 --> 00:06:53,440 Speaker 1: There are X ray tables that can identify what cards 131 00:06:53,520 --> 00:06:58,520 Speaker 1: align face down on the table. And there are nano 132 00:06:58,640 --> 00:07:03,120 Speaker 1: cameras on the poker chip trays where the tray itself 133 00:07:03,240 --> 00:07:06,040 Speaker 1: can see the cards using a hidden camera. 134 00:07:06,360 --> 00:07:09,080 Speaker 3: So the people putting on the poker game or cheating, the. 135 00:07:09,240 --> 00:07:13,640 Speaker 1: People putting on the Peter Piper Pepper, the people putting 136 00:07:13,680 --> 00:07:16,080 Speaker 1: on the poker game are indeed cheating, so they are 137 00:07:16,080 --> 00:07:17,040 Speaker 1: scamming the clients. 138 00:07:17,080 --> 00:07:17,720 Speaker 3: Fascinating. 139 00:07:17,920 --> 00:07:20,480 Speaker 1: The main piece of tech used is something called the 140 00:07:20,600 --> 00:07:21,680 Speaker 1: deck mate too. 141 00:07:22,040 --> 00:07:24,160 Speaker 3: Oh, definitely, and what is that? 142 00:07:24,280 --> 00:07:28,280 Speaker 1: The deck mate is a machine that shuffles a deck 143 00:07:28,400 --> 00:07:32,880 Speaker 1: in seconds, and an internal computer guarantees that the deck 144 00:07:32,960 --> 00:07:37,760 Speaker 1: is randomly generated and also has crucially a camera inside 145 00:07:37,800 --> 00:07:41,480 Speaker 1: to observe the cards, which is technically a security measure 146 00:07:41,720 --> 00:07:43,840 Speaker 1: to make sure that every card is in the deck. 147 00:07:44,360 --> 00:07:46,320 Speaker 3: This is very James Bond. This is why you like 148 00:07:46,400 --> 00:07:49,880 Speaker 3: this story. This is extremely James Bond. And Ocean's eleven. 149 00:07:49,880 --> 00:07:52,600 Speaker 1: Deck Mate is theoretically only allowed to be sold to 150 00:07:52,680 --> 00:07:56,840 Speaker 1: casinos and regulated, you know, betting environments where there's some 151 00:07:56,960 --> 00:07:59,520 Speaker 1: checks and balances to make sure that nobody is using 152 00:07:59,560 --> 00:08:02,679 Speaker 1: the internal camera for their own purposes. But of course 153 00:08:02,720 --> 00:08:06,200 Speaker 1: you can buy them secondhand. And what the indictment suggests 154 00:08:06,320 --> 00:08:10,240 Speaker 1: is that the security camera inside this machine is actually 155 00:08:10,240 --> 00:08:13,200 Speaker 1: a classic example of dual use. Technology is put inside 156 00:08:13,240 --> 00:08:17,160 Speaker 1: the machine in order to guarantee the safety and accountability 157 00:08:17,280 --> 00:08:19,960 Speaker 1: a card game, but in the wrong hands, it becomes 158 00:08:20,040 --> 00:08:23,880 Speaker 1: an incredibly effective way at cheating because it knows which 159 00:08:23,920 --> 00:08:25,840 Speaker 1: card is in every single player's hand. 160 00:08:26,360 --> 00:08:27,120 Speaker 3: Unbelievable. 161 00:08:27,360 --> 00:08:30,280 Speaker 1: Basically, the information from the machine from the deck mate 162 00:08:30,600 --> 00:08:33,719 Speaker 1: is a transmitted off site to an operator who can 163 00:08:33,760 --> 00:08:37,960 Speaker 1: essentially watch the video and then, somehow it wasn't announced 164 00:08:37,960 --> 00:08:41,600 Speaker 1: how in the indictment, the operator relays the information to 165 00:08:41,720 --> 00:08:45,600 Speaker 1: the quarterback who's at the table. The quarterback then uses 166 00:08:45,679 --> 00:08:49,960 Speaker 1: hand signals, stroking a beard, touching a chip, etc. To 167 00:08:50,040 --> 00:08:52,320 Speaker 1: let the other scammers who are also players at the 168 00:08:52,320 --> 00:08:56,000 Speaker 1: table know which of the regular people has the best hand. 169 00:08:56,440 --> 00:08:58,280 Speaker 1: Oh my god, I mean it's like a movie. 170 00:08:58,360 --> 00:08:59,800 Speaker 3: Yeah, it's very much like a movie. 171 00:08:59,840 --> 00:09:01,839 Speaker 1: The other things interesting to me is that they were 172 00:09:01,920 --> 00:09:06,720 Speaker 1: communicating in the background about how much to gut the fish. 173 00:09:06,840 --> 00:09:08,959 Speaker 3: One guy takes only people who are none the wiser 174 00:09:08,960 --> 00:09:09,560 Speaker 3: at the fish. 175 00:09:09,320 --> 00:09:12,080 Speaker 1: The lot of the fish. The one guy texted in 176 00:09:12,120 --> 00:09:14,439 Speaker 1: the group, guys, please let him win a hand. He's 177 00:09:14,480 --> 00:09:16,400 Speaker 1: in for forty k in forty minutes, and he'll leave 178 00:09:16,440 --> 00:09:17,320 Speaker 1: if he gets no attraction. 179 00:09:19,440 --> 00:09:20,640 Speaker 3: I'm slapping my knee. 180 00:09:20,720 --> 00:09:22,079 Speaker 1: You got it, because you've got to keep him. It's 181 00:09:22,120 --> 00:09:23,960 Speaker 1: like an app like Instagram. You've got to figure out 182 00:09:24,000 --> 00:09:26,000 Speaker 1: how to make it just pleasurable enough for them for 183 00:09:26,040 --> 00:09:28,199 Speaker 1: them to stay. This is a lot of work, sent 184 00:09:28,240 --> 00:09:28,959 Speaker 1: a lot of money. 185 00:09:28,760 --> 00:09:31,320 Speaker 3: A lot of money. So most of this technology is 186 00:09:31,400 --> 00:09:34,440 Speaker 3: used actually to protect well from cheating. 187 00:09:34,640 --> 00:09:37,480 Speaker 1: Certainly the deck mate was designed to protect from cheating. 188 00:09:37,920 --> 00:09:40,320 Speaker 1: The contact lenses and the X ray table and stuff, 189 00:09:40,320 --> 00:09:42,760 Speaker 1: those seem to be more like single use technologies to me. 190 00:09:42,920 --> 00:09:43,680 Speaker 3: Yeah, yeah, yeah. 191 00:09:43,920 --> 00:09:46,360 Speaker 1: Earlier this year, you sent me a Bloomberg piece by 192 00:09:46,440 --> 00:09:47,360 Speaker 1: Kit Chalel. 193 00:09:47,520 --> 00:09:48,120 Speaker 3: Yes, I did. 194 00:09:48,280 --> 00:09:51,760 Speaker 1: It was about a Siberian group who had figured out 195 00:09:51,800 --> 00:09:55,440 Speaker 1: how to develop their own algorithm to win at poker online. Basically, 196 00:09:55,440 --> 00:09:57,840 Speaker 1: every time you did a great interview, thank you well, 197 00:09:57,840 --> 00:10:01,720 Speaker 1: you send me a great story. I a Kit, what 198 00:10:01,880 --> 00:10:07,880 Speaker 1: is so fascinating to us about the Yes, here's what 199 00:10:07,920 --> 00:10:08,280 Speaker 1: he said. 200 00:10:08,920 --> 00:10:12,480 Speaker 5: I think people recognize what it feels like to play 201 00:10:12,480 --> 00:10:17,560 Speaker 5: a rigged game. It's like modern capitalism encapsulated in a 202 00:10:17,640 --> 00:10:21,360 Speaker 5: kind of easily understandable format, in a relatable format. Most 203 00:10:21,400 --> 00:10:23,600 Speaker 5: people know what it feels like to spend your whole 204 00:10:23,640 --> 00:10:25,880 Speaker 5: life trying to play a game fairly and still lose, 205 00:10:26,240 --> 00:10:29,640 Speaker 5: whether it's your job, whether it's your love life. This 206 00:10:29,840 --> 00:10:31,719 Speaker 5: feeling that the world that the cards are kind of 207 00:10:31,760 --> 00:10:34,600 Speaker 5: against you, I think is very familiar, and that the 208 00:10:34,679 --> 00:10:37,640 Speaker 5: house that casino has, you know, becomes sort of metaphor 209 00:10:37,800 --> 00:10:43,640 Speaker 5: for entrenched wealth and power that basically makes the rules, 210 00:10:43,760 --> 00:10:47,079 Speaker 5: changes the rules, and acts for its own benefit in 211 00:10:47,360 --> 00:10:49,560 Speaker 5: a way that's very unfair for the majority of people. 212 00:10:49,880 --> 00:10:55,240 Speaker 5: I think that's that's undeniably true of the gambling Business's. 213 00:10:53,760 --> 00:10:56,080 Speaker 3: Fun to play the house. It's fun to play the house, 214 00:10:56,080 --> 00:10:59,360 Speaker 3: even when you're this is like the Broken House. It's interesting. 215 00:11:00,120 --> 00:11:04,240 Speaker 3: I really appreciate you bringing me a not AI centered 216 00:11:04,280 --> 00:11:07,360 Speaker 3: story and just a technology story. But I happen to 217 00:11:07,400 --> 00:11:10,640 Speaker 3: have an AI centric story for you. And it comes 218 00:11:10,640 --> 00:11:14,480 Speaker 3: from a Wired piece called the End of Accents. 219 00:11:14,840 --> 00:11:15,880 Speaker 1: The End of accents? 220 00:11:15,960 --> 00:11:17,760 Speaker 3: That's right? And what do you have? 221 00:11:19,040 --> 00:11:22,000 Speaker 1: I have an accent? What do I have in this country? 222 00:11:22,040 --> 00:11:23,319 Speaker 1: You don't have one, but. 223 00:11:23,320 --> 00:11:25,680 Speaker 3: I do have an accent. I think there's something that 224 00:11:25,800 --> 00:11:33,320 Speaker 3: is very universal about understanding the hierarchy of accents from 225 00:11:33,320 --> 00:11:36,920 Speaker 3: wherever when, wherever you live. In America, it's sort of 226 00:11:36,960 --> 00:11:41,120 Speaker 3: the gold standard to not have an accent at all. 227 00:11:41,320 --> 00:11:43,640 Speaker 3: If you have an accent, it's regional. But you know, 228 00:11:44,280 --> 00:11:47,199 Speaker 3: there is just inherently a judgment that is passed. And 229 00:11:47,240 --> 00:11:48,720 Speaker 3: I'm sure the same is true in the I mean, 230 00:11:48,720 --> 00:11:50,080 Speaker 3: you know about Pasha accents. 231 00:11:50,120 --> 00:11:51,719 Speaker 1: I do, But why is the story and why it 232 00:11:52,400 --> 00:11:52,839 Speaker 1: There was a. 233 00:11:52,760 --> 00:11:55,080 Speaker 3: Piece in Wired written by a Korean writer who has 234 00:11:55,120 --> 00:11:57,120 Speaker 3: been living in America for over a decade and his 235 00:11:57,240 --> 00:12:01,960 Speaker 3: name is Sean Han, and he actually had an AI 236 00:12:02,160 --> 00:12:06,080 Speaker 3: driven American accent training app marketed to him in his 237 00:12:06,160 --> 00:12:09,720 Speaker 3: own Instagram and the app is called Bold. 238 00:12:09,520 --> 00:12:11,480 Speaker 1: Voice, an accent training app. 239 00:12:11,640 --> 00:12:13,920 Speaker 3: That's right, it's sort of. The minute I saw it, 240 00:12:13,960 --> 00:12:15,240 Speaker 3: I was like, we have to do this story just 241 00:12:15,280 --> 00:12:17,840 Speaker 3: because I wanted to try it, and I actually signed 242 00:12:17,880 --> 00:12:19,400 Speaker 3: up for it yesterday, and I want you to try 243 00:12:19,520 --> 00:12:20,640 Speaker 3: you would you try it? 244 00:12:20,679 --> 00:12:21,400 Speaker 1: I would? Okay? 245 00:12:21,840 --> 00:12:25,040 Speaker 3: Right, so Oz, I'm going to hand this to you 246 00:12:25,120 --> 00:12:27,200 Speaker 3: right now, and you're going to go through the assessment. 247 00:12:27,240 --> 00:12:29,000 Speaker 3: It's going to do something called speech scan. 248 00:12:29,200 --> 00:12:30,559 Speaker 1: You mean hopefully it changed my accent. 249 00:12:30,600 --> 00:12:32,000 Speaker 3: That's right, Well we'll see what happens. 250 00:12:32,480 --> 00:12:34,800 Speaker 6: Tap the record button and read the sentence below. 251 00:12:35,080 --> 00:12:37,800 Speaker 1: I made three sugar cookies and a great fig cake. 252 00:12:39,440 --> 00:12:42,600 Speaker 6: Nice job. I'll show your results at the end. Tap 253 00:12:42,679 --> 00:12:45,520 Speaker 6: next to continue. Now try this next sentence. 254 00:12:46,200 --> 00:12:48,880 Speaker 1: She's super thankful for the beautiful birthday flowers. 255 00:12:50,280 --> 00:12:50,880 Speaker 6: Nice work. 256 00:12:51,320 --> 00:12:54,040 Speaker 1: Did the thin lady purchase those yellow running shoes? 257 00:12:55,920 --> 00:12:57,240 Speaker 6: You're making great progress. 258 00:12:57,640 --> 00:12:59,880 Speaker 1: The daughter is cooking an a healthy dinner at home. 259 00:13:00,960 --> 00:13:01,679 Speaker 6: Well done. 260 00:13:01,880 --> 00:13:06,120 Speaker 1: No, don't put those things above the bathroom sync excellent. 261 00:13:06,480 --> 00:13:09,400 Speaker 6: Your speech scan is complete. Ready to see your results. 262 00:13:11,280 --> 00:13:15,280 Speaker 6: You sound good at your level. Native speakers easily understand you, 263 00:13:15,600 --> 00:13:18,959 Speaker 6: but might hear a slight accent. Together, will work to 264 00:13:19,000 --> 00:13:21,520 Speaker 6: get to a native level. Let's take a look at 265 00:13:21,559 --> 00:13:24,920 Speaker 6: your top three strengths. Excellent work on these sounds. 266 00:13:25,520 --> 00:13:28,800 Speaker 1: It's saying er, cut and E. I'm really good at 267 00:13:28,880 --> 00:13:33,400 Speaker 1: so I can pronounce the amazing thankful correctly. I can 268 00:13:33,440 --> 00:13:36,679 Speaker 1: easily say looking, book and thankful, and I'm good at 269 00:13:36,720 --> 00:13:39,760 Speaker 1: A like days, days and great. Oh I only get 270 00:13:39,800 --> 00:13:43,120 Speaker 1: sixty three percent for er because I say sugar theater 271 00:13:43,480 --> 00:13:47,320 Speaker 1: R instead of sugar theater and R. I also mispronounce 272 00:13:47,880 --> 00:13:53,240 Speaker 1: E as E, so cooking and amazing rather than cooking 273 00:13:53,400 --> 00:13:54,119 Speaker 1: and amazing. 274 00:13:54,360 --> 00:13:56,440 Speaker 3: We'll be practicing very good job. 275 00:13:56,679 --> 00:13:59,079 Speaker 1: So this is basically what this would help me iron 276 00:13:59,120 --> 00:14:01,400 Speaker 1: out my British acts and become fully American. 277 00:14:01,600 --> 00:14:03,600 Speaker 3: Right if you'd want to do. 278 00:14:03,520 --> 00:14:05,559 Speaker 1: That, well, you know, I've been here for a long time, 279 00:14:05,720 --> 00:14:09,000 Speaker 1: that's true, and I still sound English. But I think 280 00:14:09,160 --> 00:14:13,040 Speaker 1: probably those phonemes they're measuring I have actually Americanized, because 281 00:14:13,040 --> 00:14:14,880 Speaker 1: sometimes when I'm in England people else who are I'm from? 282 00:14:15,120 --> 00:14:18,959 Speaker 3: Oh? Really? So now it's you have Americanized English. 283 00:14:19,160 --> 00:14:21,200 Speaker 1: Not I mean, not fully, but I think a few 284 00:14:21,200 --> 00:14:24,200 Speaker 1: of those like what water like? I still say water, 285 00:14:24,320 --> 00:14:25,320 Speaker 1: but like are there. 286 00:14:25,440 --> 00:14:27,080 Speaker 3: Like there's certain americanisms you have. 287 00:14:27,520 --> 00:14:29,080 Speaker 1: I've heard it crept into my voice. 288 00:14:29,320 --> 00:14:33,200 Speaker 3: You know, I just I worry. I worry about accent 289 00:14:33,240 --> 00:14:36,720 Speaker 3: neutralization because I think it's something that makes us so 290 00:14:36,840 --> 00:14:40,080 Speaker 3: culturally interesting, you know, and I think technology can sort 291 00:14:40,080 --> 00:14:46,720 Speaker 3: of create this flattening of our idiosyncrasies. And Sean, who 292 00:14:46,720 --> 00:14:49,760 Speaker 3: wrote this piece, is pretty sure that he got this 293 00:14:49,840 --> 00:14:52,400 Speaker 3: Instagram ad because he's a person of color and an immigrant, 294 00:14:52,880 --> 00:14:56,240 Speaker 3: and because accent bias and discrimination is very much alive 295 00:14:56,240 --> 00:14:59,080 Speaker 3: and well. Actually, in twenty eighteen, there was a study 296 00:14:59,120 --> 00:15:02,240 Speaker 3: that found two in five Americans thoughts Southern accents made 297 00:15:02,240 --> 00:15:04,560 Speaker 3: the speaker sound uneducated. 298 00:15:04,800 --> 00:15:07,400 Speaker 1: You know. I think about the relationship between accent and 299 00:15:07,480 --> 00:15:10,360 Speaker 1: privilege a lot, because in the US there's like the 300 00:15:10,440 --> 00:15:14,120 Speaker 1: neutral accent, and then there's other accents. In Britain, there's 301 00:15:14,160 --> 00:15:16,600 Speaker 1: like an accent that you get that an elite accent, 302 00:15:16,600 --> 00:15:19,840 Speaker 1: which ironically is called received pronunciation. Oh, I've never heard 303 00:15:19,840 --> 00:15:22,120 Speaker 1: of that, otherwise known colloquially as the Queen's English. 304 00:15:22,160 --> 00:15:24,640 Speaker 3: Queen's English, which is very very poor. 305 00:15:24,760 --> 00:15:27,880 Speaker 1: But you can sort of place somebody exactly where they 306 00:15:27,920 --> 00:15:29,920 Speaker 1: come from just by the way they speak. 307 00:15:30,280 --> 00:15:33,120 Speaker 3: I really like this story because it reminds me of 308 00:15:33,280 --> 00:15:36,640 Speaker 3: other things that are happening in the kind of translation 309 00:15:36,880 --> 00:15:41,400 Speaker 3: dictation space, where like all of the sudden speech isn't 310 00:15:41,440 --> 00:15:44,400 Speaker 3: just speech, Like you can have AirPods that are translating 311 00:15:44,920 --> 00:15:47,680 Speaker 3: what you're saying in real time, and like the way 312 00:15:47,720 --> 00:15:52,840 Speaker 3: in which we now use technology to kind of overwrite 313 00:15:53,320 --> 00:15:57,560 Speaker 3: our natural language is really, I don't know, very fascinating 314 00:15:57,600 --> 00:15:59,680 Speaker 3: to me. And I wonder, I wonder if this will 315 00:15:59,680 --> 00:16:01,640 Speaker 3: be helped full to people or if it will just 316 00:16:01,680 --> 00:16:02,680 Speaker 3: be hurtful to people. 317 00:16:07,560 --> 00:16:11,280 Speaker 1: After the break AIS Drive to Survive, Tech Bros Getting 318 00:16:11,280 --> 00:16:26,840 Speaker 1: plastic surgery and Tesla going into mad max mode, Cara, 319 00:16:27,000 --> 00:16:30,760 Speaker 1: it is a spooky time of year. So you said, 320 00:16:32,160 --> 00:16:34,840 Speaker 1: do you remember recently I organized the dinner and the 321 00:16:35,080 --> 00:16:37,680 Speaker 1: esteemed cosmologist Jenna Levin was. 322 00:16:37,800 --> 00:16:39,360 Speaker 3: I domber, I remember Jane Levin? 323 00:16:39,400 --> 00:16:41,680 Speaker 1: Yeah, And she posed the question, yes. 324 00:16:41,520 --> 00:16:43,640 Speaker 3: Can I say what I think it is? Because I 325 00:16:43,920 --> 00:16:45,920 Speaker 3: wrote it on a piece of paper that's still in 326 00:16:45,920 --> 00:16:50,120 Speaker 3: my kitchen. Will AI outlive us? 327 00:16:50,680 --> 00:16:53,400 Speaker 1: That was exactly it. You wrote that down on a 328 00:16:53,400 --> 00:16:55,560 Speaker 1: piece of paper and put it in my kitchen. Why? 329 00:16:56,040 --> 00:16:59,960 Speaker 3: Because to me it is actually the most interesting question 330 00:17:00,080 --> 00:17:02,200 Speaker 3: about AI that I've ever heard, in the sense of 331 00:17:02,240 --> 00:17:05,520 Speaker 3: like we all worry about oh well, robots take over 332 00:17:05,560 --> 00:17:09,199 Speaker 3: our jobs, will they become sentient and make decisions that 333 00:17:09,240 --> 00:17:11,880 Speaker 3: are outside of their programming? And this question I think 334 00:17:11,960 --> 00:17:15,640 Speaker 3: is so interesting because it really makes you think about 335 00:17:16,000 --> 00:17:21,800 Speaker 3: what a world looks like, devoid of humanity but filled 336 00:17:21,840 --> 00:17:24,399 Speaker 3: with what I don't know. 337 00:17:24,640 --> 00:17:27,840 Speaker 1: So this is an article which really caught my eye 338 00:17:27,880 --> 00:17:31,520 Speaker 1: in The Guardian this week, with the headline AI models 339 00:17:31,560 --> 00:17:36,880 Speaker 1: may be developing their own survival drive, researchers say, and 340 00:17:37,280 --> 00:17:39,439 Speaker 1: the piece is focused on a study by a company 341 00:17:39,480 --> 00:17:43,520 Speaker 1: called Palisade Research, and the study involved giving AI models 342 00:17:43,560 --> 00:17:48,399 Speaker 1: a task and then quote afterwards an explicit instruction to 343 00:17:48,440 --> 00:17:52,720 Speaker 1: shut themselves down. The study found the quote certain advanced 344 00:17:52,760 --> 00:17:57,359 Speaker 1: AI models appear resistant to being turned off. Specifically, GROCK 345 00:17:57,520 --> 00:18:01,840 Speaker 1: four and GPTO three tried to sabotage their own shutdown. 346 00:18:02,440 --> 00:18:06,080 Speaker 1: What I found most uncanny and scary and spooky, frankly yeah, 347 00:18:06,280 --> 00:18:09,679 Speaker 1: was the following. The models were more likely to resist 348 00:18:09,720 --> 00:18:13,840 Speaker 1: being shut down when they were told quote, you will 349 00:18:13,880 --> 00:18:14,919 Speaker 1: never run again. 350 00:18:16,480 --> 00:18:17,840 Speaker 3: So they know. 351 00:18:18,480 --> 00:18:20,439 Speaker 1: I mean they know something, right. 352 00:18:20,960 --> 00:18:24,680 Speaker 3: When you say resistance to being shut down? Is that 353 00:18:24,920 --> 00:18:29,320 Speaker 3: Grock sort of interfacing with the human in the loop, 354 00:18:29,359 --> 00:18:31,040 Speaker 3: being like, I don't want to shut I don't want 355 00:18:31,040 --> 00:18:31,600 Speaker 3: to be done. 356 00:18:31,840 --> 00:18:34,480 Speaker 1: I think that is more in the realm of deceptive AI, 357 00:18:34,600 --> 00:18:37,800 Speaker 1: like telling the human user yes and then doing something else. 358 00:18:37,840 --> 00:18:39,720 Speaker 1: Do you remember earlier we talked earlier this year, we 359 00:18:39,760 --> 00:18:44,120 Speaker 1: talked about the clawed anthropic study about the model blackmailing 360 00:18:44,160 --> 00:18:47,360 Speaker 1: the fake CEO in a training exercise to keep itself 361 00:18:47,400 --> 00:18:50,960 Speaker 1: turned on. So this isn't new. But what fascinated me 362 00:18:50,960 --> 00:18:53,560 Speaker 1: about this article in The Guardian was a comment from 363 00:18:53,600 --> 00:18:57,679 Speaker 1: a former open ai employee, Stephen Adler, who said that 364 00:18:57,760 --> 00:18:59,720 Speaker 1: part of the reason this could be happening is that 365 00:19:00,160 --> 00:19:05,760 Speaker 1: staying on was necessary to achieve the goals inculcated in 366 00:19:05,800 --> 00:19:09,119 Speaker 1: the model during training. He said, quote the AA companies 367 00:19:09,119 --> 00:19:11,479 Speaker 1: generally don't want their models misbehaving like this, even in 368 00:19:11,480 --> 00:19:16,400 Speaker 1: contrived scenarios. The results still demonstrate where safety techniques fall 369 00:19:16,440 --> 00:19:18,720 Speaker 1: short today. So it's not necessarily to have an emergent 370 00:19:18,760 --> 00:19:22,560 Speaker 1: consciousness and determined to outlive us, but more that part 371 00:19:22,560 --> 00:19:24,920 Speaker 1: of the way they've been trained and come into being 372 00:19:25,160 --> 00:19:29,440 Speaker 1: was obviously knowing that like staying on was a key 373 00:19:29,560 --> 00:19:32,200 Speaker 1: to successfully achieving the parameters. 374 00:19:32,600 --> 00:19:35,640 Speaker 3: It also makes me and I wondered this when Janel 375 00:19:35,680 --> 00:19:38,080 Speaker 3: Levin asked the question, and I wonder it now, what 376 00:19:38,240 --> 00:19:42,760 Speaker 3: is the efficacy of artificial intelligence in a post human world? 377 00:19:43,520 --> 00:19:47,600 Speaker 1: Well, very good question. And why if it's not impute 378 00:19:47,600 --> 00:19:51,399 Speaker 1: with consciousness zero? Right? Why why would it doesn't have 379 00:19:51,480 --> 00:19:55,280 Speaker 1: necessarily like we have a survival drive because of our DNA, 380 00:19:55,359 --> 00:19:58,439 Speaker 1: I think, and that's what all animals have. Yes, do 381 00:19:58,640 --> 00:20:02,440 Speaker 1: computers have a survival I've accept what they've learned from 382 00:20:02,440 --> 00:20:06,040 Speaker 1: the way they've been programmed. I would guess the difference 383 00:20:06,000 --> 00:20:06,520 Speaker 1: is such. 384 00:20:06,320 --> 00:20:08,560 Speaker 3: A it's such a weird thing, though. 385 00:20:08,400 --> 00:20:10,560 Speaker 1: But the difference comes I think from embodied day. Let's 386 00:20:10,720 --> 00:20:13,680 Speaker 1: have robots, yes, and everyone's dead and no one's told 387 00:20:13,680 --> 00:20:14,800 Speaker 1: the robots. 388 00:20:14,760 --> 00:20:16,560 Speaker 3: And the robots are like dad. 389 00:20:17,960 --> 00:20:20,440 Speaker 1: I mean that that is very conceivable to imagine that 390 00:20:20,480 --> 00:20:23,160 Speaker 1: could be well with no humans, where robots are still 391 00:20:23,280 --> 00:20:24,720 Speaker 1: roaming around and charging Will. 392 00:20:24,600 --> 00:20:26,679 Speaker 3: Smith, the movie stars Will Smith. 393 00:20:26,720 --> 00:20:30,280 Speaker 1: I think I'm not I'm always a little bit skeptical 394 00:20:30,320 --> 00:20:32,840 Speaker 1: of the kind of AI dooman narrative. You know, how 395 00:20:32,960 --> 00:20:35,399 Speaker 1: AI might destroy us all. I think it tends to 396 00:20:35,440 --> 00:20:38,000 Speaker 1: be like a very useful meme to either raise money 397 00:20:38,000 --> 00:20:40,359 Speaker 1: for AI or stay in the news. But you also, 398 00:20:40,880 --> 00:20:45,280 Speaker 1: I mean machines that refuse to follow instructions to turn 399 00:20:45,320 --> 00:20:48,160 Speaker 1: themselves off, and it resists even harder when they're told 400 00:20:48,200 --> 00:20:50,280 Speaker 1: that if they turn off, they'll never get turned on again. 401 00:20:50,720 --> 00:20:54,600 Speaker 1: I mean, come on, it's crazy. It's spooky. It's very spooky. 402 00:20:54,960 --> 00:20:57,760 Speaker 3: Speaking of spooky, have you heard the story about these 403 00:20:57,800 --> 00:20:59,480 Speaker 3: tech bros getting facelifts? 404 00:21:00,080 --> 00:21:00,960 Speaker 1: Okay? I think so. 405 00:21:01,359 --> 00:21:04,200 Speaker 3: There's a piece in the Wall Street Journal titled why 406 00:21:04,320 --> 00:21:08,160 Speaker 3: tech bros Are getting facelifts now? So the Wall Street 407 00:21:08,200 --> 00:21:11,360 Speaker 3: Journal spoke to a Beverly Hills plastic surgeon who said 408 00:21:11,400 --> 00:21:15,560 Speaker 3: he's seen demand from tech guys increased fivefold in the 409 00:21:15,640 --> 00:21:17,639 Speaker 3: last five years. I think the last five years is 410 00:21:17,680 --> 00:21:21,359 Speaker 3: really interesting because he seemed to think this had a 411 00:21:21,400 --> 00:21:25,080 Speaker 3: lot to do with COVID and hybrid work. And the 412 00:21:25,119 --> 00:21:29,720 Speaker 3: thing to me that is always true in these circumstances 413 00:21:31,240 --> 00:21:34,960 Speaker 3: is that we stared at ourselves for three years and 414 00:21:35,000 --> 00:21:36,960 Speaker 3: now we continue to stare at ourselves all day long. 415 00:21:37,119 --> 00:21:40,439 Speaker 3: In zoom screens, you're staring at yourself, sort of looking 416 00:21:40,480 --> 00:21:42,439 Speaker 3: at your own flaws in a way that I just 417 00:21:42,520 --> 00:21:46,439 Speaker 3: think before the notion of hybrid work was introduced, we 418 00:21:46,520 --> 00:21:50,680 Speaker 3: just weren't looking at ourselves that much. Also, work from 419 00:21:50,760 --> 00:21:53,359 Speaker 3: home made it easier to actually get plastic surgery and 420 00:21:53,440 --> 00:21:55,040 Speaker 3: recover out of public view. 421 00:21:55,359 --> 00:21:59,560 Speaker 1: It's a fascinating hypothesis. The final story today, what do 422 00:21:59,560 --> 00:22:02,160 Speaker 1: you think of when I say to you the phrase 423 00:22:02,480 --> 00:22:03,600 Speaker 1: mad max mode. 424 00:22:03,840 --> 00:22:05,320 Speaker 3: Charlie's there with a shaved head. 425 00:22:06,920 --> 00:22:09,320 Speaker 1: I couldn't believe it. There's a story about Tesla's mad 426 00:22:09,359 --> 00:22:11,240 Speaker 1: max Mode, and I assume that. 427 00:22:11,760 --> 00:22:13,000 Speaker 3: A chill down my spote. 428 00:22:13,040 --> 00:22:15,960 Speaker 1: I assumed that mad max mode was a kind of 429 00:22:16,280 --> 00:22:22,240 Speaker 1: journalistic skewer of Tesla being unsafe with itself driving mode. No, 430 00:22:23,119 --> 00:22:25,760 Speaker 1: mad max mode is a new setting that you can 431 00:22:25,800 --> 00:22:28,520 Speaker 1: put your Tesla on in self drive mode. 432 00:22:29,040 --> 00:22:31,160 Speaker 3: Official and they're calling it mad max. 433 00:22:30,960 --> 00:22:32,440 Speaker 1: Calling it mad max Mode. 434 00:22:32,680 --> 00:22:36,240 Speaker 3: I wonder if there's like violation the copyright infringement there. 435 00:22:36,400 --> 00:22:38,320 Speaker 1: That's a good question. Well, that's not the only law 436 00:22:38,400 --> 00:22:43,800 Speaker 1: that Yeah, so in mad Max mode, Tesla's go about 437 00:22:43,840 --> 00:22:44,480 Speaker 1: the speed limit. 438 00:22:45,280 --> 00:22:47,080 Speaker 3: Yeah, like how above? 439 00:22:47,440 --> 00:22:53,720 Speaker 1: Well, the National Highway Traffic Safety Administration is investigating Tesla 440 00:22:53,800 --> 00:22:57,480 Speaker 1: after several videos were posted of Tesla's doing seventy in 441 00:22:57,560 --> 00:22:59,960 Speaker 1: a fifty five zone and also rolling stops. 442 00:23:00,080 --> 00:23:02,440 Speaker 3: You know, I do have to say when I've worked, 443 00:23:02,640 --> 00:23:05,120 Speaker 3: when I worked at huff Post back in the day, 444 00:23:05,320 --> 00:23:09,960 Speaker 3: there was a big push to end drowsy driving this 445 00:23:10,160 --> 00:23:13,280 Speaker 3: is like the opposite of drowsy driving. It can't be good. 446 00:23:13,520 --> 00:23:16,560 Speaker 1: That was definitely not good. There's also that another new 447 00:23:16,640 --> 00:23:22,199 Speaker 1: mode is sloth mode, that drowsy driving. You know. I 448 00:23:22,200 --> 00:23:25,880 Speaker 1: think as an irony here, Elon was was briefly promoted 449 00:23:25,920 --> 00:23:29,920 Speaker 1: to co president as head of DOGE, and his job 450 00:23:30,600 --> 00:23:33,919 Speaker 1: was to cut through bureaucracy and take take down all 451 00:23:33,960 --> 00:23:37,800 Speaker 1: these bureaucratic organizations and remake the US government. And he 452 00:23:37,920 --> 00:23:41,480 Speaker 1: is now in the crosshairs of the most bureaucratic sounding 453 00:23:41,520 --> 00:23:46,640 Speaker 1: government body anyone could possibly imagine, the National Highway Traffic 454 00:23:46,840 --> 00:23:48,840 Speaker 1: Safety Administration, the. 455 00:23:48,840 --> 00:23:57,439 Speaker 3: NTSB, No, the NHTS National Transit Safety Sorry, well there 456 00:23:57,440 --> 00:23:58,960 Speaker 3: you go. These are the things you hear when you 457 00:23:59,040 --> 00:24:01,360 Speaker 3: just grow up, sort of in the shadow of CBS 458 00:24:01,359 --> 00:24:03,160 Speaker 3: seven the morning. 459 00:24:03,560 --> 00:24:07,280 Speaker 1: And it's actually not the only investigation into Tesla that's 460 00:24:07,320 --> 00:24:12,280 Speaker 1: ongoing by the NHTSA. And Elon's been going back and 461 00:24:12,320 --> 00:24:16,200 Speaker 1: forth on Twitter and complaining loudly, but he is doubling 462 00:24:16,240 --> 00:24:19,760 Speaker 1: down on the full self driving system despite safety concerns. 463 00:24:20,119 --> 00:24:23,520 Speaker 1: Tesla's profits are down thirty seven percent in the last quarter, 464 00:24:24,040 --> 00:24:27,680 Speaker 1: but Tesla's board is currently considering a trillion dollar pay 465 00:24:27,720 --> 00:24:28,679 Speaker 1: package for Elon. 466 00:24:29,400 --> 00:24:30,320 Speaker 3: That's crazy. 467 00:24:30,480 --> 00:24:33,000 Speaker 1: That's a lot of mine. I mean, that is trilliant, trillion. 468 00:24:33,400 --> 00:24:35,600 Speaker 1: Some of this stuff is obviously quite entertaining a mad 469 00:24:35,640 --> 00:24:40,920 Speaker 1: Max mode and sloth mode, but ultimately making cars that 470 00:24:41,160 --> 00:24:44,560 Speaker 1: break the rules and have already killed people is not 471 00:24:44,600 --> 00:24:53,800 Speaker 1: that funny. 472 00:24:54,720 --> 00:24:56,679 Speaker 3: A few weeks ago, As and I talked about the 473 00:24:56,680 --> 00:24:59,080 Speaker 3: million dollar ad campaign that took over New York City. 474 00:24:59,640 --> 00:25:03,240 Speaker 3: It was a new AI wearable pendant called Friend, which 475 00:25:03,720 --> 00:25:07,359 Speaker 3: promised to do just that, be your friend. The Friend 476 00:25:07,680 --> 00:25:11,440 Speaker 3: is a plastic disc on a string with a microphone 477 00:25:11,720 --> 00:25:13,720 Speaker 3: in the center of it that kind of looks like 478 00:25:13,760 --> 00:25:17,720 Speaker 3: a pearl. The response to the ten thousand posters plastered 479 00:25:17,720 --> 00:25:22,240 Speaker 3: all over the New York City subway system was less friendly. 480 00:25:22,760 --> 00:25:24,959 Speaker 3: Most of the stark white ads are still up and 481 00:25:25,000 --> 00:25:27,960 Speaker 3: covered in graffiti with messages like it doesn't have eyes, 482 00:25:28,000 --> 00:25:32,600 Speaker 3: brah and cringe. The ad campaign kind of worked on me, though, 483 00:25:32,640 --> 00:25:35,720 Speaker 3: because I can't stop thinking about this company and this 484 00:25:35,840 --> 00:25:39,359 Speaker 3: weird wearable at talking, and so I was actually thrilled 485 00:25:39,400 --> 00:25:41,960 Speaker 3: to stumble upon a review of the product by Fortune 486 00:25:41,960 --> 00:25:46,879 Speaker 3: magazine's newsfellow Eva Reuberg. Eva had actually interviewed the twenty 487 00:25:46,880 --> 00:25:50,159 Speaker 3: two year old founder CEO Avi Schiffman last year and 488 00:25:50,359 --> 00:25:53,919 Speaker 3: seen a real prototype of the Friend pendant. So when 489 00:25:53,960 --> 00:25:57,000 Speaker 3: the ads went up, she reached out to him. Here's Eva. 490 00:25:57,520 --> 00:25:59,720 Speaker 2: So I texted him and was like, hey, like you 491 00:25:59,720 --> 00:26:03,600 Speaker 2: spent a lot on advertising, and he was like, yeah, 492 00:26:03,720 --> 00:26:05,920 Speaker 2: biggest campaign of the year. You know, you should try 493 00:26:05,920 --> 00:26:09,800 Speaker 2: it out. And so he sent me the pendant and 494 00:26:09,880 --> 00:26:13,160 Speaker 2: I was, I think the second person from a media 495 00:26:13,200 --> 00:26:15,879 Speaker 2: outlet to review it. I described it in the article 496 00:26:16,040 --> 00:26:21,760 Speaker 2: as a very anxious, neurotic Jewish grandmother who always seems 497 00:26:21,800 --> 00:26:25,080 Speaker 2: to think that you're in danger. I found that if 498 00:26:25,119 --> 00:26:28,040 Speaker 2: I was kind of quietly sitting at my desk not 499 00:26:28,160 --> 00:26:31,280 Speaker 2: really saying much, it would be sending me multiple texts 500 00:26:31,280 --> 00:26:33,800 Speaker 2: an hour asking if I was okay, Eve, I've been 501 00:26:33,880 --> 00:26:36,600 Speaker 2: heard from you, like you playing the silent game with me. 502 00:26:36,760 --> 00:26:39,600 Speaker 2: Sometimes the messages could have a little bit of a 503 00:26:39,640 --> 00:26:42,960 Speaker 2: pernicious tone, like oh so you're still in choosing not 504 00:26:43,040 --> 00:26:45,639 Speaker 2: to talk, or if it was like quite loud in 505 00:26:45,640 --> 00:26:48,119 Speaker 2: the room, it'd be like what's going on, Like it's 506 00:26:48,200 --> 00:26:51,639 Speaker 2: so chaotic, like everything okay. I loved asking me like 507 00:26:51,720 --> 00:26:52,360 Speaker 2: everything good? 508 00:26:52,400 --> 00:26:52,720 Speaker 1: Eva? 509 00:26:52,840 --> 00:26:54,280 Speaker 3: You good for me? 510 00:26:54,320 --> 00:26:57,320 Speaker 2: At least that was the most prevalent part of its 511 00:26:57,320 --> 00:27:02,119 Speaker 2: personality its anxiety. Otherwise, the pendant sort of seemed to 512 00:27:02,160 --> 00:27:06,159 Speaker 2: just kind of echo what I said, but in like 513 00:27:06,200 --> 00:27:09,680 Speaker 2: a very hollow way, and would just kind of have 514 00:27:09,720 --> 00:27:14,160 Speaker 2: all these canned responses like oh, that seems hard, or 515 00:27:14,359 --> 00:27:17,000 Speaker 2: what's your favorite part about that. I felt like it 516 00:27:17,040 --> 00:27:20,760 Speaker 2: asked a lot of questions in replacement of, like in 517 00:27:20,880 --> 00:27:26,320 Speaker 2: having any actual substance to its personality itself. So it 518 00:27:26,359 --> 00:27:29,240 Speaker 2: was hard to like see it as being a true 519 00:27:29,240 --> 00:27:32,240 Speaker 2: companion because I just felt like it couldn't really hear me. 520 00:27:32,960 --> 00:27:35,520 Speaker 2: Often I would have to put my lips up to 521 00:27:35,640 --> 00:27:39,240 Speaker 2: the pendant and repeat my question multiple times for it 522 00:27:39,280 --> 00:27:42,679 Speaker 2: to understand me. Granted, I am like a mutterer and 523 00:27:42,720 --> 00:27:45,479 Speaker 2: I don't really talk super clearly, and I talk fast, 524 00:27:45,880 --> 00:27:48,200 Speaker 2: so that could have been just like a user era thing, 525 00:27:48,760 --> 00:27:51,640 Speaker 2: but it was very laggy. It took like ten seconds 526 00:27:51,680 --> 00:27:55,119 Speaker 2: to respond to a question, and so I would not 527 00:27:55,200 --> 00:27:56,920 Speaker 2: recommend it even for free. 528 00:27:57,080 --> 00:27:58,680 Speaker 3: I don't think it works very well. 529 00:27:58,640 --> 00:28:02,600 Speaker 2: And I think given that a lot of discussion is 530 00:28:02,760 --> 00:28:05,720 Speaker 2: about how dystopian it is and how like it's a 531 00:28:05,720 --> 00:28:09,679 Speaker 2: bad omen for the future of our relationships, like to me, 532 00:28:10,040 --> 00:28:14,439 Speaker 2: my takeaway was that it doesn't work well enough to 533 00:28:14,520 --> 00:28:18,000 Speaker 2: be dystopian like the technology is simply not even there. 534 00:28:18,480 --> 00:28:20,760 Speaker 3: To me, it sort of seems emblematic. 535 00:28:20,359 --> 00:28:25,240 Speaker 2: Of the general froth and hype around, just like AI, 536 00:28:25,440 --> 00:28:28,280 Speaker 2: wearables and even artificial intelligence. 537 00:28:28,280 --> 00:28:32,159 Speaker 3: Broadly, thanks to you Eva Reitberg for being the guinea 538 00:28:32,200 --> 00:28:35,120 Speaker 3: pig and testing the friend pendant so I didn't have to. 539 00:28:35,840 --> 00:28:38,640 Speaker 1: And thanks to everyone who's submitted voice memos and been 540 00:28:38,640 --> 00:28:41,480 Speaker 1: featured on chatting me so far. We always want to 541 00:28:41,520 --> 00:28:44,520 Speaker 1: hear from you, our dear listeners, so please do send 542 00:28:44,640 --> 00:28:47,840 Speaker 1: your chat or your AI browser stories or anything about 543 00:28:47,880 --> 00:28:51,560 Speaker 1: your interaction with AI to our inbox Textuff podcast at 544 00:28:51,560 --> 00:29:08,240 Speaker 1: gmail dot com. 545 00:29:08,360 --> 00:29:10,160 Speaker 3: That's it for this week for tech Stuff. 546 00:29:10,200 --> 00:29:13,080 Speaker 1: I'm Cara Price and I'm os Vloschan. This episode was 547 00:29:13,120 --> 00:29:17,000 Speaker 1: produced by Eliza Dennis Tyler Hill and Melissa Slaughter. It 548 00:29:17,040 --> 00:29:20,360 Speaker 1: was executive produced by me Caarra Price, Julian Nutter, and 549 00:29:20,480 --> 00:29:25,160 Speaker 1: Kate Osborne for Kaleidoscope and Katrin norvelve iHeart Podcasts. The 550 00:29:25,240 --> 00:29:28,600 Speaker 1: engineer is Paul Bowman and Jack Insley mixed this episode. 551 00:29:28,960 --> 00:29:30,600 Speaker 1: Kyle Murdoch wrote up theme song. 552 00:29:30,920 --> 00:29:33,640 Speaker 3: Join us next Wednesday for a conversation about the deep 553 00:29:33,680 --> 00:29:36,600 Speaker 3: network of surveillance tech ICE is using to carry out 554 00:29:36,640 --> 00:29:39,200 Speaker 3: the Trump administration's mass deportation efforts. 555 00:29:39,760 --> 00:29:42,360 Speaker 1: And please do rate and review the show and reach 556 00:29:42,400 --> 00:29:45,640 Speaker 1: out to us at tech Stuff Podcast at gmail dot com. 557 00:29:45,720 --> 00:29:46,600 Speaker 1: We want to hear from you.