1 00:00:12,880 --> 00:00:16,000 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:16,120 --> 00:00:18,200 Speaker 2: I'm as Valocian and I'm care Price. 3 00:00:18,360 --> 00:00:21,119 Speaker 1: Today we're going to get into why gen Z is 4 00:00:21,200 --> 00:00:24,720 Speaker 1: nostalgic for an Internet free world, a world they've never known. 5 00:00:25,360 --> 00:00:28,040 Speaker 1: An update to what's happening to TikTok in the US? 6 00:00:28,720 --> 00:00:32,240 Speaker 1: Then are you using chatbots to organize your closet or 7 00:00:32,360 --> 00:00:35,840 Speaker 1: write your wedding vows? We investigate how people are really 8 00:00:35,960 --> 00:00:36,840 Speaker 1: using chatbots? 9 00:00:37,159 --> 00:00:40,040 Speaker 3: This is the week in tech. It's Friday, July eleventh, 10 00:00:41,880 --> 00:00:44,400 Speaker 3: Hello Cara, Hi Ahs. 11 00:00:44,680 --> 00:00:48,760 Speaker 1: It's that time between July fourth and Labor Day, two 12 00:00:48,800 --> 00:00:51,360 Speaker 1: holidays that I had no conception of before I moved. 13 00:00:51,240 --> 00:00:53,240 Speaker 3: To the US, one of which you spell wrong. 14 00:00:54,200 --> 00:00:55,920 Speaker 1: How's that labo U are? 15 00:00:55,960 --> 00:00:56,080 Speaker 4: Oh? 16 00:00:56,120 --> 00:00:58,880 Speaker 1: Yeah, there's a U in the script. I'm sorry about that. 17 00:00:59,640 --> 00:01:03,080 Speaker 1: But it's a time known around the world as summer. 18 00:01:03,200 --> 00:01:05,280 Speaker 1: Although looking out of the window today at that dark, 19 00:01:05,319 --> 00:01:07,000 Speaker 1: gray sky, you wouldn't really know it. 20 00:01:07,440 --> 00:01:09,720 Speaker 3: You know, not to be ungrateful about living in the 21 00:01:09,760 --> 00:01:15,760 Speaker 3: greatest city on Earth, but New York has been disgustingly. 22 00:01:14,880 --> 00:01:19,880 Speaker 1: Hot, although I haven't been here, so I've been following 23 00:01:19,920 --> 00:01:21,960 Speaker 1: the weather and floating about. Having been out of town. 24 00:01:22,040 --> 00:01:25,480 Speaker 3: It's swamp. Like I wanted to bring up a story 25 00:01:25,720 --> 00:01:28,600 Speaker 3: that reminded me that being hot in New York City 26 00:01:29,080 --> 00:01:30,959 Speaker 3: is sometimes the most important thing you can be. 27 00:01:32,040 --> 00:01:34,000 Speaker 1: I have the sense you're not talking about the weather here. 28 00:01:34,160 --> 00:01:38,440 Speaker 3: No, I'm talking about a website called looks Mapping that 29 00:01:38,680 --> 00:01:41,959 Speaker 3: is like a digital heat map of which New York 30 00:01:41,959 --> 00:01:45,160 Speaker 3: City restaurants have the hottest patrons. So, I mean, I 31 00:01:45,200 --> 00:01:49,600 Speaker 3: might be dating myself here, but it's like Zagat for 32 00:01:49,720 --> 00:01:55,680 Speaker 3: people I thought. I'm not sure. Listeners, I urge you 33 00:01:55,720 --> 00:01:58,520 Speaker 3: to go to looks mapping dot com because for me 34 00:01:58,560 --> 00:02:02,120 Speaker 3: and I think, and for you, the interface is so 35 00:02:02,360 --> 00:02:06,560 Speaker 3: beta Geociti's early Internet that like, there's something sort of 36 00:02:06,600 --> 00:02:10,119 Speaker 3: comforting about it that I don't know, just the look 37 00:02:10,120 --> 00:02:10,360 Speaker 3: of it. 38 00:02:10,440 --> 00:02:12,639 Speaker 1: Do you think if the zay Gat or Zagat guide 39 00:02:12,720 --> 00:02:14,799 Speaker 1: was still around today, they would be it had food 40 00:02:15,440 --> 00:02:19,040 Speaker 1: value for money and Beyonce have a fourth pianto Hotness with. 41 00:02:19,000 --> 00:02:22,720 Speaker 3: A little chili emoji. Actually, I've avoided looking at the 42 00:02:22,760 --> 00:02:25,880 Speaker 3: map because I don't care about eating around hot. 43 00:02:25,680 --> 00:02:26,440 Speaker 1: People's wrong with you? 44 00:02:26,600 --> 00:02:29,080 Speaker 3: I'm not like, Oh, I go. I think other people 45 00:02:29,120 --> 00:02:31,600 Speaker 3: obviously are looking when they go to a restaurant to 46 00:02:31,639 --> 00:02:34,640 Speaker 3: find people to date. I'm looking to eat, which is 47 00:02:34,680 --> 00:02:39,119 Speaker 3: crazy that that is dating myself. I do, as I said, 48 00:02:39,200 --> 00:02:40,840 Speaker 3: want to look at this with you so that we 49 00:02:40,880 --> 00:02:43,600 Speaker 3: can laugh about it. So I'm going to pull up 50 00:02:43,840 --> 00:02:45,000 Speaker 3: your favorite restaurant. 51 00:02:45,240 --> 00:02:47,560 Speaker 1: Okay, well you know what. We both know what that is. 52 00:02:47,840 --> 00:02:48,920 Speaker 3: What is your favorite restaurant? 53 00:02:48,960 --> 00:02:50,800 Speaker 1: Well, you know very well because you live next door 54 00:02:50,840 --> 00:02:52,239 Speaker 1: and you often see me on my way there on 55 00:02:52,280 --> 00:02:52,919 Speaker 1: a Friday evening. 56 00:02:53,000 --> 00:02:54,239 Speaker 3: I'm typing it in right now. 57 00:02:54,200 --> 00:02:57,200 Speaker 1: Nikoboka Bar and Grill. In fact, one of the many 58 00:02:57,200 --> 00:03:00,679 Speaker 1: moments that have solidified our friendship was when you knew 59 00:03:00,720 --> 00:03:03,440 Speaker 1: I was celebrating my birthday there and you called in 60 00:03:03,480 --> 00:03:06,000 Speaker 1: a Cosmopolitan to me, delivered to my tables when they 61 00:03:06,080 --> 00:03:09,800 Speaker 1: went in town. Yeah. So, how does the Nickoboca Bar 62 00:03:09,919 --> 00:03:10,760 Speaker 1: and Grills? 63 00:03:10,600 --> 00:03:11,840 Speaker 3: Its not It's not found. 64 00:03:11,919 --> 00:03:14,680 Speaker 1: It's not found on looks mapping dot com. I can't 65 00:03:14,760 --> 00:03:15,440 Speaker 1: believe it. 66 00:03:15,160 --> 00:03:16,240 Speaker 3: It's not here. 67 00:03:16,280 --> 00:03:19,800 Speaker 1: Just for the listeners benefit. The Nikobocker Barren Grill is 68 00:03:19,840 --> 00:03:23,880 Speaker 1: an extremely dated sort of quasi steakhouse. 69 00:03:24,000 --> 00:03:25,680 Speaker 3: It's what I would call Fraser course, the. 70 00:03:25,639 --> 00:03:29,079 Speaker 1: Fraser course, thick carpet. Nobody else under sixty in there 71 00:03:29,120 --> 00:03:31,880 Speaker 1: apart from me and sometimes you so oddly enough, it's 72 00:03:31,919 --> 00:03:34,200 Speaker 1: not featured on Looks. 73 00:03:34,160 --> 00:03:37,320 Speaker 3: Shocking because they were like, no, they were like, this 74 00:03:37,360 --> 00:03:38,000 Speaker 3: will not appear. 75 00:03:38,080 --> 00:03:40,040 Speaker 1: What about my second favorite restaurant? Do you know what 76 00:03:40,040 --> 00:03:40,240 Speaker 1: that is? 77 00:03:40,240 --> 00:03:43,839 Speaker 3: Peaking duck housea house Midtown downtown one because you've you've 78 00:03:43,840 --> 00:03:45,040 Speaker 3: taken me to the downtown one. 79 00:03:45,600 --> 00:03:47,440 Speaker 1: True, reluctantly. I do actually like the downtown one. I 80 00:03:47,640 --> 00:03:49,800 Speaker 1: like the Midtown one, which also the downtown one doesn't 81 00:03:49,800 --> 00:03:51,080 Speaker 1: have a carpet. The Midtown one does. 82 00:03:51,120 --> 00:03:56,720 Speaker 3: That's disgusting discussing yep, oh oh, we have wait. I 83 00:03:56,760 --> 00:03:59,440 Speaker 3: just need to explain to people who are listening the 84 00:03:59,520 --> 00:04:02,400 Speaker 3: avatar that's standing on the compass on Looks mapping is 85 00:04:02,400 --> 00:04:07,520 Speaker 3: like a yassified like lip injected, hair flipped. 86 00:04:07,600 --> 00:04:09,400 Speaker 1: Guy's the website myself. 87 00:04:09,440 --> 00:04:12,400 Speaker 3: It's so good. He's like, you know, doing duck lips. 88 00:04:12,400 --> 00:04:13,280 Speaker 1: Oh, I see what. 89 00:04:14,720 --> 00:04:18,839 Speaker 3: I'm sorry to say. This is peaking duck house downtown. 90 00:04:17,480 --> 00:04:22,880 Speaker 3: They don't have no, it's too ugly. The score for 91 00:04:22,960 --> 00:04:28,680 Speaker 3: peaking duck House Chinese Downtown five points down. 92 00:04:29,760 --> 00:04:31,800 Speaker 1: By the way, peaking duck House Downtown is a lot 93 00:04:31,839 --> 00:04:34,760 Speaker 1: hotter than peaking duck House in Midtown negative. 94 00:04:35,120 --> 00:04:36,039 Speaker 2: It would have been a negative. 95 00:04:36,160 --> 00:04:38,760 Speaker 1: Literally okay, five points. Said, well, so how does this, 96 00:04:38,920 --> 00:04:41,120 Speaker 1: I mean, how does this work? How does how does 97 00:04:41,160 --> 00:04:44,080 Speaker 1: the looks mapping dot Com assign a hotness score to 98 00:04:44,760 --> 00:04:46,479 Speaker 1: by the way, not to all New York restaurants, but 99 00:04:46,520 --> 00:04:48,839 Speaker 1: to all New New York restaurants. Evidently that I don't. 100 00:04:48,640 --> 00:04:50,680 Speaker 3: Think that doesn't go to well. So here's what it 101 00:04:50,720 --> 00:04:54,599 Speaker 3: says on the website. Quote, I scraped millions and this 102 00:04:54,680 --> 00:04:57,200 Speaker 3: is the from the creator of the website. I scraped 103 00:04:57,279 --> 00:05:00,560 Speaker 3: millions of Google Maps restaurant reviews, and we gave each 104 00:05:00,600 --> 00:05:05,800 Speaker 3: reviewer's profile picture to an AI model that rates how 105 00:05:05,880 --> 00:05:08,960 Speaker 3: hot they are out of ten. Very smart. I mean 106 00:05:08,960 --> 00:05:12,160 Speaker 3: that an AI model like that exists. This map shows 107 00:05:12,279 --> 00:05:15,880 Speaker 3: how attractive each restaurant's clientele is, and the model is 108 00:05:15,920 --> 00:05:19,680 Speaker 3: certainly biased. It's certainly flawed. But we judge places, and 109 00:05:19,720 --> 00:05:22,640 Speaker 3: I'm still quoting the creator. We judge places by the 110 00:05:22,680 --> 00:05:25,560 Speaker 3: people who go there. We always have, and are we 111 00:05:25,640 --> 00:05:29,440 Speaker 3: not also flawed? This website just puts reductive numbers on 112 00:05:29,480 --> 00:05:32,880 Speaker 3: the superficial calculations we make every day, a mirror held 113 00:05:33,000 --> 00:05:34,440 Speaker 3: up to our collective vanity. 114 00:05:34,920 --> 00:05:37,480 Speaker 1: Quote, So this is you're telling me this website is 115 00:05:37,640 --> 00:05:41,400 Speaker 1: essentially a Balzakian commentary on New York Dining. 116 00:05:41,600 --> 00:05:44,480 Speaker 3: I mean the website itself does look like the website 117 00:05:44,480 --> 00:05:47,160 Speaker 3: of a philosophy department in a small local arts college. 118 00:05:47,400 --> 00:05:50,920 Speaker 3: So yes, I actually read about this in the New 119 00:05:51,000 --> 00:05:52,200 Speaker 3: York Times. 120 00:05:52,000 --> 00:05:53,160 Speaker 2: This Weekend Places. 121 00:05:53,200 --> 00:05:56,040 Speaker 3: And the guy who created it is this. He's barely 122 00:05:56,080 --> 00:05:58,440 Speaker 3: a man, He's a boy. He's twenty two. His name's 123 00:05:58,480 --> 00:06:01,160 Speaker 3: Riley Walls. He's a pro pro grammar and this actually 124 00:06:01,200 --> 00:06:02,279 Speaker 3: isn't his first prank. 125 00:06:02,440 --> 00:06:02,640 Speaker 1: You know. 126 00:06:02,680 --> 00:06:05,400 Speaker 3: A few years ago he created a fake restaurant with 127 00:06:05,440 --> 00:06:10,200 Speaker 3: a near perfect Google rating called Miron's Steakhouse. Cool, and 128 00:06:10,320 --> 00:06:13,120 Speaker 3: so a ton of people signed up to be on 129 00:06:13,160 --> 00:06:16,280 Speaker 3: the wait list to get into a fake restaurant. And 130 00:06:16,320 --> 00:06:19,200 Speaker 3: then actually it was so popular that they opened the 131 00:06:19,240 --> 00:06:21,080 Speaker 3: steakhouse for one night in twenty twenty three. 132 00:06:21,200 --> 00:06:23,080 Speaker 1: I absolutely love this type of story. There was another 133 00:06:23,080 --> 00:06:24,920 Speaker 1: guy who did it in London with a fake restaurant 134 00:06:24,960 --> 00:06:28,000 Speaker 1: called The Shed, and then he served people this disgusting food. 135 00:06:28,440 --> 00:06:31,800 Speaker 1: I think data poisoning to trick thirsty souls into going 136 00:06:31,800 --> 00:06:34,920 Speaker 1: to hype places is one of the true joys of 137 00:06:34,960 --> 00:06:36,400 Speaker 1: the Internet. This is bombed to my soul. 138 00:06:36,720 --> 00:06:40,080 Speaker 3: Yeah, and Waltz actually had an AI model scrape two 139 00:06:40,160 --> 00:06:44,440 Speaker 3: point eight million Google reviews from one point five million 140 00:06:44,520 --> 00:06:47,839 Speaker 3: unique accounts. The model identified five hundred and eighty seven 141 00:06:47,880 --> 00:06:51,920 Speaker 3: thousand profile images with distinguishable faces, and then Walls had 142 00:06:51,920 --> 00:06:54,320 Speaker 3: that model determined if the users were young or old, 143 00:06:54,839 --> 00:06:59,760 Speaker 3: male or female, and hot or not very binary, very binary. 144 00:07:00,040 --> 00:07:03,120 Speaker 1: How did the model define hot well? 145 00:07:03,240 --> 00:07:06,760 Speaker 3: Walls told The Times that the attractiveness score was quote 146 00:07:07,560 --> 00:07:08,799 Speaker 3: admittedly a bit yanky. 147 00:07:09,840 --> 00:07:11,600 Speaker 1: I mean that sounds like a bit of an understatement. 148 00:07:11,960 --> 00:07:14,000 Speaker 1: I do love the idea of sitting down to set 149 00:07:14,000 --> 00:07:17,120 Speaker 1: the parameters for an AI model about hot or not. 150 00:07:19,360 --> 00:07:20,720 Speaker 1: How did mister Wolves go about it? 151 00:07:20,920 --> 00:07:23,280 Speaker 3: So, yeah, he actually gave a few examples. If someone 152 00:07:23,360 --> 00:07:26,040 Speaker 3: was wearing a wedding dress in their profile image, it 153 00:07:26,080 --> 00:07:28,160 Speaker 3: meant they were hot because obviously someone wanted to sleep 154 00:07:28,160 --> 00:07:32,040 Speaker 3: with them. And if the photo was blurry, they were 155 00:07:32,080 --> 00:07:34,840 Speaker 3: not because obviously nobody wanted to take a good photo 156 00:07:34,840 --> 00:07:35,120 Speaker 3: of them. 157 00:07:35,640 --> 00:07:36,240 Speaker 2: So they're ugly. 158 00:07:36,760 --> 00:07:38,800 Speaker 1: I mean, these brahminters are not the ones. This is 159 00:07:38,840 --> 00:07:42,480 Speaker 1: not the sort of Leonardo da Vinci facial symmetry mode 160 00:07:42,560 --> 00:07:43,800 Speaker 1: of assessing beauty. 161 00:07:43,840 --> 00:07:46,840 Speaker 3: I guess no, it's like seemingly quite arbitrary. And he 162 00:07:46,920 --> 00:07:51,320 Speaker 3: actually unsurprisingly, did not build this for people who are 163 00:07:51,400 --> 00:07:54,400 Speaker 3: using it as a legitimate tool. He I think, like 164 00:07:54,480 --> 00:07:57,720 Speaker 3: I said, is making a social commentary on how diners 165 00:07:57,760 --> 00:08:01,280 Speaker 3: prioritize whether a restaurant is a scene over the food 166 00:08:01,320 --> 00:08:05,160 Speaker 3: and the atmosphere, And if you go on TikTok, a 167 00:08:05,200 --> 00:08:08,239 Speaker 3: lot of influencers actually talk about if there are hot 168 00:08:08,280 --> 00:08:11,400 Speaker 3: guys or rich guys, like the rich guy restaurant is 169 00:08:11,440 --> 00:08:12,720 Speaker 3: I think the most important thing. 170 00:08:12,960 --> 00:08:14,680 Speaker 1: It turns out that New York is not the only 171 00:08:14,680 --> 00:08:17,920 Speaker 1: city where the heat map exists. It's also available for 172 00:08:18,200 --> 00:08:20,440 Speaker 1: Los Angeles and San Francisco. 173 00:08:20,080 --> 00:08:22,000 Speaker 3: The only three places where there are hot people. 174 00:08:22,280 --> 00:08:24,640 Speaker 1: But I'm looking at the map right now, and it 175 00:08:24,720 --> 00:08:29,600 Speaker 1: seems like there's more blue meaning less hotness up in Harlem. 176 00:08:30,280 --> 00:08:33,120 Speaker 3: Twitter users did pick up on this quickly, and most 177 00:08:33,160 --> 00:08:35,560 Speaker 3: of the denser red pockets of the map were in 178 00:08:35,760 --> 00:08:39,400 Speaker 3: largely white, affluent neighborhoods in all three cities, Los Angeles, 179 00:08:39,400 --> 00:08:40,600 Speaker 3: San Francisco, and New York. 180 00:08:40,760 --> 00:08:43,080 Speaker 1: So, like most AI models, this one has a certain 181 00:08:43,120 --> 00:08:44,920 Speaker 1: amount of racial bias baked in. 182 00:08:45,320 --> 00:08:48,880 Speaker 3: Correct, which is unsurprising. You know, AI is trained on data, 183 00:08:48,920 --> 00:08:51,920 Speaker 3: so its answers reflect the biases of the humans who 184 00:08:51,920 --> 00:08:54,360 Speaker 3: created the data in the first place, and Walts actually 185 00:08:54,360 --> 00:08:56,560 Speaker 3: got criticized for this, but he responded that that is 186 00:08:56,600 --> 00:08:59,480 Speaker 3: part of the point. Again, he says looks mapping is 187 00:08:59,559 --> 00:09:04,120 Speaker 3: designed to make fun of AI and he also added 188 00:09:04,120 --> 00:09:06,720 Speaker 3: that one of the ugliest restaurants was actually a country club. 189 00:09:06,920 --> 00:09:08,280 Speaker 1: I like the sound that this Waltz character. 190 00:09:08,280 --> 00:09:11,560 Speaker 3: We should ask him to twenty two year old cheeky so. 191 00:09:11,520 --> 00:09:14,040 Speaker 1: I mentioned I was on vacation last week out of 192 00:09:14,040 --> 00:09:16,959 Speaker 1: the city. I was actually in Greece and every time 193 00:09:17,040 --> 00:09:21,319 Speaker 1: the sunset, a horde of tourists would run towards the 194 00:09:21,400 --> 00:09:25,640 Speaker 1: most instagrammable location with their cameras. I was actually fighting 195 00:09:25,640 --> 00:09:29,040 Speaker 1: for space on the courtyard in front of it of 196 00:09:29,080 --> 00:09:32,000 Speaker 1: a church to actually look at the sunset, and somebody 197 00:09:32,320 --> 00:09:34,319 Speaker 1: you take my photo. I said sure, I said, I'll 198 00:09:34,320 --> 00:09:37,200 Speaker 1: take your photo afterwards. And I said, oh, that's okay. 199 00:09:37,240 --> 00:09:39,280 Speaker 1: I just want to look at the sunset and my wife, 200 00:09:39,760 --> 00:09:43,319 Speaker 1: I said, you are so self righteous. Why do you 201 00:09:43,400 --> 00:09:45,839 Speaker 1: need to shame that poor man. The man was like, 202 00:09:45,920 --> 00:09:49,560 Speaker 1: I have nothing exactly, but it kind of raised this 203 00:09:49,600 --> 00:09:52,880 Speaker 1: interesting question about what were the photos designed to capture 204 00:09:52,880 --> 00:09:55,200 Speaker 1: the trip as in the past, or was the trip 205 00:09:55,240 --> 00:09:57,800 Speaker 1: designed to create an opportunity to take photographs and share 206 00:09:57,840 --> 00:09:59,840 Speaker 1: them on social media, and it was a bit like 207 00:10:00,040 --> 00:10:02,400 Speaker 1: everyone's in their own version of The Truman Show, where 208 00:10:02,400 --> 00:10:04,480 Speaker 1: they were both the actor, the producer and the director, 209 00:10:04,520 --> 00:10:06,120 Speaker 1: and it didn't look that fun. 210 00:10:06,720 --> 00:10:09,360 Speaker 3: I think the way we're living in the Truman Show 211 00:10:09,440 --> 00:10:14,040 Speaker 3: now is that we are all living in a world 212 00:10:14,120 --> 00:10:16,800 Speaker 3: that has really been predetermined by what we see on 213 00:10:16,800 --> 00:10:19,960 Speaker 3: social media. So I think looks Mapping was probably trying 214 00:10:20,000 --> 00:10:22,320 Speaker 3: to shine a light on this particular fact, which is 215 00:10:22,320 --> 00:10:25,600 Speaker 3: that we're going to places right now with expectations that 216 00:10:25,640 --> 00:10:28,640 Speaker 3: have been pre constructed on the Internet. And it's because, 217 00:10:29,679 --> 00:10:33,800 Speaker 3: to steal a line from the French philosopher Louis Altuse, 218 00:10:35,120 --> 00:10:39,480 Speaker 3: we are always already aware of what we're getting, which 219 00:10:39,480 --> 00:10:41,000 Speaker 3: is sort of sad, you know. 220 00:10:41,160 --> 00:10:44,680 Speaker 1: Yeah, I think it's so. My stepfather, Ricardo, owns a 221 00:10:44,720 --> 00:10:48,160 Speaker 1: restaurant in London called Ricardo's. Shout Out head there and 222 00:10:48,200 --> 00:10:52,080 Speaker 1: also congratulations, happy birthday, thirtieth anniversary. 223 00:10:51,480 --> 00:10:55,880 Speaker 5: Wow, happy anniversary, Ricardo anniversary, and Riccardo's is a great 224 00:10:55,880 --> 00:11:00,800 Speaker 5: restaurant is not built around social media moments, so I 225 00:11:00,840 --> 00:11:05,120 Speaker 5: was encouraging Ricardo to look at a brand refresh where 226 00:11:05,160 --> 00:11:08,920 Speaker 5: he really leaned into being one of the oldest continually 227 00:11:08,960 --> 00:11:11,400 Speaker 5: operated family owned Italian restaurants in London. 228 00:11:11,440 --> 00:11:14,920 Speaker 1: He was like, no, I'm good, but it is I mean, 229 00:11:14,960 --> 00:11:18,200 Speaker 1: this whole thing you're talking about, where real life becomes 230 00:11:18,200 --> 00:11:22,160 Speaker 1: optimized for social media moments rather than social media capturing 231 00:11:22,240 --> 00:11:25,640 Speaker 1: real life is it's a little disturbing. 232 00:11:26,400 --> 00:11:30,160 Speaker 3: I think, as elder millennials, which we are, we grew 233 00:11:30,240 --> 00:11:34,520 Speaker 3: up before the I share. Therefore, I am to quote 234 00:11:34,520 --> 00:11:37,520 Speaker 3: Sherry turkle era began. We live in it, but we 235 00:11:38,040 --> 00:11:41,240 Speaker 3: had life before it, and I think the pressures of 236 00:11:41,280 --> 00:11:43,599 Speaker 3: a life lived online are much harder on gen Z. 237 00:11:44,760 --> 00:11:47,600 Speaker 3: I actually found a survey that backs this up. So 238 00:11:48,559 --> 00:11:53,559 Speaker 3: I'm sure you know about the BSI, the British Standards Institution. 239 00:11:53,280 --> 00:11:56,080 Speaker 1: Of course. Yeah, actually I just read my membership. 240 00:11:56,679 --> 00:12:01,040 Speaker 3: Well to consulting from was president work there after. They've 241 00:12:01,080 --> 00:12:04,959 Speaker 3: actually asked thirteen hundred young Brits ages sixteen to twenty 242 00:12:04,960 --> 00:12:07,520 Speaker 3: one whether they would prefer to be young in a 243 00:12:07,520 --> 00:12:12,680 Speaker 3: world without the Internet. Forty seven percent of them responded yes. 244 00:12:13,320 --> 00:12:16,240 Speaker 1: That is a classic case of better the devil you 245 00:12:16,280 --> 00:12:18,400 Speaker 1: don't know in this case, right, because none of these 246 00:12:18,520 --> 00:12:20,640 Speaker 1: youths have ever lived with on the Internet. 247 00:12:21,200 --> 00:12:23,600 Speaker 3: I think there's something kind of sexy to people about 248 00:12:23,640 --> 00:12:27,560 Speaker 3: a time they didn't live, you know, like I'm very 249 00:12:27,559 --> 00:12:31,760 Speaker 3: interested in taking a horse across the country. And I 250 00:12:31,760 --> 00:12:34,600 Speaker 3: think the most interesting thing about this survey is that 251 00:12:34,679 --> 00:12:37,920 Speaker 3: fifty percent of all of those surveyed said that a 252 00:12:37,960 --> 00:12:40,520 Speaker 3: social media curfew would improve their lives. 253 00:12:40,600 --> 00:12:42,559 Speaker 1: What do you mean by social media curfew? 254 00:12:42,600 --> 00:12:46,199 Speaker 3: So actually the UK, which has a technology secretary, has 255 00:12:46,360 --> 00:12:50,880 Speaker 3: hinted that the government is considering I love this mandatory 256 00:12:50,920 --> 00:12:53,800 Speaker 3: cutoff times for certain apps like TikTok and Instagram for 257 00:12:53,920 --> 00:12:55,040 Speaker 3: children under a certain age. 258 00:12:55,040 --> 00:12:58,440 Speaker 1: It is like during the Blitz when everyone just like 259 00:12:59,320 --> 00:13:02,760 Speaker 1: it's just like the well funny enough. On my trip, 260 00:13:02,800 --> 00:13:06,040 Speaker 1: which I'm sorry to keep posting about, I gave myself 261 00:13:06,120 --> 00:13:08,199 Speaker 1: a week long blackout or curse. 262 00:13:08,320 --> 00:13:08,800 Speaker 3: It was weird. 263 00:13:09,080 --> 00:13:12,000 Speaker 1: Yeah, So I removed Gmail and Slack from my phone 264 00:13:12,400 --> 00:13:14,720 Speaker 1: so that I wouldn't be tempted to glance at work 265 00:13:14,720 --> 00:13:17,840 Speaker 1: stuff and stretch myself out. I kept text, obviously, and 266 00:13:17,880 --> 00:13:20,240 Speaker 1: you and I texted a fair amount about text stuff, 267 00:13:20,559 --> 00:13:23,440 Speaker 1: but other than that, I was offline. I can't tell 268 00:13:23,520 --> 00:13:26,400 Speaker 1: you how good it felt to be not looking at 269 00:13:26,520 --> 00:13:29,600 Speaker 1: Slack and work Gmail for a week. And how bad 270 00:13:29,640 --> 00:13:31,319 Speaker 1: it felt to start again on Monday. 271 00:13:32,000 --> 00:13:35,599 Speaker 3: You know, as you and maybe our audience have recognized, 272 00:13:35,640 --> 00:13:38,240 Speaker 3: you've done it so gracefully. I have been in and 273 00:13:38,240 --> 00:13:41,040 Speaker 3: out of the show this year, and some of the 274 00:13:41,080 --> 00:13:42,839 Speaker 3: reason that I've been taking off is on account of 275 00:13:42,880 --> 00:13:47,440 Speaker 3: some personal mental health struggles, and as such, I was 276 00:13:47,440 --> 00:13:50,120 Speaker 3: actually forced into a bit of a curfew myself. It 277 00:13:50,160 --> 00:13:53,640 Speaker 3: was more a sensory deprivation tech but there were some 278 00:13:53,720 --> 00:13:55,880 Speaker 3: days where I actually only had my phone for one 279 00:13:55,960 --> 00:13:59,280 Speaker 3: hour a day. And the thing that it made me 280 00:13:59,360 --> 00:14:03,600 Speaker 3: realize is that the iPhone is a rapacious creditor. 281 00:14:04,040 --> 00:14:04,680 Speaker 1: What does that mean. 282 00:14:04,920 --> 00:14:10,160 Speaker 3: It means that the more you use it, the more 283 00:14:10,200 --> 00:14:16,040 Speaker 3: you need to use it. And yeah, it is like sugar. 284 00:14:16,400 --> 00:14:19,120 Speaker 3: It is like cigarettes, where it's like we like to 285 00:14:19,640 --> 00:14:26,640 Speaker 3: brag when we have abstained. And my family actually refers 286 00:14:26,680 --> 00:14:29,200 Speaker 3: to my cell phone addiction as a legitimate addiction. And 287 00:14:29,240 --> 00:14:30,360 Speaker 3: I've been intervened on. 288 00:14:30,640 --> 00:14:33,520 Speaker 1: What was your experience then, of having twenty three hours 289 00:14:33,520 --> 00:14:34,040 Speaker 1: a day off. 290 00:14:34,320 --> 00:14:37,280 Speaker 3: I was so jacked up. I was so high from 291 00:14:37,360 --> 00:14:39,600 Speaker 3: using my phone for that one hour that like I 292 00:14:39,640 --> 00:14:43,000 Speaker 3: would give the phone back and I felt like my 293 00:14:43,120 --> 00:14:46,320 Speaker 3: brain was on fire. And I think when you have 294 00:14:46,400 --> 00:14:49,200 Speaker 3: access to a phone all day long. It sort of 295 00:14:49,200 --> 00:14:51,600 Speaker 3: spreads itself out. You don't feel like you're at a 296 00:14:51,640 --> 00:14:54,000 Speaker 3: loss because you don't have it, but when you don't 297 00:14:54,040 --> 00:14:56,320 Speaker 3: have it, and then it's something you get, you realize 298 00:14:56,400 --> 00:14:58,080 Speaker 3: how jacked up this thing makes you. 299 00:14:58,720 --> 00:15:01,240 Speaker 1: It was interesting turning my feat own back on Monday 300 00:15:01,280 --> 00:15:05,040 Speaker 1: morning and reinstalling Gmail and Slack. I literally reinstalled them 301 00:15:05,400 --> 00:15:07,800 Speaker 1: and four hours went by when I didn't look up 302 00:15:07,800 --> 00:15:10,920 Speaker 1: from my phone. But I felt so broken by the 303 00:15:11,040 --> 00:15:11,360 Speaker 1: end of that. 304 00:15:11,520 --> 00:15:13,920 Speaker 3: Yeah, coming back to the survey, the thing that really 305 00:15:13,960 --> 00:15:17,280 Speaker 3: stood out to me was actually how often young people, 306 00:15:17,920 --> 00:15:21,720 Speaker 3: especially young women, are comparing their appearance or lifestyle to others. 307 00:15:22,280 --> 00:15:25,560 Speaker 3: In the survey, eighty five percent of female respondents said 308 00:15:25,600 --> 00:15:29,280 Speaker 3: they do this at least sometimes, and nearly half of 309 00:15:29,360 --> 00:15:32,400 Speaker 3: young women are doing this often or very often. 310 00:15:32,560 --> 00:15:36,000 Speaker 1: Yeah, but this idea of comparing yourself and your appearance 311 00:15:36,000 --> 00:15:38,320 Speaker 1: and your lifestyle to others, this is what the social 312 00:15:38,320 --> 00:15:40,320 Speaker 1: commentary of looks mapping is all about. 313 00:15:40,720 --> 00:15:44,600 Speaker 3: That's very true. I actually originally found these survey results 314 00:15:44,680 --> 00:15:47,760 Speaker 3: buried in an opinion piece from the Guardian. The author 315 00:15:48,000 --> 00:15:51,000 Speaker 3: is about Brooks is a bit older than the people 316 00:15:51,000 --> 00:15:53,960 Speaker 3: who are surveyed she's twenty six, but she admits to 317 00:15:54,000 --> 00:15:56,960 Speaker 3: being entranced by an old video that went viral recently 318 00:15:57,480 --> 00:16:02,720 Speaker 3: of Wesleyan alumni the band MGMT, performing for their peers 319 00:16:02,720 --> 00:16:03,720 Speaker 3: at Wesleyan University. 320 00:16:03,760 --> 00:16:09,080 Speaker 1: I remember seeing MGMT performing extremely reluctantly at a concert, 321 00:16:09,120 --> 00:16:12,160 Speaker 1: that graduating concert at Yale, and they just look so 322 00:16:12,280 --> 00:16:13,280 Speaker 1: bummed out to be there. 323 00:16:13,480 --> 00:16:17,880 Speaker 3: They were like, we miss Wesley And yeah, and it's 324 00:16:17,920 --> 00:16:20,240 Speaker 3: not the music that she's drawn to, it's the crowd. 325 00:16:21,040 --> 00:16:24,880 Speaker 3: She observes the quote, no one is dressed that well, judgy. 326 00:16:25,040 --> 00:16:29,680 Speaker 3: The cam resumes unsteadily to capture the crowd's awkwardness, slump 327 00:16:29,760 --> 00:16:34,280 Speaker 3: shoulders and a rhythmic bopping. Beyond the footage we're watching, 328 00:16:34,960 --> 00:16:37,000 Speaker 3: no one seems to be filming. 329 00:16:37,480 --> 00:16:38,160 Speaker 1: That's interesting. 330 00:16:38,360 --> 00:16:40,400 Speaker 3: Yeah, and that part was key for her. No one 331 00:16:40,440 --> 00:16:44,720 Speaker 3: in the video is on their phone or filming the experience, except, 332 00:16:44,760 --> 00:16:46,480 Speaker 3: of course, like the person who captured the video. But 333 00:16:47,240 --> 00:16:50,280 Speaker 3: Isabelle goes on to say, quote, I was only four 334 00:16:50,400 --> 00:16:53,760 Speaker 3: when the video was filmed, So why does watching it 335 00:16:53,800 --> 00:16:57,360 Speaker 3: make me feel as if I've lost a whole world? So, 336 00:16:57,720 --> 00:17:00,960 Speaker 3: you know, she feels like she's missed out on this 337 00:17:01,200 --> 00:17:05,840 Speaker 3: pre social Internet where people were more authentic and unique. 338 00:17:05,960 --> 00:17:09,879 Speaker 3: And I think, you know, I was in college before 339 00:17:09,920 --> 00:17:11,760 Speaker 3: you would film something on a foot, We would take 340 00:17:11,760 --> 00:17:15,280 Speaker 3: photos on our BlackBerry maybe, or you would take camera pictures. 341 00:17:15,320 --> 00:17:17,080 Speaker 3: And now there are some concerts that I've actually been 342 00:17:17,119 --> 00:17:20,119 Speaker 3: to where the artists will say, don't take out your 343 00:17:20,160 --> 00:17:22,080 Speaker 3: phone for this, you know, just everybody puts your phone 344 00:17:22,080 --> 00:17:26,000 Speaker 3: down and watch. The saddest part about it for me, 345 00:17:26,440 --> 00:17:32,080 Speaker 3: and this I actually feel guilty of, is how difficult 346 00:17:32,119 --> 00:17:33,720 Speaker 3: it is for people to just. 347 00:17:33,880 --> 00:17:37,320 Speaker 1: Be well, it's not just us. You mentioned the UK 348 00:17:37,440 --> 00:17:41,800 Speaker 1: government is looking at this TikTok curfew, and I'm very 349 00:17:41,840 --> 00:17:46,960 Speaker 1: interested to know whether regulation government regulation will address the 350 00:17:46,960 --> 00:17:50,679 Speaker 1: harms of social media before the next generation become teenagers. 351 00:17:50,920 --> 00:17:52,359 Speaker 1: And I do wonder if we'll look back on this 352 00:17:52,400 --> 00:17:54,480 Speaker 1: time that we're living through now as kind of the 353 00:17:54,520 --> 00:17:59,119 Speaker 1: time where doctors were prescribing cigarettes to people with a cough, 354 00:17:59,240 --> 00:18:02,000 Speaker 1: and whether we'll be like, it was so crazy that 355 00:18:02,040 --> 00:18:04,800 Speaker 1: we allowed a whole generation to have their minds ruined 356 00:18:04,800 --> 00:18:08,800 Speaker 1: by trillion dollar corporations. Who knows, But the policy thing 357 00:18:08,840 --> 00:18:11,720 Speaker 1: I'm interested in here in the US, a TikTok ban 358 00:18:11,800 --> 00:18:14,640 Speaker 1: has been mooted, not because of children's health, of course, 359 00:18:14,680 --> 00:18:18,359 Speaker 1: but because of geopology China, China exactly so. In April 360 00:18:18,440 --> 00:18:21,680 Speaker 1: last year, President Biden signed a bill requiring that TikTok 361 00:18:21,760 --> 00:18:24,360 Speaker 1: be sold to a US buyer or be shut down. 362 00:18:24,880 --> 00:18:27,480 Speaker 1: The ban very briefly took effect the day before President 363 00:18:27,480 --> 00:18:30,640 Speaker 1: Trump was inaugurated in January. I'll never forget TikTok only 364 00:18:30,680 --> 00:18:34,600 Speaker 1: went dark for a single evening because upon taking office, 365 00:18:34,760 --> 00:18:37,720 Speaker 1: Trump immediately signed an executive order for a so called 366 00:18:38,119 --> 00:18:41,840 Speaker 1: enforcement delay, and has signed two more of these since then. 367 00:18:42,080 --> 00:18:44,320 Speaker 1: So the deadline keeps getting pushed back because the Trump 368 00:18:44,359 --> 00:18:47,280 Speaker 1: administration is working on a deal for a consortium of 369 00:18:47,359 --> 00:18:52,120 Speaker 1: non Chinese investors to take over TikTok's US operations. Details 370 00:18:52,119 --> 00:18:55,600 Speaker 1: about the deal remain unannounced, but the information broke some 371 00:18:55,720 --> 00:18:58,080 Speaker 1: news this week that makes the future of TikTok in 372 00:18:58,160 --> 00:19:01,760 Speaker 1: America appear secure. Apparently, there's a new version of the 373 00:19:01,840 --> 00:19:06,160 Speaker 1: app being built specifically for US users. The US only 374 00:19:06,200 --> 00:19:09,080 Speaker 1: TikTok app is set to launch on September fifth, and 375 00:19:09,240 --> 00:19:12,600 Speaker 1: US residents will have apparently until March twenty twenty six 376 00:19:12,920 --> 00:19:15,520 Speaker 1: to download it and get off the old app. There 377 00:19:15,600 --> 00:19:18,640 Speaker 1: is one wrinkle, which is that the Chinese government will 378 00:19:18,640 --> 00:19:21,399 Speaker 1: also have to approve of this structure, and apparently a 379 00:19:21,440 --> 00:19:24,840 Speaker 1: deal was actually close and then the tariff announcements earlier 380 00:19:24,840 --> 00:19:26,960 Speaker 1: this year derailed it. You know that I have this 381 00:19:27,040 --> 00:19:30,119 Speaker 1: kind of geopolitics bent. So while we're at it, there 382 00:19:30,160 --> 00:19:32,439 Speaker 1: was another story this week. It was in four or 383 00:19:32,440 --> 00:19:38,879 Speaker 1: four media that brought together diplomacy, cryptocurrency, personal appearance, and 384 00:19:38,920 --> 00:19:43,639 Speaker 1: the internet making collective decisions. Ala looks mapping. Tell me 385 00:19:44,520 --> 00:19:45,680 Speaker 1: have you heard of polymarket? 386 00:19:46,720 --> 00:19:49,120 Speaker 3: I mean it does sound like a dating app for polyamory. 387 00:19:49,880 --> 00:19:53,960 Speaker 1: Poly Market is not a dating market for the polyamorous. 388 00:19:54,200 --> 00:19:58,119 Speaker 1: It is a sort of online betting marketplace where you 389 00:19:58,160 --> 00:20:01,360 Speaker 1: can basically lay down a wager on almost anything other 390 00:20:01,440 --> 00:20:03,760 Speaker 1: users can come up with. You basically bet on yes 391 00:20:03,840 --> 00:20:07,840 Speaker 1: no issues using crypto against other users rather than against 392 00:20:07,880 --> 00:20:11,959 Speaker 1: a bookmaker, and then this decentralized system facilitates the payouts. 393 00:20:12,280 --> 00:20:14,919 Speaker 1: It got very big during the election last year and 394 00:20:15,000 --> 00:20:17,520 Speaker 1: right now on the site, some of the bets going 395 00:20:17,600 --> 00:20:21,040 Speaker 1: run the gamut from Wimbledon results to will a hurricane 396 00:20:21,040 --> 00:20:24,040 Speaker 1: make landfall in the US before August? But a bet 397 00:20:24,080 --> 00:20:27,040 Speaker 1: that's really blown up and has over two hundred million 398 00:20:27,119 --> 00:20:31,199 Speaker 1: dollars in crypto riding on it is will Ukrainian President 399 00:20:31,320 --> 00:20:35,560 Speaker 1: Zelenski wear a suit? Before July? Zelenski said he would 400 00:20:35,600 --> 00:20:39,359 Speaker 1: not wear a suit until Russia's war in Ukraine ended, 401 00:20:39,800 --> 00:20:42,840 Speaker 1: and so he pretty much always wears these military style fatigues, 402 00:20:43,240 --> 00:20:44,720 Speaker 1: and he says that he does so to remind the 403 00:20:44,760 --> 00:20:48,399 Speaker 1: world that Ukraine is still in the midst of a war. So, 404 00:20:48,880 --> 00:20:52,680 Speaker 1: of course, the users of polymarket turns Zelenski's dress into 405 00:20:52,720 --> 00:20:54,720 Speaker 1: a wager. I don't know if this was a proxy 406 00:20:54,720 --> 00:20:57,120 Speaker 1: wager for whether or not the war in Ukraine would end, 407 00:20:57,320 --> 00:21:01,720 Speaker 1: or a wager on how committed to his principles Zelensky is. 408 00:21:01,760 --> 00:21:04,280 Speaker 1: But here's what happened. The bet went live in May, 409 00:21:04,520 --> 00:21:07,520 Speaker 1: and then towards the end of June, Selensky showed up 410 00:21:07,520 --> 00:21:09,600 Speaker 1: at a NATO summit wearing something. 411 00:21:09,600 --> 00:21:14,280 Speaker 3: I'm going to show you now, Oh, looking hot. 412 00:21:14,440 --> 00:21:15,119 Speaker 1: What's he wearing? 413 00:21:15,480 --> 00:21:18,879 Speaker 3: He's wearing He's dressed in all black. He's sort of 414 00:21:18,960 --> 00:21:24,720 Speaker 3: dressed like a caterwaiter, all black. He's wearing a suit 415 00:21:24,960 --> 00:21:25,720 Speaker 3: and a black shirt. 416 00:21:25,760 --> 00:21:29,720 Speaker 1: I think it's a suit. Yeah, Well, you've waded into 417 00:21:29,760 --> 00:21:36,280 Speaker 1: a huge controversy. Polymarket right now has been roiled by 418 00:21:36,320 --> 00:21:38,960 Speaker 1: whether or not this is a suit now, to be fair, 419 00:21:39,560 --> 00:21:44,160 Speaker 1: he's sort of wearing you know, cargo style pants. He's 420 00:21:44,200 --> 00:21:46,920 Speaker 1: not wearing dress shoes, he's not wearing a shirt and tie. 421 00:21:46,960 --> 00:21:52,119 Speaker 1: He's wearing a kind of little mini pose and the 422 00:21:52,320 --> 00:21:55,199 Speaker 1: jacket has four pockets on it which kind of have 423 00:21:55,240 --> 00:21:58,399 Speaker 1: these over the top flaps. So you could argue that 424 00:21:58,400 --> 00:22:01,800 Speaker 1: it's more of a kind of form military fatigue for 425 00:22:01,920 --> 00:22:03,639 Speaker 1: nighttime operations than. 426 00:22:03,960 --> 00:22:05,640 Speaker 3: Pure suit black time fatigue. 427 00:22:05,920 --> 00:22:09,280 Speaker 1: And this is this is Royling Polymarket right now. There 428 00:22:09,280 --> 00:22:11,520 Speaker 1: are hundreds of millions of dollars at stake on whether 429 00:22:11,600 --> 00:22:13,000 Speaker 1: or not this is a suit. 430 00:22:13,160 --> 00:22:15,160 Speaker 3: Literally one hundreds of one hundred million dollars has. 431 00:22:15,119 --> 00:22:17,399 Speaker 1: Been bet and so there's now this formal dispute on 432 00:22:17,520 --> 00:22:22,000 Speaker 1: Polymarket and that triggered an internal review because it wasn't 433 00:22:22,040 --> 00:22:24,400 Speaker 1: clear if this was a yes no suit or no suit, 434 00:22:24,840 --> 00:22:28,640 Speaker 1: and therefore the site's so called oracles were called in. 435 00:22:29,040 --> 00:22:32,040 Speaker 1: The oracles get to debate facts and can't with a verdict, 436 00:22:32,240 --> 00:22:36,760 Speaker 1: and after deliberating live on discord, the oracles determined that 437 00:22:36,840 --> 00:22:41,200 Speaker 1: this was not a suit. However, there was some accusations 438 00:22:41,200 --> 00:22:46,520 Speaker 1: of the oracles were also betters, and so poly Market 439 00:22:46,560 --> 00:22:49,480 Speaker 1: took the dispute back a second time, and on Tuesday 440 00:22:49,520 --> 00:22:51,800 Speaker 1: this week they announced their final verdict. 441 00:22:53,440 --> 00:22:54,639 Speaker 3: And what was the final verdict? 442 00:22:54,840 --> 00:22:58,639 Speaker 1: No suit. Yeah, you'll be out of pocket. 443 00:22:58,840 --> 00:22:59,920 Speaker 3: I would be very out of. 444 00:23:00,520 --> 00:23:03,160 Speaker 1: What's interesting to me about this story beyond the fact 445 00:23:03,200 --> 00:23:06,560 Speaker 1: that it's about crypto and how the Internet makes collective 446 00:23:06,600 --> 00:23:11,600 Speaker 1: decisions with this quote from Polymarket's founder Shane Coplin, and 447 00:23:11,640 --> 00:23:14,000 Speaker 1: he basically said, these bets are about more than making 448 00:23:14,040 --> 00:23:17,919 Speaker 1: a quick buck. He called polymarket quote the future of 449 00:23:18,000 --> 00:23:21,000 Speaker 1: news and said that the next information age is also 450 00:23:21,000 --> 00:23:24,760 Speaker 1: a quote. Won't be driven by the twentieth centuries medium monoliths. 451 00:23:25,080 --> 00:23:26,320 Speaker 1: It will be driven by markets. 452 00:23:26,720 --> 00:23:29,720 Speaker 3: To be fair, the story of recent election cycles has 453 00:23:29,800 --> 00:23:33,439 Speaker 3: been that the betting markets are more accurate than the polls, 454 00:23:33,440 --> 00:23:36,159 Speaker 3: which is crazy. And I'm pulling up what is it, 455 00:23:36,240 --> 00:23:37,560 Speaker 3: polymarket dot com? 456 00:23:37,600 --> 00:23:38,639 Speaker 1: Polymarket dot com. 457 00:23:38,760 --> 00:23:41,080 Speaker 3: Number one is the New York City mayoral election, which 458 00:23:41,119 --> 00:23:44,439 Speaker 3: is interesting. There's three million dollars on the FED decision. 459 00:23:44,480 --> 00:23:47,480 Speaker 3: I guess what. Oh. The way that this is played 460 00:23:47,520 --> 00:23:51,359 Speaker 3: out on poly market is that the first thing is 461 00:23:51,359 --> 00:23:54,960 Speaker 3: the highest bet what has the largest wage volume? Yeah yeah, yeah, 462 00:23:55,040 --> 00:23:57,040 Speaker 3: yeah yeah. And that's the New York City mayoral election, 463 00:23:57,080 --> 00:23:58,080 Speaker 3: which is really interesting. 464 00:23:58,280 --> 00:24:01,359 Speaker 1: There's also stuff here about whether the Trump Epstein files 465 00:24:01,359 --> 00:24:04,440 Speaker 1: will get released in twenty twenty five, which surprisingly only 466 00:24:04,480 --> 00:24:05,600 Speaker 1: thirteen percent. 467 00:24:05,600 --> 00:24:08,480 Speaker 3: Voting Trump scales that there are no files. 468 00:24:08,560 --> 00:24:11,480 Speaker 1: Yeah, there are no files. So you know, this kind 469 00:24:11,520 --> 00:24:15,240 Speaker 1: of allows you to be an active participant in consuming 470 00:24:15,280 --> 00:24:17,399 Speaker 1: the news in fact, because you're betting on the news 471 00:24:17,680 --> 00:24:20,359 Speaker 1: at all times, which is pretty interesting. We're going to 472 00:24:20,359 --> 00:24:22,560 Speaker 1: take a quick break now, but when we come back, 473 00:24:23,240 --> 00:24:26,560 Speaker 1: a key question, are you the only human in the 474 00:24:26,640 --> 00:24:44,240 Speaker 1: zoom meeting? Stay with us, Welcome back. We've got a 475 00:24:44,240 --> 00:24:46,840 Speaker 1: few more stories that have caught our eye to share, 476 00:24:47,440 --> 00:24:50,320 Speaker 1: and then we'll be starting something new through the summer 477 00:24:50,520 --> 00:24:52,639 Speaker 1: in place of the tech support interviews. 478 00:24:53,119 --> 00:24:56,479 Speaker 3: It's a new segment called drum Roll. No, it's not 479 00:24:56,480 --> 00:24:58,919 Speaker 3: called drum Roll. It's called Chat and. 480 00:24:58,800 --> 00:25:00,959 Speaker 1: Me Chat and me stick around, don't miss it. 481 00:25:01,320 --> 00:25:03,760 Speaker 3: In the meantime, I have an observation which is that 482 00:25:04,040 --> 00:25:08,720 Speaker 3: you spend most of your days in meetings in zoom calls, 483 00:25:09,040 --> 00:25:12,879 Speaker 3: and those meetings and zoom calls perhaps could be just 484 00:25:12,920 --> 00:25:16,920 Speaker 3: a Slack message or an email or even a text message. 485 00:25:16,880 --> 00:25:20,720 Speaker 1: My therapist, my executive coach, my best friend, my wife, 486 00:25:21,080 --> 00:25:21,800 Speaker 1: or just my co. 487 00:25:21,760 --> 00:25:23,800 Speaker 3: Host no, that's what it actually a chatbot can be. 488 00:25:24,840 --> 00:25:27,560 Speaker 1: Yes. I mean, of course, remember during the pandemic, it 489 00:25:27,640 --> 00:25:29,840 Speaker 1: was like, oh great, zoom calls, we can work from anywhere. 490 00:25:30,160 --> 00:25:31,720 Speaker 1: And now it's like a bit like cell phone and say, 491 00:25:31,720 --> 00:25:33,480 Speaker 1: oh great, I can always be in contexts like gosh, 492 00:25:33,520 --> 00:25:35,800 Speaker 1: now I have to do zoom calls just all the time. 493 00:25:36,000 --> 00:25:39,320 Speaker 1: And it went from this feeling liberation to feeling like 494 00:25:39,400 --> 00:25:40,840 Speaker 1: kind of a digital jail cell. 495 00:25:41,200 --> 00:25:43,800 Speaker 3: I think many people feel that way. But now people 496 00:25:43,880 --> 00:25:46,560 Speaker 3: are starting to think, what if I can skip all 497 00:25:46,560 --> 00:25:49,400 Speaker 3: those zoom meetings and still get all the relevant information, 498 00:25:49,600 --> 00:25:52,280 Speaker 3: say more that I need. So the Washington Post actually 499 00:25:52,320 --> 00:25:55,440 Speaker 3: wrote about the increased presence of AI note takers in meetings, 500 00:25:55,480 --> 00:25:57,520 Speaker 3: like oh I can't come to this meeting, I'll send 501 00:25:57,520 --> 00:26:00,000 Speaker 3: my AI note taker. And they even talk to PEP 502 00:26:00,000 --> 00:26:04,040 Speaker 3: people who have been in meetings where robots or AI 503 00:26:04,160 --> 00:26:05,920 Speaker 3: note takers have outnumbered humans. 504 00:26:06,000 --> 00:26:07,840 Speaker 1: You know, I always get a little bit annoyed when 505 00:26:07,880 --> 00:26:10,960 Speaker 1: there's an AI note taker in the room and no 506 00:26:11,000 --> 00:26:13,560 Speaker 1: one's asked my consent, like this is you know, I 507 00:26:13,560 --> 00:26:16,080 Speaker 1: think it would be a courtesy to say, do you 508 00:26:16,119 --> 00:26:19,080 Speaker 1: mind if I transcribe everything you say and keep it forever. 509 00:26:19,320 --> 00:26:21,160 Speaker 1: It's this thing about using an Internet where you're kind 510 00:26:21,160 --> 00:26:25,280 Speaker 1: of creating this constant chain of data that follows you 511 00:26:25,320 --> 00:26:28,120 Speaker 1: around forever. And I don't really want everything I've ever 512 00:26:28,160 --> 00:26:31,359 Speaker 1: said in a Zoom call to be permanently memorialized and 513 00:26:31,359 --> 00:26:34,480 Speaker 1: then owned by god knows who, but it's becoming the standard. 514 00:26:34,520 --> 00:26:38,120 Speaker 1: I mean, Zoom, Microsoft, Teams, Google meet they all offer 515 00:26:38,160 --> 00:26:41,120 Speaker 1: these note taking features that can record, transcribe, and use 516 00:26:41,160 --> 00:26:45,080 Speaker 1: AI to summarize meetings. That said, I haven't yet encountered 517 00:26:45,160 --> 00:26:48,480 Speaker 1: somebody who has the goal and the cheek to send 518 00:26:48,520 --> 00:26:51,840 Speaker 1: an AI NoteTaker to a meeting in their place to 519 00:26:52,240 --> 00:26:54,320 Speaker 1: excuse them from going to the meeting. That would really 520 00:26:54,359 --> 00:26:55,000 Speaker 1: drive me crazy. 521 00:26:55,080 --> 00:26:57,240 Speaker 3: Well, get ready for my new digital twin. 522 00:26:57,680 --> 00:27:00,760 Speaker 2: Sarah right, Sarah Rice always on time? 523 00:27:01,240 --> 00:27:03,480 Speaker 3: Oh my god, she's prompt. That would be a good 524 00:27:03,640 --> 00:27:05,960 Speaker 3: that for me would be important. But yeah, in addition 525 00:27:06,200 --> 00:27:09,760 Speaker 3: to the big players no, Google and Microsoft and Zoom, 526 00:27:10,000 --> 00:27:15,280 Speaker 3: there's actually smaller companies like Otter, Ai and Granola of 527 00:27:15,320 --> 00:27:20,600 Speaker 3: these names that transcribe calls like across platforms. So even 528 00:27:20,640 --> 00:27:23,560 Speaker 3: if you decide to actually log onto the meeting, you 529 00:27:23,560 --> 00:27:26,199 Speaker 3: can ask your bot to take notes for you and 530 00:27:27,040 --> 00:27:30,120 Speaker 3: it is a great tool for multitasking or zoning out. 531 00:27:30,520 --> 00:27:33,520 Speaker 3: Of course, the downside is that technology is recording more 532 00:27:33,560 --> 00:27:36,399 Speaker 3: and more of our daily lives and echoing what you said. 533 00:27:37,160 --> 00:27:40,120 Speaker 3: One AI exac and the Washington Post piece said, We're 534 00:27:40,160 --> 00:27:43,160 Speaker 3: moving into a world where nothing will be forgotten. 535 00:27:43,640 --> 00:27:46,520 Speaker 2: It is Wimbledon this week, it is, and so such 536 00:27:46,520 --> 00:27:47,560 Speaker 2: a beautiful match. 537 00:27:47,720 --> 00:27:51,639 Speaker 1: All of this grousing aside, Having an AI avatar in 538 00:27:51,680 --> 00:27:53,560 Speaker 1: my Zoom calls so that I could watch tennis all 539 00:27:53,680 --> 00:27:55,560 Speaker 1: day would be my dream come true. 540 00:27:55,720 --> 00:27:58,640 Speaker 3: The thing about you, you would never trust an AI. 541 00:28:00,920 --> 00:28:03,200 Speaker 3: You would micromanage the AI. Yeah, I feel like I quit. 542 00:28:05,040 --> 00:28:07,520 Speaker 1: There's a bit of discrepancy. A minute fourteen. 543 00:28:09,119 --> 00:28:10,840 Speaker 3: I was watching. That's not what happened. But I know 544 00:28:10,880 --> 00:28:12,560 Speaker 3: you love your tennis, so do you? I do. 545 00:28:12,600 --> 00:28:15,000 Speaker 1: I love tennis, And you won't be surprised to know 546 00:28:15,119 --> 00:28:17,720 Speaker 1: that this was not a casual reference to my favorite sport. 547 00:28:18,119 --> 00:28:19,800 Speaker 1: There is a tech angle here. Do you know what 548 00:28:19,800 --> 00:28:20,120 Speaker 1: it is? 549 00:28:20,359 --> 00:28:24,239 Speaker 3: Yeah? What fully? Replacing line judges with AI and cares right. 550 00:28:24,960 --> 00:28:29,000 Speaker 3: And the reason I hate this is because it just 551 00:28:29,800 --> 00:28:32,720 Speaker 3: isn't fun to watch players verbally abuse AI. 552 00:28:33,160 --> 00:28:36,000 Speaker 1: It is quite fun to watch players shrugging their shoulders 553 00:28:36,040 --> 00:28:38,000 Speaker 1: and running around looking for someone to yell at. 554 00:28:38,000 --> 00:28:38,160 Speaker 5: Though. 555 00:28:38,240 --> 00:28:40,600 Speaker 1: It's like we tennis, but there are no line judges 556 00:28:40,640 --> 00:28:43,400 Speaker 1: anymore for the first time in the one hundred and 557 00:28:43,440 --> 00:28:46,840 Speaker 1: forty eight year history of Wimbledon. Instead of a number 558 00:28:46,920 --> 00:28:50,720 Speaker 1: of humans carefully bending over to watch each line during 559 00:28:50,760 --> 00:28:53,680 Speaker 1: every point to see if a ball falls inside or 560 00:28:53,680 --> 00:28:56,920 Speaker 1: outside the line, it's all being done by AI or 561 00:28:57,000 --> 00:29:01,480 Speaker 1: another acronym ELC, electronic line calling. Humans, of course, do 562 00:29:01,640 --> 00:29:04,880 Speaker 1: make mistakes more frequently than robots, but many sports fans 563 00:29:04,880 --> 00:29:07,480 Speaker 1: will agree that watching a match isn't just about accuracy, 564 00:29:07,880 --> 00:29:09,040 Speaker 1: it's about drama. 565 00:29:09,600 --> 00:29:12,440 Speaker 3: So just explain to me, does it mean that there 566 00:29:12,440 --> 00:29:15,440 Speaker 3: are no disputed calls or reviews of replace? 567 00:29:15,680 --> 00:29:20,520 Speaker 1: Yeah? So obviously Phase one was line judges. Phase two 568 00:29:20,640 --> 00:29:24,400 Speaker 1: was line judges plus cameras, so that if player could 569 00:29:24,400 --> 00:29:26,400 Speaker 1: dispute the line dram call and you get a hawkeye 570 00:29:26,440 --> 00:29:29,360 Speaker 1: and then you get the computer to basically correct or 571 00:29:29,840 --> 00:29:30,440 Speaker 1: endorse the line. 572 00:29:30,720 --> 00:29:33,280 Speaker 3: Y you have it here in America for the US opening. 573 00:29:33,360 --> 00:29:35,880 Speaker 1: Yeah, well exactly. But were now in phase three, which 574 00:29:35,920 --> 00:29:38,760 Speaker 1: is just the robots. I mean, this is the classic story, 575 00:29:38,840 --> 00:29:43,600 Speaker 1: isn't It's like humans, humans augmented by technology just technology. 576 00:29:43,880 --> 00:29:46,120 Speaker 1: Around two hundred line judges are out of a job, 577 00:29:46,280 --> 00:29:49,360 Speaker 1: but eighty have been retained in case there are issues 578 00:29:49,360 --> 00:29:52,680 Speaker 1: with the electronics system. Always nice to be back up 579 00:29:53,040 --> 00:29:56,200 Speaker 1: to a robot. One of the judges didn't pull her punches. 580 00:29:56,520 --> 00:29:59,080 Speaker 1: She said that line judges are now their quote for 581 00:29:59,120 --> 00:30:01,880 Speaker 1: no other reason and to escort players on and off 582 00:30:01,880 --> 00:30:04,960 Speaker 1: the court. They were always dressed like butler's and that's 583 00:30:05,000 --> 00:30:06,080 Speaker 1: basically what they are now. 584 00:30:06,200 --> 00:30:07,760 Speaker 3: Oh shot fired? 585 00:30:08,600 --> 00:30:12,920 Speaker 1: Is it working not perfectly? An umpire had to stop 586 00:30:12,920 --> 00:30:15,720 Speaker 1: a match recently when the system failed to spot a 587 00:30:15,760 --> 00:30:18,880 Speaker 1: ball that clearly had landed out of bounds on no 588 00:30:19,000 --> 00:30:21,360 Speaker 1: less than set point, and then the players had to 589 00:30:21,360 --> 00:30:25,600 Speaker 1: replay the point. A Wimbledon spokesman blamed the mistake on 590 00:30:25,720 --> 00:30:29,960 Speaker 1: operator error human error, but the match was between a 591 00:30:30,000 --> 00:30:33,640 Speaker 1: Russian tennis player and a brit and the Russian blamed 592 00:30:34,040 --> 00:30:37,960 Speaker 1: hometown bias for the replay of the clutch point rather 593 00:30:38,000 --> 00:30:40,840 Speaker 1: than simply awarding it to her given the ball was 594 00:30:40,880 --> 00:30:44,520 Speaker 1: clearly out. Here's what Pavyo Chenkova had to say. 595 00:30:45,080 --> 00:30:47,880 Speaker 4: I expect a different decision. I just thought also chairm 596 00:30:47,960 --> 00:30:51,920 Speaker 4: Bio could take an initiative, and that's why he's therefore 597 00:30:52,800 --> 00:30:55,320 Speaker 4: sitting on the chair, and he also saw it out. 598 00:30:55,520 --> 00:30:57,760 Speaker 4: He told me after the match, I think we're losing 599 00:30:57,760 --> 00:31:00,560 Speaker 4: a little bit of this charm of actually having human 600 00:31:00,600 --> 00:31:04,560 Speaker 4: being ball boys, and you know, it just becomes a 601 00:31:04,600 --> 00:31:08,240 Speaker 4: little bit weird and like Robert sort of. 602 00:31:08,200 --> 00:31:12,160 Speaker 1: Orientated, just becomes a little bit weird and robo orientated. 603 00:31:12,440 --> 00:31:13,680 Speaker 1: That's the story of her. 604 00:31:14,200 --> 00:31:15,480 Speaker 3: That's right, that's right. 605 00:31:15,480 --> 00:31:18,520 Speaker 1: But it is interesting. She was like, maybe the umpire 606 00:31:19,080 --> 00:31:21,680 Speaker 1: was scared to trust his own eyes and overall the 607 00:31:21,720 --> 00:31:24,080 Speaker 1: computer even most clearly out that is automation bias. One 608 00:31:24,080 --> 00:31:24,320 Speaker 1: and one. 609 00:31:24,360 --> 00:31:26,040 Speaker 3: I was going to say automation bias. I mean, how 610 00:31:26,080 --> 00:31:28,680 Speaker 3: many times I get in fights with my mother about 611 00:31:28,680 --> 00:31:32,080 Speaker 3: directions because she knows the right way and I was 612 00:31:32,160 --> 00:31:37,280 Speaker 3: raised on Google Maps. So you probably don't know this 613 00:31:37,400 --> 00:31:42,640 Speaker 3: because you're English. But this is the fiftieth anniversary summer 614 00:31:43,040 --> 00:31:45,960 Speaker 3: of the release of Jaws, which I actually watched again 615 00:31:46,000 --> 00:31:49,000 Speaker 3: last weekend and haunts my nightmare. 616 00:31:49,080 --> 00:31:50,880 Speaker 1: My mother told me I wasn't allowed to watch it. 617 00:31:50,960 --> 00:31:51,480 Speaker 3: She's a good mind. 618 00:31:51,480 --> 00:31:52,400 Speaker 1: I would never swim again. 619 00:31:52,480 --> 00:31:53,120 Speaker 3: She's a good mard. 620 00:31:53,160 --> 00:31:54,200 Speaker 1: She wanted me to swim. 621 00:31:54,560 --> 00:31:59,880 Speaker 3: She wanted to related to this, and I really hate 622 00:31:59,920 --> 00:32:02,400 Speaker 3: that there's any real news story that's related to Jaws. 623 00:32:02,400 --> 00:32:05,040 Speaker 3: But there was a story this week about city and 624 00:32:05,080 --> 00:32:08,920 Speaker 3: state officials in New York using drones to locate and 625 00:32:09,000 --> 00:32:11,640 Speaker 3: track sharks on beaches and Queens and on Long Island. 626 00:32:12,280 --> 00:32:15,920 Speaker 3: I go to Long Island, I swim on Long Island, 627 00:32:16,640 --> 00:32:20,960 Speaker 3: and around the July fourth weekend this past weekend, there 628 00:32:21,000 --> 00:32:23,760 Speaker 3: were eight sightings of sharks in less than a week. 629 00:32:23,840 --> 00:32:26,760 Speaker 1: Despite not having watched Jaws, I'm still very scared of shoks. 630 00:32:27,000 --> 00:32:27,960 Speaker 1: This is not good to hear. 631 00:32:28,680 --> 00:32:31,920 Speaker 3: So every time a shark was spotted, swimmers were ordered 632 00:32:31,920 --> 00:32:34,560 Speaker 3: to evacuate the area for an hour. And this happened 633 00:32:34,760 --> 00:32:36,719 Speaker 3: a lot over the weekend. As you can imagine, it 634 00:32:36,720 --> 00:32:38,320 Speaker 3: really frustrated beach goers. 635 00:32:38,640 --> 00:32:41,240 Speaker 1: Yeah. I was out of town, as I've mentioned multiple times, 636 00:32:41,480 --> 00:32:43,560 Speaker 1: but I got sent some Reddit comments about this story, 637 00:32:43,600 --> 00:32:46,760 Speaker 1: and my favorite was quote in true New York City fashion, 638 00:32:46,920 --> 00:32:49,920 Speaker 1: it's obviously a loan shark. I don't pay up, or 639 00:32:49,920 --> 00:32:50,959 Speaker 1: we'll nibble your toes. 640 00:32:51,080 --> 00:32:54,440 Speaker 3: I really, I mean, I do love the way that 641 00:32:54,800 --> 00:33:00,840 Speaker 3: Reddit makes lemonade out of lemons. There's actually speculation about 642 00:33:01,280 --> 00:33:03,840 Speaker 3: whether there are actually more sharks in the area, or 643 00:33:03,880 --> 00:33:06,400 Speaker 3: if there were always this many. We just didn't spot 644 00:33:06,440 --> 00:33:07,760 Speaker 3: them as easily without drones. 645 00:33:07,840 --> 00:33:09,800 Speaker 1: That was exactly going to be my question, because it's like, 646 00:33:09,800 --> 00:33:11,760 Speaker 1: there was a story recently in The New Yorker about 647 00:33:11,800 --> 00:33:15,720 Speaker 1: cancer diagnostic and it's like, is there a possibility haing 648 00:33:15,760 --> 00:33:18,440 Speaker 1: too much information? Yes, Like maybe the sharks are always there, 649 00:33:19,120 --> 00:33:19,920 Speaker 1: just the drones in you. 650 00:33:20,560 --> 00:33:23,400 Speaker 3: It's why they say don't get a full body scam, 651 00:33:23,480 --> 00:33:25,320 Speaker 3: because sometimes there's stuff you don't want to know. You 652 00:33:25,360 --> 00:33:26,800 Speaker 3: don't want to know what sharks are in the water. 653 00:33:26,880 --> 00:33:27,920 Speaker 1: Sometimes sometimes you don't. 654 00:33:28,240 --> 00:33:30,480 Speaker 3: But I do think the state is responding to reports 655 00:33:30,560 --> 00:33:33,320 Speaker 3: that a twenty year old woman was probably bitten by 656 00:33:33,320 --> 00:33:34,480 Speaker 3: a shark a few weeks back. 657 00:33:34,560 --> 00:33:36,880 Speaker 1: What on earth do you mean? Probably bitten by a shark? 658 00:33:37,040 --> 00:33:40,680 Speaker 3: So this woman was bitten by something in the water, 659 00:33:41,080 --> 00:33:43,880 Speaker 3: and when biologists gave the bite marks a once over, 660 00:33:44,240 --> 00:33:48,040 Speaker 3: they concluded that the bite quote most likely involved a 661 00:33:48,160 --> 00:33:52,240 Speaker 3: juvenile sand tiger shark. How she doing, She's okay, she's alive. 662 00:33:52,480 --> 00:33:54,920 Speaker 3: She walked away with minor cuts to her left foot 663 00:33:55,440 --> 00:33:58,480 Speaker 3: and leg. But why aren't we more upset about this 664 00:33:58,920 --> 00:34:01,800 Speaker 3: about what, well, probably being bitten by a shark? 665 00:34:01,920 --> 00:34:04,360 Speaker 1: Yeah, I was probably bitten by a shark? Would actually 666 00:34:04,360 --> 00:34:07,959 Speaker 1: be a great slogan T shirt. That's the last headline today. 667 00:34:08,200 --> 00:34:11,520 Speaker 1: But I believe Kara, you have a pitch for our listeners. 668 00:34:11,880 --> 00:34:14,680 Speaker 3: So since January when we took over the tex Stuff feed, 669 00:34:14,760 --> 00:34:17,719 Speaker 3: I come to pitch meetings and I'm always excited to 670 00:34:17,760 --> 00:34:20,360 Speaker 3: talk to you guys about just these out of pocket 671 00:34:20,400 --> 00:34:23,360 Speaker 3: ways that my friends are using Chatgypte and not just 672 00:34:23,440 --> 00:34:28,080 Speaker 3: chatgypt you know, Claude Gemini. I like seeing the ways 673 00:34:28,120 --> 00:34:32,160 Speaker 3: in which chatbots are like infiltrating daily life, right. 674 00:34:32,080 --> 00:34:33,880 Speaker 1: And you shared a very interesting one this week. 675 00:34:34,360 --> 00:34:38,000 Speaker 3: Yeah, you know, it wasn't even interesting as much as 676 00:34:38,000 --> 00:34:41,200 Speaker 3: it was banal, but also really functional, which is that 677 00:34:42,000 --> 00:34:49,280 Speaker 3: my sister and her girlfriend were supposed to be staying 678 00:34:49,400 --> 00:34:53,040 Speaker 3: at this home for the week on vacation sort of 679 00:34:53,080 --> 00:34:57,040 Speaker 3: Airbnb rental property, and it was I will say, not 680 00:34:57,120 --> 00:35:00,239 Speaker 3: up to snuff. And my sister and her girlfrien are 681 00:35:00,239 --> 00:35:03,120 Speaker 3: both very good communicators, but it was late and they 682 00:35:03,120 --> 00:35:08,520 Speaker 3: were tired, and they wanted to vacate this premises. 683 00:35:08,960 --> 00:35:10,960 Speaker 2: And get their money back and get their money back. 684 00:35:10,800 --> 00:35:14,200 Speaker 3: And so they relied on Chat GBT to help them communicate. 685 00:35:14,360 --> 00:35:15,719 Speaker 3: And this was the important part. And I think this 686 00:35:15,760 --> 00:35:18,719 Speaker 3: is why people use chat. They wanted to communicate in 687 00:35:18,760 --> 00:35:22,640 Speaker 3: a certain way, being kind and calm but also firm. 688 00:35:23,160 --> 00:35:25,840 Speaker 3: And I think what Chat helped them do was form 689 00:35:26,320 --> 00:35:32,000 Speaker 3: a message that was not accusatory or disrespectful about the space. 690 00:35:32,520 --> 00:35:35,440 Speaker 3: And I think it's just so interesting that, like we 691 00:35:35,480 --> 00:35:39,440 Speaker 3: would rely on a large language model to make ourselves 692 00:35:39,440 --> 00:35:41,040 Speaker 3: seem more human or humane. 693 00:35:41,120 --> 00:35:43,560 Speaker 1: That's very well put, you know. And it's been fun 694 00:35:43,560 --> 00:35:45,759 Speaker 1: of hearing these stories from you every week in our 695 00:35:45,760 --> 00:35:48,000 Speaker 1: pitch meetings, and so we thought it'd be fun to 696 00:35:48,040 --> 00:35:51,040 Speaker 1: bring more of them into the show, but not just 697 00:35:51,080 --> 00:35:52,920 Speaker 1: from you, also from our listeners. 698 00:35:53,320 --> 00:35:56,280 Speaker 3: Yeah, and you know, hear from our listeners about how 699 00:35:56,360 --> 00:35:57,720 Speaker 3: you're using AI. 700 00:35:58,239 --> 00:36:01,640 Speaker 1: So if and when you found yourself turning to Jack 701 00:36:01,680 --> 00:36:04,560 Speaker 1: Gpt or girl called claud or Gemini or whichever one 702 00:36:04,600 --> 00:36:07,880 Speaker 1: you use for help with a particular task or to 703 00:36:07,960 --> 00:36:11,960 Speaker 1: navigate your real life, we want to hear about it. Ideally, 704 00:36:12,520 --> 00:36:15,120 Speaker 1: send us a one to two minute voice note to 705 00:36:15,360 --> 00:36:18,360 Speaker 1: Text Stuff Podcast at gmail dot com, or if you 706 00:36:18,360 --> 00:36:19,880 Speaker 1: want to write it down instead and just send it 707 00:36:19,920 --> 00:36:21,279 Speaker 1: as a normal email, that's fine too. 708 00:36:22,280 --> 00:36:24,720 Speaker 3: Yeah, tell us about how you're using it, what works, 709 00:36:24,760 --> 00:36:28,279 Speaker 3: what doesn't, why you like using AI for this task specifically, 710 00:36:28,760 --> 00:36:33,560 Speaker 3: we want to hear it all, and if you are 711 00:36:33,680 --> 00:36:36,879 Speaker 3: using a chatbot to write your wedding vows, we will 712 00:36:36,920 --> 00:36:39,680 Speaker 3: report this story until it is blue in the face. 713 00:36:40,360 --> 00:36:42,160 Speaker 3: I know what's happening and I want. 714 00:36:42,040 --> 00:36:44,880 Speaker 1: To hear about it, so please send in your stories 715 00:36:44,960 --> 00:36:48,120 Speaker 1: to text Stuff Podcasts at gmail dot com. You may 716 00:36:48,120 --> 00:36:50,160 Speaker 1: get to hear your voice on the show, and if 717 00:36:50,160 --> 00:36:52,960 Speaker 1: you do, we'll send you a T shirt with the 718 00:36:53,000 --> 00:36:56,160 Speaker 1: phrase I've probably got bitten by a shark, or maybe 719 00:36:56,160 --> 00:36:58,440 Speaker 1: something more appropriate to tech stuff, but we want to 720 00:36:58,440 --> 00:37:00,400 Speaker 1: hear from you, so please do write in. We'll be 721 00:37:00,440 --> 00:37:03,160 Speaker 1: doing this every week. Is called Chat and Me. 722 00:37:16,520 --> 00:37:18,120 Speaker 3: That's it for this week for tech Stuff. 723 00:37:18,160 --> 00:37:20,360 Speaker 2: I'm Karra Price and I'm Oz Valschin. 724 00:37:20,880 --> 00:37:24,840 Speaker 1: This episode was produced by Eliza Dennis and Adriana Topia. 725 00:37:25,560 --> 00:37:28,120 Speaker 1: It was executive produced by Me, Kara Price and Kate 726 00:37:28,160 --> 00:37:33,680 Speaker 1: Osborne for Kaleidoscope and Katrina norvelfa iHeart Podcasts. The engineer 727 00:37:33,719 --> 00:37:36,959 Speaker 1: is Bihid Fraser and Jack Insley mixed this episode. Kyle 728 00:37:37,040 --> 00:37:38,239 Speaker 1: Murdoch wrote out theme song. 729 00:37:38,880 --> 00:37:41,920 Speaker 3: Join us next Wednesday for Textuff the story when we 730 00:37:41,960 --> 00:37:44,799 Speaker 3: will share an in depth conversation with David Webster, the 731 00:37:44,840 --> 00:37:47,440 Speaker 3: head of User Experience at Google Labs about what it 732 00:37:47,520 --> 00:37:49,880 Speaker 3: means to design human centered tech. 733 00:37:50,040 --> 00:37:53,200 Speaker 1: And please do rate and review the show on Apple podcasts, 734 00:37:53,280 --> 00:37:56,360 Speaker 1: on Spotify wherever you listen to your podcasts, and write 735 00:37:56,360 --> 00:37:59,160 Speaker 1: into us at tech stuff podcast at gmail dot com 736 00:37:59,200 --> 00:38:02,640 Speaker 1: with your feedback and of course your story is about chatting.