1 00:00:13,680 --> 00:00:17,000 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:17,120 --> 00:00:19,080 Speaker 2: I'm as Voloscian and I'm Carera Price. 3 00:00:19,320 --> 00:00:23,040 Speaker 1: Today we'll get into the headlines this week, including Nepal's 4 00:00:23,280 --> 00:00:28,240 Speaker 1: failed social media band turned government takeover, and a look 5 00:00:28,480 --> 00:00:32,919 Speaker 1: at the newest generation of tech founders. Then on chatting me. 6 00:00:33,400 --> 00:00:37,480 Speaker 3: Chasubt has become kind of like a partner that makes 7 00:00:37,560 --> 00:00:39,640 Speaker 3: managing health and lifestyle simple. 8 00:00:39,800 --> 00:00:42,599 Speaker 1: All of that On the Week in Tech. It's Friday, 9 00:00:42,880 --> 00:00:43,920 Speaker 1: September twenty sixth. 10 00:00:46,880 --> 00:00:49,000 Speaker 2: Hello, Cara Hias. 11 00:00:49,080 --> 00:00:50,840 Speaker 1: Nice to see you. It's been a couple of weeks 12 00:00:50,880 --> 00:00:55,920 Speaker 1: where we've been remote, you've abandoned me. It's good to 13 00:00:55,960 --> 00:00:56,960 Speaker 1: be back with you in person. 14 00:00:57,200 --> 00:00:58,600 Speaker 2: Dirty gritty New York City. 15 00:00:59,000 --> 00:01:01,320 Speaker 4: Speaking of dirty grit in New York City, how often 16 00:01:01,400 --> 00:01:02,200 Speaker 4: you go to Brian Park? 17 00:01:02,480 --> 00:01:04,240 Speaker 1: Why not to go inside Brian Park? But I take 18 00:01:04,240 --> 00:01:06,920 Speaker 1: the subway to Brian Park once a week on Thursday's 19 00:01:07,000 --> 00:01:09,640 Speaker 1: Goo Therapy. But this week and you're trying not to 20 00:01:09,680 --> 00:01:14,959 Speaker 1: go anywhere because this un week has completely unga, snarled 21 00:01:15,400 --> 00:01:17,000 Speaker 1: traffic and the. 22 00:01:16,880 --> 00:01:19,720 Speaker 4: Dirtiest word in the New york Er Dictionary unga. 23 00:01:20,360 --> 00:01:22,760 Speaker 1: Why do you ask, though, because. 24 00:01:22,840 --> 00:01:25,560 Speaker 4: Brian Park is not where the Center of Fashion is 25 00:01:25,560 --> 00:01:29,479 Speaker 4: in Manhattan. It is where I found my most beloved 26 00:01:29,640 --> 00:01:31,759 Speaker 4: new toy. Her name is Belonce. 27 00:01:32,160 --> 00:01:32,720 Speaker 1: Belonce. 28 00:01:33,120 --> 00:01:35,080 Speaker 4: She's actually not she doesn't belong to me, but she 29 00:01:35,240 --> 00:01:37,880 Speaker 4: is the robot lawnmower of Brian Park. 30 00:01:37,959 --> 00:01:39,000 Speaker 1: Well, she's got a good name. 31 00:01:39,240 --> 00:01:41,680 Speaker 4: I wish I came up with the name Belonce. 32 00:01:41,800 --> 00:01:42,160 Speaker 1: I did not. 33 00:01:42,840 --> 00:01:44,560 Speaker 4: I read about it in the Washington Post and then 34 00:01:44,600 --> 00:01:46,720 Speaker 4: I actually went to the park to check it out. 35 00:01:46,760 --> 00:01:47,560 Speaker 1: You did some reporting. 36 00:01:47,640 --> 00:01:50,320 Speaker 2: She's gorgeous. How in the flesh she's gorgeous? 37 00:01:50,560 --> 00:01:54,640 Speaker 1: Is there anything significant about Belonce other than her fabulous name? 38 00:01:55,200 --> 00:02:00,480 Speaker 4: Nothing is insignificant about Belonce. She's a four thousand dollars rumba, essentially, 39 00:02:00,640 --> 00:02:03,040 Speaker 4: and she's run via an app which tells her when 40 00:02:03,080 --> 00:02:05,600 Speaker 4: and where to mow. So you tell her where to go, 41 00:02:05,760 --> 00:02:08,640 Speaker 4: she does her thing, and then very cutely, she rolls 42 00:02:08,639 --> 00:02:12,160 Speaker 4: herself back to her charger after a few hours. And 43 00:02:12,240 --> 00:02:14,040 Speaker 4: I think it goes without saying that all of these 44 00:02:14,080 --> 00:02:18,359 Speaker 4: electric robot lawnmowers are much in quotes greener than their 45 00:02:18,400 --> 00:02:20,560 Speaker 4: gas guzzling human controlled counterparts. 46 00:02:20,960 --> 00:02:25,440 Speaker 1: My mother also has a semi autonomous robot lawnmower at 47 00:02:25,440 --> 00:02:27,200 Speaker 1: her house. I'm not sure if that's because of her 48 00:02:27,600 --> 00:02:30,320 Speaker 1: concern for gas guzzling. But it is interesting how these 49 00:02:30,400 --> 00:02:33,240 Speaker 1: kind of almost sci fi type robots have become so 50 00:02:33,400 --> 00:02:35,840 Speaker 1: normalized that people think you're crazy for going to look 51 00:02:35,880 --> 00:02:38,520 Speaker 1: at one in action. I am curious though. Obviously in 52 00:02:38,520 --> 00:02:42,560 Speaker 1: the previously owned garden, you know, a Belonce going about 53 00:02:42,560 --> 00:02:45,400 Speaker 1: her business is racally safe. But I can imagine the 54 00:02:45,440 --> 00:02:48,960 Speaker 1: cut in trust of New York City, you know, vandalism, theft. 55 00:02:49,320 --> 00:02:52,560 Speaker 4: All dirt, the grimy crime of New York City. Actually, 56 00:02:52,840 --> 00:02:54,360 Speaker 4: that Belonce wouldn't be safe. 57 00:02:54,440 --> 00:02:55,440 Speaker 1: Yeah, I was worried about her. 58 00:02:55,520 --> 00:02:56,600 Speaker 2: Well, she is safe. 59 00:02:56,720 --> 00:02:59,040 Speaker 4: You want to know why there's a ton of security 60 00:02:59,080 --> 00:03:03,640 Speaker 4: in brime. She's also not the first she has peers. 61 00:03:03,760 --> 00:03:08,239 Speaker 4: There were several others tested, including l lawn James got. 62 00:03:09,720 --> 00:03:12,280 Speaker 1: Who comes up with this? Mum? If you're listening, Karen, 63 00:03:12,320 --> 00:03:14,400 Speaker 1: do you have any suggestions for what the garden lawnmower 64 00:03:14,480 --> 00:03:15,519 Speaker 1: could be rechristened? 65 00:03:15,639 --> 00:03:18,800 Speaker 4: Okay, so I have actually five because one of our 66 00:03:18,840 --> 00:03:22,040 Speaker 4: genius producers asked an AI chatbot for a few more 67 00:03:22,120 --> 00:03:23,079 Speaker 4: lawn pun names. 68 00:03:23,280 --> 00:03:26,919 Speaker 2: Here we go. My favorite, Trimothy Shallome. 69 00:03:29,160 --> 00:03:36,960 Speaker 4: Mozart, Yeah, fine, eh, post Malone, ye ye grass, light Year, 70 00:03:37,200 --> 00:03:40,840 Speaker 4: and fine, another great one a lawn musk. 71 00:03:41,120 --> 00:03:44,040 Speaker 1: That's my favorite. MS. Okay, we got it. 72 00:03:44,840 --> 00:03:47,840 Speaker 4: So that is my lighthearted contribution for today. If you 73 00:03:47,880 --> 00:03:51,080 Speaker 4: are in New York City, go check out belonce. She'll 74 00:03:51,120 --> 00:03:55,120 Speaker 4: be waiting for you. I have to switch gears a 75 00:03:55,160 --> 00:03:58,360 Speaker 4: little bit for my next story. How familiar are you 76 00:03:58,400 --> 00:03:59,040 Speaker 4: with Discord. 77 00:03:59,560 --> 00:04:03,200 Speaker 1: I've had about Discord for many years. I've never actually 78 00:04:03,320 --> 00:04:05,640 Speaker 1: used it a gamer. I'm not a gamer. I know 79 00:04:05,720 --> 00:04:09,120 Speaker 1: it's kind of like a live platform for simultaneous discussion, 80 00:04:09,160 --> 00:04:13,920 Speaker 1: and unfortunately, my life is so sad that the closest 81 00:04:13,960 --> 00:04:17,920 Speaker 1: analogy I can think of is the work productivity tool Slack. 82 00:04:18,400 --> 00:04:19,359 Speaker 2: Slack is a game for you. 83 00:04:19,480 --> 00:04:23,040 Speaker 4: Unfortunately, you're like, oh, you mean Slack's Ega Genesis. Well, 84 00:04:23,080 --> 00:04:24,680 Speaker 4: the reason I ask you, and I think you'll be 85 00:04:24,720 --> 00:04:28,720 Speaker 4: interested in this. Discord recently played a very big role 86 00:04:29,200 --> 00:04:30,440 Speaker 4: in a revolution. 87 00:04:31,200 --> 00:04:33,279 Speaker 1: I actually I actually heard about this a little bit 88 00:04:33,320 --> 00:04:35,680 Speaker 1: intil some headlines. This is in Nepal, that's right. 89 00:04:36,160 --> 00:04:38,559 Speaker 4: So in the course of a week, the country banned 90 00:04:38,600 --> 00:04:42,040 Speaker 4: social media, erupted in gen z led protest, ousted, their 91 00:04:42,040 --> 00:04:45,880 Speaker 4: prime minister, solicited recommendations for an interim replacement on chat GBT, 92 00:04:46,279 --> 00:04:49,599 Speaker 4: and then elected an interim leader on discord. 93 00:04:49,279 --> 00:04:50,320 Speaker 1: Talk about tech stuff. 94 00:04:52,200 --> 00:04:53,680 Speaker 2: It really is a tech stuff story. 95 00:04:53,839 --> 00:04:55,960 Speaker 1: Yeah, I don't know that much about Nepal. I know 96 00:04:56,000 --> 00:04:59,640 Speaker 1: it's in the Himalayas. I know it's where Mount Everest is. Yes, 97 00:05:00,120 --> 00:05:05,120 Speaker 1: it borders China and India. But why did this happen there? Like, 98 00:05:05,160 --> 00:05:06,359 Speaker 1: what's the background? 99 00:05:06,560 --> 00:05:11,000 Speaker 4: So actually Nepal was traditionally a monarchy, right, but a 100 00:05:11,040 --> 00:05:14,719 Speaker 4: democratic government was founded less than twenty years ago. Nepal 101 00:05:15,240 --> 00:05:18,520 Speaker 4: is the second poorest country in South Asia, the first 102 00:05:18,520 --> 00:05:21,440 Speaker 4: being Afghanistan. And a few weeks ago there was a 103 00:05:21,480 --> 00:05:26,360 Speaker 4: massive TikTok trend among youth and Nepal called hashtag Nepo kids. 104 00:05:27,720 --> 00:05:31,080 Speaker 4: So social media users were posting videos showing the wealthy 105 00:05:31,120 --> 00:05:35,600 Speaker 4: lifestyles of the children of Nepal's leaders. So think images 106 00:05:35,640 --> 00:05:38,640 Speaker 4: of politicians' kids posing in front of Christmas trees made 107 00:05:38,640 --> 00:05:41,800 Speaker 4: out of Louis Vuitton and Cardier boxes. These images were 108 00:05:41,839 --> 00:05:45,240 Speaker 4: being juxtaposed with images of the day to day struggles 109 00:05:45,240 --> 00:05:48,000 Speaker 4: of regular people in Nepal. So there are images of 110 00:05:48,040 --> 00:05:51,880 Speaker 4: malnourished children, floods, decimated housing, and overall poverty. 111 00:05:52,360 --> 00:05:55,440 Speaker 1: Yeah, that's quite a stark juxtaposition. And I can't imagine 112 00:05:55,440 --> 00:05:59,600 Speaker 1: government ministers felt too good about their children being juxtaposed 113 00:05:59,640 --> 00:06:03,560 Speaker 1: with children suffering malnourishment and poverty. No, and it. 114 00:06:03,480 --> 00:06:06,200 Speaker 4: Actually it led to the government banning twenty six social 115 00:06:06,279 --> 00:06:10,680 Speaker 4: media platforms, including TikTok, YouTube, and Facebook, purportedly in the 116 00:06:10,760 --> 00:06:13,679 Speaker 4: name of battling fake news, hate speech, and online fraud, 117 00:06:13,760 --> 00:06:16,000 Speaker 4: which obviously people are not buying. 118 00:06:16,240 --> 00:06:19,560 Speaker 1: Yeah, I wouldn't buy that either. When did the band 119 00:06:19,560 --> 00:06:20,240 Speaker 1: come into effect? 120 00:06:20,480 --> 00:06:21,200 Speaker 2: On my birthday? 121 00:06:21,240 --> 00:06:22,680 Speaker 1: September fourth, hap your birthday? 122 00:06:22,960 --> 00:06:26,520 Speaker 4: The following Monday, Thousands of young people took to the streets. 123 00:06:26,560 --> 00:06:29,920 Speaker 4: So some government buildings were burned down, including the Prime 124 00:06:29,920 --> 00:06:33,400 Speaker 4: Minister's office, and according to Reuter's there was a violent 125 00:06:33,440 --> 00:06:36,680 Speaker 4: crackdown and about seventy two people were actually killed in 126 00:06:36,720 --> 00:06:38,280 Speaker 4: the protest and subsequent fires. 127 00:06:38,360 --> 00:06:42,040 Speaker 1: Yeah, this sounds like, I mean, eerily familiar in terms 128 00:06:42,080 --> 00:06:44,719 Speaker 1: of the Arab Spring of fifteen years ago. What happened 129 00:06:44,760 --> 00:06:45,679 Speaker 1: next in Nepal? 130 00:06:45,920 --> 00:06:49,760 Speaker 4: So the social media ban was reversed almost immediately after 131 00:06:49,839 --> 00:06:53,560 Speaker 4: protests began, and on Tuesday, September ninth, the Prime minister 132 00:06:53,680 --> 00:06:57,960 Speaker 4: resigned and fled the country. Then the military pretty much 133 00:06:58,160 --> 00:07:00,839 Speaker 4: took over and put a curfew in place. And so 134 00:07:01,160 --> 00:07:02,680 Speaker 4: you know, what do people do when they're force to 135 00:07:02,680 --> 00:07:04,840 Speaker 4: stay inside, They go back on life, That's right, they 136 00:07:04,880 --> 00:07:08,400 Speaker 4: go online. So over one hundred thousand people, including a 137 00:07:08,480 --> 00:07:12,160 Speaker 4: number of organizing groups, spent several days on Discord working 138 00:07:12,200 --> 00:07:14,840 Speaker 4: to come up with an interim leader who the military 139 00:07:14,880 --> 00:07:18,920 Speaker 4: would actually cooperate with and accept. There were discussions taking 140 00:07:18,960 --> 00:07:22,360 Speaker 4: place on Discord, filled with you know, everything you'd expect 141 00:07:22,360 --> 00:07:27,000 Speaker 4: on discord, memes, insults, and there were some genuinely earnest 142 00:07:27,240 --> 00:07:29,480 Speaker 4: you know, suggestions and polls. 143 00:07:29,600 --> 00:07:29,800 Speaker 1: Right. 144 00:07:29,960 --> 00:07:32,720 Speaker 4: Young activists argued over the future leadership of the country, 145 00:07:33,040 --> 00:07:36,320 Speaker 4: and some even turned to chat GPT for help. You know, 146 00:07:36,400 --> 00:07:39,720 Speaker 4: one group asked chat to identify potential candidates for the 147 00:07:39,720 --> 00:07:43,040 Speaker 4: interim leadership position, and so when they were provided a 148 00:07:43,080 --> 00:07:45,960 Speaker 4: list of qualified candidates, they went back to chat gpt 149 00:07:46,120 --> 00:07:48,480 Speaker 4: and asked it to debate the strengths and weaknesses of 150 00:07:48,520 --> 00:07:49,120 Speaker 4: each candidate. 151 00:07:49,160 --> 00:07:51,680 Speaker 1: Well and chat gate good advice, it seemed to you know. 152 00:07:51,840 --> 00:07:56,120 Speaker 4: Chat gpt actually provided many qualified candidates, including one woman 153 00:07:56,280 --> 00:07:58,960 Speaker 4: who had been selected by a dominant gen Z collective 154 00:07:59,000 --> 00:08:02,520 Speaker 4: to lead negotiations with the military. This was Sushila Karki. 155 00:08:02,680 --> 00:08:04,920 Speaker 4: She's seventy three years old. She's not, you know, a 156 00:08:04,920 --> 00:08:08,640 Speaker 4: young discord user herself. She's a former Chief Justice and 157 00:08:08,680 --> 00:08:11,720 Speaker 4: well known anti corruption crusader, and according to the Sri 158 00:08:11,840 --> 00:08:16,200 Speaker 4: Lankan Guardian, Chatchept advised the quote, she seems likely to 159 00:08:16,280 --> 00:08:20,160 Speaker 4: command trust across different groups and could help oversee reforms 160 00:08:20,200 --> 00:08:21,760 Speaker 4: and the path to fare elections. 161 00:08:21,840 --> 00:08:23,679 Speaker 1: And so far, so good. 162 00:08:24,160 --> 00:08:27,440 Speaker 4: You know, after some chaos, the discord community decided to 163 00:08:27,440 --> 00:08:30,400 Speaker 4: have her be the new interim leader of Nepal. And 164 00:08:30,440 --> 00:08:33,160 Speaker 4: she's actually putting together a cabinet right now and will 165 00:08:33,160 --> 00:08:35,800 Speaker 4: hold elections for a new prime minister in about six months. 166 00:08:35,920 --> 00:08:39,479 Speaker 1: This is an amazing story. This is uncanny online, offline 167 00:08:39,800 --> 00:08:44,199 Speaker 1: metaverse discord chatchpt, real life protests, new leaders. I mean, 168 00:08:44,920 --> 00:08:47,200 Speaker 1: what's so interesting to me about this story is just 169 00:08:47,240 --> 00:08:50,680 Speaker 1: how borderless it is. For example, the mascot of this 170 00:08:50,800 --> 00:08:55,120 Speaker 1: revolution is the pirate flag from the Japanese anime series 171 00:08:55,440 --> 00:08:57,400 Speaker 1: One Piece. Have you watched One Piece? 172 00:08:57,920 --> 00:08:59,440 Speaker 2: I know about it, I'm aware of it. 173 00:08:59,720 --> 00:09:03,400 Speaker 1: So CNN describes it as quote the swashbuckling story, and 174 00:09:03,400 --> 00:09:06,560 Speaker 1: I always loved the swashbuckling story. To the swashbuckling story 175 00:09:06,600 --> 00:09:10,320 Speaker 1: of the charming pirate captain Monkey d Loofy and his 176 00:09:10,440 --> 00:09:14,600 Speaker 1: misfit straw hat crew. CNN continues to one piece fans. 177 00:09:14,840 --> 00:09:18,760 Speaker 1: The flag symbolizes Lufi's quest to chase his dreams, liberate 178 00:09:18,760 --> 00:09:23,720 Speaker 1: oppressed people, and fight the autocratic world government. That flag, 179 00:09:23,800 --> 00:09:27,319 Speaker 1: from a Japanese anime series has now been seen flying 180 00:09:27,760 --> 00:09:31,760 Speaker 1: not just in Nepal, but also in Indonesia and the Philippines, 181 00:09:31,960 --> 00:09:34,160 Speaker 1: countries that are both recently had their own gen Z 182 00:09:34,360 --> 00:09:37,600 Speaker 1: uprisings over government corruptions. I was in an uber this 183 00:09:37,720 --> 00:09:41,800 Speaker 1: morning in gridlock traffic in Midtown, so got chatting. My 184 00:09:41,960 --> 00:09:45,520 Speaker 1: driver was from Nepal, Lucky, and so I asked him, 185 00:09:45,679 --> 00:09:48,000 Speaker 1: you know, what do you think about this story? He said, Look, 186 00:09:48,280 --> 00:09:50,360 Speaker 1: this isn't really a story about Nepal. This is much 187 00:09:50,360 --> 00:09:53,400 Speaker 1: bigger than Nepal. This is about a group of people, 188 00:09:53,520 --> 00:09:57,120 Speaker 1: young people who refuse to be bound by the political conventions, 189 00:09:57,120 --> 00:10:00,800 Speaker 1: the political institutions, or the political establishment of their country 190 00:10:01,240 --> 00:10:05,000 Speaker 1: and instead insisted on having their voice and on a 191 00:10:05,080 --> 00:10:08,920 Speaker 1: new political reality. And his point was this is really borderers. 192 00:10:09,040 --> 00:10:11,320 Speaker 1: And he even said it could happen here. I don't 193 00:10:11,320 --> 00:10:13,479 Speaker 1: know if it really could happen here, but the optimism 194 00:10:13,480 --> 00:10:16,000 Speaker 1: which he brought in the excitement reminded me of the 195 00:10:16,080 --> 00:10:19,080 Speaker 1: early days of the Arab Spring, which obviously followed by 196 00:10:19,080 --> 00:10:21,000 Speaker 1: what people talk about as the Arab Winter. But it'll 197 00:10:21,040 --> 00:10:22,360 Speaker 1: be interesting to see what happens next. 198 00:10:22,880 --> 00:10:25,840 Speaker 4: Yeah, you know, for year, social media companies talked about 199 00:10:26,000 --> 00:10:29,560 Speaker 4: how their platforms could democratize the Internet and bring people together. 200 00:10:30,040 --> 00:10:32,839 Speaker 4: And in the US, as we know, they've mostly proven 201 00:10:32,880 --> 00:10:35,959 Speaker 4: to be divisive and polarizing, you know, que the recent 202 00:10:35,960 --> 00:10:39,120 Speaker 4: discussion about the rhetoric and online communities that motivated Tyler 203 00:10:39,200 --> 00:10:41,800 Speaker 4: Robinson who is now in custody for killing Charlie Kirk. 204 00:10:42,320 --> 00:10:45,880 Speaker 4: But in Nepal they were able to have a transparent 205 00:10:45,960 --> 00:10:48,960 Speaker 4: exercise in direct democracy, of course, with the help of 206 00:10:49,000 --> 00:10:51,480 Speaker 4: Discord and unbelievably generative AI. 207 00:10:52,000 --> 00:10:55,160 Speaker 1: And this brings me to our next stories. So while 208 00:10:55,200 --> 00:10:58,120 Speaker 1: gen Z and Nepal is interested in using AI and 209 00:10:58,200 --> 00:11:02,000 Speaker 1: discord a topple government and bring more direct experience of democracy, 210 00:11:02,760 --> 00:11:06,520 Speaker 1: gen Z in America is focused on getting rich. 211 00:11:07,840 --> 00:11:09,160 Speaker 2: Who isn't true? 212 00:11:09,200 --> 00:11:12,480 Speaker 1: Who isn't So basically, there was a great story in 213 00:11:12,480 --> 00:11:14,400 Speaker 1: the Wall Street Journal that I read. I mean, the 214 00:11:14,400 --> 00:11:19,199 Speaker 1: headline was AI startup founders taut a winning formula, no booze, 215 00:11:19,520 --> 00:11:24,160 Speaker 1: no sleep, no fun. And what the story is really 216 00:11:24,200 --> 00:11:26,800 Speaker 1: about is how back in the early two thousands there 217 00:11:26,840 --> 00:11:29,920 Speaker 1: was this like hustle culture there was. It was celebrated 218 00:11:30,000 --> 00:11:32,400 Speaker 1: in the movie The Social Network. You know, it was 219 00:11:32,440 --> 00:11:35,480 Speaker 1: this time of college dropouts and the consumer internet was 220 00:11:35,520 --> 00:11:38,360 Speaker 1: being born. People were living in dormitories and you know, 221 00:11:38,440 --> 00:11:40,920 Speaker 1: doing business with each other and kind of sacrificing that 222 00:11:41,000 --> 00:11:43,800 Speaker 1: the normal pleasures of youth to build these companies that 223 00:11:43,880 --> 00:11:46,240 Speaker 1: became Facebook and Google or whatever else and became you know, 224 00:11:46,320 --> 00:11:49,880 Speaker 1: multi multi billionaires. In the intervening time, Silicon Valley kind 225 00:11:49,920 --> 00:11:54,240 Speaker 1: of settled into a normal you know, corporate America. People 226 00:11:54,320 --> 00:11:58,000 Speaker 1: had great benefits, they're normal working hours, they made great money, 227 00:11:58,320 --> 00:12:00,760 Speaker 1: et cetera, et cetera, and that seemed to be kind 228 00:12:00,800 --> 00:12:03,440 Speaker 1: of the new norm. And you saw this like generation 229 00:12:03,480 --> 00:12:06,200 Speaker 1: of people who were you know, not all about hustle culture, 230 00:12:06,280 --> 00:12:08,480 Speaker 1: about you know, have a balance in their private lives. 231 00:12:08,240 --> 00:12:10,000 Speaker 2: Exact life balance, a balance. 232 00:12:10,559 --> 00:12:14,880 Speaker 1: Now there are new fortunes to be made, and there 233 00:12:14,920 --> 00:12:18,120 Speaker 1: are new threats to the stable assumptions of what it 234 00:12:18,240 --> 00:12:20,640 Speaker 1: was to be you know, a knowledge worker in your 235 00:12:20,640 --> 00:12:24,760 Speaker 1: early twenties. And so with a vengeance, this hustle culture 236 00:12:25,120 --> 00:12:26,080 Speaker 1: seems to have returned. 237 00:12:26,520 --> 00:12:28,920 Speaker 2: So what does that look like exactly? 238 00:12:28,920 --> 00:12:31,120 Speaker 4: You know, are founders sleeping under their deaths, working all 239 00:12:31,160 --> 00:12:33,120 Speaker 4: the time, have no social I mean, that's that's what 240 00:12:33,160 --> 00:12:35,240 Speaker 4: I think of when I think about sort of early Facebook. 241 00:12:35,320 --> 00:12:37,880 Speaker 1: Yeah, exactly that, but that someone that are even younger 242 00:12:37,880 --> 00:12:40,720 Speaker 1: than they were. So The General profiled an eighteen year 243 00:12:40,800 --> 00:12:44,480 Speaker 1: old from Kazakhstan named Arlan. He dropped out of school 244 00:12:44,559 --> 00:12:47,160 Speaker 1: in his junior year of high school after creating an 245 00:12:47,200 --> 00:12:49,960 Speaker 1: AI tool that led him getting a summer research job 246 00:12:50,040 --> 00:12:53,559 Speaker 1: at Stanford. After that, he was accepted to why Combinator, 247 00:12:53,600 --> 00:12:56,800 Speaker 1: which is this very famous startup incubator that helped develop 248 00:12:56,840 --> 00:12:59,959 Speaker 1: companies like door Dash and Airbnb. He said that being 249 00:13:00,000 --> 00:13:03,200 Speaker 1: steps to why Combinator been his dream since he was ten. 250 00:13:03,240 --> 00:13:04,680 Speaker 2: Years old, which was eight years ago. 251 00:13:06,040 --> 00:13:13,960 Speaker 1: Exactly. Jeez, that really puts his perspective to From there, 252 00:13:14,240 --> 00:13:17,360 Speaker 1: Arlan received a million dollars in funding and moved from 253 00:13:17,440 --> 00:13:20,640 Speaker 1: Kazakhstan to San Francisco again. According to the Journal, he 254 00:13:20,800 --> 00:13:24,600 Speaker 1: always carries his laptop with him and works quote while 255 00:13:24,640 --> 00:13:28,920 Speaker 1: out walking, during dinners, at the laundromat and on the. 256 00:13:28,840 --> 00:13:34,000 Speaker 4: Toy because we're moving fast and breaking things, because we 257 00:13:34,040 --> 00:13:34,679 Speaker 4: have add. 258 00:13:34,640 --> 00:13:36,800 Speaker 1: Well, yeah, I mean, obviously the phone in the toilet 259 00:13:36,840 --> 00:13:39,200 Speaker 1: is a particularly bad habit, but I think Arlan is 260 00:13:39,240 --> 00:13:43,120 Speaker 1: like debugging software, I think, rather than texting. The journal 261 00:13:43,120 --> 00:13:46,760 Speaker 1: reported that even after dinner with other founders, Arlan will 262 00:13:46,800 --> 00:13:51,680 Speaker 1: go to meetings with potential clients until one am. He said, quote, 263 00:13:51,880 --> 00:13:54,640 Speaker 1: I'm trying to be in the sprint mode always. I 264 00:13:54,679 --> 00:13:57,840 Speaker 1: never tried to divide working on startup and my social life. 265 00:13:57,960 --> 00:14:01,280 Speaker 1: And Arline even convinced his father brother to move to 266 00:14:01,280 --> 00:14:05,240 Speaker 1: the US where they're working on their own AI startups. Again, 267 00:14:05,679 --> 00:14:07,200 Speaker 1: he's eighteen years old. 268 00:14:07,480 --> 00:14:09,720 Speaker 4: What does he mean when he says he never tried 269 00:14:09,760 --> 00:14:12,040 Speaker 4: to divide his startup in a social life. 270 00:14:12,200 --> 00:14:14,600 Speaker 1: Yeah, I mean, so you know, it's not like he 271 00:14:14,720 --> 00:14:17,920 Speaker 1: never socializes with other people. Like the startup founders obviously 272 00:14:18,000 --> 00:14:21,080 Speaker 1: understand that having a network is a key part of succeeding. 273 00:14:21,440 --> 00:14:24,240 Speaker 1: They don't waste their time and they don't socialize for fun. 274 00:14:24,640 --> 00:14:26,600 Speaker 1: But apparently one of the things which is in vogue 275 00:14:26,800 --> 00:14:28,960 Speaker 1: as a way of socializing is something you know quite 276 00:14:29,000 --> 00:14:30,840 Speaker 1: a lot about, which is book clubs. 277 00:14:31,560 --> 00:14:32,280 Speaker 2: Interesting. 278 00:14:32,720 --> 00:14:34,960 Speaker 1: I didn't realize that you run a book club I do. 279 00:14:35,080 --> 00:14:38,040 Speaker 1: What do you think motivates your members? To be part 280 00:14:38,080 --> 00:14:39,480 Speaker 1: of the book club. It's called Bellatris. 281 00:14:39,760 --> 00:14:40,400 Speaker 2: Yes, thank you. 282 00:14:40,560 --> 00:14:44,600 Speaker 4: Check it out on Instagram and on the websitewwwl tris 283 00:14:44,680 --> 00:14:47,800 Speaker 4: dot com. I think when we started at what motivated 284 00:14:47,840 --> 00:14:51,440 Speaker 4: people was this idea that we were increasingly on our 285 00:14:51,480 --> 00:14:54,480 Speaker 4: phones all the time. Like in the early twentieth century, 286 00:14:54,640 --> 00:14:58,400 Speaker 4: when paperbacks really started to hit the market, people were 287 00:14:58,400 --> 00:14:59,520 Speaker 4: worried that people would be. 288 00:14:59,440 --> 00:14:59,960 Speaker 2: Reading too much. 289 00:15:00,040 --> 00:15:03,920 Speaker 4: Much. Now, I think a lot of people have said 290 00:15:03,960 --> 00:15:06,600 Speaker 4: to me, you know, I'm devoted to reading in the 291 00:15:06,640 --> 00:15:08,760 Speaker 4: way that I'm devoted to exercise, in the sense of 292 00:15:09,080 --> 00:15:12,000 Speaker 4: it's bringing my mind offline in a way that not 293 00:15:12,160 --> 00:15:12,840 Speaker 4: much else is. 294 00:15:13,080 --> 00:15:15,160 Speaker 1: And I think one of the interesting things in Silicon 295 00:15:15,240 --> 00:15:18,560 Speaker 1: Value there's this thing going on where there's interest in 296 00:15:18,920 --> 00:15:22,520 Speaker 1: Roman history and military history and all this this kind 297 00:15:22,520 --> 00:15:25,280 Speaker 1: of idea that to be a successful founder, Yes, there's 298 00:15:25,280 --> 00:15:27,600 Speaker 1: a lot of hustle, but also kind of living in 299 00:15:27,640 --> 00:15:31,120 Speaker 1: the world of grand ideas and intellectual frameworks and stuff 300 00:15:31,200 --> 00:15:32,000 Speaker 1: is also important. 301 00:15:32,480 --> 00:15:34,400 Speaker 4: One of the things that's surprising about this article, and 302 00:15:34,640 --> 00:15:37,120 Speaker 4: you're describing in it, is like I was under the 303 00:15:37,160 --> 00:15:40,800 Speaker 4: impression that gen Z was like quiet, quitting and taking 304 00:15:40,840 --> 00:15:41,840 Speaker 4: time for themselves. 305 00:15:42,120 --> 00:15:44,040 Speaker 1: Yeah, I think that comes back to what I was 306 00:15:44,080 --> 00:15:46,800 Speaker 1: saying at the beginning, which is there's probably like there 307 00:15:46,800 --> 00:15:51,000 Speaker 1: may be a bifurcation between like elder gen Z and 308 00:15:51,080 --> 00:15:55,840 Speaker 1: younger gen Z. I mean, life was economically before the 309 00:15:55,840 --> 00:16:00,200 Speaker 1: air evolution, like the corporate job market was much more 310 00:16:00,200 --> 00:16:03,640 Speaker 1: secure than is now, and also like new fortunes were 311 00:16:03,640 --> 00:16:05,960 Speaker 1: not being made, like the tech companies were so dominant. 312 00:16:05,960 --> 00:16:08,440 Speaker 1: There's all that focus on how you needed anti trust 313 00:16:08,760 --> 00:16:12,520 Speaker 1: regulation otherwise no new companies would emerge. And then jen 314 00:16:12,600 --> 00:16:15,080 Speaker 1: Ai came along and it just shattered the landscape. And 315 00:16:15,120 --> 00:16:19,480 Speaker 1: so now you have these extraordinary you know, dreamers like 316 00:16:19,520 --> 00:16:21,840 Speaker 1: our Land moving from Kazakhstan to try and make it 317 00:16:21,840 --> 00:16:24,400 Speaker 1: in the valley. It's something that's always happened, I think, 318 00:16:24,440 --> 00:16:27,360 Speaker 1: But this you know, return of hustle culture with a 319 00:16:27,440 --> 00:16:29,560 Speaker 1: vengeance is really interesting. There was a bit in a 320 00:16:29,600 --> 00:16:32,520 Speaker 1: story that really made me think, which is that a 321 00:16:32,560 --> 00:16:34,920 Speaker 1: bunch of the young founders refer to this concept of 322 00:16:35,040 --> 00:16:37,000 Speaker 1: nine ninety six. Do you not? Nine ninety six is? 323 00:16:37,280 --> 00:16:39,760 Speaker 4: I do? But please describe it because I want people to. 324 00:16:39,720 --> 00:16:41,840 Speaker 1: Know, well, nine ninety six is actually a concept that 325 00:16:41,880 --> 00:16:45,080 Speaker 1: emerged in China, and this was basically a kind of 326 00:16:45,320 --> 00:16:48,600 Speaker 1: work as hard as you can nine am to nine pm, 327 00:16:48,720 --> 00:16:50,720 Speaker 1: six days a week, so every day you one day 328 00:16:50,720 --> 00:16:53,080 Speaker 1: off and otherwise you work twelve hours a day. And 329 00:16:53,160 --> 00:16:55,200 Speaker 1: this was kind of like a little bit of a 330 00:16:55,200 --> 00:16:57,160 Speaker 1: boogey man of like people look at China and think, 331 00:16:57,160 --> 00:16:58,760 Speaker 1: oh my god, this is a whole generation of young 332 00:16:58,800 --> 00:17:01,320 Speaker 1: people who are killing them by working too hard in 333 00:17:01,360 --> 00:17:04,040 Speaker 1: the tech sector, and you know, actually suicides, and there's 334 00:17:04,040 --> 00:17:07,159 Speaker 1: a lot of horrible kind of ansidiary reporting around this 335 00:17:07,200 --> 00:17:10,560 Speaker 1: nine ninety six culture. But it's now evidently like a 336 00:17:10,640 --> 00:17:13,919 Speaker 1: meme not so dissimilar to the Pirate flag, been absorbed 337 00:17:14,000 --> 00:17:17,119 Speaker 1: by a certain sector of youth population of Silicon Valley 338 00:17:17,359 --> 00:17:19,080 Speaker 1: as something to aspire to. And I just find that 339 00:17:19,080 --> 00:17:19,960 Speaker 1: really really interesting. 340 00:17:20,160 --> 00:17:23,280 Speaker 4: It is very strange because we came of age in 341 00:17:23,359 --> 00:17:27,480 Speaker 4: a time where burnout culture was such a buzzword. This 342 00:17:27,560 --> 00:17:31,000 Speaker 4: is literally touting burnout culture as the way to live 343 00:17:31,320 --> 00:17:33,240 Speaker 4: and as the way to make money, which is interesting. 344 00:17:33,240 --> 00:17:36,960 Speaker 4: I mean, if you think of prior generations of immigrant populations, 345 00:17:37,000 --> 00:17:39,840 Speaker 4: like burnout was a way to build wealth in this country. 346 00:17:39,840 --> 00:17:42,359 Speaker 4: If you came to this country and if your relatives 347 00:17:42,359 --> 00:17:43,959 Speaker 4: were meant to come to this country, it wasn't like 348 00:17:44,160 --> 00:17:46,280 Speaker 4: if you owned a dry cleaner, for example, you. 349 00:17:46,480 --> 00:17:48,560 Speaker 1: It nine ninety six was there was another work. 350 00:17:48,359 --> 00:17:51,200 Speaker 4: Life balance at the dry cleaner. And I think what's 351 00:17:51,240 --> 00:17:53,479 Speaker 4: interesting is it's the type of businesses that people are 352 00:17:53,480 --> 00:17:56,800 Speaker 4: building that has changed. But there has always been a 353 00:17:56,840 --> 00:17:59,280 Speaker 4: mentality in this country of like, we come to America 354 00:18:00,080 --> 00:18:01,000 Speaker 4: to make wealth. 355 00:18:01,080 --> 00:18:03,399 Speaker 1: See I think you're right to your point. AI is 356 00:18:03,520 --> 00:18:05,960 Speaker 1: exactly that. Like it's a chimera, like it's a dream, 357 00:18:06,000 --> 00:18:07,800 Speaker 1: it's anything you want it to be. And so it's 358 00:18:07,840 --> 00:18:09,919 Speaker 1: literally a moment where like for a certain group of 359 00:18:09,920 --> 00:18:12,679 Speaker 1: people it feels like anything's possible, although of course the 360 00:18:12,720 --> 00:18:15,080 Speaker 1: reality is most of them will be very burnt out 361 00:18:15,119 --> 00:18:17,480 Speaker 1: and not have very valuable businesses at the end of it. 362 00:18:17,680 --> 00:18:19,920 Speaker 4: You know, your pal Jeff Hinton, who was a tech 363 00:18:19,960 --> 00:18:23,479 Speaker 4: stuff guest and the so called godfather of AI, recently 364 00:18:23,520 --> 00:18:26,320 Speaker 4: gave an interview to The Financial Times where he said, quote, 365 00:18:26,480 --> 00:18:29,080 Speaker 4: AI will make a few people much richer and most 366 00:18:29,119 --> 00:18:32,000 Speaker 4: people poorer. So I think we kind of are aware 367 00:18:32,000 --> 00:18:34,119 Speaker 4: of this on a macro level, but it's interesting to 368 00:18:34,119 --> 00:18:36,840 Speaker 4: see it play out in the Petri dish of Silicon Valley. 369 00:18:36,880 --> 00:18:40,359 Speaker 4: You know how hard these kids are willing to fight 370 00:18:40,480 --> 00:18:42,600 Speaker 4: for a slice of the AI PI while like all 371 00:18:42,680 --> 00:18:44,600 Speaker 4: of these entry level jobs are being slashed. 372 00:18:45,359 --> 00:18:48,920 Speaker 1: After the break, some more headlines. The FDC is taking 373 00:18:48,960 --> 00:18:53,360 Speaker 1: Amazon to trial, Albania has an AI minister of corruption, 374 00:18:54,000 --> 00:18:57,959 Speaker 1: and Russia has an AI generated television show. Then on 375 00:18:58,040 --> 00:19:01,720 Speaker 1: Chat to me Chat helps one tech stuff listener navigate 376 00:19:01,840 --> 00:19:13,000 Speaker 1: a scary diagnosis. Stay with us. We're back, and in 377 00:19:13,000 --> 00:19:15,000 Speaker 1: honor of you, m week, We've got a couple more 378 00:19:15,080 --> 00:19:18,840 Speaker 1: global headlines for you. But first, Kara, are you an 379 00:19:18,880 --> 00:19:20,119 Speaker 1: Amazon Prime member? 380 00:19:20,760 --> 00:19:21,240 Speaker 2: I am? 381 00:19:21,560 --> 00:19:23,480 Speaker 1: Do you remember when you signed up? Oh? 382 00:19:23,520 --> 00:19:25,280 Speaker 2: God, years ago, years ago? 383 00:19:25,320 --> 00:19:27,639 Speaker 1: Probably the free delivery gotcha easily. 384 00:19:27,720 --> 00:19:29,879 Speaker 2: Yeah, me too, just hauling it prime. 385 00:19:30,240 --> 00:19:33,320 Speaker 1: Yeah, it's like I want to be the prime USDA. 386 00:19:33,560 --> 00:19:36,600 Speaker 1: H yeah, so that's a good thing. Prime. However, the 387 00:19:36,640 --> 00:19:40,080 Speaker 1: Federal Trade Commission is taking Amazon to trial this week 388 00:19:40,359 --> 00:19:44,320 Speaker 1: overclaims they tricked millions of people into signing up for Prime. 389 00:19:44,359 --> 00:19:46,639 Speaker 1: I did it with my eyes open, but apparently millions 390 00:19:46,640 --> 00:19:49,040 Speaker 1: of others did not. And the laws it was actually 391 00:19:49,040 --> 00:19:52,840 Speaker 1: first filed, interestingly during the Biden administration back in twenty 392 00:19:52,880 --> 00:19:55,560 Speaker 1: twenty three, but the trial just started on Monday. 393 00:19:56,200 --> 00:19:58,560 Speaker 2: Why does the FDC think that people were tricked? 394 00:19:58,880 --> 00:20:00,800 Speaker 1: Well, you want to hear from the comp I do. 395 00:20:01,240 --> 00:20:03,280 Speaker 2: I'm not going to read it. 396 00:20:03,359 --> 00:20:09,240 Speaker 1: The complaint filed by the FTC says, quote Amazon used manipulative, coercive, 397 00:20:09,560 --> 00:20:14,480 Speaker 1: or deceptive user interface designs known as dark patterns to 398 00:20:14,560 --> 00:20:20,240 Speaker 1: treat consumers into enrolling and automatically renewing prime subscriptions. Cancline 399 00:20:20,240 --> 00:20:24,240 Speaker 1: subscription was made intentionally more difficult than signing up for one, 400 00:20:24,680 --> 00:20:28,520 Speaker 1: and the complaint calls the cancelation process quote labyrinthine is 401 00:20:28,560 --> 00:20:30,960 Speaker 1: how you say labyrinth thine, labyrinthine. 402 00:20:31,040 --> 00:20:32,919 Speaker 2: I think it's labyrinthine, labyrinthine. 403 00:20:33,359 --> 00:20:38,640 Speaker 1: Get this. Internally at Amazon, the cancelation process was called quote, 404 00:20:39,040 --> 00:20:43,520 Speaker 1: an unspoken cancer, and it was referred to as the 405 00:20:43,560 --> 00:20:46,760 Speaker 1: Iliad flow, as in the Greek epic poem about a 406 00:20:46,800 --> 00:20:49,800 Speaker 1: hero's journey through the protracted Trojan war. 407 00:20:50,119 --> 00:20:53,800 Speaker 4: I love these nerds, That's why, I mean, just internally 408 00:20:53,840 --> 00:20:54,680 Speaker 4: what they call things. 409 00:20:54,680 --> 00:20:57,240 Speaker 2: But what does the Iliad flow entail? 410 00:20:58,320 --> 00:21:00,720 Speaker 1: Well, it kind of entails what you'd do, imagine, which 411 00:21:00,760 --> 00:21:04,439 Speaker 1: is being extremely hard to get from A to B. 412 00:21:04,640 --> 00:21:06,840 Speaker 1: And they knew, and they knew, and they were making 413 00:21:06,840 --> 00:21:09,560 Speaker 1: a joke about it. But specifically, the complaint says that 414 00:21:09,560 --> 00:21:11,919 Speaker 1: in order to go through the process of unsubscribing. On 415 00:21:11,960 --> 00:21:16,480 Speaker 1: the site, customers had to navigate through four pages, six clicks, 416 00:21:16,600 --> 00:21:19,880 Speaker 1: and fifteen various options. And there are also a ton 417 00:21:19,920 --> 00:21:24,119 Speaker 1: of sort of chances along the way to unknowingly stop canceling, 418 00:21:24,560 --> 00:21:28,840 Speaker 1: warnings about losing benefits, promotional offers, all intended to distract 419 00:21:28,920 --> 00:21:32,879 Speaker 1: you or convince you not to cancel. Whereas guess how 420 00:21:32,920 --> 00:21:34,879 Speaker 1: many clicks it takes to sign up for Prime? 421 00:21:35,200 --> 00:21:35,720 Speaker 4: Not a lot. 422 00:21:37,640 --> 00:21:41,160 Speaker 1: I'm actually surprised it wasn't just one, but absolutely Since 423 00:21:41,200 --> 00:21:44,160 Speaker 1: the court filing, Amazon has actually already changed the process. 424 00:21:44,680 --> 00:21:48,560 Speaker 1: There's now a well labeled and easy to understand cancelation page. 425 00:21:48,960 --> 00:21:50,920 Speaker 1: But the outcome of the trial is still worth keeping 426 00:21:50,920 --> 00:21:53,639 Speaker 1: an eye on because it could impact the future of 427 00:21:53,680 --> 00:21:56,680 Speaker 1: canceling subscriptions for all kinds of companies. And it is 428 00:21:56,880 --> 00:22:01,119 Speaker 1: unbelievably hard and annoying to cancel subscriptions. The worst subscription 429 00:22:01,240 --> 00:22:06,280 Speaker 1: cancelation I ever had was the Times Literary supplement We 430 00:22:06,320 --> 00:22:07,159 Speaker 1: had to Call, We Have. 431 00:22:07,240 --> 00:22:09,080 Speaker 2: To Call, which is run by Abacus. 432 00:22:10,520 --> 00:22:13,560 Speaker 4: If I'm not mistaken, it's actually I've watched my mom 433 00:22:13,600 --> 00:22:15,560 Speaker 4: almost being brought to tears by cancelation. 434 00:22:16,080 --> 00:22:18,280 Speaker 2: It's evil. Some of the cancelation. 435 00:22:17,880 --> 00:22:19,560 Speaker 1: Tactes are evil, dark patterns. 436 00:22:19,640 --> 00:22:24,560 Speaker 2: It feels malicious, high listeners. 437 00:22:24,680 --> 00:22:26,960 Speaker 1: This is oz. I wanted to let you know that 438 00:22:27,000 --> 00:22:30,159 Speaker 1: the day after we recorded the episode, there was actually 439 00:22:30,240 --> 00:22:33,760 Speaker 1: an outcome to the trial. Amazon settled with the FTC 440 00:22:34,000 --> 00:22:37,920 Speaker 1: for two point five billion dollars. By settling, they admitted 441 00:22:38,080 --> 00:22:40,560 Speaker 1: no wrongdoing. But we'll keep an eye out on what 442 00:22:40,600 --> 00:22:45,520 Speaker 1: this outcome means for subscription cancelation overall. But now back 443 00:22:45,560 --> 00:22:53,800 Speaker 1: to our episode and our next short headline. How much 444 00:22:53,840 --> 00:22:55,800 Speaker 1: do you know about Albanian politics? 445 00:22:56,160 --> 00:22:58,640 Speaker 2: Literally nothing, which is embarrassing, but nothing. 446 00:22:58,560 --> 00:23:00,479 Speaker 1: Not that embrassing. I've actually been to our not too 447 00:23:00,520 --> 00:23:04,119 Speaker 1: long ago. It's really quite a fascinating country that's had 448 00:23:04,119 --> 00:23:08,600 Speaker 1: a meteoric rise post communism. It's in this very strategic 449 00:23:08,640 --> 00:23:11,600 Speaker 1: location between East and West, and so you know, the 450 00:23:11,960 --> 00:23:14,840 Speaker 1: US and Europeans are quite supportive. But there's also terrible 451 00:23:14,840 --> 00:23:17,639 Speaker 1: problems with organized crime. And actually a lot of the 452 00:23:18,040 --> 00:23:22,600 Speaker 1: drug trafficking through the port of Rotterdam from South America 453 00:23:22,720 --> 00:23:28,320 Speaker 1: is controlled by Albanian organized crime. And recently Albania became 454 00:23:28,359 --> 00:23:31,040 Speaker 1: the first country in the world to appoint an AI 455 00:23:31,240 --> 00:23:32,960 Speaker 1: generated government minister. 456 00:23:33,200 --> 00:23:34,040 Speaker 2: How does that work? 457 00:23:35,119 --> 00:23:39,680 Speaker 1: Good question? I'll start with her name, Della, which means 458 00:23:39,760 --> 00:23:43,440 Speaker 1: sun in Albania. That's Della has actually been around since January. 459 00:23:44,080 --> 00:23:47,640 Speaker 1: She started life as an AI assistant designed to help 460 00:23:47,680 --> 00:23:50,000 Speaker 1: people navigate online government services. 461 00:23:50,080 --> 00:23:52,040 Speaker 2: So she's actually worked her. 462 00:23:52,720 --> 00:23:57,159 Speaker 1: This is this is a real uh yeah, brags to 463 00:23:57,240 --> 00:24:02,840 Speaker 1: Rich's journey for Della. She has reached promoted to cabinet 464 00:24:02,840 --> 00:24:07,120 Speaker 1: minister in charge of public procurement. Essentially, she'll be analyzing 465 00:24:07,200 --> 00:24:10,440 Speaker 1: government contracts for corruption. Would you like to meet her? 466 00:24:10,560 --> 00:24:13,160 Speaker 2: But I'm dying to meet her. Also, this feels ripe 467 00:24:13,240 --> 00:24:13,840 Speaker 2: for corruption. 468 00:24:14,200 --> 00:24:16,400 Speaker 1: Well we'll get We'll get to that in a minute, 469 00:24:16,440 --> 00:24:18,080 Speaker 1: but first, here's d'ella. 470 00:24:18,160 --> 00:24:23,199 Speaker 3: The samoanchuitor Anti coustets I Kim Kalandur. 471 00:24:25,000 --> 00:24:27,080 Speaker 2: So this is an AI generated person that I'm looking at. 472 00:24:27,320 --> 00:24:31,480 Speaker 1: This is an AI generated person wearing traditional Albanian dress 473 00:24:31,960 --> 00:24:35,320 Speaker 1: against the backdrop of the Albanian flag and the EU. 474 00:24:35,119 --> 00:24:37,320 Speaker 2: Flag its state sponsor TV. 475 00:24:38,280 --> 00:24:41,200 Speaker 1: She's talking in Albanian, but luckily there are subtitles. She says, 476 00:24:41,720 --> 00:24:45,320 Speaker 1: some have labeled me unconstitutional because I'm not a human being. 477 00:24:45,760 --> 00:24:49,359 Speaker 1: That hurt me, not for myself, but for the nine 478 00:24:49,440 --> 00:24:53,080 Speaker 1: hundred and seventy two thousand interactions I had with citizens 479 00:24:53,080 --> 00:24:58,240 Speaker 1: whom I served as part of E Albania ABBE. This 480 00:24:58,280 --> 00:25:02,359 Speaker 1: is well, that's the beginning of her career when she 481 00:25:02,440 --> 00:25:05,840 Speaker 1: was bootstrapping, Yeah, exactly. She emphasized that as an AI 482 00:25:06,160 --> 00:25:09,320 Speaker 1: she has no ambitions or personal interests, which makes her 483 00:25:09,400 --> 00:25:12,480 Speaker 1: perfectly suited to the job of making sure government contracts 484 00:25:12,560 --> 00:25:16,520 Speaker 1: impartial and corruption free. The Prime Minister of Albania, Eddie Ramer, 485 00:25:17,000 --> 00:25:20,560 Speaker 1: said Diella would help make Albania quote a country where 486 00:25:20,600 --> 00:25:27,360 Speaker 1: public tenders are one hundred percent free of corruption, one 487 00:25:27,600 --> 00:25:31,359 Speaker 1: hundred percent free of corruption. I methinks the Prime Minister 488 00:25:31,400 --> 00:25:36,760 Speaker 1: does protest too much, and some citizens of Albania are 489 00:25:36,840 --> 00:25:41,400 Speaker 1: also quite dubious. One Facebook user quote in The Guardian said, 490 00:25:42,040 --> 00:25:44,800 Speaker 1: in Albania, even Diella will be corrupted. 491 00:25:44,960 --> 00:25:48,120 Speaker 4: I mean talk about AI hallucination, AI corruption, I think 492 00:25:48,200 --> 00:25:50,399 Speaker 4: is the next frontier. You know, it's funny that you 493 00:25:50,440 --> 00:25:52,720 Speaker 4: bring this up, because my next headline is also about 494 00:25:52,720 --> 00:25:56,199 Speaker 4: state sponsored AI. I read this article in four or 495 00:25:56,200 --> 00:25:59,880 Speaker 4: four media about a news satire show which recently premiered 496 00:26:00,040 --> 00:26:04,040 Speaker 4: on Russian state TV, and it's apparently all AI generated. 497 00:26:04,320 --> 00:26:08,800 Speaker 1: So what does state funded political AI slop entail? 498 00:26:09,320 --> 00:26:12,919 Speaker 4: So, according to ads for the show. They've got a 499 00:26:13,000 --> 00:26:17,639 Speaker 4: neural network deciding the content and topics for each episode, 500 00:26:17,800 --> 00:26:22,159 Speaker 4: and then that same neural network uses AI to generate video. 501 00:26:22,520 --> 00:26:25,600 Speaker 4: So it's been creating content like, you know, amazing stuff 502 00:26:25,640 --> 00:26:29,320 Speaker 4: like President Trump singing songs, French President Emmanuel Macron wearing 503 00:26:29,359 --> 00:26:32,359 Speaker 4: hair curlers and a pink robe. An ad for the 504 00:26:32,359 --> 00:26:36,400 Speaker 4: show claims, quote the editorial team's opinion may not coincide 505 00:26:36,400 --> 00:26:38,920 Speaker 4: with the AIS, though usually it does. 506 00:26:39,119 --> 00:26:42,840 Speaker 1: Do you think that's it has to be? 507 00:26:43,320 --> 00:26:45,720 Speaker 4: Poll It Stacker, which is the name of the show, 508 00:26:45,840 --> 00:26:49,040 Speaker 4: is not just news but a tough breakdown of political 509 00:26:49,119 --> 00:26:52,720 Speaker 4: madness from a digital host who notices what others overlook. 510 00:26:53,200 --> 00:26:56,480 Speaker 4: Russia's Ministry of Defense actually owns the TV channel where 511 00:26:56,480 --> 00:26:56,959 Speaker 4: it airs. 512 00:26:57,359 --> 00:26:59,679 Speaker 1: I'm curious about this this show though, I mean it 513 00:26:59,760 --> 00:27:02,720 Speaker 1: like obvious that it's made by AI. How does the 514 00:27:02,760 --> 00:27:04,119 Speaker 1: AI show up in the show? 515 00:27:04,440 --> 00:27:09,200 Speaker 4: Well, there's a data scientist named Kalev Letaru from Georgetown. 516 00:27:09,240 --> 00:27:12,080 Speaker 4: He's the one who found the show originally, and he says, 517 00:27:12,080 --> 00:27:14,800 Speaker 4: at times, if you don't know it was AI, you 518 00:27:14,880 --> 00:27:17,280 Speaker 4: might never guess that it looks a lot like other 519 00:27:17,359 --> 00:27:20,720 Speaker 4: Russian propaganda. And he says further, if they are using 520 00:27:20,800 --> 00:27:23,000 Speaker 4: AI to the degree that they say they are, even 521 00:27:23,040 --> 00:27:25,800 Speaker 4: if it's just to pick topics. They mastered that formula 522 00:27:25,880 --> 00:27:27,840 Speaker 4: in a way that others have not. You know, there 523 00:27:27,880 --> 00:27:31,120 Speaker 4: are parts that are unmistakable AI, like deep fake interviews 524 00:27:31,119 --> 00:27:33,840 Speaker 4: with world leaders. In one of those interviews, Trump says 525 00:27:34,280 --> 00:27:37,400 Speaker 4: he'll end the Russia Ukraine war by building a casino 526 00:27:37,440 --> 00:27:40,879 Speaker 4: in Moscow with golden toilets. But the AI seems to 527 00:27:40,920 --> 00:27:44,040 Speaker 4: have Trump's pattern of speech down pretty well. To quote 528 00:27:44,119 --> 00:27:46,800 Speaker 4: him in one of these videos, all the Russian oligarchs, 529 00:27:46,960 --> 00:27:50,000 Speaker 4: they would all be inside, all their money would be inside. 530 00:27:50,160 --> 00:27:50,959 Speaker 2: Problem solved. 531 00:27:51,000 --> 00:27:53,400 Speaker 4: They would just play poker and forget about the whole war. 532 00:27:53,960 --> 00:27:56,399 Speaker 4: A very bad deal for them, very distracting. 533 00:27:58,080 --> 00:28:02,280 Speaker 1: It's funny, you know, big fears when when gerative AI 534 00:28:02,440 --> 00:28:05,960 Speaker 1: really took off, that we would all be tricked and 535 00:28:06,080 --> 00:28:09,600 Speaker 1: transfixed by deep fakes of world leaders and you know, 536 00:28:09,640 --> 00:28:14,720 Speaker 1: political events would be dictated by fake media. Instead, what 537 00:28:14,800 --> 00:28:17,280 Speaker 1: seems to have happened is that deep fakes have kind 538 00:28:17,320 --> 00:28:21,760 Speaker 1: of evolved the medium of the meme. Like instead of 539 00:28:21,800 --> 00:28:24,360 Speaker 1: a meme, now you kind of create a deep fake 540 00:28:24,400 --> 00:28:26,320 Speaker 1: scenario of the thing you're trying to make fun of 541 00:28:26,359 --> 00:28:29,080 Speaker 1: and it's funny that, you know, Trump is talking about 542 00:28:29,240 --> 00:28:34,280 Speaker 1: building a casino in Moscow in this Russian satire AI show, 543 00:28:34,720 --> 00:28:39,200 Speaker 1: because on Trump's own AI satire social media channels, he's 544 00:28:39,320 --> 00:28:44,960 Speaker 1: making content about turning Gaza into a riviera. So there's 545 00:28:44,960 --> 00:28:49,440 Speaker 1: a funny like you know, doubleness here where it's odd 546 00:28:49,480 --> 00:28:54,000 Speaker 1: that the primary expression of deep fake content right now 547 00:28:54,160 --> 00:29:08,120 Speaker 1: is political satire. And now it's time for our final 548 00:29:08,200 --> 00:29:12,040 Speaker 1: segment of the day, Chat and Me. Today's submission is 549 00:29:12,080 --> 00:29:15,160 Speaker 1: from Jeremy, who is an early adopter of Chat GPT, 550 00:29:15,800 --> 00:29:19,240 Speaker 1: a self described chat evangelist, and of course a tech 551 00:29:19,280 --> 00:29:19,959 Speaker 1: stuff listener. 552 00:29:20,520 --> 00:29:23,000 Speaker 4: I like anyone who's a self described tex stuff listener. 553 00:29:23,080 --> 00:29:26,200 Speaker 1: I like anyone who text stuff listener, not self described 554 00:29:26,240 --> 00:29:26,600 Speaker 1: thank you. 555 00:29:26,920 --> 00:29:30,200 Speaker 4: So, while Jeremy uses Chat for all sorts of tasks, 556 00:29:30,200 --> 00:29:33,440 Speaker 4: there's one in particular that he thinks is of note. 557 00:29:33,840 --> 00:29:36,880 Speaker 3: One of the biggest ways it helps me. It's like 558 00:29:37,040 --> 00:29:40,600 Speaker 3: health tracking both me and my dog. Actually, what I 559 00:29:40,640 --> 00:29:45,240 Speaker 3: do is I log my Apple Watch stats after hikes, workouts, 560 00:29:45,360 --> 00:29:48,360 Speaker 3: and then I upload that information into chat GBT and 561 00:29:48,400 --> 00:29:50,720 Speaker 3: it just breaks those numbers down in a way that 562 00:29:50,840 --> 00:29:53,080 Speaker 3: actually makes sense to me. So I can kind of 563 00:29:53,120 --> 00:29:54,800 Speaker 3: see my progress over time. 564 00:29:55,480 --> 00:29:57,920 Speaker 4: There is a plot twist coming. Jeremy doesn't just track 565 00:29:58,000 --> 00:30:01,200 Speaker 4: his workouts. He also tracks his food with the help 566 00:30:01,200 --> 00:30:04,680 Speaker 4: of chat GPT, and he actually is a really powerful incentive. 567 00:30:04,880 --> 00:30:08,400 Speaker 3: I was recently diagnosed with pre diabetes, so I'm trying 568 00:30:08,400 --> 00:30:10,280 Speaker 3: to kind of be a little more careful what I eat. 569 00:30:10,680 --> 00:30:13,920 Speaker 3: Instead of just guessing what I eat and the nutritional 570 00:30:14,000 --> 00:30:18,800 Speaker 3: value of information, I basically just log my daily foods. 571 00:30:18,880 --> 00:30:21,320 Speaker 3: Whatever I eat, I take a picture of what I eat, 572 00:30:21,560 --> 00:30:24,480 Speaker 3: and I just uploaded the chat GPT and it kind 573 00:30:24,480 --> 00:30:28,360 Speaker 3: of just breaks all their down. You know, the calorie corp, sugar, 574 00:30:28,400 --> 00:30:29,400 Speaker 3: protein are there. 575 00:30:29,640 --> 00:30:31,960 Speaker 1: You know, I can imagine if you're dealing with a 576 00:30:32,000 --> 00:30:35,560 Speaker 1: diagnosis like pre diabetes, not only do you have to 577 00:30:35,600 --> 00:30:39,520 Speaker 1: come to terms with this new health reality emotionally, you 578 00:30:39,600 --> 00:30:42,840 Speaker 1: also very practically have to build a new routine that 579 00:30:42,880 --> 00:30:46,760 Speaker 1: takes into account all kinds of new health priorities. You know, obviously, 580 00:30:46,840 --> 00:30:49,680 Speaker 1: the doctors can give you strict guidelines about what to 581 00:30:49,760 --> 00:30:52,480 Speaker 1: do in order to make sure your blood sugar levels 582 00:30:52,480 --> 00:30:55,800 Speaker 1: don't spike in terms of proteins and fats and carbohydrates. 583 00:30:55,880 --> 00:30:59,560 Speaker 1: But actually what Jeremy's doing is translating that into real life, 584 00:31:00,120 --> 00:31:03,440 Speaker 1: seeing what chatchpt says about it and how that may 585 00:31:03,760 --> 00:31:06,600 Speaker 1: track with the guidelines he's trying to follow. We've talked 586 00:31:06,640 --> 00:31:11,280 Speaker 1: a lot about medicine and AI, and the classic is 587 00:31:11,320 --> 00:31:16,240 Speaker 1: always like, can the AI outdiagnose a doctor, which in 588 00:31:16,280 --> 00:31:18,800 Speaker 1: many cases it seems like it can, but the majority 589 00:31:18,800 --> 00:31:23,440 Speaker 1: of medicine is not really diagnosis. It's actually preventative health 590 00:31:23,680 --> 00:31:29,040 Speaker 1: and patient compliance with doctor's recommendations and stuff. And so 591 00:31:29,080 --> 00:31:31,360 Speaker 1: this is actually the first I really heard of not 592 00:31:31,520 --> 00:31:34,240 Speaker 1: using AI so much just to diagnose an illness, but 593 00:31:34,320 --> 00:31:36,720 Speaker 1: actually to turn it into a tool to help manage 594 00:31:36,720 --> 00:31:38,120 Speaker 1: your condition. And I think it's pretty cool. 595 00:31:38,200 --> 00:31:41,719 Speaker 4: Yeah, it's a little bit like your medical concierge that 596 00:31:41,840 --> 00:31:43,840 Speaker 4: leaves the doctor's office with you exactly. 597 00:31:43,880 --> 00:31:43,960 Speaker 3: That. 598 00:31:44,680 --> 00:31:47,480 Speaker 1: Don't forget. We want to hear from you, our listeners, 599 00:31:47,840 --> 00:31:51,760 Speaker 1: so please send your chat stories to our inbox tech 600 00:31:51,800 --> 00:32:00,600 Speaker 1: Stuff podcast at gmail dot com. 601 00:32:00,720 --> 00:32:02,320 Speaker 2: That's it for this week for Tech Stuff. 602 00:32:02,320 --> 00:32:05,520 Speaker 1: I'm Kara Price and I'm os Voloshin. This episode was 603 00:32:05,520 --> 00:32:09,520 Speaker 1: produced by Eliza Dennis Tyler Hill and Melissa Slaughter. It 604 00:32:09,560 --> 00:32:12,640 Speaker 1: was executive produced by me Kara Price and Kate Osborne 605 00:32:12,640 --> 00:32:17,640 Speaker 1: for Kaleidoscope and Katrina Norvel for iHeart Podcasts. The Engineer 606 00:32:17,680 --> 00:32:21,360 Speaker 1: is Beheit. Fraser and Jack Insley mixed this episode. Kyle 607 00:32:21,440 --> 00:32:22,680 Speaker 1: Murdoch wrote out theme song. 608 00:32:23,160 --> 00:32:25,800 Speaker 4: Join us next Wednesday for Textuff the Story for a 609 00:32:25,840 --> 00:32:28,760 Speaker 4: look at a beloved American industry that is benefiting and 610 00:32:28,800 --> 00:32:31,680 Speaker 4: being destroyed by AI Hollywood. 611 00:32:32,240 --> 00:32:35,080 Speaker 1: Please rate and review the show wherever you listen to podcasts, 612 00:32:35,160 --> 00:32:37,680 Speaker 1: and reach out to us at tech Stuff podcast at 613 00:32:37,680 --> 00:32:39,719 Speaker 1: gmail dot com. We love hearing from you.