1 00:00:12,960 --> 00:00:16,760 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:16,920 --> 00:00:19,400 Speaker 1: I'm os Vlosian, and today Karen Price and I will 3 00:00:19,400 --> 00:00:23,200 Speaker 1: bring you the headlines this week, including how Interpol is 4 00:00:23,280 --> 00:00:26,520 Speaker 1: keeping up with new types of crime. Then on tech Support, 5 00:00:26,600 --> 00:00:30,080 Speaker 1: we'll talk to the Washington Posts Drew Holwell about one 6 00:00:30,160 --> 00:00:35,200 Speaker 1: woman's three year twenty four to seven live stream experiment. 7 00:00:35,760 --> 00:00:37,880 Speaker 2: Some of the people I talked to who were Emily's fans, 8 00:00:38,040 --> 00:00:40,440 Speaker 2: they would go to sleep listening to Emily's voice. 9 00:00:40,760 --> 00:00:43,959 Speaker 1: All of that. On the Weekend Tech It's Friday, May Night. 10 00:00:52,120 --> 00:00:55,040 Speaker 1: So Carol, we have become quite fascinated on this program 11 00:00:55,680 --> 00:00:58,600 Speaker 1: by personas online who aren't exactly what they seem. 12 00:00:58,760 --> 00:00:59,800 Speaker 3: That is absolutely right. 13 00:01:00,000 --> 00:01:02,120 Speaker 4: In last week, you'll remember we did the deep dive 14 00:01:02,200 --> 00:01:05,440 Speaker 4: on AI John Cena, who's the metabot, getting into all 15 00:01:05,520 --> 00:01:09,760 Speaker 4: kinds of illegal sexual situations, which was very disturbing. 16 00:01:09,959 --> 00:01:11,880 Speaker 1: Yeah, well that's right. I've got a story this week 17 00:01:11,920 --> 00:01:16,039 Speaker 1: though about real people pretending to be something they're not 18 00:01:16,480 --> 00:01:19,480 Speaker 1: online and a novel way of catching them out. 19 00:01:19,680 --> 00:01:20,800 Speaker 3: Is that called a dating app? 20 00:01:21,120 --> 00:01:25,280 Speaker 1: They tell me this is basically this is geopolitical catfishing. 21 00:01:25,600 --> 00:01:29,840 Speaker 1: According to cybersecurity experts, thousands of North Korean infiltrators are 22 00:01:29,920 --> 00:01:32,920 Speaker 1: getting hired by Fortune five hundred companies. 23 00:01:33,400 --> 00:01:36,160 Speaker 4: So the North Koreans who aren't fighting with Russia on 24 00:01:36,200 --> 00:01:40,039 Speaker 4: the battlefields of Ukraine are working for large American corporations. 25 00:01:40,440 --> 00:01:42,920 Speaker 1: That's exactly right. And then once they get hired, they 26 00:01:42,920 --> 00:01:46,280 Speaker 1: collect their wages, but they also steal intellectual property and 27 00:01:46,360 --> 00:01:47,920 Speaker 1: insert malware. 28 00:01:47,640 --> 00:01:51,040 Speaker 3: So in a way, this is another battlefield. 29 00:01:50,320 --> 00:01:53,160 Speaker 1: That's very well put Cara. There's a publication called The 30 00:01:53,200 --> 00:01:56,720 Speaker 1: Register which reported that these North Korean infiltrators have gone 31 00:01:56,880 --> 00:02:01,720 Speaker 1: undetected by masking their IP addresses by creating farms of 32 00:02:01,840 --> 00:02:04,600 Speaker 1: laptops which are physically situated in the US, and then 33 00:02:04,600 --> 00:02:08,400 Speaker 1: they basically pay US people to allow them to remotely 34 00:02:08,400 --> 00:02:11,440 Speaker 1: dial into these laptop farms. They're also of course using 35 00:02:11,560 --> 00:02:14,840 Speaker 1: AI to write job applications as everyone is, and making 36 00:02:14,880 --> 00:02:18,000 Speaker 1: fake LinkedIn profiles. Once they actually get hired there, they 37 00:02:18,040 --> 00:02:20,919 Speaker 1: tend to do very well because they have a whole 38 00:02:21,080 --> 00:02:24,280 Speaker 1: army of other North Koreans helping them do their jobs 39 00:02:24,360 --> 00:02:25,760 Speaker 1: in the background. 40 00:02:25,480 --> 00:02:27,480 Speaker 3: So they have actual coworkers, they have. 41 00:02:27,440 --> 00:02:29,200 Speaker 1: Co workers there as a front man who gets the 42 00:02:29,280 --> 00:02:31,239 Speaker 1: job and then dozens of people who help them do 43 00:02:31,280 --> 00:02:33,600 Speaker 1: an excellent job. And part of the story is actually 44 00:02:33,639 --> 00:02:35,880 Speaker 1: that even when companies become aware, they don't want to 45 00:02:35,880 --> 00:02:38,080 Speaker 1: fire these people because they're so much better than all 46 00:02:38,080 --> 00:02:39,560 Speaker 1: the other employees. 47 00:02:39,760 --> 00:02:41,880 Speaker 3: So they're keeping they get to keep their job. 48 00:02:42,520 --> 00:02:44,640 Speaker 1: I think they probably do in the end to have 49 00:02:44,680 --> 00:02:48,040 Speaker 1: to get rid of them. One cybersecurity expert, however, has 50 00:02:48,120 --> 00:02:51,600 Speaker 1: found a full proof way to catch out a suspected 51 00:02:51,800 --> 00:02:53,880 Speaker 1: North Korean infiltrator in a job interview. 52 00:02:54,040 --> 00:02:55,400 Speaker 3: That's something that I need to know in my every 53 00:02:55,440 --> 00:02:55,799 Speaker 3: day life. 54 00:02:56,480 --> 00:02:58,120 Speaker 1: It all starts with a question, do you know what 55 00:02:58,120 --> 00:02:58,760 Speaker 1: the question is? 56 00:02:58,840 --> 00:02:59,880 Speaker 3: I have no idea. 57 00:03:00,200 --> 00:03:04,160 Speaker 1: How fat is Kim Johnkwun? Ask me the question, how 58 00:03:04,160 --> 00:03:08,840 Speaker 1: fat is Kim junk Wun? Well, you're not too long. 59 00:03:08,960 --> 00:03:11,960 Speaker 1: North Korean infiltrated would immediately close out of the video 60 00:03:12,040 --> 00:03:15,280 Speaker 1: conference because even contemplating that question is to. 61 00:03:15,680 --> 00:03:19,200 Speaker 3: As Wow, that's so interesting. 62 00:03:19,680 --> 00:03:22,720 Speaker 1: So that's fake fake people of the wheat story. 63 00:03:22,880 --> 00:03:25,280 Speaker 4: I'll have one next week for you, maybe about me. 64 00:03:26,480 --> 00:03:30,240 Speaker 4: So you love stories about fake people online. I love 65 00:03:30,720 --> 00:03:33,880 Speaker 4: Peanut M and MS very good match. 66 00:03:34,040 --> 00:03:35,720 Speaker 3: Very good. You know who eats Pean and M and MS. 67 00:03:35,720 --> 00:03:38,880 Speaker 4: Probably Kim Johnon He may do he may If you 68 00:03:38,880 --> 00:03:40,080 Speaker 4: can get an eminem into. 69 00:03:40,600 --> 00:03:42,040 Speaker 1: Big red wine enthusiasts. 70 00:03:42,080 --> 00:03:46,080 Speaker 4: French red wine really attressive, of course, not chatewed Diana 71 00:03:47,280 --> 00:03:50,280 Speaker 4: so well. Speaking of Chateau Diana and peanut m and ms, 72 00:03:50,440 --> 00:03:52,840 Speaker 4: there's a new app that is gaining popularity that a 73 00:03:52,840 --> 00:03:55,560 Speaker 4: lot of my friends actually use, and it's this app 74 00:03:55,600 --> 00:04:00,120 Speaker 4: that tells you how guilty your guilty snacking pleasure. 75 00:03:59,840 --> 00:04:01,720 Speaker 1: Is like a calorie counter apple, what. 76 00:04:02,120 --> 00:04:04,440 Speaker 3: Sort of it's like calorie counter adjacent. 77 00:04:04,640 --> 00:04:07,960 Speaker 1: It's called yuka, like the vegetable. That's correct, but it 78 00:04:08,000 --> 00:04:08,960 Speaker 1: also has yuck in it. 79 00:04:09,040 --> 00:04:11,680 Speaker 3: I just made up a new word. That's correct. It's correct. 80 00:04:12,000 --> 00:04:14,520 Speaker 4: That's correct for those of you, like OZ who are 81 00:04:14,600 --> 00:04:17,400 Speaker 4: unfamiliar with it. The app actually lets you scan the 82 00:04:17,440 --> 00:04:21,159 Speaker 4: barcodes of different foods and personal products and then gives 83 00:04:21,200 --> 00:04:24,800 Speaker 4: them a score based on how healthy it is. Everything 84 00:04:24,839 --> 00:04:26,640 Speaker 4: that I've ever scanned is like you will die. 85 00:04:26,839 --> 00:04:28,880 Speaker 1: So you basically you take a photo of the barcode 86 00:04:28,920 --> 00:04:30,760 Speaker 1: and upload it to the app, and then it kind 87 00:04:30,800 --> 00:04:32,520 Speaker 1: of spits out what the product has in it. 88 00:04:32,600 --> 00:04:33,520 Speaker 3: That's right, that's right. 89 00:04:33,600 --> 00:04:34,440 Speaker 1: How did the M and ms do? 90 00:04:35,040 --> 00:04:36,200 Speaker 3: Zero out of one hundred? 91 00:04:36,240 --> 00:04:36,920 Speaker 1: Is actually true? 92 00:04:37,000 --> 00:04:37,200 Speaker 3: Yes? 93 00:04:39,040 --> 00:04:40,640 Speaker 1: Yes, the nuts in them? 94 00:04:41,120 --> 00:04:43,920 Speaker 3: What you said? 95 00:04:43,920 --> 00:04:47,640 Speaker 4: They contain at least six additives, four of which it 96 00:04:47,760 --> 00:04:49,039 Speaker 4: rated as high risk. 97 00:04:49,120 --> 00:04:50,440 Speaker 3: Whatever high risk means. 98 00:04:50,720 --> 00:04:52,920 Speaker 4: It's already I'm too far gone because I eat pan 99 00:04:52,920 --> 00:04:53,960 Speaker 4: and Eminem's every single night. 100 00:04:54,000 --> 00:04:55,039 Speaker 3: I call them my night chocolate. 101 00:04:55,200 --> 00:04:57,760 Speaker 1: You know. I read that the app sometimes offers healthier 102 00:04:57,800 --> 00:05:00,920 Speaker 1: alternatives for products that you've scanned or looked up, and 103 00:05:00,960 --> 00:05:04,840 Speaker 1: I'll produce a tory. Actually tried scanning some Twinkies. Sadly, 104 00:05:05,320 --> 00:05:06,480 Speaker 1: there are no alternatives. 105 00:05:06,920 --> 00:05:09,000 Speaker 4: This is like when people say, oh, do you want 106 00:05:09,000 --> 00:05:11,800 Speaker 4: to have sex? Free sex? You know what I'm saying, 107 00:05:11,800 --> 00:05:13,760 Speaker 4: It's just that to me, Well, it's like how I 108 00:05:13,800 --> 00:05:16,640 Speaker 4: feel about all these alternatives. So I actually have a 109 00:05:16,640 --> 00:05:19,000 Speaker 4: friend who was pushing Yuka on me so hard, and 110 00:05:19,000 --> 00:05:21,240 Speaker 4: I'm like, leave me out of this drama. I don't 111 00:05:21,279 --> 00:05:23,400 Speaker 4: want to see what's in the products that I'm eating. 112 00:05:23,880 --> 00:05:28,280 Speaker 4: But it actually looks like she's not alone. Unsurprisingly, our 113 00:05:28,360 --> 00:05:31,800 Speaker 4: dear RFK Junior and his wife both use Yuka. 114 00:05:31,839 --> 00:05:33,680 Speaker 1: This is the app you use if you want to 115 00:05:33,880 --> 00:05:36,839 Speaker 1: have the feeling of the United States Secretary of Health 116 00:05:36,839 --> 00:05:40,880 Speaker 1: and Human Services constantly whispering over your shoulder about food additives. 117 00:05:41,240 --> 00:05:45,000 Speaker 1: And indeed, those emin ms have red forty, which. 118 00:05:44,839 --> 00:05:46,880 Speaker 4: Is so sad because red eminems are my favorite m 119 00:05:46,920 --> 00:05:49,159 Speaker 4: and ms. You know, I think it speaks to a 120 00:05:49,200 --> 00:05:53,680 Speaker 4: sort of health conscious, make America healthy moment people are having. 121 00:05:53,760 --> 00:05:55,120 Speaker 3: Like if you've been to. 122 00:05:55,040 --> 00:05:57,680 Speaker 4: The grocery store recently, they're putting protein on everything, like 123 00:05:57,960 --> 00:05:58,960 Speaker 4: protein waffles. 124 00:05:59,000 --> 00:06:01,080 Speaker 3: Like, why do I need prote in my popcorn? It's 125 00:06:01,120 --> 00:06:01,960 Speaker 3: just popcorn? 126 00:06:02,160 --> 00:06:02,320 Speaker 1: You know. 127 00:06:02,360 --> 00:06:05,440 Speaker 4: There are sodas with extra fiber and probiotics in them. 128 00:06:05,440 --> 00:06:08,799 Speaker 4: You've got steak and shake transitioning away from using seed 129 00:06:08,800 --> 00:06:10,520 Speaker 4: oils in French fry, Like if I go to steak 130 00:06:10,520 --> 00:06:11,920 Speaker 4: a chake, I want to eat a French fry. I'm 131 00:06:11,960 --> 00:06:15,000 Speaker 4: not trying to eat like an avocado oil French fry. 132 00:06:15,200 --> 00:06:17,840 Speaker 1: The wood Street Channal reported that Yuka has sixty eight 133 00:06:18,040 --> 00:06:22,279 Speaker 1: million users worldwide, and an average of twenty five thousand 134 00:06:22,320 --> 00:06:26,039 Speaker 1: new US users have joined daily since the beginning of 135 00:06:26,040 --> 00:06:28,400 Speaker 1: this year, twenty five thousand people every day. At the 136 00:06:28,400 --> 00:06:31,160 Speaker 1: beginning of May, Yuka rank as the number one health 137 00:06:31,200 --> 00:06:33,800 Speaker 1: and fitness app in Apple's app store. Not only that, 138 00:06:34,040 --> 00:06:37,760 Speaker 1: major food brands like Campbell's and Chabani have responded to 139 00:06:37,800 --> 00:06:40,719 Speaker 1: customers complaining about the ingredients they find in their products 140 00:06:40,760 --> 00:06:41,559 Speaker 1: while using Yuka. 141 00:06:42,120 --> 00:06:43,719 Speaker 4: I don't want to think that we are all becoming 142 00:06:43,880 --> 00:06:47,400 Speaker 4: Rfki junior. But I think at least once you download 143 00:06:47,440 --> 00:06:49,880 Speaker 4: the app, you can think that you're being health conscious, 144 00:06:50,200 --> 00:06:51,200 Speaker 4: even if it's one time. 145 00:06:51,480 --> 00:06:53,160 Speaker 1: I mean, I think the thing is like if you're 146 00:06:53,160 --> 00:06:55,400 Speaker 1: reaching for a bag of M and m's no judgment, 147 00:06:56,160 --> 00:06:58,000 Speaker 1: you kind of know it's not besting anyway. 148 00:06:57,720 --> 00:06:59,360 Speaker 4: Anything that I reach for, It's like, I don't need 149 00:06:59,400 --> 00:07:01,240 Speaker 4: an app to tell me if I'm doing something right now. 150 00:07:01,279 --> 00:07:04,560 Speaker 1: I think if you're choosing between like different frozen dinners, 151 00:07:04,600 --> 00:07:07,120 Speaker 1: for example, and one is like categorically best than the other, 152 00:07:07,240 --> 00:07:09,080 Speaker 1: that like, if you're like it's an input to a 153 00:07:09,120 --> 00:07:11,440 Speaker 1: switching decision versus like should I eat the eminems, I 154 00:07:11,440 --> 00:07:12,640 Speaker 1: can imagine it being more useful. 155 00:07:12,720 --> 00:07:14,760 Speaker 4: Yes, and I do think people are interested in finding 156 00:07:14,760 --> 00:07:16,760 Speaker 4: healthy alternatives, which is an interesting thing. 157 00:07:17,400 --> 00:07:20,200 Speaker 3: I guess that comes out of Yuca. Yes, that's absolutely true. 158 00:07:20,280 --> 00:07:22,920 Speaker 1: Well, some people are looking into what's going into their food, 159 00:07:23,560 --> 00:07:26,080 Speaker 1: others into what's going into their clusrooms. 160 00:07:27,000 --> 00:07:27,520 Speaker 3: Very nice. 161 00:07:27,600 --> 00:07:29,160 Speaker 1: Thank You've got a headline for us about this. 162 00:07:29,400 --> 00:07:31,400 Speaker 4: I do have a news story for you about what's 163 00:07:31,440 --> 00:07:33,600 Speaker 4: going on in the classroom, a place I haven't been 164 00:07:33,600 --> 00:07:37,600 Speaker 4: in at least fifteen years. Recently, over two hundred and 165 00:07:37,640 --> 00:07:42,400 Speaker 4: fifty CEOs, from Microsoft's Satya Nadella to Josh Kushner's Karly 166 00:07:42,520 --> 00:07:46,840 Speaker 4: Klaus to the CEO of the College Board signed an 167 00:07:46,880 --> 00:07:49,880 Speaker 4: open letter calling for computer science and AI to be 168 00:07:49,960 --> 00:07:54,360 Speaker 4: quote a core part of US kindergarten through twelve curricula. 169 00:07:55,240 --> 00:07:58,240 Speaker 4: The letter states that taking just one high school computer 170 00:07:58,320 --> 00:08:01,360 Speaker 4: science course can boost students few wages by eight percent, 171 00:08:01,560 --> 00:08:03,880 Speaker 4: regardless of career path or college attendance. 172 00:08:04,160 --> 00:08:06,119 Speaker 1: One of these moments where we were like in peak, 173 00:08:06,280 --> 00:08:08,400 Speaker 1: everyone should learn to code, and then it was like 174 00:08:08,400 --> 00:08:09,800 Speaker 1: no one should learn to code, and now it's like 175 00:08:09,840 --> 00:08:11,119 Speaker 1: people should learn to code again. 176 00:08:11,560 --> 00:08:13,120 Speaker 4: I know I'd to use a computer very young, and 177 00:08:13,320 --> 00:08:15,360 Speaker 4: my income has not been boosted by eight percent. 178 00:08:15,360 --> 00:08:16,000 Speaker 3: I'll tell you that. 179 00:08:16,080 --> 00:08:19,080 Speaker 1: The letter doesn't specify really how this curriculum should be 180 00:08:19,120 --> 00:08:21,480 Speaker 1: developed and rolled out, but it does point to countries 181 00:08:21,520 --> 00:08:25,400 Speaker 1: like Singapore, China, and South Korea as examples of countries 182 00:08:25,440 --> 00:08:28,400 Speaker 1: they've done this successfully. The letter reads, quote in the 183 00:08:28,400 --> 00:08:30,360 Speaker 1: age of AI, we must prepare our children for the 184 00:08:30,400 --> 00:08:33,840 Speaker 1: future to be AI creators, not just consumers. A basic 185 00:08:33,880 --> 00:08:37,280 Speaker 1: foundation in computer science and AI is crucial for helping 186 00:08:37,320 --> 00:08:40,480 Speaker 1: every student thrive in a technology driven world. Without it, 187 00:08:40,559 --> 00:08:41,560 Speaker 1: there is falling behind. 188 00:08:42,120 --> 00:08:45,320 Speaker 4: Yeah, there's a huge investment in AI education from countries 189 00:08:45,360 --> 00:08:47,360 Speaker 4: who want to get a long term edge in the 190 00:08:47,360 --> 00:08:51,960 Speaker 4: AI race, Like the UAE. The Amoradi school system will 191 00:08:51,960 --> 00:08:54,720 Speaker 4: add AI as a subject in the upcoming school year 192 00:08:54,800 --> 00:08:58,840 Speaker 4: and will include concepts like ethical awareness and real world applications. 193 00:08:59,360 --> 00:09:03,520 Speaker 4: Schools in Beijing will start offering AI courses in September 194 00:09:03,600 --> 00:09:04,080 Speaker 4: as well. 195 00:09:04,360 --> 00:09:06,920 Speaker 1: Here in the US, President Trump recently signed an executive 196 00:09:07,040 --> 00:09:10,920 Speaker 1: order calling to emphasize AI competency in schools. The executive 197 00:09:11,000 --> 00:09:14,880 Speaker 1: order also called the establishment of a Presidential Artificial Intelligence 198 00:09:15,000 --> 00:09:19,160 Speaker 1: Challenge and nationwide competition for students and educators demonstrate their 199 00:09:19,200 --> 00:09:20,120 Speaker 1: AI skills. 200 00:09:20,200 --> 00:09:23,400 Speaker 4: This reminds me of the Bass pro fishing Python Hunter 201 00:09:23,760 --> 00:09:27,559 Speaker 4: Bowl in Florida, which is something I really loved. But no, honestly, 202 00:09:27,600 --> 00:09:30,520 Speaker 4: it seems like kids are demonstrating their AI skills, just 203 00:09:30,559 --> 00:09:33,320 Speaker 4: not in the ways that teachers would perhaps like them to. 204 00:09:33,800 --> 00:09:37,520 Speaker 4: New York Magazine actually ran an article titled quote everyone 205 00:09:37,600 --> 00:09:40,120 Speaker 4: is cheating their way through college, And there was actually 206 00:09:40,120 --> 00:09:42,400 Speaker 4: a statistic in there that was really striking, which is 207 00:09:42,400 --> 00:09:45,960 Speaker 4: that in a survey of one thousand college students, nearly 208 00:09:46,080 --> 00:09:49,320 Speaker 4: ninety percent the other ten percenter line had used chat 209 00:09:49,360 --> 00:09:51,920 Speaker 4: gpt for homework help. And this was back in twenty 210 00:09:51,960 --> 00:09:55,160 Speaker 4: twenty three, before chat gpt became you know, every eighty 211 00:09:55,200 --> 00:09:56,000 Speaker 4: year old's best friend. 212 00:09:56,120 --> 00:09:58,000 Speaker 1: Yeah, that last ten percent, I think it's probably been 213 00:09:58,120 --> 00:10:01,840 Speaker 1: accounted for since twenty twenty three. One of the students said, quote, 214 00:10:01,840 --> 00:10:03,960 Speaker 1: with chat gipt, I can write an essay in two 215 00:10:03,960 --> 00:10:06,920 Speaker 1: hours that normally takes twelve. It can't be easy being 216 00:10:07,040 --> 00:10:10,400 Speaker 1: k through twelve teacher or college professor these days. Some 217 00:10:10,440 --> 00:10:12,280 Speaker 1: of them have tried devising their own ways to detect 218 00:10:12,360 --> 00:10:15,600 Speaker 1: chatchipt usage in their student's essays. Others say, you can 219 00:10:15,600 --> 00:10:17,439 Speaker 1: tell when an essay is written by a chatbot because 220 00:10:17,440 --> 00:10:21,559 Speaker 1: it's written clunkily or random words. But of course these 221 00:10:21,720 --> 00:10:23,000 Speaker 1: models are just getting better and better. 222 00:10:23,160 --> 00:10:24,640 Speaker 4: I mean, we got away with it in the script, 223 00:10:24,679 --> 00:10:26,480 Speaker 4: didn't we know? I'm kidding, But the way that I 224 00:10:26,520 --> 00:10:30,360 Speaker 4: would have exploited chat gypt at sixteen seventeen eighteen. You know, 225 00:10:30,440 --> 00:10:32,679 Speaker 4: all I had back in my day was free translation 226 00:10:32,880 --> 00:10:37,239 Speaker 4: dot com, which allowed me to write long French essays. 227 00:10:37,320 --> 00:10:40,480 Speaker 1: Yes, you can work well enough to did it depends on. 228 00:10:41,320 --> 00:10:45,559 Speaker 3: As my B plus baby as MYB plus. 229 00:10:45,240 --> 00:10:48,360 Speaker 1: So our next headline is about competition between the US 230 00:10:48,360 --> 00:10:51,000 Speaker 1: and China, but not in the classroom and not really 231 00:10:51,040 --> 00:10:54,480 Speaker 1: an AI, rather in the realm of vehicles. The Wall 232 00:10:54,520 --> 00:10:57,240 Speaker 1: Street Journal around this headline with the story what a 233 00:10:57,360 --> 00:11:01,560 Speaker 1: fifteen thousand dollars electric suv says about the US China 234 00:11:01,720 --> 00:11:05,640 Speaker 1: car rivalry. The car in question is the Toyota BZ 235 00:11:05,920 --> 00:11:09,440 Speaker 1: three X, which is a compact electric suv about the 236 00:11:09,440 --> 00:11:12,120 Speaker 1: same size as the Toyota RAV four, which of course 237 00:11:12,200 --> 00:11:15,040 Speaker 1: is ubiquitous here in the US. But the car has 238 00:11:15,080 --> 00:11:16,800 Speaker 1: a jaw droppingly low price tag. 239 00:11:17,600 --> 00:11:20,200 Speaker 4: If an electric car costs fifteen thousand dollars, it would 240 00:11:20,200 --> 00:11:21,600 Speaker 4: be like a plug and chug driver. 241 00:11:22,000 --> 00:11:22,760 Speaker 3: It's amazing. 242 00:11:23,200 --> 00:11:26,080 Speaker 1: Yeah, Well, you can thank China's supply chains for the 243 00:11:26,080 --> 00:11:28,880 Speaker 1: price of the car. Toyota are obviously a Japanese company, 244 00:11:29,320 --> 00:11:32,200 Speaker 1: but the cars are localized in different parts of the world, 245 00:11:32,400 --> 00:11:35,520 Speaker 1: and this suv is made in China using Chinese batteries 246 00:11:35,640 --> 00:11:39,640 Speaker 1: and Chinese driver assistance technology. But when Toyotas are sold 247 00:11:39,640 --> 00:11:42,319 Speaker 1: in the US, the supply chain is way more expensive. 248 00:11:42,720 --> 00:11:46,199 Speaker 1: There's a similar model available here for about forty thousand dollars, 249 00:11:46,559 --> 00:11:48,480 Speaker 1: so someone else, why don't you buy the Chinese version 250 00:11:48,480 --> 00:11:51,520 Speaker 1: and import it. Well, that'd be the most beautiful word 251 00:11:51,720 --> 00:11:55,480 Speaker 1: in English language tariffs, except in this case, bidener are 252 00:11:55,559 --> 00:11:58,600 Speaker 1: tariffs which put one hundred percent tariff on Chinese evs. 253 00:11:58,800 --> 00:12:02,280 Speaker 4: And I don't see Trump reversing that one. 254 00:12:02,640 --> 00:12:04,160 Speaker 1: That's not one of the ones he's going to be 255 00:12:04,520 --> 00:12:05,600 Speaker 1: running back gus for sure. 256 00:12:05,600 --> 00:12:07,480 Speaker 4: But it's a weird thing how much we live in 257 00:12:07,480 --> 00:12:11,400 Speaker 4: this parallel universe with China, Like fifteen thousand versus forty thousand, 258 00:12:11,480 --> 00:12:13,000 Speaker 4: is I guess a one hundred percent tariff? 259 00:12:13,040 --> 00:12:14,040 Speaker 3: There you go, that's that. 260 00:12:16,080 --> 00:12:19,960 Speaker 1: More. And China people buy local car brands that we've 261 00:12:19,960 --> 00:12:23,640 Speaker 1: never heard of here in the US Zeka Ion there 262 00:12:23,679 --> 00:12:26,120 Speaker 1: is one that's slightly more well known called BYD not 263 00:12:26,160 --> 00:12:30,880 Speaker 1: by me. And the technology around these Chinese evs is 264 00:12:30,960 --> 00:12:35,240 Speaker 1: very impressive. C ATL, a Chinese battery manufacturer, recently showed 265 00:12:35,240 --> 00:12:37,640 Speaker 1: off a new EV battery that can put three hundred 266 00:12:37,720 --> 00:12:40,840 Speaker 1: miles of charge into a car battery in just five minutes. 267 00:12:41,040 --> 00:12:41,840 Speaker 3: That's insane. 268 00:12:41,960 --> 00:12:45,559 Speaker 1: These companies in China, these battery companies and car companies 269 00:12:46,080 --> 00:12:48,160 Speaker 1: are really really pulling ahead in the race. 270 00:12:48,280 --> 00:12:49,120 Speaker 3: That's incredible. 271 00:12:49,360 --> 00:12:52,040 Speaker 4: It just seems like cars are a really good reflection 272 00:12:52,120 --> 00:12:54,960 Speaker 4: of the sort of siling off of global economies, at 273 00:12:55,040 --> 00:12:57,640 Speaker 4: least between the US and China, and with the addition 274 00:12:57,720 --> 00:13:00,880 Speaker 4: of more tariffs on Chinese imports, this it probably isn't 275 00:13:00,880 --> 00:13:02,400 Speaker 4: going to change anytime soon. 276 00:13:03,120 --> 00:13:04,679 Speaker 3: Bye. By globalization. 277 00:13:05,360 --> 00:13:08,440 Speaker 1: There is one area though, where countries still work together, 278 00:13:09,120 --> 00:13:10,439 Speaker 1: and that's Interpol. 279 00:13:10,800 --> 00:13:12,000 Speaker 3: You know, you're right about that. 280 00:13:12,480 --> 00:13:14,120 Speaker 4: And I wanted to tell you a little bit about 281 00:13:14,120 --> 00:13:17,360 Speaker 4: this Financial Time story that I read about how Interpol 282 00:13:17,400 --> 00:13:20,520 Speaker 4: has been adapting to the technologies modern criminals are using. 283 00:13:20,960 --> 00:13:25,360 Speaker 4: Interpol you know, is, of course the International Criminal Police Organization. 284 00:13:25,840 --> 00:13:28,240 Speaker 1: Of course I didn't know that. I thought you would. 285 00:13:28,720 --> 00:13:29,960 Speaker 3: You're supposed to know all this stuff. 286 00:13:30,120 --> 00:13:31,479 Speaker 1: I thought it was French Interpol. 287 00:13:31,640 --> 00:13:34,200 Speaker 4: So let me say it again for you, international criminal 288 00:13:34,280 --> 00:13:37,720 Speaker 4: police organization. And they have been engaged in a technological 289 00:13:37,840 --> 00:13:40,319 Speaker 4: arms race with the world's most wanted criminals. I always 290 00:13:40,360 --> 00:13:42,280 Speaker 4: think of inspector Gadget when I think of this stuff. 291 00:13:42,400 --> 00:13:45,160 Speaker 1: The article is a great read, and it describes Interpol's 292 00:13:45,200 --> 00:13:48,840 Speaker 1: innovation lab, which is in Singapore, with an opening scene 293 00:13:49,080 --> 00:13:53,440 Speaker 1: quote a fleet of underwater drones, gleaming and ready for action, 294 00:13:53,800 --> 00:13:57,280 Speaker 1: is lined up along the wall. Nearby a small armory 295 00:13:57,400 --> 00:14:00,680 Speaker 1: of brightly colored three D printed guns displayed on a 296 00:14:00,720 --> 00:14:05,000 Speaker 1: side table. A robot dog named Ino lies prone on 297 00:14:05,040 --> 00:14:07,280 Speaker 1: the floor waiting to be activated. 298 00:14:07,480 --> 00:14:08,640 Speaker 3: There's something filthy. 299 00:14:08,800 --> 00:14:10,240 Speaker 4: When I first read that, I was like, you know, 300 00:14:10,320 --> 00:14:12,000 Speaker 4: I do not like the way they're talking about you 301 00:14:12,000 --> 00:14:14,600 Speaker 4: in this article. He lies prone on the floor. That's 302 00:14:14,600 --> 00:14:17,319 Speaker 4: how you know FT is a British newspaper. But yeah, 303 00:14:17,400 --> 00:14:21,280 Speaker 4: the FT actually interviewed the head of Interpol's Digital Forensics 304 00:14:21,320 --> 00:14:23,920 Speaker 4: team and he said that the advancement of technology in 305 00:14:23,920 --> 00:14:26,480 Speaker 4: the last couple of years is the biggest he's seen, 306 00:14:26,760 --> 00:14:28,640 Speaker 4: which is saying a lot because he's actually been working 307 00:14:28,720 --> 00:14:31,000 Speaker 4: there since the late nineties to keep up with the 308 00:14:31,000 --> 00:14:34,800 Speaker 4: evolution of cybercrime. Interpol actually opened the Singapore Lab in 309 00:14:34,840 --> 00:14:38,160 Speaker 4: twenty fifteen, and these days one of their main focuses 310 00:14:38,440 --> 00:14:41,840 Speaker 4: is identifying AI enabled scams, which, as you know and 311 00:14:41,880 --> 00:14:44,840 Speaker 4: as we know as a listenership, are getting more and. 312 00:14:44,760 --> 00:14:46,760 Speaker 3: More sophisticated with the use of deep fakes. 313 00:14:47,280 --> 00:14:50,280 Speaker 4: Just a decade ago we were dealing with the Nigerian 314 00:14:50,320 --> 00:14:52,720 Speaker 4: print scam, and now we are trying to tackle deep 315 00:14:52,720 --> 00:14:57,360 Speaker 4: fake romance scams, sextortion and multimillion dollar phishing attacks. So 316 00:14:57,400 --> 00:15:00,280 Speaker 4: when the FT reporter visited the Interpol lab, it was 317 00:15:00,360 --> 00:15:05,120 Speaker 4: monitoring nearly three point five million attempted cyber attacks, and 318 00:15:05,160 --> 00:15:07,640 Speaker 4: he was told that that was fairly typical. 319 00:15:08,040 --> 00:15:11,120 Speaker 1: I especially like the section about the robot canine units. 320 00:15:11,320 --> 00:15:14,040 Speaker 1: Some of the models Interpol has are the size of 321 00:15:14,040 --> 00:15:16,920 Speaker 1: a German shepherd, and they can run up to seven 322 00:15:16,920 --> 00:15:19,160 Speaker 1: and a half miles per hour, and they can jump 323 00:15:19,240 --> 00:15:22,760 Speaker 1: pretty high. And they can also be sort of two 324 00:15:22,840 --> 00:15:27,360 Speaker 1: way microphone systems carrying audio messages, which apparently can be 325 00:15:27,400 --> 00:15:29,360 Speaker 1: quite handy in hostage situations. 326 00:15:29,640 --> 00:15:32,240 Speaker 4: Yeah, so say what you want about technology being used 327 00:15:32,240 --> 00:15:35,520 Speaker 4: for morally dubious ends, but if I were the hostage 328 00:15:35,520 --> 00:15:38,360 Speaker 4: in that situation, at least my confusion would distract from 329 00:15:38,360 --> 00:15:40,840 Speaker 4: my panic, Like you'd be waiting for your captors to 330 00:15:40,920 --> 00:15:43,000 Speaker 4: untie you while they argue with a robot dog. 331 00:15:43,040 --> 00:15:47,280 Speaker 3: Like. Also, robot dogs can't retire or get sick or 332 00:15:47,280 --> 00:15:47,800 Speaker 3: get sick. 333 00:15:48,080 --> 00:15:51,680 Speaker 4: I actually just heard this story about how bomb squad 334 00:15:51,760 --> 00:15:54,280 Speaker 4: unit dogs have to retire with their owners. 335 00:15:54,360 --> 00:15:57,520 Speaker 1: Essentially, so the New York Police dogs basically they're one 336 00:15:57,520 --> 00:15:58,640 Speaker 1: person dogs. 337 00:15:58,520 --> 00:16:01,960 Speaker 4: And if they're handle or retire, they're like woh, Wow, 338 00:16:01,960 --> 00:16:03,200 Speaker 4: come to my retirement party. 339 00:16:03,280 --> 00:16:04,480 Speaker 1: Where's these robot dogs? 340 00:16:05,200 --> 00:16:08,360 Speaker 4: Exactly the root, completely owner agnostic. 341 00:16:08,440 --> 00:16:08,760 Speaker 3: Yeah. 342 00:16:08,840 --> 00:16:11,160 Speaker 4: But one of the big takeaways from the piece, this 343 00:16:11,400 --> 00:16:14,680 Speaker 4: ft piece, is that even with the innovation Lab, Interpol's 344 00:16:14,760 --> 00:16:15,880 Speaker 4: job is never over. 345 00:16:16,040 --> 00:16:18,480 Speaker 1: We talked about dogs, Now it's time for game of 346 00:16:18,840 --> 00:16:19,560 Speaker 1: cat and mouse. 347 00:16:22,000 --> 00:16:24,680 Speaker 4: Criminals will catch up and vice versa. You know, take 348 00:16:24,720 --> 00:16:28,000 Speaker 4: ghost guns for example. They are popular amongst criminals because, 349 00:16:28,080 --> 00:16:31,520 Speaker 4: unlike traditional firearms, they do not have serial numbers, and 350 00:16:31,560 --> 00:16:34,040 Speaker 4: so Interpol is now trying to figure out how to 351 00:16:34,120 --> 00:16:37,800 Speaker 4: link a ghost gun to their specific printers by analyzing 352 00:16:37,840 --> 00:16:40,280 Speaker 4: the composition of the materials in order to figure out 353 00:16:40,280 --> 00:16:40,840 Speaker 4: their origin. 354 00:16:41,120 --> 00:16:43,120 Speaker 1: I joked about cat and mouse gains. But you can 355 00:16:43,160 --> 00:16:46,600 Speaker 1: just imagine how the next innovation will be disguising the 356 00:16:46,720 --> 00:16:50,480 Speaker 1: variable compositions of these ghost guns. We've got a couple 357 00:16:50,480 --> 00:16:53,440 Speaker 1: more headlines to run through, Starting with another crime story. 358 00:16:54,000 --> 00:16:57,320 Speaker 1: The Guardian reports that a trial over a road rage 359 00:16:57,400 --> 00:17:01,160 Speaker 1: induced killing of a man called Chris Pelk is underway, 360 00:17:01,760 --> 00:17:06,119 Speaker 1: starring Chris Pelke and Ai. Generated Pelky appeared in a 361 00:17:06,200 --> 00:17:09,840 Speaker 1: video calling for forgiveness for the man accused of shooting 362 00:17:09,920 --> 00:17:13,520 Speaker 1: him in what maybe the first AI delivered victim impact 363 00:17:13,560 --> 00:17:16,280 Speaker 1: statement ever delivered in a courtroom. I'm going to play 364 00:17:16,320 --> 00:17:16,600 Speaker 1: a fore. 365 00:17:16,600 --> 00:17:20,080 Speaker 3: Now, in another life, we probably could have been friends. 366 00:17:20,800 --> 00:17:23,200 Speaker 3: I believe in forgiveness and in God who forgives. 367 00:17:23,480 --> 00:17:25,600 Speaker 1: I always have and I still do. 368 00:17:26,080 --> 00:17:28,280 Speaker 3: I love what does he keep saying. 369 00:17:28,280 --> 00:17:30,240 Speaker 1: I love that AI Judge shows out towards I love 370 00:17:30,280 --> 00:17:30,639 Speaker 1: that AI. 371 00:17:31,320 --> 00:17:33,719 Speaker 3: Oh my god, yea incredible. 372 00:17:34,080 --> 00:17:34,240 Speaker 4: You know. 373 00:17:34,320 --> 00:17:38,840 Speaker 1: The script was written by Pelky's sister and brother in law. 374 00:17:38,840 --> 00:17:41,160 Speaker 1: They fed the AI model, you know, images and video 375 00:17:41,280 --> 00:17:43,720 Speaker 1: of Pelki, but they actually wrote this script and I 376 00:17:43,720 --> 00:17:45,680 Speaker 1: thought it was quite quite moving that they would want 377 00:17:45,720 --> 00:17:48,240 Speaker 1: to go so far and above and beyond to give 378 00:17:48,280 --> 00:17:51,359 Speaker 1: a victim impact statement asking for forgiveness for the shooter. 379 00:17:51,800 --> 00:17:55,000 Speaker 4: It's incredible. Also, this is essentially a deep fake because 380 00:17:55,080 --> 00:17:58,200 Speaker 4: of their brother. And I've seen a lot of deep fakes. 381 00:17:58,200 --> 00:18:00,000 Speaker 4: This one is very good. Other than that it looked 382 00:18:00,240 --> 00:18:05,920 Speaker 4: sort of computer generated. It is computer generated, so another 383 00:18:05,960 --> 00:18:07,679 Speaker 4: deep fake news. Do you remember that picture of the 384 00:18:07,760 --> 00:18:10,320 Speaker 4: late Pope France is wearing a Montclair puffer jacket. 385 00:18:10,520 --> 00:18:13,880 Speaker 1: Yeah, this Ski Ski Sheek. It was like supreme Pope. 386 00:18:14,080 --> 00:18:18,120 Speaker 4: It was the Supreme Pope. It was AI generated but beloved. Nonetheless, 387 00:18:18,440 --> 00:18:20,160 Speaker 4: and now we sort of have a sequel to this. 388 00:18:20,640 --> 00:18:24,040 Speaker 4: The White House posted an AI generated picture of President 389 00:18:24,080 --> 00:18:29,560 Speaker 4: Trump dressed as the Pope hat robe, cross everything. 390 00:18:29,880 --> 00:18:31,880 Speaker 1: True, respect, that's right. 391 00:18:32,240 --> 00:18:35,080 Speaker 4: This post on X comes weeks after the passing of 392 00:18:35,119 --> 00:18:38,240 Speaker 4: Pope Francis and days after Trump said to media I'd 393 00:18:38,280 --> 00:18:42,160 Speaker 4: like to be Pope. There's been plenty of backlash from 394 00:18:42,400 --> 00:18:45,600 Speaker 4: state leaders to the New York archbishop, but when the 395 00:18:45,600 --> 00:18:50,200 Speaker 4: BBC asked a spokesperson for the Vatican to comment, they declined. 396 00:18:50,720 --> 00:18:56,080 Speaker 1: Finally, move over Hershey, Pennsylvania. According to The New York Times, 397 00:18:56,240 --> 00:19:00,479 Speaker 1: SpaceX is building a company town. Officially, residents of an 398 00:19:00,520 --> 00:19:03,320 Speaker 1: area surrounding SpaceX's launch site in the southern tip of 399 00:19:03,359 --> 00:19:06,760 Speaker 1: Texas have voted to create a city called star Base. 400 00:19:07,359 --> 00:19:09,920 Speaker 1: Starbase will be home to some three thy five hundred 401 00:19:10,000 --> 00:19:14,280 Speaker 1: SpaceX employees, and the proposed city boundaries include land owned 402 00:19:14,280 --> 00:19:17,280 Speaker 1: by the company and planned areas to build more housing. 403 00:19:17,920 --> 00:19:20,359 Speaker 1: SpaceX has filed paperwork with the state of Texas to 404 00:19:20,400 --> 00:19:23,080 Speaker 1: build a school, a power plant, and of course, a 405 00:19:23,160 --> 00:19:27,320 Speaker 1: sushi restaurant near He's hose. Yeah, that's why his main 406 00:19:27,359 --> 00:19:30,600 Speaker 1: residence is wild and where he voted. Apparently, the new 407 00:19:30,640 --> 00:19:34,959 Speaker 1: designation will also allow SpaceX to close a nearby beach 408 00:19:35,200 --> 00:19:38,359 Speaker 1: for rocket launches without the permission of the wider community. 409 00:19:38,680 --> 00:19:40,640 Speaker 4: And today we take you out on a joke from 410 00:19:40,720 --> 00:19:44,480 Speaker 4: care Price, which is, yes, I work at the Starbucks 411 00:19:44,600 --> 00:19:45,920 Speaker 4: in Starbase. That's good. 412 00:19:45,960 --> 00:19:46,600 Speaker 2: That's very good. 413 00:19:51,280 --> 00:19:53,000 Speaker 1: We're going to take a quick break now, and then 414 00:19:53,040 --> 00:19:56,639 Speaker 1: we're joined by the Washington Posts Drew Harwell to learn 415 00:19:56,720 --> 00:20:00,800 Speaker 1: about the three year live streaming marathon of Emily CC. 416 00:20:01,400 --> 00:20:18,159 Speaker 4: Stay with us, I can help, but wonder would I 417 00:20:18,320 --> 00:20:22,840 Speaker 4: watch someone drink a Starbucks from starbas on a live 418 00:20:22,840 --> 00:20:23,840 Speaker 4: stream or. 419 00:20:23,800 --> 00:20:30,439 Speaker 1: Eat a Starburst from Stargate. This brings us to our 420 00:20:30,480 --> 00:20:33,320 Speaker 1: next segment, which is a story I can't stop thinking about. 421 00:20:33,800 --> 00:20:35,680 Speaker 1: For a lot of teens and young adults, sitting down 422 00:20:35,720 --> 00:20:38,280 Speaker 1: and watching your favorite streamer play a video game or 423 00:20:38,359 --> 00:20:41,080 Speaker 1: live react to an event is as natural as watching 424 00:20:41,119 --> 00:20:44,720 Speaker 1: the Kardashians. And the Bravo of the live streaming world 425 00:20:45,000 --> 00:20:48,000 Speaker 1: is Twitch, a subsidiary of Amazon. Some of Twitch's most 426 00:20:48,000 --> 00:20:51,920 Speaker 1: popular streamers have tens of thousands of paid subscribers and 427 00:20:52,040 --> 00:20:55,880 Speaker 1: millions of views on their streams. They include political commentators 428 00:20:55,920 --> 00:21:00,399 Speaker 1: like Hassan Pika, gamers like Ninja, and marathon streamers like 429 00:21:00,520 --> 00:21:03,160 Speaker 1: kai sannot who will stream for hours at a time, 430 00:21:03,440 --> 00:21:07,960 Speaker 1: interacting with their subscribers and even performing requested stunts for 431 00:21:08,000 --> 00:21:08,760 Speaker 1: a fee, of course. 432 00:21:09,000 --> 00:21:11,960 Speaker 4: But there's one streamer in particular, a twenty eight year 433 00:21:12,000 --> 00:21:15,040 Speaker 4: old Texas woman who goes by the username Emily CC, 434 00:21:15,560 --> 00:21:18,480 Speaker 4: who has taken this concept of the live stream marathon 435 00:21:18,760 --> 00:21:21,520 Speaker 4: to a whole different level. For the past three years, 436 00:21:21,640 --> 00:21:24,560 Speaker 4: Emily has streamed her life twenty four hours a day, 437 00:21:24,840 --> 00:21:28,639 Speaker 4: seven days a week. She streams while driving, sleeping, shopping, 438 00:21:28,720 --> 00:21:31,800 Speaker 4: and only disappears from the camera to use the restroom. 439 00:21:32,320 --> 00:21:35,240 Speaker 1: As you can imagine, it is a huge sacrifice to 440 00:21:35,240 --> 00:21:38,080 Speaker 1: broadcast your life NonStop. I think I read the last 441 00:21:38,119 --> 00:21:40,520 Speaker 1: time she went on a date was seven years ago. 442 00:21:41,080 --> 00:21:44,480 Speaker 4: This to me is like peak parasocial And what surprises 443 00:21:44,520 --> 00:21:47,399 Speaker 4: me most is that she has over three hundred and 444 00:21:47,560 --> 00:21:49,040 Speaker 4: twenty thousand followers. 445 00:21:49,280 --> 00:21:53,520 Speaker 1: Yeah, it's part social experiment, it's part next generation reality TV, 446 00:21:54,080 --> 00:21:56,879 Speaker 1: it's part monetizing your own life with the hustle, and 447 00:21:56,920 --> 00:22:00,600 Speaker 1: it encapsulates everything you said in terms of this increasingly personal, 448 00:22:00,880 --> 00:22:03,800 Speaker 1: parasocial way that people interact with other people they don't 449 00:22:03,880 --> 00:22:08,240 Speaker 1: know online. And recently, The Washington Post published a profile 450 00:22:08,280 --> 00:22:10,960 Speaker 1: of EMILYCC and with thrilled to have the author, the 451 00:22:11,000 --> 00:22:14,119 Speaker 1: technology reporter Drew Harwell here with us today, Drew, Welcome 452 00:22:14,160 --> 00:22:16,280 Speaker 1: to tech stuff. Thanks for having me. So I actually 453 00:22:16,280 --> 00:22:20,040 Speaker 1: have Twitch dot tv slash EMILYCC open in front of 454 00:22:20,040 --> 00:22:23,680 Speaker 1: me right now. Emily is unfortunately asleep while her dog 455 00:22:23,880 --> 00:22:25,639 Speaker 1: is awake in the crate in front of her bed, 456 00:22:26,040 --> 00:22:28,240 Speaker 1: and she has the subhead on her Twitch. I'm in 457 00:22:28,280 --> 00:22:33,000 Speaker 1: an article watch post exclamation mark, Social record exclamation mark. 458 00:22:33,119 --> 00:22:36,000 Speaker 1: So evidently she wasn't too upset with your reporting. But 459 00:22:36,040 --> 00:22:36,960 Speaker 1: how did this come about? 460 00:22:37,440 --> 00:22:40,240 Speaker 2: I cover creators to the post And one day I 461 00:22:40,359 --> 00:22:43,280 Speaker 2: was just sort of, you know, procrastinating on X and 462 00:22:43,320 --> 00:22:46,560 Speaker 2: I saw this clip of Kai Sanai. He's like the 463 00:22:46,600 --> 00:22:49,199 Speaker 2: biggest star there is on Twitch. He was doing this 464 00:22:49,280 --> 00:22:53,200 Speaker 2: month long stream called Mafia than Io from his mansion 465 00:22:53,280 --> 00:22:55,560 Speaker 2: and between a couple of his stunts, they were just 466 00:22:55,600 --> 00:22:57,359 Speaker 2: kind of sitting around the computer and they were like, hey, 467 00:22:57,440 --> 00:23:01,200 Speaker 2: let's look up the longest streaming which person we can, 468 00:23:01,720 --> 00:23:04,520 Speaker 2: And they pulled up Emily and she was just shocked, 469 00:23:04,560 --> 00:23:06,800 Speaker 2: right because she was at home, like playing some goofy 470 00:23:06,880 --> 00:23:10,080 Speaker 2: video game and they both had this connection and she 471 00:23:10,240 --> 00:23:14,040 Speaker 2: was crying and Kai was really blown away, and she 472 00:23:14,320 --> 00:23:16,520 Speaker 2: talked about, you know, how she had been streaming for 473 00:23:16,560 --> 00:23:19,800 Speaker 2: three years twenty four to seven, never stopped, how she 474 00:23:19,920 --> 00:23:22,560 Speaker 2: was so tired, and yet she felt like she couldn't 475 00:23:22,640 --> 00:23:24,600 Speaker 2: quit the stream because she was so committed to it. 476 00:23:24,800 --> 00:23:27,000 Speaker 2: And it just struck me as such a fascinating human 477 00:23:27,040 --> 00:23:30,240 Speaker 2: story because the things she has to do to abide 478 00:23:30,240 --> 00:23:33,840 Speaker 2: by this like crazy challenge are wild, but to just 479 00:23:33,880 --> 00:23:39,200 Speaker 2: a great example of how inhuman the demands for streamers 480 00:23:39,240 --> 00:23:41,240 Speaker 2: can be, right. I mean, these are people who really 481 00:23:41,280 --> 00:23:43,440 Speaker 2: want to stand out on the Internet, and to do so, 482 00:23:43,480 --> 00:23:46,680 Speaker 2: they have to push themselves to the limits, and they 483 00:23:46,720 --> 00:23:49,280 Speaker 2: do so not knowing whether it'll pay off, not knowing 484 00:23:49,280 --> 00:23:51,960 Speaker 2: whether they'll even be streaming to anybody, or just you know, 485 00:23:52,119 --> 00:23:54,480 Speaker 2: five people and just sort of wasted time. So just 486 00:23:54,520 --> 00:23:56,520 Speaker 2: struck me as, you know, a fascinating example of so 487 00:23:56,560 --> 00:23:58,640 Speaker 2: many things we have to deal with on the modern Internet. 488 00:23:58,880 --> 00:24:01,560 Speaker 1: How did you persuade her to sit? First? Story? And 489 00:24:01,560 --> 00:24:03,119 Speaker 1: then will you in the twitch? While you were doing 490 00:24:03,200 --> 00:24:04,439 Speaker 1: your reporting, I guess you were. 491 00:24:04,600 --> 00:24:07,320 Speaker 2: So I'll start with her. I just reached out to her, 492 00:24:07,359 --> 00:24:10,919 Speaker 2: you know, I'd like doing profiles on creators, influencers and 493 00:24:10,960 --> 00:24:13,679 Speaker 2: the creator economy. I'd like doing narrative journalism, and so 494 00:24:13,720 --> 00:24:15,280 Speaker 2: I reach out to them and say, hey, just let 495 00:24:15,280 --> 00:24:17,280 Speaker 2: me tell your story. I want to understand you. I 496 00:24:17,320 --> 00:24:20,159 Speaker 2: want our readers to understand you. We have a pretty 497 00:24:20,200 --> 00:24:22,240 Speaker 2: old reader base, so a lot of this stuff is 498 00:24:22,320 --> 00:24:23,840 Speaker 2: very new to them. But I think, you know, in 499 00:24:23,840 --> 00:24:25,880 Speaker 2: my mind, I always see them as labor stories. These 500 00:24:25,920 --> 00:24:29,600 Speaker 2: are people who see the Internet as a career and 501 00:24:29,960 --> 00:24:33,320 Speaker 2: you know, the workplace for them is their home and 502 00:24:33,359 --> 00:24:35,280 Speaker 2: the Internet, and they don't really have a lot of 503 00:24:35,359 --> 00:24:39,680 Speaker 2: labor protections. They work for faceless companies who they hope 504 00:24:39,720 --> 00:24:42,840 Speaker 2: to make money from, but who basically don't care if 505 00:24:42,840 --> 00:24:44,679 Speaker 2: they live or die, right because there's always going to 506 00:24:44,680 --> 00:24:46,960 Speaker 2: be another twenty three year old streamer. So yeah, I 507 00:24:47,080 --> 00:24:49,080 Speaker 2: just took reaching out to Emily, and you know, she 508 00:24:49,240 --> 00:24:52,960 Speaker 2: was very game, and it was funny because everything I 509 00:24:53,040 --> 00:24:55,000 Speaker 2: talked with her about was on stream and it was 510 00:24:55,000 --> 00:24:58,520 Speaker 2: a really new experience for crazy, you know, it was crazy. 511 00:24:58,760 --> 00:25:01,040 Speaker 2: It was great. When I do these stories, I have 512 00:25:01,119 --> 00:25:03,200 Speaker 2: a kind of pre interview phase where I talked to 513 00:25:03,240 --> 00:25:05,720 Speaker 2: them over the phone. I walked them through this whole 514 00:25:05,760 --> 00:25:09,159 Speaker 2: weird experience of having a reporter shadow them, and so 515 00:25:09,200 --> 00:25:11,479 Speaker 2: we did some phone calls, we did some video calls 516 00:25:11,520 --> 00:25:14,920 Speaker 2: over Discord, which is kind of like a twitch communication 517 00:25:15,040 --> 00:25:17,720 Speaker 2: platform of choice. And then yeah, of course I spent 518 00:25:17,800 --> 00:25:20,360 Speaker 2: time with her in Austin. All of it was live streamed, 519 00:25:20,520 --> 00:25:23,480 Speaker 2: and so I could see in the chat as a 520 00:25:23,560 --> 00:25:26,560 Speaker 2: thousand people were watching. They were commenting on every question 521 00:25:26,600 --> 00:25:28,919 Speaker 2: I was asking, and you know, I'm asking sensitive questions 522 00:25:28,920 --> 00:25:32,880 Speaker 2: about like her lack of sex life, her bad experiences 523 00:25:32,880 --> 00:25:36,000 Speaker 2: with her parents, and how depressed she is. And this 524 00:25:36,119 --> 00:25:39,119 Speaker 2: whole time, the commenters are like either making fun of 525 00:25:39,160 --> 00:25:40,760 Speaker 2: the question or saying like, oh, well, I knew you 526 00:25:40,800 --> 00:25:43,080 Speaker 2: could ask about that, or you know, if I ask 527 00:25:43,160 --> 00:25:45,760 Speaker 2: a nice question, they're like, oh w like when for 528 00:25:45,880 --> 00:25:48,159 Speaker 2: mister Washington Post, or they'd call me unk as in 529 00:25:48,280 --> 00:25:51,879 Speaker 2: like uncle, like the old guy. So it was just 530 00:25:51,960 --> 00:25:54,520 Speaker 2: like so funny, and you know, for her, it was 531 00:25:54,560 --> 00:25:56,919 Speaker 2: like the most usual, normal thing in the world to 532 00:25:56,920 --> 00:25:58,960 Speaker 2: have people just commenting on that. But actually, when I 533 00:25:59,040 --> 00:26:01,080 Speaker 2: was in Austin with her, she stepped the way to 534 00:26:01,119 --> 00:26:03,200 Speaker 2: the bathroom for a minute and I was just sitting 535 00:26:03,200 --> 00:26:05,360 Speaker 2: alone in front of her computer while everybody was still 536 00:26:05,400 --> 00:26:08,720 Speaker 2: watching me. Everybody was posting these comments, and I felt 537 00:26:08,760 --> 00:26:11,680 Speaker 2: like my face flush. It just it's such an unusual 538 00:26:11,720 --> 00:26:14,399 Speaker 2: experience to have so many people watching you. And I 539 00:26:14,480 --> 00:26:16,919 Speaker 2: really felt in that moment like I understood more of 540 00:26:16,920 --> 00:26:20,560 Speaker 2: what she put herself through. Our lizard brains are not 541 00:26:20,880 --> 00:26:24,359 Speaker 2: built to process this amount of attention at all times, 542 00:26:24,400 --> 00:26:26,399 Speaker 2: and yet that was just normal for her, So it 543 00:26:26,400 --> 00:26:27,920 Speaker 2: gave me a good sense of what she has to 544 00:26:27,960 --> 00:26:28,359 Speaker 2: go through. 545 00:26:28,840 --> 00:26:32,720 Speaker 4: That's an incredible image, just imagining the journalist as the 546 00:26:32,760 --> 00:26:36,240 Speaker 4: subject moves away, sort of doing exactly what the subject 547 00:26:36,320 --> 00:26:39,560 Speaker 4: is used to doing, and just yes looked at in 548 00:26:39,600 --> 00:26:40,000 Speaker 4: that way. 549 00:26:40,560 --> 00:26:42,840 Speaker 2: It was wild. I mean you could basically still pull 550 00:26:42,920 --> 00:26:44,639 Speaker 2: up the clips now with me with her and like 551 00:26:44,880 --> 00:26:47,439 Speaker 2: petting her dog. You know, it's to see it on 552 00:26:47,480 --> 00:26:49,040 Speaker 2: the other side of the screen and to see that 553 00:26:49,119 --> 00:26:51,439 Speaker 2: place where she lives, not just in the box on 554 00:26:51,480 --> 00:26:53,240 Speaker 2: my monitor, but actually in real life. 555 00:26:53,280 --> 00:26:55,199 Speaker 1: It was like it's like you got to watch The 556 00:26:55,240 --> 00:26:57,280 Speaker 1: Truman Show and then play a character in it for 557 00:26:57,320 --> 00:26:59,880 Speaker 1: a moment and then leave again. Talk about the record. 558 00:27:00,320 --> 00:27:04,240 Speaker 1: This is as of Friday May ninth, How long will 559 00:27:04,280 --> 00:27:05,320 Speaker 1: Emily have been doing this for? 560 00:27:05,960 --> 00:27:09,000 Speaker 2: She has been online streaming twenty four to seven for 561 00:27:09,520 --> 00:27:12,760 Speaker 2: one two hundred and seventy nine days, and that's twenty 562 00:27:12,800 --> 00:27:14,800 Speaker 2: four to seven. You know, she does go to sleep, 563 00:27:14,920 --> 00:27:17,159 Speaker 2: she takes showers, you know, there's stuff when she's not 564 00:27:17,320 --> 00:27:19,600 Speaker 2: on camera, but all of that the camera has been 565 00:27:19,600 --> 00:27:21,800 Speaker 2: on recording and you can actually see in the data 566 00:27:21,800 --> 00:27:24,800 Speaker 2: it's wild. Like nobody really attempts that. It is a 567 00:27:24,840 --> 00:27:27,240 Speaker 2: crazy record, probably for good reason. But there are a 568 00:27:27,280 --> 00:27:31,160 Speaker 2: lot of marathon streamers who do long streams a month, 569 00:27:31,280 --> 00:27:34,840 Speaker 2: a week, two months, and they put themselves through these 570 00:27:35,040 --> 00:27:38,600 Speaker 2: wild challenges right where they'll lock themselves into a closet 571 00:27:38,720 --> 00:27:41,840 Speaker 2: or like ty Sinai, he basically created his own circus 572 00:27:41,840 --> 00:27:45,520 Speaker 2: with over like stunts and celebrity cameos, and you know, 573 00:27:45,560 --> 00:27:48,240 Speaker 2: it's a really interesting kind of entertainment because we're used 574 00:27:48,280 --> 00:27:50,840 Speaker 2: to these disparate blocks of like I watch a TV 575 00:27:50,880 --> 00:27:53,960 Speaker 2: show for thirty minutes, but these streamers, like they foster 576 00:27:54,080 --> 00:27:57,560 Speaker 2: that parasocial relationship with people where it's like my fans 577 00:27:57,560 --> 00:27:59,399 Speaker 2: are going to wake up, They're going to turn on 578 00:27:59,480 --> 00:28:01,240 Speaker 2: my Twitch, They're going to see what I'm doing. They're 579 00:28:01,240 --> 00:28:02,800 Speaker 2: going to watch me all day. They're just going to 580 00:28:02,880 --> 00:28:06,240 Speaker 2: have me in a tab on their computer thinking about me. 581 00:28:06,280 --> 00:28:07,640 Speaker 2: And some of the people I talk to who were 582 00:28:07,680 --> 00:28:10,840 Speaker 2: Emily's fans. They would go to sleep listening to Emily's voice, 583 00:28:11,240 --> 00:28:14,080 Speaker 2: and she was the background noise of their life. And 584 00:28:14,520 --> 00:28:16,760 Speaker 2: the stuff Emily puts herself through is I think fascinating. 585 00:28:16,760 --> 00:28:19,800 Speaker 2: But also these people who devote their lives to her 586 00:28:19,840 --> 00:28:22,000 Speaker 2: are really interesting too, because they really do see Emily 587 00:28:22,000 --> 00:28:25,359 Speaker 2: as a friend. And the more time I spent with Emily, 588 00:28:25,400 --> 00:28:27,639 Speaker 2: the more I got it. If I'm going to be 589 00:28:27,640 --> 00:28:31,160 Speaker 2: spending a lot of time alone by myself, it makes 590 00:28:31,160 --> 00:28:33,080 Speaker 2: sense to just have this person kind of there. It's 591 00:28:33,080 --> 00:28:34,600 Speaker 2: sort of like there's a person in the room, like 592 00:28:34,600 --> 00:28:37,800 Speaker 2: I have a roommate, and you know, if I can 593 00:28:37,840 --> 00:28:39,720 Speaker 2: relate to somebody like Emily, if I can relate to 594 00:28:39,720 --> 00:28:42,200 Speaker 2: the people I see on TV, why wouldn't I want 595 00:28:42,200 --> 00:28:45,120 Speaker 2: them around all the time. So you kind of get 596 00:28:45,320 --> 00:28:48,520 Speaker 2: where it comes from. We can kind of like scrutinize 597 00:28:48,760 --> 00:28:51,360 Speaker 2: are these real relationships? Are these real friendships? Like? Are 598 00:28:51,360 --> 00:28:51,600 Speaker 2: they real? 599 00:28:51,720 --> 00:28:51,880 Speaker 1: You know? 600 00:28:52,040 --> 00:28:54,760 Speaker 2: Is this just like an illusion? But I think people 601 00:28:54,800 --> 00:28:56,960 Speaker 2: do kind of get something from it, and I think 602 00:28:57,000 --> 00:29:00,680 Speaker 2: it's kind of interesting to understand the gift and the 603 00:29:00,720 --> 00:29:02,520 Speaker 2: curse of how this stuff comes together. 604 00:29:02,880 --> 00:29:04,840 Speaker 4: Can you just talk a little bit about how she 605 00:29:04,920 --> 00:29:08,200 Speaker 4: got started doing this because it started differently than it 606 00:29:08,240 --> 00:29:08,640 Speaker 4: is now. 607 00:29:09,200 --> 00:29:11,280 Speaker 2: So she was an only child. She grew up you know, 608 00:29:11,320 --> 00:29:14,880 Speaker 2: on screens, watching TV shows and video games. She was nineteen, 609 00:29:14,960 --> 00:29:18,280 Speaker 2: she was working at a CBS as a cashier, not 610 00:29:18,360 --> 00:29:20,760 Speaker 2: really knowing what she wanted to do with her life. 611 00:29:20,840 --> 00:29:22,760 Speaker 2: A lot of her friends had moved away. She kind 612 00:29:22,800 --> 00:29:24,520 Speaker 2: of stayed home. She didn't have a lot of money. 613 00:29:25,280 --> 00:29:28,480 Speaker 2: She had a boyfriend who was obsessed with playing video games, 614 00:29:29,000 --> 00:29:31,080 Speaker 2: and she would go over to his house and kind 615 00:29:31,080 --> 00:29:34,239 Speaker 2: of bring her laptop and basically entertain herself. Why he 616 00:29:34,320 --> 00:29:38,400 Speaker 2: was busy gaming, and I think basically to put her off, 617 00:29:38,880 --> 00:29:41,360 Speaker 2: he said, you know, why don't you like stream yourself 618 00:29:41,360 --> 00:29:43,680 Speaker 2: on Twitch? Like you've got no friends, right, Like you 619 00:29:43,800 --> 00:29:45,800 Speaker 2: just do your own thing on the internet. And she 620 00:29:45,880 --> 00:29:47,840 Speaker 2: saw that as like, Okay, maybe this is a way 621 00:29:47,880 --> 00:29:51,040 Speaker 2: to build friendships and make friends, and so she started 622 00:29:51,040 --> 00:29:53,640 Speaker 2: like a lot of streamers do, where she was playing 623 00:29:53,720 --> 00:29:58,400 Speaker 2: you know, World of Warcraft, a multiplayer games and recording 624 00:29:58,440 --> 00:30:00,680 Speaker 2: her face in the box on the corn the screen. 625 00:30:00,880 --> 00:30:02,920 Speaker 2: And you can go back and actually see the statistics 626 00:30:02,960 --> 00:30:05,400 Speaker 2: of her first streams, like nobody was watching, right, A 627 00:30:05,440 --> 00:30:08,680 Speaker 2: couple people here and there, and she just kept doing 628 00:30:08,720 --> 00:30:11,000 Speaker 2: it and she felt I think, a sense of purpose 629 00:30:11,040 --> 00:30:14,000 Speaker 2: from it because she wasn't getting purpose from her school, 630 00:30:14,480 --> 00:30:17,040 Speaker 2: wasn't really feeling it in her job, and this was 631 00:30:17,080 --> 00:30:19,080 Speaker 2: something that she could do on her own, devote her 632 00:30:19,120 --> 00:30:21,280 Speaker 2: life to, and she just kept doing it and doing 633 00:30:21,360 --> 00:30:23,040 Speaker 2: it and doing it. And this happens with a lot 634 00:30:23,080 --> 00:30:25,880 Speaker 2: of creators where you know they are really driven to 635 00:30:26,600 --> 00:30:28,960 Speaker 2: be the best they can be, and their computer is 636 00:30:29,000 --> 00:30:31,160 Speaker 2: always there, their webcam is always there. So she just 637 00:30:31,240 --> 00:30:33,840 Speaker 2: really kind of fell into it and now jumped to 638 00:30:34,520 --> 00:30:37,520 Speaker 2: three years later, it has become her life and you know, 639 00:30:37,600 --> 00:30:39,800 Speaker 2: she doesn't even really know a way to stop because 640 00:30:39,800 --> 00:30:42,440 Speaker 2: it's been so so critical to how she lives. 641 00:30:43,520 --> 00:30:45,360 Speaker 4: And the numbers are important, but I want to also 642 00:30:45,400 --> 00:30:48,560 Speaker 4: talk about the money, like she has paying subscribers, but 643 00:30:48,600 --> 00:30:50,760 Speaker 4: can she actually live on this wage? 644 00:30:51,240 --> 00:30:52,880 Speaker 2: Let me break down kind of the revenue of how 645 00:30:52,880 --> 00:30:55,400 Speaker 2: this works for creators, because it is really interesting. I 646 00:30:55,400 --> 00:30:59,200 Speaker 2: think she can make a living wage. She's not a millionaire. 647 00:30:59,200 --> 00:31:01,719 Speaker 2: There is a class of people who are millionaires on Twitch, 648 00:31:02,600 --> 00:31:05,080 Speaker 2: like a lot of creator platforms, it's a one percent problem. 649 00:31:05,080 --> 00:31:06,880 Speaker 3: There's the one percent of Twitch. 650 00:31:06,800 --> 00:31:08,959 Speaker 2: Totally kind of like the one percent of Hollywood, right, 651 00:31:08,960 --> 00:31:11,720 Speaker 2: and the one percent of Major League sports where there's 652 00:31:12,120 --> 00:31:15,320 Speaker 2: a lot of people who just basically straight by. She's 653 00:31:15,440 --> 00:31:17,760 Speaker 2: kind of I would say, maybe the upper middle class. 654 00:31:17,800 --> 00:31:21,640 Speaker 2: It sounds like based off of her following, she makes 655 00:31:21,680 --> 00:31:25,360 Speaker 2: probably around five thousand dollars a month, maybe more, maybe 656 00:31:25,360 --> 00:31:28,400 Speaker 2: ten thousand dollars a month, so it's nothing to sneer at. 657 00:31:28,800 --> 00:31:31,880 Speaker 2: And on Twitch, if you get a number of followers, 658 00:31:32,080 --> 00:31:35,080 Speaker 2: you can get a share of the subscriptions. People pay 659 00:31:35,720 --> 00:31:39,200 Speaker 2: Twitch six dollars a month to subscribe to her channel. 660 00:31:39,240 --> 00:31:41,520 Speaker 2: You can watch her for free, but if you subscribe, 661 00:31:41,560 --> 00:31:45,600 Speaker 2: you get special emojis, you can you know, send her messages. 662 00:31:45,640 --> 00:31:48,680 Speaker 2: You get these little perks of subscriptions. So there's that. 663 00:31:48,880 --> 00:31:51,680 Speaker 2: Then on Twitch, people can donate to you outright. They 664 00:31:51,720 --> 00:31:54,760 Speaker 2: can give you tips. A lot of times people will 665 00:31:55,240 --> 00:31:57,880 Speaker 2: drop into your stream, maybe make fun of you or 666 00:31:58,000 --> 00:32:00,880 Speaker 2: do a crazy sound effect. There's been situations where like 667 00:32:01,160 --> 00:32:03,040 Speaker 2: Emily will go out to the grocery store and an 668 00:32:03,040 --> 00:32:05,200 Speaker 2: elevator and someone will prank her by like paying a 669 00:32:05,240 --> 00:32:07,840 Speaker 2: dollar to play like a fart noise over her cell 670 00:32:07,840 --> 00:32:11,479 Speaker 2: phone just kind of embarrass her in person. So you know, 671 00:32:11,600 --> 00:32:14,640 Speaker 2: she lives alone. She lives in an apartment. She doesn't 672 00:32:14,680 --> 00:32:16,840 Speaker 2: really do anything, but she makes more than she would 673 00:32:16,840 --> 00:32:19,200 Speaker 2: have made at CBS. And this is something she talked 674 00:32:19,200 --> 00:32:22,520 Speaker 2: about where she feels like this is capitalism. She's just 675 00:32:22,600 --> 00:32:25,680 Speaker 2: making a living. However, she feels like she can you know, 676 00:32:25,840 --> 00:32:28,880 Speaker 2: she felt like, I could finish at community college and 677 00:32:29,000 --> 00:32:31,440 Speaker 2: stay working at CVS, or I could do this other 678 00:32:31,520 --> 00:32:34,200 Speaker 2: risky thing, And why would I not do it? Because 679 00:32:34,200 --> 00:32:36,680 Speaker 2: I'm making more than I ever would have done, you know, 680 00:32:36,840 --> 00:32:38,120 Speaker 2: moving groceries around. 681 00:32:38,280 --> 00:32:40,720 Speaker 1: What's it costing her to do this? I don't mean 682 00:32:40,720 --> 00:32:42,800 Speaker 1: financially so much as personally. 683 00:32:43,520 --> 00:32:46,320 Speaker 2: Yeah, it's costing her a lot, I think, And she 684 00:32:46,360 --> 00:32:49,760 Speaker 2: talks about this pretty openly. You know, for three years 685 00:32:49,760 --> 00:32:52,880 Speaker 2: she has devoted her life to being online all the time. 686 00:32:53,000 --> 00:32:54,640 Speaker 2: So there's kind of the tangible stuff. 687 00:32:54,720 --> 00:32:54,880 Speaker 4: Right. 688 00:32:55,200 --> 00:32:59,760 Speaker 2: She wasn't flying anywhere because then she would be offline 689 00:32:59,760 --> 00:33:02,160 Speaker 2: for a little bit, so she couldn't stream. She was 690 00:33:02,200 --> 00:33:05,400 Speaker 2: saying no to wedding invitations, right. She's talked about she 691 00:33:05,440 --> 00:33:07,600 Speaker 2: went to like a club with a friend one time 692 00:33:07,640 --> 00:33:09,680 Speaker 2: and the service was really bad in the club, so 693 00:33:09,720 --> 00:33:12,400 Speaker 2: she had to leave. And she's talked about losing friends 694 00:33:12,440 --> 00:33:15,760 Speaker 2: because not everybody wants to be on a stream with 695 00:33:16,160 --> 00:33:18,640 Speaker 2: hanging out with her. She doesn't have a good relationship 696 00:33:18,680 --> 00:33:21,440 Speaker 2: with her parents. Her parents don't really understand what she's doing. 697 00:33:21,520 --> 00:33:23,400 Speaker 2: They don't want to be on stream. So there's a 698 00:33:23,440 --> 00:33:26,120 Speaker 2: lot of kind of like human losses. But I think also, 699 00:33:26,280 --> 00:33:30,120 Speaker 2: you know, she started to wonder just about the opportunity cost. 700 00:33:30,480 --> 00:33:35,440 Speaker 2: When you're devoting twenty four seven to your stream, you're 701 00:33:35,440 --> 00:33:37,120 Speaker 2: not doing a bunch of other things that might make 702 00:33:37,160 --> 00:33:40,120 Speaker 2: you happier, might lead to something else. And so you know, 703 00:33:40,200 --> 00:33:43,480 Speaker 2: she's still making those judgments every day, but there's a 704 00:33:43,520 --> 00:33:45,840 Speaker 2: lot that goes into it that she's had to sacrifice 705 00:33:46,000 --> 00:33:48,840 Speaker 2: just to be part of this crowd. 706 00:33:49,720 --> 00:33:52,040 Speaker 1: Why do you put this in let's say the history 707 00:33:52,080 --> 00:33:56,000 Speaker 1: of like reality TV experiments and as a famous Doky 708 00:33:56,080 --> 00:33:59,280 Speaker 1: series recently about the Japanese man who lived in that 709 00:33:59,440 --> 00:34:02,120 Speaker 1: room being broadcast for like two years and you know, 710 00:34:02,600 --> 00:34:05,000 Speaker 1: was really at the edge of his sanity by the 711 00:34:05,080 --> 00:34:07,800 Speaker 1: end of that experience. And the contestant, right, there's a 712 00:34:07,840 --> 00:34:10,399 Speaker 1: woman you mentioned in your piece called Jennifer Ringley who 713 00:34:10,800 --> 00:34:12,920 Speaker 1: had a continuous broadcast of her life back in the 714 00:34:12,960 --> 00:34:16,240 Speaker 1: early two thousands. I guess what's new and what's old 715 00:34:16,280 --> 00:34:18,160 Speaker 1: and what does all of this say about us? 716 00:34:18,640 --> 00:34:22,080 Speaker 2: So I find them really interesting examples because it shows 717 00:34:22,160 --> 00:34:24,839 Speaker 2: that some of this is not new, and I think 718 00:34:24,840 --> 00:34:29,680 Speaker 2: it speaks to this human impulse in us that desires 719 00:34:29,719 --> 00:34:33,560 Speaker 2: to watch people be people and to just look at 720 00:34:33,560 --> 00:34:36,239 Speaker 2: them through the glass. It's why we watch reality TV. 721 00:34:36,560 --> 00:34:38,600 Speaker 2: So why I watch reality TV, right, I want to 722 00:34:38,640 --> 00:34:43,440 Speaker 2: see people interact in these situations and process that. I 723 00:34:43,480 --> 00:34:46,800 Speaker 2: think what's changed from that is that it has become 724 00:34:46,840 --> 00:34:50,239 Speaker 2: normalized to the point where we are all creators in 725 00:34:50,239 --> 00:34:52,560 Speaker 2: a way. Right. We use social media all the time, 726 00:34:52,680 --> 00:34:56,480 Speaker 2: we perform for our family and friends on the internet. 727 00:34:56,800 --> 00:35:00,239 Speaker 2: It's just become something that seems not so crazy more. 728 00:35:00,280 --> 00:35:02,280 Speaker 2: And I think it's kind of expanded the Overton window 729 00:35:02,320 --> 00:35:04,920 Speaker 2: on what we feel is acceptable. When you know, one 730 00:35:04,960 --> 00:35:06,600 Speaker 2: hundred years ago, it would have been crazy that we 731 00:35:06,640 --> 00:35:09,960 Speaker 2: would ever watch somebody in all of these private moments 732 00:35:10,320 --> 00:35:11,560 Speaker 2: or even make money off of it. 733 00:35:11,719 --> 00:35:16,200 Speaker 1: There's also this kind of algorithmic bias towards extreme content, right, 734 00:35:16,239 --> 00:35:19,239 Speaker 1: But there's this interesting element of this story where the 735 00:35:19,280 --> 00:35:24,320 Speaker 1: algorithmic bias towards extreme content mix people behave or incentivizes 736 00:35:24,360 --> 00:35:27,720 Speaker 1: people to behave in more extreme ways. In real life, 737 00:35:28,000 --> 00:35:30,680 Speaker 1: there is a British only fans creator who you mentioned 738 00:35:30,680 --> 00:35:33,920 Speaker 1: in your story called Lily Phillips, who ran a stunt 739 00:35:33,920 --> 00:35:35,880 Speaker 1: where she slept I think, with one hundred men in 740 00:35:35,920 --> 00:35:38,240 Speaker 1: twenty four hours and it kind of broke the internet 741 00:35:38,280 --> 00:35:41,280 Speaker 1: a few months ago, and then another sort of YouTube 742 00:35:41,280 --> 00:35:44,160 Speaker 1: creator made a kind of interview piece with her where 743 00:35:44,480 --> 00:35:47,000 Speaker 1: she broke down, which kind of broke the internet a 744 00:35:47,000 --> 00:35:49,160 Speaker 1: second time. Can you tell us a bit about how 745 00:35:49,160 --> 00:35:50,640 Speaker 1: it intersects with this story. 746 00:35:51,080 --> 00:35:52,680 Speaker 2: Yeah, I mean we have to remember this is a 747 00:35:52,960 --> 00:35:56,040 Speaker 2: this is a business, right, and with only fans with Twitch, 748 00:35:56,080 --> 00:35:59,160 Speaker 2: with a lot of social media, these people are competing 749 00:35:59,480 --> 00:36:02,239 Speaker 2: for the currency of the Internet, which is attention. Right, 750 00:36:02,560 --> 00:36:05,840 Speaker 2: the only way to get attention versus all of the 751 00:36:06,000 --> 00:36:09,399 Speaker 2: thousands of other people you're competing with is to raise 752 00:36:09,440 --> 00:36:12,279 Speaker 2: the bar, to do something crazier, to do something that 753 00:36:12,320 --> 00:36:15,160 Speaker 2: people can't look away from. I mean, you see it 754 00:36:15,160 --> 00:36:18,800 Speaker 2: on x formerly Twitter right, where people will say crazy, 755 00:36:18,920 --> 00:36:23,080 Speaker 2: outlandish conspiracy theories and they want to outdo the next 756 00:36:23,360 --> 00:36:27,319 Speaker 2: right wing influencer by saying something crazier because they know 757 00:36:27,440 --> 00:36:29,520 Speaker 2: that it doesn't matter if what they're saying is right, 758 00:36:29,680 --> 00:36:32,560 Speaker 2: it matters that it pisses people off. And engages people, 759 00:36:32,719 --> 00:36:35,319 Speaker 2: gets people to watch them. What if I'm kind of 760 00:36:35,400 --> 00:36:39,280 Speaker 2: unusual about Twitch? Is that unlike with TikTok, where people 761 00:36:39,320 --> 00:36:42,920 Speaker 2: really want like the most just gut punching five second 762 00:36:43,000 --> 00:36:46,120 Speaker 2: video that they can make, Twitch really relishes and being 763 00:36:46,200 --> 00:36:48,920 Speaker 2: this long form thing that people can watch for six hours. 764 00:36:49,080 --> 00:36:52,320 Speaker 2: So there's a lot of like banal content on Twitch 765 00:36:52,640 --> 00:36:55,120 Speaker 2: that you wonder like why would somebody watch somebody playing 766 00:36:55,200 --> 00:37:00,600 Speaker 2: video games or playing chess, or studying or coding a 767 00:37:00,600 --> 00:37:03,000 Speaker 2: computer program, all of which is on Twitch. But then 768 00:37:03,040 --> 00:37:04,799 Speaker 2: you see people like Emily who it's kind of the 769 00:37:04,840 --> 00:37:07,600 Speaker 2: mix of just a banal kind of normal life, but 770 00:37:07,719 --> 00:37:10,080 Speaker 2: also you want to watch to see is she going 771 00:37:10,120 --> 00:37:12,880 Speaker 2: to finally quit? Is she going to reach her breaking point? 772 00:37:13,000 --> 00:37:16,879 Speaker 2: She's gone three years, Like is she going to you know, 773 00:37:17,080 --> 00:37:20,120 Speaker 2: just freak out and run away? So there's that kind 774 00:37:20,160 --> 00:37:23,360 Speaker 2: of mix of wanting to see somebody in their normal element, 775 00:37:23,440 --> 00:37:26,759 Speaker 2: but also kind of expecting something bad is going to 776 00:37:26,800 --> 00:37:29,120 Speaker 2: happen from this crazy challenge she's putting herself through. 777 00:37:29,520 --> 00:37:31,800 Speaker 4: I mean, she's talked about having a kind of end point, 778 00:37:31,800 --> 00:37:33,600 Speaker 4: like she's not just doing this to do this, she 779 00:37:33,640 --> 00:37:35,600 Speaker 4: wants to do it so that she can have a house, 780 00:37:36,080 --> 00:37:38,239 Speaker 4: and be married by the time she's thirty years old. 781 00:37:38,280 --> 00:37:41,600 Speaker 4: So do you, now having interviewed her, feel like there's 782 00:37:41,600 --> 00:37:45,560 Speaker 4: an endpoint for her and is it looking like she's 783 00:37:45,560 --> 00:37:47,759 Speaker 4: on the trajectory to have the things that she wants. 784 00:37:48,080 --> 00:37:52,080 Speaker 2: I think with Emily, she shifts her end goal out 785 00:37:52,320 --> 00:37:54,520 Speaker 2: all the time, so it's hard to know when she 786 00:37:54,560 --> 00:37:56,839 Speaker 2: will ever reach an endpoint. And she even says now 787 00:37:56,920 --> 00:38:00,279 Speaker 2: she doesn't see any reason to quit yet kind of 788 00:38:00,320 --> 00:38:03,320 Speaker 2: hears these clocks in her head of I want to 789 00:38:03,320 --> 00:38:05,760 Speaker 2: buy a house by thirty, I want to get married, 790 00:38:05,800 --> 00:38:09,400 Speaker 2: you know, all these kind of like basic societal impulses 791 00:38:09,440 --> 00:38:11,879 Speaker 2: that a lot of us here. But you know, it's 792 00:38:11,880 --> 00:38:13,799 Speaker 2: hard for her to think of like what a life 793 00:38:13,800 --> 00:38:16,359 Speaker 2: would be like not on camera. I think it would 794 00:38:16,360 --> 00:38:20,120 Speaker 2: be a big culture shock when or if it ever happens. 795 00:38:20,520 --> 00:38:22,359 Speaker 2: When you're doing something day in and day out for 796 00:38:22,400 --> 00:38:24,279 Speaker 2: three years, to be able to break away from that 797 00:38:24,320 --> 00:38:26,840 Speaker 2: will be really challenging. And you know, I tried to 798 00:38:26,880 --> 00:38:29,120 Speaker 2: press her on this point because she said she wants 799 00:38:29,160 --> 00:38:31,799 Speaker 2: to be married in a couple of years, but she 800 00:38:31,880 --> 00:38:34,160 Speaker 2: still thinks she might be streaming in five years, And 801 00:38:34,200 --> 00:38:37,280 Speaker 2: I said, well, are you going to be like pregnant 802 00:38:37,280 --> 00:38:40,360 Speaker 2: and streaming. Are you going to be like streaming your wedding? Like, 803 00:38:40,400 --> 00:38:43,160 Speaker 2: how does this work? And so I think she's still 804 00:38:43,160 --> 00:38:45,839 Speaker 2: trying to figure out what bat'll look like and if 805 00:38:45,880 --> 00:38:47,440 Speaker 2: it will work or if she'll have to kind of 806 00:38:47,800 --> 00:38:50,480 Speaker 2: bend on either the life goals she has or kind 807 00:38:50,480 --> 00:38:51,680 Speaker 2: of these twitch goals. 808 00:39:01,239 --> 00:39:02,799 Speaker 3: Drew, thank you so much for your time. 809 00:39:03,120 --> 00:39:03,680 Speaker 1: Thank you, Drew. 810 00:39:03,800 --> 00:39:04,799 Speaker 2: Yeah, thanks for having me. 811 00:39:25,040 --> 00:39:27,759 Speaker 3: That's it for this week for tech Stuff, I'm Kara Price. 812 00:39:27,520 --> 00:39:30,520 Speaker 1: And I'm os Valosan. This episode was produced by Eliza 813 00:39:30,560 --> 00:39:33,840 Speaker 1: Dennis and Victoria Domingez. It was executive produced by me, 814 00:39:34,200 --> 00:39:38,160 Speaker 1: Kara Price and Kate Osborne for Kaleidoscope and Katrina Norvel 815 00:39:38,239 --> 00:39:44,560 Speaker 1: for iHeart Podcasts. The engineer is Beheath Fraser and Jack 816 00:39:44,640 --> 00:39:48,080 Speaker 1: Insley mixed this episode. Kyle Murdoch rode Themesol. 817 00:39:48,200 --> 00:39:50,879 Speaker 4: Join us next Wednesday for tex Stuff The Story, when 818 00:39:50,920 --> 00:39:53,280 Speaker 4: we will share an in depth conversation with game designer 819 00:39:53,360 --> 00:39:57,279 Speaker 4: him Gingold about every millennial's favorite game, Simsony. 820 00:39:57,560 --> 00:40:00,360 Speaker 1: Please rate, review and reach out to us at text 821 00:40:00,360 --> 00:40:02,839 Speaker 1: aff podcast at gmail dot com. We want to hear 822 00:40:02,880 --> 00:40:03,239 Speaker 1: from you.