1 00:00:14,600 --> 00:00:18,040 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:18,000 --> 00:00:20,920 Speaker 1: as Voloscian and I'm care Price. A handful of stories 3 00:00:20,920 --> 00:00:23,400 Speaker 1: stood out to us this week, and they have one 4 00:00:23,440 --> 00:00:27,800 Speaker 1: thing in common. China. First, why Chinese regulators are investigating 5 00:00:27,800 --> 00:00:31,800 Speaker 1: a deal between Meta and an AI company based in Singapore, 6 00:00:32,080 --> 00:00:35,400 Speaker 1: And why everyone on TikTok is drinking hot water, eating 7 00:00:35,440 --> 00:00:38,159 Speaker 1: dim sum and coming out as Chinese. 8 00:00:38,479 --> 00:00:40,920 Speaker 2: Then a few other things that caught our eye this week, 9 00:00:41,400 --> 00:00:44,720 Speaker 2: including a type of factory work where humans are folding 10 00:00:44,800 --> 00:00:50,520 Speaker 2: laundry for hours while humanoid robots watch. Then TikTok wants 11 00:00:50,520 --> 00:00:52,720 Speaker 2: more of your attention, this time in a new app 12 00:00:52,920 --> 00:00:57,960 Speaker 2: hosting microdramas like love at First Bite. And finally, Silicon 13 00:00:58,040 --> 00:01:01,800 Speaker 2: Valley's new favorite hire is a act engineer. Will explain 14 00:01:01,840 --> 00:01:02,680 Speaker 2: what that means. 15 00:01:02,680 --> 00:01:05,720 Speaker 1: All of that on the Weekend Tech. It's Friday, January 16 00:01:05,760 --> 00:01:12,480 Speaker 1: twenty third. Hello Cara, Hello Azzie. How much time do 17 00:01:12,560 --> 00:01:14,800 Speaker 1: you spend thinking about animal intelligence? 18 00:01:15,080 --> 00:01:17,360 Speaker 2: I mean I spend enough time thinking about animals? 19 00:01:17,680 --> 00:01:18,000 Speaker 1: Yeah? 20 00:01:18,120 --> 00:01:19,400 Speaker 2: Yeah, not animal intelligence. 21 00:01:19,520 --> 00:01:20,200 Speaker 1: You're getting a dog? 22 00:01:20,640 --> 00:01:23,000 Speaker 2: Well, I had, I was fastering a dog. It got adopted, 23 00:01:23,280 --> 00:01:29,360 Speaker 2: not by you, no, unfortunately, sad. Yeah, I'm sad. 24 00:01:29,440 --> 00:01:30,520 Speaker 1: Well, you can always got it. 25 00:01:30,520 --> 00:01:32,080 Speaker 2: I can always get it. I can always get a 26 00:01:32,120 --> 00:01:35,240 Speaker 2: robo dog when they come. I'll talk to Barston Dynamics 27 00:01:35,240 --> 00:01:35,600 Speaker 2: about it. 28 00:01:35,840 --> 00:01:38,360 Speaker 1: I got more interested than I had previously been in 29 00:01:38,400 --> 00:01:43,440 Speaker 1: animal intelligence because I was actually in the Windy Impenetrable Forest, 30 00:01:43,840 --> 00:01:47,840 Speaker 1: which is on the border of Congo, Rwanda and Uganda, 31 00:01:48,480 --> 00:01:50,480 Speaker 1: and I got to see the mountain gorillas and it 32 00:01:50,520 --> 00:01:52,120 Speaker 1: was absolutely Did. 33 00:01:52,000 --> 00:01:52,840 Speaker 2: You feel a kinship? 34 00:01:53,360 --> 00:01:55,600 Speaker 1: You can't not. I saw gorillas and chimpanzees, and I 35 00:01:55,640 --> 00:01:58,760 Speaker 1: mean it's absolutely stunning. I mean really, So I've been 36 00:01:58,800 --> 00:02:01,840 Speaker 1: more interested in animal and intelligence in the last few weeks, 37 00:02:02,040 --> 00:02:05,559 Speaker 1: which is probably why a story about a cow caught 38 00:02:05,560 --> 00:02:07,200 Speaker 1: my eye this week. Have you have you? Have you? 39 00:02:07,280 --> 00:02:08,720 Speaker 1: Are you familiar with Veronica the cow? 40 00:02:08,800 --> 00:02:10,359 Speaker 2: No, I've never heard of Okay. 41 00:02:10,520 --> 00:02:14,360 Speaker 1: So basically what people are fascinated with in terms of 42 00:02:14,400 --> 00:02:18,720 Speaker 1: the great apes is their ability to use tools. Chimpanzees 43 00:02:18,720 --> 00:02:22,800 Speaker 1: are more closely related to us than gorillas are, and 44 00:02:22,840 --> 00:02:27,040 Speaker 1: they also use tools. In particular, they use multi purpose tools, 45 00:02:27,120 --> 00:02:31,320 Speaker 1: so they can use they can use a stick to 46 00:02:31,520 --> 00:02:35,520 Speaker 1: open a termite's nest, to get honey, and to beat 47 00:02:35,560 --> 00:02:35,959 Speaker 1: each other. 48 00:02:36,240 --> 00:02:38,600 Speaker 2: So they're using a fork basically for everything. 49 00:02:38,680 --> 00:02:42,480 Speaker 1: Yes, the exact one that's extendable for Yeah. So this 50 00:02:42,600 --> 00:02:46,640 Speaker 1: week it was revealed that a cow used the tool. 51 00:02:47,240 --> 00:02:50,320 Speaker 1: Really what, Veronica, and in fact, what made it particularly 52 00:02:50,360 --> 00:02:52,560 Speaker 1: interesting was that, just like a chimpanzee, was actually a 53 00:02:52,639 --> 00:02:57,600 Speaker 1: dual use tools. So Veronica, the cow that doclemented picking 54 00:02:57,680 --> 00:03:00,560 Speaker 1: up a broomstick with her tongue, and then she used 55 00:03:00,560 --> 00:03:03,399 Speaker 1: the bristle end of the broom to scratch her back 56 00:03:03,440 --> 00:03:06,639 Speaker 1: while holding it in her mouth, and she used the 57 00:03:06,680 --> 00:03:09,880 Speaker 1: flat side with no bristles, the more gentle side to 58 00:03:09,919 --> 00:03:10,520 Speaker 1: scratch her. 59 00:03:10,480 --> 00:03:12,959 Speaker 2: Udder, really because the others are more sensitive. 60 00:03:13,000 --> 00:03:13,680 Speaker 1: Exactly right. 61 00:03:13,800 --> 00:03:14,720 Speaker 2: That's incredible. 62 00:03:14,960 --> 00:03:17,440 Speaker 1: You know, we think we're so kind of far along 63 00:03:17,480 --> 00:03:20,440 Speaker 1: in terms of our observations of the universe, but hiding 64 00:03:20,480 --> 00:03:23,360 Speaker 1: in plain sight was the fact that cows can use tools. 65 00:03:23,760 --> 00:03:27,320 Speaker 1: And the BBC said that the perceived lack of intelligence 66 00:03:27,320 --> 00:03:30,000 Speaker 1: of cows may say more about a lack of observation 67 00:03:30,120 --> 00:03:31,800 Speaker 1: than about the animals themselves. 68 00:03:32,160 --> 00:03:33,840 Speaker 2: Interesting, huh. 69 00:03:34,000 --> 00:03:36,040 Speaker 1: I wanted to bring you the story because we talk 70 00:03:36,120 --> 00:03:38,960 Speaker 1: so much about artificial intelligence. I was at a panel 71 00:03:39,000 --> 00:03:41,960 Speaker 1: at a conference last week called DLD in Munich, and 72 00:03:42,000 --> 00:03:44,360 Speaker 1: this guy called Stuart Russell is like a very famous 73 00:03:44,400 --> 00:03:48,360 Speaker 1: AI researcher and pioneer was being interviewed and he kept insisting, look, 74 00:03:48,520 --> 00:03:51,440 Speaker 1: before we talk about artificial intelligence, I need to give 75 00:03:51,480 --> 00:03:54,800 Speaker 1: you my definition of intelligence, which is the ability to 76 00:03:54,920 --> 00:03:57,840 Speaker 1: achieve goals in the world. And he was saying, it's 77 00:03:57,920 --> 00:04:00,640 Speaker 1: very scary because AI has its own goals and is 78 00:04:00,760 --> 00:04:03,800 Speaker 1: achieving them. But you know, a cow can do it too. 79 00:04:04,080 --> 00:04:06,360 Speaker 2: A cow can do it too. It's so interesting to 80 00:04:06,400 --> 00:04:09,240 Speaker 2: me that is this the first time, the. 81 00:04:09,320 --> 00:04:11,920 Speaker 1: First time a cow has ever been seen using a 82 00:04:11,920 --> 00:04:14,360 Speaker 1: tool and it raised really serious questions. I think about 83 00:04:14,480 --> 00:04:19,520 Speaker 1: how smart cows are and I mean obviously using to 84 00:04:19,560 --> 00:04:20,520 Speaker 1: do what we do to them. 85 00:04:20,600 --> 00:04:20,920 Speaker 2: Yeah. 86 00:04:21,000 --> 00:04:24,240 Speaker 1: Absolutely, So I promised this episode is going to be 87 00:04:24,320 --> 00:04:27,400 Speaker 1: about China, and we were wondering how to pivot out 88 00:04:27,400 --> 00:04:29,000 Speaker 1: of this story. I was going to say, our super 89 00:04:29,000 --> 00:04:32,040 Speaker 1: producer Eliza found a transition. 90 00:04:32,200 --> 00:04:32,960 Speaker 2: Tell me so. 91 00:04:33,080 --> 00:04:36,240 Speaker 1: Whired released a special issue of the magazine about the 92 00:04:36,279 --> 00:04:39,919 Speaker 1: twenty three ways You're already living in the Chinese century. 93 00:04:40,360 --> 00:04:43,800 Speaker 1: Number three was You'll be drinking frank and milk. 94 00:04:44,200 --> 00:04:46,040 Speaker 2: So she found it the Franken milk connection. 95 00:04:46,360 --> 00:04:48,640 Speaker 1: So there's a there's a super cow in China which 96 00:04:48,640 --> 00:04:52,400 Speaker 1: has been cloned and it produces forty pounds of milk 97 00:04:52,440 --> 00:04:55,000 Speaker 1: per year, which is almost double what the output of 98 00:04:55,040 --> 00:04:56,760 Speaker 1: American cows. I mean, I was pretty shocked. 99 00:04:56,839 --> 00:04:57,760 Speaker 2: This a type of cow. 100 00:04:58,040 --> 00:05:00,279 Speaker 1: It's a special type of cow that's been cloned, owned 101 00:05:00,839 --> 00:05:04,880 Speaker 1: and has double the output. So we're kind of drinking 102 00:05:05,279 --> 00:05:09,600 Speaker 1: potentially genetically modified cloned count China. 103 00:05:09,600 --> 00:05:12,000 Speaker 2: According to Wyatt, what are some of the other tells 104 00:05:12,040 --> 00:05:14,839 Speaker 2: for sort of how we're living in the Chinese century. 105 00:05:15,200 --> 00:05:17,080 Speaker 1: Well, one of the big ones, of course, is the 106 00:05:17,200 --> 00:05:20,440 Speaker 1: rise of the Chinese battery industry and the electric vehicle industry. 107 00:05:20,760 --> 00:05:22,640 Speaker 1: I hadn't realized until I read the piece in Wired 108 00:05:23,120 --> 00:05:27,520 Speaker 1: that China sells ten times more electric vehicles globally every 109 00:05:27,600 --> 00:05:29,920 Speaker 1: year than US company. So it's just an order of 110 00:05:29,960 --> 00:05:31,440 Speaker 1: magnitude bigger the car industry. 111 00:05:31,440 --> 00:05:31,719 Speaker 2: There. 112 00:05:32,240 --> 00:05:34,240 Speaker 1: The other one which I liked was Your Next co 113 00:05:34,480 --> 00:05:37,640 Speaker 1: Worker is a two legged Chinese robot, which we'll be 114 00:05:37,680 --> 00:05:39,839 Speaker 1: discussing later this episode. 115 00:05:39,560 --> 00:05:40,200 Speaker 2: Very frightening. 116 00:05:40,600 --> 00:05:44,320 Speaker 1: Two other of my favorites were your precious Woo Woo crystals, 117 00:05:44,880 --> 00:05:47,840 Speaker 1: the product of a small town Chinese venture. You're a 118 00:05:47,920 --> 00:05:48,719 Speaker 1: crystal queen. 119 00:05:48,680 --> 00:05:49,520 Speaker 2: Huge crystal queen. 120 00:05:49,680 --> 00:05:52,760 Speaker 1: So the detail from this piece is that when Mao 121 00:05:52,839 --> 00:05:55,760 Speaker 1: Zedong died he had a crystal coffin to see through 122 00:05:55,839 --> 00:05:56,560 Speaker 1: crystal coffin. 123 00:05:56,640 --> 00:05:58,200 Speaker 2: Good for him, and that's cool. 124 00:05:58,279 --> 00:06:00,400 Speaker 1: The area of China where that crystal comes from is 125 00:06:00,400 --> 00:06:02,520 Speaker 1: now the area that produces all of the kind of 126 00:06:02,920 --> 00:06:05,440 Speaker 1: crystals for the wellness and healing industry. 127 00:06:05,560 --> 00:06:07,400 Speaker 2: We should go. That would be such a that would 128 00:06:07,440 --> 00:06:12,400 Speaker 2: be a great fun. People, well, they'll flies to China. China. 129 00:06:12,480 --> 00:06:14,279 Speaker 1: And then the final one, which I liked, was the 130 00:06:14,320 --> 00:06:16,280 Speaker 1: toy you want most in the world is still a 131 00:06:16,360 --> 00:06:16,880 Speaker 1: La buobu. 132 00:06:17,080 --> 00:06:20,240 Speaker 2: My friend and business partners texted me, I loved my 133 00:06:20,240 --> 00:06:22,080 Speaker 2: flight attend this morning. She was so cute. She had 134 00:06:22,080 --> 00:06:23,000 Speaker 2: a La boobo in her bag. 135 00:06:23,120 --> 00:06:25,520 Speaker 1: Yeah, it's interesting why I pointed out that Labubu is 136 00:06:25,560 --> 00:06:30,920 Speaker 1: the first basically China generated ip craze to sweep the world. 137 00:06:31,080 --> 00:06:32,440 Speaker 2: It's true. I think that that's true. 138 00:06:32,520 --> 00:06:34,400 Speaker 1: Is there a Luba movie it's happening. 139 00:06:34,960 --> 00:06:38,520 Speaker 2: Yeah, it's it's in development. As they say. There's one 140 00:06:38,560 --> 00:06:40,640 Speaker 2: that I can add to the list, actually, which is 141 00:06:40,640 --> 00:06:44,039 Speaker 2: that everyone on TikTok is in their Chinese era. 142 00:06:44,320 --> 00:06:45,320 Speaker 1: Wow. I love an era. 143 00:06:46,480 --> 00:06:47,120 Speaker 2: Everything's in era. 144 00:06:47,240 --> 00:06:49,760 Speaker 1: Everything's an error. But before we get there, I have 145 00:06:50,080 --> 00:06:54,599 Speaker 1: a China business story that really got me thinking about 146 00:06:54,600 --> 00:06:58,440 Speaker 1: this kind of dizzying shadow conflict between the US and 147 00:06:58,560 --> 00:07:01,560 Speaker 1: China in the world tech. I want to take you 148 00:07:01,640 --> 00:07:06,560 Speaker 1: back to January twenty twenty five, the Deep Seak Freakout. Okay, 149 00:07:06,680 --> 00:07:08,360 Speaker 1: you remember that, the oh. 150 00:07:08,320 --> 00:07:10,040 Speaker 2: Yeah, of course, of course, yes, this was last year 151 00:07:10,040 --> 00:07:11,360 Speaker 2: when we started the show exactly. 152 00:07:11,480 --> 00:07:15,160 Speaker 1: So there was this Chinese AI model called deep Seak, 153 00:07:15,240 --> 00:07:15,600 Speaker 1: and all. 154 00:07:15,480 --> 00:07:17,640 Speaker 2: Of a sudden, everyone's like, open ais Square. 155 00:07:17,520 --> 00:07:20,960 Speaker 1: It's done. The US yeah crashed, etcetera, etcetera, before it 156 00:07:21,000 --> 00:07:24,800 Speaker 1: recovered spectacularly. Shortly after, though, another tech product was released 157 00:07:24,800 --> 00:07:29,080 Speaker 1: in China called manus Aim and manus Ai. Manus Ai 158 00:07:29,280 --> 00:07:31,320 Speaker 1: was a little bit more of a storm in a teacup, 159 00:07:31,320 --> 00:07:33,680 Speaker 1: people like, oh my god, it's happening again Deep Seat 160 00:07:33,760 --> 00:07:36,960 Speaker 1: freak Out V. Two. It didn't spook the US markets 161 00:07:36,960 --> 00:07:38,920 Speaker 1: in the same way, but it was a very impressive 162 00:07:39,360 --> 00:07:43,480 Speaker 1: AI agent, capable of independently building websites and doing basic coding, 163 00:07:43,920 --> 00:07:47,440 Speaker 1: kind of ahead of the Aargentic era of AI which 164 00:07:47,480 --> 00:07:50,880 Speaker 1: emerged later in the year in the US. So today's 165 00:07:50,880 --> 00:07:51,720 Speaker 1: story is about. 166 00:07:51,440 --> 00:07:54,120 Speaker 2: Manus Okay, tell me about manus. 167 00:07:54,040 --> 00:07:58,280 Speaker 1: Meta actually agreed to acquire it last month. Reportedly in 168 00:07:58,320 --> 00:08:01,400 Speaker 1: the two to three billion dollars. When the deal was 169 00:08:01,440 --> 00:08:04,920 Speaker 1: first announced, there were rumors that US regulators might block 170 00:08:04,960 --> 00:08:08,800 Speaker 1: the deal, but as it happens, Chinese regulators may have 171 00:08:08,840 --> 00:08:09,600 Speaker 1: stolen a march. 172 00:08:10,200 --> 00:08:13,640 Speaker 2: So is this surprising? Like the US government has meddled 173 00:08:13,640 --> 00:08:15,280 Speaker 2: with Chinese companies before. 174 00:08:15,400 --> 00:08:18,000 Speaker 1: That's right, but it's now it's the boots on the 175 00:08:18,000 --> 00:08:20,360 Speaker 1: other foot, and China are trying to block the sale 176 00:08:20,800 --> 00:08:22,560 Speaker 1: of this company to the US. 177 00:08:22,800 --> 00:08:24,200 Speaker 2: The US, Okay, And you're. 178 00:08:24,160 --> 00:08:26,679 Speaker 1: Right obviously, I mean, you know, whether it's export bans 179 00:08:26,680 --> 00:08:29,160 Speaker 1: on the most advanced in video chips or this kind 180 00:08:29,240 --> 00:08:33,600 Speaker 1: of forced quasi sale of tiktoks us operations, tariffs. The 181 00:08:33,720 --> 00:08:36,400 Speaker 1: US has been very on the front foot in terms 182 00:08:36,400 --> 00:08:39,959 Speaker 1: of finding ways to you know, constrain or punish Chinese 183 00:08:40,000 --> 00:08:42,520 Speaker 1: tech industry. But what's kind of interesting now is that 184 00:08:42,640 --> 00:08:46,280 Speaker 1: China is investigating where this acquisition violates its laws on 185 00:08:46,320 --> 00:08:47,360 Speaker 1: technology exports. 186 00:08:48,400 --> 00:08:49,720 Speaker 2: Is Manus a Chinese company? 187 00:08:49,880 --> 00:08:52,400 Speaker 1: Well, that's why I wants to share this story. Okay, 188 00:08:52,480 --> 00:08:56,400 Speaker 1: this is where it gets really fascinating. Manus, according to Manus, 189 00:08:56,880 --> 00:08:58,040 Speaker 1: is not a Chinese company. 190 00:08:58,120 --> 00:08:59,400 Speaker 2: According to Manus. 191 00:08:59,040 --> 00:09:02,640 Speaker 1: It's based in Singapore. According to China, it is a 192 00:09:02,720 --> 00:09:06,880 Speaker 1: Chinese company interesting, so hence they can block a sale. 193 00:09:07,040 --> 00:09:10,199 Speaker 1: This is sort of like the Greenland of the much 194 00:09:10,320 --> 00:09:12,480 Speaker 1: so very much so possession. 195 00:09:12,559 --> 00:09:13,560 Speaker 2: Is it's going to win. 196 00:09:13,600 --> 00:09:16,560 Speaker 1: Nine tenths of the law. Well, I'm pretty sure China 197 00:09:16,600 --> 00:09:18,599 Speaker 1: will win, honestly, but it remains. 198 00:09:18,559 --> 00:09:19,679 Speaker 2: Because China has more power. 199 00:09:19,960 --> 00:09:23,040 Speaker 1: Well, I mean, there's this phenomenon known as Singapore washing, 200 00:09:23,120 --> 00:09:25,800 Speaker 1: where like, Singapore is an independent country and an independent 201 00:09:25,880 --> 00:09:30,040 Speaker 1: jurisdiction and like a useful place for Western companies to 202 00:09:30,040 --> 00:09:34,240 Speaker 1: do business with Chinese companies, but ultimately Singapore is is 203 00:09:34,280 --> 00:09:38,600 Speaker 1: closer to China than Venezuela is to Washington, and so 204 00:09:38,840 --> 00:09:42,360 Speaker 1: that their likelihood is that the Singaporean regulators will probably 205 00:09:42,440 --> 00:09:45,400 Speaker 1: not resist the Chinese regulators. But I thought it was 206 00:09:45,440 --> 00:09:50,079 Speaker 1: pretty interesting because Manus last year actually relocated Singapore, and 207 00:09:50,200 --> 00:09:53,360 Speaker 1: China basically said no, like, you were founded in China 208 00:09:53,520 --> 00:09:56,280 Speaker 1: by Chinese citizens. Yeah, you can go to another jurisdiction. 209 00:09:56,400 --> 00:09:59,160 Speaker 1: That's that's totally fine, But when the rubber meets the road, 210 00:09:59,360 --> 00:10:01,960 Speaker 1: we're going to contin you to treat you as Chinese. 211 00:10:02,120 --> 00:10:04,199 Speaker 1: And the reason I was fascinated by this story is 212 00:10:04,200 --> 00:10:06,640 Speaker 1: that somebody at the conference said to me, this could 213 00:10:06,679 --> 00:10:11,800 Speaker 1: be kind of a watershed moment because previously Chinese entrepreneurs 214 00:10:11,840 --> 00:10:14,560 Speaker 1: believe they could found a company in China, relok at 215 00:10:14,600 --> 00:10:18,160 Speaker 1: it Singapore, sell it to Facebook right off into the sun, 216 00:10:18,600 --> 00:10:21,239 Speaker 1: and China are basically saying no, no, no, you can't. 217 00:10:21,440 --> 00:10:24,920 Speaker 1: So what this person said to me was maybe one outcome. 218 00:10:24,960 --> 00:10:28,560 Speaker 1: One possibility here is that Chinese entrepreneurs will leave China 219 00:10:28,880 --> 00:10:31,960 Speaker 1: before they even found their businesses, because if they ever 220 00:10:32,040 --> 00:10:34,240 Speaker 1: want to be able to leave, they've got to start 221 00:10:34,280 --> 00:10:37,120 Speaker 1: the business basically as far away as China from possible. 222 00:10:37,200 --> 00:10:39,280 Speaker 2: So this is not so much interesting to you because 223 00:10:39,280 --> 00:10:41,320 Speaker 2: of what manness is. This is interesting to you because 224 00:10:41,360 --> 00:10:45,640 Speaker 2: it's kind of a harbinger for how Chinese companies might 225 00:10:45,679 --> 00:10:46,720 Speaker 2: be founded in the future. 226 00:10:46,960 --> 00:10:49,600 Speaker 1: Yeah, I mean, it's it's an interesting sort of story 227 00:10:49,600 --> 00:10:53,520 Speaker 1: about a potential, you know, brain drain moment. And you know, 228 00:10:53,559 --> 00:10:57,400 Speaker 1: and here's China saying Singapore maybe an independent sovereign country 229 00:10:57,400 --> 00:11:00,720 Speaker 1: and independent jurisdiction, but if one of our folks or 230 00:11:00,760 --> 00:11:04,600 Speaker 1: companies goes there, we're not going to respect that. It's 231 00:11:04,600 --> 00:11:08,120 Speaker 1: a different jurisdiction. So it is interesting combination of tech 232 00:11:08,120 --> 00:11:11,000 Speaker 1: and geopolitics. And also I was, you know, I was 233 00:11:11,000 --> 00:11:13,640 Speaker 1: attracted to this story because of this idea of like 234 00:11:13,720 --> 00:11:16,840 Speaker 1: China fights back. And another story that really caught my 235 00:11:16,840 --> 00:11:21,320 Speaker 1: eye last week was the Chinese customs officials recently blocked 236 00:11:21,559 --> 00:11:24,760 Speaker 1: an inbound shipment of Nvidia H two hundred. 237 00:11:25,000 --> 00:11:25,720 Speaker 2: Me into China. 238 00:11:25,880 --> 00:11:30,000 Speaker 1: They've blocked a shipment into China in video chips. Why, Well, 239 00:11:31,880 --> 00:11:35,000 Speaker 1: we don't know. I was quite shocked as well. And 240 00:11:35,240 --> 00:11:39,240 Speaker 1: obviously the background to this is Jensenfwangan Nvidia were at 241 00:11:39,280 --> 00:11:43,320 Speaker 1: great pains last year to persuade the Trump administration to 242 00:11:43,360 --> 00:11:45,720 Speaker 1: allow them to export these chips to China, and they 243 00:11:45,800 --> 00:11:49,640 Speaker 1: won the battle and were allowed to export them. And 244 00:11:49,640 --> 00:11:52,120 Speaker 1: then all of a sudden, China are apparently turning around 245 00:11:52,480 --> 00:11:55,040 Speaker 1: and saying no, no, no, you can't bring this into the country. 246 00:11:55,280 --> 00:11:57,680 Speaker 1: We don't know why, but we do know is this 247 00:11:57,760 --> 00:12:02,679 Speaker 1: could be hugely significant because suppliers who make the parts 248 00:12:02,760 --> 00:12:06,360 Speaker 1: required to go into in video chips have already started 249 00:12:06,360 --> 00:12:09,880 Speaker 1: throttling production because they're so worried that the demand may 250 00:12:09,920 --> 00:12:11,600 Speaker 1: go down so much without China. 251 00:12:12,360 --> 00:12:14,600 Speaker 2: But I sort of understand, if you're China, why are 252 00:12:14,640 --> 00:12:15,680 Speaker 2: you blocking these chips. 253 00:12:16,000 --> 00:12:17,920 Speaker 1: Well, we don't know if it's China or if it's 254 00:12:17,960 --> 00:12:21,199 Speaker 1: like customs officials at a certain point. So like the 255 00:12:21,240 --> 00:12:23,920 Speaker 1: ft who broke this story asked for comment. The Chinese 256 00:12:23,960 --> 00:12:27,400 Speaker 1: government didn't respond. Is unclear whether this was a one off, 257 00:12:27,679 --> 00:12:31,760 Speaker 1: you know, customs thing, or if in fact this is 258 00:12:31,800 --> 00:12:34,720 Speaker 1: a new Chinese policy. But there's a few kind of 259 00:12:34,760 --> 00:12:37,120 Speaker 1: explanations that have been floated. One is that this will 260 00:12:37,200 --> 00:12:40,680 Speaker 1: encourage domestic chip companies in China's develop their own chips faster. 261 00:12:41,000 --> 00:12:43,720 Speaker 1: Another is that it's some kind of bargaining tactic, because 262 00:12:43,720 --> 00:12:45,959 Speaker 1: of course it could be a huge financial hit to 263 00:12:46,000 --> 00:12:49,240 Speaker 1: in Video, which represents something like more than five percent 264 00:12:49,280 --> 00:12:52,840 Speaker 1: of the total market cap of the US stock market. Essentially, 265 00:12:52,880 --> 00:12:55,960 Speaker 1: now if they can't export their chips to China. We 266 00:12:56,000 --> 00:12:58,120 Speaker 1: also know there's a lot of smuggling of the highest 267 00:12:58,160 --> 00:13:00,960 Speaker 1: quality chips into China, which is also lucrative business. So 268 00:13:02,440 --> 00:13:05,920 Speaker 1: it's a sort of you know, international cat and mouse game. 269 00:13:06,120 --> 00:13:08,760 Speaker 1: And it's possible that yeah, that China are in any 270 00:13:08,760 --> 00:13:11,400 Speaker 1: case have that chips and getting to eat them too. 271 00:13:12,040 --> 00:13:16,640 Speaker 2: I can't no. Also, for you, what chips are very different. 272 00:13:16,800 --> 00:13:22,960 Speaker 1: That's true. Chips are fries and for you they're crisps. Crisps. 273 00:13:23,679 --> 00:13:27,000 Speaker 2: So today we've been talking a lot about China. And 274 00:13:27,600 --> 00:13:30,800 Speaker 2: it's very funny because as online as I am, my 275 00:13:30,920 --> 00:13:33,840 Speaker 2: friends are actually more online. So a lot of things 276 00:13:33,840 --> 00:13:36,560 Speaker 2: that come to me come to me because people are like, 277 00:13:36,559 --> 00:13:39,319 Speaker 2: you should post about this on your dumpster fire Instagram. 278 00:13:39,800 --> 00:13:42,280 Speaker 2: And a trend that I got asked a lot about 279 00:13:42,280 --> 00:13:43,920 Speaker 2: this week and that a friend of mine was like, 280 00:13:44,160 --> 00:13:46,680 Speaker 2: very hot on was the following. But I'm gonna let 281 00:13:46,679 --> 00:13:47,560 Speaker 2: my girl take it away. 282 00:13:48,000 --> 00:13:50,240 Speaker 3: If you are someone that likes hot hot dim sum 283 00:13:50,320 --> 00:13:52,440 Speaker 3: Sechhue cuisine beyond bell noodles, do you know that there's 284 00:13:52,440 --> 00:13:56,160 Speaker 3: actually a really interesting reason why. The reason is because 285 00:13:56,679 --> 00:13:59,920 Speaker 3: you are Chinese. You didn't know it, but you are Chinese, 286 00:14:00,000 --> 00:14:01,440 Speaker 3: and so for that reason, every single time you have 287 00:14:01,440 --> 00:14:04,640 Speaker 3: a crazing for like dumplings or something, it's because you 288 00:14:04,720 --> 00:14:08,040 Speaker 3: are Chinese. Your body just longs for that Wanton You know. 289 00:14:08,920 --> 00:14:10,200 Speaker 1: I think I am Chinese. 290 00:14:10,400 --> 00:14:11,840 Speaker 2: You know what the best line is? And I don't 291 00:14:11,840 --> 00:14:16,079 Speaker 2: think she knew this, like you know like wanton lust, Yes, 292 00:14:16,760 --> 00:14:19,360 Speaker 2: longing for Wanton great little play. 293 00:14:20,040 --> 00:14:23,160 Speaker 1: Well, you obviously you are Chinese. I must be hungry 294 00:14:23,240 --> 00:14:25,680 Speaker 1: than you because I wasn't. I has missed the play 295 00:14:25,680 --> 00:14:26,960 Speaker 1: on words. I was just thinking about how much I 296 00:14:27,000 --> 00:14:29,040 Speaker 1: would like to be eating one of those things right now. 297 00:14:29,640 --> 00:14:32,040 Speaker 2: So this video, just for a little context, is by 298 00:14:32,040 --> 00:14:35,120 Speaker 2: a Chinese American creator named Sherry. Her handle is spelled 299 00:14:35,480 --> 00:14:39,000 Speaker 2: s h E r R y x I I R 300 00:14:39,200 --> 00:14:41,080 Speaker 2: U I I. If you want to follow her guides, 301 00:14:41,080 --> 00:14:42,520 Speaker 2: I'm not going to say it again. You had to 302 00:14:42,520 --> 00:14:46,160 Speaker 2: be listening. And she has millions of views, and now 303 00:14:46,200 --> 00:14:51,160 Speaker 2: Americans on TikTok are posting videos saying things like you 304 00:14:51,240 --> 00:14:53,240 Speaker 2: met me at a very Chinese time in my life. 305 00:14:53,400 --> 00:14:57,800 Speaker 1: But this is very funny and very very weird. 306 00:14:58,640 --> 00:15:03,560 Speaker 2: It's very strange. It does toe the cultural appropriation line. 307 00:15:03,600 --> 00:15:07,440 Speaker 2: You know, there is something mildly unsettling about seeing comments 308 00:15:07,440 --> 00:15:10,840 Speaker 2: on TikTok like first day of being a Chinese baddie. 309 00:15:11,160 --> 00:15:13,080 Speaker 1: I'm all for it. I'm looking forward to living in 310 00:15:13,120 --> 00:15:15,840 Speaker 1: a very Chinese time in my life. Yeah, what exactly 311 00:15:16,040 --> 00:15:19,080 Speaker 1: what does this mean? Chinese body obviously likes Chinese food. 312 00:15:19,360 --> 00:15:22,720 Speaker 2: They drink hot water, they're eating bao, you know, they're 313 00:15:22,720 --> 00:15:24,680 Speaker 2: wearing slippers, indoors. 314 00:15:24,400 --> 00:15:26,080 Speaker 1: La boo boo, they have la boo boo. 315 00:15:26,200 --> 00:15:30,520 Speaker 2: Yeah, it's I mean, it's basically anything that is prioritizing 316 00:15:30,600 --> 00:15:33,960 Speaker 2: Chinese culture as their predominant culture. It's I think one 317 00:15:34,000 --> 00:15:37,320 Speaker 2: of the things again that's important, is that Chinese influencers 318 00:15:37,360 --> 00:15:42,960 Speaker 2: are bestowing this onto American TikTok users to say, you 319 00:15:43,000 --> 00:15:44,840 Speaker 2: are doing these things, you are Chinese. 320 00:15:44,960 --> 00:15:47,880 Speaker 4: Hello, my Chinese besties, If you're watching this, you're Chinese. 321 00:15:48,000 --> 00:15:49,560 Speaker 4: It has occurred to me that a lot of you 322 00:15:49,600 --> 00:15:53,400 Speaker 4: guys has not come to terms with your newfound Chinese identity. 323 00:15:53,960 --> 00:15:57,480 Speaker 4: And let me just ask you this. Aren't you scrolling 324 00:15:57,520 --> 00:16:02,240 Speaker 4: on this Chinese app probably a Chinese made phone, wearing 325 00:16:02,400 --> 00:16:06,600 Speaker 4: clothes that are made in China, collecting dolls that are 326 00:16:06,680 --> 00:16:11,760 Speaker 4: from China, most importantly, living in a very Chinese time 327 00:16:11,920 --> 00:16:14,800 Speaker 4: of your life. So yes, you're Chinese. So you better 328 00:16:14,840 --> 00:16:17,520 Speaker 4: go get that kettle and start drinking hot water, get 329 00:16:17,520 --> 00:16:20,920 Speaker 4: your rice coocaretti, get your hot ho machine ready, and 330 00:16:20,960 --> 00:16:23,320 Speaker 4: we will have some baijo next time we have launch 331 00:16:23,360 --> 00:16:23,800 Speaker 4: and dinner. 332 00:16:24,520 --> 00:16:26,920 Speaker 2: We actually talked about this in my interview with the 333 00:16:26,960 --> 00:16:28,360 Speaker 2: New Yorker writer Kyle Chaika. 334 00:16:28,880 --> 00:16:30,920 Speaker 5: The most brain rotted thing to me, the most like 335 00:16:32,280 --> 00:16:34,760 Speaker 5: where are we? What the fuck is happening? Is like 336 00:16:35,160 --> 00:16:39,960 Speaker 5: the whole merger of Chinese Internet and US Internet. 337 00:16:40,720 --> 00:16:43,000 Speaker 2: And he told me his favorite piece of brain rot 338 00:16:43,040 --> 00:16:45,840 Speaker 2: was actually the meme that preceded this trend. 339 00:16:46,480 --> 00:16:49,720 Speaker 5: It's Donald Trump given an Ai Mandarin voice and he's 340 00:16:49,760 --> 00:17:02,680 Speaker 5: singing these dramatic, like maoist Chinese songs. Ah and that 341 00:17:03,200 --> 00:17:06,960 Speaker 5: and many other memes have built up to this joke 342 00:17:07,080 --> 00:17:09,240 Speaker 5: of like you have met me at a very Chinese 343 00:17:09,240 --> 00:17:10,159 Speaker 5: time in my life. 344 00:17:10,320 --> 00:17:14,080 Speaker 1: It is interesting. Obviously, Asian culture kind of trending in 345 00:17:14,160 --> 00:17:18,480 Speaker 1: American pop culture is not a new phenomenon, but historically 346 00:17:18,560 --> 00:17:21,800 Speaker 1: it's been focused around kind of Japanese and Korean trends 347 00:17:21,840 --> 00:17:25,200 Speaker 1: like animes and K dramas and K pop and Korean 348 00:17:25,240 --> 00:17:28,520 Speaker 1: makeup and all that kind of stuff. And now, at 349 00:17:28,520 --> 00:17:30,439 Speaker 1: the same time as the story I brought about, this 350 00:17:30,560 --> 00:17:35,120 Speaker 1: kind of deep geopolitical tension and competition and shadow conflict 351 00:17:35,800 --> 00:17:38,120 Speaker 1: kind of the other side of the coin is this 352 00:17:38,320 --> 00:17:43,320 Speaker 1: very sort of odd and fascinating cultural merger on TikTok. 353 00:17:43,840 --> 00:17:47,160 Speaker 2: It's true, and you know, there's a few theories as 354 00:17:47,160 --> 00:17:50,320 Speaker 2: to why this might be happening. One is that last year, 355 00:17:50,359 --> 00:17:52,240 Speaker 2: when TikTok was going to be banned in the US, 356 00:17:52,720 --> 00:17:55,720 Speaker 2: many users flocked to a Chinese app called Red Note. 357 00:17:55,800 --> 00:17:56,879 Speaker 1: I remember that in protest. 358 00:17:56,960 --> 00:17:59,760 Speaker 2: Yeah, we actually reported on it, which led to this 359 00:18:00,440 --> 00:18:05,600 Speaker 2: interesting bridging of cultures, and many users actually started learning Mandarin, 360 00:18:06,040 --> 00:18:09,199 Speaker 2: which is crazy. Others have said that it could be 361 00:18:09,200 --> 00:18:11,879 Speaker 2: because gen Z is acutely feeling the loss of the 362 00:18:11,920 --> 00:18:15,040 Speaker 2: American dream and that the economy is unstable and there 363 00:18:15,080 --> 00:18:17,639 Speaker 2: are fears of global climate change, so they're like looking 364 00:18:17,760 --> 00:18:22,240 Speaker 2: outside of their home country for hope and romanticizing life 365 00:18:22,560 --> 00:18:25,080 Speaker 2: in China, which is a country that is, you know, 366 00:18:25,880 --> 00:18:29,080 Speaker 2: a communist country, you know, maybe not one that we 367 00:18:29,119 --> 00:18:32,280 Speaker 2: would necessarily feel so free and if we were actually there, 368 00:18:32,320 --> 00:18:35,960 Speaker 2: but that seems to have a very sort of robust 369 00:18:36,040 --> 00:18:40,040 Speaker 2: infrastructure and a thriving clean energy industry. There's actually a 370 00:18:40,040 --> 00:18:42,359 Speaker 2: Bloomberg article which cited one creator who put it, I 371 00:18:42,400 --> 00:18:46,000 Speaker 2: think very very well, and in the article, this creator said, 372 00:18:46,440 --> 00:18:50,040 Speaker 2: they're engaging with a hyper real China, a symbolic version 373 00:18:50,080 --> 00:18:55,640 Speaker 2: that absorbs everything. Americans fear that they're losing community structure, competence, limits, 374 00:18:55,840 --> 00:18:58,200 Speaker 2: cultural continuity, and care for elders. 375 00:18:58,600 --> 00:19:01,040 Speaker 1: Really really interesting. I mean, it's interesting we look back 376 00:19:01,080 --> 00:19:04,440 Speaker 1: now on the collapse of the Soviet Union as though 377 00:19:04,480 --> 00:19:06,679 Speaker 1: it was kind of inevitable and the Soviet Union was 378 00:19:06,680 --> 00:19:10,199 Speaker 1: always the loser, But I think there were certainly, you know, 379 00:19:10,359 --> 00:19:13,240 Speaker 1: in the sixties a lot of Americans and Brits who 380 00:19:13,240 --> 00:19:16,560 Speaker 1: were quite admiring of Soviet culture and the way Soviet 381 00:19:16,640 --> 00:19:19,560 Speaker 1: society was organized and stuff. So when you live in 382 00:19:19,560 --> 00:19:23,120 Speaker 1: this kind of bipolar world, which we do again now, 383 00:19:23,320 --> 00:19:26,400 Speaker 1: the thrill and the appeal of the other is very 384 00:19:26,440 --> 00:19:29,280 Speaker 1: real as well as the fear. But I thought there 385 00:19:29,320 --> 00:19:30,960 Speaker 1: was a guy we had on the podcast last year 386 00:19:30,960 --> 00:19:34,000 Speaker 1: called Dan Wong who wrote a book called Breakneck, and 387 00:19:34,280 --> 00:19:36,159 Speaker 1: he had lived in China and moved back to the US, 388 00:19:36,480 --> 00:19:38,880 Speaker 1: and he basically said, look, you know, if you live 389 00:19:38,920 --> 00:19:42,080 Speaker 1: in China and you look out of your window once 390 00:19:42,119 --> 00:19:44,760 Speaker 1: a year, in a space of three years, you might 391 00:19:44,800 --> 00:19:48,000 Speaker 1: see a whole new city emerge. If you live in 392 00:19:48,119 --> 00:19:50,480 Speaker 1: New York or San Francisco, the best you can hope 393 00:19:50,520 --> 00:19:52,720 Speaker 1: for in that time period is a new coffee shop 394 00:19:52,720 --> 00:19:56,400 Speaker 1: to open. So I do think there's something as much 395 00:19:56,400 --> 00:19:59,080 Speaker 1: as people fear the authoritarian aspects of Chinese culture, those 396 00:19:59,119 --> 00:20:02,800 Speaker 1: things that you mentioned in the quote, like community structure, competence, 397 00:20:03,160 --> 00:20:07,520 Speaker 1: cultural continuity, I mean, these are things which are understandably attractive. Yeah, 398 00:20:07,600 --> 00:20:09,280 Speaker 1: would you say that you're in a very Chinese time 399 00:20:09,280 --> 00:20:10,360 Speaker 1: of your life? Now not at all. 400 00:20:10,400 --> 00:20:13,160 Speaker 2: I'm not trying, but it's I think what's really interesting 401 00:20:13,280 --> 00:20:16,080 Speaker 2: is the minute I heard about this trend, I was like, 402 00:20:16,200 --> 00:20:17,600 Speaker 2: should I be boiling apples? 403 00:20:17,640 --> 00:20:18,720 Speaker 1: Have you tried any of these things? 404 00:20:18,760 --> 00:20:19,520 Speaker 2: I haven't, but. 405 00:20:21,160 --> 00:20:27,400 Speaker 1: Ask you next week please after the break more TikTok 406 00:20:27,600 --> 00:20:32,040 Speaker 1: TikTok micro dramas. Humans take on boring jobs training robots 407 00:20:32,080 --> 00:20:35,440 Speaker 1: so that eventually we won't have to work. And there's 408 00:20:35,440 --> 00:20:44,760 Speaker 1: a new dream employee in Silicon Valley. Stay with us 409 00:20:45,960 --> 00:20:48,760 Speaker 1: and we're back. Caro. Do you remember a couple of 410 00:20:48,800 --> 00:20:51,400 Speaker 1: weeks ago I made my prediction for the year. 411 00:20:51,760 --> 00:20:53,600 Speaker 2: Yeah, you said it was the year of the robot. 412 00:20:53,760 --> 00:20:57,040 Speaker 1: The year of the robot. So I sketched in that 413 00:20:57,080 --> 00:21:01,119 Speaker 1: episode a couple of weeks ago a hypothetical nightmare scenario 414 00:21:01,359 --> 00:21:05,480 Speaker 1: in which troves of real humans do activities so that 415 00:21:05,600 --> 00:21:08,399 Speaker 1: robots can learn from them. It turns out that it 416 00:21:08,440 --> 00:21:12,160 Speaker 1: wasn't a hypothetical, it's real. And Rest of World, which 417 00:21:12,200 --> 00:21:15,440 Speaker 1: is a tech publication, talked to a worker at one 418 00:21:15,480 --> 00:21:18,960 Speaker 1: of these robot training centers called Kim, which is pseudonym, 419 00:21:19,480 --> 00:21:21,720 Speaker 1: and he talked about what he does. He puts on 420 00:21:21,760 --> 00:21:26,240 Speaker 1: a virtual reality headset and exoskeleton on his arms, and 421 00:21:26,280 --> 00:21:28,520 Speaker 1: then he pretends to open the door of a microwave 422 00:21:29,000 --> 00:21:31,200 Speaker 1: hundreds and hundreds of times per day. 423 00:21:31,280 --> 00:21:33,080 Speaker 2: Oh my god, to train a robot to do that, 424 00:21:33,080 --> 00:21:34,680 Speaker 2: that's so monotonous. 425 00:21:34,760 --> 00:21:38,560 Speaker 1: Other days he mimics folding clothes and sometimes he even 426 00:21:38,880 --> 00:21:39,960 Speaker 1: stacks wooden blocks. 427 00:21:40,440 --> 00:21:43,359 Speaker 2: What so humans have to do household task to train 428 00:21:43,520 --> 00:21:46,919 Speaker 2: robots to then replace them doing household tasks. 429 00:21:46,720 --> 00:21:49,920 Speaker 1: That's right, and the humanoid robot next to him watches 430 00:21:50,160 --> 00:21:54,360 Speaker 1: and basically develops their fine motor skills by doing so. 431 00:21:54,359 --> 00:21:56,919 Speaker 2: So how many coworkers does Kim have? Like, is this 432 00:21:56,960 --> 00:21:58,160 Speaker 2: like a major facility? 433 00:21:58,280 --> 00:22:01,800 Speaker 1: It's a big facility. And local governments in China are 434 00:22:01,840 --> 00:22:05,760 Speaker 1: working with the central government to open two dozen robot 435 00:22:05,880 --> 00:22:08,520 Speaker 1: training centers. In fact that the goal is to have 436 00:22:08,560 --> 00:22:11,560 Speaker 1: more than forty of these in operation, which are servicing 437 00:22:11,720 --> 00:22:15,440 Speaker 1: more than one hundred and fifty humanoid robot companies currently 438 00:22:15,480 --> 00:22:16,200 Speaker 1: active in China. 439 00:22:16,760 --> 00:22:18,600 Speaker 2: And so what is the end goal? Like why do 440 00:22:18,640 --> 00:22:20,680 Speaker 2: these robots need to learn how to open a microwave? 441 00:22:20,920 --> 00:22:22,879 Speaker 1: I mean, I think it's two things, right. One is, 442 00:22:23,200 --> 00:22:27,680 Speaker 1: humanoid robotics is a new space race. So if China wins, 443 00:22:27,760 --> 00:22:30,000 Speaker 1: I mean, how many hype videos. Have you seen of 444 00:22:30,040 --> 00:22:31,360 Speaker 1: like humanoid robotic. 445 00:22:31,040 --> 00:22:32,560 Speaker 2: Olympics doing crazy things? 446 00:22:32,720 --> 00:22:37,040 Speaker 1: If China gets their first on mass produced effective human robotics, 447 00:22:37,359 --> 00:22:40,440 Speaker 1: it's a very clear narrative that they have essentially won 448 00:22:40,480 --> 00:22:43,399 Speaker 1: the future, or won this chapter of the future. So 449 00:22:43,400 --> 00:22:46,400 Speaker 1: I think it's partly that, but also it's more practical, right, 450 00:22:46,440 --> 00:22:49,520 Speaker 1: These robots ultimately are designed to be used in assembly 451 00:22:49,560 --> 00:22:53,600 Speaker 1: line jobs, or to care for elders, or essentially industries 452 00:22:53,680 --> 00:22:57,119 Speaker 1: where there are not enough workers as population ages and 453 00:22:57,320 --> 00:22:58,680 Speaker 1: birth rates decline. 454 00:23:00,000 --> 00:23:01,879 Speaker 2: Interesting. I just had this visual of being in a 455 00:23:01,880 --> 00:23:06,760 Speaker 2: hotel in the future and just passing by robots pushing 456 00:23:07,480 --> 00:23:08,720 Speaker 2: the maintenance. 457 00:23:08,320 --> 00:23:09,879 Speaker 1: Cards lost in translation. 458 00:23:11,160 --> 00:23:15,920 Speaker 2: So bizarre, but it's so right on our doorstep, don't 459 00:23:15,960 --> 00:23:16,240 Speaker 2: you think? 460 00:23:16,520 --> 00:23:18,399 Speaker 1: I think so. I mean, I think it does remain 461 00:23:18,440 --> 00:23:22,320 Speaker 1: to be seen, like how practical these robots will actually 462 00:23:22,359 --> 00:23:25,280 Speaker 1: be because they you know, they fall over, and they have 463 00:23:25,359 --> 00:23:30,880 Speaker 1: power shortages, and they're extremely expensive. But China essentially birthed 464 00:23:31,240 --> 00:23:34,639 Speaker 1: the electronic industry by investing it way ahead of it 465 00:23:34,680 --> 00:23:38,760 Speaker 1: actually being functional and created demand central demand even though 466 00:23:38,760 --> 00:23:42,040 Speaker 1: the market didn't have demand. And now you know, Bide 467 00:23:42,119 --> 00:23:44,439 Speaker 1: is bigger than Tesla, and China produces ten times more 468 00:23:44,480 --> 00:23:47,800 Speaker 1: evs than the US, So you know, it is possible 469 00:23:47,840 --> 00:23:53,480 Speaker 1: for governments with enough engineering, talent, money, determination to bring 470 00:23:53,920 --> 00:23:56,720 Speaker 1: technologies into the world not for nothing. I think Elon 471 00:23:56,880 --> 00:24:01,639 Speaker 1: is very focused on the Optimus robot. Like sometimes this 472 00:24:01,720 --> 00:24:05,359 Speaker 1: sounds like nonsense, but also like, do bet against Elon 473 00:24:05,440 --> 00:24:07,520 Speaker 1: and the Chinese government in terms of what the next 474 00:24:07,600 --> 00:24:13,679 Speaker 1: phase of the future will look like? No on China, Well, 475 00:24:13,880 --> 00:24:15,160 Speaker 1: I mean they're making the same bets. 476 00:24:15,359 --> 00:24:17,960 Speaker 2: Yes, yes, I just think China is going to pull 477 00:24:17,960 --> 00:24:18,240 Speaker 2: it off. 478 00:24:18,320 --> 00:24:20,920 Speaker 1: Yeah, it looks. It looks they're definitely in the lead. 479 00:24:21,640 --> 00:24:25,160 Speaker 2: So we're talking a lot about the AI arms race, 480 00:24:25,440 --> 00:24:29,480 Speaker 2: the robotics race, another the in my industry, I think 481 00:24:29,520 --> 00:24:30,879 Speaker 2: the thing that we talk about a lot is the 482 00:24:30,920 --> 00:24:33,840 Speaker 2: attention economy, right and who and now I actually just 483 00:24:33,880 --> 00:24:37,760 Speaker 2: read this article today about passive TV viewing and how 484 00:24:37,840 --> 00:24:41,800 Speaker 2: TV now has to become something that's interesting even when 485 00:24:41,840 --> 00:24:44,840 Speaker 2: you're not watching it because people are now watching TV 486 00:24:44,920 --> 00:24:46,119 Speaker 2: on the background. 487 00:24:45,920 --> 00:24:47,920 Speaker 1: Interesting we like. But hasn't that always been the case 488 00:24:47,920 --> 00:24:49,800 Speaker 1: that daytime TV is kind of like a thing that's 489 00:24:49,800 --> 00:24:50,960 Speaker 1: on while you're doing your. 490 00:24:51,440 --> 00:24:55,120 Speaker 2: Prestige TV is predicated on people really being hyper focused 491 00:24:55,160 --> 00:24:57,879 Speaker 2: and now because of streaming, and because of auto play 492 00:24:58,480 --> 00:25:02,080 Speaker 2: and because of reruns being on streaming platforms, people are 493 00:25:02,119 --> 00:25:03,280 Speaker 2: just passively watching audio. 494 00:25:03,440 --> 00:25:05,720 Speaker 1: So there's no distinction between like, this is TV that 495 00:25:05,760 --> 00:25:09,000 Speaker 1: requires a full attention, this is background. All TV is background. 496 00:25:08,600 --> 00:25:12,040 Speaker 2: TV increasingly, Yes, because of second screening, which is like, 497 00:25:12,320 --> 00:25:16,080 Speaker 2: even if you're watching Succession, are you watching Succession? So 498 00:25:17,359 --> 00:25:21,040 Speaker 2: there's a new US China race for attention, Yes, which 499 00:25:21,080 --> 00:25:23,480 Speaker 2: comes in the form of microdramas. 500 00:25:24,240 --> 00:25:27,359 Speaker 1: And this is making very short, like one to two minute, 501 00:25:28,000 --> 00:25:31,200 Speaker 1: very addictive, serialized content. 502 00:25:31,400 --> 00:25:31,680 Speaker 2: Yes. 503 00:25:32,040 --> 00:25:35,399 Speaker 1: And I've read so much about this trend. 504 00:25:35,600 --> 00:25:37,800 Speaker 2: Yeah, but you've never seen a newspaper. Yeah. 505 00:25:37,840 --> 00:25:39,440 Speaker 1: Is it? Like? Is it? Is it a real thing? 506 00:25:39,760 --> 00:25:41,080 Speaker 2: Huge? Huge? 507 00:25:41,160 --> 00:25:41,800 Speaker 1: Do you watch them? 508 00:25:43,080 --> 00:25:44,560 Speaker 2: I looked into it for the I mean you talk 509 00:25:44,600 --> 00:25:48,000 Speaker 2: about other things that I've looked into specifically for the show. Yes, this, 510 00:25:48,800 --> 00:25:50,760 Speaker 2: like is it? Am I the market for it? 511 00:25:50,880 --> 00:25:53,159 Speaker 1: No, it's for younger, younger folks, it's not even. 512 00:25:53,040 --> 00:25:55,480 Speaker 2: For younger folks. It's just I'm not personally the market 513 00:25:55,520 --> 00:25:59,960 Speaker 2: for it. I think micro dramas are essentially like they're 514 00:26:00,359 --> 00:26:02,880 Speaker 2: soap operas, their soap operas, and I'm not a soap 515 00:26:02,880 --> 00:26:04,800 Speaker 2: opera viewer, Like I don't care, I don't want, I 516 00:26:04,800 --> 00:26:06,400 Speaker 2: don't need programs. 517 00:26:06,160 --> 00:26:07,240 Speaker 1: And what's the China angle? 518 00:26:07,600 --> 00:26:11,040 Speaker 2: Okay, So a few years ago, China actually developed short 519 00:26:11,040 --> 00:26:15,320 Speaker 2: form serialized content called verticals or microdramas. The ones that 520 00:26:15,480 --> 00:26:18,480 Speaker 2: I've been very focused on are created by an American 521 00:26:18,520 --> 00:26:21,240 Speaker 2: company called Real Short and I actually went into detail 522 00:26:21,560 --> 00:26:23,679 Speaker 2: on these with me not to Sew at the end 523 00:26:23,680 --> 00:26:25,840 Speaker 2: of last year. If you want to listen to our episode. 524 00:26:26,200 --> 00:26:29,360 Speaker 2: The way that I mean not To actually had stumbled 525 00:26:29,560 --> 00:26:37,000 Speaker 2: on some of these was through TikTok. Right now, TikTok 526 00:26:37,800 --> 00:26:42,280 Speaker 2: has created a whole app devoted to these verticals, and 527 00:26:42,280 --> 00:26:45,880 Speaker 2: it's called Pine Drama and it's free to download, and 528 00:26:45,920 --> 00:26:49,479 Speaker 2: for now the videos run ad free. But I'm one 529 00:26:49,560 --> 00:26:51,240 Speaker 2: hundred percent well, I'm not one hundred percent year, but 530 00:26:51,240 --> 00:26:53,359 Speaker 2: I'm pretty sure that these this will change in the future. 531 00:26:53,440 --> 00:26:56,480 Speaker 1: So this is like a subsidiary app of TikTok. 532 00:26:56,640 --> 00:26:58,120 Speaker 2: Yes, just it's a separate app. 533 00:26:58,240 --> 00:27:02,240 Speaker 1: Yes, just for the dramas. Yes. Are they being produced 534 00:27:02,240 --> 00:27:05,520 Speaker 1: by like, by regular TikTok creators or all being produced 535 00:27:05,560 --> 00:27:06,480 Speaker 1: by like studios? 536 00:27:06,480 --> 00:27:10,919 Speaker 2: Of a fashion, I believe that these are studio created. 537 00:27:11,000 --> 00:27:14,040 Speaker 2: These are not creator driven. That would not be considered 538 00:27:14,119 --> 00:27:19,320 Speaker 2: a vertical. These videos are only about a minute long. 539 00:27:19,560 --> 00:27:22,080 Speaker 2: They draw on themes of like things that we see 540 00:27:22,080 --> 00:27:24,879 Speaker 2: in romance romance novels. There are you know, things like 541 00:27:24,960 --> 00:27:25,760 Speaker 2: age gap. 542 00:27:25,640 --> 00:27:28,040 Speaker 1: Romances, May December lust May. 543 00:27:27,920 --> 00:27:31,320 Speaker 2: December, Step Step Sibling. You know what I roll my 544 00:27:31,359 --> 00:27:34,520 Speaker 2: eyes at step Simbling Romance. Clueless was a step simbling romance. 545 00:27:34,680 --> 00:27:36,280 Speaker 1: Sure, it was one of my favorite movies. 546 00:27:36,280 --> 00:27:39,600 Speaker 2: It's one of the best movies ever work Romances, Billionaires, 547 00:27:39,640 --> 00:27:43,560 Speaker 2: and Werewolves. The success of these apps I think come 548 00:27:43,600 --> 00:27:47,000 Speaker 2: from their addictive nature and the fact that they are 549 00:27:47,400 --> 00:27:51,040 Speaker 2: pay to play, so you if you want to see more, 550 00:27:51,080 --> 00:27:53,639 Speaker 2: you have to kind of buy more tokens to watch 551 00:27:53,680 --> 00:27:54,400 Speaker 2: more stories. 552 00:27:54,520 --> 00:27:56,080 Speaker 1: This is I mean, you know what the history of 553 00:27:56,119 --> 00:27:58,359 Speaker 1: the novel was exactly. This is serial right. This is 554 00:27:58,480 --> 00:28:02,640 Speaker 1: published week by the next installment. Keep it, keep enough 555 00:28:02,680 --> 00:28:05,800 Speaker 1: suspense for long enough to keep people buying the next 556 00:28:05,800 --> 00:28:10,120 Speaker 1: installment until you're out of gas. I mean, it's nothing's new, correct. 557 00:28:10,720 --> 00:28:13,879 Speaker 1: So final story I have for you today isn't based 558 00:28:13,880 --> 00:28:18,119 Speaker 1: in China, but it has some training connections. Okay, So 559 00:28:18,640 --> 00:28:20,520 Speaker 1: we've talked about nine, nine and six a lot we have. 560 00:28:20,880 --> 00:28:22,760 Speaker 2: It's so much better than six seven. 561 00:28:23,119 --> 00:28:26,600 Speaker 1: Working nine am to nine pm, six days a week, 562 00:28:26,920 --> 00:28:31,120 Speaker 1: and we've talked about how that lifestyle has infiltrated silicon value. Yeah, 563 00:28:31,160 --> 00:28:34,399 Speaker 1: it's now been upgraded according to the information tech startups 564 00:28:34,680 --> 00:28:39,600 Speaker 1: and now looking relentlessly for quote cracked engineers. 565 00:28:39,080 --> 00:28:42,920 Speaker 2: Like cracked, like gone gone nuts, gone crazy on crack. 566 00:28:43,320 --> 00:28:47,040 Speaker 1: I mean basically, it's somebody who is just so obsessed 567 00:28:47,040 --> 00:28:49,920 Speaker 1: with their work that a nine ninety six looks like 568 00:28:50,000 --> 00:28:51,000 Speaker 1: a slob, or later. 569 00:28:50,920 --> 00:28:53,160 Speaker 2: Someone who's really using peptides, Yeah, exactly. 570 00:28:53,760 --> 00:28:58,880 Speaker 1: One startup CEO described cracked engineers as quote so competent 571 00:28:58,880 --> 00:29:01,200 Speaker 1: in what they do, they don't need to care about 572 00:29:01,240 --> 00:29:04,280 Speaker 1: anything else. They don't care about politics or wearing smart 573 00:29:04,280 --> 00:29:05,880 Speaker 1: clothes or washing regularly. 574 00:29:05,920 --> 00:29:07,600 Speaker 2: At first, I thought you were saying smart clothes, like 575 00:29:07,920 --> 00:29:10,800 Speaker 2: technologically smart clothes. I was like, well, that's interesting that 576 00:29:11,320 --> 00:29:14,800 Speaker 2: an engineer would wear smart clothes. Okay, So they're basically 577 00:29:14,880 --> 00:29:19,520 Speaker 2: very unwell people well or extremely well people there. 578 00:29:19,560 --> 00:29:21,800 Speaker 1: So they're so plugged into the mainframe they don't need 579 00:29:21,840 --> 00:29:24,840 Speaker 1: to think about any of the creature comforts. One person 580 00:29:24,880 --> 00:29:27,719 Speaker 1: actually quoted in the piece was trying to learn to 581 00:29:27,760 --> 00:29:30,720 Speaker 1: swim as a hobby, but they talked about how they 582 00:29:30,840 --> 00:29:35,320 Speaker 1: deleted that because it was distracting them too much from work. Deed, 583 00:29:35,360 --> 00:29:38,040 Speaker 1: it deleted it. So these correcked engineers are typically in 584 00:29:38,080 --> 00:29:40,520 Speaker 1: their twenties, but they may have been working for up 585 00:29:40,560 --> 00:29:43,320 Speaker 1: to a decade by that point. They're fast, they're independent, 586 00:29:43,360 --> 00:29:45,280 Speaker 1: they don't want to manage others, and they have no 587 00:29:45,440 --> 00:29:47,040 Speaker 1: sense of work life balance. 588 00:29:47,600 --> 00:29:50,200 Speaker 2: And why is this, I mean, how is this different 589 00:29:50,240 --> 00:29:52,440 Speaker 2: than say, you're Zuckerberg when he's twenty one. 590 00:29:53,880 --> 00:29:56,400 Speaker 1: I think that zuckerbo when I mean, if you remember 591 00:29:56,440 --> 00:29:58,800 Speaker 1: the movie Social Network, he was jamming his keyboard in 592 00:29:58,800 --> 00:30:02,600 Speaker 1: his bedroom while the Winkle VII were like rowing and. 593 00:30:03,160 --> 00:30:03,800 Speaker 2: Living their lives. 594 00:30:03,880 --> 00:30:05,640 Speaker 1: Yeah, yeah, So he was an outline where this is 595 00:30:05,680 --> 00:30:08,640 Speaker 1: now mainstream culture in Silicon Valley, I say, and I 596 00:30:08,680 --> 00:30:11,480 Speaker 1: thought that the information put it really well, which is basically, 597 00:30:11,800 --> 00:30:14,480 Speaker 1: there's a new AI gold rush happening right now. Like 598 00:30:14,760 --> 00:30:17,160 Speaker 1: a generation of people think they can become billionaires by 599 00:30:17,160 --> 00:30:19,640 Speaker 1: the time they're thirty, and so in this kind of 600 00:30:19,680 --> 00:30:24,160 Speaker 1: gold riish opportunity, people will do whatever it takes to 601 00:30:24,200 --> 00:30:26,600 Speaker 1: get an age. So you're seeing this kind of this 602 00:30:26,840 --> 00:30:33,640 Speaker 1: fomo driven engineering culture emerge in Silicon Valley. And from 603 00:30:33,680 --> 00:30:36,640 Speaker 1: the company point of view, you know, hiring these cracked 604 00:30:36,720 --> 00:30:39,440 Speaker 1: engineers is seen as a way to speed up productivity 605 00:30:40,240 --> 00:30:44,120 Speaker 1: and kind of create value without building a long term 606 00:30:44,280 --> 00:30:47,959 Speaker 1: sustainable work environment. There was actually a tech recruiter quoted 607 00:30:47,960 --> 00:30:50,400 Speaker 1: in the piece saying, people like to put band aids 608 00:30:50,400 --> 00:30:52,560 Speaker 1: on their business, and I think a correct engineer is 609 00:30:52,600 --> 00:30:54,520 Speaker 1: a band aid for a lot of founders who have 610 00:30:55,040 --> 00:30:56,440 Speaker 1: no idea what they're building. 611 00:30:56,280 --> 00:30:58,480 Speaker 2: What they're building. Yeah, really saying this. 612 00:30:58,400 --> 00:31:01,280 Speaker 1: Is a cussle culture thing. But you know, this dynamic 613 00:31:01,320 --> 00:31:04,080 Speaker 1: of Fomo and the new gold Russian stuff, like it's 614 00:31:04,080 --> 00:31:08,240 Speaker 1: obviously what is driving like this discourse, this US China 615 00:31:08,280 --> 00:31:11,320 Speaker 1: discourse that like one side will win this moment and 616 00:31:11,400 --> 00:31:14,240 Speaker 1: that will be until the end of time, like the 617 00:31:14,320 --> 00:31:16,920 Speaker 1: victor in this bipolar conflict. I think that's the kind 618 00:31:16,920 --> 00:31:19,000 Speaker 1: of macro story we talked about, you know, the top 619 00:31:19,000 --> 00:31:21,640 Speaker 1: of this episode. But in the microcosm it plays out 620 00:31:21,640 --> 00:31:24,680 Speaker 1: with like people aspiring to be cracked. 621 00:31:25,320 --> 00:31:29,080 Speaker 2: I mean, it also seems like a youth culture thing. 622 00:31:29,160 --> 00:31:31,320 Speaker 2: It doesn't seem like something that you can do well 623 00:31:31,360 --> 00:31:33,479 Speaker 2: into your forty You can be cracked in your forties. 624 00:31:33,800 --> 00:31:36,640 Speaker 1: And you also apparently it's quite looked down upon to 625 00:31:36,680 --> 00:31:38,680 Speaker 1: describe yourself as cracked, like it's a bit. 626 00:31:39,040 --> 00:31:40,680 Speaker 2: It's only you can be described. 627 00:31:40,760 --> 00:31:42,800 Speaker 1: You want, you want people to talk about you as cracked, 628 00:31:42,800 --> 00:31:45,120 Speaker 1: but you can't give yourself yourself beautiful. 629 00:32:05,000 --> 00:32:07,600 Speaker 2: That's it for this week for tech Stuff, I'm Cara Price. 630 00:32:07,400 --> 00:32:10,320 Speaker 1: Ali mos Vlosan. This episode was produced by Eliza Dennis 631 00:32:10,360 --> 00:32:14,160 Speaker 1: and Melissa Slaughter. It was executive produced by me Caarra Price, 632 00:32:14,240 --> 00:32:17,720 Speaker 1: Julian Nutter, and Kate Osborne for Kaleidoscope and Katrina Norvel 633 00:32:17,800 --> 00:32:21,560 Speaker 1: for iHeart Podcasts. The engineer is Bihid Fraser and Jack 634 00:32:21,680 --> 00:32:25,000 Speaker 1: Insley mix this episode. Kyle Murdoch wrote our theme song. 635 00:32:25,400 --> 00:32:27,880 Speaker 2: Please rate, review and reach out to us at textuff 636 00:32:27,920 --> 00:32:30,520 Speaker 2: podcast at gmail dot com. We want to hear from 637 00:32:30,560 --> 00:32:30,600 Speaker 2: you