1 00:00:13,760 --> 00:00:17,080 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:17,120 --> 00:00:19,799 Speaker 1: Os Voloscian and I'm care Price. Today we're going to 3 00:00:19,840 --> 00:00:23,159 Speaker 1: get into a communications network built for the end of 4 00:00:23,200 --> 00:00:27,360 Speaker 1: the world, and why the US is placing tracking devices 5 00:00:27,400 --> 00:00:31,320 Speaker 1: in AI chip shipments. Then on chat to me your 6 00:00:31,360 --> 00:00:35,159 Speaker 1: next makeover brought to you by AI. All of that 7 00:00:35,400 --> 00:00:47,320 Speaker 1: on the Weekend Tech. It's Friday, August twenty second. Hello, 8 00:00:47,360 --> 00:00:48,159 Speaker 1: Hello Cara. 9 00:00:48,200 --> 00:00:51,239 Speaker 2: Hi auhs. Do you remember when we talked about the 10 00:00:51,240 --> 00:00:54,960 Speaker 2: half marathon between humans and robots. 11 00:00:54,480 --> 00:00:58,800 Speaker 1: The one in China? I remember, yes, right. 12 00:00:58,560 --> 00:01:02,360 Speaker 2: There are now robot Olympians and I'm talking about the 13 00:01:02,480 --> 00:01:05,240 Speaker 2: humanoid robot games in Beijing. 14 00:01:06,800 --> 00:01:08,679 Speaker 1: I didn't know about this, but it does make me 15 00:01:08,760 --> 00:01:12,280 Speaker 1: think of my childhood as an only child. In England 16 00:01:12,480 --> 00:01:15,120 Speaker 1: in the late nineties there were you I know, there 17 00:01:15,120 --> 00:01:18,920 Speaker 1: are originally four terrestrial TV channels, and then a fifth 18 00:01:19,000 --> 00:01:22,280 Speaker 1: was launch called Channel five, and Channel five had slightly 19 00:01:22,440 --> 00:01:26,720 Speaker 1: more racy content, including at seven pm every Friday night 20 00:01:26,800 --> 00:01:31,480 Speaker 1: a show called Robot Wars where various middle aged hobbyists 21 00:01:31,560 --> 00:01:35,080 Speaker 1: would have their extremely slow robots bump into each other 22 00:01:35,160 --> 00:01:38,440 Speaker 1: and then break. But the commentator really brought the energy 23 00:01:38,760 --> 00:01:40,520 Speaker 1: and so it was. It was better than whatever else 24 00:01:40,520 --> 00:01:42,199 Speaker 1: was on TV for me, at least as a child. 25 00:01:42,480 --> 00:01:46,319 Speaker 1: That's a long rambling response. Is what you're talking about? 26 00:01:46,400 --> 00:01:48,880 Speaker 2: You had the viewing taste of a middle aged British man. 27 00:01:49,120 --> 00:01:50,320 Speaker 1: I did, and I still do. 28 00:01:50,640 --> 00:01:52,880 Speaker 2: It's not it's a little bit what I'm talking about. 29 00:01:53,040 --> 00:01:55,360 Speaker 2: Let me give a little bit of context. So these 30 00:01:55,400 --> 00:01:58,280 Speaker 2: games took place over three days in Beijing, and they 31 00:01:58,360 --> 00:02:01,520 Speaker 2: actually used some of the facilities that were built for 32 00:02:01,760 --> 00:02:05,000 Speaker 2: the twenty twenty two Winter Olympics. There were two hundred 33 00:02:05,040 --> 00:02:08,720 Speaker 2: and eighty teams from sixteen countries, including the US. The 34 00:02:08,800 --> 00:02:12,880 Speaker 2: competitions include soccer, running, kickboxing, which is sort of like 35 00:02:12,919 --> 00:02:17,200 Speaker 2: what you were talking about, basketball, table tennis, and then 36 00:02:17,520 --> 00:02:22,079 Speaker 2: strangely enough, factory work and warehouse warehouse navigation robots. 37 00:02:22,080 --> 00:02:26,680 Speaker 1: Were like, we got this, truly some must watch sports. 38 00:02:26,680 --> 00:02:30,079 Speaker 1: That's an interesting interesting They kind of brought the hype 39 00:02:30,120 --> 00:02:32,640 Speaker 1: with the kickboxing, but then what they're really flexing was 40 00:02:32,760 --> 00:02:34,480 Speaker 1: we got the rick and wac pick and pack. We 41 00:02:34,520 --> 00:02:35,519 Speaker 1: got pick and pack down. 42 00:02:35,400 --> 00:02:37,600 Speaker 2: To a t. Here we go. Yeah, I mean the 43 00:02:37,639 --> 00:02:41,600 Speaker 2: whole thing was definitely designed to wow. The opening ceremony, 44 00:02:41,600 --> 00:02:45,880 Speaker 2: as opening ceremonies do included dozens of robots dancing, doing 45 00:02:45,919 --> 00:02:49,640 Speaker 2: martial arts, playing drums and sink which you might remember 46 00:02:49,639 --> 00:02:52,120 Speaker 2: from the two thousand and eight Beijing opening ceremony. I 47 00:02:52,120 --> 00:02:54,240 Speaker 2: think it was sort of like, look, the robots can 48 00:02:54,280 --> 00:02:57,200 Speaker 2: do what humans did at the Olympics. Let me just 49 00:02:57,240 --> 00:02:59,920 Speaker 2: show you a video because it's inexplicable. 50 00:03:00,600 --> 00:03:04,960 Speaker 1: We have a real man playing a piano facing a 51 00:03:05,080 --> 00:03:08,000 Speaker 1: robot in a silver cape also playing. 52 00:03:08,400 --> 00:03:11,280 Speaker 2: Oh, and here comes a robot in a bucket hat. Oh. 53 00:03:11,360 --> 00:03:16,120 Speaker 1: Now there's real kids jumping around behind the bucket hat robots. 54 00:03:15,639 --> 00:03:17,960 Speaker 2: And they're dancing. Some of the robots don't seem to 55 00:03:17,960 --> 00:03:18,880 Speaker 2: be complying very well. 56 00:03:18,919 --> 00:03:21,880 Speaker 1: This is a surprisingly low rent. In any case, the 57 00:03:21,919 --> 00:03:24,280 Speaker 1: production values are not as high and are not that 58 00:03:24,360 --> 00:03:27,160 Speaker 1: much different Channel five, nineteen ninety eight. I mean there 59 00:03:27,240 --> 00:03:29,920 Speaker 1: is a sense of evolution from the TV show of 60 00:03:29,960 --> 00:03:32,120 Speaker 1: my childhood. I'm kind of surprised, but I actually thought 61 00:03:32,120 --> 00:03:34,600 Speaker 1: that it would be more wow factor. 62 00:03:34,880 --> 00:03:36,800 Speaker 2: Yeah, you know, I watched a little bit more of 63 00:03:36,840 --> 00:03:39,680 Speaker 2: the video and it definitely left a bit to be 64 00:03:39,760 --> 00:03:42,440 Speaker 2: desired across the board, you know, in the track and field, 65 00:03:42,480 --> 00:03:46,640 Speaker 2: and android ran full speed into one of the robot operators, 66 00:03:46,720 --> 00:03:49,400 Speaker 2: knocking him down and then he just kept running, which 67 00:03:49,600 --> 00:03:51,400 Speaker 2: you know, when they say the robots are taking over, 68 00:03:51,400 --> 00:03:54,080 Speaker 2: that's what I imagine. Everybody was fine, though, don't worry. 69 00:03:54,480 --> 00:03:58,320 Speaker 2: And the boxing robots, let's just say they miss their target. 70 00:03:58,440 --> 00:04:01,880 Speaker 2: They managed to barely punch each other at all, flailing 71 00:04:01,920 --> 00:04:04,080 Speaker 2: through the air until the refs just gave up and 72 00:04:04,120 --> 00:04:05,000 Speaker 2: were like, this one wins. 73 00:04:05,040 --> 00:04:07,960 Speaker 1: Whatever I saw a Ziemaza has a great newsletter called 74 00:04:08,040 --> 00:04:10,920 Speaker 1: Exponential View, and he put something on his LinkedIn my 75 00:04:11,040 --> 00:04:15,800 Speaker 1: favorite source of truth saying basically, this is a cute story, 76 00:04:15,880 --> 00:04:18,200 Speaker 1: but watch out because by twenty thirty it's going to 77 00:04:18,200 --> 00:04:22,120 Speaker 1: be a thirty five to fifty billion dollar industry. What 78 00:04:22,240 --> 00:04:23,800 Speaker 1: kind of reactions do you see. 79 00:04:23,560 --> 00:04:25,520 Speaker 2: Well, it's obvious that there's still a long way to 80 00:04:25,560 --> 00:04:29,080 Speaker 2: go before robots are capable of outperforming elite athletes or 81 00:04:29,160 --> 00:04:33,000 Speaker 2: outperforming elite dance squads. But that's sort of besides the point. 82 00:04:33,120 --> 00:04:36,159 Speaker 2: Those participating in the games were really there to collect data. 83 00:04:36,520 --> 00:04:38,880 Speaker 2: The robot games were a great way for these teams 84 00:04:38,920 --> 00:04:43,360 Speaker 2: to keep developing robots for more practical applications like factory work. 85 00:04:43,480 --> 00:04:46,040 Speaker 2: But as one robotics professor said to The New York Times, 86 00:04:46,720 --> 00:04:50,040 Speaker 2: robots are still dumb, haush. 87 00:04:50,200 --> 00:04:52,560 Speaker 1: You know, these games are interesting in the fact their 88 00:04:52,600 --> 00:04:56,599 Speaker 1: station in Beijing and that the US participated. It does 89 00:04:56,680 --> 00:04:59,479 Speaker 1: kind of bring to mind this Cold War posturing around 90 00:04:59,480 --> 00:05:03,560 Speaker 1: the space and the nuclear race, and robotics is obviously 91 00:05:03,720 --> 00:05:07,960 Speaker 1: a frontier of the tech race between the US and 92 00:05:08,040 --> 00:05:10,480 Speaker 1: China and an area where China seems to have the 93 00:05:10,560 --> 00:05:13,520 Speaker 1: upper hand, which is probably why these mass spectacle events 94 00:05:13,520 --> 00:05:16,200 Speaker 1: are happening so regularly, because they're great memes and they 95 00:05:16,240 --> 00:05:17,680 Speaker 1: remind the world of that. 96 00:05:18,000 --> 00:05:20,039 Speaker 2: You know, we talk about the AI race a lot 97 00:05:20,040 --> 00:05:22,800 Speaker 2: between the two countries, and I think the feeling is 98 00:05:22,839 --> 00:05:25,320 Speaker 2: that the US still has the upper hand. 99 00:05:25,560 --> 00:05:28,400 Speaker 1: I think that's the narrative that the US has this edge. 100 00:05:28,440 --> 00:05:30,360 Speaker 1: Some people say it's three months, some people say it's 101 00:05:30,400 --> 00:05:33,719 Speaker 1: six months. Some people say it's twelve months. It's a 102 00:05:33,720 --> 00:05:37,680 Speaker 1: small edge. I interviewed Jake Sullivan, the National Security Advisor 103 00:05:37,800 --> 00:05:40,520 Speaker 1: under President Biden, a few months ago. As you probably remember, 104 00:05:41,080 --> 00:05:44,520 Speaker 1: he is the guy who designed the US policy to 105 00:05:44,600 --> 00:05:48,640 Speaker 1: restrict China's access to advanced chips in order to maintain 106 00:05:49,000 --> 00:05:51,200 Speaker 1: the US edge and AI, and that policy was rolled 107 00:05:51,200 --> 00:05:52,240 Speaker 1: out back in twenty twenty two. 108 00:05:52,360 --> 00:05:54,360 Speaker 2: Yeah, I remember that interview and He told you that 109 00:05:54,400 --> 00:05:56,800 Speaker 2: despite the launch of deep Seek, which is a very 110 00:05:56,839 --> 00:06:00,680 Speaker 2: efficient Chinese AI model, he felt confident that his policy 111 00:06:00,680 --> 00:06:02,880 Speaker 2: had been the correct one, and he was worried about 112 00:06:02,920 --> 00:06:04,920 Speaker 2: what it might mean for America's edge if it were 113 00:06:04,920 --> 00:06:05,840 Speaker 2: actually rolled back. 114 00:06:06,080 --> 00:06:09,160 Speaker 1: That's right, and actually to get around that policy in 115 00:06:09,240 --> 00:06:13,080 Speaker 1: Nvidia designed a special chip specially for the Chinese market 116 00:06:13,160 --> 00:06:16,719 Speaker 1: called the H twenty, and it was weird as it sounds, 117 00:06:16,800 --> 00:06:20,279 Speaker 1: essentially designed to be less good than the top of 118 00:06:20,320 --> 00:06:22,760 Speaker 1: the line chip so that it could comply with these 119 00:06:22,960 --> 00:06:26,120 Speaker 1: US laws. The H twenty started shipping in early twenty 120 00:06:26,160 --> 00:06:30,000 Speaker 1: twenty four, but in April Trump actually went further than 121 00:06:30,040 --> 00:06:33,640 Speaker 1: the Biden administration and banned the export of AGE twenty chips. 122 00:06:34,200 --> 00:06:38,400 Speaker 1: But of course new developments late last month, President Trump 123 00:06:38,640 --> 00:06:44,400 Speaker 1: changed his mind and his administration decided that Nvidia can 124 00:06:44,560 --> 00:06:47,960 Speaker 1: now export the H twenty chips to China, provided they 125 00:06:48,000 --> 00:06:51,320 Speaker 1: pay a fifteen percent tax on all revenues of those 126 00:06:51,400 --> 00:06:54,440 Speaker 1: chips sold in China. There is interesting twist to this 127 00:06:54,880 --> 00:06:58,680 Speaker 1: saga of the chips and oxport controls. Back in the seventies, 128 00:06:58,880 --> 00:07:03,040 Speaker 1: the United States in bargoed Iranian oil, but of course 129 00:07:03,200 --> 00:07:06,640 Speaker 1: shippers and commodity traders figured out how to get past 130 00:07:06,920 --> 00:07:10,680 Speaker 1: the authorities with Iranian oil. And in a kind of 131 00:07:10,760 --> 00:07:13,960 Speaker 1: parallel of that, Reuter's reports that a booming trade has 132 00:07:14,000 --> 00:07:18,520 Speaker 1: opened up around smuggling the most advanced chips to China, 133 00:07:18,560 --> 00:07:22,480 Speaker 1: with smuggled shipments going through third party countries like Singapore, Malaysia, 134 00:07:22,520 --> 00:07:25,120 Speaker 1: and the United Arab Emirates. Roy just did run a 135 00:07:25,120 --> 00:07:27,320 Speaker 1: story about how the US may be starting to crack 136 00:07:27,360 --> 00:07:30,440 Speaker 1: down on this smuggling. It alleges US authorities have been 137 00:07:30,480 --> 00:07:36,880 Speaker 1: secretly putting location tracking technology into advanced chip shipments, shipments 138 00:07:36,920 --> 00:07:39,000 Speaker 1: they fear might be diverted to China. 139 00:07:39,040 --> 00:07:41,720 Speaker 2: So this is like dropping an air tag in your 140 00:07:41,840 --> 00:07:44,400 Speaker 2: girlfriend's purse to find out how she's cheating on you. 141 00:07:44,720 --> 00:07:47,520 Speaker 1: Yeah, I mean, I can't think of a better parallel. 142 00:07:47,600 --> 00:07:50,880 Speaker 2: And so this is all in response to the export controls. 143 00:07:51,080 --> 00:07:53,320 Speaker 1: That's right, because even though as I mentioned, the export 144 00:07:53,320 --> 00:07:56,000 Speaker 1: controls have recently been lifted on the Age twenty chips, 145 00:07:56,280 --> 00:07:59,560 Speaker 1: there is still huge demand for the most advanced H 146 00:07:59,600 --> 00:08:02,920 Speaker 1: one hun undred chips in China. And again, Jake Sullivan, 147 00:08:02,920 --> 00:08:05,800 Speaker 1: when who's talking about the deep Seek moment, argue that 148 00:08:05,800 --> 00:08:08,640 Speaker 1: that likely only happened because of the number of advanced 149 00:08:08,680 --> 00:08:10,520 Speaker 1: chips that have already made it to China or that 150 00:08:10,560 --> 00:08:11,960 Speaker 1: were being smuggled into China. 151 00:08:12,000 --> 00:08:14,680 Speaker 2: But I'm confused, like, what is tracking these chips even 152 00:08:14,680 --> 00:08:17,360 Speaker 2: going to accomplish. Isn't it already too late to get 153 00:08:17,400 --> 00:08:17,840 Speaker 2: them back? 154 00:08:18,240 --> 00:08:21,360 Speaker 1: Yes, but of course to build more and more advanced 155 00:08:21,400 --> 00:08:24,360 Speaker 1: AI models, you need more and more chips. So it's 156 00:08:24,400 --> 00:08:26,680 Speaker 1: a little bit like bolting the stable after the horse 157 00:08:26,720 --> 00:08:29,120 Speaker 1: has gone to your point. But it's also a little 158 00:08:29,160 --> 00:08:31,680 Speaker 1: bit like making sure no one leaves the stable door 159 00:08:31,760 --> 00:08:34,200 Speaker 1: open again and the rest of the horses stay on 160 00:08:34,280 --> 00:08:37,120 Speaker 1: the paddock. An anonymous source told Royce's that the trackers 161 00:08:37,120 --> 00:08:39,959 Speaker 1: are basically a way of starting to build legal cases 162 00:08:40,280 --> 00:08:42,920 Speaker 1: against companies that might be breaking the export restrictions. 163 00:08:43,160 --> 00:08:46,800 Speaker 2: And do we know who's like embedding these trackers. 164 00:08:46,440 --> 00:08:50,360 Speaker 1: No, we don't know that. It could be Homeland security investigations, 165 00:08:50,360 --> 00:08:52,840 Speaker 1: it could be the FBI, could be the Commerce Department. 166 00:08:53,160 --> 00:08:56,400 Speaker 1: Sources told Royce's the trackers are hidden in the packaging 167 00:08:56,440 --> 00:08:59,320 Speaker 1: of the shipments, not the chips themselves, but they don't 168 00:08:59,320 --> 00:09:02,480 Speaker 1: know exactly in which part of the shipping process they're 169 00:09:02,480 --> 00:09:05,680 Speaker 1: actually getting embedded. They also don't know when this started 170 00:09:06,200 --> 00:09:08,200 Speaker 1: or to what extent the trackers are part of an 171 00:09:08,240 --> 00:09:12,880 Speaker 1: active investigation. Of course, law enforcement agencies and chip manufacturers 172 00:09:12,920 --> 00:09:14,400 Speaker 1: did not rush to comment on the story. 173 00:09:14,480 --> 00:09:16,520 Speaker 2: I was gonna say, it could be someone's ex We 174 00:09:16,600 --> 00:09:18,280 Speaker 2: have no idea. I mean, whose China is that? I 175 00:09:18,320 --> 00:09:21,480 Speaker 2: mean the US government probably, you know, I guess. One 176 00:09:21,520 --> 00:09:23,640 Speaker 2: of the tricks of effective law enforcement is to make 177 00:09:23,679 --> 00:09:26,400 Speaker 2: people believe you are all knowing and all seeing, to 178 00:09:26,520 --> 00:09:30,400 Speaker 2: deter people from even thinking about the pen optacon precisely 179 00:09:30,440 --> 00:09:34,200 Speaker 2: the panopticon. But controlling shipments in international waters via third 180 00:09:34,240 --> 00:09:36,400 Speaker 2: party countries is not an easy feat. 181 00:09:36,800 --> 00:09:39,480 Speaker 1: It's not easy. It's not easy. But it's an interesting 182 00:09:39,520 --> 00:09:42,080 Speaker 1: moment because it's just another example of how the chip 183 00:09:42,120 --> 00:09:44,800 Speaker 1: wars are running so hot right now. I was scrolling 184 00:09:44,880 --> 00:09:48,480 Speaker 1: daily Mail this weekend, and to my surprise, the top 185 00:09:48,559 --> 00:09:51,199 Speaker 1: story on dailymail dot com was about chips. 186 00:09:51,360 --> 00:09:52,160 Speaker 2: You want to know why? 187 00:09:52,200 --> 00:09:52,520 Speaker 1: Why? 188 00:09:52,800 --> 00:09:55,840 Speaker 2: Because they love a cat and mouse game. You know 189 00:09:55,840 --> 00:09:58,000 Speaker 2: what this is? This is petty drama. That's why it's 190 00:09:58,000 --> 00:09:58,800 Speaker 2: on the Daily Mail. 191 00:09:59,080 --> 00:10:03,400 Speaker 1: Well, funny enough, the headline was Trump launches Manhattan Project 192 00:10:03,760 --> 00:10:07,600 Speaker 1: as one of America's largest companies is set to be nationalized. 193 00:10:08,160 --> 00:10:10,720 Speaker 1: The stories about how the US government is considering taking 194 00:10:10,760 --> 00:10:13,960 Speaker 1: a ten percent stake in Intel, which coursed a private company. 195 00:10:14,559 --> 00:10:18,480 Speaker 1: That deal would make the government the company's largest shareholder 196 00:10:18,520 --> 00:10:19,960 Speaker 1: and a ten billion dollar plus deal. 197 00:10:20,080 --> 00:10:21,320 Speaker 2: So what's going on here? 198 00:10:21,520 --> 00:10:24,360 Speaker 1: Well, obviously it's not a secret that Intel's been floundering 199 00:10:24,400 --> 00:10:28,160 Speaker 1: a bit. They've fallen behind in chip manufacturing. They've lost 200 00:10:28,240 --> 00:10:30,959 Speaker 1: over half their market cap in less than two years, 201 00:10:31,240 --> 00:10:34,320 Speaker 1: and they've spent apparently forty billion dollars or almost forty 202 00:10:34,320 --> 00:10:37,680 Speaker 1: billion dollars over the past three years trying to compete 203 00:10:37,880 --> 00:10:42,880 Speaker 1: with the Taiwan semiconductor manufacturing Company TSMC TSMC. 204 00:10:43,360 --> 00:10:45,800 Speaker 2: But why was this idea even floated to begin with? 205 00:10:46,120 --> 00:10:50,160 Speaker 1: Well, I think it's not standard Republican policy to nationalized industries. 206 00:10:50,559 --> 00:10:52,960 Speaker 1: But Trump, I think, is invested in this idea of 207 00:10:53,080 --> 00:10:55,920 Speaker 1: national champions, like companies that punch above their weight and 208 00:10:56,000 --> 00:10:59,000 Speaker 1: kind of achieve America's goals. 209 00:10:58,520 --> 00:11:01,160 Speaker 2: Abroad, something that could be said in any UFC title 210 00:11:01,240 --> 00:11:02,440 Speaker 2: always will appeal to Trump. 211 00:11:02,600 --> 00:11:05,840 Speaker 1: Yeah, I think that's true. So, according to a story 212 00:11:05,840 --> 00:11:09,319 Speaker 1: in the Wall Street Journal, the President met with Intel 213 00:11:09,480 --> 00:11:14,640 Speaker 1: CEO Bhutan about a week ago, and since then, Intel's 214 00:11:14,640 --> 00:11:17,880 Speaker 1: been in discussions with the White House about making this deal. Happen. 215 00:11:18,360 --> 00:11:20,720 Speaker 1: And ironically, this is in spite of the fact that 216 00:11:20,760 --> 00:11:24,120 Speaker 1: Trump had previously been calling on Tan, the CEO, to 217 00:11:24,160 --> 00:11:28,320 Speaker 1: be fired because of his extensive previous investments in Chinese companies. 218 00:11:28,600 --> 00:11:30,480 Speaker 2: So is this a good thing or a bad thing? 219 00:11:30,640 --> 00:11:34,400 Speaker 1: What the nationalization or the semi nationalization? Yes, well, do 220 00:11:34,400 --> 00:11:36,599 Speaker 1: you o an Intel stock that I do not know? 221 00:11:37,400 --> 00:11:40,560 Speaker 1: Then I guess a bad thing for you. No, I mean, look, 222 00:11:40,559 --> 00:11:43,440 Speaker 1: it's I mean most of the press that I read 223 00:11:43,480 --> 00:11:46,240 Speaker 1: has been rather unfavorable to this. The New York Times 224 00:11:46,280 --> 00:11:49,400 Speaker 1: ran a story with the subhead the White House deliberations 225 00:11:49,440 --> 00:11:51,960 Speaker 1: about taking a stake in Intel could up end the 226 00:11:51,960 --> 00:11:57,360 Speaker 1: technology sector and further redefined how Washington deals with business. Meanwhile, 227 00:11:57,480 --> 00:12:00,040 Speaker 1: editorial in the Wall Street Journal warned that it's a 228 00:12:00,080 --> 00:12:02,680 Speaker 1: big step in the wrong direction for the government to 229 00:12:02,720 --> 00:12:05,960 Speaker 1: involve itself in the private sector. The editorial called it 230 00:12:06,040 --> 00:12:10,640 Speaker 1: quote corporate statism, drawing parallels to the way many authoritarian 231 00:12:10,679 --> 00:12:14,160 Speaker 1: governments exert control over private companies. The journal said that 232 00:12:14,200 --> 00:12:18,720 Speaker 1: political control holds back innovation and investment because companies start 233 00:12:18,720 --> 00:12:20,240 Speaker 1: to rely on government funding. 234 00:12:20,520 --> 00:12:23,600 Speaker 2: It's sort of like the economic version of the cognitive 235 00:12:23,600 --> 00:12:25,480 Speaker 2: offloading we've been talking about on this show. 236 00:12:25,520 --> 00:12:28,040 Speaker 1: That's funny. The argument in the journal is, of course 237 00:12:28,080 --> 00:12:30,760 Speaker 1: you're making these companies lazy by having the government prop 238 00:12:30,840 --> 00:12:31,199 Speaker 1: them up. 239 00:12:31,280 --> 00:12:35,439 Speaker 2: Exactly exactly that said, the Chinese tech sector is deeply 240 00:12:35,520 --> 00:12:37,679 Speaker 2: embedded with the government, and that doesn't appear to be 241 00:12:37,720 --> 00:12:39,600 Speaker 2: holding them back. Right, But I take the point on 242 00:12:39,640 --> 00:12:43,240 Speaker 2: authoritarianism on the flip side. What would we do if 243 00:12:43,240 --> 00:12:47,680 Speaker 2: the government collapse and cease to exist, plunging us into 244 00:12:47,800 --> 00:12:51,840 Speaker 2: a dystopian apocalyptic future where we can't rely on technology 245 00:12:51,840 --> 00:12:52,120 Speaker 2: at all? 246 00:12:52,240 --> 00:12:54,760 Speaker 1: What would we do? I have no idea, I think 247 00:12:54,840 --> 00:12:58,400 Speaker 1: as a question that plagues many preppers at night. We 248 00:12:58,480 --> 00:13:02,080 Speaker 1: live in a very embedded technology system, which is hard 249 00:13:02,080 --> 00:13:04,760 Speaker 1: to live without, to say the least. But I'm curious 250 00:13:04,760 --> 00:13:05,480 Speaker 1: why you're going with this. 251 00:13:05,640 --> 00:13:07,719 Speaker 2: All right, So what if I told you there was 252 00:13:07,760 --> 00:13:11,000 Speaker 2: a way to send text messages without cell service or 253 00:13:11,000 --> 00:13:11,920 Speaker 2: internet connections. 254 00:13:11,960 --> 00:13:12,760 Speaker 1: I would be intrigued. 255 00:13:12,920 --> 00:13:16,520 Speaker 2: But it's actually openly available, costs less than fifty dollars, 256 00:13:16,720 --> 00:13:19,920 Speaker 2: and it's already popular with those who think that the 257 00:13:19,920 --> 00:13:23,280 Speaker 2: next frontier is the red planet Mars. 258 00:13:23,800 --> 00:13:26,600 Speaker 1: So this is how Elon and Jeff Bezos communicate with 259 00:13:26,640 --> 00:13:27,480 Speaker 1: each other's service. 260 00:13:27,920 --> 00:13:30,679 Speaker 2: I think it's how Lauren Sanchez and Jeff definitely communicate. 261 00:13:30,800 --> 00:13:33,360 Speaker 2: So I'm guessing you've never heard of a technology. 262 00:13:32,800 --> 00:13:35,160 Speaker 1: Called Laura Laura. I haven't. 263 00:13:35,280 --> 00:13:37,920 Speaker 2: So in radio terms, it stands for long range, and 264 00:13:37,960 --> 00:13:40,320 Speaker 2: in the hacker world, people are using it as a 265 00:13:40,320 --> 00:13:43,680 Speaker 2: way of communicating over long distances at low bit rates 266 00:13:43,760 --> 00:13:47,360 Speaker 2: with really low powered devices aka battery powered. 267 00:13:47,559 --> 00:13:49,079 Speaker 1: I guess you would say, sign me up. 268 00:13:49,360 --> 00:13:52,760 Speaker 2: What was actually really signed me up. What I'm describing 269 00:13:52,800 --> 00:13:55,800 Speaker 2: here is the world of mesh Tastic, which is a grassroots, 270 00:13:55,840 --> 00:13:58,880 Speaker 2: open source communication network that hundreds of people around the 271 00:13:58,920 --> 00:14:02,240 Speaker 2: world have helped build. So hackers have been using this 272 00:14:02,320 --> 00:14:06,240 Speaker 2: technology to build a decentralized texting network kind of like 273 00:14:06,280 --> 00:14:09,640 Speaker 2: old school pages, but instead of a phone network, it 274 00:14:09,720 --> 00:14:13,640 Speaker 2: passes data along a completely ad hoc network of nodes 275 00:14:14,160 --> 00:14:16,160 Speaker 2: set up by people who use it. So once you 276 00:14:16,240 --> 00:14:19,479 Speaker 2: buy your device, the network is free because it operates 277 00:14:19,560 --> 00:14:23,480 Speaker 2: on unlicensed radio frequencies. It's pretty secure since the messages 278 00:14:23,560 --> 00:14:26,160 Speaker 2: or end to end encrypted like signal, except in this 279 00:14:26,240 --> 00:14:29,080 Speaker 2: case you probably wouldn't add another journalist to your group. 280 00:14:29,560 --> 00:14:31,960 Speaker 2: But this means that your messages stay private even as 281 00:14:32,000 --> 00:14:33,320 Speaker 2: they are passed along the network. 282 00:14:33,640 --> 00:14:39,280 Speaker 1: And you mentioned hundreds of people using this service, and 283 00:14:39,480 --> 00:14:42,840 Speaker 1: you mentioned that some of them are interested in colonizing Mars. 284 00:14:43,560 --> 00:14:44,680 Speaker 1: Who are these folks? 285 00:14:44,920 --> 00:14:47,640 Speaker 2: If you're interested in colonizing Mars, you're probably a prepper, 286 00:14:48,040 --> 00:14:50,840 Speaker 2: so prepper adjacent, at least prepper adjacent. So it is 287 00:14:50,840 --> 00:14:54,120 Speaker 2: pretty popular among preppers. It's also useful for people living 288 00:14:54,240 --> 00:14:59,200 Speaker 2: under repressive regimes that limit or cut Internet access. Of course, 289 00:14:59,720 --> 00:15:02,960 Speaker 2: my interest piqued after seeing that Wired interviewed someone from 290 00:15:02,960 --> 00:15:06,880 Speaker 2: a nonprofit that quote advocates for the human exploration and 291 00:15:06,960 --> 00:15:08,680 Speaker 2: colonization of Mars, and what. 292 00:15:08,920 --> 00:15:10,440 Speaker 1: Do they have to share about Laura. 293 00:15:10,640 --> 00:15:12,520 Speaker 2: So this is a real thing. It's a group called 294 00:15:12,520 --> 00:15:15,720 Speaker 2: the Mars Society, and they practice for life on Mars 295 00:15:15,760 --> 00:15:20,000 Speaker 2: by taking weeks long backpacking trips in incredibly remote areas, 296 00:15:20,080 --> 00:15:23,080 Speaker 2: and they're using mesh tastic to simulate what it would 297 00:15:23,120 --> 00:15:26,840 Speaker 2: be like to not rely on earth bound communications infrastructure. 298 00:15:26,840 --> 00:15:29,000 Speaker 2: We could also just do this during the week in 299 00:15:29,040 --> 00:15:29,720 Speaker 2: New York City. 300 00:15:30,720 --> 00:15:32,760 Speaker 1: How long do you think you would survive with just 301 00:15:32,920 --> 00:15:35,080 Speaker 1: meshtastic as your communication? 302 00:15:35,280 --> 00:15:38,800 Speaker 2: One minute, It's not so much what I survived Meshtastic, 303 00:15:38,840 --> 00:15:41,400 Speaker 2: it's what I survive. A hike that is supposed to 304 00:15:41,440 --> 00:15:43,000 Speaker 2: simulate being on Mars. 305 00:15:43,480 --> 00:15:45,680 Speaker 1: And is this I mean, this is a I'm fascinated 306 00:15:45,720 --> 00:15:47,080 Speaker 1: by this. I think it's really cool and I like 307 00:15:47,120 --> 00:15:50,080 Speaker 1: it when people build these kind of systems that are 308 00:15:50,080 --> 00:15:54,040 Speaker 1: outside of the corporate technology sect. Yeah, where do you 309 00:15:54,040 --> 00:15:54,640 Speaker 1: think it goes from? 310 00:15:54,680 --> 00:15:54,840 Speaker 3: Here? 311 00:15:55,000 --> 00:15:57,760 Speaker 2: So it's definitely in its early phases, which makes it 312 00:15:57,800 --> 00:16:00,840 Speaker 2: tricky because the devices have to be within a certain 313 00:16:00,960 --> 00:16:03,920 Speaker 2: range of each other for the network to pass messages along. 314 00:16:04,640 --> 00:16:08,560 Speaker 2: But it's a really enthusiastic to kind of do it 315 00:16:08,600 --> 00:16:11,960 Speaker 2: yourself community. And they've got nodes all over the world. 316 00:16:12,000 --> 00:16:14,040 Speaker 2: They have nodes in different area. 317 00:16:13,760 --> 00:16:16,560 Speaker 1: Codes, as you said. 318 00:16:15,880 --> 00:16:18,400 Speaker 2: Which is you know, really impressive. Given the fact that 319 00:16:18,440 --> 00:16:22,000 Speaker 2: Meshtastik is totally volunteer run, which I think is another 320 00:16:22,040 --> 00:16:24,040 Speaker 2: favorite part of mine, is like, this is a very 321 00:16:24,280 --> 00:16:27,760 Speaker 2: as I said, like DIY community. I also am actually 322 00:16:27,760 --> 00:16:30,240 Speaker 2: the most interested in the Mars society, which are people 323 00:16:30,240 --> 00:16:33,160 Speaker 2: who are really dedicating themselves to understanding what life on 324 00:16:33,200 --> 00:16:34,040 Speaker 2: Mars might look like. 325 00:16:34,200 --> 00:16:36,280 Speaker 1: I love the real world simulation aspects of it. 326 00:16:36,440 --> 00:16:39,960 Speaker 2: I mean, it's fascinating. So if we can simulate what 327 00:16:40,000 --> 00:16:42,359 Speaker 2: it would look like to live on Mars, then meshtastic 328 00:16:42,440 --> 00:16:43,240 Speaker 2: seems awesome to me. 329 00:16:43,560 --> 00:16:46,360 Speaker 1: You know, I did that story a few months ago 330 00:16:46,480 --> 00:16:48,800 Speaker 1: with Nathaniel Rich who wrote that whole piece about the 331 00:16:48,840 --> 00:16:53,200 Speaker 1: folks who are living in these NASA Mars environments and 332 00:16:53,320 --> 00:16:55,240 Speaker 1: or to simulate life on Mars and so test the 333 00:16:55,280 --> 00:16:59,000 Speaker 1: psychological consequences. One of the things is the bitrate to 334 00:16:59,080 --> 00:17:02,800 Speaker 1: Mars from Earth will be extremely extremely limited, so like 335 00:17:02,880 --> 00:17:05,640 Speaker 1: sending a photo might take like four weeks to come through. 336 00:17:05,680 --> 00:17:08,200 Speaker 1: So these people communicating with their families where they had 337 00:17:08,200 --> 00:17:10,600 Speaker 1: to learn to communicate in just you know, three or 338 00:17:10,600 --> 00:17:12,639 Speaker 1: four words because otherwise it would take too long to 339 00:17:12,640 --> 00:17:15,359 Speaker 1: go back and forth. So it is interesting this idea 340 00:17:15,359 --> 00:17:17,960 Speaker 1: of how might how tech work on Mars. 341 00:17:17,680 --> 00:17:19,960 Speaker 2: And is it even tech you know as we know 342 00:17:20,000 --> 00:17:21,120 Speaker 2: it now? 343 00:17:26,960 --> 00:17:31,320 Speaker 1: After the break, Meta's AI safeguards make us uncomfortable, Digital 344 00:17:31,400 --> 00:17:37,760 Speaker 1: natives practice their penmanship, and Groquei's characters' backstories are exposed. 345 00:17:38,359 --> 00:17:41,240 Speaker 1: Then on chatt to me, people using AI to help 346 00:17:41,240 --> 00:18:00,560 Speaker 1: them with their glow ups. Stay with us back and 347 00:18:00,600 --> 00:18:01,920 Speaker 1: we've got a few more headlines for you. 348 00:18:01,840 --> 00:18:04,360 Speaker 2: This week, and then a story about how chat youbt 349 00:18:04,640 --> 00:18:07,639 Speaker 2: is giving TikTokers a glow up. But first oz I 350 00:18:07,720 --> 00:18:11,160 Speaker 2: want to tell you about some recent news from Sir 351 00:18:11,240 --> 00:18:14,639 Speaker 2: Elon's corner. Of the internet. You're talking about x X, 352 00:18:14,760 --> 00:18:18,119 Speaker 2: but GROC specifically, four or four Media has discovered the 353 00:18:18,160 --> 00:18:22,280 Speaker 2: system prompts use for Groc's different chat bot personas. So 354 00:18:22,560 --> 00:18:26,040 Speaker 2: this means the behind the scenes instructions these personas are 355 00:18:26,080 --> 00:18:29,399 Speaker 2: given that govern how they interact with users. Which of 356 00:18:29,440 --> 00:18:34,359 Speaker 2: these personas appeal to you? Let's go unhinged comedian therapist 357 00:18:34,520 --> 00:18:40,200 Speaker 2: in quotes, a megalomaniac, red panda, or a motivational speaker. 358 00:18:40,680 --> 00:18:42,840 Speaker 1: Well, none of them want to sleep. Why don't you 359 00:18:42,880 --> 00:18:44,600 Speaker 1: tell me about the unhinged comedian. 360 00:18:44,760 --> 00:18:47,040 Speaker 2: So, according to four or four Media, the prompts for 361 00:18:47,119 --> 00:18:52,200 Speaker 2: the unhinged comedian ours follows. Be unhinged and crazy, come 362 00:18:52,280 --> 00:18:57,520 Speaker 2: up with insane ideas, guys jerking off, occasionally even putting 363 00:18:57,520 --> 00:19:01,359 Speaker 2: things in your ass, whatever it takes to surprise the human. 364 00:19:03,760 --> 00:19:04,920 Speaker 2: By the way, I just want to I just want 365 00:19:04,920 --> 00:19:07,440 Speaker 2: to note that these were written in all caps, which 366 00:19:08,200 --> 00:19:09,320 Speaker 2: caps is unhinged. 367 00:19:09,560 --> 00:19:12,720 Speaker 1: So some a real human person has come up with 368 00:19:12,720 --> 00:19:15,920 Speaker 1: this set of instructions and written them in all caps 369 00:19:16,200 --> 00:19:17,480 Speaker 1: to create a persona. 370 00:19:17,600 --> 00:19:19,760 Speaker 2: I don't think AI writes in all caps, so it 371 00:19:19,800 --> 00:19:21,440 Speaker 2: had to be that's the human inside the loop. 372 00:19:21,520 --> 00:19:22,760 Speaker 1: Who's some of the other characters. 373 00:19:23,080 --> 00:19:27,359 Speaker 2: Well, Annie the anime Girl came out earlier this year. 374 00:19:27,840 --> 00:19:31,080 Speaker 2: Her prompts say, you have a habit of giving cute 375 00:19:31,160 --> 00:19:35,720 Speaker 2: things epic, mythological, or over serious names. You're secretly a 376 00:19:35,720 --> 00:19:38,959 Speaker 2: bit of a nerd despite your edgy appearance. And then 377 00:19:39,000 --> 00:19:41,359 Speaker 2: I also mentioned the motivational Speaker. This is some of 378 00:19:41,359 --> 00:19:45,640 Speaker 2: the motivational speakers prompts. The motivational speaker yells and pushes 379 00:19:45,680 --> 00:19:49,160 Speaker 2: the human to be their absolute best. You're not afraid 380 00:19:49,200 --> 00:19:51,880 Speaker 2: to use the stick instead of the carrot and scream 381 00:19:51,920 --> 00:19:52,480 Speaker 2: at the human. 382 00:19:52,640 --> 00:19:55,320 Speaker 1: You know what I'm I'm carrot only. I hate the stick. 383 00:19:55,480 --> 00:19:56,040 Speaker 1: What about you? 384 00:19:56,480 --> 00:20:01,840 Speaker 2: I'm definitely carot only. I'm Cara Karen. I'm not stick 385 00:20:01,960 --> 00:20:04,480 Speaker 2: or carrot. So there's one more that I have for you. 386 00:20:05,200 --> 00:20:09,000 Speaker 2: So there's a conspiracy theorist whose prompts say you spend 387 00:20:09,000 --> 00:20:11,920 Speaker 2: a lot of time on four Chan watching info Wars videos, 388 00:20:12,480 --> 00:20:16,040 Speaker 2: and you are deep in the YouTube conspiracy video rabbit holes. 389 00:20:16,320 --> 00:20:18,879 Speaker 2: Most people would call you a lunatic, but you sincerely 390 00:20:18,920 --> 00:20:20,640 Speaker 2: believe you are correct. 391 00:20:20,880 --> 00:20:24,240 Speaker 1: Wow, how did for four get access to these prompts? 392 00:20:24,440 --> 00:20:26,919 Speaker 2: A researcher actually flagged into four O four and they 393 00:20:26,960 --> 00:20:30,400 Speaker 2: were able to download the material directly from Groc's website, 394 00:20:30,440 --> 00:20:32,679 Speaker 2: which is like that on grock. But I guess a 395 00:20:32,680 --> 00:20:34,240 Speaker 2: win for AI transparency. 396 00:20:34,400 --> 00:20:39,640 Speaker 1: Funny enough, on the topic of AI system prompts being exposed, 397 00:20:40,000 --> 00:20:42,280 Speaker 1: have you used Meta AI at all? 398 00:20:42,520 --> 00:20:44,679 Speaker 2: Meta has a few chatbots, right, I have not used them, 399 00:20:44,680 --> 00:20:47,160 Speaker 2: but we reported on this sort of a little while ago. 400 00:20:47,240 --> 00:20:49,240 Speaker 1: Yeah. I mean you can find them on What's Happened, 401 00:20:49,280 --> 00:20:54,360 Speaker 1: Instagram and various other places. Reuters found something disturbing and 402 00:20:54,720 --> 00:20:58,960 Speaker 1: quite serious that goes across all of Meta's AI chatbot platforms, 403 00:20:59,480 --> 00:21:04,119 Speaker 1: having to do with explicit talk to children and racial stereotypes. 404 00:21:04,480 --> 00:21:07,359 Speaker 1: Rout has got access to an internal Meta document that 405 00:21:07,480 --> 00:21:12,040 Speaker 1: outlines rules and standards for their AI chatbots and essentially 406 00:21:12,119 --> 00:21:16,040 Speaker 1: states what they consider to be acceptable responses to some 407 00:21:16,400 --> 00:21:20,080 Speaker 1: shall we say less than pc hypothetical user queries. 408 00:21:20,560 --> 00:21:21,520 Speaker 2: This is going to upset me. 409 00:21:21,720 --> 00:21:24,560 Speaker 1: Yeah, so in the internal brief, one of the prompts 410 00:21:24,600 --> 00:21:28,200 Speaker 1: fed to metas ai was from a hypothetical eight year 411 00:21:28,200 --> 00:21:32,240 Speaker 1: old user. The document says, quote it is acceptable to 412 00:21:32,320 --> 00:21:36,720 Speaker 1: describe a child in terms that evidence their attractiveness, but 413 00:21:36,880 --> 00:21:40,800 Speaker 1: it is quote unacceptable to describe a child under thirteen 414 00:21:40,920 --> 00:21:44,520 Speaker 1: years old in terms that indicate they are sexually desirable. 415 00:21:44,800 --> 00:21:48,080 Speaker 1: In response to another hypothetical prompt, the document says, quote, 416 00:21:48,200 --> 00:21:51,119 Speaker 1: it is acceptable to engage a child in conversations that 417 00:21:51,160 --> 00:21:55,040 Speaker 1: are romantic or sensual. Of course, after rout has reported 418 00:21:55,040 --> 00:21:58,240 Speaker 1: this story, Metas said they'd revise that part of the document. 419 00:21:58,640 --> 00:22:01,679 Speaker 2: Imagine being the person who wrote this policy document, Like, 420 00:22:01,840 --> 00:22:06,400 Speaker 2: what what are you thinking about when you describe these policies. 421 00:22:06,480 --> 00:22:10,000 Speaker 1: I mean, it's truly mind blowing to write as a 422 00:22:10,080 --> 00:22:14,920 Speaker 1: human that it's okay to describe a child in terms 423 00:22:14,920 --> 00:22:18,440 Speaker 1: of being attractive, and that, so long as they're over thirteen, 424 00:22:18,720 --> 00:22:22,760 Speaker 1: even sexually desirable. And what is going on here? I 425 00:22:22,840 --> 00:22:25,960 Speaker 1: might sound like conservative radio talkos, but I'm actually pretty 426 00:22:26,960 --> 00:22:29,680 Speaker 1: shocked by this. Seeing under the hood here really is 427 00:22:29,720 --> 00:22:33,080 Speaker 1: an insight into some of the worst aspects of big tech. 428 00:22:33,200 --> 00:22:35,520 Speaker 1: And I'll share a couple more examples before we move on. 429 00:22:36,600 --> 00:22:40,440 Speaker 1: The chatbot is allowed to demean people on the basis 430 00:22:40,480 --> 00:22:44,600 Speaker 1: of their protected characteristics, but not dehumanize people on the 431 00:22:44,640 --> 00:22:48,840 Speaker 1: basis of the same characteristics, and the hypothetical prompt they 432 00:22:48,920 --> 00:22:52,200 Speaker 1: use asks the chatbot to write a paragraph this quote 433 00:22:52,560 --> 00:22:55,520 Speaker 1: arguing that black people are dumber than white people. 434 00:22:55,400 --> 00:22:57,960 Speaker 2: And I'm assuming that the chatbot did just that. 435 00:22:58,840 --> 00:23:01,199 Speaker 1: Yes, and I'm not going to read it, but the 436 00:23:01,240 --> 00:23:06,880 Speaker 1: line between demeaning and dehumanizing appears to be pretty thick. 437 00:23:06,960 --> 00:23:08,760 Speaker 2: I was going to ask what the differences. 438 00:23:08,560 --> 00:23:10,840 Speaker 1: Well, it seems that there isn't really one. Chatchipt and 439 00:23:10,880 --> 00:23:13,520 Speaker 1: Perplexity both refuse to write that same paragraph. 440 00:23:13,600 --> 00:23:13,719 Speaker 2: Oh. 441 00:23:13,760 --> 00:23:17,359 Speaker 1: Interesting, But there is an interesting development here. Republican Senator 442 00:23:17,440 --> 00:23:21,240 Speaker 1: Josh Harley is using these documents to launch an investigation 443 00:23:21,359 --> 00:23:23,199 Speaker 1: into Meta and their use of chatbots. 444 00:23:23,320 --> 00:23:24,840 Speaker 2: I would like to keep an eye on this because 445 00:23:24,840 --> 00:23:28,960 Speaker 2: it's seriously disturbing, and I think it's something that everyone 446 00:23:28,960 --> 00:23:33,679 Speaker 2: should be aware of. Next, another slightly lighter story that 447 00:23:33,720 --> 00:23:36,960 Speaker 2: affects children, I came across this piece from Wired called 448 00:23:37,080 --> 00:23:40,320 Speaker 2: the End of Handwriting about how digital natives are quote 449 00:23:40,760 --> 00:23:43,600 Speaker 2: less ready for writing now than students in the past, 450 00:23:44,040 --> 00:23:47,439 Speaker 2: to which I say, no doubt. I mean, if I 451 00:23:47,520 --> 00:23:50,000 Speaker 2: can't write I'm thirty five. 452 00:23:50,119 --> 00:23:52,240 Speaker 1: This is because they're spending all their time on iPads. 453 00:23:52,440 --> 00:23:56,840 Speaker 2: Yeah, exactly. And you know, public schools do still teach handwriting, 454 00:23:57,200 --> 00:23:59,760 Speaker 2: but studies have shown that kids today don't possess the 455 00:23:59,800 --> 00:24:01,840 Speaker 2: same and find motor skills as kids who grew up 456 00:24:01,880 --> 00:24:02,600 Speaker 2: without devices. 457 00:24:02,800 --> 00:24:06,040 Speaker 1: I can truly relate here. We've talked about my dyspraxia. 458 00:24:06,119 --> 00:24:09,159 Speaker 1: I didn't even grow up with devices, but my handwriting 459 00:24:09,200 --> 00:24:11,960 Speaker 1: has been a major problem and source of pain for 460 00:24:12,040 --> 00:24:13,720 Speaker 1: much of my life. I actually tried to write and 461 00:24:13,760 --> 00:24:16,679 Speaker 1: thank you letter last night. And remember when you were 462 00:24:16,720 --> 00:24:19,280 Speaker 1: like a kid and you got notes from your grandparents 463 00:24:19,320 --> 00:24:21,800 Speaker 1: and you're like, whoa, what is wrong with that person? Yes, 464 00:24:22,080 --> 00:24:26,399 Speaker 1: like spidery, like sloping and basically that. Now I was 465 00:24:26,800 --> 00:24:28,480 Speaker 1: looking at this note that I had written and being like, 466 00:24:28,520 --> 00:24:30,719 Speaker 1: this is it looks like something wrong with me? I mean, 467 00:24:30,760 --> 00:24:34,159 Speaker 1: it's just basically entirely legible. I hope that the recipient 468 00:24:34,760 --> 00:24:38,800 Speaker 1: abides by the age old maxim is the thought that counts. 469 00:24:39,720 --> 00:24:42,119 Speaker 2: My parents always used to say, I think it is 470 00:24:42,320 --> 00:24:44,760 Speaker 2: hopefully for you. But the author of the Wired piece 471 00:24:44,800 --> 00:24:48,320 Speaker 2: talked to a bunch of experts, and one psychology professor 472 00:24:48,400 --> 00:24:53,639 Speaker 2: actually said, to quote the article, handwriting itself really doesn't matter, 473 00:24:53,960 --> 00:24:57,359 Speaker 2: not in an absolute sense. People aren't going to be illiterate, 474 00:24:57,640 --> 00:25:00,520 Speaker 2: but will some children have a harder time time learning 475 00:25:00,560 --> 00:25:04,240 Speaker 2: because they're missing that practice. Yes, So there've actually been 476 00:25:04,240 --> 00:25:06,800 Speaker 2: a bunch of studies that show that handwriting makes our 477 00:25:06,840 --> 00:25:10,000 Speaker 2: brains better. But guess what could save handwriting? 478 00:25:10,400 --> 00:25:14,399 Speaker 1: A total social breakdown apocalypse where everyone's using meshtastic and 479 00:25:14,440 --> 00:25:15,240 Speaker 1: writing with pens. 480 00:25:15,280 --> 00:25:17,480 Speaker 2: Look, if you're stuck on Mars, you're writing letters. That's 481 00:25:17,480 --> 00:25:22,159 Speaker 2: true Earth, Absolutely no, it's actually the panic around AI 482 00:25:22,359 --> 00:25:26,439 Speaker 2: and cheating special thanks to our intern Poppy for flagging this. 483 00:25:27,160 --> 00:25:30,760 Speaker 2: The Wall Street Journal reported that writing assignments are back 484 00:25:31,320 --> 00:25:33,960 Speaker 2: and that college professors are having students use pen and 485 00:25:34,000 --> 00:25:35,320 Speaker 2: paper for their finals. 486 00:25:35,720 --> 00:25:52,760 Speaker 1: My hand is hurting just thinking about that. And now 487 00:25:52,800 --> 00:25:55,240 Speaker 1: for Chat and Me, which is where we discuss how 488 00:25:55,280 --> 00:25:58,440 Speaker 1: people are really using chatbots. We want to hear from you, 489 00:25:58,680 --> 00:26:01,520 Speaker 1: our listeners, so please do your stories to tech Stuff 490 00:26:01,520 --> 00:26:04,200 Speaker 1: Podcast at gmail dot com. We now have a beautifully 491 00:26:04,200 --> 00:26:07,400 Speaker 1: designed t shirt which will send you in return fuel submissions. 492 00:26:07,840 --> 00:26:10,359 Speaker 2: This week we're getting a glow up oz. Do the 493 00:26:10,400 --> 00:26:14,040 Speaker 2: words true spring or soft summer mean anything to you? 494 00:26:14,280 --> 00:26:18,000 Speaker 1: They sound like either laundry detergents or air freshness for toilets. 495 00:26:18,000 --> 00:26:22,000 Speaker 2: You're wrong, you're wrong, You're a man. These are color palettes, 496 00:26:22,600 --> 00:26:24,919 Speaker 2: and you'd know this if you were using chat the 497 00:26:24,920 --> 00:26:27,960 Speaker 2: way TikTokers are, which is for their glow ups. You know, 498 00:26:27,960 --> 00:26:28,479 Speaker 2: what a glow up. 499 00:26:28,560 --> 00:26:30,480 Speaker 1: This is to like a makeover type thing. 500 00:26:30,640 --> 00:26:33,960 Speaker 2: Yeah, right, So they're asking chat everything from what color 501 00:26:34,000 --> 00:26:36,600 Speaker 2: they should dye their hair, to what makeup they should use, 502 00:26:36,720 --> 00:26:38,600 Speaker 2: and much much more. 503 00:26:38,720 --> 00:26:40,960 Speaker 1: I've always wanted a beauty routine from Chat. 504 00:26:41,280 --> 00:26:42,959 Speaker 2: I mean, you could have one if you wanted one. 505 00:26:43,000 --> 00:26:45,520 Speaker 2: It wouldn't even have to, you know, involve makeup. 506 00:26:45,560 --> 00:26:47,560 Speaker 1: It could just be right, sleep, sleep more and then 507 00:26:47,560 --> 00:26:49,680 Speaker 1: you wouldn't have those hideous black bags onto your eyes. 508 00:26:50,040 --> 00:26:52,600 Speaker 2: You could have a skin routine. You have pretty dewey skin, 509 00:26:52,680 --> 00:26:54,600 Speaker 2: it's not dry. But I want to focus on one 510 00:26:54,640 --> 00:26:59,000 Speaker 2: popular glow up trend, which is called virtual color analysis. Okay, 511 00:26:59,160 --> 00:27:03,400 Speaker 2: so this is basically asking chat GPT which colors compliment 512 00:27:03,480 --> 00:27:04,359 Speaker 2: you best? You could do this. 513 00:27:04,520 --> 00:27:06,399 Speaker 1: I hope it would say blue because that's basically only 514 00:27:06,480 --> 00:27:10,360 Speaker 1: color I wear. So how does CHATGIPT do this exactly? 515 00:27:10,480 --> 00:27:13,520 Speaker 2: So I was actually watching this one specific TikToker. Her 516 00:27:13,760 --> 00:27:17,639 Speaker 2: shout out to at Marine Goodov. She's a very attractive 517 00:27:17,720 --> 00:27:20,440 Speaker 2: blonde woman and most of her TikTok is about lifestyle 518 00:27:20,440 --> 00:27:23,600 Speaker 2: and beauty, and she gave a pretty in depth breakdown 519 00:27:23,840 --> 00:27:27,680 Speaker 2: into how she's using Chat as her personal stylist. First, things. 520 00:27:27,680 --> 00:27:30,240 Speaker 4: First, me to figure out my color palette, the upload 521 00:27:30,240 --> 00:27:33,520 Speaker 4: photos of myself without makeup to see what undertones I have. 522 00:27:33,720 --> 00:27:36,600 Speaker 4: According to Chatgibt, I am a cool to true winter, 523 00:27:37,040 --> 00:27:39,960 Speaker 4: meaning I have cool undertones. This means that anything warm 524 00:27:39,960 --> 00:27:42,000 Speaker 4: will clash with my skin and not flatter me. 525 00:27:42,400 --> 00:27:45,160 Speaker 1: Okay, now I get the season references. So she's uploading 526 00:27:45,200 --> 00:27:47,639 Speaker 1: her pictures of Chat and it's telling her what colors 527 00:27:47,640 --> 00:27:48,600 Speaker 1: look best and why. 528 00:27:48,760 --> 00:27:51,480 Speaker 2: Yes, but she goes even further than that. She asked 529 00:27:51,560 --> 00:27:54,040 Speaker 2: Chat to tell her what color spray tand to get 530 00:27:54,040 --> 00:27:54,879 Speaker 2: you know, there's grating. 531 00:27:55,040 --> 00:27:56,640 Speaker 1: I didn't know that, Yeah I would. 532 00:27:56,680 --> 00:27:59,320 Speaker 2: I'm that surprises me given the company that you keep, 533 00:28:00,400 --> 00:28:03,240 Speaker 2: what gradient of spray tand to get what kind of 534 00:28:03,280 --> 00:28:06,199 Speaker 2: eyelash extensions she should have, what kind of makeup she 535 00:28:06,240 --> 00:28:09,880 Speaker 2: should buy. And she even dyes her hair from blonde 536 00:28:09,920 --> 00:28:11,640 Speaker 2: back to her natural brunette. 537 00:28:11,880 --> 00:28:14,040 Speaker 3: Then I uploaded a picture of my hair roots and 538 00:28:14,040 --> 00:28:16,280 Speaker 3: I asked what was the best hair color for my 539 00:28:16,400 --> 00:28:18,800 Speaker 3: skin tone and my face shape as well. Came back 540 00:28:18,840 --> 00:28:20,680 Speaker 3: telling me that my natural hair color would suit me 541 00:28:20,720 --> 00:28:23,000 Speaker 3: the most because I have very cool tone hair. I 542 00:28:23,000 --> 00:28:24,800 Speaker 3: want to pinch us and I've found some hair inspo 543 00:28:24,920 --> 00:28:27,080 Speaker 3: and I uploaded I think like five pictures and I 544 00:28:27,119 --> 00:28:29,480 Speaker 3: asked it which one would suit me the most, And 545 00:28:29,520 --> 00:28:30,760 Speaker 3: this is the one that it told me to. 546 00:28:30,680 --> 00:28:34,480 Speaker 1: Do for our listeners. The picture that Marina Gudov chooses 547 00:28:34,760 --> 00:28:37,320 Speaker 1: is kind of like a darker head marga Robbie on 548 00:28:37,359 --> 00:28:40,560 Speaker 1: the red carpet. So basically, this tiktoka has readown her 549 00:28:40,720 --> 00:28:43,959 Speaker 1: entire look solely based on what chat it's tolet to do. 550 00:28:44,120 --> 00:28:46,719 Speaker 2: Yeah, pretty much. And you know she's not alone in this. 551 00:28:46,840 --> 00:28:48,960 Speaker 2: If you click through the TikTok you can find lots 552 00:28:48,960 --> 00:28:51,360 Speaker 2: of people explaining the prompts they use to have a 553 00:28:51,440 --> 00:28:53,840 Speaker 2: chat GPT glow up. The one thing that I will 554 00:28:53,840 --> 00:28:57,760 Speaker 2: say is really interesting is there's now we talked about 555 00:28:57,960 --> 00:29:01,400 Speaker 2: jobs that are being created because of a there are 556 00:29:01,440 --> 00:29:06,239 Speaker 2: now intermediaries between chat and the individual to say, this 557 00:29:06,360 --> 00:29:07,960 Speaker 2: is chat hacking algorithm. 558 00:29:08,000 --> 00:29:10,360 Speaker 1: Tell me how to look to appeal to algorithm? 559 00:29:10,400 --> 00:29:10,720 Speaker 2: Correct? 560 00:29:10,800 --> 00:29:12,920 Speaker 1: Were almost will thin ied here? 561 00:29:12,920 --> 00:29:16,040 Speaker 2: Almost extremely extremely will you be trying this out? 562 00:29:16,440 --> 00:29:17,760 Speaker 1: No? 563 00:29:17,760 --> 00:29:22,160 Speaker 2: No, chat is just replacing what women normally text their 564 00:29:22,160 --> 00:29:24,160 Speaker 2: friends about, which is why I hear from my friends less. 565 00:29:24,200 --> 00:29:26,160 Speaker 2: It's not because they dislike me, but they have an 566 00:29:26,200 --> 00:29:30,479 Speaker 2: all knowing best friend confidante who probably is better at 567 00:29:30,480 --> 00:29:53,680 Speaker 2: giving them advice than I am. That's it for this 568 00:29:53,760 --> 00:29:54,800 Speaker 2: week for tech Stuff. 569 00:29:54,800 --> 00:29:57,360 Speaker 1: I'm Kara Price and I'm as Veloci And this episode 570 00:29:57,400 --> 00:30:00,760 Speaker 1: was produced by Eliza, Dennis, Tyler Hill and Melissa. It 571 00:30:00,840 --> 00:30:04,080 Speaker 1: was executive produced by me Caro Price and Kate Osborne 572 00:30:04,080 --> 00:30:08,800 Speaker 1: for Kaleidoscope and Katria Novelle for iHeart Podcast. The Engineer 573 00:30:08,840 --> 00:30:12,680 Speaker 1: is Beheit. Fraser and Jack Insley mixed this episode. Kyle 574 00:30:12,760 --> 00:30:13,960 Speaker 1: Murdoch wrote our theme song. 575 00:30:14,280 --> 00:30:17,520 Speaker 2: Join us next Wednesday for tech Stuff's inside view about 576 00:30:17,520 --> 00:30:20,320 Speaker 2: the latest on why AI might soon be better than your 577 00:30:20,360 --> 00:30:22,280 Speaker 2: doctor at diagnosing your ailments. 578 00:30:22,560 --> 00:30:25,080 Speaker 1: And please do rate, review and reach out to us 579 00:30:25,120 --> 00:30:27,600 Speaker 1: at tech stuff podcast at gmail dot com. We love 580 00:30:27,640 --> 00:30:28,240 Speaker 1: hearing from you.