1 00:00:13,760 --> 00:00:16,880 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:16,920 --> 00:00:18,720 Speaker 1: as Volocian and I'm care Price. 3 00:00:19,079 --> 00:00:22,880 Speaker 2: Today We've got two big stories to break down for you. First, 4 00:00:22,960 --> 00:00:27,200 Speaker 2: it's Silicon Valley's Westminster Dog Show. Journalists, tech nerds, industry 5 00:00:27,240 --> 00:00:30,120 Speaker 2: insiders are in Vegas this week to check out the 6 00:00:30,160 --> 00:00:34,760 Speaker 2: Consumer Electronics Show. Then, is your grocery store spying on you? 7 00:00:35,479 --> 00:00:37,520 Speaker 3: And then a few other things that caught our eye 8 00:00:37,560 --> 00:00:40,239 Speaker 3: this week, including Silicon Valley's. 9 00:00:39,880 --> 00:00:41,760 Speaker 4: New rave culture fueled by. 10 00:00:41,760 --> 00:00:46,800 Speaker 3: Chinese peptides and the exceptionally creative influencers and OnlyFans models 11 00:00:47,080 --> 00:00:48,400 Speaker 3: getting US visas. 12 00:00:48,880 --> 00:00:52,360 Speaker 2: All of that on the Weekend Tech. It's Friday, January ninth. 13 00:00:57,040 --> 00:01:00,400 Speaker 2: Hello Cara, hi Azi, How are you very good? 14 00:01:00,440 --> 00:01:02,920 Speaker 4: Happy New Year, Happy New Year to you too. 15 00:01:03,320 --> 00:01:05,720 Speaker 1: So it's our first Weekend Tech of the year. 16 00:01:06,280 --> 00:01:07,960 Speaker 4: It is very excited. 17 00:01:08,080 --> 00:01:09,000 Speaker 1: Right before the holidays. 18 00:01:09,000 --> 00:01:12,280 Speaker 2: People like to do roundups the best of the year 19 00:01:12,280 --> 00:01:14,080 Speaker 2: that went by at the start of the year. 20 00:01:14,120 --> 00:01:15,039 Speaker 1: People like to do. 21 00:01:15,440 --> 00:01:16,240 Speaker 4: What's going to happen? 22 00:01:16,319 --> 00:01:17,119 Speaker 1: What's going to happen? 23 00:01:17,480 --> 00:01:19,800 Speaker 3: I'm you know what my tech prediction is, what I'm 24 00:01:19,840 --> 00:01:20,960 Speaker 3: going to be addicted to my father. 25 00:01:21,160 --> 00:01:24,080 Speaker 1: Yeah, that's not a prediction. That's just as a guarantee. 26 00:01:24,160 --> 00:01:24,880 Speaker 4: That's a prophecy. 27 00:01:25,200 --> 00:01:27,520 Speaker 1: I do actually have one big prediction for you. Tell 28 00:01:27,560 --> 00:01:28,640 Speaker 1: me so. 29 00:01:28,760 --> 00:01:30,479 Speaker 2: I think that people are going to look back at 30 00:01:30,520 --> 00:01:35,000 Speaker 2: twenty twenty six as the year that AI really entered 31 00:01:35,319 --> 00:01:38,440 Speaker 2: the physical world. And I think that became quite clear 32 00:01:38,600 --> 00:01:41,320 Speaker 2: this week at the Consumer Electronics Show. But I actually 33 00:01:41,319 --> 00:01:44,720 Speaker 2: wanted to frame this conversation up about CS with an 34 00:01:44,840 --> 00:01:48,920 Speaker 2: essay I read late last year by faith fee Lee. 35 00:01:49,080 --> 00:01:49,800 Speaker 4: Tell me about it. 36 00:01:49,800 --> 00:01:53,080 Speaker 1: Faith a Lee is the godmother of AI. She is 37 00:01:53,480 --> 00:01:54,680 Speaker 1: you wrote this blog post. 38 00:01:54,880 --> 00:02:00,560 Speaker 2: The title is from Words to Worlds. Spatial intelligence is 39 00:02:00,640 --> 00:02:03,440 Speaker 2: AI's next frontier. To have any idea what that means, I. 40 00:02:03,720 --> 00:02:07,440 Speaker 3: Mean I can guess spatial intelligence meaning like that our 41 00:02:07,520 --> 00:02:09,760 Speaker 3: whole world is sort of mapped with AI. 42 00:02:10,000 --> 00:02:13,040 Speaker 2: Pretty much so is machines in the world with sensors, 43 00:02:13,120 --> 00:02:16,280 Speaker 2: and it's creating fully simulated worlds that. 44 00:02:16,240 --> 00:02:20,239 Speaker 1: Obey the laws of physics. Basically, faith a Lee made 45 00:02:20,240 --> 00:02:24,680 Speaker 1: the case that llms, which power chatbots, are amazing, but 46 00:02:25,680 --> 00:02:29,040 Speaker 1: they're kind of almost like trapped in Plato's cave in 47 00:02:29,080 --> 00:02:32,560 Speaker 1: a weird way because they're not actually seeing the world. 48 00:02:33,000 --> 00:02:37,000 Speaker 1: They're reading text and data that humans or other machines 49 00:02:37,040 --> 00:02:40,720 Speaker 1: have recorded about the world and then extrapolating what the 50 00:02:40,760 --> 00:02:44,079 Speaker 1: world is like from records of the world, rather than 51 00:02:44,360 --> 00:02:47,600 Speaker 1: looking at the world, ingesting it, and understanding its rules 52 00:02:48,120 --> 00:02:53,040 Speaker 1: for themselves like humans do. So this is basically what 53 00:02:53,200 --> 00:02:53,880 Speaker 1: faith A Lee. 54 00:02:54,000 --> 00:02:56,320 Speaker 2: And a lot of others think might be kind of 55 00:02:56,360 --> 00:02:59,600 Speaker 2: the next revolution in computing. It's computers that can actually 56 00:03:00,080 --> 00:03:03,720 Speaker 2: understand the world for themselves rather than interpret it based 57 00:03:03,720 --> 00:03:07,320 Speaker 2: on descriptions of the world. And she basically has argued 58 00:03:07,360 --> 00:03:09,720 Speaker 2: that this will be the next frontiering computing, and that 59 00:03:09,800 --> 00:03:13,440 Speaker 2: to really deliver on the Alan Turing thinking machine promise, 60 00:03:13,720 --> 00:03:15,720 Speaker 2: it can't just be words and numbers. It has to 61 00:03:15,760 --> 00:03:18,680 Speaker 2: be the laws of physics. How gravity is different, how 62 00:03:18,880 --> 00:03:22,000 Speaker 2: pressure is different underwater versus on the surface, how a 63 00:03:22,120 --> 00:03:24,320 Speaker 2: chair moves around a room when you push it depending 64 00:03:24,400 --> 00:03:27,160 Speaker 2: on the fabric of the carpet or the slipperiness of 65 00:03:27,200 --> 00:03:29,359 Speaker 2: the floor. Like these are things now that can only 66 00:03:29,360 --> 00:03:32,160 Speaker 2: be understood in words, But ultimately she wants to build 67 00:03:32,200 --> 00:03:35,560 Speaker 2: machines and algorithms that can understand it for themselves, in 68 00:03:35,600 --> 00:03:38,400 Speaker 2: which case this might be the year I think when 69 00:03:38,680 --> 00:03:42,040 Speaker 2: thinking machines truly enter the room. 70 00:03:42,120 --> 00:03:42,720 Speaker 4: Well, I was just thinking. 71 00:03:42,720 --> 00:03:45,320 Speaker 3: An incredible thing to think about is I've talked to 72 00:03:45,320 --> 00:03:48,320 Speaker 3: you ad nauseum for the last day or two about 73 00:03:48,320 --> 00:03:50,040 Speaker 3: my dog that I've been fastering. 74 00:03:50,200 --> 00:03:53,520 Speaker 4: Yeah, and I just think about a new concern. 75 00:03:53,120 --> 00:03:55,920 Speaker 3: That I have all day long, is like what's going 76 00:03:55,960 --> 00:03:57,520 Speaker 3: on with this dog when I'm not with this dog? 77 00:03:57,960 --> 00:04:00,360 Speaker 3: But like, what if I could have a robot that 78 00:04:00,400 --> 00:04:04,400 Speaker 3: could tend to this dog as like a shadow? And 79 00:04:04,440 --> 00:04:06,560 Speaker 3: you're thinking about, like, oh, how does the physical world 80 00:04:06,600 --> 00:04:08,120 Speaker 3: interact with the robot world? 81 00:04:08,520 --> 00:04:10,800 Speaker 4: Like imagine a dog is like I have my mom 82 00:04:10,960 --> 00:04:13,040 Speaker 4: or I have my dad, and then I have. 83 00:04:13,080 --> 00:04:15,920 Speaker 3: My robot Alpha, who is like the robot self that 84 00:04:15,960 --> 00:04:17,080 Speaker 3: takes care of the dog too. 85 00:04:17,080 --> 00:04:19,480 Speaker 1: And who's the real boss You're at the robot. 86 00:04:19,560 --> 00:04:23,080 Speaker 3: Yeah, that's when robots take When dogs are. 87 00:04:22,960 --> 00:04:25,120 Speaker 1: Listening to the dog, you and the robot are standing 88 00:04:25,200 --> 00:04:26,159 Speaker 1: next to each other and you're like. 89 00:04:26,160 --> 00:04:29,560 Speaker 2: Come here, come here, pal, and the dog goes up 90 00:04:29,600 --> 00:04:30,200 Speaker 2: to the robot. 91 00:04:30,480 --> 00:04:31,320 Speaker 1: That's that's the future. 92 00:04:31,520 --> 00:04:36,680 Speaker 3: That is that's when humans, by the way, dogs answering 93 00:04:36,720 --> 00:04:38,960 Speaker 3: to robots is when humans will really be like, we 94 00:04:39,000 --> 00:04:40,440 Speaker 3: got to get these robots out of the question. 95 00:04:40,520 --> 00:04:43,240 Speaker 1: I think that's right. What about you do you have 96 00:04:43,279 --> 00:04:44,040 Speaker 1: any twenty twenty. 97 00:04:44,080 --> 00:04:48,360 Speaker 3: These predictions, Well, this is not a prediction that I've invented. 98 00:04:48,400 --> 00:04:50,039 Speaker 3: This is a prediction that's based on a lot of 99 00:04:50,040 --> 00:04:52,880 Speaker 3: reading that I've done on Business Insider and the Ft and. 100 00:04:53,040 --> 00:04:53,640 Speaker 1: Showing off. 101 00:04:55,480 --> 00:04:58,000 Speaker 4: Sor Tim Cook. I think is going to step down 102 00:04:58,040 --> 00:04:58,560 Speaker 4: from Apple. 103 00:04:58,760 --> 00:05:01,240 Speaker 2: I saw a headline about that. I didn't dive too 104 00:05:01,240 --> 00:05:02,920 Speaker 2: deep on it. What are people saying. 105 00:05:02,800 --> 00:05:05,960 Speaker 3: He's nearing retirement age, which, you know, it's not something 106 00:05:06,000 --> 00:05:09,280 Speaker 3: that's going to be like a hostile takeover, you know. 107 00:05:09,320 --> 00:05:10,800 Speaker 3: I think it's not going to be something that the 108 00:05:10,839 --> 00:05:14,440 Speaker 3: average person like finds perceptible as far as change. I 109 00:05:14,480 --> 00:05:17,920 Speaker 3: think that he will step down and kind of lead 110 00:05:18,000 --> 00:05:21,479 Speaker 3: from behind the scenes. A lah a Jeff Bezos, I 111 00:05:21,480 --> 00:05:22,680 Speaker 3: mean it's a hard job. 112 00:05:23,000 --> 00:05:25,359 Speaker 2: I guess Apple's obviously a behemoth. And you know, the 113 00:05:25,400 --> 00:05:28,440 Speaker 2: iPhone sales have been robust. The other stuff they've tried, 114 00:05:28,480 --> 00:05:30,880 Speaker 2: like the vision pro goggles they've stepped back from the 115 00:05:30,920 --> 00:05:32,080 Speaker 2: car never happened. 116 00:05:32,640 --> 00:05:34,440 Speaker 1: The TV thing is kind of I mean, it's not. 117 00:05:34,839 --> 00:05:37,520 Speaker 3: You know what I was actually thinking about it apples 118 00:05:37,560 --> 00:05:40,600 Speaker 3: TV has worked, like if you look at the Emmys 119 00:05:40,640 --> 00:05:42,440 Speaker 3: and the Critics' Choice Awards and the Golden globes like 120 00:05:42,640 --> 00:05:44,920 Speaker 3: they've kind of dominated this year. True in a way 121 00:05:44,920 --> 00:05:47,120 Speaker 3: that I mean, I don't know if you would attribute 122 00:05:47,120 --> 00:05:48,880 Speaker 3: that to Tim Cook necessarily. 123 00:05:48,400 --> 00:05:51,120 Speaker 2: But I guess if he could choose an Emmy or 124 00:05:51,160 --> 00:05:56,160 Speaker 2: having invented chet GPTR. 125 00:05:54,120 --> 00:05:56,360 Speaker 1: Respectology colleagues on the West Coast. 126 00:05:56,520 --> 00:05:58,960 Speaker 4: That's absolutely true. That's absolutely true. 127 00:05:59,120 --> 00:06:01,440 Speaker 3: You know. The other thing is that, speaking of our 128 00:06:01,480 --> 00:06:04,080 Speaker 3: friends in Hollywood, the thing that I've noticed a lot 129 00:06:04,120 --> 00:06:08,520 Speaker 3: of retired men do is they start production companies because 130 00:06:08,520 --> 00:06:10,520 Speaker 3: they like to be in the entertainment business. So who knows, 131 00:06:10,520 --> 00:06:14,200 Speaker 3: maybe Tim Apple will start a production company with Apple Studios. 132 00:06:14,200 --> 00:06:15,000 Speaker 4: I have no idea. 133 00:06:15,080 --> 00:06:17,560 Speaker 1: So yes, my prediction is the robot will enter the 134 00:06:17,640 --> 00:06:20,440 Speaker 1: room and yours this Tim Apple will exit the room. 135 00:06:20,640 --> 00:06:22,280 Speaker 4: That's right, Okay, perfect. 136 00:06:23,160 --> 00:06:26,520 Speaker 2: One tech CEO. I don't think it's going anywhere, is Jensen. 137 00:06:26,800 --> 00:06:27,840 Speaker 4: He's not going back to Dennis. 138 00:06:27,839 --> 00:06:29,240 Speaker 1: He's not going back to Denny's. 139 00:06:29,480 --> 00:06:31,760 Speaker 4: I wonder if he passes Denny's and he's like, I can't. 140 00:06:32,000 --> 00:06:32,640 Speaker 1: No, I'm sure. 141 00:06:32,800 --> 00:06:34,560 Speaker 2: I feel like it's a big part of his mythology. 142 00:06:35,240 --> 00:06:38,080 Speaker 2: He was at the Consumer Electronics Show this week, of 143 00:06:38,120 --> 00:06:42,039 Speaker 2: coursees delivering a keynote, and I count with my twenty 144 00:06:42,080 --> 00:06:45,040 Speaker 2: twenty six prediction before I read his remarks, which is 145 00:06:45,080 --> 00:06:49,040 Speaker 2: probably quite dumb, but because he is remarked, luckily we're 146 00:06:49,040 --> 00:06:51,320 Speaker 2: in line with my twenty twenty six prediction. He said 147 00:06:51,640 --> 00:06:53,839 Speaker 2: this will be the year when there's a chet GPT 148 00:06:54,080 --> 00:06:58,000 Speaker 2: moment for physical AI. This is not a disinterested statement. 149 00:06:58,040 --> 00:07:01,000 Speaker 2: Because Nvidio also announced they're getting into the self driving 150 00:07:01,000 --> 00:07:01,640 Speaker 2: car business. 151 00:07:01,720 --> 00:07:03,640 Speaker 1: Yes, I saw that, and this is. 152 00:07:03,560 --> 00:07:08,880 Speaker 2: Pretty interesting because previously you had Tesla sales falling Waymo 153 00:07:09,160 --> 00:07:14,480 Speaker 2: as the dominant players in US self driving robotaxis, and 154 00:07:14,520 --> 00:07:17,360 Speaker 2: it seemed like a two horse race. But part of 155 00:07:17,360 --> 00:07:20,080 Speaker 2: in Vidia's announcement was a partnership with Mercedes where they're 156 00:07:20,080 --> 00:07:23,239 Speaker 2: basically going to plug their software into other car makers. 157 00:07:23,280 --> 00:07:24,320 Speaker 4: They're in the car business. 158 00:07:24,360 --> 00:07:25,960 Speaker 2: They're not in the car business, but they're all of 159 00:07:25,960 --> 00:07:29,240 Speaker 2: a sudden kind of a competitor to their clients, right, 160 00:07:29,280 --> 00:07:30,360 Speaker 2: which is really interesting. 161 00:07:30,440 --> 00:07:32,840 Speaker 4: That is very interesting. Do you think it's something that's 162 00:07:32,840 --> 00:07:33,280 Speaker 4: going to work. 163 00:07:34,600 --> 00:07:35,080 Speaker 1: I wouldn't. 164 00:07:35,160 --> 00:07:38,680 Speaker 2: I mean you you're you're the Innvidia shareholder that's against 165 00:07:38,680 --> 00:07:39,360 Speaker 2: your dear leader. 166 00:07:39,480 --> 00:07:40,520 Speaker 1: No, No, I wouldn't. 167 00:07:40,880 --> 00:07:42,920 Speaker 3: And I think it's very smart to not actually be 168 00:07:42,960 --> 00:07:45,480 Speaker 3: in the car design business. I think it's very smart 169 00:07:45,480 --> 00:07:47,720 Speaker 3: to like put your software in a pre existing car. 170 00:07:47,880 --> 00:07:50,400 Speaker 2: Probably if you're if you're Mercedes who are doing the partnership, 171 00:07:50,480 --> 00:07:52,360 Speaker 2: or other auto makers, you're like, oh, this like a hail. 172 00:07:52,440 --> 00:07:55,120 Speaker 2: Mary's well, I think we have a chance of plugging 173 00:07:55,120 --> 00:07:57,000 Speaker 2: this in and catching up, which is which is really 174 00:07:57,000 --> 00:08:00,000 Speaker 2: really interesting. But he wasn't just talking about this self 175 00:08:00,040 --> 00:08:02,080 Speaker 2: driving car partnership. He was pointing at this kind of 176 00:08:02,560 --> 00:08:05,280 Speaker 2: robotic revolution. He likes hanging out with robots. We talked 177 00:08:05,320 --> 00:08:07,200 Speaker 2: about Earli last year. Remember he got the high five 178 00:08:07,240 --> 00:08:08,960 Speaker 2: from the humanoid robot. 179 00:08:08,720 --> 00:08:10,400 Speaker 4: On his trip I Do, I Do? 180 00:08:11,360 --> 00:08:14,559 Speaker 2: And another king of the self driving car world, Elon 181 00:08:14,680 --> 00:08:17,360 Speaker 2: himself is kind of seemingly more interested in. 182 00:08:17,400 --> 00:08:20,440 Speaker 1: Humanoid robots than self driving cars. Now he's moved on, 183 00:08:20,520 --> 00:08:21,080 Speaker 1: he's moved on. 184 00:08:21,160 --> 00:08:24,400 Speaker 2: He's definitely this time last year, optimist robot was hanging 185 00:08:24,400 --> 00:08:27,320 Speaker 2: out with Kim, and it seemed last year like a 186 00:08:27,360 --> 00:08:29,640 Speaker 2: pure gimmick. And now twelve months later there's a big 187 00:08:29,640 --> 00:08:32,080 Speaker 2: Wall Street Journal story saying that Elon is pivoting the 188 00:08:32,080 --> 00:08:34,040 Speaker 2: whole business to be human optimist robots. 189 00:08:34,040 --> 00:08:36,920 Speaker 1: So it is interesting as this year I'd like to 190 00:08:36,920 --> 00:08:40,079 Speaker 1: meet Kim. This year we'll see. 191 00:08:40,240 --> 00:08:44,280 Speaker 2: So humanoid robots stole the show at CS this year. 192 00:08:44,559 --> 00:08:45,720 Speaker 1: Did you see any of the videos? 193 00:08:45,840 --> 00:08:46,160 Speaker 4: Yes? 194 00:08:46,320 --> 00:08:47,680 Speaker 1: What did you? What did you see? 195 00:08:48,400 --> 00:08:50,800 Speaker 4: I think it was? Was it Tesla that I saw? 196 00:08:50,880 --> 00:08:53,360 Speaker 1: You? Mintioned Tesla. I'm going to show you Boston Dynamics. 197 00:08:53,720 --> 00:08:56,920 Speaker 3: Dynamics, Yes, show me the spot the dog famously they did. 198 00:08:57,000 --> 00:09:00,000 Speaker 1: They've now got Atlas, the person a. 199 00:09:00,000 --> 00:09:00,640 Speaker 4: Booty hun her. 200 00:09:01,040 --> 00:09:01,600 Speaker 1: What are you seeing? 201 00:09:02,240 --> 00:09:05,280 Speaker 4: So I'm seeing the biggest nerd on the planet. 202 00:09:05,400 --> 00:09:08,040 Speaker 3: It kind of reminds me of Elizabeth Holmes when something 203 00:09:08,040 --> 00:09:11,520 Speaker 3: good happens with Faranos coming out and doing like one 204 00:09:11,559 --> 00:09:13,640 Speaker 3: of those like kick dancers. 205 00:09:13,840 --> 00:09:18,520 Speaker 2: It's an extremely flexible humanoid robot doing cotwheels, doing Russian 206 00:09:18,559 --> 00:09:19,520 Speaker 2: Cossack dancing. 207 00:09:19,960 --> 00:09:21,640 Speaker 1: That is what it is, russian'd. 208 00:09:21,960 --> 00:09:24,520 Speaker 4: Yeah, it's incredible that they've trained them to do this. 209 00:09:24,840 --> 00:09:29,000 Speaker 2: It is And so Hyundai, who actually bought Boston Dynamics 210 00:09:29,080 --> 00:09:30,720 Speaker 2: lost this robot Atlas. 211 00:09:31,160 --> 00:09:33,520 Speaker 1: It can lift up to one hundred and ten pounds. 212 00:09:33,640 --> 00:09:36,760 Speaker 2: It can work in temperatures between minus four degrees fahrenheit 213 00:09:36,840 --> 00:09:38,440 Speaker 2: up to one hundred and four degrees. 214 00:09:38,160 --> 00:09:40,000 Speaker 4: Assud of the problem of needing cooling centers. 215 00:09:40,200 --> 00:09:42,600 Speaker 1: You don't and actually the battery is pretty amazing as well. 216 00:09:42,760 --> 00:09:45,800 Speaker 2: Last time, like when humanoid robots were last on sixty minutes, 217 00:09:45,840 --> 00:09:47,760 Speaker 2: had these huge hunchback of notre. 218 00:09:47,640 --> 00:09:49,400 Speaker 1: Dam story packs, battery packs. 219 00:09:49,400 --> 00:09:51,600 Speaker 2: But now it's just a pure human form essentially, which 220 00:09:51,600 --> 00:09:55,480 Speaker 2: is kind of interesting. They're planning to produce thirty thousand 221 00:09:55,679 --> 00:09:58,520 Speaker 2: of these robots per year, with the goal of them 222 00:09:58,679 --> 00:10:01,679 Speaker 2: working on factory flos by twenty twenty eight. 223 00:10:01,679 --> 00:10:04,080 Speaker 4: They're like, humans can't work in hard enough environments. We 224 00:10:04,120 --> 00:10:04,920 Speaker 4: need to make robot. 225 00:10:05,040 --> 00:10:05,880 Speaker 1: Well, I mean, I guess. 226 00:10:05,760 --> 00:10:07,120 Speaker 4: It's a that's a good thing. 227 00:10:07,679 --> 00:10:08,480 Speaker 1: It could be a good thing. 228 00:10:08,520 --> 00:10:10,160 Speaker 2: I mean, I look, I don't know if these are 229 00:10:10,160 --> 00:10:13,000 Speaker 2: actually going to replace humans on the factory floor, but 230 00:10:13,120 --> 00:10:16,280 Speaker 2: certainly they've got everyone talking about Hyundai for the first 231 00:10:16,280 --> 00:10:19,080 Speaker 2: time in a while, I don't talk about human So 232 00:10:19,559 --> 00:10:23,640 Speaker 2: those are humanoid robots designed by Hyundai to be factory workers. LG, 233 00:10:24,440 --> 00:10:28,679 Speaker 2: another South Korean tech firm, also made waves of cs 234 00:10:28,679 --> 00:10:34,280 Speaker 2: this year with its robot called Cloyd Cloud. Lloyd claud 235 00:10:34,480 --> 00:10:37,920 Speaker 2: Lloyd Cloud exactly, which ladders up to a vision. 236 00:10:37,920 --> 00:10:38,760 Speaker 1: Do you know what the vision is? 237 00:10:38,800 --> 00:10:39,480 Speaker 4: What is the vision? 238 00:10:39,880 --> 00:10:42,120 Speaker 1: The zero labor home that's I'm looking for. 239 00:10:42,120 --> 00:10:44,800 Speaker 3: I was just thinking, if I have to do another 240 00:10:44,840 --> 00:10:47,439 Speaker 3: load of laundry, like it's Sissifian the cloud. 241 00:10:47,480 --> 00:10:49,640 Speaker 1: Literally Cloyd literally does the laundry. 242 00:10:49,800 --> 00:10:50,520 Speaker 4: Can I try this? 243 00:10:50,800 --> 00:10:52,480 Speaker 1: Yeah, of course, Eliza. 244 00:10:53,160 --> 00:10:56,480 Speaker 2: Cloyd can remove dishes from the dishwasher, what heat up 245 00:10:56,520 --> 00:10:58,680 Speaker 2: food in the oven? This is this is not a 246 00:10:58,720 --> 00:11:01,040 Speaker 2: domestic home now, this is domestic laborer. 247 00:11:01,200 --> 00:11:01,880 Speaker 4: Oh my god. 248 00:11:02,280 --> 00:11:05,400 Speaker 2: It can interact seamlessly with other LG devices like your 249 00:11:05,440 --> 00:11:07,000 Speaker 2: fridge and your cooker. 250 00:11:07,320 --> 00:11:09,240 Speaker 3: So it is the connected home that then has the 251 00:11:09,360 --> 00:11:10,960 Speaker 3: robot that's running your connected. 252 00:11:10,760 --> 00:11:13,680 Speaker 2: The robot is the is the conductor to the symphony 253 00:11:14,160 --> 00:11:15,080 Speaker 2: of your connected home. 254 00:11:15,160 --> 00:11:16,880 Speaker 4: Not particularly opposed to this, I got to. 255 00:11:16,840 --> 00:11:19,800 Speaker 2: Say these were of the only humanoid robots who took 256 00:11:20,400 --> 00:11:24,080 Speaker 2: CS by storm. There was also the switchbot one row 257 00:11:24,440 --> 00:11:27,520 Speaker 2: h one that can fill a coffee machine, make breakfast, 258 00:11:27,720 --> 00:11:31,679 Speaker 2: wash the windows. And then there's the one EX Neo robot. 259 00:11:31,880 --> 00:11:35,320 Speaker 2: This one is kind of interesting because it doesn't have 260 00:11:35,400 --> 00:11:38,040 Speaker 2: the best fine motor control, which I think is true 261 00:11:38,040 --> 00:11:40,160 Speaker 2: of all of these robots. So if you have a 262 00:11:40,200 --> 00:11:42,960 Speaker 2: one X Neo robot in your home, you have to 263 00:11:43,040 --> 00:11:47,200 Speaker 2: consent to the fact that it will sometimes switch into 264 00:11:47,240 --> 00:11:50,600 Speaker 2: a mode where it's being human controlled, so there will 265 00:11:50,679 --> 00:11:54,959 Speaker 2: be somebody looking through your robot into your home, wearing 266 00:11:55,080 --> 00:11:58,480 Speaker 2: some kind of like haptic suit, doing what the robot 267 00:11:58,600 --> 00:11:59,120 Speaker 2: needs to do. 268 00:12:00,480 --> 00:12:01,240 Speaker 1: It's like Avatar. 269 00:12:01,720 --> 00:12:03,959 Speaker 2: Can you imagine buying your robot and then having somebody 270 00:12:04,040 --> 00:12:07,840 Speaker 2: sitting four thousand miles away in your house, yes, doing 271 00:12:07,960 --> 00:12:09,120 Speaker 2: your domestic. 272 00:12:10,600 --> 00:12:10,960 Speaker 4: Robots. 273 00:12:11,520 --> 00:12:14,520 Speaker 2: The jobs created by robots are people working thousands of 274 00:12:14,559 --> 00:12:16,920 Speaker 2: miles away to operate robots in your home. 275 00:12:17,440 --> 00:12:19,239 Speaker 1: It's a bizarre Also. 276 00:12:19,040 --> 00:12:23,320 Speaker 2: Like I wouldn't really want camera in my home, which 277 00:12:23,360 --> 00:12:24,439 Speaker 2: is external. 278 00:12:24,160 --> 00:12:25,719 Speaker 3: And we'll talk about this in my next story. But 279 00:12:25,760 --> 00:12:27,600 Speaker 3: there are also places where I don't want cameras. 280 00:12:27,840 --> 00:12:28,959 Speaker 1: Yeah, I want to hear about that. 281 00:12:29,400 --> 00:12:31,800 Speaker 2: Just before we turn away from CS, there was some 282 00:12:31,880 --> 00:12:35,440 Speaker 2: other robots which are pretty cool but not humanoid. Tell 283 00:12:35,480 --> 00:12:42,199 Speaker 2: me there's the anchor Uphie s two vacuum. It's rumba 284 00:12:42,520 --> 00:12:45,760 Speaker 2: with them up and it can travel between carpeted areas, 285 00:12:46,360 --> 00:12:48,280 Speaker 2: then go to the hard floor to mop it, come 286 00:12:48,320 --> 00:12:51,640 Speaker 2: back to the carpeted area. No water spilled, anything like that. 287 00:12:51,720 --> 00:12:55,319 Speaker 2: And Cara has an aromatherapy. 288 00:12:54,559 --> 00:12:56,520 Speaker 4: Diffuser, Oh my god. 289 00:12:56,800 --> 00:13:00,600 Speaker 2: And then finally smart fridges which can scan your empty 290 00:13:00,640 --> 00:13:03,959 Speaker 2: packages and add them to a shopping list for the 291 00:13:04,000 --> 00:13:06,680 Speaker 2: next grocery run. So it's a camera inside the fridge 292 00:13:07,160 --> 00:13:09,480 Speaker 2: that accesses an app while shopping, so. 293 00:13:09,520 --> 00:13:11,559 Speaker 1: Kind of remind you either on nine or in the 294 00:13:11,600 --> 00:13:12,960 Speaker 1: real world. Yeah, or you need to buy it. 295 00:13:13,000 --> 00:13:17,360 Speaker 3: I mean it is like on a macro level, pretty 296 00:13:17,360 --> 00:13:19,920 Speaker 3: incredible the way in which technology is like filling the 297 00:13:19,960 --> 00:13:21,880 Speaker 3: gaps and what humans can do, and also just like 298 00:13:22,240 --> 00:13:24,760 Speaker 3: taking care of time, Like all these things are very 299 00:13:24,840 --> 00:13:27,320 Speaker 3: time intensive. The question is is like is it time 300 00:13:27,360 --> 00:13:29,800 Speaker 3: that we want to eradicate, Like do we want to not. 301 00:13:29,720 --> 00:13:32,679 Speaker 4: Go to the grocery store. What are we supposed to 302 00:13:32,679 --> 00:13:33,640 Speaker 4: be doing instead. 303 00:13:34,480 --> 00:13:35,400 Speaker 1: Working on our computer? 304 00:13:36,040 --> 00:13:36,559 Speaker 4: Exactly? 305 00:13:36,679 --> 00:13:39,280 Speaker 2: That is the conundrum of the I have to work 306 00:13:39,360 --> 00:13:42,240 Speaker 2: all the time, so then when do I do things 307 00:13:42,240 --> 00:13:45,960 Speaker 2: which are non compensated labor but very important, like yeah, 308 00:13:46,000 --> 00:13:47,880 Speaker 2: the house up is that? That's like, yes, so I 309 00:13:48,000 --> 00:13:49,800 Speaker 2: never I never do the things that I need to 310 00:13:49,840 --> 00:13:52,000 Speaker 2: do which are not directly work related, Like I have 311 00:13:52,080 --> 00:13:54,400 Speaker 2: such a long list of annoying personal things. 312 00:13:54,520 --> 00:13:56,320 Speaker 4: We call it personal ad personal admin. 313 00:13:56,440 --> 00:13:58,640 Speaker 1: Yes, I never get through it. Yes, I don't know if, 314 00:13:58,640 --> 00:14:00,640 Speaker 1: but it doesn't tend to be still if the robot 315 00:14:00,679 --> 00:14:02,240 Speaker 1: could do not yet not yet. 316 00:14:02,280 --> 00:14:04,680 Speaker 3: I mean I still think the idea of like, I 317 00:14:04,760 --> 00:14:07,080 Speaker 3: live in an apartment building where not only humans live, 318 00:14:07,120 --> 00:14:08,360 Speaker 3: but a humanoid robot lives. 319 00:14:08,400 --> 00:14:09,480 Speaker 4: Like how far out are we on. 320 00:14:09,440 --> 00:14:12,120 Speaker 2: That you will go to somebody's house this year, I bet, 321 00:14:12,160 --> 00:14:13,040 Speaker 2: and they'll have a robot. 322 00:14:13,160 --> 00:14:18,800 Speaker 4: That's crazy. Okay, we'll see. I obviously did not get 323 00:14:18,800 --> 00:14:23,160 Speaker 4: to Cees this week. I did, though, get to Wegmans. 324 00:14:22,800 --> 00:14:24,960 Speaker 1: The grocery store, yes, the grocery. 325 00:14:24,520 --> 00:14:27,560 Speaker 3: Store, because I read this article in the Gothamis that 326 00:14:27,600 --> 00:14:30,800 Speaker 3: the grocery store chain was using facial recognition software in 327 00:14:30,840 --> 00:14:32,320 Speaker 3: their NYC location. 328 00:14:32,560 --> 00:14:34,640 Speaker 1: Well wow, what did the goemists find out about this? 329 00:14:35,080 --> 00:14:39,000 Speaker 3: So actually, a city law in twenty twenty one requires 330 00:14:39,000 --> 00:14:40,960 Speaker 3: stores to put up a sign if they use facial 331 00:14:41,000 --> 00:14:46,080 Speaker 3: recognition software, so Wegmans complied. The funny thing is I 332 00:14:46,160 --> 00:14:48,760 Speaker 3: went to Wegmans, and I really had to search for 333 00:14:48,800 --> 00:14:52,120 Speaker 3: this sign. And Wegmans might disagree with me because there 334 00:14:52,200 --> 00:14:55,520 Speaker 3: was a sign on the door, But the way in 335 00:14:55,560 --> 00:14:57,800 Speaker 3: which I had to look for it, because it was 336 00:14:57,840 --> 00:15:00,800 Speaker 3: like sliding in and out of view on the automatic door, 337 00:15:01,000 --> 00:15:01,960 Speaker 3: really made me laugh. 338 00:15:02,240 --> 00:15:05,000 Speaker 2: So you went to Wegmans looking for the sign, you 339 00:15:05,000 --> 00:15:06,960 Speaker 2: hadn't been looking for the sign, you wouldn't have seen it. 340 00:15:07,080 --> 00:15:09,720 Speaker 3: So there were people that I talked to and I 341 00:15:09,840 --> 00:15:11,880 Speaker 3: pointed out the sign to them when I went to 342 00:15:11,880 --> 00:15:13,080 Speaker 3: the store, and they were like, no, I. 343 00:15:13,000 --> 00:15:13,600 Speaker 4: Didn't notice that. 344 00:15:13,600 --> 00:15:14,400 Speaker 1: I didn't recognize that. 345 00:15:14,400 --> 00:15:15,080 Speaker 4: I didn't see that. 346 00:15:15,520 --> 00:15:17,880 Speaker 3: Because if you're not looking for it, I think we're 347 00:15:17,960 --> 00:15:20,880 Speaker 3: so used to just seeing signs up and if there 348 00:15:20,920 --> 00:15:23,440 Speaker 3: don't apply, Like, I've never looked at pet signs until recently. 349 00:15:23,760 --> 00:15:25,440 Speaker 1: So you found the sign. What did it say? 350 00:15:25,760 --> 00:15:31,200 Speaker 3: It said, quote biometric identifier information collected at this location. Then, 351 00:15:31,240 --> 00:15:39,800 Speaker 3: in much finer print, quote Wegman's Food Market, Inc. Collects, retains, converts, stores, 352 00:15:39,960 --> 00:15:45,840 Speaker 3: or shares customers biometric identifier information, which may include facial recognition, 353 00:15:46,280 --> 00:15:47,920 Speaker 3: eye scans, and voice prints. 354 00:15:48,240 --> 00:15:52,280 Speaker 2: Of those two verbs, converts and shares stand out to 355 00:15:52,280 --> 00:15:55,680 Speaker 2: me as convert, I'm like converts into what shares with whom? 356 00:15:55,960 --> 00:15:58,920 Speaker 3: So Gofmus actually asked Wegmans about how they were storing 357 00:15:58,960 --> 00:16:01,240 Speaker 3: the data, how long they were storing it for, and 358 00:16:01,280 --> 00:16:04,120 Speaker 3: how the company would share this data with law enforcement. 359 00:16:05,400 --> 00:16:07,360 Speaker 4: Surprise surprise, Wegmans did. 360 00:16:07,240 --> 00:16:11,240 Speaker 2: Not reply, Okay, what you've been been bearing the leave 361 00:16:11,320 --> 00:16:12,359 Speaker 2: for a while. Now. 362 00:16:12,440 --> 00:16:14,440 Speaker 1: What this is is this? What is this is a 363 00:16:14,520 --> 00:16:17,160 Speaker 1: customer retention to safety thing safety. 364 00:16:17,320 --> 00:16:21,960 Speaker 3: Yes, so the sign says, quote, we use facial recognition 365 00:16:22,000 --> 00:16:25,280 Speaker 3: technology to protect the safety and security of our patrons 366 00:16:25,320 --> 00:16:30,080 Speaker 3: and employees and do not lease, trade, or otherwise profit 367 00:16:30,360 --> 00:16:33,280 Speaker 3: from the transfer of biometric identify or information. 368 00:16:34,160 --> 00:16:37,080 Speaker 2: It is interesting because like in New York they now 369 00:16:37,160 --> 00:16:39,560 Speaker 2: have like you know these like plastic. 370 00:16:39,160 --> 00:16:41,200 Speaker 4: Covers on everything in CBS. 371 00:16:41,000 --> 00:16:43,080 Speaker 1: So that you can't sell the people shoplift. 372 00:16:43,320 --> 00:16:46,880 Speaker 4: I mean, this is just replacing the old Have you seen. 373 00:16:46,720 --> 00:16:48,240 Speaker 1: This person right now? 374 00:16:48,240 --> 00:16:48,720 Speaker 4: You know what I mean? 375 00:16:48,760 --> 00:16:51,880 Speaker 3: Like, you go into any bodega or any sort of 376 00:16:52,000 --> 00:16:55,000 Speaker 3: like smoke shop and there's been a shoplifter and they 377 00:16:55,040 --> 00:16:57,840 Speaker 3: have a photo from the camera that is so blurry 378 00:16:57,840 --> 00:17:00,960 Speaker 3: that you can't even see if this person is a person. Now, 379 00:17:01,000 --> 00:17:03,240 Speaker 3: what we have is a company that has a lot 380 00:17:03,240 --> 00:17:07,800 Speaker 3: of money to invest in software and basically they can 381 00:17:07,920 --> 00:17:10,520 Speaker 3: capture any face and cross reference that face if that 382 00:17:10,560 --> 00:17:12,600 Speaker 3: face has been known to shoplift. 383 00:17:12,800 --> 00:17:15,959 Speaker 2: Yeah, but it's also like if you haven't been shoplifting. 384 00:17:16,040 --> 00:17:19,720 Speaker 2: Like the idea that you can be personally identified every 385 00:17:19,800 --> 00:17:21,639 Speaker 2: time you go into a shop and cross reference to 386 00:17:21,720 --> 00:17:22,560 Speaker 2: older weird. 387 00:17:22,760 --> 00:17:25,280 Speaker 1: It's like, you know, you didn't leave your wallet in 388 00:17:25,320 --> 00:17:25,640 Speaker 1: the shop. 389 00:17:25,680 --> 00:17:28,800 Speaker 2: You left your whole identity there that's right to be shared, converted, 390 00:17:28,880 --> 00:17:30,160 Speaker 2: and whatever else that will. 391 00:17:30,240 --> 00:17:33,040 Speaker 3: It's just a very strange phenomenon. So I actually talked 392 00:17:33,080 --> 00:17:35,679 Speaker 3: to a few people, and luckily I had a very 393 00:17:35,720 --> 00:17:38,680 Speaker 3: intelligent woman who had seen the same Gothmass article I had. 394 00:17:39,400 --> 00:17:42,919 Speaker 3: I approached her and said to her, you, did you 395 00:17:43,040 --> 00:17:46,200 Speaker 3: know that facial recognition technology was being used at this Wegman? 396 00:17:46,320 --> 00:17:48,439 Speaker 3: She said, I knew it was being used at Wegmans. 397 00:17:48,440 --> 00:17:50,840 Speaker 3: I thought it was at the Wegmans in Brooklyn, and 398 00:17:50,920 --> 00:17:53,520 Speaker 3: that she actually hadn't clocked the side in the Manhattan store. 399 00:17:53,640 --> 00:17:55,560 Speaker 3: The only reason she knew about it was because she 400 00:17:55,600 --> 00:17:56,760 Speaker 3: had read the Gothmist article. 401 00:17:56,760 --> 00:17:57,639 Speaker 1: I want to hear the tape. 402 00:17:58,040 --> 00:18:00,639 Speaker 3: Do you feel comfortable with your biometrics oatomy collected by 403 00:18:00,640 --> 00:18:01,639 Speaker 3: a large grocery stoarch? 404 00:18:01,640 --> 00:18:03,359 Speaker 5: I know, I actually had the thought right as I 405 00:18:03,359 --> 00:18:05,360 Speaker 5: walked in here. I was like, maybe I should put 406 00:18:05,359 --> 00:18:07,200 Speaker 5: my little scarf around my field wearing. 407 00:18:07,000 --> 00:18:08,520 Speaker 4: A really cute scarf right now. 408 00:18:08,880 --> 00:18:10,760 Speaker 5: Maybe you should use it or like a giant pair 409 00:18:10,760 --> 00:18:14,840 Speaker 5: of sunglasses. Yeah no, I mean, I guess the way 410 00:18:14,840 --> 00:18:18,760 Speaker 5: I feel about it is like I'm not sure that 411 00:18:18,800 --> 00:18:21,880 Speaker 5: it is going to impact me personally. In any real way. 412 00:18:21,920 --> 00:18:23,720 Speaker 5: But I think there are certain people for whom it's 413 00:18:23,760 --> 00:18:27,120 Speaker 5: really bad. Yeah, you know, I'm like a white lady who. 414 00:18:26,960 --> 00:18:27,320 Speaker 1: Is like. 415 00:18:28,920 --> 00:18:33,359 Speaker 5: Economically mobile, and so I'm sort of like not worried 416 00:18:33,400 --> 00:18:35,840 Speaker 5: about it for myself, which I guess is why I 417 00:18:35,840 --> 00:18:38,200 Speaker 5: still came in here. But on principle, yeah, I think 418 00:18:38,200 --> 00:18:38,960 Speaker 5: it's bad. 419 00:18:39,160 --> 00:18:41,440 Speaker 3: You know, it's actually interesting she was wearing this kind 420 00:18:41,440 --> 00:18:45,000 Speaker 3: of cravat that was this little cashmir brown scarf, and 421 00:18:46,080 --> 00:18:48,520 Speaker 3: her idea to cover her face with a scarf is 422 00:18:48,520 --> 00:18:51,080 Speaker 3: actually a good one. Four or four recently published an 423 00:18:51,119 --> 00:18:53,520 Speaker 3: article about anti surveillance design, which you and I have 424 00:18:53,520 --> 00:18:55,960 Speaker 3: talked about, remember in terms of three M and the 425 00:18:56,600 --> 00:18:57,560 Speaker 3: Hong Kong protests. 426 00:18:57,640 --> 00:18:57,840 Speaker 1: Yeah. 427 00:18:57,880 --> 00:19:00,959 Speaker 3: Four. A fourth point is like, facial recognition tech is complicated, 428 00:19:00,960 --> 00:19:04,040 Speaker 3: but actually tricking it is very easy and the most 429 00:19:04,040 --> 00:19:06,880 Speaker 3: effective tool you can use to trick facial recognition technology 430 00:19:06,960 --> 00:19:07,760 Speaker 3: is actually a mask. 431 00:19:08,000 --> 00:19:09,800 Speaker 1: Is Wegman's an outlier here or the other? I mean 432 00:19:09,880 --> 00:19:10,760 Speaker 1: this you big quich is? 433 00:19:10,800 --> 00:19:12,399 Speaker 2: I know, for example, when you go to the airport, 434 00:19:12,440 --> 00:19:14,119 Speaker 2: you can sometimes shop with your. 435 00:19:14,000 --> 00:19:17,959 Speaker 4: First like the f out Actually no, it's not just Wegmans. 436 00:19:18,040 --> 00:19:21,320 Speaker 3: There are stores of Albertson's Walmart, Kroger's are all using 437 00:19:21,320 --> 00:19:23,040 Speaker 3: it in some of their locations. 438 00:19:23,280 --> 00:19:28,359 Speaker 4: I think also Home Depot, Low's, Macy's. I mean, I 439 00:19:28,359 --> 00:19:29,560 Speaker 4: think a lot of people won't care. 440 00:19:29,960 --> 00:19:32,360 Speaker 3: Yeah, and a lot of people will just say, I've 441 00:19:32,359 --> 00:19:35,080 Speaker 3: been surveilled. What is this, as we used to say, 442 00:19:35,119 --> 00:19:38,320 Speaker 3: a little sprinkle of lder going to do to my 443 00:19:38,400 --> 00:19:40,119 Speaker 3: day to day? If I'm not someone who should be 444 00:19:40,160 --> 00:19:42,520 Speaker 3: worried about shoplifting, And to that, I say, what if 445 00:19:42,560 --> 00:19:44,040 Speaker 3: it's not forever about shoplifting? 446 00:19:44,119 --> 00:19:50,879 Speaker 4: Right after the break, the P. 447 00:19:51,240 --> 00:19:54,680 Speaker 3: And GLP ones are having a moment and only fans 448 00:19:54,720 --> 00:19:57,560 Speaker 3: models are getting very special US visas. 449 00:19:57,760 --> 00:20:07,000 Speaker 1: Stay with us and we're back. Does the word peptide 450 00:20:07,400 --> 00:20:08,159 Speaker 1: mean anything to you? 451 00:20:08,240 --> 00:20:10,439 Speaker 3: My friends are always like the peptides, the peptides and my 452 00:20:10,480 --> 00:20:14,800 Speaker 3: skincare and the peptide what I think they use skincare 453 00:20:14,880 --> 00:20:17,359 Speaker 3: that have like peptides that titans their face. 454 00:20:17,720 --> 00:20:19,560 Speaker 1: Have they been to peptide raves apps? 455 00:20:19,600 --> 00:20:23,440 Speaker 3: I mean, unless you're talking about the their bathroom now. 456 00:20:23,920 --> 00:20:24,040 Speaker 1: Well. 457 00:20:24,080 --> 00:20:25,640 Speaker 2: The New York Times had a great story this week 458 00:20:25,680 --> 00:20:29,720 Speaker 2: with the headline Chinese peptides are the latest biohacking trend 459 00:20:30,119 --> 00:20:33,160 Speaker 2: in the tech world. It had one source saying quote 460 00:20:33,280 --> 00:20:37,439 Speaker 2: the elites all have a Chinese peptide dealer, and it 461 00:20:37,520 --> 00:20:41,320 Speaker 2: follows several Silicon Valley workers, all between twenty and forty 462 00:20:41,800 --> 00:20:45,600 Speaker 2: who are buying unregulated peptides from China with the hope 463 00:20:45,600 --> 00:20:48,280 Speaker 2: that it will improve their health, their sleep, their fitness, 464 00:20:48,400 --> 00:20:49,600 Speaker 2: and their focus. 465 00:20:49,880 --> 00:20:50,440 Speaker 1: But what is it? 466 00:20:50,880 --> 00:20:52,960 Speaker 4: I mean? I know peptides are like protein. 467 00:20:53,680 --> 00:20:58,040 Speaker 2: What is Peptides are amino acids that regulate hormones in 468 00:20:58,119 --> 00:21:01,439 Speaker 2: the body and reduce in flammation, which is why it's 469 00:21:01,440 --> 00:21:04,640 Speaker 2: a popular ingredient I learned in skin care products. It's 470 00:21:04,680 --> 00:21:08,200 Speaker 2: also and I didn't know this embarrassingly, the P in 471 00:21:08,320 --> 00:21:09,000 Speaker 2: GLP one. 472 00:21:09,520 --> 00:21:12,359 Speaker 3: I didn't know that just until this very moment. So 473 00:21:13,280 --> 00:21:15,920 Speaker 3: the p peptides improve health. 474 00:21:15,960 --> 00:21:19,159 Speaker 4: They're good. People like them, But why do they like them? Like, 475 00:21:19,200 --> 00:21:20,480 Speaker 4: what do people say that they do? 476 00:21:20,920 --> 00:21:24,600 Speaker 2: People are claiming that different types of peptides work for 477 00:21:24,720 --> 00:21:26,280 Speaker 2: different types of ailments. 478 00:21:26,400 --> 00:21:27,320 Speaker 1: Peptides is like. 479 00:21:27,280 --> 00:21:31,119 Speaker 2: A catsual term for these collections of amino acids, and 480 00:21:31,160 --> 00:21:34,159 Speaker 2: people are using all kinds of different peptides for different purposes. 481 00:21:34,400 --> 00:21:38,000 Speaker 2: So people say that these peptides can stimulate wound healing 482 00:21:38,080 --> 00:21:42,000 Speaker 2: with by stimulating new blood vessel growth, better sleep, of course, 483 00:21:42,240 --> 00:21:45,840 Speaker 2: weight loss, increase focus. One of the weirdest things was 484 00:21:46,240 --> 00:21:50,200 Speaker 2: improving eye contact. Quote from the story, One Open AI 485 00:21:50,280 --> 00:21:53,400 Speaker 2: researcher called it a zepic for autism. There have been 486 00:21:53,440 --> 00:21:55,919 Speaker 2: some negative side effects. One woman in the New York 487 00:21:56,000 --> 00:21:59,240 Speaker 2: Times story had all her hair fall out bad Two 488 00:21:59,320 --> 00:22:02,760 Speaker 2: women will hostpit life with swollen tongues and racing hearts 489 00:22:03,119 --> 00:22:06,480 Speaker 2: after shooting up on peptides at an anti aging festival 490 00:22:06,480 --> 00:22:06,879 Speaker 2: in Vegas. 491 00:22:07,040 --> 00:22:09,120 Speaker 4: Just like the worst case scenario use. 492 00:22:09,160 --> 00:22:12,400 Speaker 1: Well, in the worst case scenario is probably you die. 493 00:22:12,720 --> 00:22:14,919 Speaker 3: Right, So I'm just saying like in terms of like 494 00:22:15,480 --> 00:22:18,840 Speaker 3: people who are using glps are not generally like going 495 00:22:18,880 --> 00:22:20,200 Speaker 3: to shoot them up festivals. 496 00:22:20,280 --> 00:22:24,359 Speaker 2: Well, that's what this is about. These raves are like 497 00:22:24,600 --> 00:22:28,879 Speaker 2: basically people are going to these parties with DJs where 498 00:22:29,160 --> 00:22:33,360 Speaker 2: all of these peptides, a DJ and a peptide, all 499 00:22:33,400 --> 00:22:37,560 Speaker 2: of these peptides are available, and somebody else is teaching 500 00:22:37,560 --> 00:22:41,240 Speaker 2: you how to make your own peptide cocktail. The time 501 00:22:41,280 --> 00:22:43,000 Speaker 2: that you went through a few of these raves, and 502 00:22:43,080 --> 00:22:45,520 Speaker 2: there are pictures of the raves in the article. People 503 00:22:45,600 --> 00:22:49,439 Speaker 2: are like have a needle in their arm with tape 504 00:22:50,000 --> 00:22:51,560 Speaker 2: with like a disco background. 505 00:22:51,840 --> 00:22:54,520 Speaker 4: Oh so they can just like get fed, get fed peptides. 506 00:22:54,600 --> 00:22:57,560 Speaker 1: Yeah yeah, Is this legal. It's kind of in a 507 00:22:57,640 --> 00:22:58,439 Speaker 1: gray area. 508 00:22:58,880 --> 00:23:03,840 Speaker 2: Personal use is legal, and buying directly from suppliers is legal, 509 00:23:04,400 --> 00:23:08,600 Speaker 2: but very few peptides are FDA approved, and so biohackers 510 00:23:08,720 --> 00:23:12,240 Speaker 2: are very much injecting at their own risk. As the 511 00:23:12,240 --> 00:23:15,600 Speaker 2: woman whose hair fell out may testify. What's kind of 512 00:23:15,600 --> 00:23:18,919 Speaker 2: interesting is this is not as niche as you might imagine. 513 00:23:19,080 --> 00:23:21,160 Speaker 4: The rave piece isn't the rave piece. 514 00:23:21,000 --> 00:23:21,520 Speaker 1: Is pretty niche. 515 00:23:21,560 --> 00:23:23,000 Speaker 2: In fact, you look at the photos in the New 516 00:23:23,040 --> 00:23:25,000 Speaker 2: York Times, it's like only a New York Times journalist 517 00:23:25,000 --> 00:23:25,880 Speaker 2: could describe that as. 518 00:23:25,760 --> 00:23:28,919 Speaker 1: A rave It's like four people in a room with 519 00:23:29,040 --> 00:23:29,600 Speaker 1: purple light. 520 00:23:29,720 --> 00:23:31,720 Speaker 4: Yeah yeah, yeah, yeah, yeah yeah yeah. 521 00:23:31,760 --> 00:23:35,119 Speaker 2: But as a party, I guess at best, imports of 522 00:23:35,160 --> 00:23:38,360 Speaker 2: peptides and hormones from China doubled. 523 00:23:38,520 --> 00:23:39,879 Speaker 4: I was going to ask why China, and then I 524 00:23:39,960 --> 00:23:40,840 Speaker 4: was like, I know the yes they make. 525 00:23:41,680 --> 00:23:43,920 Speaker 2: So in twenty twenty four, there were one hundred and 526 00:23:43,960 --> 00:23:47,960 Speaker 2: sixty four million dollars worth of Chinese peptides imported to 527 00:23:48,000 --> 00:23:50,480 Speaker 2: the US in the first three quarters of the year. 528 00:23:51,119 --> 00:23:54,560 Speaker 2: In twenty twenty five, three hundred and twenty eight million 529 00:23:55,119 --> 00:23:59,399 Speaker 2: exactly double, which is three twenty million dollars of Chinese peptides. 530 00:23:59,000 --> 00:24:02,080 Speaker 4: Which makes you one, does it work or is there 531 00:24:02,160 --> 00:24:02,639 Speaker 4: just hype? 532 00:24:03,080 --> 00:24:04,120 Speaker 1: I mean JLP's work. 533 00:24:04,359 --> 00:24:05,560 Speaker 4: Have you ever had an I. 534 00:24:07,520 --> 00:24:10,440 Speaker 2: Listen, Honestly reading the article, I was like, Wow, if 535 00:24:10,480 --> 00:24:13,399 Speaker 2: I had a lot more time on my hands and 536 00:24:13,480 --> 00:24:17,040 Speaker 2: I was a little bit more crazed, I can easily 537 00:24:17,119 --> 00:24:18,520 Speaker 2: imagine going down this rabbit hole. 538 00:24:18,600 --> 00:24:21,359 Speaker 3: Yeah here all you kind of can, though, just with 539 00:24:21,480 --> 00:24:24,360 Speaker 3: like very cursory use of GLP wan like a lot 540 00:24:24,400 --> 00:24:26,240 Speaker 3: of people, I mean basically everybody. 541 00:24:26,240 --> 00:24:29,800 Speaker 1: I know. Yeah, I'm also exploring. You're exploring I. 542 00:24:29,760 --> 00:24:32,360 Speaker 4: Am exploring wait no, for mental health benefits. 543 00:24:32,400 --> 00:24:33,960 Speaker 2: So that was one of the things mentioned in the article, 544 00:24:34,040 --> 00:24:38,280 Speaker 2: microdosing GLPS for mental health. But you know, it's like 545 00:24:38,400 --> 00:24:41,560 Speaker 2: the promise that rather than like going to the gym 546 00:24:41,720 --> 00:24:45,200 Speaker 2: four times a week and meditating for forty five minutes 547 00:24:45,240 --> 00:24:47,960 Speaker 2: a day and going on a walk and looking at the. 548 00:24:47,880 --> 00:24:50,960 Speaker 3: Sky back to your it goes back to what you're saying, 549 00:24:50,960 --> 00:24:52,720 Speaker 3: which is this list of all the stuff that you 550 00:24:52,800 --> 00:24:56,320 Speaker 3: have to do when you're not working is really hard 551 00:24:56,320 --> 00:24:59,520 Speaker 3: to do, Like, can I achieve it with this coat that. 552 00:24:59,480 --> 00:25:01,880 Speaker 1: I can buy on line? Yes, so it's tempting. 553 00:25:01,960 --> 00:25:04,199 Speaker 2: I actually, luckily I don't have enough time to go 554 00:25:04,320 --> 00:25:06,240 Speaker 2: down the rabbit hole, but I but I can imagine. 555 00:25:06,240 --> 00:25:07,879 Speaker 1: I can imagine maybe you could have a robot that 556 00:25:07,920 --> 00:25:09,000 Speaker 1: could stranger things. 557 00:25:09,240 --> 00:25:10,119 Speaker 4: Stranger things. 558 00:25:10,200 --> 00:25:16,480 Speaker 3: Absolutely, I want to pivot. But do you consider yourself 559 00:25:16,480 --> 00:25:17,760 Speaker 3: an exceptional creative? 560 00:25:18,320 --> 00:25:19,720 Speaker 1: Have you been reading my LinkedIn again? 561 00:25:20,040 --> 00:25:23,760 Speaker 3: I've been reading your thoughts because I take Pepta and 562 00:25:23,800 --> 00:25:26,520 Speaker 3: I read this article that said influencers and only fans 563 00:25:26,560 --> 00:25:30,720 Speaker 3: models dominate the US extraordinary artist visa, and I thought 564 00:25:30,720 --> 00:25:30,960 Speaker 3: of you. 565 00:25:31,200 --> 00:25:32,960 Speaker 1: I saw that story. I didn't read it, but I'm 566 00:25:32,960 --> 00:25:33,840 Speaker 1: glad to bring it up. 567 00:25:34,080 --> 00:25:35,520 Speaker 4: And you know why I thought of you. 568 00:25:35,560 --> 00:25:37,160 Speaker 1: Because I had one visa. 569 00:25:37,359 --> 00:25:39,359 Speaker 4: It was not because you were selling feed picks. 570 00:25:39,680 --> 00:25:42,400 Speaker 2: I was reached for a time deemed by the US 571 00:25:42,480 --> 00:25:44,560 Speaker 2: government an extraordinary artist. 572 00:25:44,640 --> 00:25:47,199 Speaker 4: So do you know where this this I learned for 573 00:25:47,240 --> 00:25:49,480 Speaker 4: the first time reporting the story. Do you know where 574 00:25:49,480 --> 00:25:51,440 Speaker 4: it came from? The one visa came from no idea. 575 00:25:51,920 --> 00:25:54,840 Speaker 3: So these visas were created because in the nineteen seventies, 576 00:25:55,119 --> 00:25:58,560 Speaker 3: the Nixon administration was trying to deport John Lennon put 577 00:25:58,800 --> 00:26:02,399 Speaker 3: over his politics. But Lenin was able to stay in 578 00:26:02,440 --> 00:26:05,399 Speaker 3: the US because he was deemed an outstanding person in 579 00:26:05,440 --> 00:26:06,280 Speaker 3: the arts and science. 580 00:26:06,320 --> 00:26:09,920 Speaker 1: So you and John Lennon, Wow, and now a lot 581 00:26:09,920 --> 00:26:10,720 Speaker 1: of only fans. 582 00:26:10,520 --> 00:26:13,720 Speaker 4: Creators and boy George and Shinnan O'Connor wow. Yes. 583 00:26:14,080 --> 00:26:18,320 Speaker 3: The FT reports that currently an overwhelming number of people 584 00:26:18,320 --> 00:26:22,400 Speaker 3: getting approved for OH one visas are influencers and OnlyFans models. 585 00:26:22,840 --> 00:26:26,800 Speaker 3: One lawyer who was interviewed called them scroll kings and 586 00:26:27,040 --> 00:26:27,920 Speaker 3: I love, I. 587 00:26:27,920 --> 00:26:30,000 Speaker 4: Really love scroll kings and queens. 588 00:26:30,280 --> 00:26:34,439 Speaker 3: The number of scroll kings and queens has actually, like 589 00:26:34,480 --> 00:26:36,880 Speaker 3: peptide use, doubled since twenty ten. 590 00:26:37,040 --> 00:26:41,320 Speaker 2: Wow, and why is applicants looked upon so favorably by 591 00:26:41,359 --> 00:26:44,640 Speaker 2: the case officers at the United States Citizenship and Immigration 592 00:26:44,800 --> 00:26:45,919 Speaker 2: Services Bureau. 593 00:26:46,320 --> 00:26:49,840 Speaker 3: Some immigration lawyers said that follower account makes a difference. 594 00:26:50,119 --> 00:26:53,840 Speaker 3: The visa process is complicated and requires a lot of paperwork. 595 00:26:53,880 --> 00:26:56,480 Speaker 2: No, been true, But that's interesting at the follow account 596 00:26:56,520 --> 00:27:01,080 Speaker 2: because much like with the large iguage models, where if 597 00:27:01,080 --> 00:27:04,480 Speaker 2: it doesn't exist in writing, doesn't exist. Yeah, in your 598 00:27:05,000 --> 00:27:08,399 Speaker 2: one visa application, it doesn't exist if you can't measure 599 00:27:08,440 --> 00:27:10,159 Speaker 2: it well. You need to come up with all of 600 00:27:10,200 --> 00:27:13,159 Speaker 2: these very concrete numbers to demonstrate reach and influence and 601 00:27:13,160 --> 00:27:14,840 Speaker 2: blah blah, blah so and who. 602 00:27:14,640 --> 00:27:17,840 Speaker 3: Can demonstrate reach and influence better than an influencer or 603 00:27:17,880 --> 00:27:20,200 Speaker 3: someone with an only fans account, no one, you know. 604 00:27:20,240 --> 00:27:22,520 Speaker 3: I think there are other people who have these visas 605 00:27:22,560 --> 00:27:24,719 Speaker 3: who might be able to say, Okay, I'm in a 606 00:27:24,760 --> 00:27:27,760 Speaker 3: major film or I'm in theater productions, you know, and 607 00:27:27,760 --> 00:27:31,760 Speaker 3: get recommendations from experts in their field. These only fans 608 00:27:31,760 --> 00:27:35,240 Speaker 3: models and influencers are like, look, I have twenty million followers, 609 00:27:35,440 --> 00:27:36,760 Speaker 3: let me stay in the country. 610 00:27:37,000 --> 00:27:40,800 Speaker 1: And how many exceptional talent visas are given out each year? 611 00:27:41,359 --> 00:27:42,080 Speaker 4: Are you fishing? 612 00:27:45,240 --> 00:27:48,440 Speaker 3: Only twenty thousand visas were granted in twenty twenty four, 613 00:27:48,520 --> 00:27:51,680 Speaker 3: and the majority of visas are still h one b's yeah, 614 00:27:51,960 --> 00:28:06,119 Speaker 3: your exception. That's it for this week for tech stuff, 615 00:28:06,160 --> 00:28:07,120 Speaker 3: I'm Karra Price. 616 00:28:06,920 --> 00:28:07,880 Speaker 1: And I'm os Vloschin. 617 00:28:07,960 --> 00:28:11,040 Speaker 2: This episode was produced by Eliza Dennis and Melissa Slaughter. 618 00:28:11,680 --> 00:28:14,600 Speaker 2: It was executive produced by me Caarra Price, Julian Nutta, 619 00:28:14,600 --> 00:28:17,520 Speaker 2: and Kate Osborne for Kaleidoscope and Katrina. 620 00:28:17,160 --> 00:28:18,920 Speaker 1: Novelle for iHeart Podcasts. 621 00:28:19,520 --> 00:28:23,520 Speaker 2: The engineer is Mike Coscarelli and Jack Insley mikes this episode. 622 00:28:23,960 --> 00:28:25,360 Speaker 1: Kyle Murdoch wrote our theme. 623 00:28:25,200 --> 00:28:27,879 Speaker 3: Song, Please rate, review and Reach out to us at 624 00:28:27,960 --> 00:28:29,480 Speaker 3: tech Stuff Podcast at. 625 00:28:29,359 --> 00:28:31,560 Speaker 4: Gmail dot com. We want to hear from. 626 00:28:31,600 --> 00:28:31,800 Speaker 3: You'll