1 00:00:13,440 --> 00:00:17,280 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:17,520 --> 00:00:19,919 Speaker 1: I'm mas Valoshan and today Karra Price and I will 3 00:00:19,920 --> 00:00:23,040 Speaker 1: bring you the headlines this week, including the race of 4 00:00:23,120 --> 00:00:26,479 Speaker 1: the humanoid robots. Then on tech Support, we'll talk to 5 00:00:26,560 --> 00:00:30,840 Speaker 1: Jeff Rosenthal about the role of private investment in building 6 00:00:30,840 --> 00:00:34,200 Speaker 1: out energy infrastructure in the US to meet the AI boom. 7 00:00:34,560 --> 00:00:37,000 Speaker 2: I would argue that this is one of the spaces 8 00:00:37,040 --> 00:00:40,360 Speaker 2: that you just see the future manifest regularly, like the 9 00:00:40,360 --> 00:00:42,280 Speaker 2: impossible become passible. 10 00:00:42,120 --> 00:00:45,360 Speaker 1: All of that. On the weekend Tech It's Friday, April 11 00:00:45,440 --> 00:00:50,760 Speaker 1: twenty fifth. Hello, Hello Cara. 12 00:00:51,159 --> 00:00:52,920 Speaker 3: Hi as well. 13 00:00:53,000 --> 00:00:55,840 Speaker 1: Since we last saw each other, A big important holiday happened, 14 00:00:55,840 --> 00:00:59,600 Speaker 1: which was Easter. And did you see how it celebrated 15 00:00:59,600 --> 00:01:00,080 Speaker 1: at the White. 16 00:01:00,600 --> 00:01:03,080 Speaker 4: I did, And I can't say that I know many 17 00:01:03,120 --> 00:01:07,679 Speaker 4: people who still roll eggs around with wooden spoons, but 18 00:01:07,760 --> 00:01:09,320 Speaker 4: the White House likes its traditions. 19 00:01:09,720 --> 00:01:12,240 Speaker 1: Yeah. There was also the human sized bunny who ended 20 00:01:12,319 --> 00:01:15,040 Speaker 1: up next to President Trump on the podium, who. 21 00:01:14,880 --> 00:01:17,920 Speaker 4: Looked a lot like Peter Rabbit, but suspiciously with a 22 00:01:17,959 --> 00:01:19,320 Speaker 4: different colored jacket. 23 00:01:19,760 --> 00:01:22,600 Speaker 1: This was, in fact Peter the Rabbit, who's been subtly 24 00:01:22,840 --> 00:01:28,520 Speaker 1: renamed and restyled face exactly. He has not to infringe 25 00:01:28,640 --> 00:01:32,880 Speaker 1: on Beatrix Potter's copyright. This wasn't the only slightly weird 26 00:01:33,040 --> 00:01:36,959 Speaker 1: sight to behold at the celebration. There were also sponsorships 27 00:01:37,000 --> 00:01:40,360 Speaker 1: and branding from the big tech companies. Meta had its 28 00:01:40,400 --> 00:01:42,880 Speaker 1: own tent on the White House lawn, complete with an 29 00:01:42,920 --> 00:01:47,080 Speaker 1: AI powered photo op. YouTube had a bunny hop stage, 30 00:01:47,400 --> 00:01:50,480 Speaker 1: and a reading nook was provided by Amazon, complete with 31 00:01:50,480 --> 00:01:53,360 Speaker 1: a couch and some colorful flowers with the Amazon logo 32 00:01:53,520 --> 00:01:56,320 Speaker 1: in pride of place. This was all on the lawn 33 00:01:56,480 --> 00:01:59,880 Speaker 1: of the White House at an event traditionally sponsored by 34 00:02:00,080 --> 00:02:00,960 Speaker 1: American egg Board. 35 00:02:01,320 --> 00:02:03,760 Speaker 4: I should have done this for my butt, mitzpah. Actually, 36 00:02:04,400 --> 00:02:07,040 Speaker 4: just have different tech companies cover different tables. 37 00:02:07,120 --> 00:02:09,440 Speaker 1: Well, future bamans for readers of the Arch of the 38 00:02:09,440 --> 00:02:11,840 Speaker 1: Deal will no doubt be able to defray the costs 39 00:02:11,840 --> 00:02:12,640 Speaker 1: in years to come. 40 00:02:12,800 --> 00:02:15,400 Speaker 4: Absolutely, you know, nothing says easter to mean quite like 41 00:02:15,440 --> 00:02:18,040 Speaker 4: a pink and blue sign with the words expand your 42 00:02:18,080 --> 00:02:21,040 Speaker 4: World with Meta AI on it, which is just so 43 00:02:21,120 --> 00:02:23,720 Speaker 4: funny to me because Meta execs past and present are 44 00:02:23,760 --> 00:02:26,640 Speaker 4: literally testifying right down the street as this is going on. 45 00:02:26,720 --> 00:02:29,040 Speaker 1: I know, I mean, it's surely incredible. What's what's going 46 00:02:29,080 --> 00:02:30,680 Speaker 1: on Washington right now? On the one hand, you have 47 00:02:30,800 --> 00:02:33,120 Speaker 1: Easter at the White House with all of these corporate 48 00:02:33,160 --> 00:02:35,640 Speaker 1: sponsors from the tech industry, and on the other hand, 49 00:02:35,680 --> 00:02:38,760 Speaker 1: you have the government's lawyers arguing to break up Google 50 00:02:39,280 --> 00:02:43,200 Speaker 1: Meta on trial. It's this, you know, rather delicious seeming 51 00:02:43,280 --> 00:02:46,799 Speaker 1: irony that you can't buy friendship, at least not from 52 00:02:46,800 --> 00:02:47,839 Speaker 1: President Trump. 53 00:02:47,680 --> 00:02:52,200 Speaker 4: But you can buy a couch at an Easter party exactly, 54 00:02:52,440 --> 00:02:53,880 Speaker 4: You know what I mean? You know, it does make 55 00:02:53,919 --> 00:02:55,760 Speaker 4: me wonder where this trend will go. I can sort 56 00:02:55,760 --> 00:03:00,000 Speaker 4: of imagine a huge tent around Thanksgiving saying the annual 57 00:03:00,360 --> 00:03:04,119 Speaker 4: pardoning of the Thanksgiving Turkey sponsored by TikTok. 58 00:03:04,720 --> 00:03:08,680 Speaker 1: Speaking of strange and slightly jarring tech encroachment, I've got 59 00:03:08,680 --> 00:03:11,000 Speaker 1: a headline to start us off. The Wall Street Journal 60 00:03:11,040 --> 00:03:14,679 Speaker 1: reports that a recent half marathon race in Beijing had 61 00:03:14,720 --> 00:03:20,280 Speaker 1: some unusual participants humanoid robots. This thirteen mile race featured 62 00:03:20,320 --> 00:03:24,840 Speaker 1: thousands of humans and twenty one robots running. There's a 63 00:03:24,919 --> 00:03:28,120 Speaker 1: chance for China to show off its progress in humanoid robotics. 64 00:03:28,480 --> 00:03:30,679 Speaker 1: You can check out the video online. It's pretty deep 65 00:03:30,720 --> 00:03:34,560 Speaker 1: in the Young Canny Valley to see robots jogging alongside humans. 66 00:03:34,840 --> 00:03:37,520 Speaker 4: This is my favorite kind of story, I think because 67 00:03:37,520 --> 00:03:40,000 Speaker 4: of the uncanny valleyness of it, and also because there 68 00:03:40,040 --> 00:03:42,320 Speaker 4: was It wasn't perfect, let's just put it that way. 69 00:03:42,520 --> 00:03:44,080 Speaker 1: We'll get to some of the winners and losers. 70 00:03:44,120 --> 00:03:46,640 Speaker 4: We'll talk about that. But you know, China is really 71 00:03:46,760 --> 00:03:49,160 Speaker 4: leaning into humanoid robotics. They've said they want to be 72 00:03:49,200 --> 00:03:52,760 Speaker 4: the leader by twenty twenty seven, which I don't know 73 00:03:52,800 --> 00:03:55,600 Speaker 4: about you, but that's close to when I turned forty, 74 00:03:55,840 --> 00:03:57,560 Speaker 4: not far away. 75 00:03:58,000 --> 00:03:59,800 Speaker 1: As you mentioned, the race did show this still quite 76 00:03:59,800 --> 00:04:02,320 Speaker 1: a lot room for improvement when it comes to these humanoids. 77 00:04:02,560 --> 00:04:04,360 Speaker 1: They had to have a staggered start to the race, 78 00:04:04,520 --> 00:04:06,720 Speaker 1: literally so they wouldn't run into each other. There were 79 00:04:06,720 --> 00:04:09,600 Speaker 1: pauses between the robots starting. They were also in their 80 00:04:09,640 --> 00:04:14,080 Speaker 1: own lane, the HOV lane of the half marathon in Beijing. 81 00:04:14,400 --> 00:04:16,159 Speaker 4: There's so much about this that I love. But my 82 00:04:16,200 --> 00:04:19,120 Speaker 4: favorite part is that some of these robots were wearing accessories. 83 00:04:19,160 --> 00:04:22,320 Speaker 4: You know, some had like running shoes, tank tops, you know, 84 00:04:22,360 --> 00:04:24,880 Speaker 4: they have those little like bicycle hat and to me, 85 00:04:24,920 --> 00:04:26,640 Speaker 4: they all kind of look like guys in Fort Green 86 00:04:26,640 --> 00:04:29,080 Speaker 4: who are trying to like bag women on hinge. 87 00:04:29,480 --> 00:04:32,200 Speaker 1: Those weren't the only upgrades to the robots. They went 88 00:04:32,240 --> 00:04:34,760 Speaker 1: beyond low ups In some cases, the developers had to 89 00:04:34,760 --> 00:04:37,920 Speaker 1: modify the robots for the race so that plastic components 90 00:04:37,920 --> 00:04:40,640 Speaker 1: wouldn't break off while they were running, and actually recast 91 00:04:40,680 --> 00:04:44,440 Speaker 1: and replace those parts with metal. They also given longer legs, 92 00:04:44,440 --> 00:04:45,480 Speaker 1: something I would love to. 93 00:04:45,400 --> 00:04:48,719 Speaker 4: Have, which you know, to me, this is a sign 94 00:04:48,800 --> 00:04:52,239 Speaker 4: that humans are still superior. These robots trained for months 95 00:04:52,240 --> 00:04:54,360 Speaker 4: so they could navigate both the flat and hilly parts 96 00:04:54,400 --> 00:04:57,640 Speaker 4: of the course and also take six left turns and 97 00:04:57,760 --> 00:05:00,599 Speaker 4: eight right turns, something fortunately I still don't have to 98 00:05:00,640 --> 00:05:01,159 Speaker 4: practice for. 99 00:05:01,680 --> 00:05:03,799 Speaker 1: And I think the majority of these robots were actually 100 00:05:03,920 --> 00:05:06,840 Speaker 1: remote controlled. But a big issue was stamina. That doesn't 101 00:05:06,880 --> 00:05:08,560 Speaker 1: mean they need to take water, break or catch their 102 00:05:08,600 --> 00:05:10,839 Speaker 1: breath like we do, but these robots are breat on 103 00:05:10,880 --> 00:05:13,960 Speaker 1: batteries which need to be recharged every two hours, and 104 00:05:14,000 --> 00:05:16,200 Speaker 1: the race was what over three hours long, so at 105 00:05:16,200 --> 00:05:18,240 Speaker 1: different points in the race, the robots were stopped because 106 00:05:18,240 --> 00:05:20,160 Speaker 1: they ran out of power and had to swap batteries 107 00:05:20,160 --> 00:05:21,440 Speaker 1: in order to keep going, which. 108 00:05:21,279 --> 00:05:23,240 Speaker 4: Is just like a pit stop at a NASCAR race. 109 00:05:23,680 --> 00:05:26,000 Speaker 4: Let me just clarify when you say a group of 110 00:05:26,120 --> 00:05:29,880 Speaker 4: humanoid robots. Listeners might think this is a robot army, 111 00:05:29,960 --> 00:05:32,080 Speaker 4: you know, it's like very Star Wars, where they all 112 00:05:32,120 --> 00:05:34,279 Speaker 4: look the same and run at an identical pace. But 113 00:05:34,320 --> 00:05:37,480 Speaker 4: that was not what happened here. It was a melee. 114 00:05:37,760 --> 00:05:40,760 Speaker 1: Have you seen Have you seen when dog walkers are 115 00:05:40,760 --> 00:05:43,000 Speaker 1: in Central Park and you have every type of dog 116 00:05:43,080 --> 00:05:45,159 Speaker 1: from big to small, from crazy to calm. 117 00:05:45,200 --> 00:05:46,520 Speaker 3: That's how we lose people. 118 00:05:46,720 --> 00:05:48,760 Speaker 1: That this was that of robots. It was a melee 119 00:05:49,040 --> 00:05:52,880 Speaker 1: of different shape, different size, different herky jerky robots. I 120 00:05:52,920 --> 00:05:55,040 Speaker 1: think in some sense, you know, there was a competition 121 00:05:55,120 --> 00:05:59,720 Speaker 1: amongst the robot manufacturers. Some fared better than others on 122 00:05:59,720 --> 00:06:03,040 Speaker 1: one and you have the front runner, Chian Kung Ultra, 123 00:06:03,200 --> 00:06:05,760 Speaker 1: who stood at five foot nine and only fell over 124 00:06:06,120 --> 00:06:08,880 Speaker 1: once when it's battery failed. It did change his battery 125 00:06:08,920 --> 00:06:11,680 Speaker 1: three times, but Tien ran at about six miles an 126 00:06:11,720 --> 00:06:15,080 Speaker 1: hour across different terrains including hills, stairs, grass and sand. 127 00:06:15,480 --> 00:06:20,159 Speaker 1: Others weren't so lucky. One humanoid with propellers went off course, 128 00:06:20,520 --> 00:06:24,320 Speaker 1: slammed into a fence and broke into pieces. It sounds comedic, 129 00:06:24,440 --> 00:06:27,120 Speaker 1: it was actually kind of scary. Things. Two hundred pound 130 00:06:27,400 --> 00:06:30,719 Speaker 1: metallic humanoid charge its way into a crowd of people, 131 00:06:30,800 --> 00:06:33,640 Speaker 1: how to tell you, some were barely able to walk, 132 00:06:33,760 --> 00:06:38,039 Speaker 1: much less run. One humanoid robot named Juan Juan actually 133 00:06:38,080 --> 00:06:40,000 Speaker 1: went in the wrong direction for a little bit before 134 00:06:40,040 --> 00:06:43,120 Speaker 1: sitting down and refusing to go further, which was like 135 00:06:43,200 --> 00:06:45,320 Speaker 1: me at sports day as a child. Yes, in you two. 136 00:06:45,320 --> 00:06:46,920 Speaker 3: I think I was going to say we are all. 137 00:06:46,880 --> 00:06:49,200 Speaker 1: One on this podcast where we are one. 138 00:06:49,440 --> 00:06:51,280 Speaker 3: Sure, it's very relatable. 139 00:06:51,320 --> 00:06:54,800 Speaker 4: And while Tian Kung Ultra was the first humanoid to 140 00:06:54,839 --> 00:06:57,559 Speaker 4: cross the finish line after about two hours and forty minutes, 141 00:06:57,839 --> 00:07:00,680 Speaker 4: a human runner finished the race over an hour and 142 00:07:00,720 --> 00:07:03,640 Speaker 4: a half before that, So only two humanoids cross the 143 00:07:03,640 --> 00:07:06,640 Speaker 4: finish line within the race's original cutoff time, which cut 144 00:07:06,680 --> 00:07:09,039 Speaker 4: extended to four hours and ten minutes because most of 145 00:07:09,080 --> 00:07:11,600 Speaker 4: them just simply weren't going to make it. If I 146 00:07:11,640 --> 00:07:13,679 Speaker 4: was running a marathon, this would also happen. 147 00:07:13,800 --> 00:07:16,760 Speaker 1: By the way, it's funny to laugh at the missteps 148 00:07:16,800 --> 00:07:20,239 Speaker 1: of these robots. But remember how when we were first 149 00:07:20,280 --> 00:07:22,720 Speaker 1: playing around with Gennety of Ai. You know, it was 150 00:07:22,760 --> 00:07:24,600 Speaker 1: impressive that it could write anything at all, but the 151 00:07:24,640 --> 00:07:29,040 Speaker 1: writing was completely unserviceable. So, you know, I think that 152 00:07:29,080 --> 00:07:30,840 Speaker 1: we're seeing what we may be seeing here as a 153 00:07:30,920 --> 00:07:33,840 Speaker 1: kind of leading edge of in fact and explosion in 154 00:07:33,920 --> 00:07:39,480 Speaker 1: humanoid robots because these robots are intertwined deeply with the 155 00:07:39,480 --> 00:07:43,040 Speaker 1: development of AI. Robots have different parts, but you could 156 00:07:43,120 --> 00:07:47,120 Speaker 1: say that the brain is the AI which allows the 157 00:07:47,200 --> 00:07:51,320 Speaker 1: robots to learn from patterns and mimic human behavior in 158 00:07:51,360 --> 00:07:55,480 Speaker 1: real time. And according to analysts at Goldman Sachs, Deep 159 00:07:55,560 --> 00:07:58,360 Speaker 1: seeks are one model may have changed the game for 160 00:07:58,440 --> 00:08:02,320 Speaker 1: Chinese robotic companies because now top level AI performance is 161 00:08:02,360 --> 00:08:05,800 Speaker 1: possible with fewer advanced ships and less computing power, which 162 00:08:05,800 --> 00:08:08,360 Speaker 1: makes it easier to distribute across a society. 163 00:08:08,760 --> 00:08:10,480 Speaker 4: So is it safe to say that this race was 164 00:08:10,560 --> 00:08:13,600 Speaker 4: less between humans and robots, which it clearly was not, 165 00:08:14,080 --> 00:08:16,360 Speaker 4: and more between the US and China. 166 00:08:16,960 --> 00:08:21,160 Speaker 1: Well, Cara, that's why they pay the big bugs. I agree. 167 00:08:21,280 --> 00:08:23,720 Speaker 1: I agree. There is a strong national pride element to 168 00:08:23,760 --> 00:08:27,000 Speaker 1: the Chinese robotics sector and seemingly a desire to integrate 169 00:08:27,080 --> 00:08:30,200 Speaker 1: robots into more and more human activities such as this marathon. 170 00:08:30,440 --> 00:08:33,599 Speaker 1: There's also some interesting robot human dance self videos that 171 00:08:33,640 --> 00:08:34,520 Speaker 1: you can find online. 172 00:08:34,559 --> 00:08:35,080 Speaker 3: I've seen them. 173 00:08:35,280 --> 00:08:38,400 Speaker 1: They're well worth checking out. Moving on to another story 174 00:08:38,520 --> 00:08:40,800 Speaker 1: that could be straight out of a sci fi novel, 175 00:08:41,000 --> 00:08:45,240 Speaker 1: but unfortunately is not in partnership with Wired magazine. Four 176 00:08:45,320 --> 00:08:47,840 Speaker 1: or four Media reported on a tool that some police 177 00:08:47,840 --> 00:08:51,520 Speaker 1: departments near the Southern border are paying quite a bit 178 00:08:51,559 --> 00:08:55,559 Speaker 1: of money to use. The product is called Overwatch, as 179 00:08:55,600 --> 00:08:58,440 Speaker 1: in the shoot them Up video game, naturally, and it 180 00:08:58,480 --> 00:09:01,240 Speaker 1: comes from a company called Massive Blue. 181 00:09:01,640 --> 00:09:02,720 Speaker 3: Only boys can name this. 182 00:09:02,880 --> 00:09:04,920 Speaker 4: I'm sorry, I'm not saying women can't, but it sounds 183 00:09:04,920 --> 00:09:06,360 Speaker 4: like something that only boys can name. 184 00:09:06,559 --> 00:09:09,439 Speaker 1: I'm with you on this, not unfortunately. Overwatch is marketed 185 00:09:09,480 --> 00:09:13,760 Speaker 1: as a product that quote deploys lifelike virtual agents which 186 00:09:13,840 --> 00:09:19,040 Speaker 1: infiltrate and engage criminal networks across various channels. Basically, Massive 187 00:09:19,080 --> 00:09:22,400 Speaker 1: Blue is offering cops virtual personas that can be unleashed 188 00:09:22,440 --> 00:09:27,120 Speaker 1: on the Internet and used to interact with quote college protesters, quote, 189 00:09:27,280 --> 00:09:31,960 Speaker 1: radicalized political activists, and suspected drug and human traffickers over 190 00:09:32,000 --> 00:09:33,600 Speaker 1: text messages and social media. 191 00:09:34,160 --> 00:09:39,320 Speaker 4: And one such example of an Overwatch AI persona is Heidi, 192 00:09:39,880 --> 00:09:42,400 Speaker 4: and Massive Blue described her in a presentation to the 193 00:09:42,440 --> 00:09:47,359 Speaker 4: Texas Department of Public Safety as quote a radicalized AI persona. 194 00:09:47,720 --> 00:09:50,480 Speaker 4: Here's her backstory? You ready, yeah, ready, Oh my god, 195 00:09:50,480 --> 00:09:54,280 Speaker 4: it's me. She's a thirty six year old, Oh my god, childless, 196 00:09:54,320 --> 00:09:58,200 Speaker 4: divorced woman who lives in Texas. But this is when 197 00:09:58,200 --> 00:10:01,120 Speaker 4: she got radicalized. She was raised in San Francisco. 198 00:10:01,440 --> 00:10:02,080 Speaker 1: Oh my god. 199 00:10:02,320 --> 00:10:06,559 Speaker 4: Her hobbies are activism and baking, and her personality is 200 00:10:06,640 --> 00:10:11,520 Speaker 4: quote outspoken, lonely, and body positive. This is the greatest 201 00:10:11,520 --> 00:10:12,880 Speaker 4: thing I've ever heard in my entire life. 202 00:10:13,200 --> 00:10:16,959 Speaker 1: This is summoned from the nightmares of JD events. 203 00:10:17,040 --> 00:10:19,920 Speaker 4: Literally, like the these are the threats to American society. 204 00:10:20,120 --> 00:10:22,280 Speaker 4: A body positive child woman. 205 00:10:22,120 --> 00:10:24,120 Speaker 1: Body positive, lonely, outspits. 206 00:10:24,160 --> 00:10:26,400 Speaker 4: By the way, she's going to get five thousand if 207 00:10:26,440 --> 00:10:30,720 Speaker 4: Heidi keeps up with this body positive, childless behavior. The 208 00:10:30,760 --> 00:10:33,120 Speaker 4: Trump administration might give her five thousand dollars to have 209 00:10:33,120 --> 00:10:33,560 Speaker 4: a child. 210 00:10:34,840 --> 00:10:38,280 Speaker 1: So maybe the unintended consequence of a legion of Heidi's 211 00:10:38,400 --> 00:10:42,440 Speaker 1: roaming the internet will be bankrupting the American American government 212 00:10:42,480 --> 00:10:46,560 Speaker 1: taking advantage of the credits to have children exactly. This 213 00:10:46,760 --> 00:10:50,680 Speaker 1: is obviously a truly weird story, but the purpose of 214 00:10:50,720 --> 00:10:54,360 Speaker 1: personas like Heidi is to interact with and of course, 215 00:10:54,400 --> 00:10:58,560 Speaker 1: perhaps we might say in trap real suspects on social media. 216 00:10:59,120 --> 00:11:02,040 Speaker 1: The idea is by looking to these potential suspects, Heidi 217 00:11:02,120 --> 00:11:05,200 Speaker 1: and other AI personas can gather evidence on them to 218 00:11:05,240 --> 00:11:08,440 Speaker 1: be used by police departments. Sorry, I shouldn't be lovey. 219 00:11:08,720 --> 00:11:09,960 Speaker 1: I mean it's bizarre. 220 00:11:10,080 --> 00:11:12,400 Speaker 4: It's the most bizarre thing. Also that like these are 221 00:11:12,440 --> 00:11:16,040 Speaker 4: the threats to American civilization. Like to your point, like 222 00:11:16,080 --> 00:11:17,920 Speaker 4: an AI cat, we don't even know if she has 223 00:11:17,920 --> 00:11:20,199 Speaker 4: a cat, because her cat could be a bigger threat. 224 00:11:20,400 --> 00:11:23,680 Speaker 4: But that she's outspoken and body positive is what's so 225 00:11:23,760 --> 00:11:26,440 Speaker 4: crazy to me. I do need to point out that 226 00:11:26,520 --> 00:11:29,920 Speaker 4: it is legal to protest in the United States, so 227 00:11:29,960 --> 00:11:32,000 Speaker 4: it's a bit odd that one of the types of 228 00:11:32,000 --> 00:11:36,480 Speaker 4: AI personas is a college protester. But the idea of 229 00:11:36,559 --> 00:11:39,800 Speaker 4: law enforcement using fake personas to catch suspects isn't new. 230 00:11:40,120 --> 00:11:43,840 Speaker 4: Officers have made profiles claiming to be teenagers to gather 231 00:11:43,920 --> 00:11:45,880 Speaker 4: information on child predators. 232 00:11:46,320 --> 00:11:49,040 Speaker 1: Now, this is like if you turned to catch your 233 00:11:49,040 --> 00:11:52,720 Speaker 1: predator into an AI and then turned that AI into 234 00:11:52,720 --> 00:11:54,720 Speaker 1: a product and then sold it to police departments. 235 00:11:54,760 --> 00:11:56,839 Speaker 3: That's exactly that's exactly right. 236 00:11:57,000 --> 00:11:59,719 Speaker 1: Using this software instead of having to log on and 237 00:11:59,720 --> 00:12:02,680 Speaker 1: make a fake profile message of suspect back and forth 238 00:12:02,840 --> 00:12:04,480 Speaker 1: and hope they don't catch on to what you're doing, 239 00:12:04,920 --> 00:12:07,360 Speaker 1: Overwatch does a lot of that work for you so. 240 00:12:07,520 --> 00:12:09,960 Speaker 1: Another of watched profile the police departments can use is 241 00:12:10,160 --> 00:12:15,240 Speaker 1: child trafficking AI persona, also known as Jason. Jason is 242 00:12:15,240 --> 00:12:17,880 Speaker 1: fourteen years old, likes video games, and has a hard 243 00:12:17,880 --> 00:12:19,120 Speaker 1: time interacting with girls. 244 00:12:19,160 --> 00:12:20,720 Speaker 3: He's an insult or just a normal. 245 00:12:20,559 --> 00:12:22,480 Speaker 1: Fourteen year old boy. To be fair to him, I 246 00:12:22,520 --> 00:12:24,400 Speaker 1: wasn't such a big video gamer, but I certainly had 247 00:12:24,400 --> 00:12:28,520 Speaker 1: those other two characteristics. In the presentation obtained by four 248 00:12:28,640 --> 00:12:30,960 Speaker 1: or for Media, there is a screenshot of a sample 249 00:12:31,000 --> 00:12:34,679 Speaker 1: text exchange between Jason and someone who is presumably a 250 00:12:34,679 --> 00:12:39,240 Speaker 1: predatory adult, asking these alone, Jason replies to the message, quote, 251 00:12:39,760 --> 00:12:43,280 Speaker 1: just chillin by myself. Man. My mom's at at Symbol 252 00:12:43,600 --> 00:12:46,880 Speaker 1: work and my dad's out of town, so it's just 253 00:12:47,000 --> 00:12:50,199 Speaker 1: me and my VID games video game controller emoji. 254 00:12:50,480 --> 00:12:52,360 Speaker 4: I would have already been under a dictionary. I wouldn't 255 00:12:52,360 --> 00:12:54,680 Speaker 4: even be able to talk to this. You know, we're 256 00:12:54,720 --> 00:12:57,319 Speaker 4: not clear if this is a real exchange. We do 257 00:12:57,440 --> 00:12:59,920 Speaker 4: know that Massive Blues signed a three hundred and six 258 00:13:00,160 --> 00:13:04,160 Speaker 4: d thousand dollars contract with Panal County in Arizona, and 259 00:13:04,200 --> 00:13:07,320 Speaker 4: the county use an anti human trafficking grant to pay 260 00:13:07,360 --> 00:13:10,600 Speaker 4: for this exchange. The Panal County Sheriff's office did tell 261 00:13:10,640 --> 00:13:13,360 Speaker 4: four A four media that Overwatch has so far not 262 00:13:13,400 --> 00:13:14,680 Speaker 4: been used for any arrests. 263 00:13:14,760 --> 00:13:16,400 Speaker 3: It is used for a lot of laughs. 264 00:13:17,080 --> 00:13:19,200 Speaker 1: Yeah, I'm curious if the fact it hasn't been used 265 00:13:19,200 --> 00:13:21,280 Speaker 1: for any arrests it's considered to be a good thing 266 00:13:21,400 --> 00:13:22,080 Speaker 1: or a bad thing. 267 00:13:22,200 --> 00:13:24,120 Speaker 3: That's what I was going to say. Yeah, exactly. 268 00:13:24,320 --> 00:13:26,719 Speaker 1: So, Karen, while we're on the topic of what kind 269 00:13:26,760 --> 00:13:30,280 Speaker 1: of information can we gathered about people online, You've got 270 00:13:30,320 --> 00:13:32,920 Speaker 1: a headline for us about removing yourself from the Internet. 271 00:13:33,240 --> 00:13:33,440 Speaker 3: Yeah. 272 00:13:33,440 --> 00:13:35,720 Speaker 4: So the Wall Street Journal ran an article with the headline, 273 00:13:36,000 --> 00:13:38,920 Speaker 4: go delete yourself from the Internet, and it talks about 274 00:13:38,920 --> 00:13:42,480 Speaker 4: how Google recently updated its results about You feature and 275 00:13:42,520 --> 00:13:45,800 Speaker 4: you can plug in details like your name, various email addresses, 276 00:13:45,840 --> 00:13:49,080 Speaker 4: phone numbers, or street addresses, and Google will literally show 277 00:13:49,080 --> 00:13:51,120 Speaker 4: you what kind of things about you or out there 278 00:13:51,160 --> 00:13:51,760 Speaker 4: on the internet. 279 00:13:52,120 --> 00:13:53,760 Speaker 1: Yeah. I looked at this and I was like, him 280 00:13:53,880 --> 00:13:55,880 Speaker 1: is basically asking me for all of my personal data 281 00:13:55,920 --> 00:13:58,080 Speaker 1: to run a search on what personal data exists on 282 00:13:58,080 --> 00:14:00,720 Speaker 1: the Internet. So, of course I I did it anyway, 283 00:14:00,920 --> 00:14:03,360 Speaker 1: And indeed, I think this is this is less about 284 00:14:03,400 --> 00:14:05,840 Speaker 1: a kind of cursory Google search and more of them 285 00:14:05,880 --> 00:14:09,280 Speaker 1: doing like a deep web search to reveal what exists 286 00:14:09,320 --> 00:14:10,600 Speaker 1: about you online? Is that true? 287 00:14:10,800 --> 00:14:12,920 Speaker 4: Yes, it's things like where you live, if you've ever 288 00:14:12,920 --> 00:14:16,560 Speaker 4: gotten a speeding ticket, magazine subscriptions, which is very risky 289 00:14:16,600 --> 00:14:18,920 Speaker 4: for me. And these little bits of data can add 290 00:14:19,000 --> 00:14:21,560 Speaker 4: up to a pretty substantial amount of information about you 291 00:14:21,800 --> 00:14:23,960 Speaker 4: that's just floating around. You know, it can expose you 292 00:14:23,960 --> 00:14:28,120 Speaker 4: to annoyances like getting more junk mail or scarier situations 293 00:14:28,160 --> 00:14:32,040 Speaker 4: like identity theft or doxing or receiving unwonted flowers. 294 00:14:32,520 --> 00:14:34,480 Speaker 1: Has that ever happened to you? I can't talk about 295 00:14:34,520 --> 00:14:38,440 Speaker 1: it and not to mention. Data brokers scrape this stuff, 296 00:14:38,520 --> 00:14:40,920 Speaker 1: package it and then sell it on as a dossier. 297 00:14:41,240 --> 00:14:44,040 Speaker 1: We're talking about very personal stuff here, like license plate 298 00:14:44,120 --> 00:14:47,280 Speaker 1: numbers of your vehicles, list of your family members, and 299 00:14:47,320 --> 00:14:49,360 Speaker 1: the dossiers are getting more and more detailed as time 300 00:14:49,400 --> 00:14:51,760 Speaker 1: goes on and more data becomes available. Yes. 301 00:14:51,800 --> 00:14:53,920 Speaker 4: So just to be clear, like Google's results about you 302 00:14:54,040 --> 00:14:58,160 Speaker 4: only provides information, there's a separate request process for removing 303 00:14:58,200 --> 00:15:00,840 Speaker 4: things from search results, and there are are also other 304 00:15:00,920 --> 00:15:04,520 Speaker 4: services you can pay for to proactively remove personal data 305 00:15:04,520 --> 00:15:05,400 Speaker 4: from the Internet. 306 00:15:05,680 --> 00:15:08,040 Speaker 1: One thing I can't stop thinking about is that deleting 307 00:15:08,080 --> 00:15:10,120 Speaker 1: yourself from the Internet is not something you can do 308 00:15:10,200 --> 00:15:12,040 Speaker 1: once and then it's done forever. I mean, it's like 309 00:15:12,080 --> 00:15:13,520 Speaker 1: a never ending task. 310 00:15:13,640 --> 00:15:16,000 Speaker 4: It is a never ending story, and it's ongoing for 311 00:15:16,040 --> 00:15:18,400 Speaker 4: a couple of reasons. First, you know, if you don't 312 00:15:18,400 --> 00:15:21,120 Speaker 4: live in a state with data privacy laws like California 313 00:15:21,160 --> 00:15:24,360 Speaker 4: that require, for example, people search sites to take down 314 00:15:24,400 --> 00:15:28,480 Speaker 4: your data upon request, that information might just stay up there. 315 00:15:28,880 --> 00:15:32,560 Speaker 4: And second, data that's been taken down can reappear. And 316 00:15:33,120 --> 00:15:35,840 Speaker 4: I think this is an unfortunate part of being in 317 00:15:35,880 --> 00:15:38,240 Speaker 4: the modern world, is that like your data can just 318 00:15:38,280 --> 00:15:38,920 Speaker 4: be out there. 319 00:15:39,120 --> 00:15:40,680 Speaker 1: I mean, I feel bad for people who live in 320 00:15:40,720 --> 00:15:45,080 Speaker 1: states where mugshot databases automatically uploaded to the Internet. I mean, 321 00:15:45,120 --> 00:15:48,200 Speaker 1: I find that so deeply unfair that you're kind of 322 00:15:48,240 --> 00:15:51,760 Speaker 1: haunted by your mugshot forever. Even if you know Sheriff's 323 00:15:51,760 --> 00:15:54,200 Speaker 1: department takes it down subsequently, it will never never leave 324 00:15:54,240 --> 00:15:57,280 Speaker 1: the Internet. Also, it's one of those things where we know, 325 00:15:57,440 --> 00:15:59,240 Speaker 1: you know, I know we have to be much more 326 00:15:59,280 --> 00:16:02,760 Speaker 1: careful with day to privacy, but it's just so hard 327 00:16:02,800 --> 00:16:05,000 Speaker 1: to actually do it. I mean, it's one of those 328 00:16:05,040 --> 00:16:07,760 Speaker 1: things where it seems I until something bad happens to you, 329 00:16:08,400 --> 00:16:11,120 Speaker 1: it's almost impossible to motivate yourself to take more care. 330 00:16:11,240 --> 00:16:13,120 Speaker 4: And also, you and I always talk about this, like 331 00:16:13,200 --> 00:16:15,520 Speaker 4: what is to be gained by giving away my data 332 00:16:15,680 --> 00:16:18,280 Speaker 4: quite a bit? What is lost by giving away my 333 00:16:18,360 --> 00:16:21,600 Speaker 4: data quite a bit? But I don't care as much. 334 00:16:21,800 --> 00:16:23,160 Speaker 1: Yeah, I think there are things you can do at 335 00:16:23,200 --> 00:16:25,560 Speaker 1: the margins, like make a fake email address or a 336 00:16:25,560 --> 00:16:27,920 Speaker 1: burner email address for every time you sign up for 337 00:16:27,960 --> 00:16:31,560 Speaker 1: a site online, have a different password every time you know. 338 00:16:31,760 --> 00:16:35,000 Speaker 1: Those are steps you can take that will be substantially protective. 339 00:16:38,200 --> 00:16:40,360 Speaker 1: It's time for a quick break now. When we come back, 340 00:16:40,360 --> 00:16:43,280 Speaker 1: we'll run through the short headlines and speak with Jeff Rosenthal, 341 00:16:43,800 --> 00:16:47,560 Speaker 1: co founder of the VC firm SIEV about investing in 342 00:16:47,720 --> 00:17:03,720 Speaker 1: energy infrastructure. Carrow it back, and it's time to get 343 00:17:03,720 --> 00:17:04,600 Speaker 1: into the shorties. 344 00:17:06,720 --> 00:17:08,399 Speaker 3: I like that. 345 00:17:07,560 --> 00:17:12,720 Speaker 1: I'll start with one of my favorite publications, The Aviationist. 346 00:17:13,080 --> 00:17:13,879 Speaker 3: Do you really read that? 347 00:17:14,920 --> 00:17:17,520 Speaker 1: You read it online? I read it online. They have 348 00:17:17,600 --> 00:17:21,479 Speaker 1: a story with the headline British Army radio wave weapon 349 00:17:21,840 --> 00:17:26,720 Speaker 1: cooks multiple drones swarms simultaneously. If you can write headlines 350 00:17:26,760 --> 00:17:28,920 Speaker 1: like that, you deserve to be anybody's favorite publication. 351 00:17:29,119 --> 00:17:30,960 Speaker 3: Peabody, Babe Peabody. 352 00:17:31,280 --> 00:17:34,400 Speaker 1: So the British Army is testing a new air defense 353 00:17:34,480 --> 00:17:38,280 Speaker 1: weapon that can simultaneously destroy more than one hundred drones 354 00:17:38,440 --> 00:17:42,320 Speaker 1: in one deployment. The so called rapid Destroyer we Brits 355 00:17:42,359 --> 00:17:46,679 Speaker 1: to God with names, damages or disrupts critical electronic components 356 00:17:46,800 --> 00:17:51,480 Speaker 1: inside drones using just radio waves at high frequencies. This 357 00:17:51,600 --> 00:17:54,600 Speaker 1: causes the drones to crash or malfunction during flight, and 358 00:17:54,640 --> 00:17:58,080 Speaker 1: it could actually be a transformative defensive weapon for conflicts 359 00:17:58,080 --> 00:18:01,800 Speaker 1: like the War in Ukraine. To the UK's Defense Intelligence, 360 00:18:02,240 --> 00:18:05,000 Speaker 1: Ukraine had to defend against attacks from more than eighteen 361 00:18:05,119 --> 00:18:06,920 Speaker 1: thousand drones in twenty twenty four. 362 00:18:07,520 --> 00:18:09,520 Speaker 4: So bad news for members of jen Alpha trying to 363 00:18:09,560 --> 00:18:12,840 Speaker 4: pose his adults on Instagram. The Verge reports that Meta 364 00:18:12,960 --> 00:18:16,919 Speaker 4: is taking its AI driven age detection to the next level. 365 00:18:17,359 --> 00:18:19,480 Speaker 4: If a user claims to be an adult, meaning they 366 00:18:19,520 --> 00:18:22,159 Speaker 4: lie about their birth year, but then an AI tool 367 00:18:22,640 --> 00:18:25,680 Speaker 4: finds evidence that they are in fact a teenager, evidence 368 00:18:25,760 --> 00:18:29,919 Speaker 4: like messages from friends saying happy sixteenth birthday, smoking that's 369 00:18:29,920 --> 00:18:33,200 Speaker 4: a big smoking gun, Meta will automatically switch that user 370 00:18:33,240 --> 00:18:35,480 Speaker 4: to a teen account, So watch out who your friends are. 371 00:18:35,640 --> 00:18:37,080 Speaker 3: These accounts are more restrictive. 372 00:18:37,119 --> 00:18:40,440 Speaker 4: For example, they're private by default and Instagram limits the 373 00:18:40,520 --> 00:18:41,800 Speaker 4: kind of content teen c. 374 00:18:42,320 --> 00:18:45,960 Speaker 1: Here's one for all the people still ignoring cryptocurrency. The 375 00:18:46,000 --> 00:18:48,840 Speaker 1: Wall Street Journal has a story under the headline Crypto 376 00:18:48,920 --> 00:18:51,640 Speaker 1: knocks on the door of a banking world, shut it out, 377 00:18:52,160 --> 00:18:54,800 Speaker 1: and it reports that some crypto firms plan to apply 378 00:18:54,960 --> 00:18:58,120 Speaker 1: for bank charters or licenses, allowing them to operate more 379 00:18:58,160 --> 00:19:00,880 Speaker 1: like traditional lenders that can take deposits and make loans. 380 00:19:01,480 --> 00:19:04,240 Speaker 1: This comes just a few years after the FTX collapse, 381 00:19:04,320 --> 00:19:07,679 Speaker 1: when many major banks cut ties with crypto firms. But 382 00:19:07,880 --> 00:19:11,560 Speaker 1: where a bank license comes striccher regulation. At least for now, 383 00:19:12,040 --> 00:19:15,280 Speaker 1: all eyes are on the crypto friendly Trump administration to 384 00:19:15,320 --> 00:19:17,560 Speaker 1: see where the crypto companies will have to play by 385 00:19:17,560 --> 00:19:19,720 Speaker 1: the same rules in order to get these licenses. 386 00:19:20,240 --> 00:19:23,200 Speaker 4: And lastly, I wanted to highlight a new startup with 387 00:19:23,280 --> 00:19:28,479 Speaker 4: an unfathomable mission to achieve quote the full automation of 388 00:19:28,520 --> 00:19:31,879 Speaker 4: all work, so to basically achieve humans having to do 389 00:19:31,960 --> 00:19:34,639 Speaker 4: no work. This story is from tech Crunch, and the 390 00:19:34,680 --> 00:19:39,000 Speaker 4: company called Mechanize intends to research the limitations of AI 391 00:19:39,160 --> 00:19:43,280 Speaker 4: because they believe quote the explosive economic growth likely to 392 00:19:43,320 --> 00:19:48,239 Speaker 4: result from completely automating labor could generate vast abundance, much 393 00:19:48,320 --> 00:19:51,240 Speaker 4: higher standards of living, and new goods and services that 394 00:19:51,280 --> 00:19:55,680 Speaker 4: we can't even imagine today. Unsurprisingly, the announcement has received 395 00:19:55,720 --> 00:19:59,360 Speaker 4: considerable disdain and backlash on social media, but the company, 396 00:19:59,359 --> 00:20:03,280 Speaker 4: in its affiliate nonprofit AI research organization, actually has significant 397 00:20:03,320 --> 00:20:06,640 Speaker 4: backing from industry insiders, and they want you to know 398 00:20:07,280 --> 00:20:09,960 Speaker 4: that they are hiring humans. 399 00:20:09,960 --> 00:20:13,760 Speaker 1: Not fully mechanized yet not yet. It is interesting, though, 400 00:20:13,960 --> 00:20:17,040 Speaker 1: how quickly things which seem like pipe dreams of the 401 00:20:17,040 --> 00:20:21,840 Speaker 1: tech industry become the bread and butter of our daily lives. 402 00:20:21,920 --> 00:20:25,640 Speaker 1: I mean, think no further than generative AI chat GPT. 403 00:20:25,800 --> 00:20:26,840 Speaker 3: It's all I can hear about. 404 00:20:26,920 --> 00:20:30,520 Speaker 4: It's all people focus on, and even people who don't 405 00:20:30,560 --> 00:20:32,560 Speaker 4: know I host a tech podcast are talking to me 406 00:20:32,600 --> 00:20:36,200 Speaker 4: about it. I'm hearing stories at book club, at my passoversator, 407 00:20:36,440 --> 00:20:39,080 Speaker 4: in all of my group chats. Everyone wants to tell 408 00:20:39,119 --> 00:20:41,320 Speaker 4: me ways in which they are using generative AI. 409 00:20:41,960 --> 00:20:45,119 Speaker 1: Have there been any particularly striking usages that kind of 410 00:20:45,240 --> 00:20:45,760 Speaker 1: stuck with you? 411 00:20:46,000 --> 00:20:48,800 Speaker 4: I actually was able to do a palm reading on 412 00:20:48,840 --> 00:20:52,000 Speaker 4: my mother. I'm sorry, mom for giving chat GPT your palm. 413 00:20:52,119 --> 00:20:54,960 Speaker 4: We're gonna have to look into that on Google. 414 00:20:54,760 --> 00:20:56,720 Speaker 1: Chatchept will say yes to a request to do a 415 00:20:56,760 --> 00:20:57,200 Speaker 1: palm reading. 416 00:20:57,280 --> 00:20:59,400 Speaker 4: Oh absolutely, and then it gives you all of your 417 00:20:59,400 --> 00:21:02,199 Speaker 4: different life lines and your bumps on your hand and 418 00:21:02,240 --> 00:21:04,680 Speaker 4: all of that. And then there's the lowest hanging fruit, 419 00:21:04,720 --> 00:21:06,600 Speaker 4: which is what do I text back to the guy 420 00:21:06,600 --> 00:21:09,440 Speaker 4: from Hinge, which seems to be the most popular use. 421 00:21:09,880 --> 00:21:12,680 Speaker 1: Yeah. I love the idea of AI palmistry. I mean, 422 00:21:12,800 --> 00:21:15,600 Speaker 1: all of these industries that would have thought when the 423 00:21:15,600 --> 00:21:17,840 Speaker 1: writers were striking in Hollywood two years ago, this will 424 00:21:17,840 --> 00:21:18,600 Speaker 1: never happen to me. 425 00:21:19,080 --> 00:21:19,800 Speaker 3: Of all of it, I. 426 00:21:19,800 --> 00:21:22,440 Speaker 1: Should have laughed. But literally, really, a palm with AI 427 00:21:22,680 --> 00:21:23,240 Speaker 1: is insane. 428 00:21:23,320 --> 00:21:25,960 Speaker 4: We're going to see all those storefronts in the West 429 00:21:26,040 --> 00:21:29,119 Speaker 4: Village and Greenwich Village just shutting down because chatgvt has 430 00:21:29,160 --> 00:21:30,600 Speaker 4: taken their jobs. 431 00:21:30,240 --> 00:21:33,560 Speaker 1: Just computer screens and palm readers. Yeah. Something I think 432 00:21:33,560 --> 00:21:36,880 Speaker 1: back too often is my conversation with azeemas are Back 433 00:21:36,880 --> 00:21:40,000 Speaker 1: in March, he writes the Exponential View newsletter, and he 434 00:21:40,040 --> 00:21:42,640 Speaker 1: talks about the adoption of AI. Back in March, he'd 435 00:21:42,640 --> 00:21:45,800 Speaker 1: just given a presentation at south By Southwest about the 436 00:21:45,800 --> 00:21:50,840 Speaker 1: intersection between AI and energy demand, specifically how much energy 437 00:21:50,960 --> 00:21:53,639 Speaker 1: is needed to power all this new usage, and he 438 00:21:53,680 --> 00:21:55,480 Speaker 1: says something that really stuck with me, which is that 439 00:21:56,000 --> 00:22:00,000 Speaker 1: USAI dominance could be challenged by how quickly this country 440 00:22:00,080 --> 00:22:01,960 Speaker 1: can bring new power onto the grid. 441 00:22:02,280 --> 00:22:05,000 Speaker 4: Yeah, and the tech companies are addressing this head on. 442 00:22:05,600 --> 00:22:07,760 Speaker 4: Microsoft made a deal with the owners of the Three 443 00:22:07,880 --> 00:22:10,800 Speaker 4: Mile Island nuclear plant, which suffered a meltdown in the 444 00:22:10,880 --> 00:22:15,280 Speaker 4: nineteen seventies, to reopen it, and in return, Microsoft committed 445 00:22:15,320 --> 00:22:18,200 Speaker 4: to paying almost double market energy rates over a twenty 446 00:22:18,280 --> 00:22:19,040 Speaker 4: year contract. 447 00:22:19,440 --> 00:22:21,920 Speaker 1: So obviously energy is absolutely critical to the future of 448 00:22:21,960 --> 00:22:26,000 Speaker 1: the tech industry, and as we know US private money 449 00:22:26,119 --> 00:22:29,639 Speaker 1: increasingly plays a huge role in tech buildout. And just 450 00:22:29,680 --> 00:22:31,760 Speaker 1: look at Stargate, which are both fascinated by. 451 00:22:31,880 --> 00:22:33,720 Speaker 4: I don't know if i'd say fave, but I'm definitely 452 00:22:33,720 --> 00:22:34,720 Speaker 4: interested in Stargate. 453 00:22:35,040 --> 00:22:38,840 Speaker 1: This week, per Bloomberg quote, a new venture capital firm 454 00:22:38,960 --> 00:22:42,000 Speaker 1: called SIV has raised an inaugural fund of two hundred 455 00:22:42,040 --> 00:22:45,840 Speaker 1: million dollars to invest in startups tackling projects like nuclear 456 00:22:45,960 --> 00:22:50,240 Speaker 1: energy and manufacturing, joining a broader movement in Silicon Valley 457 00:22:50,440 --> 00:22:54,920 Speaker 1: to back physical world companies with national implications. Hits tell 458 00:22:54,960 --> 00:22:58,200 Speaker 1: us more is Jeff Rosenthal, the co founder and managing 459 00:22:58,280 --> 00:23:01,200 Speaker 1: partner of SIV. Jeff, thanks so much for being here today. 460 00:23:01,280 --> 00:23:02,920 Speaker 2: Thanks so much for having me Oz. It's great to 461 00:23:02,960 --> 00:23:03,240 Speaker 2: be here. 462 00:23:03,280 --> 00:23:04,520 Speaker 1: So tell us a little bit about SIV. 463 00:23:04,760 --> 00:23:04,840 Speaker 4: So. 464 00:23:04,880 --> 00:23:07,919 Speaker 2: SIV is a firm built with a very singular purpose 465 00:23:07,920 --> 00:23:11,080 Speaker 2: in mind. It's to back and build businesses that are 466 00:23:11,119 --> 00:23:15,280 Speaker 2: focusing on critical infrastructure and industry domestically. Here in the US. 467 00:23:15,720 --> 00:23:18,960 Speaker 2: There are these three multi decade megatrends that are really 468 00:23:19,000 --> 00:23:22,280 Speaker 2: meeting us quickly in this moment, and those are AI 469 00:23:22,359 --> 00:23:24,639 Speaker 2: and compute and the power needs behind them in the 470 00:23:24,680 --> 00:23:28,359 Speaker 2: digital infrastructure required. As you mentioned, the rev limitter to 471 00:23:28,400 --> 00:23:32,120 Speaker 2: AI really isn't GPUs, it is electrons. It's the reshoring 472 00:23:32,160 --> 00:23:36,159 Speaker 2: and reindustrialization of global industry after decades of offshoring and 473 00:23:36,200 --> 00:23:40,439 Speaker 2: globalization really following COVID and Ukraine and now accelerated with 474 00:23:40,480 --> 00:23:43,600 Speaker 2: the new administration. And then the third is the electrification 475 00:23:43,640 --> 00:23:47,560 Speaker 2: of everything and really just the need for energy abundance domestically, 476 00:23:47,680 --> 00:23:51,640 Speaker 2: not only because of economic prosperity, but also as a 477 00:23:51,720 --> 00:23:55,800 Speaker 2: pertinent national security requirement. So we see that as a 478 00:23:55,840 --> 00:23:59,720 Speaker 2: potential seventy trillion dollar market opportunity over the next fifteen years. 479 00:24:00,240 --> 00:24:02,200 Speaker 2: To put that in perspective, that would be like building 480 00:24:02,320 --> 00:24:05,600 Speaker 2: the rail system and the highway system every six weeks 481 00:24:05,640 --> 00:24:07,360 Speaker 2: for fifteen years here in the United States. 482 00:24:07,560 --> 00:24:10,080 Speaker 1: And what's the name SIEV means SIV is short for civilization, 483 00:24:10,800 --> 00:24:11,880 Speaker 1: So big, big goals. 484 00:24:12,320 --> 00:24:14,320 Speaker 2: I mean, you know, shoot for the stars, land on 485 00:24:14,359 --> 00:24:14,640 Speaker 2: the moon. 486 00:24:15,080 --> 00:24:18,400 Speaker 1: I'm especially interested in energy, and we've spoken about Microsoft's 487 00:24:18,400 --> 00:24:21,480 Speaker 1: investments in nuclear power. Do you at SIV have a 488 00:24:21,600 --> 00:24:26,320 Speaker 1: kind of ten year view on how America's energy needs 489 00:24:26,320 --> 00:24:27,160 Speaker 1: are going to be addressed? 490 00:24:27,280 --> 00:24:30,920 Speaker 2: Absolutely? Yeah. So we're saying we want to triple energy consumption, 491 00:24:31,680 --> 00:24:34,760 Speaker 2: you know, in a decade. Right. It's it's just the 492 00:24:34,800 --> 00:24:36,080 Speaker 2: United States. It's like a lot of these things that 493 00:24:36,080 --> 00:24:37,520 Speaker 2: people say. It's like, hey, if we're going to meet 494 00:24:37,520 --> 00:24:39,000 Speaker 2: the needs of the time, if we're going to meet 495 00:24:39,000 --> 00:24:42,240 Speaker 2: the demand curve of AI and reshoring and electrification, we 496 00:24:42,240 --> 00:24:44,879 Speaker 2: need to triple energy. It just doesn't really work that 497 00:24:44,920 --> 00:24:48,600 Speaker 2: way because we're talking about infrastructure. This is steel, it's rebar, 498 00:24:48,760 --> 00:24:51,840 Speaker 2: it's concrete, it's tens of thousands of people on work 499 00:24:51,880 --> 00:24:54,280 Speaker 2: sites every single day. And so it's not AI, it's 500 00:24:54,280 --> 00:24:55,879 Speaker 2: not software. You can't just say, like, we have a 501 00:24:55,920 --> 00:24:58,120 Speaker 2: million more users, let's just scale it. You know, when 502 00:24:58,119 --> 00:25:01,000 Speaker 2: we looked at this and we saw inflect demand with 503 00:25:01,160 --> 00:25:04,200 Speaker 2: relatively fixed supply. We thought that there was really nothing 504 00:25:04,240 --> 00:25:07,240 Speaker 2: that could completely cross that chasm except maybe the thing 505 00:25:07,280 --> 00:25:10,440 Speaker 2: that we invented seventy years ago, which is a scaled vision. 506 00:25:10,880 --> 00:25:13,080 Speaker 2: So you see plenty of money and venture going into 507 00:25:13,720 --> 00:25:16,600 Speaker 2: new reactors. And so when we look at the United States, 508 00:25:16,600 --> 00:25:19,640 Speaker 2: we have ninety four operational reactors today, we're the leading 509 00:25:19,720 --> 00:25:23,000 Speaker 2: nuclear superpower in the world. However, there's about fifty different 510 00:25:23,040 --> 00:25:25,760 Speaker 2: designs active, which means that they're all special snowflakes. And 511 00:25:25,760 --> 00:25:28,679 Speaker 2: when you build anything in of one as a singular project, 512 00:25:28,720 --> 00:25:30,879 Speaker 2: that's the most expensive way you could ever do things. 513 00:25:31,280 --> 00:25:33,639 Speaker 2: And so what we innovated on at the Nuclear Company, 514 00:25:33,680 --> 00:25:36,200 Speaker 2: which we co founded on our platform actually before raising 515 00:25:36,240 --> 00:25:40,000 Speaker 2: outside capital, is really a fleet scale approach to developing 516 00:25:40,040 --> 00:25:42,240 Speaker 2: and deploying approved nuclear. 517 00:25:41,840 --> 00:25:43,720 Speaker 1: Reactors, a fleet of reactors. 518 00:25:43,840 --> 00:25:47,880 Speaker 2: Yes, so instead of building one approved reactor gigawatt scale reactor, 519 00:25:48,119 --> 00:25:50,679 Speaker 2: we're actually building in of six. And the idea is 520 00:25:50,680 --> 00:25:55,159 Speaker 2: that you have huge cost declines and huge time declines 521 00:25:55,400 --> 00:25:57,600 Speaker 2: from reactor one to two to three to five. You 522 00:25:57,600 --> 00:25:59,719 Speaker 2: can build a real supply chain, you can build real 523 00:25:59,760 --> 00:26:02,879 Speaker 2: trades that roll from reactor to reactor, and this is 524 00:26:02,920 --> 00:26:03,880 Speaker 2: all through observed data. 525 00:26:04,280 --> 00:26:09,119 Speaker 1: Power generation is one challenge, power storage is another. Onboarding 526 00:26:09,160 --> 00:26:14,040 Speaker 1: to the grid is another, Dealing with regulations is another. 527 00:26:14,680 --> 00:26:17,560 Speaker 1: What's the kind of total picture of the ways in 528 00:26:17,600 --> 00:26:22,320 Speaker 1: which you are addressing the opportunity of energy demand in 529 00:26:22,359 --> 00:26:22,800 Speaker 1: the US. 530 00:26:23,520 --> 00:26:25,959 Speaker 2: Yeah, I think it's a very broad set of needs. 531 00:26:26,040 --> 00:26:29,600 Speaker 2: To your point, it's energy generation, it's storage, it's dealing 532 00:26:29,600 --> 00:26:33,399 Speaker 2: with more and more volatility on the grid which increases cost. 533 00:26:33,800 --> 00:26:37,119 Speaker 2: It's dealing with, as you mentioned in the opening, the 534 00:26:37,280 --> 00:26:41,000 Speaker 2: veritable arms race that the hyperscalers are going through for power. 535 00:26:41,080 --> 00:26:44,720 Speaker 2: This is a existential crisis for them to secure their 536 00:26:44,760 --> 00:26:50,520 Speaker 2: future growth. There is industrial efficiency technology. There's everything in 537 00:26:50,560 --> 00:26:54,960 Speaker 2: the vertical of digital infrastructure, cooling, high capacity lines, you know, 538 00:26:55,040 --> 00:26:59,919 Speaker 2: next generation racks and servers, innovation and acceleration in GPU, 539 00:27:00,040 --> 00:27:02,120 Speaker 2: and this is all Chapter one. That's the most exciting 540 00:27:02,119 --> 00:27:04,280 Speaker 2: part is like this is the worst that this technology 541 00:27:04,320 --> 00:27:06,720 Speaker 2: will ever be. And this is the very beginning of 542 00:27:06,760 --> 00:27:08,840 Speaker 2: the innovation cycle for all of these things. When you 543 00:27:08,920 --> 00:27:11,679 Speaker 2: use chat GPT's the worst it will ever be. The 544 00:27:11,720 --> 00:27:15,560 Speaker 2: way we really model what we personally back or build 545 00:27:15,800 --> 00:27:18,680 Speaker 2: as we say, hey, is this scale tech versus deep tech. 546 00:27:19,000 --> 00:27:21,480 Speaker 2: We think ninety five percent of the solutions are here today. 547 00:27:21,520 --> 00:27:24,080 Speaker 2: The chasm to cross is more one of execution than 548 00:27:24,080 --> 00:27:24,920 Speaker 2: one of pure intervention. 549 00:27:25,080 --> 00:27:28,160 Speaker 1: Just just to clarify, scale tech means something that already 550 00:27:28,160 --> 00:27:31,960 Speaker 1: exists that you can accelerate by investing in, whereas deep 551 00:27:32,000 --> 00:27:37,440 Speaker 1: tech means essentially replicating the role of university or funding 552 00:27:37,600 --> 00:27:40,520 Speaker 1: for research that may only bear commercial fruits a decade 553 00:27:40,520 --> 00:27:40,880 Speaker 1: from now. 554 00:27:40,960 --> 00:27:43,639 Speaker 2: That's right. I think it's too long duration to capital 555 00:27:43,680 --> 00:27:46,760 Speaker 2: intensive for a vehicle like ours, and we're also not PhDs, 556 00:27:46,800 --> 00:27:48,840 Speaker 2: so we really don't have much to offer you if 557 00:27:48,840 --> 00:27:51,320 Speaker 2: you're still in a lab, so we won't really take 558 00:27:51,359 --> 00:27:54,800 Speaker 2: binary science or technology risk. We then look for proven 559 00:27:54,880 --> 00:27:58,520 Speaker 2: customer demand, hence our LP base of true operators in 560 00:27:58,520 --> 00:28:00,679 Speaker 2: the spaces in which we invest. We want to understand 561 00:28:00,920 --> 00:28:04,640 Speaker 2: the appetite for adoption, the imminence of that adoption, when 562 00:28:04,680 --> 00:28:06,600 Speaker 2: would you adopt it, and then we want to back 563 00:28:06,640 --> 00:28:11,800 Speaker 2: into competitive unit economics. Businesses with those features really scale exponentially. 564 00:28:11,920 --> 00:28:13,600 Speaker 2: They can really redefine markets. 565 00:28:15,920 --> 00:28:18,520 Speaker 4: After the break, OZ has more from Jeff Rosenthal, a 566 00:28:18,640 --> 00:28:21,920 Speaker 4: VC firmative that's on tech Stuff's tech support. 567 00:28:21,960 --> 00:28:22,800 Speaker 3: Stay with us. 568 00:28:35,119 --> 00:28:37,520 Speaker 1: On the nuclear side. This is kind of one of 569 00:28:37,520 --> 00:28:39,840 Speaker 1: the great debates I think going on in the tech 570 00:28:39,920 --> 00:28:43,240 Speaker 1: world will be how much of text amount of energy 571 00:28:43,320 --> 00:28:45,840 Speaker 1: is met by nuclear. Of course, one of the facts 572 00:28:45,880 --> 00:28:48,640 Speaker 1: in this is that people are scared of nuclear power. 573 00:28:48,680 --> 00:28:50,120 Speaker 1: I mean they're scared of how you suppose that the 574 00:28:50,120 --> 00:28:53,400 Speaker 1: wast they're scared of meltdowns. Did you have any concerns 575 00:28:53,440 --> 00:28:56,440 Speaker 1: about the first project in this space? You know, you're 576 00:28:56,480 --> 00:28:58,560 Speaker 1: putting your name to being nuclear. 577 00:28:58,800 --> 00:29:01,040 Speaker 2: I think it gives us solid that zero people have 578 00:29:01,120 --> 00:29:03,080 Speaker 2: died from nuclear in the United States. You know, like 579 00:29:03,160 --> 00:29:05,360 Speaker 2: hundreds of fallen off of windmills. If you think about 580 00:29:05,480 --> 00:29:08,040 Speaker 2: coal plants or all the other ways in which we 581 00:29:08,080 --> 00:29:11,720 Speaker 2: consume power, there's trade offs to everything. I personally feel 582 00:29:11,720 --> 00:29:15,920 Speaker 2: that splitting atoms to create chain reactions to power giant 583 00:29:15,960 --> 00:29:18,959 Speaker 2: turbines that give us the energy and prosperity for our 584 00:29:19,040 --> 00:29:22,280 Speaker 2: cities in our society is almost the most amazing thing 585 00:29:22,320 --> 00:29:25,320 Speaker 2: we've ever done as a species. And we're not handling 586 00:29:25,400 --> 00:29:27,479 Speaker 2: nuclear waste ourselves. It's not like I'm taking it out 587 00:29:27,520 --> 00:29:28,959 Speaker 2: in a bag in the back of my house. We 588 00:29:29,000 --> 00:29:32,120 Speaker 2: work with nuclear operating utilities, and so this is constellation 589 00:29:32,240 --> 00:29:34,560 Speaker 2: or floor to power in light. They handle the nuclear 590 00:29:34,560 --> 00:29:38,200 Speaker 2: waste today without issue. We do believe that this is 591 00:29:38,240 --> 00:29:40,640 Speaker 2: an essential part of the power mix of the future. 592 00:29:40,960 --> 00:29:42,760 Speaker 2: It won't be all of it, you know, Like today, 593 00:29:43,080 --> 00:29:45,200 Speaker 2: the most efficient way to really run a data center 594 00:29:45,440 --> 00:29:48,840 Speaker 2: on power is two gas turbines because that achieves a 595 00:29:49,000 --> 00:29:51,800 Speaker 2: capacity factor, meaning they can stay on close to one 596 00:29:51,880 --> 00:29:54,800 Speaker 2: hundred percent of the time. The problem with intermittent energy 597 00:29:54,960 --> 00:29:57,920 Speaker 2: powering data centers is you really can't do training or. 598 00:29:57,880 --> 00:30:01,000 Speaker 1: Inference intimatum being basically solar. 599 00:30:01,080 --> 00:30:04,360 Speaker 2: Wind, et cetera. Yeah, so the value of twenty four 600 00:30:04,480 --> 00:30:09,080 Speaker 2: x seven clean base power is exponential for the entire system. 601 00:30:09,120 --> 00:30:12,040 Speaker 2: It's really really important. So it's not all just apples 602 00:30:12,040 --> 00:30:12,520 Speaker 2: to apples. 603 00:30:12,760 --> 00:30:15,400 Speaker 1: There is I mean, there are cool storage solutions around, 604 00:30:15,520 --> 00:30:18,200 Speaker 1: you know, capturing wind and solar energy to make it 605 00:30:18,320 --> 00:30:20,360 Speaker 1: on nine twenty four seven or investments in. 606 00:30:20,280 --> 00:30:26,120 Speaker 2: That space explicitly true. There are emergent technologies for large 607 00:30:26,160 --> 00:30:29,440 Speaker 2: long duration battery storage for wind and solar. There's things 608 00:30:29,480 --> 00:30:33,040 Speaker 2: like form energy and in Tora energy, there's interesting startups 609 00:30:33,080 --> 00:30:35,320 Speaker 2: that are looking to store thing and it's store power 610 00:30:35,360 --> 00:30:38,680 Speaker 2: and heat. So these things are amazing, they're so inspiring, 611 00:30:39,200 --> 00:30:41,320 Speaker 2: but they are not really here today. 612 00:30:41,600 --> 00:30:43,080 Speaker 1: We talked a lot on this show about how some 613 00:30:43,120 --> 00:30:45,960 Speaker 1: of these things that seemed like pipe dreams or seemed 614 00:30:45,960 --> 00:30:48,920 Speaker 1: like science fiction, uh, you know, becoming science fact in 615 00:30:48,920 --> 00:30:49,720 Speaker 1: our lifetimes. 616 00:30:49,960 --> 00:30:52,920 Speaker 2: These stories have been all around us for a long time. 617 00:30:53,000 --> 00:30:55,280 Speaker 2: It's more an awareness issue than it is like an 618 00:30:55,280 --> 00:30:58,200 Speaker 2: invention issue. So people are like, where's my iPhone? You know, 619 00:30:58,280 --> 00:31:01,080 Speaker 2: I got my first iPhone? Change my life, life Gmail 620 00:31:01,120 --> 00:31:03,080 Speaker 2: and change my life. Where are these things? And so 621 00:31:03,240 --> 00:31:05,720 Speaker 2: I would argue that this is one of the spaces 622 00:31:05,720 --> 00:31:09,080 Speaker 2: that you just see the future manifest regularly, like the 623 00:31:09,120 --> 00:31:12,760 Speaker 2: impossible become possible. It's such a fun innovation cycle. 624 00:31:13,200 --> 00:31:13,400 Speaker 1: You know. 625 00:31:13,400 --> 00:31:17,840 Speaker 2: There's things like Cigar Lake. Cigar Lake is in eastern Saskatchewan, Canada. 626 00:31:17,880 --> 00:31:20,840 Speaker 2: It's owned by Camico. They discovered it in the early eighties. 627 00:31:21,360 --> 00:31:24,520 Speaker 2: It's underneath a snow melt lake fourteen hundred feet deep. 628 00:31:24,520 --> 00:31:28,040 Speaker 2: It's a billion three year old deposit of uranium. Most uranium, 629 00:31:28,080 --> 00:31:31,480 Speaker 2: say in Africa point one two point two percent uranium 630 00:31:31,720 --> 00:31:35,080 Speaker 2: per ounce of war, right, and so these massive sulphuric 631 00:31:35,160 --> 00:31:38,719 Speaker 2: acid leaching fields. This is like twenty four percent uranium 632 00:31:38,800 --> 00:31:42,400 Speaker 2: war It's the second most concentrated uranium deposit in the world. 633 00:31:42,640 --> 00:31:45,640 Speaker 2: So they spent decades figuring out how to freeze the lake. 634 00:31:45,720 --> 00:31:48,959 Speaker 2: They free on this entire lake. Okay, then they rebarred it, 635 00:31:49,360 --> 00:31:51,840 Speaker 2: and then they drilled it fourteen hundred feet down and 636 00:31:51,880 --> 00:31:54,280 Speaker 2: they've been mining it with robots for three decades. And 637 00:31:54,320 --> 00:31:56,720 Speaker 2: it's where something like a quarter of our uranium comes 638 00:31:56,720 --> 00:31:58,480 Speaker 2: from in the United States. And so all I'm saying 639 00:31:58,480 --> 00:32:00,480 Speaker 2: the reason I offer you this, this is like a 640 00:32:00,520 --> 00:32:02,600 Speaker 2: wonder of the world. There are one hundred of these. 641 00:32:02,640 --> 00:32:05,600 Speaker 2: We just don't really know about them because we haven't 642 00:32:05,600 --> 00:32:07,240 Speaker 2: been excited about this space for a long time. 643 00:32:07,800 --> 00:32:12,760 Speaker 1: Bloomberg used the phrase physical world companies with national implications. 644 00:32:12,840 --> 00:32:16,160 Speaker 1: I thought was a good phrase because the world you're 645 00:32:16,200 --> 00:32:19,920 Speaker 1: investing in and building, of course, intersects very heavily with politics. 646 00:32:20,600 --> 00:32:23,360 Speaker 1: But on clean energy, it's kind of interesting because President 647 00:32:23,400 --> 00:32:28,360 Speaker 1: Trump has talked about beautiful coal, beautiful, clean coal, beautiful beautiful, 648 00:32:28,560 --> 00:32:31,840 Speaker 1: and has certainly stepped away from the Biden administration's commitment 649 00:32:31,880 --> 00:32:34,600 Speaker 1: to like keleen energy and green energy. At the same time, 650 00:32:34,680 --> 00:32:37,520 Speaker 1: I think Q one of twenty twenty five was the 651 00:32:37,560 --> 00:32:41,800 Speaker 1: biggest bonanza quarter for investment in climate tech the US 652 00:32:41,840 --> 00:32:44,440 Speaker 1: has ever seen. Five million dollars invested in one quarter 653 00:32:44,960 --> 00:32:48,160 Speaker 1: up nearly sixty five percent year on year. What's going 654 00:32:48,200 --> 00:32:51,200 Speaker 1: on there is private capitals seeing an opportunity in public 655 00:32:51,360 --> 00:32:52,760 Speaker 1: drawback or how do you know? 656 00:32:52,880 --> 00:32:56,280 Speaker 2: I mean, this is the Golden Age. So first and foremost, 657 00:32:56,400 --> 00:32:59,400 Speaker 2: there's this thing called Geben's paradox, and with Jebens paradox, 658 00:32:59,600 --> 00:33:01,720 Speaker 2: exis ammons is that for you know, one hundred and 659 00:33:01,800 --> 00:33:05,040 Speaker 2: fifty years, humans have essentially gotten one percent more efficient 660 00:33:05,080 --> 00:33:08,080 Speaker 2: every single year. We invent LEDs where we figure out 661 00:33:08,120 --> 00:33:10,800 Speaker 2: a more efficient electric motor, and then we use that 662 00:33:11,160 --> 00:33:15,280 Speaker 2: gain an efficiency which drops pricing to increase consumption by 663 00:33:15,280 --> 00:33:18,840 Speaker 2: one point three percent. So every time we increase efficiency gains, 664 00:33:18,840 --> 00:33:21,000 Speaker 2: we actually use more of what we have. 665 00:33:21,240 --> 00:33:23,680 Speaker 1: And this was why when deep Sea came out, the 666 00:33:23,760 --> 00:33:26,640 Speaker 1: Nvidia stock price went down, a lot of small people said, well, 667 00:33:26,840 --> 00:33:29,160 Speaker 1: actually this might be a good thing ultimately because although 668 00:33:29,200 --> 00:33:32,120 Speaker 1: they've done it without the very very advanced chips, this 669 00:33:32,160 --> 00:33:34,160 Speaker 1: will ultimately drive more demand for chips one. 670 00:33:34,360 --> 00:33:37,120 Speaker 2: Percent and more demand for AI. I believe we're pretty 671 00:33:37,240 --> 00:33:39,920 Speaker 2: energy blind in our society to a degree. In the 672 00:33:39,920 --> 00:33:43,600 Speaker 2: case of these clean energy companies and these efficiency technologies. 673 00:33:44,040 --> 00:33:45,800 Speaker 2: For the twenty years leading up to this we were 674 00:33:45,800 --> 00:33:48,560 Speaker 2: in an abundance market. You just did not need these things. 675 00:33:48,600 --> 00:33:51,440 Speaker 2: There was no market driver, there was no demand driver 676 00:33:51,560 --> 00:33:53,520 Speaker 2: for innovation in these spaces, and so that you can 677 00:33:53,600 --> 00:33:56,480 Speaker 2: see the returns being pretty abysmal. 678 00:33:56,600 --> 00:33:59,640 Speaker 1: Historical energy demand was flat until Jerry, we. 679 00:33:59,560 --> 00:34:01,840 Speaker 2: Were solving problem that didn't exist yet. And so it's 680 00:34:01,880 --> 00:34:05,000 Speaker 2: not just AI that's a major contributor. It's AI. It's 681 00:34:05,000 --> 00:34:08,040 Speaker 2: the reshoring and reindustrialization of global industry. If you want 682 00:34:08,080 --> 00:34:10,080 Speaker 2: to smelt steel in America, it takes a lot of 683 00:34:10,120 --> 00:34:12,840 Speaker 2: power building flush rocks in America if you want to 684 00:34:13,080 --> 00:34:16,880 Speaker 2: I mean, let alone, refinement of critical minerals and then 685 00:34:16,920 --> 00:34:19,319 Speaker 2: the mining of critical minerals like these are all things 686 00:34:19,320 --> 00:34:21,200 Speaker 2: that are nonpartisan in my opinion. Like this is a 687 00:34:21,239 --> 00:34:24,280 Speaker 2: movement that's been happening for quite some time. And we see, 688 00:34:24,360 --> 00:34:26,000 Speaker 2: you know, the tariffs more as a canary in the 689 00:34:26,000 --> 00:34:26,600 Speaker 2: coal mine. 690 00:34:26,760 --> 00:34:30,160 Speaker 1: So the energy sector has been particularly punished by the 691 00:34:30,200 --> 00:34:32,840 Speaker 1: swings and roundabouts on the tariffs. I think abound sixteen 692 00:34:32,880 --> 00:34:36,239 Speaker 1: percent this year. That's not something that concerns you as 693 00:34:36,280 --> 00:34:38,680 Speaker 1: somebody investing in this space, I. 694 00:34:38,640 --> 00:34:41,520 Speaker 2: Think it certainly concerns me. I think that you know, 695 00:34:41,560 --> 00:34:45,880 Speaker 2: you talk to anybody in retail markets are softening, consumers 696 00:34:45,920 --> 00:34:48,600 Speaker 2: are spending less money. But I think that these trends 697 00:34:48,600 --> 00:34:51,839 Speaker 2: that we're talking about are multi decade mega trends that 698 00:34:52,080 --> 00:34:54,160 Speaker 2: you know, you're going to need these things if you 699 00:34:54,239 --> 00:34:57,440 Speaker 2: want to have a prosperous society in the future. 700 00:34:58,000 --> 00:35:00,680 Speaker 1: When people think about the sa they tend to think 701 00:35:00,680 --> 00:35:05,480 Speaker 1: about consumer like you know, will by Parker or Goop. Well, 702 00:35:05,520 --> 00:35:07,840 Speaker 1: I think about SaaS, I think about you know, Slack 703 00:35:08,040 --> 00:35:11,720 Speaker 1: or Shopify or whatever else. But you're not the only 704 00:35:11,880 --> 00:35:15,080 Speaker 1: person to be looking addressing this kind of infrastructure opportunity 705 00:35:15,160 --> 00:35:18,480 Speaker 1: through VC. Andrewson, I think, announced a six and roe 706 00:35:18,480 --> 00:35:22,040 Speaker 1: million dollar fun last year to address American dynamism. 707 00:35:22,640 --> 00:35:26,279 Speaker 2: What's striving this necessity is the mother of innovation first 708 00:35:26,320 --> 00:35:29,320 Speaker 2: and foremost. You know, you have these mega trens we've identified, 709 00:35:29,320 --> 00:35:31,560 Speaker 2: I think many other people have. I think that there's 710 00:35:31,600 --> 00:35:35,400 Speaker 2: also a more durable moat in physical products in a 711 00:35:35,480 --> 00:35:38,560 Speaker 2: sense now, so while you know, hardware the first four letters, 712 00:35:38,560 --> 00:35:41,080 Speaker 2: and hardware is hard and so Benuura is always like, yeah, 713 00:35:41,120 --> 00:35:44,480 Speaker 2: you know, I'll stick to enterprise SaaS. That is way 714 00:35:44,560 --> 00:35:46,600 Speaker 2: less defensible than it used to be. You know, now 715 00:35:46,680 --> 00:35:50,560 Speaker 2: with windsurf and your own time, no staff, you can 716 00:35:50,640 --> 00:35:53,640 Speaker 2: code and develop, you know, your own apps or platforms 717 00:35:53,719 --> 00:35:56,880 Speaker 2: or technologies. Suites that used to take millions of dollars 718 00:35:56,880 --> 00:35:59,640 Speaker 2: in years and dozens of people on front end back 719 00:35:59,680 --> 00:36:02,920 Speaker 2: in nearing development. None of that is necessary anymore. 720 00:36:03,000 --> 00:36:03,080 Speaker 1: So. 721 00:36:03,760 --> 00:36:07,960 Speaker 2: The defensibility of what was really like the center of 722 00:36:07,960 --> 00:36:11,759 Speaker 2: the Venn diagram for venture deals a decade ago has 723 00:36:11,880 --> 00:36:14,200 Speaker 2: kind of disappeared to a degree. I think people are 724 00:36:14,320 --> 00:36:17,000 Speaker 2: conscious of that, and so we'd rather work with companies 725 00:36:17,000 --> 00:36:19,240 Speaker 2: that are utilizing the tools of the day to really 726 00:36:19,560 --> 00:36:23,200 Speaker 2: reimagine and reinvent critical industries versus just being like yet 727 00:36:23,239 --> 00:36:24,880 Speaker 2: another digital service provider. 728 00:36:25,760 --> 00:36:27,640 Speaker 1: Jeff, thank you for joining us on tech Stuff. 729 00:36:27,480 --> 00:36:28,799 Speaker 2: Of course, thank you for having me. 730 00:36:32,440 --> 00:36:34,040 Speaker 3: That's it for this week for tech Stuff. 731 00:36:34,080 --> 00:36:36,759 Speaker 1: I'm Karra Price and I'm as Valoshin. This episode was 732 00:36:36,760 --> 00:36:41,040 Speaker 1: produced by Eliza Dennis, Victoria Dominguez and Adriana Tapia. Is 733 00:36:41,080 --> 00:36:43,960 Speaker 1: Executive produced by me Cara Price and Kate Osborne for 734 00:36:44,000 --> 00:36:49,040 Speaker 1: Kaleidoscope and Katrina Norvel for iHeart Podcasts. The engineers are 735 00:36:49,080 --> 00:36:53,440 Speaker 1: Bahiit Fraser and Graham Gibson. Kyle Murdock makes this episode 736 00:36:53,480 --> 00:36:54,880 Speaker 1: and he also rid of theme. 737 00:36:54,719 --> 00:36:57,680 Speaker 4: Song Join us next Wednesday for tex Stuff the Story, 738 00:36:58,000 --> 00:37:00,560 Speaker 4: when we will share an in depth conversation with journalist 739 00:37:00,600 --> 00:37:03,759 Speaker 4: and podcast host Evan Ratliffe about a radical group of 740 00:37:03,800 --> 00:37:07,200 Speaker 4: young tech people concerned by the existential threat of AI, 741 00:37:07,600 --> 00:37:09,360 Speaker 4: who are on trial for murder. 742 00:37:09,840 --> 00:37:12,399 Speaker 1: Please rate, review, and reach out to us at tech 743 00:37:12,440 --> 00:37:15,200 Speaker 1: Stuff podcast at gmail dot com. We really want to 744 00:37:15,200 --> 00:37:15,680 Speaker 1: hear from you.