1 00:00:05,120 --> 00:00:14,040 Speaker 1: Ah, welcome back to it could happen me all that's horrible. 2 00:00:14,520 --> 00:00:19,480 Speaker 1: You didn't like that, Garrison, Well, they can all be winners. Uh. 3 00:00:19,520 --> 00:00:23,320 Speaker 1: This is part I Guess three of our coverage of 4 00:00:23,360 --> 00:00:27,680 Speaker 1: the Consumer Electronics Show and what the tech industry has 5 00:00:27,760 --> 00:00:32,240 Speaker 1: in store for all of us in the future. Um. 6 00:00:32,400 --> 00:00:35,120 Speaker 1: Last episode we talked about the stuff we saw at 7 00:00:35,120 --> 00:00:39,400 Speaker 1: CS that was both cool and optimistic and spoke to 8 00:00:39,440 --> 00:00:42,959 Speaker 1: some some potentially positive trends in tech. And today we're 9 00:00:42,960 --> 00:00:45,040 Speaker 1: going to get back to what we do best, which 10 00:00:45,120 --> 00:00:48,440 Speaker 1: is making you feel bad. But first I want to 11 00:00:48,440 --> 00:00:51,800 Speaker 1: open this up a little bit with Garrison. You're a Canadian, 12 00:00:52,600 --> 00:00:55,600 Speaker 1: You're you're you're You're a very young Canadian twenty years old, 13 00:00:55,640 --> 00:00:58,240 Speaker 1: grew up in a cult and now you have just 14 00:00:58,280 --> 00:01:01,160 Speaker 1: seen Las Vegas, Nevada for the first time. Did it 15 00:01:01,320 --> 00:01:06,200 Speaker 1: change your life? UM? I mean I guess so, I 16 00:01:06,200 --> 00:01:08,240 Speaker 1: I guess I guess it did change my life in 17 00:01:09,760 --> 00:01:13,120 Speaker 1: in my perception of what Las Vegas is and my 18 00:01:13,240 --> 00:01:18,920 Speaker 1: desire to never return of But yeah, I mean, we 19 00:01:18,920 --> 00:01:21,720 Speaker 1: we've been able to spend probably around half our time 20 00:01:21,720 --> 00:01:25,520 Speaker 1: at CES, the other half just so soaking in the 21 00:01:25,880 --> 00:01:30,119 Speaker 1: impeccable vibes of Las Vegas, Nevada. Yeah, I've been. I've 22 00:01:30,160 --> 00:01:34,679 Speaker 1: been tour guiding you around, uh soberly and safely. We 23 00:01:34,720 --> 00:01:39,520 Speaker 1: went to the Venetian and the Palazzo. We took a 24 00:01:39,600 --> 00:01:42,920 Speaker 1: very expensive gondola, right, that was an expensive gondola. Right, 25 00:01:43,160 --> 00:01:46,200 Speaker 1: got to see the beautiful blue skies of Venice and 26 00:01:46,360 --> 00:01:50,200 Speaker 1: all their four corners. Your reaction to seeing inside the 27 00:01:50,240 --> 00:01:53,000 Speaker 1: Vendea if you've never been the Venetian, the interior of it, 28 00:01:53,000 --> 00:01:55,160 Speaker 1: it's this massive casino, as they all are. They're all 29 00:01:55,200 --> 00:02:00,280 Speaker 1: like small towns inside buildings massive, and the Venetian is 30 00:02:00,320 --> 00:02:02,720 Speaker 1: like a replica of the city of Venice with a 31 00:02:02,760 --> 00:02:07,680 Speaker 1: fake sky. And that is one giant mall. I believe 32 00:02:07,720 --> 00:02:11,080 Speaker 1: it's the second largest hotel in the world. It is 33 00:02:11,919 --> 00:02:17,360 Speaker 1: unbelievably large, uh, incredibly expensive. And the fidelity of like 34 00:02:17,520 --> 00:02:20,640 Speaker 1: the fakeness of all of these things that are based 35 00:02:20,639 --> 00:02:23,440 Speaker 1: on real stuff is is quite high to it's it's 36 00:02:23,440 --> 00:02:29,440 Speaker 1: a whole thing. Yeah, it's It's really interesting because some 37 00:02:29,520 --> 00:02:31,120 Speaker 1: of the most impactful stuff is all of like the 38 00:02:31,160 --> 00:02:34,360 Speaker 1: fake storefronts inside because in many ways they're kind of 39 00:02:34,360 --> 00:02:39,960 Speaker 1: just all glorified malls, um and glorified arcades, all slot machines, 40 00:02:40,560 --> 00:02:44,400 Speaker 1: And it's funny because, like you know, they make all 41 00:02:44,440 --> 00:02:47,400 Speaker 1: of these facades on the inside, they have they have 42 00:02:47,520 --> 00:02:50,760 Speaker 1: the ceiling painted to look like the sky, but it's 43 00:02:50,800 --> 00:02:53,639 Speaker 1: it's it's just it's so dark in there, like it's 44 00:02:53,639 --> 00:02:56,760 Speaker 1: so like it's you see blue skies above you, but 45 00:02:56,840 --> 00:02:59,720 Speaker 1: there's like no light anywhere, no light anywhere. There's no 46 00:02:59,720 --> 00:03:02,679 Speaker 1: claws in the rooms. No, you never know what time 47 00:03:02,720 --> 00:03:05,600 Speaker 1: it is. You never see the outdoors. You're all isolated 48 00:03:05,600 --> 00:03:08,200 Speaker 1: in these little corridors leading from one shop to another 49 00:03:08,600 --> 00:03:11,720 Speaker 1: with slot machines all along the way. You're flying back soon, 50 00:03:11,760 --> 00:03:14,800 Speaker 1: are you looking forward to not being in a maze 51 00:03:15,000 --> 00:03:19,040 Speaker 1: of lights designed to bewilder and slowly damage you enough 52 00:03:19,480 --> 00:03:21,600 Speaker 1: that you sit down at a craps table and very 53 00:03:21,600 --> 00:03:24,000 Speaker 1: excited to see a real tree that's not a palm tree. 54 00:03:25,360 --> 00:03:28,400 Speaker 1: It's very excited to like touch grass, because there's no 55 00:03:28,480 --> 00:03:31,880 Speaker 1: grass in Las Vegas. No, it's actually I think illegal 56 00:03:32,200 --> 00:03:34,080 Speaker 1: in a lot of parts of the city to have 57 00:03:34,200 --> 00:03:36,440 Speaker 1: like a grass lawn, which is so one of the 58 00:03:36,480 --> 00:03:40,119 Speaker 1: things so obviously Vegas is in an objective sense, incredibly wasteful. 59 00:03:40,200 --> 00:03:43,000 Speaker 1: A huge amount of resources get poured into what is 60 00:03:43,000 --> 00:03:46,440 Speaker 1: effectively just for gaming. But UM. The other thing, like 61 00:03:46,520 --> 00:03:48,240 Speaker 1: another thing that you have to hold in your mind 62 00:03:48,240 --> 00:03:49,760 Speaker 1: when you recognize that, is that of all of the 63 00:03:49,760 --> 00:03:53,839 Speaker 1: states in the Southwest utilizing the very limited water resources there. 64 00:03:54,040 --> 00:03:55,760 Speaker 1: If I'm not mistaken, because it was just reading an 65 00:03:55,840 --> 00:03:58,440 Speaker 1: article about this, Nevada is the one state that has 66 00:03:58,480 --> 00:04:01,400 Speaker 1: reduced its water usage while it's grown by like three 67 00:04:01,440 --> 00:04:05,880 Speaker 1: quarters of a million people. UM, So it contains multitudes. 68 00:04:06,360 --> 00:04:11,160 Speaker 1: And also Nevada, like Vegas, is where the I'm spacing 69 00:04:11,160 --> 00:04:12,800 Speaker 1: on the name right now. But basically you have all 70 00:04:12,840 --> 00:04:16,200 Speaker 1: of these different states in the Southwest that are all 71 00:04:16,279 --> 00:04:18,080 Speaker 1: kind of coming together to try to figure out how 72 00:04:18,080 --> 00:04:22,320 Speaker 1: to deal with the fact that uh Lake Meat water 73 00:04:22,440 --> 00:04:25,240 Speaker 1: levels are getting lower in the Colorado River is disappearing 74 00:04:25,279 --> 00:04:27,400 Speaker 1: in some areas, and it is the only thing that 75 00:04:27,440 --> 00:04:29,359 Speaker 1: makes life out here possible on the scale that it 76 00:04:29,400 --> 00:04:33,239 Speaker 1: currently exists on. UM. And a couple of months before 77 00:04:33,240 --> 00:04:36,920 Speaker 1: c e S they had their big meeting in Las 78 00:04:37,040 --> 00:04:39,600 Speaker 1: Vegas in order to talk about how to try and 79 00:04:39,640 --> 00:04:42,880 Speaker 1: deal with the calamitous water situation. So it is very 80 00:04:42,960 --> 00:04:47,039 Speaker 1: much this city that is like filled with simulacra of 81 00:04:47,040 --> 00:04:50,800 Speaker 1: the past, um which it uses to try to hack 82 00:04:50,839 --> 00:04:52,880 Speaker 1: your brain to get you to stay up for four 83 00:04:52,960 --> 00:04:54,960 Speaker 1: days in a row, gambling and spending tens of thousands 84 00:04:55,000 --> 00:04:58,440 Speaker 1: of dollars. And it also because it's the best place 85 00:04:58,480 --> 00:05:01,080 Speaker 1: to hold a convention, and in a very technical sense, 86 00:05:01,080 --> 00:05:04,120 Speaker 1: like it is the most prepared for a large convention. 87 00:05:04,160 --> 00:05:07,480 Speaker 1: They this city can handle a hundred and fifty two 88 00:05:07,520 --> 00:05:11,039 Speaker 1: hundred thousand people coming in overnight and needing places to 89 00:05:11,120 --> 00:05:14,200 Speaker 1: stay and needing infrastructure in order. So it's also where 90 00:05:14,200 --> 00:05:17,080 Speaker 1: a lot of things about the future get decided, which 91 00:05:17,120 --> 00:05:20,720 Speaker 1: is when you spend enough time walking kind of horrifying. 92 00:05:20,920 --> 00:05:24,840 Speaker 1: It's kind of horrified the fact that important decisions get 93 00:05:24,880 --> 00:05:27,840 Speaker 1: made in this in this realm of in this place 94 00:05:27,880 --> 00:05:30,680 Speaker 1: that's designed to be mind altering. Yeah, it is, it is. 95 00:05:30,720 --> 00:05:32,800 Speaker 1: It is crafted. We're not like joking about this. There 96 00:05:32,839 --> 00:05:35,760 Speaker 1: are no clocks in the hotel rooms, like the casinos 97 00:05:35,760 --> 00:05:39,840 Speaker 1: are crafted to damage your perception of time. Um, so 98 00:05:39,920 --> 00:05:42,080 Speaker 1: I don't know that somebody should maybe look into that. 99 00:05:42,360 --> 00:05:44,600 Speaker 1: It's I do like when you're talking about like me. 100 00:05:44,720 --> 00:05:47,000 Speaker 1: It's a great example of the overall vibes of Las 101 00:05:47,080 --> 00:05:50,120 Speaker 1: Vegas is as as like meat is drying up. We 102 00:05:50,200 --> 00:05:54,080 Speaker 1: keep finding bodies inside the lake like bodies that have 103 00:05:54,120 --> 00:05:56,880 Speaker 1: been there a long time, bodies of people who had 104 00:05:56,920 --> 00:05:59,720 Speaker 1: alternate ideas about how Vegas should look. I mean a 105 00:05:59,760 --> 00:06:02,719 Speaker 1: lot of them were probably yeah, yeah, well walk walk 106 00:06:02,760 --> 00:06:06,400 Speaker 1: to the Venetian, walked through Caesar's Palace. Uh, they had 107 00:06:06,480 --> 00:06:11,480 Speaker 1: they had some nice vapor wave elieds displays outside. Briefly 108 00:06:11,480 --> 00:06:14,599 Speaker 1: went into the Paris one, which was honestly, I think 109 00:06:14,760 --> 00:06:17,839 Speaker 1: they Paris handled this, handled the fake sky the worst 110 00:06:18,360 --> 00:06:23,040 Speaker 1: because not only was this the sky painted ceiling so low, 111 00:06:23,960 --> 00:06:27,320 Speaker 1: the the bottom part of the Eiffel Tower just stops 112 00:06:27,360 --> 00:06:29,680 Speaker 1: where the ceiling stops. That they didn't even try. They 113 00:06:29,680 --> 00:06:31,919 Speaker 1: don't even try to continue the illusion. It's just just 114 00:06:32,000 --> 00:06:35,640 Speaker 1: as is a hard stop. Um. We wrote a roller coaster. 115 00:06:36,040 --> 00:06:37,640 Speaker 1: We we went to New York. We we went to 116 00:06:37,960 --> 00:06:40,920 Speaker 1: the Little Blurry for me because you were so drunk. 117 00:06:42,240 --> 00:06:47,040 Speaker 1: But I just bought them. I dumped the the attempt 118 00:06:47,040 --> 00:06:49,200 Speaker 1: at like buying drinks from places and just got a 119 00:06:49,240 --> 00:06:53,719 Speaker 1: handle of Woodford Reserve, which allegedly you can mix into 120 00:06:53,760 --> 00:06:56,640 Speaker 1: one of the th HC Pina coladas that they have 121 00:06:56,839 --> 00:07:00,599 Speaker 1: and allegedly it's pretty good time. We we went to 122 00:07:00,680 --> 00:07:04,680 Speaker 1: Rainforest Cafe. I unfortunately bought you got sicker than I 123 00:07:04,720 --> 00:07:08,640 Speaker 1: did eating Rainforest Cafe I I bought. I bought this 124 00:07:08,720 --> 00:07:13,520 Speaker 1: volcano cake and it was quite regrettable. Um. And then 125 00:07:13,520 --> 00:07:15,840 Speaker 1: we walked over to the New York themed casino inside 126 00:07:15,880 --> 00:07:18,240 Speaker 1: Las Vegas. So if you want a city themed casino 127 00:07:18,320 --> 00:07:20,320 Speaker 1: inside the city that you're in, you can go there. 128 00:07:20,320 --> 00:07:24,600 Speaker 1: Just pretty different city creating microcostums within microcostums. You're just 129 00:07:24,720 --> 00:07:27,360 Speaker 1: like the nesting, nesting all the way down. And I, 130 00:07:28,440 --> 00:07:30,880 Speaker 1: in an effort to make both me and Robert vomit, 131 00:07:31,840 --> 00:07:37,040 Speaker 1: we went on a roller coaster we've barely survived. That 132 00:07:37,120 --> 00:07:41,040 Speaker 1: did feel like a very dangerous roller that was we 133 00:07:41,040 --> 00:07:44,480 Speaker 1: were so close to vomiting everywhere. Just yeah, it was 134 00:07:44,520 --> 00:07:46,679 Speaker 1: a good time. That was pretty fun. I felt great 135 00:07:47,040 --> 00:07:50,480 Speaker 1: so that I just felt people would enjoy your your 136 00:07:50,520 --> 00:07:52,800 Speaker 1: your first Vegas experience. And of course you stayed at 137 00:07:52,800 --> 00:07:56,000 Speaker 1: Circus Circus, which we just walked through earlier today, one 138 00:07:56,120 --> 00:07:58,720 Speaker 1: last time, one final, one final debut to see a 139 00:07:58,760 --> 00:08:04,960 Speaker 1: family of four with thirty eight thousand dollars imagined losing that. 140 00:08:05,320 --> 00:08:10,160 Speaker 1: Circus Circus unbelieveabout the worst casino in the world. I think, 141 00:08:10,360 --> 00:08:13,240 Speaker 1: in order to segue into our next topic, it's Pretti. 142 00:08:13,880 --> 00:08:16,000 Speaker 1: I think Las Vegas is probably one of the most 143 00:08:16,000 --> 00:08:18,920 Speaker 1: heavily surveilled cities in the United States. It would be 144 00:08:18,960 --> 00:08:21,200 Speaker 1: hard to find one with more, especially when you're on 145 00:08:21,280 --> 00:08:23,000 Speaker 1: the Strip. Obviously there's a lot of lots of it. 146 00:08:23,080 --> 00:08:25,040 Speaker 1: I have family who live here and they can go 147 00:08:25,160 --> 00:08:28,320 Speaker 1: years without visiting the fucking strip because it's terrible. Um. 148 00:08:28,520 --> 00:08:32,720 Speaker 1: But another another, and and so kind of in a 149 00:08:32,760 --> 00:08:36,079 Speaker 1: similar sense. At CES, there was a lot of stuff 150 00:08:36,120 --> 00:08:40,240 Speaker 1: about surveillance, a lot of stuff about, uh, you know, 151 00:08:40,360 --> 00:08:43,600 Speaker 1: collect different new innovative ways to collect data on you 152 00:08:43,720 --> 00:08:48,280 Speaker 1: and your and your appliances and what's in your home. Um, 153 00:08:48,400 --> 00:08:51,040 Speaker 1: do we want to stalk start by talking about the 154 00:08:51,040 --> 00:08:57,280 Speaker 1: the almdipure of of surveillance tech. Yeah, there was actually 155 00:08:57,360 --> 00:08:59,880 Speaker 1: just an article in the Washington Post about this about 156 00:09:00,040 --> 00:09:02,800 Speaker 1: how unsafe quite a bit of it is. And one 157 00:09:02,800 --> 00:09:04,720 Speaker 1: of the things that you may have caught in some 158 00:09:04,800 --> 00:09:06,480 Speaker 1: of your news because this was probably one of the 159 00:09:06,520 --> 00:09:08,480 Speaker 1: more viral stories, is that there was a lot of 160 00:09:08,520 --> 00:09:12,920 Speaker 1: piss based technology, a lot of p analyzation. Vivu had 161 00:09:13,000 --> 00:09:15,400 Speaker 1: a thing there was there was at least three different 162 00:09:16,120 --> 00:09:19,599 Speaker 1: p test kits that were on the show floor. I 163 00:09:19,600 --> 00:09:21,880 Speaker 1: think some of them wont some some of the CS 164 00:09:21,960 --> 00:09:25,960 Speaker 1: Innovation Awards where basically you can analyze what's in your urine. Yeah, 165 00:09:26,080 --> 00:09:28,719 Speaker 1: and these are always framed as like it can let 166 00:09:28,760 --> 00:09:31,040 Speaker 1: you give you confirmation if you have a U t I, 167 00:09:31,200 --> 00:09:33,320 Speaker 1: it can help people who have all these different illnesses. 168 00:09:33,320 --> 00:09:36,599 Speaker 1: It can help diabetics. UM. And I'm sure there's a 169 00:09:36,640 --> 00:09:39,480 Speaker 1: degree to which that's true, But I asked the Vivu 170 00:09:39,640 --> 00:09:42,439 Speaker 1: lady and I didn't speak with the There was another 171 00:09:42,520 --> 00:09:46,400 Speaker 1: called UM you scan by with things and and you 172 00:09:46,400 --> 00:09:52,080 Speaker 1: you scans urine sensor analyzes hormone levels in urine. That's interesting, Yeah, 173 00:09:52,080 --> 00:09:55,480 Speaker 1: which is is why it won some awards, and also 174 00:09:55,559 --> 00:10:00,160 Speaker 1: why a bunch of folks, including Consumer Reports UM, put 175 00:10:00,000 --> 00:10:02,200 Speaker 1: out like a warning about it, saying like we shouldn't 176 00:10:02,200 --> 00:10:05,160 Speaker 1: be celebrating this. This is an incredibly dangerous product because 177 00:10:05,160 --> 00:10:07,800 Speaker 1: it all is going to your phone, the data is 178 00:10:07,840 --> 00:10:10,720 Speaker 1: being collected digitally, and if, for example, you are in 179 00:10:10,760 --> 00:10:15,760 Speaker 1: a state that heavily restricts women's access to reproductive healthcare, uh, 180 00:10:15,800 --> 00:10:19,280 Speaker 1: there is literally nothing stopping the law enforcement or the 181 00:10:19,280 --> 00:10:22,960 Speaker 1: government of those states from demanding all of that data 182 00:10:23,040 --> 00:10:26,240 Speaker 1: be handed over, potentially even in real time. There's absolutely 183 00:10:26,240 --> 00:10:28,319 Speaker 1: nothing stopping that. And the company has already said they'll 184 00:10:28,360 --> 00:10:32,280 Speaker 1: comply with law enforcement with government requests. Um. And there's 185 00:10:32,320 --> 00:10:34,760 Speaker 1: they don't have any kind of plan for the fact 186 00:10:34,800 --> 00:10:39,480 Speaker 1: that they are creating a way to surveile people's bodies, 187 00:10:39,920 --> 00:10:42,680 Speaker 1: um for the government. Um. And when I talked to 188 00:10:42,720 --> 00:10:45,680 Speaker 1: the one of the representatives of Vivu, which is another 189 00:10:45,679 --> 00:10:47,920 Speaker 1: one of these urine companies that I don't believe detect 190 00:10:47,920 --> 00:10:51,280 Speaker 1: your hormone levels but but does is generating a lot 191 00:10:51,280 --> 00:10:54,120 Speaker 1: of data about your body, a lot of biometric data. 192 00:10:54,160 --> 00:10:56,960 Speaker 1: And the most she would give me is that the 193 00:10:57,040 --> 00:11:01,600 Speaker 1: data is encrypted. Which great, that that that's a fancy 194 00:11:01,640 --> 00:11:05,080 Speaker 1: word for saying, yeah, we have it. We are. We 195 00:11:05,120 --> 00:11:08,880 Speaker 1: are sitting here right after one of the most damaging 196 00:11:09,000 --> 00:11:13,720 Speaker 1: data hacks of all time, which has it was last 197 00:11:13,760 --> 00:11:17,880 Speaker 1: pass It was one of the massive password collecting apps 198 00:11:17,880 --> 00:11:20,600 Speaker 1: where you basically like centralize all your passwords behind one 199 00:11:20,600 --> 00:11:24,160 Speaker 1: and remember and like it's a lot of people are 200 00:11:24,240 --> 00:11:28,520 Speaker 1: exposed as a result of that. And um, I just 201 00:11:28,600 --> 00:11:33,320 Speaker 1: think that like the this show, such a massive part 202 00:11:33,320 --> 00:11:36,719 Speaker 1: of it was we have we are debuting devices that 203 00:11:36,760 --> 00:11:39,000 Speaker 1: will allow you to monitor different parts of your body 204 00:11:39,040 --> 00:11:41,559 Speaker 1: at all times and get real time, biometric data your 205 00:11:41,559 --> 00:11:44,000 Speaker 1: body and your house and centralizing all this data about you. 206 00:11:44,240 --> 00:11:46,880 Speaker 1: Talk about ring in one place, because that's the same 207 00:11:46,920 --> 00:11:49,800 Speaker 1: thing with like smart homes and smart appliances were very popular. 208 00:11:49,880 --> 00:11:53,439 Speaker 1: Smart cars were a very big thing. Um we're talking about, 209 00:11:53,480 --> 00:11:56,200 Speaker 1: like smart cities were another big thing for just other 210 00:11:56,240 --> 00:11:58,680 Speaker 1: ways to centralize all of the data about what you own, 211 00:11:58,720 --> 00:12:02,480 Speaker 1: where it is um and how to effectively provide advertising 212 00:12:02,520 --> 00:12:04,480 Speaker 1: to get you to buy more. There's an attempt being 213 00:12:04,520 --> 00:12:06,960 Speaker 1: made by Republicans in Oklahoma right now to make it 214 00:12:07,000 --> 00:12:11,760 Speaker 1: criminal two do gender transition if you are under twenty 215 00:12:11,800 --> 00:12:15,840 Speaker 1: six years of age. There's no reason why a product 216 00:12:15,880 --> 00:12:17,840 Speaker 1: like this couldn't be used to determine whether or not 217 00:12:17,920 --> 00:12:20,840 Speaker 1: somebody is illegally taking hormones in a state where they 218 00:12:20,880 --> 00:12:23,880 Speaker 1: are attempting to restrict trans people like it's this is 219 00:12:23,920 --> 00:12:28,640 Speaker 1: all We're not just being like fuddy duddies. These are 220 00:12:28,679 --> 00:12:33,040 Speaker 1: all very serious implications, and there's zero thought, zero evidence 221 00:12:33,080 --> 00:12:34,520 Speaker 1: of thought being given to it with any of the 222 00:12:34,559 --> 00:12:36,839 Speaker 1: biometric companies. Now, one of the reasons we talked about 223 00:12:36,880 --> 00:12:40,079 Speaker 1: that those smart glasses UM that are for people who 224 00:12:40,200 --> 00:12:44,280 Speaker 1: are hearing impaired, that caption conversations live around them. One 225 00:12:44,320 --> 00:12:46,160 Speaker 1: of the reasons I was impressed by that is that 226 00:12:46,280 --> 00:12:48,520 Speaker 1: it's all a closed loop. None of it goes to 227 00:12:48,559 --> 00:12:52,280 Speaker 1: your smartphone, none of it's broadcast wirely wirelessly. Um, it 228 00:12:52,400 --> 00:12:54,480 Speaker 1: is all on device and none of it is stored anywhere. 229 00:12:54,960 --> 00:12:56,560 Speaker 1: And when they said that, that was part of what 230 00:12:56,600 --> 00:13:00,800 Speaker 1: convinced me these people understand the responsibility the d they 231 00:13:00,840 --> 00:13:13,640 Speaker 1: have delivering a healthcare product. We should move on to 232 00:13:13,720 --> 00:13:16,480 Speaker 1: the other part of the Panopticon that we saw and 233 00:13:16,520 --> 00:13:20,320 Speaker 1: talk about Ring. Yeah, the Ring booth was one of 234 00:13:20,320 --> 00:13:24,640 Speaker 1: the more terrifyingly dystopian bits. And it's you know, and 235 00:13:24,679 --> 00:13:27,080 Speaker 1: it's describe it for our listeners. Well, I mean it's 236 00:13:28,400 --> 00:13:32,040 Speaker 1: they basically made like a white Pickett house. Um. And 237 00:13:32,040 --> 00:13:35,000 Speaker 1: you know again ces these are massive, massive buildings, and 238 00:13:35,040 --> 00:13:37,200 Speaker 1: so they do people can construct a full house in 239 00:13:37,240 --> 00:13:40,120 Speaker 1: there and did so, like you know, there's fake fake 240 00:13:40,160 --> 00:13:43,320 Speaker 1: green grass, a nice little fence, this perfect little ideal 241 00:13:43,360 --> 00:13:46,840 Speaker 1: at home. And the massive massive sign above was like, uh, 242 00:13:47,080 --> 00:13:50,640 Speaker 1: you know, ring keeping, like keeping keeping your neighborhood and 243 00:13:50,679 --> 00:13:52,840 Speaker 1: say if you know like all of all of all 244 00:13:52,880 --> 00:13:56,560 Speaker 1: of that that type of messaging. Um. The in the 245 00:13:56,640 --> 00:13:59,320 Speaker 1: model home they had there was like a dozen cameras 246 00:13:59,480 --> 00:14:04,640 Speaker 1: on every all all around the sides approach, multiple cameras 247 00:14:04,640 --> 00:14:08,240 Speaker 1: on the doors. There's a doorbell camera, peephole camera camera 248 00:14:08,280 --> 00:14:11,360 Speaker 1: on the fence at one door with three cameras on 249 00:14:11,400 --> 00:14:14,800 Speaker 1: the door itself. And I mean ring zoned by Amazon. 250 00:14:14,920 --> 00:14:19,880 Speaker 1: There was you know, Alexa Alexa assisted Ring cameras um 251 00:14:19,920 --> 00:14:22,600 Speaker 1: all of the day that gets gets used by law enforcement. 252 00:14:22,600 --> 00:14:25,800 Speaker 1: A Ring partners like directly with law enforcement to make 253 00:14:25,880 --> 00:14:30,520 Speaker 1: data like immediately available and make feeds immediately available. And 254 00:14:31,200 --> 00:14:32,880 Speaker 1: the probably the still least thing we saw at the 255 00:14:32,960 --> 00:14:37,400 Speaker 1: Ring booth was this home security tiny little drone. Yeah, 256 00:14:37,720 --> 00:14:41,320 Speaker 1: so basically they've built and it's weird because the so 257 00:14:41,400 --> 00:14:45,600 Speaker 1: the box it comes in looks like a fucking um 258 00:14:45,760 --> 00:14:48,720 Speaker 1: de humidifier that I used to have or humidifier that 259 00:14:48,760 --> 00:14:50,680 Speaker 1: I used to have in my house. It's almost identical. 260 00:14:51,360 --> 00:14:54,520 Speaker 1: Um but it's like this little plastic box and a 261 00:14:54,640 --> 00:14:57,640 Speaker 1: drone can take off and fly out of it. Uh, 262 00:14:57,680 --> 00:15:00,520 Speaker 1: And the drone trains itself on your house, so it 263 00:15:00,520 --> 00:15:03,920 Speaker 1: knows how to get around and if somebody it thinks 264 00:15:03,960 --> 00:15:08,440 Speaker 1: somebody's breaking in. A person who is effectively like works 265 00:15:08,480 --> 00:15:13,080 Speaker 1: for Ring, like an actual human being sitting in a 266 00:15:13,160 --> 00:15:17,200 Speaker 1: call center somewhere takes control of the drone and can 267 00:15:17,240 --> 00:15:20,880 Speaker 1: confront someone in your house, which I guess there's a 268 00:15:20,880 --> 00:15:24,760 Speaker 1: potential security benefit there, But also, you are signing up 269 00:15:24,800 --> 00:15:29,640 Speaker 1: to allow Amazon to have a random person travel around 270 00:15:29,680 --> 00:15:32,840 Speaker 1: your home at any hour of the night, in a 271 00:15:33,080 --> 00:15:35,880 Speaker 1: in a thing they control, in a little flying machine 272 00:15:35,960 --> 00:15:40,120 Speaker 1: that they control, and that I cannot put myself in. 273 00:15:40,120 --> 00:15:43,120 Speaker 1: That I get, obviously, I get wanting to have cameras. 274 00:15:43,120 --> 00:15:45,320 Speaker 1: I don't think it's unreasonable to have security cameras on 275 00:15:45,360 --> 00:15:47,560 Speaker 1: your home. I even understand how some people who are 276 00:15:47,560 --> 00:15:49,640 Speaker 1: not as privacy conscious as I am could be like, yeah, 277 00:15:49,640 --> 00:15:52,000 Speaker 1: I don't care if it's connected to the internet. Um, 278 00:15:52,040 --> 00:15:53,760 Speaker 1: even though that's not a thing I like, I can't 279 00:15:53,880 --> 00:15:55,760 Speaker 1: put myself in the head of somebody who would want 280 00:15:55,800 --> 00:15:59,280 Speaker 1: that thing in their house. Yeah, it's bizarre because obviously 281 00:15:59,320 --> 00:16:01,960 Speaker 1: there's needs. Again, they're like health related. Maybe if you've 282 00:16:02,000 --> 00:16:04,480 Speaker 1: got like an illness or something, you might want something 283 00:16:04,560 --> 00:16:07,960 Speaker 1: like that. Like, I can understand how very specific purpose 284 00:16:08,040 --> 00:16:11,360 Speaker 1: driven needs, but like, as a normal person wanting an 285 00:16:11,400 --> 00:16:14,720 Speaker 1: Amazon employee to be able to wander around your home, 286 00:16:15,760 --> 00:16:18,200 Speaker 1: it seems weird to me. I mean, that's obviously can 287 00:16:18,240 --> 00:16:20,800 Speaker 1: also all that data getting used. Amazon can scan your 288 00:16:20,920 --> 00:16:24,720 Speaker 1: entire house fiera, what what products you buy, you know, 289 00:16:24,960 --> 00:16:27,600 Speaker 1: what what non Amazon things are inside your home, what 290 00:16:27,680 --> 00:16:29,240 Speaker 1: types of trends that you're using, and all that kind 291 00:16:29,280 --> 00:16:30,920 Speaker 1: of get used to help get you to buy more things. 292 00:16:31,240 --> 00:16:33,480 Speaker 1: That the the one of the morn cities parts of 293 00:16:33,520 --> 00:16:36,240 Speaker 1: like all of the marketing and some of like the 294 00:16:36,880 --> 00:16:40,640 Speaker 1: some of like the video commercials for Ring that we saw, 295 00:16:40,680 --> 00:16:43,680 Speaker 1: you know, playing on these giant, giant screens inside is 296 00:16:44,160 --> 00:16:47,840 Speaker 1: they're they're really trying to also push that. They're trying 297 00:16:47,840 --> 00:16:50,520 Speaker 1: to push in a normalize using ring as a part 298 00:16:50,520 --> 00:16:54,160 Speaker 1: of your everyday life, but for non security means like 299 00:16:54,480 --> 00:16:56,800 Speaker 1: you know, when you're leaving your grandma's house, you say 300 00:16:56,840 --> 00:16:59,400 Speaker 1: goodbye to her in her little ring camera. You know 301 00:16:59,440 --> 00:17:01,320 Speaker 1: when you when you're getting to your friend's house you 302 00:17:01,400 --> 00:17:03,480 Speaker 1: do a little funny pranks in front of their ring camera. 303 00:17:03,560 --> 00:17:05,960 Speaker 1: It's like it's all these different ways to make rings 304 00:17:05,960 --> 00:17:08,520 Speaker 1: seem like this fun and normal thing to like play 305 00:17:08,560 --> 00:17:12,240 Speaker 1: with your friends and your family social, when in reality, 306 00:17:12,280 --> 00:17:17,399 Speaker 1: look again, security cameras are inherently anti social. It doesn't 307 00:17:17,440 --> 00:17:19,480 Speaker 1: mean that there aren't good reasons to have one. And 308 00:17:19,680 --> 00:17:23,560 Speaker 1: as someone who's been burglarized. I do understand that, Um, 309 00:17:23,600 --> 00:17:26,640 Speaker 1: it's not bad, but it's anti social because you are 310 00:17:26,680 --> 00:17:29,320 Speaker 1: surveilling people because you're worried about what they might do. 311 00:17:29,760 --> 00:17:32,399 Speaker 1: That is that is a fundamentally anti social thing. And 312 00:17:32,440 --> 00:17:35,840 Speaker 1: so the attempt to like turn that, the attempt to 313 00:17:35,920 --> 00:17:38,639 Speaker 1: kind of like merge that into normal family life and 314 00:17:38,680 --> 00:17:44,720 Speaker 1: to make it like friendly is really bad. Yeah. I 315 00:17:44,760 --> 00:17:47,159 Speaker 1: think that we briefly stopped by the A d T 316 00:17:47,320 --> 00:17:48,639 Speaker 1: Booth And this is kind of this This is kind 317 00:17:48,640 --> 00:17:51,359 Speaker 1: of similar to the little drone that we just talked about, 318 00:17:51,720 --> 00:17:56,359 Speaker 1: but a little bit more ridiculous. Um, they have at 319 00:17:56,400 --> 00:18:00,199 Speaker 1: the A d T Booth this home security robot, like 320 00:18:00,440 --> 00:18:04,560 Speaker 1: six six ft tall robot with uh with like a 321 00:18:04,880 --> 00:18:08,080 Speaker 1: like like an like a lc D little face with 322 00:18:08,080 --> 00:18:12,760 Speaker 1: this big smile on it, and and it's powered or 323 00:18:12,800 --> 00:18:16,480 Speaker 1: not powered. It is controlled by you, the owner, by 324 00:18:16,520 --> 00:18:21,560 Speaker 1: wearing an Oculus headset, and it has it has rolling 325 00:18:21,560 --> 00:18:23,520 Speaker 1: feet so it can move around by rolling. But it's 326 00:18:23,520 --> 00:18:26,479 Speaker 1: like six ft tall. It has two arms, massive smiling 327 00:18:26,480 --> 00:18:29,919 Speaker 1: face and if if you have you know, your headset 328 00:18:29,960 --> 00:18:31,840 Speaker 1: with you, and you think someone's breaking into your home, 329 00:18:32,160 --> 00:18:35,320 Speaker 1: you can put this on and control this robot to 330 00:18:35,480 --> 00:18:39,199 Speaker 1: like chase them out. And I was overhearing that a 331 00:18:39,280 --> 00:18:40,960 Speaker 1: d T guys talking about it, and they're like, yeah, 332 00:18:41,000 --> 00:18:43,680 Speaker 1: this is even. This is even just like a great deterrence. 333 00:18:43,720 --> 00:18:46,119 Speaker 1: Like imagine you're if someone's breaking into your home and 334 00:18:46,119 --> 00:18:48,560 Speaker 1: then they see a massive, smiling robot rolling towards you. 335 00:18:48,920 --> 00:18:54,720 Speaker 1: I would run away very quickly, like had like like 336 00:18:54,960 --> 00:18:57,639 Speaker 1: what this this thing has to cost like tens of 337 00:18:57,680 --> 00:18:59,840 Speaker 1: tens of thousands of dollars and like this is what 338 00:19:00,000 --> 00:19:03,600 Speaker 1: you're doing too, feel like you're really just spend that 339 00:19:03,680 --> 00:19:08,159 Speaker 1: much money to to create this sense of safety. Really really, 340 00:19:08,440 --> 00:19:10,720 Speaker 1: this is this is what you're doing. You're you're you're 341 00:19:10,720 --> 00:19:14,160 Speaker 1: getting a robot that gets powered by a Facebook headset 342 00:19:15,200 --> 00:19:17,320 Speaker 1: so you can walk around your house in a rolling 343 00:19:17,400 --> 00:19:20,280 Speaker 1: robot to make sure no one's gonna come, you know, 344 00:19:20,359 --> 00:19:24,919 Speaker 1: take random shipped from your house. Yeah. When like number one, Um, 345 00:19:25,680 --> 00:19:28,160 Speaker 1: anyone who would do that is the kind of person 346 00:19:28,200 --> 00:19:32,200 Speaker 1: that needs to be have things taken from them. Um. 347 00:19:32,240 --> 00:19:35,200 Speaker 1: But number two like if you're actually concerned for your 348 00:19:35,200 --> 00:19:40,200 Speaker 1: actual safety, and again I think that's perfectly valid. Um, 349 00:19:40,359 --> 00:19:43,520 Speaker 1: none of these drones, this robot of security theater, it's 350 00:19:43,520 --> 00:19:48,880 Speaker 1: not it's theater. It's easy to to like damage, it's easy. 351 00:19:48,920 --> 00:19:51,680 Speaker 1: You can knock it's it's all three wheels. You knock 352 00:19:51,720 --> 00:19:54,320 Speaker 1: it over. It can't get up. Put on block so 353 00:19:54,359 --> 00:19:57,400 Speaker 1: that you're completely covered, knock it over, and then proceed 354 00:19:57,440 --> 00:19:59,600 Speaker 1: to rob the house. It's not useful, it's it's it's 355 00:19:59,680 --> 00:20:02,440 Speaker 1: a purity alarm at that point. It's it's it's wild 356 00:20:02,640 --> 00:20:04,560 Speaker 1: and like and people will find ways to hack them 357 00:20:04,560 --> 00:20:07,879 Speaker 1: and stuff. You know, you can't hack a well trained 358 00:20:07,920 --> 00:20:10,520 Speaker 1: guard dog, which also will cost you tens of thousands 359 00:20:10,560 --> 00:20:14,000 Speaker 1: of dollars less. And we'll love you like a Doberman 360 00:20:14,080 --> 00:20:17,200 Speaker 1: pincher will kill your enemies if they break into your home, 361 00:20:17,520 --> 00:20:20,040 Speaker 1: and loves you like the same way. You know, there 362 00:20:20,080 --> 00:20:22,600 Speaker 1: was people getting into Alexa machines a few years ago. 363 00:20:22,680 --> 00:20:26,199 Speaker 1: There was Alexa Alexa machines listening and sending info when 364 00:20:26,240 --> 00:20:28,280 Speaker 1: they weren't supposed to do. There was a mass there 365 00:20:28,320 --> 00:20:30,800 Speaker 1: was a pretty big incident actually in Portland a few 366 00:20:30,840 --> 00:20:33,879 Speaker 1: years ago, of of Alexa listening in too when it 367 00:20:33,920 --> 00:20:35,880 Speaker 1: was wasn't supposed to do, and and and like listening 368 00:20:35,920 --> 00:20:38,520 Speaker 1: to different conversations and trying trying to finish conversational cues. 369 00:20:39,200 --> 00:20:40,920 Speaker 1: Um you know, it's only a matter of time before 370 00:20:40,920 --> 00:20:43,520 Speaker 1: someone figures out how to control, how to remotely control 371 00:20:43,520 --> 00:20:45,320 Speaker 1: one of these A d T robots and you have 372 00:20:45,400 --> 00:20:47,480 Speaker 1: some something like rolling around in your house that you 373 00:20:47,520 --> 00:20:52,560 Speaker 1: don't control anymore. Like it's yeah, there are um, it's 374 00:20:52,680 --> 00:20:55,360 Speaker 1: there are always vulnerabilities in these things and they always 375 00:20:55,400 --> 00:20:59,320 Speaker 1: get hacked. Um. And more to the point, like well, 376 00:20:59,359 --> 00:21:01,719 Speaker 1: if you have some sort of security drone like your 377 00:21:01,760 --> 00:21:05,480 Speaker 1: ring drone, there's no way like again, Amazon would comply 378 00:21:05,560 --> 00:21:09,159 Speaker 1: with law enforcement requests. There's nothing that says law enforcement, 379 00:21:09,200 --> 00:21:11,720 Speaker 1: if it was part of an investigation, could not use 380 00:21:11,760 --> 00:21:16,400 Speaker 1: this technology to surveil you in real time. UM. So 381 00:21:16,880 --> 00:21:20,320 Speaker 1: I don't like that. UM, not my favorite. And while 382 00:21:20,320 --> 00:21:25,160 Speaker 1: we're when we're talking about surveillance, uh, we can't ignore 383 00:21:25,760 --> 00:21:28,960 Speaker 1: our good friends at palent here now if you haven't 384 00:21:29,000 --> 00:21:32,720 Speaker 1: been paying attention to the surveillance industry, palent Heer is 385 00:21:32,760 --> 00:21:37,720 Speaker 1: a company that exists to collect data and build machine 386 00:21:37,800 --> 00:21:42,000 Speaker 1: solutions and machine learning solutions UM to surveil people and 387 00:21:42,080 --> 00:21:46,080 Speaker 1: to help equipment like drones, targeting and whatnot work better. 388 00:21:46,119 --> 00:21:49,479 Speaker 1: They're an intelligence company. There's like lots of systems they 389 00:21:49,520 --> 00:21:51,960 Speaker 1: do systems. It's not like they make a single product. 390 00:21:52,240 --> 00:21:57,080 Speaker 1: They help build systems to collect data and enable governments 391 00:21:57,160 --> 00:21:59,840 Speaker 1: and militaries to make decisions off of that data. That 392 00:21:59,920 --> 00:22:03,399 Speaker 1: is like the thing that they do primarily systems analysts tracking. 393 00:22:03,440 --> 00:22:05,399 Speaker 1: I mean, like what one of the one of the 394 00:22:05,440 --> 00:22:07,399 Speaker 1: things we saw was them you know, analyzing homes to 395 00:22:07,520 --> 00:22:10,480 Speaker 1: data around like water conservation. Right there, they're trying to 396 00:22:10,640 --> 00:22:14,639 Speaker 1: put a variety of their usage not just kill brown people, 397 00:22:15,000 --> 00:22:17,000 Speaker 1: but but they do a lot of the primary the 398 00:22:17,080 --> 00:22:20,959 Speaker 1: center of their booth was this massive military truck with 399 00:22:21,000 --> 00:22:23,280 Speaker 1: a huge armored box on the back that was filled 400 00:22:23,320 --> 00:22:28,360 Speaker 1: with computers specifically to collect data and to um like 401 00:22:28,480 --> 00:22:32,840 Speaker 1: do command and control for drone fleets in theater. Um. 402 00:22:32,960 --> 00:22:35,080 Speaker 1: And one of the things you know when you see 403 00:22:35,080 --> 00:22:37,520 Speaker 1: a vehicle of that size and it was very massive, 404 00:22:37,640 --> 00:22:40,680 Speaker 1: is that well, this is not this is intended either 405 00:22:40,760 --> 00:22:43,800 Speaker 1: to be very far back from the front, which which 406 00:22:44,400 --> 00:22:46,560 Speaker 1: mitigates some of the uses of it, or it is 407 00:22:46,600 --> 00:22:49,320 Speaker 1: intended to be used in an area in which the 408 00:22:49,440 --> 00:22:53,280 Speaker 1: enemy does not have air power. Um. But so again 409 00:22:53,400 --> 00:22:56,240 Speaker 1: the kind of places where you're just bombing them, right 410 00:22:56,320 --> 00:23:00,680 Speaker 1: like theaters like Yemen where the rebels have minimum ability 411 00:23:00,760 --> 00:23:05,080 Speaker 1: to do something like bomb a gigantic truck. That's a target. Um, 412 00:23:05,119 --> 00:23:08,720 Speaker 1: but you have kind of unrestricted ability to do stuff 413 00:23:08,720 --> 00:23:12,399 Speaker 1: like drone strike school buses, which has happened repeatedly there. Um. 414 00:23:12,560 --> 00:23:15,320 Speaker 1: We had a couple of conversations with the good people 415 00:23:15,359 --> 00:23:19,320 Speaker 1: at Palenteer. Uh they were I don't I think we 416 00:23:19,480 --> 00:23:22,840 Speaker 1: kind of figured out they were primarily they're looking for 417 00:23:22,880 --> 00:23:26,000 Speaker 1: talent because they were looking for people to recruit, looking 418 00:23:26,040 --> 00:23:28,439 Speaker 1: for different things to integrate into their systems. Yeah, they 419 00:23:28,480 --> 00:23:31,960 Speaker 1: would not show much of what they had. Everything inside 420 00:23:31,960 --> 00:23:36,600 Speaker 1: the van itself was uh classified. Here would you hit 421 00:23:36,680 --> 00:23:40,720 Speaker 1: me my phone find that person's name? But everything in 422 00:23:40,760 --> 00:23:44,840 Speaker 1: there was was classified whenever we started talking, especially the 423 00:23:44,880 --> 00:23:46,639 Speaker 1: first time we were there, because I started asking some 424 00:23:46,680 --> 00:23:50,439 Speaker 1: pretty specific questions about what was actually in that and 425 00:23:50,480 --> 00:23:52,320 Speaker 1: how it worked and how it was different from current 426 00:23:52,400 --> 00:23:55,240 Speaker 1: drone command and control solutions. And there was a very 427 00:23:55,280 --> 00:23:58,399 Speaker 1: specific woman with Palenteer who, no matter who I was 428 00:23:58,440 --> 00:24:01,320 Speaker 1: talking to, would come up behind me and kind of 429 00:24:01,640 --> 00:24:04,760 Speaker 1: direct conversation. And I think also was there to listen 430 00:24:04,800 --> 00:24:07,000 Speaker 1: to the answers that were being provided to me and 431 00:24:07,080 --> 00:24:09,679 Speaker 1: stop people from saying things on her team if they 432 00:24:09,680 --> 00:24:13,520 Speaker 1: weren't supposed to say them. Um, there were a couple 433 00:24:13,520 --> 00:24:17,120 Speaker 1: of occasions in which I asked, Hey, can we check 434 00:24:17,200 --> 00:24:19,960 Speaker 1: this thing out on the inside, and we were told no, 435 00:24:20,040 --> 00:24:22,520 Speaker 1: it was classified. No one else could get in. You 436 00:24:22,520 --> 00:24:25,080 Speaker 1: have to you have to gain permission from the army. Yeah. 437 00:24:25,600 --> 00:24:31,800 Speaker 1: I definitely saw some individuals exited, but they were Palenteer people. 438 00:24:31,840 --> 00:24:35,160 Speaker 1: But then the next day we came back, um and 439 00:24:35,600 --> 00:24:40,600 Speaker 1: I watched a woman exit the vehicle UM and a 440 00:24:40,640 --> 00:24:42,720 Speaker 1: man from Palenteer with her, But the woman was not 441 00:24:42,760 --> 00:24:45,159 Speaker 1: from Palenteer. Now, people wear badges at c e s, 442 00:24:45,200 --> 00:24:47,320 Speaker 1: so their names are on display and what they do 443 00:24:47,440 --> 00:24:49,879 Speaker 1: is on display. Although it's easy to look this person up, 444 00:24:49,880 --> 00:24:51,480 Speaker 1: and I saw she had a badge as a speaker. 445 00:24:51,880 --> 00:24:55,320 Speaker 1: Her name was Mary or sorry, her name was Melody 446 00:24:55,359 --> 00:25:00,280 Speaker 1: Hildebrand UM. So I I googled Melody Hildebrand uh because 447 00:25:00,280 --> 00:25:02,639 Speaker 1: I wanted to know she does not work for Palenteer. 448 00:25:02,680 --> 00:25:06,119 Speaker 1: What is she doing inside Palentteer's giant class classified robot 449 00:25:06,240 --> 00:25:10,320 Speaker 1: murder box? Uh. Melody is the president of Blockchain Creative 450 00:25:10,400 --> 00:25:13,919 Speaker 1: Labs and the chief Information Security officer for the Fox 451 00:25:13,960 --> 00:25:18,080 Speaker 1: Company for the you know that Fox Corporation. So it 452 00:25:18,200 --> 00:25:23,879 Speaker 1: looked like by the way her her Twitter says uh C, 453 00:25:24,040 --> 00:25:27,560 Speaker 1: I s O Fox Web three engineering cybersecurity, former war gamer, 454 00:25:27,680 --> 00:25:33,320 Speaker 1: lover of farm animals, so that's cool. Um and yeah, 455 00:25:33,359 --> 00:25:36,800 Speaker 1: over here we've got her retweeting a post about and Rill, 456 00:25:36,920 --> 00:25:40,919 Speaker 1: which is UM. One of the Peter Teel companies like 457 00:25:41,000 --> 00:25:44,240 Speaker 1: Palenteer is raising one point four eight billion in their 458 00:25:44,280 --> 00:25:47,840 Speaker 1: Series E funding. UM. This new funding will enable us 459 00:25:47,840 --> 00:25:49,760 Speaker 1: to accelerate R and DED and bring new cutting edge 460 00:25:49,800 --> 00:25:53,879 Speaker 1: automous defense capabilities to market. Now, I don't know, I 461 00:25:53,880 --> 00:25:58,600 Speaker 1: wonder what they mean by the word defense. Yeah. Yeah, Yeah, 462 00:25:58,640 --> 00:26:00,359 Speaker 1: she's also pro n f T so that's good. I'm 463 00:26:00,359 --> 00:26:04,160 Speaker 1: gonna I'm gonna tweet to her in a little bit. UM. 464 00:26:04,320 --> 00:26:06,800 Speaker 1: But no, it was it was very clear that there 465 00:26:06,840 --> 00:26:08,440 Speaker 1: was you know, there was pr people on the ground 466 00:26:08,440 --> 00:26:10,919 Speaker 1: to make sure that the line of questioning if they 467 00:26:10,960 --> 00:26:14,720 Speaker 1: were too if people were asking questions about their surveillance tech, 468 00:26:14,760 --> 00:26:18,000 Speaker 1: about this big titan truck which is what it's called Titan, 469 00:26:18,680 --> 00:26:21,960 Speaker 1: um that there's only very very specific answers. And like 470 00:26:22,320 --> 00:26:24,239 Speaker 1: they were not there to talk to journalists. They were 471 00:26:24,240 --> 00:26:26,359 Speaker 1: not there to talk to media. They were there to 472 00:26:26,400 --> 00:26:30,240 Speaker 1: recruit people to you know, become more capable at their 473 00:26:30,240 --> 00:26:35,000 Speaker 1: surveillance tech. That's that was very clear. Uh. They were 474 00:26:35,000 --> 00:26:39,160 Speaker 1: also right across the street, right across the hall from 475 00:26:39,240 --> 00:26:43,600 Speaker 1: the Fantastic Robos and Transformers robots. So on the on 476 00:26:43,600 --> 00:26:46,359 Speaker 1: one side you have a fun optimist prim robot that transforms. 477 00:26:46,680 --> 00:26:50,600 Speaker 1: The other side you have the rolling metal deathcage. So 478 00:26:51,080 --> 00:26:52,840 Speaker 1: that was that was that was most of palat here. 479 00:26:52,960 --> 00:26:57,119 Speaker 1: They had this um sky box, which was this box 480 00:26:57,160 --> 00:27:02,080 Speaker 1: that had like encrypted communications technology, drones and drone drone 481 00:27:02,080 --> 00:27:06,560 Speaker 1: piloting technology and like, um, you know, a military computer 482 00:27:07,119 --> 00:27:09,040 Speaker 1: that all in this little tiny box that they can 483 00:27:09,119 --> 00:27:13,760 Speaker 1: drop into people who are you know, basically drop into 484 00:27:13,760 --> 00:27:16,240 Speaker 1: people who are in trouble. Yeah, they were, they were. 485 00:27:16,280 --> 00:27:18,680 Speaker 1: They were building it as basically number one. It would 486 00:27:18,680 --> 00:27:20,639 Speaker 1: be for it could be for special forces teams. It 487 00:27:20,680 --> 00:27:24,200 Speaker 1: has like a laptop in there. It has potentially several 488 00:27:24,240 --> 00:27:26,960 Speaker 1: drones in there. UM and it has like a bunch 489 00:27:26,960 --> 00:27:29,679 Speaker 1: of specially modified field cameras so you could set up 490 00:27:29,720 --> 00:27:33,200 Speaker 1: surveillance on an area UM and and those cameras kind 491 00:27:33,200 --> 00:27:36,119 Speaker 1: of work with a machine learning algorithm to do stuff 492 00:27:36,160 --> 00:27:39,399 Speaker 1: like try and identify where landmines are. And again like 493 00:27:39,440 --> 00:27:43,800 Speaker 1: these are the stuff that's problematic primarily about Palatiners it's 494 00:27:43,840 --> 00:27:46,760 Speaker 1: it's data collecting, its surveillance, and the fact that we 495 00:27:46,800 --> 00:27:51,000 Speaker 1: know that drone warfare is generally pretty fucked up and 496 00:27:51,080 --> 00:27:53,639 Speaker 1: has an extremely high civilian casualty rate and is used 497 00:27:53,680 --> 00:27:56,280 Speaker 1: in a lot of theaters obviously, not in a lot 498 00:27:56,280 --> 00:27:59,560 Speaker 1: of theaters where they are primarily just massacring people either 499 00:27:59,640 --> 00:28:01,800 Speaker 1: fighting where their freedom are trying to survive. This is 500 00:28:01,840 --> 00:28:04,440 Speaker 1: the problem with it. Obviously, all of this tech will 501 00:28:04,480 --> 00:28:07,880 Speaker 1: also be used in generally positive things like, for example, 502 00:28:07,960 --> 00:28:11,200 Speaker 1: dropping a box like this into the hands of some 503 00:28:11,320 --> 00:28:15,760 Speaker 1: Ukrainian special Forces guys to to integrate them into a 504 00:28:15,840 --> 00:28:19,840 Speaker 1: more advanced command and control network so they have better 505 00:28:19,880 --> 00:28:22,960 Speaker 1: access to tactical data like is not a thing. I 506 00:28:23,000 --> 00:28:25,359 Speaker 1: don't specifically have a problem with that application. The problem 507 00:28:25,440 --> 00:28:30,200 Speaker 1: is more broadly palenteer um do you want to do? 508 00:28:30,400 --> 00:28:33,000 Speaker 1: Do you want to briefly explain in case people are 509 00:28:33,040 --> 00:28:36,800 Speaker 1: not Lord of the Rings fans. So again, these are 510 00:28:36,840 --> 00:28:39,120 Speaker 1: all companies owned by Peter Teal, who is a self 511 00:28:39,200 --> 00:28:44,160 Speaker 1: described fascist, believes in ending democracy. Believes that democracy and 512 00:28:44,200 --> 00:28:48,240 Speaker 1: freedom are not compatible because freedom he defines specifically as 513 00:28:48,280 --> 00:28:50,440 Speaker 1: the ability of people with lots of money to not 514 00:28:50,520 --> 00:28:52,880 Speaker 1: have any kind of restrictions on their behavior or what 515 00:28:52,920 --> 00:28:55,920 Speaker 1: they can compel other people to do. Peter Teal owns 516 00:28:56,040 --> 00:29:01,040 Speaker 1: Palenteer and Andy rill Um the Palanteer Palenteer. Both of 517 00:29:01,040 --> 00:29:02,760 Speaker 1: those are names from Lord of the Rings, and in 518 00:29:02,840 --> 00:29:05,960 Speaker 1: Lord of the Rings, the Palenteer was an orb given 519 00:29:06,000 --> 00:29:10,440 Speaker 1: by the big bad Guy Sawon to one of his lackeys, 520 00:29:10,440 --> 00:29:13,680 Speaker 1: a wizard named Saramon, so that he could surveil any 521 00:29:13,720 --> 00:29:16,320 Speaker 1: part of Middle Earth he wanted in order to send 522 00:29:16,840 --> 00:29:19,880 Speaker 1: his armies to crush the free peoples of the world. 523 00:29:20,280 --> 00:29:22,560 Speaker 1: Like that is that is literally what this company is 524 00:29:22,640 --> 00:29:25,959 Speaker 1: named after. It is the bad guy surveillance tech to 525 00:29:26,160 --> 00:29:28,880 Speaker 1: use the Oro Kai against the free people of Middle Earth. 526 00:29:29,200 --> 00:29:35,760 Speaker 1: It is it is specifically something that only evil people use. Um. 527 00:29:35,800 --> 00:29:38,280 Speaker 1: It's it's pretty cool that the whole company is named after. 528 00:29:38,280 --> 00:29:41,240 Speaker 1: And there were all these very nice, polite people in 529 00:29:41,760 --> 00:29:47,719 Speaker 1: uh Patagonia style vests with Palenteer logos stitched on them 530 00:29:47,800 --> 00:29:51,600 Speaker 1: standing around. UM happy to answer any of your questions. 531 00:29:52,240 --> 00:29:56,360 Speaker 1: Uh anyway, I'm I'm curious as to why Melody Hildebrandt 532 00:29:56,360 --> 00:29:58,920 Speaker 1: was inside there. What the chief information security officer of 533 00:29:59,000 --> 00:30:01,240 Speaker 1: Fox would want to do with one of those vans. 534 00:30:01,400 --> 00:30:05,560 Speaker 1: That is curious that is curious, she's on Twitter. I 535 00:30:05,560 --> 00:30:08,680 Speaker 1: did reach out to her, but we that We also 536 00:30:08,720 --> 00:30:10,440 Speaker 1: saw a few of the robot dogs. We saw the 537 00:30:10,440 --> 00:30:13,120 Speaker 1: Boston to the Dynamics one, which was very impressive and 538 00:30:13,320 --> 00:30:16,880 Speaker 1: how it moves. Um. Then we saw one much more 539 00:30:17,160 --> 00:30:21,640 Speaker 1: cheaper um uh model of of a robot dog that 540 00:30:21,720 --> 00:30:24,400 Speaker 1: had not as great mobility, but it seemed to be 541 00:30:24,440 --> 00:30:28,280 Speaker 1: more more suited towards the types of the types of 542 00:30:28,360 --> 00:30:31,000 Speaker 1: style of dogs that were that we've seen law enforcements 543 00:30:31,040 --> 00:30:35,000 Speaker 1: start to buy. Um. The cheaper ones with less flexibility, 544 00:30:35,640 --> 00:30:38,680 Speaker 1: more mounts to attach you know, things to the top 545 00:30:38,720 --> 00:30:40,440 Speaker 1: of the robot which you don't really see you with 546 00:30:40,800 --> 00:30:43,920 Speaker 1: the Boston Dynamics ones. They do not like mounting extra 547 00:30:43,920 --> 00:30:46,840 Speaker 1: things on. But the the other robot dog we saw 548 00:30:46,880 --> 00:30:48,680 Speaker 1: had this little arm that it was that it was 549 00:30:49,000 --> 00:30:51,320 Speaker 1: that that had attached to the top that was in 550 00:30:51,360 --> 00:30:54,320 Speaker 1: the robotics section, pretty close to Palatiner. That one was 551 00:30:54,400 --> 00:30:57,160 Speaker 1: much less impressive than Because we saw both robot dogs 552 00:30:57,160 --> 00:30:59,640 Speaker 1: and these are if you see video of a robot 553 00:30:59,680 --> 00:31:03,440 Speaker 1: dog that people are freaking about out about online, these 554 00:31:03,440 --> 00:31:07,320 Speaker 1: are those robot dogs. UM. The one we saw with 555 00:31:07,360 --> 00:31:09,800 Speaker 1: the arm on, it did not move. It was number 556 00:31:09,800 --> 00:31:12,520 Speaker 1: one controlled directly by a guy with a controller. It 557 00:31:12,560 --> 00:31:16,720 Speaker 1: was not autonomous and it didn't move very smoothly. The 558 00:31:17,280 --> 00:31:21,120 Speaker 1: sitting in front of the Boston Dynamic Spot spot and 559 00:31:21,200 --> 00:31:24,760 Speaker 1: watching it move was really Surrey. It was number one. 560 00:31:24,760 --> 00:31:27,720 Speaker 1: We both talked about this garrison. It's like watching c 561 00:31:27,880 --> 00:31:31,240 Speaker 1: g I in real life because it's it's so fine tuned. Yeah, 562 00:31:31,320 --> 00:31:34,000 Speaker 1: it moves like a living thing, but clearly is not 563 00:31:34,880 --> 00:31:36,680 Speaker 1: um and it moves like a living thing enough that 564 00:31:36,760 --> 00:31:39,000 Speaker 1: it is not it's not an uncanny valley. That's not 565 00:31:39,040 --> 00:31:41,760 Speaker 1: the right way to describe it now, because the movements 566 00:31:41,760 --> 00:31:46,200 Speaker 1: are kind of perfect. It's just not it's alive. It's 567 00:31:46,240 --> 00:31:49,600 Speaker 1: almost it's it's it's not uncanny valley. It's almost like instead, 568 00:31:49,640 --> 00:31:53,640 Speaker 1: it's like too perfect. Yeah, it's it's just so fine tuned. 569 00:31:53,760 --> 00:31:56,120 Speaker 1: It's it was pretty It was pretty impressive to watch. 570 00:31:56,240 --> 00:31:58,720 Speaker 1: It was very impressive and it it's become obvious to 571 00:31:58,720 --> 00:32:00,840 Speaker 1: me that like one of the things that absolutely is 572 00:32:00,880 --> 00:32:03,880 Speaker 1: going on at Boston Dynamics is that they feel there 573 00:32:03,960 --> 00:32:06,640 Speaker 1: is there is an it is important to them as 574 00:32:06,680 --> 00:32:09,680 Speaker 1: a business. Some of this may just be that they 575 00:32:09,680 --> 00:32:11,480 Speaker 1: this is a personal challenge for a lot of these 576 00:32:11,520 --> 00:32:13,880 Speaker 1: engineering guys, but I suspect they also see this as 577 00:32:13,920 --> 00:32:18,600 Speaker 1: valuable to their business to replicate physical emotionality. And when 578 00:32:18,640 --> 00:32:20,920 Speaker 1: I talk about that, when you like, watch a dog, right, 579 00:32:21,200 --> 00:32:23,320 Speaker 1: you can tell a dog's emotions from the way that 580 00:32:23,360 --> 00:32:26,560 Speaker 1: the dog moves, because that's how dogs work. Um, the 581 00:32:26,720 --> 00:32:30,920 Speaker 1: robot dog expresses physical emotion and obviously it doesn't feel emotion, 582 00:32:30,960 --> 00:32:34,600 Speaker 1: but it physically expresses emotion in a similar way to 583 00:32:34,760 --> 00:32:38,920 Speaker 1: a dog like curiosity. They're very good at mimicking a 584 00:32:39,000 --> 00:32:41,760 Speaker 1: curious dog in the way it's body language works, which 585 00:32:41,800 --> 00:32:55,200 Speaker 1: is really wild. Yeah, that would be one of the 586 00:32:55,280 --> 00:32:59,000 Speaker 1: things I did not like. Um, I mean, it's impressive. 587 00:32:59,120 --> 00:33:01,480 Speaker 1: A lot of this stuff is a objectively impressive. Most 588 00:33:01,520 --> 00:33:05,440 Speaker 1: of the other robotics we saw there was not that impressive. 589 00:33:05,560 --> 00:33:08,960 Speaker 1: Like I saw this this robot bartender that was making boba, 590 00:33:09,160 --> 00:33:12,120 Speaker 1: but it but it didn't know, it didn't know how, 591 00:33:12,320 --> 00:33:15,680 Speaker 1: or it wasn't able to actually deliver the boba onto 592 00:33:15,720 --> 00:33:19,320 Speaker 1: the secondary robot that delivers the boba. So this this 593 00:33:19,440 --> 00:33:21,920 Speaker 1: this one robot with arms made made the drink, a 594 00:33:22,000 --> 00:33:24,120 Speaker 1: human picked it up, inspected it, then put it on 595 00:33:24,160 --> 00:33:26,760 Speaker 1: a secondary robot which then delivered the drink. And I 596 00:33:27,080 --> 00:33:30,320 Speaker 1: and this technology. I mean I I was eating at 597 00:33:30,360 --> 00:33:33,240 Speaker 1: a at a at a Burmese place in Portland a 598 00:33:33,240 --> 00:33:36,360 Speaker 1: few months ago where they were using this same food 599 00:33:36,440 --> 00:33:39,760 Speaker 1: delivery robot system. It's not it's not brand new, it's 600 00:33:39,800 --> 00:33:43,080 Speaker 1: just becoming cheaper and more people are trying to like 601 00:33:43,200 --> 00:33:45,560 Speaker 1: make it a thing. And there was so there was 602 00:33:45,560 --> 00:33:47,120 Speaker 1: a lot of those types of things, a lot of 603 00:33:47,160 --> 00:33:50,640 Speaker 1: like R two D two on Jabba's sale barge, like 604 00:33:51,040 --> 00:33:55,360 Speaker 1: delivering drinks style style robots that are autonomous, like they 605 00:33:55,400 --> 00:33:58,520 Speaker 1: do move themselves around. They don't need a remote controller, 606 00:33:59,160 --> 00:34:01,920 Speaker 1: but they're not that impressive. But that that that was 607 00:34:01,960 --> 00:34:04,320 Speaker 1: like the majority of stuff in the robotics section was 608 00:34:04,360 --> 00:34:07,200 Speaker 1: that there was a few other kind of smaller rolling 609 00:34:07,280 --> 00:34:11,080 Speaker 1: robots that were there was just like elderly people like 610 00:34:11,160 --> 00:34:13,440 Speaker 1: if if someone falls down, this robot kind of goes 611 00:34:13,440 --> 00:34:16,120 Speaker 1: around and will help you. And yeah, I don't feel 612 00:34:16,160 --> 00:34:19,880 Speaker 1: well that specific stuff I don't feel like well suited 613 00:34:19,920 --> 00:34:22,759 Speaker 1: to describe, like to guess as to how well it 614 00:34:22,800 --> 00:34:26,520 Speaker 1: would work. UM. But I think more broadly talking about 615 00:34:26,520 --> 00:34:28,760 Speaker 1: autonomous tech because that was one of the biggest product 616 00:34:28,840 --> 00:34:31,680 Speaker 1: categories at Sea as it was all over the place. Um, 617 00:34:31,840 --> 00:34:33,279 Speaker 1: there were a lot of cars, and a lot of 618 00:34:33,280 --> 00:34:36,920 Speaker 1: companies doing autonomous software and light our solutions for cars. 619 00:34:37,640 --> 00:34:40,359 Speaker 1: I consider that all to be vaporware. There's a great 620 00:34:40,360 --> 00:34:44,680 Speaker 1: deal of evidence here, but fully autonomous vehicles, UM, in 621 00:34:44,719 --> 00:34:46,800 Speaker 1: the way that some of these companies are advertising is 622 00:34:46,840 --> 00:34:49,279 Speaker 1: simply not. They simply do do not exist, and not 623 00:34:49,400 --> 00:34:51,200 Speaker 1: exist and will not exist. And we did talk to 624 00:34:51,239 --> 00:34:53,960 Speaker 1: a couple of people, so again for the stuff that's 625 00:34:54,040 --> 00:34:57,680 Speaker 1: very real about autonomous tech, there's things like driver assistance, 626 00:34:57,719 --> 00:35:00,760 Speaker 1: so for like truck drivers, to allow them to strain 627 00:35:00,800 --> 00:35:04,560 Speaker 1: and stress themselves less while driving and to help um 628 00:35:04,560 --> 00:35:06,719 Speaker 1: make certain things like backing up and parking that can 629 00:35:06,760 --> 00:35:09,839 Speaker 1: be very difficult in certain environments safer by having more 630 00:35:09,920 --> 00:35:12,520 Speaker 1: cameras and machine assistance. That makes sense. And one of 631 00:35:12,560 --> 00:35:14,200 Speaker 1: the people who worked at one of those companies said 632 00:35:14,239 --> 00:35:17,360 Speaker 1: to us, Um, yeah, there's no such thing as autonomous 633 00:35:18,000 --> 00:35:20,760 Speaker 1: trucks or cars, like they don't exist outside of very 634 00:35:20,760 --> 00:35:23,680 Speaker 1: tightly controlled conditions. All we are trying to do is 635 00:35:23,760 --> 00:35:26,400 Speaker 1: make truck driving safer and less stressful on the driver, 636 00:35:26,800 --> 00:35:32,240 Speaker 1: which sounds great. UM. I mean, obviously it there's problems 637 00:35:32,320 --> 00:35:35,319 Speaker 1: with the way the trucking industry exists outside of that, 638 00:35:35,360 --> 00:35:38,479 Speaker 1: but that sounds again like one of those products meant 639 00:35:38,520 --> 00:35:44,319 Speaker 1: to actually mitigate worker fatigue and discomfort and potentially makes 640 00:35:44,360 --> 00:35:46,640 Speaker 1: it safer. So I'm on board with that kind of stuff. 641 00:35:47,200 --> 00:35:50,960 Speaker 1: But um other like an autonomous and smart tech that 642 00:35:51,000 --> 00:35:55,279 Speaker 1: we like, like like smart cars, um e V like 643 00:35:55,640 --> 00:35:58,120 Speaker 1: electronic vehicles and autonomous stuff. There was some stuff at 644 00:35:58,160 --> 00:36:00,480 Speaker 1: the John Deer Booth which it was pushing towards automation, 645 00:36:00,520 --> 00:36:02,440 Speaker 1: like we talked about in the last episode. And then 646 00:36:02,480 --> 00:36:07,520 Speaker 1: also they're they're evy tractor just launched, which so John 647 00:36:07,560 --> 00:36:10,560 Speaker 1: Dear if you're not aware, has had a series of 648 00:36:10,640 --> 00:36:13,960 Speaker 1: long running legal battles, particularly with farmers in Ukraine, over 649 00:36:14,000 --> 00:36:16,560 Speaker 1: the fact that they do not want it to be 650 00:36:17,239 --> 00:36:20,680 Speaker 1: possible or legal for you to repair your tractor if 651 00:36:20,680 --> 00:36:24,040 Speaker 1: you're a farmer. Farmers have previously in history often repaired 652 00:36:24,080 --> 00:36:27,520 Speaker 1: and fixed and modified their vehicles. Um this is both 653 00:36:28,000 --> 00:36:31,440 Speaker 1: necessary if a thing breaks you can't always get it 654 00:36:31,480 --> 00:36:34,080 Speaker 1: back to a manufacturing facility in time. And a lot 655 00:36:34,080 --> 00:36:35,440 Speaker 1: of farms in the middle of nowhere. A lot of 656 00:36:35,440 --> 00:36:36,919 Speaker 1: farms are in the middle of nowhere, which is where 657 00:36:36,920 --> 00:36:40,520 Speaker 1: food comes from. And you also like you can't wait, 658 00:36:40,640 --> 00:36:42,799 Speaker 1: you can't just be like, well, let's just put harvesting 659 00:36:42,840 --> 00:36:46,480 Speaker 1: off for a week or two. That that is a problem. Um. 660 00:36:46,560 --> 00:36:49,560 Speaker 1: John Deere sees that as a severe threat to their profits, 661 00:36:49,560 --> 00:36:53,319 Speaker 1: and they have fought viciously in courts UH to make it, 662 00:36:53,360 --> 00:36:57,280 Speaker 1: to try to make it illegal to repair your own devices. Um. 663 00:36:57,320 --> 00:36:58,840 Speaker 1: They have lost a lot of those fights in the 664 00:36:58,880 --> 00:37:01,960 Speaker 1: United States, and to its credit, the Biden administration has 665 00:37:02,000 --> 00:37:04,520 Speaker 1: taken a strong stance in favor of the right to repair. 666 00:37:05,080 --> 00:37:08,480 Speaker 1: And what we saw from John Deere at this ce 667 00:37:08,719 --> 00:37:11,839 Speaker 1: S was a bunch of very impressive autonomous products that 668 00:37:12,160 --> 00:37:15,640 Speaker 1: just coincidentally will also make it completely impossible to repair 669 00:37:15,640 --> 00:37:19,160 Speaker 1: your tractors. It's like, specifically with the new ev tractor 670 00:37:19,239 --> 00:37:22,719 Speaker 1: that launched, so much of it is a computer that 671 00:37:22,920 --> 00:37:26,080 Speaker 1: it is impossible to repair unless you work for John Deer. 672 00:37:26,160 --> 00:37:28,920 Speaker 1: Like we when we asked them, like, hey, you know, 673 00:37:29,280 --> 00:37:31,400 Speaker 1: if if this thing breaks down, how how would a 674 00:37:31,440 --> 00:37:33,400 Speaker 1: farmer go about trying to fix this? Since it is 675 00:37:34,080 --> 00:37:36,399 Speaker 1: a lot of it is like not it's it's it's 676 00:37:36,400 --> 00:37:38,520 Speaker 1: not like motors and stuff from like a classic car. 677 00:37:39,040 --> 00:37:43,400 Speaker 1: It is it is like it is computer driven. Um. 678 00:37:43,440 --> 00:37:46,239 Speaker 1: And they're like they just can't. It's just it's just 679 00:37:46,280 --> 00:37:50,400 Speaker 1: so complicated that an average person cannot repair this like 680 00:37:50,520 --> 00:37:53,959 Speaker 1: at all. It's just it just it's impossible. So that's 681 00:37:54,000 --> 00:37:58,800 Speaker 1: the way they are gonna try to get around this, uh, 682 00:37:58,880 --> 00:38:01,200 Speaker 1: this right to repair a shoe. Yeah, we will just 683 00:38:01,320 --> 00:38:03,080 Speaker 1: and the and it's being done under the guys of 684 00:38:03,360 --> 00:38:06,000 Speaker 1: well you you know, by having this much more advanced, 685 00:38:06,040 --> 00:38:07,920 Speaker 1: we can use a lot less pesticide, which is better 686 00:38:08,000 --> 00:38:11,680 Speaker 1: for the soil, better for everything, um, less carbon and 687 00:38:11,760 --> 00:38:15,120 Speaker 1: less carbon. The farmer will have more time because the 688 00:38:15,200 --> 00:38:17,759 Speaker 1: vehicle can handle this autonomously. So that's eight hours the 689 00:38:17,800 --> 00:38:20,480 Speaker 1: farmer you know, gets to to spend doing something else 690 00:38:20,560 --> 00:38:23,319 Speaker 1: and um, all of this stuff that's kind of meant 691 00:38:23,360 --> 00:38:25,279 Speaker 1: to distract from like, well, I guess yeah, maybe he'll 692 00:38:25,280 --> 00:38:28,880 Speaker 1: have more time, but also substantially less autonomy and be 693 00:38:28,920 --> 00:38:32,080 Speaker 1: completely dependent upon the John Deer Corporation in order to 694 00:38:32,280 --> 00:38:35,720 Speaker 1: produce the food that human beings need to survive. Um. 695 00:38:35,760 --> 00:38:38,000 Speaker 1: I'm also gonna point it out there and say I 696 00:38:38,040 --> 00:38:40,200 Speaker 1: started this by saying that, like, one of the major 697 00:38:40,239 --> 00:38:42,040 Speaker 1: lawsuits was between John Deer and a lot of a 698 00:38:42,040 --> 00:38:45,920 Speaker 1: group of Ukrainian farmers, um, the same farmers presumably who 699 00:38:45,920 --> 00:38:47,960 Speaker 1: were towing a lot of Russian ordinance away with the 700 00:38:48,080 --> 00:38:53,080 Speaker 1: John Deer tractors. Um, I don't know that it's that 701 00:38:53,160 --> 00:38:54,960 Speaker 1: kind of stuff. And one of the things that I 702 00:38:55,000 --> 00:38:57,480 Speaker 1: think looking at a lot of this autonomous tech, some 703 00:38:57,600 --> 00:39:01,920 Speaker 1: of it's great, some of it could save lives. Some 704 00:39:02,080 --> 00:39:05,080 Speaker 1: of it. Rather than like reducing the need for humans 705 00:39:05,120 --> 00:39:06,880 Speaker 1: to do work that it would be good if they 706 00:39:06,880 --> 00:39:09,320 Speaker 1: didn't have to do, we'll do just what you recognize, 707 00:39:09,400 --> 00:39:13,520 Speaker 1: create an even less human job for a human, like 708 00:39:13,640 --> 00:39:17,080 Speaker 1: taking drinks from a robot that makes drinks to a 709 00:39:17,200 --> 00:39:19,400 Speaker 1: robot that carries them to people. Because we we just 710 00:39:19,440 --> 00:39:23,120 Speaker 1: couldn't figure out that interstitial step. So your job as 711 00:39:23,160 --> 00:39:26,840 Speaker 1: a human being, as a as a member of of 712 00:39:26,840 --> 00:39:31,400 Speaker 1: of a species that spent millions of years evolving to 713 00:39:31,520 --> 00:39:34,840 Speaker 1: be capable of creating nearly anything, your job will be 714 00:39:34,880 --> 00:39:36,799 Speaker 1: to take a drink from one robot and set it 715 00:39:36,800 --> 00:39:38,640 Speaker 1: down at another. I mean we we we. The thing 716 00:39:38,719 --> 00:39:42,839 Speaker 1: is like that we already had that same idea in factories, 717 00:39:42,880 --> 00:39:46,279 Speaker 1: Like as as factories have gone towards being more made 718 00:39:46,280 --> 00:39:48,600 Speaker 1: by machines, they're still as factory works who need to 719 00:39:48,600 --> 00:39:51,080 Speaker 1: do all this little in between steps. So we're taking 720 00:39:51,080 --> 00:39:53,840 Speaker 1: this factory model and now just applying it to customer 721 00:39:53,880 --> 00:39:56,279 Speaker 1: service doing the same thing, trying to automize it as 722 00:39:56,360 --> 00:39:58,640 Speaker 1: much as possible, and then only rely on humans for 723 00:39:58,640 --> 00:40:01,320 Speaker 1: all of these little in between ups that for some reason, 724 00:40:01,640 --> 00:40:04,000 Speaker 1: the robots in all of the autonomous texts isn't very 725 00:40:04,000 --> 00:40:06,680 Speaker 1: good at yet or you know, isn't really focused on completing. 726 00:40:07,360 --> 00:40:11,319 Speaker 1: And that's that's the main thing that that humans are 727 00:40:11,320 --> 00:40:13,040 Speaker 1: going to be are going to be doing in the 728 00:40:13,200 --> 00:40:17,440 Speaker 1: in the autonomous Boba store that's gonna come to your 729 00:40:17,440 --> 00:40:21,240 Speaker 1: neighborhood in like ten years. Speaking of bad things about 730 00:40:21,280 --> 00:40:23,800 Speaker 1: the future or at least the present, let's talk about 731 00:40:23,800 --> 00:40:28,560 Speaker 1: Elon Musk's celebrity death tunnel. So, if you're not aware Elon, 732 00:40:28,680 --> 00:40:31,640 Speaker 1: one of the companies, actually the company he started that 733 00:40:31,800 --> 00:40:34,720 Speaker 1: is based on his own legitimate ideas is the boring 734 00:40:34,760 --> 00:40:39,640 Speaker 1: company UM, which makes big tubes underground uh so that 735 00:40:39,719 --> 00:40:44,080 Speaker 1: people can drive their individual cars through them and avoid traffic. Now, 736 00:40:44,080 --> 00:40:46,359 Speaker 1: Elon Musk is a man who takes his private jet 737 00:40:47,040 --> 00:40:50,160 Speaker 1: between airports in the same city in order to avoid traffic. 738 00:40:50,520 --> 00:40:52,520 Speaker 1: There is nothing he hates more than the idea of 739 00:40:52,520 --> 00:40:55,640 Speaker 1: being a normal person or being at all connected to 740 00:40:55,640 --> 00:40:57,640 Speaker 1: the lives of regular people, which is why you get 741 00:40:57,640 --> 00:41:00,680 Speaker 1: a private jet, um when you could just like fly 742 00:41:00,800 --> 00:41:02,759 Speaker 1: first class or something, because even if you're flying for 743 00:41:02,880 --> 00:41:05,120 Speaker 1: first class, you're still going to an airport and through 744 00:41:05,160 --> 00:41:10,879 Speaker 1: security around like the poor the poors. Um Ellen has 745 00:41:10,920 --> 00:41:15,640 Speaker 1: been vociferous about his hatred of of traffic um transit, 746 00:41:15,719 --> 00:41:18,160 Speaker 1: but also he hates public transit because you might sit 747 00:41:18,239 --> 00:41:21,440 Speaker 1: next to a serial killer. UM. So his solution is 748 00:41:21,520 --> 00:41:25,680 Speaker 1: dig holes underground and let people drive there. And Uh. 749 00:41:26,000 --> 00:41:28,560 Speaker 1: Most of the cities that have attempted to have bording 750 00:41:28,600 --> 00:41:31,319 Speaker 1: tunnels completed have been ghosted by the company. It is 751 00:41:31,440 --> 00:41:34,000 Speaker 1: kind of a con um. But they did build one 752 00:41:34,040 --> 00:41:37,359 Speaker 1: in Las Vegas and Garrison and I used it, uh, 753 00:41:37,400 --> 00:41:39,600 Speaker 1: and it took us from one side of the convention 754 00:41:39,640 --> 00:41:43,640 Speaker 1: center to the other. Um. We potentially, if we had 755 00:41:43,760 --> 00:41:46,200 Speaker 1: made the most use of this service, we we might 756 00:41:46,239 --> 00:41:49,880 Speaker 1: have gotten a five to seven minutes that we didn't 757 00:41:49,920 --> 00:41:54,000 Speaker 1: have to walk, just just you and me alone inside 758 00:41:54,000 --> 00:41:56,480 Speaker 1: the tesla, not having to be around other people in 759 00:41:57,280 --> 00:42:00,880 Speaker 1: the in the RGB tunnel if you're in. One of 760 00:42:00,880 --> 00:42:02,960 Speaker 1: the things Lama has literally said is like, well, if 761 00:42:02,960 --> 00:42:04,560 Speaker 1: you take public transit, you might sit next to some 762 00:42:04,760 --> 00:42:07,319 Speaker 1: serial killer. The way this tunnel thing works is you 763 00:42:07,360 --> 00:42:09,520 Speaker 1: tell them whether you're going east or west, and they 764 00:42:09,760 --> 00:42:12,200 Speaker 1: put you in a tesla that some dude is driving 765 00:42:12,200 --> 00:42:14,280 Speaker 1: that you don't know, and then they fill the tesla 766 00:42:14,360 --> 00:42:17,200 Speaker 1: with other other people that you also don't know. You're 767 00:42:17,200 --> 00:42:20,440 Speaker 1: still sitting next to stranger and you're in this this 768 00:42:20,560 --> 00:42:23,200 Speaker 1: tube that is lit up the same way a pair 769 00:42:23,239 --> 00:42:26,920 Speaker 1: of like Razor gaming headphones are lit up um and 770 00:42:27,000 --> 00:42:30,880 Speaker 1: you just slowly stuck in this tunnel with two random 771 00:42:30,920 --> 00:42:36,839 Speaker 1: people who you don't know. I horrible, like one thing 772 00:42:36,920 --> 00:42:39,319 Speaker 1: I feel like obviously if you're in like New York 773 00:42:39,400 --> 00:42:40,880 Speaker 1: or something or Berlin. I've been in a lot of 774 00:42:40,880 --> 00:42:43,040 Speaker 1: cities where I've traveled on the underground, and I don't 775 00:42:43,080 --> 00:42:47,160 Speaker 1: feel scared traveling in the underground because those have existed 776 00:42:47,200 --> 00:42:48,799 Speaker 1: for a very long time, and so we know what 777 00:42:48,920 --> 00:42:51,160 Speaker 1: happens when there's floods and when there's fires, and there's 778 00:42:51,200 --> 00:42:53,120 Speaker 1: a lot of systems built, which is why you don't 779 00:42:53,160 --> 00:42:56,800 Speaker 1: generally hear about a shipload of people dying in subway. 780 00:42:56,920 --> 00:43:00,279 Speaker 1: It's an extremely safe way to travel this time. Unnol 781 00:43:00,760 --> 00:43:04,160 Speaker 1: is filled with vehicles that take we know about fifty 782 00:43:04,200 --> 00:43:06,920 Speaker 1: five gallons of water to put out a fire when 783 00:43:06,920 --> 00:43:10,080 Speaker 1: the battery catches fire, and the batteries on Tesla's we 784 00:43:10,160 --> 00:43:13,520 Speaker 1: also know, catch fire with some regularity, and you are 785 00:43:13,560 --> 00:43:17,120 Speaker 1: trapped in a tunnel. Uh, there is sometimes traffic. Near 786 00:43:17,200 --> 00:43:18,799 Speaker 1: the end of our ride, we wound up in a 787 00:43:18,840 --> 00:43:22,640 Speaker 1: line of like twenty Tesla's and that did not feel 788 00:43:22,640 --> 00:43:25,480 Speaker 1: good because you just you can see nothing but Tesla's 789 00:43:25,480 --> 00:43:29,000 Speaker 1: ahead of you and behind you, and you're surrounded entirely 790 00:43:29,239 --> 00:43:33,480 Speaker 1: by this tight claustrophobic wall with absolutely no emergency exit's visible. 791 00:43:33,680 --> 00:43:36,200 Speaker 1: So in fire suppression systems visible, I don't know what 792 00:43:36,200 --> 00:43:38,520 Speaker 1: they have installed, but you can't see anything. You cannot 793 00:43:38,560 --> 00:43:41,480 Speaker 1: see a thing. All you see is the Razor r 794 00:43:41,520 --> 00:43:45,839 Speaker 1: GB gaming mouse. And then as as so, as soon 795 00:43:45,880 --> 00:43:47,880 Speaker 1: as we got off this this thing that was supposed 796 00:43:47,880 --> 00:43:49,480 Speaker 1: to take us to like the central area, it just 797 00:43:49,520 --> 00:43:51,560 Speaker 1: took us to the other side of the convention center. 798 00:43:51,719 --> 00:43:53,400 Speaker 1: In order to actually gets where we needed to go, 799 00:43:53,640 --> 00:43:55,680 Speaker 1: we just use the monorail, the thing that's been there 800 00:43:55,719 --> 00:43:59,160 Speaker 1: for a long time and looks fine. And mono rails 801 00:43:59,160 --> 00:44:01,280 Speaker 1: are also not great ideas for a lot of reasons. 802 00:44:01,280 --> 00:44:02,759 Speaker 1: But it got us right to the other end of 803 00:44:02,800 --> 00:44:09,360 Speaker 1: the strip very quickly, conveniently, cleanly. It took cost five dollars. UM. 804 00:44:09,400 --> 00:44:14,200 Speaker 1: So good work, Ellen. I love the tunnel. I hope 805 00:44:14,680 --> 00:44:17,279 Speaker 1: you're proud, ringing, ringing indoors. I can't wait for there 806 00:44:17,320 --> 00:44:19,200 Speaker 1: to be tunnels like that in every city. Don't worry, 807 00:44:19,239 --> 00:44:21,640 Speaker 1: they want the boring company is not a real company. 808 00:44:24,000 --> 00:44:27,560 Speaker 1: Um yeah, anything else care? I mean we already talked 809 00:44:27,600 --> 00:44:30,839 Speaker 1: about the digital health stuff, which was a very big 810 00:44:30,880 --> 00:44:37,520 Speaker 1: part of CEES. UM. Yeah, that's I think that's most 811 00:44:37,560 --> 00:44:41,719 Speaker 1: of what we want to touch on for now. Okay, 812 00:44:42,280 --> 00:44:45,200 Speaker 1: well that's gonna just about do it for all of 813 00:44:45,280 --> 00:44:50,040 Speaker 1: us here at whatever show this is. UM, we will 814 00:44:50,080 --> 00:44:52,840 Speaker 1: at some point have some stuff based on Oh yeah, actually, 815 00:44:52,920 --> 00:44:55,720 Speaker 1: let's let's in by I went in by talking about UM, 816 00:44:55,760 --> 00:44:58,040 Speaker 1: I guess another good thing, but it's a good thing 817 00:44:58,080 --> 00:45:01,320 Speaker 1: that relates to the bad things. Um. We ran across 818 00:45:01,520 --> 00:45:04,400 Speaker 1: a booth on our way out that on the first 819 00:45:04,440 --> 00:45:07,279 Speaker 1: day I had seen and I had thought was just 820 00:45:07,560 --> 00:45:11,000 Speaker 1: um like a I had assumed it was like a 821 00:45:11,520 --> 00:45:14,560 Speaker 1: GPS solution or something because the company was called off 822 00:45:14,640 --> 00:45:18,480 Speaker 1: grid UM, and it's the off grid phone. We talked 823 00:45:18,480 --> 00:45:21,000 Speaker 1: to the founder of the company, Ben Wilson, who was 824 00:45:21,040 --> 00:45:24,160 Speaker 1: just a guy who, as he put it, does not 825 00:45:24,360 --> 00:45:28,920 Speaker 1: like that uh, we consistently seated more and more control 826 00:45:29,000 --> 00:45:34,560 Speaker 1: over our data and over our communications to large companies 827 00:45:34,560 --> 00:45:37,279 Speaker 1: and governments and whoever the funk else gets access to 828 00:45:37,320 --> 00:45:41,000 Speaker 1: these massive and or massive not anonymous data sets, and 829 00:45:41,080 --> 00:45:44,280 Speaker 1: wanted to build a thing for himself that could eventually 830 00:45:44,320 --> 00:45:48,360 Speaker 1: replace his smartphone. UM. So he and the company he 831 00:45:48,440 --> 00:45:51,560 Speaker 1: started have produced these their dumb phones at this moment 832 00:45:51,960 --> 00:45:54,440 Speaker 1: that context and can call and do encrypted end to 833 00:45:54,600 --> 00:45:57,840 Speaker 1: end communication. They also, if you are off grid, like 834 00:45:57,880 --> 00:45:59,840 Speaker 1: in the middle of nowhere, and you and your friends 835 00:45:59,880 --> 00:46:01,960 Speaker 1: had of these, you can communicate through text thro through 836 00:46:02,000 --> 00:46:04,400 Speaker 1: phone to each other even if there is no network. Right. 837 00:46:04,440 --> 00:46:07,560 Speaker 1: The phones themselves do like make a network. They communicate 838 00:46:07,680 --> 00:46:09,880 Speaker 1: just just to each other, just to each other. They 839 00:46:09,920 --> 00:46:12,000 Speaker 1: do not connect to the why their internet. Yeah, which 840 00:46:12,040 --> 00:46:14,839 Speaker 1: is really cool and potentially extremely useful. This is this 841 00:46:14,880 --> 00:46:18,000 Speaker 1: is um There's a number of applications that this could have. Garrison, 842 00:46:18,000 --> 00:46:20,600 Speaker 1: you mentioned that the Atlanta Forest Defense people could benefit 843 00:46:20,640 --> 00:46:23,600 Speaker 1: from something like this because it will effectively they're about 844 00:46:23,600 --> 00:46:25,799 Speaker 1: two hundred bucks apiece. Anyone who can afford a few 845 00:46:25,800 --> 00:46:28,680 Speaker 1: of these, you can set up your own secure comm's 846 00:46:28,760 --> 00:46:31,480 Speaker 1: network for wherever you are and whatever you're doing and 847 00:46:31,480 --> 00:46:34,279 Speaker 1: and the other. The other feature of this is that 848 00:46:34,360 --> 00:46:37,920 Speaker 1: you can set it onto something called sheep mode, where basically, 849 00:46:38,320 --> 00:46:42,719 Speaker 1: if if if you suspect that that someone who you 850 00:46:42,719 --> 00:46:45,120 Speaker 1: don't want to look at your phone, whether that's law enforcement, 851 00:46:45,160 --> 00:46:48,960 Speaker 1: whether that's other random random other people, you can set 852 00:46:49,000 --> 00:46:51,919 Speaker 1: it to this mode that when they when they either 853 00:46:52,000 --> 00:46:56,040 Speaker 1: sees or gain gain possession of this device, all of 854 00:46:56,320 --> 00:46:59,279 Speaker 1: the the the data is immediately wiped before they can 855 00:46:59,280 --> 00:47:02,520 Speaker 1: actually open up the phone UM. And they will open 856 00:47:02,560 --> 00:47:05,800 Speaker 1: it up, they will see this fake profile that called 857 00:47:05,920 --> 00:47:08,719 Speaker 1: the called not not fake profile, but like this this 858 00:47:08,800 --> 00:47:12,560 Speaker 1: alternate profile called it called the sheep profile, which shows 859 00:47:13,320 --> 00:47:15,239 Speaker 1: not not the stuff that not the stuff that you 860 00:47:15,239 --> 00:47:17,000 Speaker 1: were using the food for. You can then just be 861 00:47:17,080 --> 00:47:19,919 Speaker 1: blank or you could like stick other numbers in there. 862 00:47:20,000 --> 00:47:22,040 Speaker 1: You could have like a series of fake text. But 863 00:47:22,120 --> 00:47:24,480 Speaker 1: and then and but if you ever regain possession of 864 00:47:24,480 --> 00:47:28,160 Speaker 1: the phone, you're able to put in UM a special 865 00:47:28,400 --> 00:47:32,280 Speaker 1: a special password that will it will send the data. 866 00:47:32,960 --> 00:47:36,440 Speaker 1: It'll it'll it'll send the data through encryption back onto 867 00:47:36,440 --> 00:47:38,320 Speaker 1: this device, so you still have the things that you 868 00:47:38,320 --> 00:47:42,160 Speaker 1: would have lost. And obviously there's a degree of like 869 00:47:42,520 --> 00:47:44,879 Speaker 1: you would have to have some trust for the company, yes, 870 00:47:44,960 --> 00:47:47,600 Speaker 1: which is says like and Ben says, like, we are 871 00:47:47,800 --> 00:47:49,680 Speaker 1: attempting to do this. He was very open about the 872 00:47:49,719 --> 00:47:51,919 Speaker 1: fact that that they have the phones. We saw them, 873 00:47:51,960 --> 00:47:54,040 Speaker 1: like some of this stuff is still getting built out. 874 00:47:54,080 --> 00:47:55,879 Speaker 1: It is it is is still in development. They're still 875 00:47:55,920 --> 00:47:58,640 Speaker 1: figuring out in different ways to keep the server secure, 876 00:47:58,880 --> 00:48:01,840 Speaker 1: to protect the servers from subpoenas from the American government 877 00:48:01,880 --> 00:48:04,359 Speaker 1: and from other from other governments. Like there's this this 878 00:48:04,440 --> 00:48:07,319 Speaker 1: is still something that is being worked on. Uh. It 879 00:48:07,400 --> 00:48:10,120 Speaker 1: was just one of the you know, we we see 880 00:48:10,120 --> 00:48:12,400 Speaker 1: a lot of like like a lot of lofty promises 881 00:48:12,440 --> 00:48:15,400 Speaker 1: and very very little thing to show for this. This 882 00:48:15,480 --> 00:48:17,560 Speaker 1: is one of the things that had actually, you know, 883 00:48:17,680 --> 00:48:20,239 Speaker 1: just this one guy that had you know, some pretty 884 00:48:20,320 --> 00:48:24,239 Speaker 1: some pretty relatable promises UM and and it's very open 885 00:48:24,280 --> 00:48:26,120 Speaker 1: about what they have done and what they haven't done 886 00:48:26,160 --> 00:48:28,160 Speaker 1: and what they're trying to do now. He he he was, 887 00:48:28,280 --> 00:48:30,239 Speaker 1: he was he was not bullshitting. He wasn't trying to 888 00:48:30,680 --> 00:48:33,560 Speaker 1: over emphasize what it can do um or what it 889 00:48:33,600 --> 00:48:35,160 Speaker 1: can do at the moment, like it's it's still being 890 00:48:35,200 --> 00:48:36,440 Speaker 1: worked on. But this is one of one of the 891 00:48:36,520 --> 00:48:38,359 Speaker 1: one of the few, one of the future things that 892 00:48:38,400 --> 00:48:40,719 Speaker 1: we will be that we will want to follow up on, 893 00:48:41,160 --> 00:48:43,279 Speaker 1: and I think we're going to try to have been 894 00:48:43,320 --> 00:48:45,080 Speaker 1: on the show in the near future because they're going 895 00:48:45,120 --> 00:48:47,360 Speaker 1: to be doing a kickstarter to fund one of the 896 00:48:47,400 --> 00:48:49,879 Speaker 1: next phases of production of this um. But you can 897 00:48:49,960 --> 00:48:52,239 Speaker 1: you can look them up yourself. You can buy the 898 00:48:52,320 --> 00:48:54,319 Speaker 1: version one of their product, which is on sale and 899 00:48:54,360 --> 00:48:58,759 Speaker 1: functional now at off grid phone dot com. Spelled the 900 00:48:58,800 --> 00:49:01,640 Speaker 1: way you would expect UM. So yeah, check out off 901 00:49:01,680 --> 00:49:04,640 Speaker 1: grid phone dot com. We found it interesting, will be 902 00:49:04,680 --> 00:49:08,160 Speaker 1: following up on that. Um. Ben gave me very strong 903 00:49:08,480 --> 00:49:11,440 Speaker 1: the good kind of libertarian vibes. Reminded me of a 904 00:49:11,440 --> 00:49:13,799 Speaker 1: couple of people I've I used to hang out with 905 00:49:13,840 --> 00:49:17,239 Speaker 1: in my youth, and it's very much is that kind 906 00:49:17,280 --> 00:49:20,320 Speaker 1: of like product of just a cranky guy who knows 907 00:49:20,480 --> 00:49:24,360 Speaker 1: tech and is angry at all of the data being 908 00:49:24,440 --> 00:49:26,480 Speaker 1: sucked up and all of the data that we just 909 00:49:26,560 --> 00:49:30,520 Speaker 1: kind of agree together, we're going to give away two 910 00:49:30,680 --> 00:49:34,239 Speaker 1: unsavory characters because life in the modern world is kind 911 00:49:34,280 --> 00:49:36,360 Speaker 1: of impossible if you don't do that. No, And like 912 00:49:36,520 --> 00:49:39,000 Speaker 1: one of one of the things on his signs was 913 00:49:39,239 --> 00:49:41,680 Speaker 1: something along the lines of don't let the pope po 914 00:49:41,960 --> 00:49:45,560 Speaker 1: look at your phone, so like it's yeah, it is 915 00:49:45,600 --> 00:49:48,560 Speaker 1: somebody who gets it. Yeah, yeah, we liked we liked 916 00:49:48,600 --> 00:49:51,839 Speaker 1: ben Um so yeah, that is that is the dark 917 00:49:51,920 --> 00:49:55,560 Speaker 1: side of the future of tech, as this year's ce 918 00:49:55,800 --> 00:49:58,799 Speaker 1: S has unveiled it to us. Um. You know, this 919 00:49:58,880 --> 00:50:01,440 Speaker 1: is the also the conclud usition of our reporting directly 920 00:50:01,440 --> 00:50:04,120 Speaker 1: on the convention itself. We will have some reporting in 921 00:50:04,160 --> 00:50:06,000 Speaker 1: the future that will be influenced by things we found 922 00:50:06,000 --> 00:50:07,600 Speaker 1: here that we're going to continue to look up. But 923 00:50:07,719 --> 00:50:09,759 Speaker 1: um and and we should have we should have some 924 00:50:09,840 --> 00:50:11,600 Speaker 1: of the audio that we pulled from inside the convection 925 00:50:11,680 --> 00:50:15,000 Speaker 1: center that should be edited together sometime in the future. 926 00:50:15,080 --> 00:50:17,360 Speaker 1: Talk to Paler. That'll be fine, yes, as as a 927 00:50:17,360 --> 00:50:21,400 Speaker 1: little kind of documentary, little daily diary of of of 928 00:50:21,440 --> 00:50:23,640 Speaker 1: what we were actually doing on the ground. So that's 929 00:50:23,640 --> 00:50:26,080 Speaker 1: being worked on. But this is this is as we're 930 00:50:26,080 --> 00:50:27,880 Speaker 1: recording right now. This is the final day of CES. 931 00:50:28,000 --> 00:50:32,080 Speaker 1: We are almost done. We have we are both very sore. 932 00:50:32,680 --> 00:50:35,439 Speaker 1: Is surprisingly hard on your body. We have to enter 933 00:50:35,640 --> 00:50:38,000 Speaker 1: Eureka Park one more time, but then we will be finished, 934 00:50:38,640 --> 00:50:41,440 Speaker 1: and then we'll have to upload this and and edited 935 00:50:41,560 --> 00:50:43,400 Speaker 1: the rest of the stuff we've made into into a 936 00:50:43,440 --> 00:50:45,279 Speaker 1: little piece for you. So that is that that is 937 00:50:45,280 --> 00:50:47,680 Speaker 1: still coming. You say we, which was very generous. You're 938 00:50:47,719 --> 00:50:50,440 Speaker 1: going to be doing that. Me and daniels I will 939 00:50:50,480 --> 00:50:54,160 Speaker 1: not be editing anything. UM. I don't know how to anyway, 940 00:50:55,080 --> 00:51:02,440 Speaker 1: go to Hell. I love you. It could Happen here 941 00:51:02,480 --> 00:51:05,120 Speaker 1: as a production of cool Zone Media. For more podcasts 942 00:51:05,120 --> 00:51:07,760 Speaker 1: from cool Zone Media, visit our website cool zone media 943 00:51:07,800 --> 00:51:09,600 Speaker 1: dot com, or check us out on the I Heart 944 00:51:09,680 --> 00:51:12,719 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to podcasts. 945 00:51:13,239 --> 00:51:15,360 Speaker 1: You can find sources for It Could Happen Here, updated 946 00:51:15,440 --> 00:51:18,920 Speaker 1: monthly at cool zone Media dot com slash sources. Thanks 947 00:51:18,960 --> 00:51:19,480 Speaker 1: for listening.