1 00:00:01,760 --> 00:00:02,880 Speaker 1: All the media. 2 00:00:04,080 --> 00:00:07,080 Speaker 2: Hi everyone, it's James coming at you with pretty nasty 3 00:00:07,120 --> 00:00:10,520 Speaker 2: cold here. I wanted to share with you that Worldfast 4 00:00:10,560 --> 00:00:13,399 Speaker 2: has swept through Los Angeles in the last couple of days. 5 00:00:13,640 --> 00:00:17,480 Speaker 2: While I'm recording this. Thousands of people have been displaced. 6 00:00:17,960 --> 00:00:20,320 Speaker 2: Five people have died that we know of so far, 7 00:00:21,600 --> 00:00:24,640 Speaker 2: thousands of structures have been burned, and many many people 8 00:00:24,640 --> 00:00:27,240 Speaker 2: in a will be finding themselves out of their homes 9 00:00:27,240 --> 00:00:30,440 Speaker 2: with nowhere to go, with very few resources. If you'd 10 00:00:30,480 --> 00:00:32,640 Speaker 2: like to help, We've come up with some mutual aid 11 00:00:32,640 --> 00:00:35,479 Speaker 2: groups who you can donate to, and we'll be interviewing 12 00:00:35,520 --> 00:00:38,360 Speaker 2: one of them on this show next week. So if 13 00:00:38,360 --> 00:00:41,199 Speaker 2: you'd like to help, the three places where we suggest 14 00:00:41,200 --> 00:00:45,080 Speaker 2: you would donate some cash are the Sidewalk Project. That's 15 00:00:45,400 --> 00:00:52,559 Speaker 2: the Sidewalkproject, dot org, Ktown for All that's letter k 16 00:00:53,080 --> 00:00:56,520 Speaker 2: t own, f O R A l L dot RG, 17 00:00:57,360 --> 00:01:01,560 Speaker 2: and ETNA Street Solidarity you can find on Venmo or 18 00:01:01,880 --> 00:01:05,120 Speaker 2: I think on Instagram as well. That's A E T 19 00:01:05,360 --> 00:01:09,280 Speaker 2: N A S T R E E T S O. 20 00:01:09,400 --> 00:01:12,280 Speaker 1: L I D A I T Y. 21 00:01:12,800 --> 00:01:22,000 Speaker 2: All right, I'm gonna go rest my voice. 22 00:01:19,000 --> 00:01:22,559 Speaker 1: Oh man, welcome to it could happen here a podcast. 23 00:01:22,640 --> 00:01:27,120 Speaker 1: That's happening here If here is your ears, if you're 24 00:01:27,400 --> 00:01:30,560 Speaker 1: deaf and reading this, then it's happening to your eyes. 25 00:01:30,920 --> 00:01:33,240 Speaker 1: Either way, it's happening here. 26 00:01:33,800 --> 00:01:35,319 Speaker 3: Here also being Las Vegas. 27 00:01:35,520 --> 00:01:39,240 Speaker 1: Well, yes, also Las Vegas. That's Nevada, not the other one. Nevada. 28 00:01:39,240 --> 00:01:45,800 Speaker 1: I A yeah, uh huh uh. Podcast number three. How 29 00:01:45,880 --> 00:01:49,360 Speaker 1: the time does fly? Sure does. By the time you 30 00:01:49,400 --> 00:01:51,720 Speaker 1: listen to this, Garrison and I will have just had 31 00:01:51,960 --> 00:01:54,720 Speaker 1: the best meal that we're going to have. Oh god, yeah, 32 00:01:54,760 --> 00:01:57,560 Speaker 1: it's tomorrow for us still, but we're still we're very 33 00:01:57,600 --> 00:02:01,080 Speaker 1: excited about Morimoto, which is a fantastic Every year we 34 00:02:01,240 --> 00:02:05,080 Speaker 1: have a very special dinner just them and me and 35 00:02:05,200 --> 00:02:08,360 Speaker 1: a couple of friends who will remain anonymous because people 36 00:02:08,400 --> 00:02:09,400 Speaker 1: get weird on the internet. 37 00:02:09,440 --> 00:02:12,959 Speaker 3: Sometimes it is literally the highlight of my year. Sometimes 38 00:02:13,520 --> 00:02:16,000 Speaker 3: it does keep me going. Actually really gives me a 39 00:02:16,000 --> 00:02:17,720 Speaker 3: lot of power. Some of the best tacos I've ever 40 00:02:17,800 --> 00:02:19,280 Speaker 3: had in my life, so good. 41 00:02:19,440 --> 00:02:20,760 Speaker 4: Uh huh. 42 00:02:20,800 --> 00:02:23,720 Speaker 1: Anyway, ah, we're just thinking about delicious food. Let's talk 43 00:02:23,720 --> 00:02:25,560 Speaker 1: about the dead eyed ghoul we met. Oh wait, no, 44 00:02:25,560 --> 00:02:28,440 Speaker 1: we're doing yet. We met a dead eyed ghul that 45 00:02:28,480 --> 00:02:32,520 Speaker 1: I'm gonna spoil now, real monster like real, real, real 46 00:02:32,760 --> 00:02:36,400 Speaker 1: evil vibes, like if this guy as soon as I 47 00:02:36,440 --> 00:02:38,359 Speaker 1: met him, shook his hand, like, oh if you get 48 00:02:38,360 --> 00:02:41,119 Speaker 1: if this guy gets power, you're going to be responsible 49 00:02:41,160 --> 00:02:42,639 Speaker 1: for a lot of death and suffering. 50 00:02:42,800 --> 00:02:44,520 Speaker 3: I mean speaking of he will. 51 00:02:44,560 --> 00:02:48,560 Speaker 1: He's just not that talented. He wishes, but you never 52 00:02:48,560 --> 00:02:49,760 Speaker 1: know where these guys are gonna end up. 53 00:02:49,760 --> 00:02:56,239 Speaker 3: Speaking of sad evil Uh, Twitter X everything appy, that's 54 00:02:56,280 --> 00:02:58,600 Speaker 3: what people are calling it. They gave a keynote which 55 00:02:58,680 --> 00:02:59,519 Speaker 3: was very sad. 56 00:03:00,120 --> 00:03:04,080 Speaker 1: The CEO Lindah Linda really jakerino about Twitter for a 57 00:03:04,120 --> 00:03:05,520 Speaker 1: while oh so bad. 58 00:03:06,200 --> 00:03:11,639 Speaker 3: So they started by talking about how Facebook meta has 59 00:03:11,680 --> 00:03:15,520 Speaker 3: has copied Twitter's like fact checking policy of actually not 60 00:03:15,560 --> 00:03:19,120 Speaker 3: having real fact checks. Yes, now maybe has actually kind 61 00:03:19,120 --> 00:03:22,080 Speaker 3: of failed as an industry, but for you know, our 62 00:03:22,120 --> 00:03:25,919 Speaker 3: problems perhaps with fact checking very different from these people's problems, 63 00:03:26,360 --> 00:03:29,520 Speaker 3: and the fact now that that Facebook is walking away 64 00:03:29,880 --> 00:03:32,640 Speaker 3: from actual, like genuine like fact checks against like just 65 00:03:32,680 --> 00:03:36,880 Speaker 3: disinformation misinformation and parting ways with like using like legacy 66 00:03:36,920 --> 00:03:39,320 Speaker 3: media a lets to verify information because those media aulets 67 00:03:39,360 --> 00:03:42,840 Speaker 3: are too political quote unquote, and instead is copying the 68 00:03:43,040 --> 00:03:46,720 Speaker 3: the current X model of free speech and specifically saying 69 00:03:46,880 --> 00:03:50,440 Speaker 3: like there's been way too much censorship on gender issues. 70 00:03:50,800 --> 00:03:53,520 Speaker 1: Now you can comment that women are a piece of property. 71 00:03:53,960 --> 00:03:55,640 Speaker 3: Well, I mean I think specifically this is this is 72 00:03:55,680 --> 00:03:57,440 Speaker 3: like trans like no, no, no stuff too. 73 00:03:57,680 --> 00:04:00,160 Speaker 1: One of the things that is specific exemption now is 74 00:04:00,160 --> 00:04:02,120 Speaker 1: that you can now refer to women as if they 75 00:04:02,160 --> 00:04:04,560 Speaker 1: are property on Facebook. 76 00:04:05,080 --> 00:04:07,360 Speaker 3: This is the future of communication, right. 77 00:04:07,320 --> 00:04:10,440 Speaker 1: Yeah, thank god, Linda is really blazing a trail for 78 00:04:10,520 --> 00:04:11,320 Speaker 1: women everywhere. 79 00:04:11,480 --> 00:04:15,680 Speaker 3: Linda was very excited about that, and they u Yakarino 80 00:04:15,880 --> 00:04:18,719 Speaker 3: about that for like a good a good ten minutes 81 00:04:19,360 --> 00:04:21,160 Speaker 3: about how you know this is this is where we're 82 00:04:21,200 --> 00:04:24,120 Speaker 3: really entering a new era of free speech and social media. 83 00:04:24,720 --> 00:04:28,159 Speaker 3: And then she got asked a question about how much 84 00:04:28,440 --> 00:04:31,960 Speaker 3: ex Twitter, the everything app will will take a part 85 00:04:32,120 --> 00:04:37,120 Speaker 3: in Elon Musk's plans for the Department of Government Efficiency DOGE, 86 00:04:37,480 --> 00:04:40,960 Speaker 3: And this got the the first applause of the panel. 87 00:04:41,040 --> 00:04:43,880 Speaker 3: It applause only happens two times during the DOGE section. 88 00:04:44,120 --> 00:04:47,480 Speaker 3: Was the first like you know, room, room starts clapping moment, 89 00:04:47,600 --> 00:04:48,920 Speaker 3: everyone goes crazy. 90 00:04:49,360 --> 00:04:51,040 Speaker 1: How many minutes in was that? 91 00:04:51,440 --> 00:04:53,600 Speaker 3: Oh maybe it was like maybe like maybe like twelve 92 00:04:53,640 --> 00:04:54,240 Speaker 3: thirteen minutes? 93 00:04:54,320 --> 00:04:54,599 Speaker 5: People? 94 00:04:54,640 --> 00:04:56,839 Speaker 1: Really, yeah, I had to had to be intentional here. 95 00:04:56,880 --> 00:04:59,120 Speaker 1: This is not like they were just overdue for class. 96 00:04:59,200 --> 00:05:02,039 Speaker 3: No no no talked about Vivek talked about, you know, 97 00:05:02,080 --> 00:05:05,120 Speaker 3: elon turning to Twitter, X the everything app for like 98 00:05:05,200 --> 00:05:09,000 Speaker 3: suggestions on which government agencies to get rid of. 99 00:05:11,080 --> 00:05:12,720 Speaker 1: I hope we get rid of the A t F. 100 00:05:13,440 --> 00:05:15,000 Speaker 3: So so that that was. 101 00:05:15,040 --> 00:05:18,039 Speaker 1: Machine guns mandatory? Why not at this point, right, it 102 00:05:18,080 --> 00:05:21,359 Speaker 1: can only help, It can only help. Look, if we 103 00:05:21,440 --> 00:05:24,320 Speaker 1: learned anything from a thing, I'm not going to specify 104 00:05:24,320 --> 00:05:28,760 Speaker 1: that happened late last year. More suppressors is always handy. 105 00:05:33,160 --> 00:05:35,039 Speaker 3: The second thing that got applause was what they talked 106 00:05:35,080 --> 00:05:39,360 Speaker 3: about next, was about you know, everyone's everyone's turning to 107 00:05:39,760 --> 00:05:43,640 Speaker 3: x Twitter everything everything app for information now and and 108 00:05:43,640 --> 00:05:46,240 Speaker 3: and Twitter x everything app played a crucial part in 109 00:05:46,279 --> 00:05:50,440 Speaker 3: bringing to light the Muslim rape gang story in the 110 00:05:50,560 --> 00:05:53,960 Speaker 3: UK and how that was so important for saving children 111 00:05:54,279 --> 00:05:56,920 Speaker 3: and we have to we have to post more, not less. 112 00:05:57,520 --> 00:05:59,600 Speaker 3: And like this was the other thing that got massive 113 00:05:59,600 --> 00:06:03,240 Speaker 3: applau was talking about the rape gangs. 114 00:06:03,400 --> 00:06:06,000 Speaker 1: People love rape gangs, people love rape gangs. That that 115 00:06:06,080 --> 00:06:08,280 Speaker 1: was a pretty good Star Trek episode. 116 00:06:08,360 --> 00:06:10,520 Speaker 3: That was Tasrir's Planet with the rape gangs, one of 117 00:06:10,720 --> 00:06:12,400 Speaker 3: one of the more black pilling things. 118 00:06:13,120 --> 00:06:14,800 Speaker 1: It wasn't a very good Star Trek episode. 119 00:06:14,920 --> 00:06:17,440 Speaker 3: It's also not a good Track episode. I was referring 120 00:06:17,440 --> 00:06:20,640 Speaker 3: to the panel, not the Trek episode. But that was 121 00:06:20,640 --> 00:06:22,440 Speaker 3: the other thing that got massive applause is it's like 122 00:06:22,520 --> 00:06:25,839 Speaker 3: save the children type rhetoric and you know, saying, you know, 123 00:06:25,880 --> 00:06:28,400 Speaker 3: like as a mother, it's it's so important that the 124 00:06:28,520 --> 00:06:32,360 Speaker 3: more people post about this problem. That was the two 125 00:06:32,400 --> 00:06:35,359 Speaker 3: big applause moments. But I think in general, this this 126 00:06:35,400 --> 00:06:38,760 Speaker 3: whole panel was trying to like, you know, demonstrate how 127 00:06:38,800 --> 00:06:42,320 Speaker 3: symbiotic a new Trump presidency and Elon Musk's. 128 00:06:42,040 --> 00:06:45,800 Speaker 1: Twitter this is a line, this is a tap from 129 00:06:45,800 --> 00:06:46,919 Speaker 1: the Trump presidency. 130 00:06:47,320 --> 00:06:49,159 Speaker 3: This is how you talk to the new government. Like, 131 00:06:49,320 --> 00:06:51,680 Speaker 3: this is how you talk to all of these new people, 132 00:06:51,760 --> 00:06:54,080 Speaker 3: all these new cabinet members. They're all on Twitter. They're 133 00:06:54,120 --> 00:06:56,080 Speaker 3: all talking on Twitter. This is this is how you 134 00:06:56,120 --> 00:06:58,440 Speaker 3: stay connected to the new government. 135 00:06:58,600 --> 00:07:01,200 Speaker 1: It's interesting. I want thing I'm curious about, so that 136 00:07:01,240 --> 00:07:03,320 Speaker 1: this is the thing that happened the last set of 137 00:07:03,440 --> 00:07:06,120 Speaker 1: Nazis that gained power in a country in a big way, 138 00:07:06,600 --> 00:07:10,160 Speaker 1: the German ones. There was this common attitude of like 139 00:07:10,200 --> 00:07:13,320 Speaker 1: if only Hitler knew because Nazi policies didn't help the 140 00:07:13,320 --> 00:07:15,120 Speaker 1: people that were supposed to help. They hurt a lot 141 00:07:15,120 --> 00:07:17,560 Speaker 1: of people, like they were just bad at everything, like 142 00:07:17,640 --> 00:07:20,160 Speaker 1: fascists tend to be. And there was this attitude that like, well, 143 00:07:20,240 --> 00:07:24,040 Speaker 1: Hitler can't know, like the fact that, like we the 144 00:07:24,040 --> 00:07:26,320 Speaker 1: country's been handed over the gangsters who were continuing to 145 00:07:26,400 --> 00:07:28,440 Speaker 1: hurt the people Hitler promised to help. He must not 146 00:07:28,520 --> 00:07:30,680 Speaker 1: be aware, like if he knew, he would fix it, 147 00:07:30,800 --> 00:07:33,360 Speaker 1: only he knew. So I'm wondering how that's going to 148 00:07:33,480 --> 00:07:36,760 Speaker 1: play in here. As Trump's policies continue to hurt the 149 00:07:36,800 --> 00:07:39,160 Speaker 1: people who a lot of the people who voted from, 150 00:07:39,160 --> 00:07:41,040 Speaker 1: not the rich people who voted for him, but the 151 00:07:41,080 --> 00:07:44,080 Speaker 1: people who like flipped between him and Biden or whatever, Like, 152 00:07:44,680 --> 00:07:46,840 Speaker 1: those folks are going to get fucked like the rest 153 00:07:46,920 --> 00:07:50,440 Speaker 1: of us. And I kind of wonder if they're going 154 00:07:50,520 --> 00:07:53,600 Speaker 1: to if there's going to be what win the blowback 155 00:07:54,040 --> 00:07:57,120 Speaker 1: against X the everything app will happen, right, Like yeah, 156 00:07:57,240 --> 00:08:01,160 Speaker 1: as people are like either I'm being or him being 157 00:08:01,320 --> 00:08:05,040 Speaker 1: called like a retard by Elon Musk for complaining that, 158 00:08:05,240 --> 00:08:09,680 Speaker 1: like Elon Musk tweets it and randomly to people when 159 00:08:09,720 --> 00:08:13,200 Speaker 1: they make very valid critiques of the ship that he's doing. 160 00:08:13,800 --> 00:08:16,560 Speaker 1: Like that's literally what he's calling. He's saying it like 161 00:08:16,600 --> 00:08:18,840 Speaker 1: every day, like constantly. I'm not I'm not using it 162 00:08:18,880 --> 00:08:21,280 Speaker 1: as a slur. That's just the term he's using. If 163 00:08:21,280 --> 00:08:24,680 Speaker 1: they comment that like their fucking medicaid got cut because 164 00:08:24,680 --> 00:08:27,760 Speaker 1: Trump put doctor Oz in charge of it, and Elon 165 00:08:27,840 --> 00:08:30,840 Speaker 1: Musk calls them like, you know, a slur, what does 166 00:08:30,880 --> 00:08:32,000 Speaker 1: that do to you? 167 00:08:32,160 --> 00:08:33,120 Speaker 5: Like they like, I don't even know. 168 00:08:33,160 --> 00:08:35,160 Speaker 1: I don't even have any more intelligent than like, yeah, 169 00:08:35,200 --> 00:08:37,120 Speaker 1: I wonder what that does to Twitter's bottom line? 170 00:08:37,360 --> 00:08:39,960 Speaker 3: Ye, I mean yeah, I'm not sure if they care anymore. 171 00:08:39,960 --> 00:08:42,840 Speaker 3: I mean. Something else Linda talked about is how Twitter's 172 00:08:42,840 --> 00:08:46,520 Speaker 3: the only place for independent news to spread. And as 173 00:08:47,000 --> 00:08:48,880 Speaker 3: both of us have, you know, worked in the independent 174 00:08:48,960 --> 00:08:52,760 Speaker 3: journalism minds nothing nothing spreads on Twitter anymore. 175 00:08:52,800 --> 00:08:54,839 Speaker 1: No, as long if it's news, it doesn't. The only 176 00:08:54,840 --> 00:08:57,440 Speaker 1: thing that spreads is Yeah, like the shit that makes 177 00:08:57,440 --> 00:09:01,080 Speaker 1: people very angry but keeps them on the site, like articles, videos, 178 00:09:01,080 --> 00:09:02,520 Speaker 1: If it takes you off site, it doesn't. 179 00:09:02,520 --> 00:09:04,600 Speaker 3: See. Yeah, the things that go viral and get spread 180 00:09:04,679 --> 00:09:08,520 Speaker 3: is like encouraging racial bias, yes, s pogrooms essentially. 181 00:09:08,240 --> 00:09:10,520 Speaker 1: Yeah, which is what happened last year in the UK, 182 00:09:10,600 --> 00:09:12,000 Speaker 1: and they're sure trying to do it again. 183 00:09:12,080 --> 00:09:13,920 Speaker 3: I mean, I I think some of someone's some of 184 00:09:13,960 --> 00:09:15,400 Speaker 3: what she's referencing is, you know, there's a lot of 185 00:09:15,440 --> 00:09:19,160 Speaker 3: like throttling attentionally of you know people on maybe our proclivities, 186 00:09:19,400 --> 00:09:21,840 Speaker 3: and there is a degree of boosting for more you know, 187 00:09:21,920 --> 00:09:25,199 Speaker 3: centrist or right wing journalists. And maybe that's that's some 188 00:09:25,280 --> 00:09:27,439 Speaker 3: of some of what they could be kind of more 189 00:09:27,559 --> 00:09:30,320 Speaker 3: more referring to there, But you know it was it 190 00:09:30,360 --> 00:09:33,560 Speaker 3: was a short keynote, only thirty minutes. Just the two 191 00:09:33,600 --> 00:09:36,000 Speaker 3: things that got applause are doge. 192 00:09:35,800 --> 00:09:37,839 Speaker 1: Well Linda doesn't know that many words, so they really 193 00:09:37,880 --> 00:09:39,240 Speaker 1: need to keep it under thirty minutes. 194 00:09:39,840 --> 00:09:42,800 Speaker 3: And literally Muslim rape gangs. Is you know this this 195 00:09:42,880 --> 00:09:46,200 Speaker 3: type of like like very very gross racial fear mongering, 196 00:09:46,559 --> 00:09:48,520 Speaker 3: and those are the things that like lit up the room. 197 00:09:49,000 --> 00:09:51,000 Speaker 1: You know, we all want there to be an after 198 00:09:51,320 --> 00:09:55,680 Speaker 1: where there's even the minimal degree of accountability that happened 199 00:09:55,720 --> 00:09:58,920 Speaker 1: after the Nazis. But like what I try to in 200 00:09:58,920 --> 00:10:02,160 Speaker 1: my darker moments is like, well that's another person who 201 00:10:02,200 --> 00:10:04,880 Speaker 1: like really made the argument of like what needs to 202 00:10:04,960 --> 00:10:10,440 Speaker 1: happen when this ends? Because it's just I want to 203 00:10:10,600 --> 00:10:14,160 Speaker 1: hurt people. My business is enabling harm. I want to 204 00:10:14,200 --> 00:10:19,480 Speaker 1: get mobs in the street, beating migrants. Like that's Linda's business. 205 00:10:19,760 --> 00:10:23,560 Speaker 1: That's the business she has willfully attached herself to. And 206 00:10:23,600 --> 00:10:27,400 Speaker 1: we should all see that. It's very important to not 207 00:10:27,640 --> 00:10:30,600 Speaker 1: stop talking about it like what it is. These people 208 00:10:30,679 --> 00:10:34,480 Speaker 1: are trying to cause racial violence, and they are trying 209 00:10:34,520 --> 00:10:38,040 Speaker 1: to cause gendered violence, and they are trying to cause 210 00:10:38,720 --> 00:10:43,040 Speaker 1: harm at scale to communities of people that they see 211 00:10:43,280 --> 00:10:49,200 Speaker 1: financial profit in damaging, well in other uplifting c es ues, 212 00:10:49,280 --> 00:10:52,319 Speaker 1: cool stuff. I love the Consumer Electronics Show. 213 00:10:54,320 --> 00:10:57,040 Speaker 3: Actually, I think it might be time for an ad break. 214 00:10:56,960 --> 00:11:02,320 Speaker 1: Speaking of damaging communities of people, right, there's a chance, yeah, ads, 215 00:11:02,360 --> 00:11:16,320 Speaker 1: Oh well we're back. Boy, I'm so glad that those 216 00:11:16,360 --> 00:11:19,400 Speaker 1: ads told me that Fragaccio Blow is touring with Bono. 217 00:11:19,880 --> 00:11:23,040 Speaker 1: I never thought they'd do it, but boy howdy, and 218 00:11:23,080 --> 00:11:26,800 Speaker 1: they're singing each other's song, so you know, that's really exciting. 219 00:11:26,840 --> 00:11:28,680 Speaker 1: It's like when Barbara did Selene. 220 00:11:29,080 --> 00:11:31,079 Speaker 3: I don't know who Barbara or Selene is, but that's, 221 00:11:31,080 --> 00:11:34,240 Speaker 3: oh my god, that's cool. Robert Luckily, I do know 222 00:11:34,280 --> 00:11:37,240 Speaker 3: what SKA is. I consider myself a day of culture, 223 00:11:37,720 --> 00:11:42,040 Speaker 3: and for tonight, me and Robert attended this kind of 224 00:11:42,120 --> 00:11:46,800 Speaker 3: like side event at CEES called Showstoppers. And as you 225 00:11:46,840 --> 00:11:50,160 Speaker 3: walk around the CEES floor, there's a lot of frankly garbage. 226 00:11:50,160 --> 00:11:53,360 Speaker 3: There's a lot of just like mostly garbage stuff that 227 00:11:53,360 --> 00:11:54,520 Speaker 3: you or stuff. 228 00:11:54,280 --> 00:11:56,520 Speaker 1: That like you're just not interested in because you're literally 229 00:11:56,559 --> 00:11:59,719 Speaker 1: buying like screens from a manufacturer in China. Like it's 230 00:11:59,720 --> 00:12:01,480 Speaker 1: like that which is not the business you're into, because 231 00:12:01,640 --> 00:12:03,360 Speaker 1: some of this stuff has meant for companies so. 232 00:12:03,400 --> 00:12:06,600 Speaker 3: Much floor space, Like there's like I we walked. 233 00:12:06,600 --> 00:12:09,000 Speaker 1: Twenty Town that I spent the first seven years of 234 00:12:09,000 --> 00:12:11,640 Speaker 1: my life in is smaller than one of the room CEU. 235 00:12:12,840 --> 00:12:15,839 Speaker 3: It's across like three hotels and a massive convention center. 236 00:12:16,280 --> 00:12:18,520 Speaker 1: Ninety thousand people come into town for this thing. 237 00:12:18,559 --> 00:12:21,160 Speaker 3: It could be hard to like see everything you want to. Now, 238 00:12:21,200 --> 00:12:23,880 Speaker 3: what's colo about Showstoppers, this is the side event at 239 00:12:23,920 --> 00:12:27,560 Speaker 3: the Bellaggio is that basically it's a room full of 240 00:12:27,679 --> 00:12:29,599 Speaker 3: kind of all the coolest stuff, a whole bunch of 241 00:12:29,600 --> 00:12:33,120 Speaker 3: stuff that has one SEES innovation awards all packed into 242 00:12:33,160 --> 00:12:37,319 Speaker 3: one room with food and alcohol. So oh boy, did I. 243 00:12:37,400 --> 00:12:38,960 Speaker 1: Order free food and free alcohol? 244 00:12:39,080 --> 00:12:41,160 Speaker 3: So many drinks that I then just left on tables, 245 00:12:43,040 --> 00:12:43,280 Speaker 3: and it. 246 00:12:43,400 --> 00:12:45,400 Speaker 1: Always pretty good food, pretty good food. 247 00:12:46,000 --> 00:12:48,679 Speaker 3: So we walked around show stoppers and there was a 248 00:12:48,760 --> 00:12:52,000 Speaker 3: number of pretty pretty cool stuff that we saw. Yeah, 249 00:12:52,040 --> 00:12:54,720 Speaker 3: but I think I think it's maybe time to talk 250 00:12:54,760 --> 00:12:57,240 Speaker 3: about the saddest, the saddest man. 251 00:12:57,280 --> 00:13:00,560 Speaker 1: The villain, the villain, the villain of the episode and 252 00:13:00,600 --> 00:13:03,800 Speaker 1: of our this year's cees. I have trouble. Can you 253 00:13:03,800 --> 00:13:05,400 Speaker 1: bring up their name? Because I'm gonna want to get 254 00:13:05,440 --> 00:13:09,160 Speaker 1: this right, so we could be dangerous. But we had 255 00:13:09,360 --> 00:13:11,000 Speaker 1: neither of us had eaten and I had had like 256 00:13:11,000 --> 00:13:14,240 Speaker 1: a hot dog eight hours ago and walked literally nineteen 257 00:13:14,320 --> 00:13:17,160 Speaker 1: thousand steps and also done forty minutes of push ups 258 00:13:17,200 --> 00:13:19,640 Speaker 1: in between, so I was starving. So we we like 259 00:13:19,960 --> 00:13:22,240 Speaker 1: shovel food into our faces and we turn the first 260 00:13:22,240 --> 00:13:26,599 Speaker 1: booth we see is called open Droid, Open Droid or 261 00:13:26,640 --> 00:13:29,440 Speaker 1: open droids droids. Yes, it did. There is an s 262 00:13:29,720 --> 00:13:32,959 Speaker 1: Open Droids and it's like kind of Star Wars. He 263 00:13:33,040 --> 00:13:36,160 Speaker 1: font it is and I did ask them if you 264 00:13:36,200 --> 00:13:39,320 Speaker 1: know they had any issues with Lucasfilm. Apparently not yet 265 00:13:39,800 --> 00:13:42,280 Speaker 1: sue them. Lucasfilm, by the way, sue these kids. 266 00:13:42,360 --> 00:13:44,600 Speaker 3: I know there's people who work for Lucasfilm who listened to. 267 00:13:44,640 --> 00:13:49,600 Speaker 1: This crush them, burn them like Los Angeles is burning 268 00:13:49,640 --> 00:13:50,600 Speaker 1: down as we see. 269 00:13:50,720 --> 00:13:52,280 Speaker 3: They had a giant sign that said. 270 00:13:52,200 --> 00:13:53,160 Speaker 5: R two D three. 271 00:13:53,440 --> 00:13:55,240 Speaker 1: Yeah, that's the name of the robot that they're selling. 272 00:13:55,280 --> 00:13:57,280 Speaker 1: And the robot, the robot that they're selling is like 273 00:13:57,360 --> 00:14:03,560 Speaker 1: a an AI enabled hold helping slash like retail, you know, 274 00:14:04,000 --> 00:14:07,199 Speaker 1: like you know robot where it basically is like a 275 00:14:07,280 --> 00:14:12,480 Speaker 1: human torso with articulated arms and pincher hands on. And 276 00:14:12,520 --> 00:14:15,959 Speaker 1: then the base is like a little tank. Basically it's 277 00:14:16,000 --> 00:14:18,680 Speaker 1: got like treads or wheels and it rolls its wheels. Yeah, 278 00:14:18,679 --> 00:14:21,880 Speaker 1: and then the torso there's like a tall maybe six 279 00:14:21,920 --> 00:14:25,520 Speaker 1: foot tall like pillar built into this like rolling base 280 00:14:26,280 --> 00:14:28,880 Speaker 1: that the torso slides up and down on. And this 281 00:14:29,080 --> 00:14:32,240 Speaker 1: was their way of not making like what Musk is 282 00:14:32,240 --> 00:14:34,360 Speaker 1: trying to do, right, a humanoid robot where you have 283 00:14:34,440 --> 00:14:36,880 Speaker 1: to figure out like knees and balance and stuff. It's 284 00:14:36,880 --> 00:14:39,840 Speaker 1: like no or like Boston Dynamics wheels, right, wheels are cheap. 285 00:14:39,880 --> 00:14:42,480 Speaker 1: It a roll It works in most situations, you know. 286 00:14:42,560 --> 00:14:45,080 Speaker 1: And then but you still have the ability for it 287 00:14:45,120 --> 00:14:48,240 Speaker 1: to articulate and go up higher or go down lower. 288 00:14:48,600 --> 00:14:51,240 Speaker 1: Like something that can crouch, but it's much simpler, you 289 00:14:51,240 --> 00:14:53,760 Speaker 1: don't have to deal with nearly as much. And so 290 00:14:53,800 --> 00:14:55,920 Speaker 1: I saw that, I'm like, oh, well, that's at least 291 00:14:55,920 --> 00:14:57,880 Speaker 1: somebody who's thinking about, like how do we make something 292 00:14:57,920 --> 00:15:00,280 Speaker 1: like this like more affordable and less complicated, les to 293 00:15:00,360 --> 00:15:03,360 Speaker 1: fuck up. And so I start talking with one of 294 00:15:03,400 --> 00:15:05,840 Speaker 1: the co founders of the company, who is an Indian 295 00:15:05,880 --> 00:15:09,600 Speaker 1: guy in his forties something around that he had like 296 00:15:09,760 --> 00:15:12,120 Speaker 1: gray hair, he'd clearly he said he'd spent twenty years 297 00:15:12,160 --> 00:15:15,760 Speaker 1: in robotics. Very nice guy, you know. I brought up 298 00:15:15,760 --> 00:15:17,880 Speaker 1: that I thought the design was interesting, and he was 299 00:15:18,000 --> 00:15:20,120 Speaker 1: very much specifying, like, here's the things we didn't do 300 00:15:20,160 --> 00:15:24,840 Speaker 1: because they were too difficult, too inefficient. You know, this 301 00:15:24,880 --> 00:15:26,680 Speaker 1: is what we're thinking of. This is a machine that 302 00:15:26,720 --> 00:15:29,160 Speaker 1: can fold laundry. This is a machine that can do dishes. 303 00:15:29,200 --> 00:15:31,440 Speaker 1: This is a machine. And he was very much specifying 304 00:15:32,160 --> 00:15:34,720 Speaker 1: and the way he phrases like, these are undesirable tasks 305 00:15:34,760 --> 00:15:37,360 Speaker 1: people don't want to do, and this is a robot 306 00:15:37,400 --> 00:15:40,760 Speaker 1: that can handle those for like small businesses or for households. 307 00:15:40,760 --> 00:15:43,240 Speaker 1: And we do see this as eventually like a you know, 308 00:15:43,800 --> 00:15:46,200 Speaker 1: something like this we want to have in households, but 309 00:15:46,240 --> 00:15:49,920 Speaker 1: he was more focused on small businesses, and he was 310 00:15:49,920 --> 00:15:51,960 Speaker 1: again very focused on this is a thing that will 311 00:15:52,000 --> 00:15:58,240 Speaker 1: do undesirable tasks for people, right, And as I started 312 00:15:58,240 --> 00:16:01,040 Speaker 1: asking more questions, at a certain point, I got foisted 313 00:16:01,080 --> 00:16:04,320 Speaker 1: off to the co founder of the company. 314 00:16:04,480 --> 00:16:05,720 Speaker 3: Is it the co founder or is it just like 315 00:16:06,000 --> 00:16:06,960 Speaker 3: another one of their rests. 316 00:16:07,000 --> 00:16:09,280 Speaker 1: You know, I'm assuming co founder because I think it's 317 00:16:09,360 --> 00:16:11,440 Speaker 1: just a couple of guys, but maybe I'm not about Sorry, 318 00:16:11,520 --> 00:16:13,800 Speaker 1: I got foisted over to the other of the two guys. 319 00:16:13,840 --> 00:16:16,960 Speaker 1: There were two guys there, right, I'm not sure because 320 00:16:17,000 --> 00:16:19,440 Speaker 1: they don't have listed any where what their role in 321 00:16:19,480 --> 00:16:22,560 Speaker 1: the company is. I got a co founder's vibe from them. 322 00:16:22,800 --> 00:16:25,360 Speaker 1: That's how it seemed to be to me, at least 323 00:16:25,360 --> 00:16:27,160 Speaker 1: in terms of like the way these two were talking. 324 00:16:27,600 --> 00:16:31,320 Speaker 1: But I don't know the scope of the open Droid's company. 325 00:16:31,360 --> 00:16:34,160 Speaker 1: Maybe there's a lot more there, but these were the 326 00:16:34,200 --> 00:16:35,920 Speaker 1: two guys who were they're talking to us. So one 327 00:16:35,960 --> 00:16:39,120 Speaker 1: of them is this very wonky engineer who's been at 328 00:16:39,120 --> 00:16:41,640 Speaker 1: this a long time and was really focused on the 329 00:16:41,720 --> 00:16:43,920 Speaker 1: nuts and bolts details and wanted to build a robot 330 00:16:43,920 --> 00:16:46,720 Speaker 1: that could handle unpleasant tasks for human beings, right, the 331 00:16:46,720 --> 00:16:48,560 Speaker 1: same thing we've all been wanting to see. So at 332 00:16:48,560 --> 00:16:51,040 Speaker 1: this point I'm like, this could work. Maybe this is 333 00:16:51,080 --> 00:16:57,440 Speaker 1: a viable product. Right. The second guy, Jack j Jessenowski, 334 00:16:58,000 --> 00:17:02,080 Speaker 1: So he is wearing what Gryson described as a Jordan 335 00:17:02,200 --> 00:17:08,640 Speaker 1: Peterson suit because it is half purple. Is a face 336 00:17:08,720 --> 00:17:12,120 Speaker 1: suit split down the motherfucking middle. 337 00:17:12,200 --> 00:17:16,920 Speaker 3: With like like new age hippie like necklaces, five necklacesla 338 00:17:17,680 --> 00:17:21,800 Speaker 3: five necklaces. He had pants with like like embroidered flowers 339 00:17:21,840 --> 00:17:22,480 Speaker 3: on those. 340 00:17:22,320 --> 00:17:24,560 Speaker 1: And like a nose bridge, like it looked like one 341 00:17:24,560 --> 00:17:25,600 Speaker 1: of those things you put in your nose. 342 00:17:25,840 --> 00:17:29,040 Speaker 3: That was one of the other things at showstoppers. There 343 00:17:29,040 --> 00:17:30,159 Speaker 3: was a company that was doing that. 344 00:17:30,280 --> 00:17:33,280 Speaker 1: So yeah, he had want to be Steve Jobs vibes 345 00:17:33,320 --> 00:17:37,280 Speaker 1: from his half unbuttoned shirt and like many many spiritual 346 00:17:37,320 --> 00:17:42,800 Speaker 1: medallions to his like Jordan Peterson's suit, and very much 347 00:17:42,960 --> 00:17:46,679 Speaker 1: just that. Like I am the charismatic founder and what 348 00:17:46,800 --> 00:17:48,679 Speaker 1: I bring to the table. My partner knows how to 349 00:17:48,680 --> 00:17:55,080 Speaker 1: build robots. I'm charismatic. I'm Jack j Jessanowski, and Jack 350 00:17:55,119 --> 00:17:58,879 Speaker 1: and I started talking, and boy howdy, we had us 351 00:17:58,920 --> 00:18:01,840 Speaker 1: a conversation and I think we're just gonna play that. 352 00:18:02,200 --> 00:18:03,639 Speaker 1: What do I need to do to set this up? 353 00:18:03,760 --> 00:18:06,280 Speaker 3: No, I think you've set it up. We walk up 354 00:18:06,320 --> 00:18:09,560 Speaker 3: to Jack, I start, I start recording, and we start 355 00:18:09,600 --> 00:18:12,239 Speaker 3: talking about the robot, and then things spin in some 356 00:18:12,560 --> 00:18:13,919 Speaker 3: pretty interesting directions. 357 00:18:14,000 --> 00:18:20,119 Speaker 1: Yeah, all right, So what is this thing useful for? 358 00:18:21,800 --> 00:18:25,800 Speaker 6: Well, generally capable, just like a human can reach to 359 00:18:25,840 --> 00:18:28,560 Speaker 6: the floor and reach up high to a cupboard, go 360 00:18:28,680 --> 00:18:29,160 Speaker 6: up and down. 361 00:18:29,200 --> 00:18:30,199 Speaker 5: That's what we made this for. 362 00:18:30,320 --> 00:18:33,280 Speaker 6: Obviously in a little bit of a different fashion because 363 00:18:33,359 --> 00:18:36,000 Speaker 6: most surfaces are level. 364 00:18:36,240 --> 00:18:37,600 Speaker 5: We don't need to reinvent the wheel. 365 00:18:38,520 --> 00:18:43,280 Speaker 6: And the biggest market that we're going after is households. 366 00:18:43,320 --> 00:18:46,879 Speaker 6: Domestic dishes, laundry, make the bed, clean up around the house, 367 00:18:47,320 --> 00:18:50,639 Speaker 6: eventually cooking. That's more fine tuned, you know, dishes and 368 00:18:50,720 --> 00:18:53,960 Speaker 6: laundry is really that first task that is. 369 00:18:53,920 --> 00:18:55,240 Speaker 5: Gonna be fully autonomous. 370 00:18:55,920 --> 00:18:59,879 Speaker 6: Obviously from a folding standpoint and cooking standpoint, you can 371 00:19:00,040 --> 00:19:04,919 Speaker 6: do tele operation today, so can use cheaper labor internationally 372 00:19:05,440 --> 00:19:09,240 Speaker 6: through a robot, but full autonomous is coming very quickly, 373 00:19:09,320 --> 00:19:10,960 Speaker 6: like Jensen talked about recently. 374 00:19:11,920 --> 00:19:15,119 Speaker 7: So I see there's a lot of folks in the 375 00:19:15,200 --> 00:19:18,720 Speaker 7: robot space that are trying robots based on the human form, right, 376 00:19:18,960 --> 00:19:20,720 Speaker 7: you guys have not gone that route. 377 00:19:20,920 --> 00:19:23,520 Speaker 1: Talk to me about that droid form. 378 00:19:23,640 --> 00:19:28,360 Speaker 6: Yes, Well, as we know, robots didn't evolve from monkeys, 379 00:19:28,640 --> 00:19:31,840 Speaker 6: and so we have an ability to reimagine them. All 380 00:19:31,880 --> 00:19:34,920 Speaker 6: of the existing hardware we use in the world has 381 00:19:34,960 --> 00:19:38,560 Speaker 6: wheels for a reason. It just works better. It's easier, 382 00:19:38,600 --> 00:19:41,640 Speaker 6: there's less friction. That means there's less maintenance. That means 383 00:19:41,720 --> 00:19:46,040 Speaker 6: there's less energy output. It's efficiency. It's also easier for 384 00:19:46,160 --> 00:19:49,360 Speaker 6: us to manufacture that stuff at scale. So I think 385 00:19:49,400 --> 00:19:55,960 Speaker 6: long term, do robots all have legs? 386 00:19:56,080 --> 00:19:57,359 Speaker 5: Yeah, more or less. 387 00:19:57,400 --> 00:19:59,600 Speaker 6: The home robot does turn into the leg robot because 388 00:19:59,640 --> 00:20:01,520 Speaker 6: then it can with you in the car everything. 389 00:20:01,880 --> 00:20:03,160 Speaker 5: But I think the early. 390 00:20:02,880 --> 00:20:08,800 Speaker 6: Stages the wheels, because of their cheaperness, because of their reliability. 391 00:20:09,359 --> 00:20:12,840 Speaker 6: I think that will be what wins early stage. That's 392 00:20:12,880 --> 00:20:13,879 Speaker 6: where we started here. 393 00:20:14,440 --> 00:20:16,240 Speaker 7: Because the robot can go in the car with you. 394 00:20:16,640 --> 00:20:18,560 Speaker 7: What do you see people wanting to have a robot 395 00:20:18,560 --> 00:20:19,440 Speaker 7: in the car with them for? 396 00:20:19,920 --> 00:20:22,720 Speaker 5: I think it will just become basically the same way. 397 00:20:22,760 --> 00:20:25,640 Speaker 6: If you have enough money, a lot of people afford 398 00:20:25,800 --> 00:20:29,240 Speaker 6: like an assistant to come with them places. 399 00:20:29,800 --> 00:20:35,919 Speaker 7: It's because that seems like a niche market compared to household. 400 00:20:37,400 --> 00:20:39,760 Speaker 3: I think it's the barrier. 401 00:20:39,840 --> 00:20:42,960 Speaker 6: I think is because of the the cost and then 402 00:20:43,119 --> 00:20:44,800 Speaker 6: the humanness. 403 00:20:44,119 --> 00:20:46,240 Speaker 5: Like then you have to care for another human. 404 00:20:46,480 --> 00:20:51,159 Speaker 6: And whereas in this case it's kind of all positive sum. 405 00:20:51,200 --> 00:20:56,000 Speaker 6: And yeah, I guess it's wrong to try to say 406 00:20:57,119 --> 00:20:58,840 Speaker 6: majority of people. 407 00:20:59,320 --> 00:21:03,560 Speaker 5: But anyone who's you know in media, you. 408 00:21:03,520 --> 00:21:06,760 Speaker 6: Know they videographer will be something you use a robot 409 00:21:06,880 --> 00:21:09,280 Speaker 6: for to follow you around and take media and film 410 00:21:09,320 --> 00:21:09,560 Speaker 6: for you. 411 00:21:10,119 --> 00:21:10,800 Speaker 5: They won't get. 412 00:21:10,720 --> 00:21:13,240 Speaker 6: Tired and say, go grab me a drink or you know, 413 00:21:13,800 --> 00:21:15,240 Speaker 6: go figure that thing out. 414 00:21:16,240 --> 00:21:19,560 Speaker 7: But it also can't decide, oh, that's actually not a 415 00:21:19,600 --> 00:21:21,160 Speaker 7: good location to film from. 416 00:21:21,160 --> 00:21:22,280 Speaker 1: It's not going to look as good. 417 00:21:22,560 --> 00:21:24,359 Speaker 7: We need to get over here, or we need another 418 00:21:24,400 --> 00:21:26,359 Speaker 7: camera on this side here, we need to get like 419 00:21:26,880 --> 00:21:29,040 Speaker 7: different angles because we're going to want to edit this 420 00:21:29,119 --> 00:21:32,000 Speaker 7: together into a thing. And as a videographer, I'm not 421 00:21:32,119 --> 00:21:35,760 Speaker 7: just a machine. I'm a part of a collaborative creative enterprise. 422 00:21:37,080 --> 00:21:42,040 Speaker 6: I think we're starting to see just how artistic these 423 00:21:42,640 --> 00:21:43,520 Speaker 6: ais can be. 424 00:21:44,160 --> 00:21:46,320 Speaker 1: What's the best example of that using. 425 00:21:47,760 --> 00:21:49,960 Speaker 6: Well, I think the most used thing is this the 426 00:21:50,040 --> 00:21:53,760 Speaker 6: Jenai art And then you have some of the new 427 00:21:53,840 --> 00:21:58,600 Speaker 6: video models are pretty cool and they're using certain sort 428 00:21:58,640 --> 00:22:05,000 Speaker 6: of zoom in shots everything. I think they'll make just 429 00:22:05,320 --> 00:22:08,240 Speaker 6: as good of movies as humans. Oh, I think the 430 00:22:08,280 --> 00:22:12,640 Speaker 6: best reference in order to actually say that that's possible 431 00:22:12,760 --> 00:22:14,879 Speaker 6: is music. I don't know if you've played with the 432 00:22:14,920 --> 00:22:17,960 Speaker 6: most recent AI music. There's song GBT dot com. 433 00:22:18,080 --> 00:22:20,840 Speaker 7: I've heard some things people call music that are produced 434 00:22:20,880 --> 00:22:21,119 Speaker 7: by that. 435 00:22:21,280 --> 00:22:22,760 Speaker 1: Yeah, we can make. 436 00:22:22,600 --> 00:22:23,960 Speaker 5: One live right now that. 437 00:22:24,040 --> 00:22:27,720 Speaker 6: I don't know if you've heard like the latest models, 438 00:22:28,040 --> 00:22:30,360 Speaker 6: pick me a genre. 439 00:22:32,280 --> 00:22:37,760 Speaker 1: Irish spiritualist Scott. If you try Scott too, you. 440 00:22:37,720 --> 00:22:42,680 Speaker 6: Love SKA is like definitely probably niche stuff is where 441 00:22:42,680 --> 00:22:48,080 Speaker 6: it's gonna have a harder time, But sk s KA. 442 00:22:49,000 --> 00:22:51,840 Speaker 5: I wonder how much SKA data there is out there. 443 00:22:52,400 --> 00:22:54,160 Speaker 1: There's a lot of SKA music out there. 444 00:22:54,200 --> 00:22:55,480 Speaker 5: What should we make it about? Should we make it 445 00:22:55,480 --> 00:22:56,359 Speaker 5: about iHeart radio? 446 00:22:57,040 --> 00:22:57,320 Speaker 1: Sure? 447 00:22:57,640 --> 00:23:05,800 Speaker 8: iHeartRadio and communication and clear channel communications? All right, it's 448 00:23:05,840 --> 00:23:08,800 Speaker 8: here a scaw song. We're like, oh, it has to 449 00:23:08,840 --> 00:23:10,080 Speaker 8: load for like thirty seconds. 450 00:23:10,080 --> 00:23:13,480 Speaker 6: It feels weirdly like I'm upset that I have to 451 00:23:13,520 --> 00:23:15,960 Speaker 6: wait that long for something to load online. 452 00:23:16,880 --> 00:23:18,800 Speaker 9: It feels to you, huh, yeah, I guys what they're 453 00:23:18,800 --> 00:23:21,320 Speaker 9: playing with it a lot, But It's funny to think 454 00:23:21,320 --> 00:23:23,359 Speaker 9: about how much time and effort it does take to 455 00:23:23,440 --> 00:23:27,359 Speaker 9: like produce a song typically I am twenty seven. 456 00:23:27,920 --> 00:23:31,800 Speaker 1: Yeah, that's interesting, wouldn't I guess that? 457 00:23:33,680 --> 00:23:33,760 Speaker 6: What? 458 00:23:34,000 --> 00:23:35,199 Speaker 4: Uh is it? 459 00:23:35,320 --> 00:23:37,679 Speaker 7: One thing that's really compelling to me is your partner. 460 00:23:38,000 --> 00:23:40,080 Speaker 7: When I came in here was very very much talking 461 00:23:40,080 --> 00:23:42,800 Speaker 7: about the utility of this in terms of replacing human 462 00:23:42,840 --> 00:23:47,800 Speaker 7: beings and tasks that are generally unpleasant laundry, doing the dishes, 463 00:23:47,920 --> 00:23:51,160 Speaker 7: cleaning up trash. You seem a lot more bullish on 464 00:23:52,040 --> 00:23:55,439 Speaker 7: robots replacing human beings and what are generally considered to 465 00:23:55,480 --> 00:23:57,879 Speaker 7: be enterprises people want to do. 466 00:23:57,960 --> 00:23:58,640 Speaker 1: With their time. 467 00:23:59,359 --> 00:24:02,040 Speaker 7: Is that like a discrepancy that that you guys have 468 00:24:02,200 --> 00:24:04,359 Speaker 7: kind of talked about or do you think it's something 469 00:24:04,600 --> 00:24:06,800 Speaker 7: you guys are more on the same page with stuff. 470 00:24:06,800 --> 00:24:10,479 Speaker 6: From a business standpoint, we're one hundred percent going after 471 00:24:11,440 --> 00:24:16,840 Speaker 6: the dishes, laundry, uh, nursing practice of. 472 00:24:16,880 --> 00:24:21,640 Speaker 5: Just doing vitals, which is the very repetitive task. That's 473 00:24:21,680 --> 00:24:25,560 Speaker 5: the push. I was starting to just talk into. 474 00:24:25,240 --> 00:24:29,760 Speaker 6: The aspect of the legged robots and kind of imagining 475 00:24:29,800 --> 00:24:34,960 Speaker 6: why a legged version would have better utility or be 476 00:24:35,040 --> 00:24:38,520 Speaker 6: something someone wants to purchase. Scared rather than the wheeled 477 00:24:38,600 --> 00:24:41,640 Speaker 6: robot an yow stairs is definitely a. 478 00:24:41,520 --> 00:24:42,280 Speaker 5: Big one of those. 479 00:24:42,320 --> 00:24:45,960 Speaker 6: There are wheel types we're working on right now, would 480 00:24:45,960 --> 00:24:49,240 Speaker 6: have ability to climb like single stairs obviously easiest, and 481 00:24:49,280 --> 00:24:51,639 Speaker 6: that's what most people have in their home if they 482 00:24:51,640 --> 00:24:52,400 Speaker 6: do have stairs. 483 00:24:53,400 --> 00:24:55,280 Speaker 1: Oh, are we gonna listen to some robots? 484 00:24:55,280 --> 00:24:58,200 Speaker 5: Scott my heart listeners system. 485 00:25:09,720 --> 00:25:28,000 Speaker 4: Stop? Crazy's a scarf. 486 00:25:29,400 --> 00:25:31,120 Speaker 1: It's a pretty basic melody. 487 00:25:31,640 --> 00:25:33,720 Speaker 7: I mean, there's horns in it, but I feel like 488 00:25:33,800 --> 00:25:36,520 Speaker 7: it's kind of takes in a I think it Helia 489 00:25:36,600 --> 00:25:38,639 Speaker 7: is trying to do pot that it's just thrown some 490 00:25:38,760 --> 00:25:42,080 Speaker 7: horns in on. This is a little closer to SCA, 491 00:25:43,800 --> 00:26:05,639 Speaker 7: although it's still Yeah, it's not really singing, but I 492 00:26:05,720 --> 00:26:06,960 Speaker 7: guess that's a matter of pace. 493 00:26:10,800 --> 00:26:11,600 Speaker 1: What do you listen to? 494 00:26:11,800 --> 00:26:13,040 Speaker 5: This is the worst that's going to be. 495 00:26:13,960 --> 00:26:14,800 Speaker 3: I hear that a lot. 496 00:26:15,320 --> 00:26:19,320 Speaker 7: It's interesting because GPT four took fifty times as much 497 00:26:19,400 --> 00:26:22,239 Speaker 7: power as GPT three to train, and there's a lot 498 00:26:22,280 --> 00:26:25,800 Speaker 7: of mixed reactions on that. And we're entering into a 499 00:26:25,920 --> 00:26:30,080 Speaker 7: period where we're very likely looking at a recession venture 500 00:26:30,160 --> 00:26:31,040 Speaker 7: capital funding. 501 00:26:31,760 --> 00:26:33,399 Speaker 1: There's a chance it's not going to be what it 502 00:26:33,520 --> 00:26:34,040 Speaker 1: has been. 503 00:26:34,720 --> 00:26:37,080 Speaker 7: Is that concern you at all that, like this vaunted 504 00:26:37,240 --> 00:26:40,040 Speaker 7: next level for all of this stuff, the energy cost, 505 00:26:40,720 --> 00:26:43,800 Speaker 7: the investment cost is just not going to be borne 506 00:26:43,920 --> 00:26:46,919 Speaker 7: by a market that is not going to be as 507 00:26:47,119 --> 00:26:49,320 Speaker 7: strong tomorrow as it was today. 508 00:26:49,680 --> 00:26:50,760 Speaker 1: At least in the immediator. 509 00:26:51,720 --> 00:26:56,720 Speaker 6: I think even if we created no more energy as 510 00:26:56,760 --> 00:26:59,840 Speaker 6: a human species today, the amount of advanced once we 511 00:27:00,119 --> 00:27:06,160 Speaker 6: create would from an architectural standpoint, continue to advance. 512 00:27:06,480 --> 00:27:09,240 Speaker 5: So you have other models. 513 00:27:11,400 --> 00:27:14,159 Speaker 6: Like I think LAMA three point three, which has matched 514 00:27:14,240 --> 00:27:19,520 Speaker 6: four oh's capabilities and is I forget how many parameters, 515 00:27:19,600 --> 00:27:22,639 Speaker 6: but like super like much much much smaller and was 516 00:27:22,720 --> 00:27:24,960 Speaker 6: much cheaper to train, and like we're continuing to see 517 00:27:25,000 --> 00:27:30,680 Speaker 6: like smaller models that are just as effective and we're 518 00:27:30,800 --> 00:27:33,719 Speaker 6: much cheaper training runs. I think Deep Seek was one 519 00:27:33,760 --> 00:27:34,560 Speaker 6: of the newest ones. 520 00:27:35,080 --> 00:27:37,199 Speaker 7: What I'm what I'm concerned about is I'm looking at 521 00:27:37,280 --> 00:27:40,360 Speaker 7: the p and L right, I'm looking at open AI's PNL. 522 00:27:40,680 --> 00:27:43,080 Speaker 7: I'm looking at the fact that they're losing five or 523 00:27:43,119 --> 00:27:46,840 Speaker 7: six billion dollars last year and we're very good chance 524 00:27:46,880 --> 00:27:47,320 Speaker 7: it's going to. 525 00:27:47,320 --> 00:27:49,080 Speaker 1: Be somewhere in the neighborhood to double that this year. 526 00:27:49,600 --> 00:27:52,440 Speaker 7: And I'm it's not that there's nothing impressive there it's 527 00:27:52,440 --> 00:27:54,320 Speaker 7: not that I don't see like, oh, you can generate 528 00:27:54,560 --> 00:27:58,560 Speaker 7: a song that's got like guitar and and and trumpets 529 00:27:58,640 --> 00:28:00,920 Speaker 7: and vocals and stuff and you know a minute or so. 530 00:28:01,240 --> 00:28:02,000 Speaker 1: It's not that that's not. 531 00:28:02,080 --> 00:28:07,000 Speaker 7: Impressive, but like a pilier trick, isn't a trillion dollar business, 532 00:28:07,160 --> 00:28:09,200 Speaker 7: And that's the kind of investment they're looking at. And 533 00:28:09,280 --> 00:28:12,120 Speaker 7: I do wonder like, is it not much more reasonable 534 00:28:12,200 --> 00:28:13,800 Speaker 7: to focus on folding laundry? 535 00:28:14,400 --> 00:28:19,560 Speaker 6: Well, obviously, I personally am in the the boathouse of 536 00:28:19,640 --> 00:28:24,880 Speaker 6: focusing on allowing this intelligence to flourish and doing these 537 00:28:24,960 --> 00:28:28,480 Speaker 6: laborious tasks and getting them in the households. I do 538 00:28:28,640 --> 00:28:32,800 Speaker 6: think from opening eye standpoint, and the reason why vcs 539 00:28:32,880 --> 00:28:36,760 Speaker 6: and private investors will value them so highly is what's 540 00:28:36,880 --> 00:28:39,080 Speaker 6: next is white collar work. 541 00:28:39,600 --> 00:28:44,960 Speaker 5: A lot of the jobs online. That's what they do. 542 00:28:45,120 --> 00:28:49,360 Speaker 6: Have an internal model which is able to control the computer, 543 00:28:50,080 --> 00:28:51,880 Speaker 6: you know, the same way you would ask an executive 544 00:28:51,880 --> 00:28:54,240 Speaker 6: assistant to do certain things online. 545 00:28:55,480 --> 00:28:56,320 Speaker 5: Now it's just. 546 00:28:56,800 --> 00:28:59,920 Speaker 7: Jobey's handing along all of their emails now through AI, 547 00:29:01,760 --> 00:29:03,120 Speaker 7: which is you know, we'll see how. 548 00:29:03,080 --> 00:29:04,200 Speaker 1: Well that works on the long term. 549 00:29:04,240 --> 00:29:07,520 Speaker 7: There's been some interesting pulling on like the degree to 550 00:29:07,600 --> 00:29:13,720 Speaker 7: which customers and investors feel trust when somebody's responding to 551 00:29:13,800 --> 00:29:16,720 Speaker 7: them with an AI. But what's interesting, I mean more 552 00:29:16,760 --> 00:29:19,760 Speaker 7: here is the dichotomy between what I see here is 553 00:29:20,080 --> 00:29:22,320 Speaker 7: a very pragmatic choice, which is, we're not going to 554 00:29:22,400 --> 00:29:26,000 Speaker 7: try and remake a human being formed robot and. 555 00:29:26,040 --> 00:29:28,560 Speaker 1: Deal with like knees and hips and all of that stuff. 556 00:29:28,800 --> 00:29:29,400 Speaker 1: We don't need that. 557 00:29:29,520 --> 00:29:32,200 Speaker 7: We can have its turn up and down on this 558 00:29:32,360 --> 00:29:36,640 Speaker 7: platform and reach things the same way. Melded to what 559 00:29:36,760 --> 00:29:38,880 Speaker 7: I consider to be kind of a little more pie 560 00:29:38,960 --> 00:29:42,240 Speaker 7: in the sky. We're doing this as eventually something that 561 00:29:42,320 --> 00:29:46,640 Speaker 7: can take creative roles and think independently and make things, 562 00:29:47,120 --> 00:29:49,600 Speaker 7: which is it's interesting to me to see that in 563 00:29:49,680 --> 00:29:51,760 Speaker 7: a company's DNA of what you guys are eight months 564 00:29:51,800 --> 00:29:54,920 Speaker 7: out right now, yep, is that what you're more interested in? 565 00:29:55,520 --> 00:29:58,480 Speaker 5: I'd say, I tailor my pitch to the person I'm 566 00:29:58,520 --> 00:29:58,920 Speaker 5: talking to. 567 00:29:59,160 --> 00:29:59,400 Speaker 1: Uh huh. 568 00:30:00,960 --> 00:30:04,560 Speaker 6: So some people definitely enjoy thinking about more of the 569 00:30:04,640 --> 00:30:07,880 Speaker 6: sci fi futures that are coming, for example, the droids 570 00:30:07,920 --> 00:30:12,920 Speaker 6: building Droid's moment. It's when you know you are decreasing 571 00:30:13,000 --> 00:30:17,680 Speaker 6: your own manufacturing costs by using your own hardware to 572 00:30:17,760 --> 00:30:20,880 Speaker 6: build more of that hardware and parts are just being 573 00:30:20,920 --> 00:30:22,080 Speaker 6: shipped into the factory. 574 00:30:22,560 --> 00:30:22,960 Speaker 5: Obviously. 575 00:30:23,200 --> 00:30:26,800 Speaker 6: I think the first fully automated phone factory just came 576 00:30:26,840 --> 00:30:29,680 Speaker 6: out in China recently, which is like some cool press 577 00:30:29,720 --> 00:30:33,560 Speaker 6: in news, but the phone is separate from the actual 578 00:30:33,640 --> 00:30:34,880 Speaker 6: manufacturing process. 579 00:30:35,120 --> 00:30:38,960 Speaker 5: Yeah, so there's that like interesting component. 580 00:30:39,600 --> 00:30:45,000 Speaker 6: The exciting part of the idea that how do we 581 00:30:45,080 --> 00:30:49,080 Speaker 6: reach true abundance as a species of material and resources 582 00:30:49,280 --> 00:30:55,840 Speaker 6: is well, because GDP is a calculation of capita times productivity, 583 00:30:56,320 --> 00:31:01,240 Speaker 6: a robot really represents capita one in it of creation. 584 00:31:02,280 --> 00:31:05,880 Speaker 6: And I'd say that's where the sci fi thinking comes 585 00:31:05,960 --> 00:31:09,920 Speaker 6: into play, and it's it's not not worth going there 586 00:31:11,000 --> 00:31:14,440 Speaker 6: when just dreaming about the future robotics and talking about 587 00:31:14,440 --> 00:31:19,160 Speaker 6: it and having an interesting, engaging conversation, But definitely when 588 00:31:19,200 --> 00:31:21,440 Speaker 6: it comes to what are we doing from an engineering 589 00:31:21,520 --> 00:31:23,640 Speaker 6: standpoint on the day to day and how are we 590 00:31:23,720 --> 00:31:27,520 Speaker 6: trying to approach the market, those conversations are not at 591 00:31:27,600 --> 00:31:28,040 Speaker 6: being had. 592 00:31:29,200 --> 00:31:31,320 Speaker 1: Well, I appreciate your time and you gave me a lot. 593 00:31:31,320 --> 00:31:33,040 Speaker 1: I'm gonna let you get to the other peak. Thank you, 594 00:31:33,160 --> 00:31:34,200 Speaker 1: Thank you so much to meet you. 595 00:31:34,320 --> 00:31:34,480 Speaker 6: Jack. 596 00:31:35,720 --> 00:31:37,880 Speaker 3: Oh wow, that's super interesting. 597 00:31:37,920 --> 00:31:40,440 Speaker 1: I hope you all liked Jack Jay as much as 598 00:31:40,520 --> 00:31:41,320 Speaker 1: I didn't. 599 00:31:41,200 --> 00:31:43,840 Speaker 3: Getting to twenty seventy years old and not knowing what. 600 00:31:44,000 --> 00:31:46,240 Speaker 1: Scott is that old. I thought he was much younger, 601 00:31:46,680 --> 00:31:48,160 Speaker 1: like you thought he was like twenty two. 602 00:31:48,360 --> 00:31:50,400 Speaker 3: Yes, but the fact that he like he didn't know 603 00:31:50,440 --> 00:31:53,479 Speaker 3: what SKA was as a genre, he wasn't was unaware 604 00:31:53,520 --> 00:31:53,600 Speaker 3: of it. 605 00:31:53,680 --> 00:31:55,200 Speaker 1: I don't think he listens to music. 606 00:31:55,040 --> 00:31:57,200 Speaker 3: Well, he listens to AI generated hess. 607 00:31:56,880 --> 00:31:58,840 Speaker 1: To AI generated music, he's just as good. 608 00:31:59,040 --> 00:32:01,840 Speaker 3: He has the most, he's the most. I listen to 609 00:32:01,960 --> 00:32:05,880 Speaker 3: AI generated music vines. If anyone I've ever seen before, 610 00:32:06,240 --> 00:32:06,760 Speaker 3: just very. 611 00:32:06,680 --> 00:32:09,840 Speaker 1: Clearly does not have a soul. No, like like nothing, 612 00:32:10,080 --> 00:32:12,080 Speaker 1: nothing would leave the universe. 613 00:32:11,800 --> 00:32:15,160 Speaker 3: If he did, right, Like it's so opposite from the 614 00:32:15,200 --> 00:32:18,000 Speaker 3: first guy you talked to, who was it so like about? No, 615 00:32:18,960 --> 00:32:22,160 Speaker 3: I want to have smart, actual tasks that people don't enjoy. Yeah, 616 00:32:22,800 --> 00:32:26,800 Speaker 3: I love cinematography. I love I love filmmaking. I don't 617 00:32:27,000 --> 00:32:30,200 Speaker 3: First of all, I don't think a robot can can 618 00:32:30,360 --> 00:32:30,840 Speaker 3: replace this. 619 00:32:31,200 --> 00:32:31,240 Speaker 6: No. 620 00:32:31,520 --> 00:32:35,320 Speaker 1: I watched five different AI generated movies yesterday, and they 621 00:32:35,360 --> 00:32:36,440 Speaker 1: all looked like shit, even like. 622 00:32:36,400 --> 00:32:38,720 Speaker 3: A robot handling a physical camera to make like to 623 00:32:38,800 --> 00:32:41,520 Speaker 3: make like choices on like shot framing and composition, and. 624 00:32:41,560 --> 00:32:43,440 Speaker 1: Like it's one thing to be like, we want we 625 00:32:43,520 --> 00:32:45,680 Speaker 1: have a race cargoing, and so we've got this robot 626 00:32:45,720 --> 00:32:47,360 Speaker 1: on a track so we can go seventy miles an 627 00:32:47,400 --> 00:32:49,280 Speaker 1: hour and we're just kind of running on a street. 628 00:32:49,680 --> 00:32:51,760 Speaker 1: Follow it because a human being can move that. That's sure. 629 00:32:52,160 --> 00:32:53,959 Speaker 1: One thing we've left out of this up so far. 630 00:32:54,120 --> 00:32:57,120 Speaker 1: So this this machine that I described earlier, this robot 631 00:32:57,160 --> 00:32:59,040 Speaker 1: that goes up and down this rolling base has a 632 00:32:59,480 --> 00:33:02,960 Speaker 1: floppy Donald Trump mask over the over its head. 633 00:33:02,880 --> 00:33:04,320 Speaker 3: Which first attracted us to this. 634 00:33:04,600 --> 00:33:06,440 Speaker 1: Yeah, that's why we showed up there in the. 635 00:33:07,000 --> 00:33:10,800 Speaker 3: A robot moving its arms around wearing a Donald Trump mask. 636 00:33:10,880 --> 00:33:14,160 Speaker 3: And as Robert was interviewing this guy, the robot was 637 00:33:14,200 --> 00:33:16,600 Speaker 3: like moving around and like trying to simulate its washing 638 00:33:16,760 --> 00:33:20,400 Speaker 3: dishes capability, and it knocked over the same water bottle 639 00:33:20,480 --> 00:33:23,880 Speaker 3: about five times. It couldn't it couldn't pick it up consistently, 640 00:33:24,280 --> 00:33:26,680 Speaker 3: So I will not trust it with my fine china. 641 00:33:26,760 --> 00:33:27,240 Speaker 3: I'll say that. 642 00:33:27,400 --> 00:33:28,840 Speaker 1: As soon as I got up there, I asked, like, 643 00:33:28,880 --> 00:33:30,680 Speaker 1: I couldn't take my jacket off, now can it fold? 644 00:33:30,760 --> 00:33:32,880 Speaker 1: And He's like, well, we'd have to reprogram it. And 645 00:33:32,960 --> 00:33:34,880 Speaker 1: it was this when I talked to the guy, I 646 00:33:35,120 --> 00:33:37,040 Speaker 1: was like because he was like, yeah, we really see 647 00:33:37,080 --> 00:33:41,000 Speaker 1: this as being, you know, potentially good for elder care. Sure, 648 00:33:41,160 --> 00:33:42,960 Speaker 1: and you know, we had just seen the product we 649 00:33:43,000 --> 00:33:45,400 Speaker 1: talked about in the last episode, which, for all of 650 00:33:45,480 --> 00:33:47,520 Speaker 1: it's I don't know that I think it'll work. Was 651 00:33:48,000 --> 00:33:50,240 Speaker 1: a lot of thought and care went into it. I 652 00:33:50,400 --> 00:33:52,080 Speaker 1: was like, okay, so, like, what work have you done 653 00:33:52,160 --> 00:33:54,760 Speaker 1: to build a machine that can like communicate and be 654 00:33:54,840 --> 00:33:57,640 Speaker 1: helpful to like people who are dealing with health issues 655 00:33:57,720 --> 00:33:59,880 Speaker 1: in their their later years. And like, well that's why 656 00:34:00,160 --> 00:34:02,720 Speaker 1: it's open right, someone else will Yeah, it's open sources. 657 00:34:02,760 --> 00:34:03,520 Speaker 1: Someone else can do it. 658 00:34:03,920 --> 00:34:04,080 Speaker 5: Part. 659 00:34:04,120 --> 00:34:05,760 Speaker 1: See, you guys are just you guys are just saying 660 00:34:05,800 --> 00:34:09,400 Speaker 1: it can do everything because somebody could potentially code something 661 00:34:09,520 --> 00:34:09,759 Speaker 1: for it. 662 00:34:10,000 --> 00:34:12,120 Speaker 3: Yeah cool, there always could be code. 663 00:34:12,200 --> 00:34:15,439 Speaker 1: Yeah, there could be code. I mean again, the other guy, 664 00:34:16,320 --> 00:34:20,600 Speaker 1: the actual engineer, seemed very interested in the nuts and 665 00:34:20,719 --> 00:34:26,080 Speaker 1: bolts of making an affordable, reproducible machine that could handle 666 00:34:26,120 --> 00:34:30,600 Speaker 1: specific tasks, and Jack Jay had absolutely no interest in 667 00:34:30,680 --> 00:34:32,920 Speaker 1: the actual machine that they were making. This is clearly 668 00:34:32,960 --> 00:34:35,160 Speaker 1: could not be clear. This is just a stepping stone, 669 00:34:35,400 --> 00:34:37,560 Speaker 1: and he's kind of grossed out by it because it's 670 00:34:37,640 --> 00:34:40,160 Speaker 1: not replacing all human art with a machine that he owns. 671 00:34:40,280 --> 00:34:43,840 Speaker 3: He's a man completely fueled by Lex Friedman podcasts, and 672 00:34:43,960 --> 00:34:46,520 Speaker 3: he doesn't want to actually do any real work. He 673 00:34:46,640 --> 00:34:49,480 Speaker 3: just wants to talk about how AI is going to 674 00:34:49,800 --> 00:34:52,880 Speaker 3: take over everything and we have to welcome it in 675 00:34:53,400 --> 00:34:55,040 Speaker 3: and here listen to this is SKA. 676 00:34:55,440 --> 00:34:58,520 Speaker 1: He wants to take money by owning something that does 677 00:34:58,640 --> 00:35:02,120 Speaker 1: not provide anything and also put people out of work. Like, 678 00:35:02,800 --> 00:35:05,640 Speaker 1: at no point did he express a desire to do 679 00:35:06,000 --> 00:35:10,080 Speaker 1: anything other than replace something people were already doing with 680 00:35:10,239 --> 00:35:15,359 Speaker 1: something worse that tech guys could profit from. That's all 681 00:35:15,480 --> 00:35:17,800 Speaker 1: there is to this man. He's not a human. 682 00:35:18,040 --> 00:35:19,160 Speaker 3: It's so anti human. 683 00:35:19,400 --> 00:35:23,000 Speaker 1: Yeah, I cannot overemphasize the degree to which there was 684 00:35:23,080 --> 00:35:24,759 Speaker 1: nothing behind this boy's eyes. 685 00:35:25,880 --> 00:35:30,040 Speaker 3: Well, do you know what, there's also nothing super intelligent behind. 686 00:35:30,360 --> 00:35:32,440 Speaker 1: That's not true. All of our ads are sponsored by 687 00:35:32,520 --> 00:35:36,080 Speaker 1: real people, even if they're bad people, they're at least people. 688 00:35:36,600 --> 00:35:40,000 Speaker 1: They live and they love and they hate and you know, 689 00:35:40,120 --> 00:35:41,279 Speaker 1: maybe they have a promo code. 690 00:35:41,800 --> 00:35:57,560 Speaker 10: Let's see, all right, So after our lovely, Our Lovely Robotic. 691 00:35:57,360 --> 00:36:03,800 Speaker 1: Jack j Jessanowski Adventure, Oh god, Also the SKA was 692 00:36:03,880 --> 00:36:07,399 Speaker 1: Shiit not good? Not good? It did? It just kept 693 00:36:07,480 --> 00:36:08,840 Speaker 1: saying the words Scott. 694 00:36:09,040 --> 00:36:12,320 Speaker 3: He kept saying the word Scott music and saying the 695 00:36:12,400 --> 00:36:17,520 Speaker 3: word Robert Robert Scott, well, just doing random noises. After 696 00:36:17,520 --> 00:36:19,200 Speaker 3: we had our fill of that, we did walk around 697 00:36:19,239 --> 00:36:20,960 Speaker 3: the rest of Showstoppers. 698 00:36:21,360 --> 00:36:24,680 Speaker 1: He was so surprised that I wasn't. I wasn't impressed 699 00:36:24,920 --> 00:36:26,239 Speaker 1: by any of the He was like, you must have 700 00:36:26,400 --> 00:36:28,080 Speaker 1: heard the lady, man, I hear them. 701 00:36:28,800 --> 00:36:30,479 Speaker 3: It's not good. 702 00:36:30,680 --> 00:36:33,479 Speaker 1: It's like it's like I made this comparison a few times. 703 00:36:33,840 --> 00:36:36,839 Speaker 1: If somebody like walked in while I'm at a house 704 00:36:36,920 --> 00:36:40,000 Speaker 1: party was like, hey, man, I taught my dog to 705 00:36:40,200 --> 00:36:44,120 Speaker 1: masturbate to pornography with its with its pause. I would 706 00:36:44,160 --> 00:36:47,920 Speaker 1: be like, well, I mean that's like, I guess impressed. 707 00:36:47,920 --> 00:36:49,960 Speaker 1: I didn't think a dog could do that. Like I 708 00:36:50,080 --> 00:36:54,799 Speaker 1: am kind of impressed, I guess, But I don't want 709 00:36:54,920 --> 00:36:58,799 Speaker 1: this like this, this doesn't do anything for me. It's 710 00:37:00,360 --> 00:37:01,319 Speaker 1: you figured this out? 711 00:37:01,480 --> 00:37:03,279 Speaker 3: What what value does this? 712 00:37:03,719 --> 00:37:06,040 Speaker 1: How does the dog know who Farah Fawce it is? 713 00:37:06,120 --> 00:37:09,680 Speaker 1: I have questions, sure, but it doesn't give me anything 714 00:37:10,239 --> 00:37:14,200 Speaker 1: like no Farah Fauce it was Garrison. 715 00:37:14,320 --> 00:37:16,160 Speaker 3: No, goddamn, what do you think I do? 716 00:37:16,920 --> 00:37:19,160 Speaker 1: I don't know anymore? 717 00:37:20,560 --> 00:37:23,600 Speaker 3: Well, what I did is walk around the rest of showstoppers. 718 00:37:24,120 --> 00:37:26,279 Speaker 3: I saw this one booth that had like a like 719 00:37:26,320 --> 00:37:29,320 Speaker 3: an iPhone case with like a little like keyboard on 720 00:37:29,360 --> 00:37:32,080 Speaker 3: the bottom that like plugs in, and I started messing 721 00:37:32,120 --> 00:37:34,400 Speaker 3: around with it, and the guy that booth walked up 722 00:37:34,440 --> 00:37:35,759 Speaker 3: to me and made fun of me because he's like, 723 00:37:36,280 --> 00:37:41,360 Speaker 3: you've never you've never held a phone with He literally said, like, 724 00:37:41,719 --> 00:37:45,040 Speaker 3: you've never had a BlackBerry before, have you? Like no, Like, yeah, 725 00:37:45,560 --> 00:37:46,520 Speaker 3: you're typing all wrong. 726 00:37:46,600 --> 00:37:46,759 Speaker 6: On that. 727 00:37:47,280 --> 00:37:51,759 Speaker 1: There was a solid nine day news cycle when Barack Obama, 728 00:37:51,880 --> 00:37:54,600 Speaker 1: newly the president, revealed that he had a BlackBerry. 729 00:37:54,719 --> 00:37:59,160 Speaker 3: I remember that, which sounds like a lifetime There was a. 730 00:37:59,200 --> 00:38:02,680 Speaker 1: Company called r im once and they made a tablet 731 00:38:02,760 --> 00:38:05,640 Speaker 1: that was pretty good and we only made a couple 732 00:38:05,719 --> 00:38:08,360 Speaker 1: of rim job jokes about it, but it didn't do 733 00:38:08,640 --> 00:38:10,680 Speaker 1: very well, and so I gave it to my dad 734 00:38:10,760 --> 00:38:13,080 Speaker 1: and accidentally there was still a picture of my dick 735 00:38:13,120 --> 00:38:15,680 Speaker 1: on it. Anyway, that's a story for another day. 736 00:38:17,520 --> 00:38:17,840 Speaker 3: Cool. 737 00:38:19,960 --> 00:38:21,719 Speaker 1: These are the kind of things you get recording at 738 00:38:21,760 --> 00:38:23,520 Speaker 1: eleven fifty six pm. 739 00:38:23,400 --> 00:38:28,040 Speaker 3: And wet Tuesday, gott to get to bed, but no, 740 00:38:28,200 --> 00:38:29,560 Speaker 3: he made fun of me for not knowing how to 741 00:38:29,680 --> 00:38:31,440 Speaker 3: use a smartphone keyboard. 742 00:38:31,640 --> 00:38:32,319 Speaker 1: He did the right thing. 743 00:38:32,400 --> 00:38:33,800 Speaker 3: I don't need to use that because I have a 744 00:38:33,880 --> 00:38:36,520 Speaker 3: keyboard on my phone built in already. It's much faster. 745 00:38:36,800 --> 00:38:40,160 Speaker 3: So anyway, we stopped it at this company that makes 746 00:38:40,520 --> 00:38:44,360 Speaker 3: well now just makes the software to use in conjunction 747 00:38:44,480 --> 00:38:48,200 Speaker 3: with the augmented reality glasses and any like high powered laptop, 748 00:38:48,320 --> 00:38:50,680 Speaker 3: specifically the laptops that have like built in like you know, 749 00:38:50,800 --> 00:38:53,879 Speaker 3: like co pilots, because they require like higher processing power, 750 00:38:54,120 --> 00:38:56,880 Speaker 3: they have an NPU or something like that, like a 751 00:38:57,080 --> 00:39:00,600 Speaker 3: like an ampro processing unit is what they're calling AI 752 00:39:00,800 --> 00:39:05,080 Speaker 3: dedicated GPU thing. Effectively, it allows you to hook up 753 00:39:05,120 --> 00:39:09,400 Speaker 3: these glasses and run you know, possibly infinite amount of 754 00:39:09,440 --> 00:39:13,279 Speaker 3: monitors using AR. And we talked about this company last 755 00:39:13,360 --> 00:39:15,120 Speaker 3: year because we saw them a show stoppers. 756 00:39:15,200 --> 00:39:16,759 Speaker 1: You put on the glasses and it's like you've got 757 00:39:16,960 --> 00:39:19,560 Speaker 1: six monitors or whatever that are all full size, and. 758 00:39:19,560 --> 00:39:21,040 Speaker 3: It's actually really easy to use. 759 00:39:21,160 --> 00:39:23,960 Speaker 1: It works very well, seamlessly, it's nice. 760 00:39:23,840 --> 00:39:26,359 Speaker 3: It's it's it's good quality, easy to use, you can 761 00:39:26,400 --> 00:39:27,320 Speaker 3: move the monitors around. 762 00:39:27,440 --> 00:39:28,919 Speaker 1: It's an excellent, excellent game. 763 00:39:29,000 --> 00:39:30,640 Speaker 3: We talked to them last year and the main thing 764 00:39:30,719 --> 00:39:32,800 Speaker 3: that was holding this like holding us back on it 765 00:39:33,160 --> 00:39:35,239 Speaker 3: is that you needed to use their own proprietary lab. 766 00:39:35,320 --> 00:39:37,200 Speaker 1: It was their own laptop, and it wasn't a great one. 767 00:39:37,320 --> 00:39:39,600 Speaker 3: It was just like a Linux laptop. It didn't have 768 00:39:39,680 --> 00:39:42,240 Speaker 3: everything like I want out of my own personal laptop. 769 00:39:42,440 --> 00:39:43,800 Speaker 1: And we were still impressed with it. 770 00:39:43,880 --> 00:39:45,640 Speaker 3: Then it was still it was so good. Yeah, and 771 00:39:45,719 --> 00:39:47,680 Speaker 3: now you can just use any high power laptop with 772 00:39:47,760 --> 00:39:51,160 Speaker 3: it essentially. So it's lovely to see that improved. We 773 00:39:51,239 --> 00:39:55,400 Speaker 3: saw this lovely like like very small foldable projector. 774 00:39:55,760 --> 00:39:58,319 Speaker 1: Oh yeah, that was cool. What's the company? That company name, 775 00:39:58,320 --> 00:39:59,920 Speaker 1: because we should be we should be giving out the name. 776 00:40:00,520 --> 00:40:03,080 Speaker 3: Yes, the ar glasses and the software system is called 777 00:40:03,200 --> 00:40:07,560 Speaker 3: spacetop very good by a company called Sightful. It works 778 00:40:07,880 --> 00:40:11,280 Speaker 3: works great. But yeah, this this little folding folding projector 779 00:40:11,600 --> 00:40:15,640 Speaker 3: currently has a kickstarter. The company is called aura Zen yeah, 780 00:40:15,920 --> 00:40:20,240 Speaker 3: or as Zen specifically, it was the zip trifled projector. 781 00:40:20,920 --> 00:40:22,839 Speaker 3: Right now. It's it's a seven to twenty p very 782 00:40:22,920 --> 00:40:25,360 Speaker 3: small foldable projector. It has a whole it has like 783 00:40:25,520 --> 00:40:28,520 Speaker 3: like an auto focusing auto keystone. They're working to get 784 00:40:28,560 --> 00:40:31,400 Speaker 3: it up to ten DP, but they're running a kickstarter 785 00:40:31,560 --> 00:40:35,000 Speaker 3: right now to ship in about three months. Super good 786 00:40:35,080 --> 00:40:36,120 Speaker 3: quality stuff. 787 00:40:35,960 --> 00:40:37,920 Speaker 1: If you're a gadget rishina, Like it felt like a 788 00:40:38,200 --> 00:40:41,279 Speaker 1: quality piece of electronics in my hands, Like the way 789 00:40:41,360 --> 00:40:45,279 Speaker 1: it like snapped when it closed just felt good. I'm 790 00:40:45,360 --> 00:40:48,280 Speaker 1: I think I'm gonna buy one like it's it's exactly 791 00:40:48,360 --> 00:40:50,440 Speaker 1: what I want for traveling, which is the ability to 792 00:40:50,520 --> 00:40:52,400 Speaker 1: it goes up to like eighty inches of screen and 793 00:40:52,520 --> 00:40:55,040 Speaker 1: like very good resolution. The ability to just have that 794 00:40:55,719 --> 00:40:58,440 Speaker 1: plugged in to a battery or the wall and my 795 00:40:58,600 --> 00:41:01,759 Speaker 1: laptop and like wherever I happen to be, I've got 796 00:41:01,800 --> 00:41:03,719 Speaker 1: a movie screen that I don't have to worry about 797 00:41:03,719 --> 00:41:05,680 Speaker 1: the fucking hooking up a TV to my laptop or 798 00:41:05,719 --> 00:41:06,040 Speaker 1: some shit. 799 00:41:06,200 --> 00:41:08,480 Speaker 3: It doesn't need Wi Fi to work. It just can 800 00:41:08,600 --> 00:41:10,440 Speaker 3: cast from from your phone. 801 00:41:10,440 --> 00:41:14,920 Speaker 1: A u r ze and zip trifle projector Aura zen yep, yep. 802 00:41:14,960 --> 00:41:16,759 Speaker 1: I think they're selling them for two fifty right now. 803 00:41:17,160 --> 00:41:18,840 Speaker 3: That's for the for the Kickstarter. 804 00:41:18,920 --> 00:41:20,440 Speaker 1: For the kickstart, it will go up a little one. 805 00:41:20,480 --> 00:41:23,800 Speaker 1: That's a product. But we saw it. It works. They 806 00:41:23,880 --> 00:41:26,600 Speaker 1: had a lot of they had tracking and stuff so 807 00:41:26,680 --> 00:41:28,720 Speaker 1: it like automatically would focus on ship. 808 00:41:28,560 --> 00:41:32,040 Speaker 3: Auto focuses and at like it scales correctly forwards, projecting 809 00:41:32,760 --> 00:41:35,360 Speaker 3: it automatically like adjusts like the tilt of it so 810 00:41:35,560 --> 00:41:36,120 Speaker 3: that it you know. 811 00:41:36,280 --> 00:41:38,520 Speaker 1: Yeah, obviously this isn't the full review because we don't 812 00:41:38,520 --> 00:41:40,600 Speaker 1: own one, but form everything we could tell by looking 813 00:41:40,640 --> 00:41:42,160 Speaker 1: at it. In the moment we tried it out, I 814 00:41:42,239 --> 00:41:43,200 Speaker 1: hooked up my phone to it. 815 00:41:43,280 --> 00:41:45,320 Speaker 3: As I went to my phone screen, I realized I 816 00:41:45,480 --> 00:41:50,640 Speaker 3: have a slightly I would say, artful flute image of 817 00:41:50,760 --> 00:41:53,680 Speaker 3: an angel. I quickly swiped away from. 818 00:41:53,640 --> 00:41:55,760 Speaker 1: We shouldn't show your dick to your damn my home. 819 00:41:55,719 --> 00:41:59,480 Speaker 3: Screen of my phone. You know they're gonna worse. 820 00:41:59,520 --> 00:42:03,719 Speaker 1: Things always be worse. But I think where will end is? 821 00:42:04,000 --> 00:42:06,279 Speaker 1: And this actually is not entirely in order, because this 822 00:42:06,440 --> 00:42:08,920 Speaker 1: is the next after we had that conversation with our 823 00:42:08,920 --> 00:42:13,120 Speaker 1: friend Jack Jay, which just left me thinking about, like, 824 00:42:13,719 --> 00:42:16,080 Speaker 1: some people aren't really people, right, That's what I kind 825 00:42:16,160 --> 00:42:16,759 Speaker 1: the sole thing is. 826 00:42:16,960 --> 00:42:19,520 Speaker 3: It is a sham. It's all. It's all for rooms. 827 00:42:19,640 --> 00:42:20,400 Speaker 3: It's soulless. 828 00:42:20,880 --> 00:42:22,840 Speaker 1: We immediately walk over and we just kind of like 829 00:42:23,000 --> 00:42:26,480 Speaker 1: randomly turn a corner and there's like a human shin 830 00:42:27,080 --> 00:42:31,800 Speaker 1: like tibia amphibia basically with like a carbon fiber you know, 831 00:42:32,239 --> 00:42:34,719 Speaker 1: frame around it that's roughly the shape of like a 832 00:42:35,000 --> 00:42:40,200 Speaker 1: person's lower leg, lower leg, and it's called bio leg. 833 00:42:40,360 --> 00:42:44,600 Speaker 1: It's a powered microprocessor knee made in Japan, where it 834 00:42:44,719 --> 00:42:48,280 Speaker 1: is a prosthetic, but unlike most prosthetics, it is powered 835 00:42:48,400 --> 00:42:50,759 Speaker 1: and has a muscle built into it, so like when 836 00:42:50,760 --> 00:42:52,640 Speaker 1: you lift up your prosthetic, it doesn't hang and it 837 00:42:52,719 --> 00:42:55,360 Speaker 1: doesn't lock. It actually has a degree of motion and 838 00:42:55,440 --> 00:42:57,520 Speaker 1: it feels like what lifts the rest of the leg 839 00:42:57,719 --> 00:43:01,680 Speaker 1: what you're remaining muscles like it measures based on like 840 00:43:01,960 --> 00:43:04,000 Speaker 1: it can like take measurements from them and it can 841 00:43:04,200 --> 00:43:07,719 Speaker 1: act intelligently based on that. And I know that it 842 00:43:07,880 --> 00:43:11,200 Speaker 1: works because the inventor was there and he was a 843 00:43:11,280 --> 00:43:13,840 Speaker 1: man who was missing his leg below the knee and 844 00:43:14,000 --> 00:43:15,680 Speaker 1: had built this for himself. 845 00:43:16,360 --> 00:43:18,560 Speaker 3: You spent like ten years working on this, Yeah, eight years, 846 00:43:18,600 --> 00:43:19,120 Speaker 3: he said. 847 00:43:18,960 --> 00:43:22,120 Speaker 1: Eight years. And that's like really the thing that is 848 00:43:23,000 --> 00:43:26,000 Speaker 1: like so both like addictive and also like this like 849 00:43:26,200 --> 00:43:28,120 Speaker 1: very tonal whipless. You get at ces as you will 850 00:43:28,160 --> 00:43:31,120 Speaker 1: go from like this dead eyed con man trying to 851 00:43:31,200 --> 00:43:33,200 Speaker 1: scam the world so he can do god knows what 852 00:43:33,360 --> 00:43:37,080 Speaker 1: kinds of other harms with absolutely nothing, nothing inside him 853 00:43:37,080 --> 00:43:39,800 Speaker 1: at all, And then I lost my leg and I 854 00:43:39,920 --> 00:43:43,480 Speaker 1: built a better prosthetic to help the entire world, and 855 00:43:43,640 --> 00:43:47,360 Speaker 1: that's like thirty seconds between those two experiences. 856 00:43:46,880 --> 00:43:49,680 Speaker 3: And like that's that's like the dark magic of cees. 857 00:43:50,000 --> 00:43:53,320 Speaker 3: Like I don't like, I'm not like anti tech, like 858 00:43:53,400 --> 00:43:56,720 Speaker 3: I think there. I think technology can really improve people's 859 00:43:56,800 --> 00:43:59,400 Speaker 3: lives if used well. And sometimes I get kind of 860 00:43:59,440 --> 00:44:02,520 Speaker 3: black pilled walking around. See yes, but then we'll stumble 861 00:44:02,600 --> 00:44:05,879 Speaker 3: across this, like you know someone who like literally lost 862 00:44:05,920 --> 00:44:09,280 Speaker 3: a leg and made themselves their own better leg. 863 00:44:09,440 --> 00:44:12,200 Speaker 1: Eight years figuring out how to do this. Yeah, is 864 00:44:12,239 --> 00:44:12,880 Speaker 1: winning award. 865 00:44:12,960 --> 00:44:15,680 Speaker 3: It's like award winning, like tech innovation. 866 00:44:15,880 --> 00:44:18,279 Speaker 1: It's changing your as a person who has lost your 867 00:44:18,320 --> 00:44:20,239 Speaker 1: lower le, like changing being able to like have a 868 00:44:20,320 --> 00:44:24,920 Speaker 1: normal gait and balance again, like massive potential to improve 869 00:44:25,040 --> 00:44:26,640 Speaker 1: people's lives as a result of this. 870 00:44:27,640 --> 00:44:32,560 Speaker 3: Yeah, just steps away from Ai ska Ai and Donald 871 00:44:32,680 --> 00:44:36,759 Speaker 3: trump Man Lafle. 872 00:44:37,040 --> 00:44:40,839 Speaker 1: The company is again Bionic IM and it's the bio Leg. 873 00:44:40,960 --> 00:44:41,840 Speaker 3: Biolag is the product. 874 00:44:41,960 --> 00:44:44,120 Speaker 1: Yeah, the Bioleague is the product by Bionic M. 875 00:44:44,280 --> 00:44:46,160 Speaker 3: I'm going to try to check it out more tomorrow 876 00:44:46,400 --> 00:44:48,680 Speaker 3: at Eureka Park, which at this point you know that'll 877 00:44:48,719 --> 00:44:52,440 Speaker 3: be like maybe future episodes come next week. But I 878 00:44:52,480 --> 00:44:55,800 Speaker 3: guess this closes our actual like like we coadge. 879 00:44:56,440 --> 00:44:58,919 Speaker 1: Let's go get fucked up and eat Japanese food. 880 00:44:59,239 --> 00:45:01,640 Speaker 3: Oh, I'm here, I'm down, Let's do it. 881 00:45:05,120 --> 00:45:07,600 Speaker 1: It Could Happen Here is a production of cool Zone Media. 882 00:45:07,800 --> 00:45:10,800 Speaker 9: For more podcasts from cool Zone Media, visit our website 883 00:45:10,920 --> 00:45:14,439 Speaker 9: coolzonmedia dot com, or check us out on the iHeartRadio app, 884 00:45:14,560 --> 00:45:17,400 Speaker 9: Apple Podcasts, or wherever you listen to podcasts. 885 00:45:17,920 --> 00:45:19,799 Speaker 1: You can now find sources for it could Happen Here, 886 00:45:19,880 --> 00:45:21,640 Speaker 1: listed directly in episode descriptions. 887 00:45:22,040 --> 00:45:22,800 Speaker 2: Thanks for listening.