1 00:00:00,080 --> 00:00:02,800 Speaker 1: Guess what will what's that mango? So I know everyone 2 00:00:02,880 --> 00:00:05,560 Speaker 1: loves to make that joke, like it's two thousand eighteen. 3 00:00:05,640 --> 00:00:10,080 Speaker 1: Where's my jet pack? Like everyone was promised a jet pack? Yeah, 4 00:00:10,240 --> 00:00:14,520 Speaker 1: where's my flying car? I'm still curious where exactly. But 5 00:00:14,880 --> 00:00:16,759 Speaker 1: the stuff I want is the stuff we were actually 6 00:00:16,760 --> 00:00:19,120 Speaker 1: reported on. I mean, in the last fifteen years, we've 7 00:00:19,160 --> 00:00:22,000 Speaker 1: talked about things like contact lenses you wear at night 8 00:00:22,120 --> 00:00:24,080 Speaker 1: that would adjust your eyes so that you take them 9 00:00:24,079 --> 00:00:26,479 Speaker 1: out in the mornings and have perfect vision through the day, 10 00:00:26,640 --> 00:00:30,360 Speaker 1: which is cool, right, Or I remember writing about peanut 11 00:00:30,400 --> 00:00:32,680 Speaker 1: butter and jelly slices so you can just slap those 12 00:00:32,680 --> 00:00:35,600 Speaker 1: on bread for faster sandwiches. I mean, I was with you. 13 00:00:35,680 --> 00:00:37,600 Speaker 1: I'm not sure about the peanut butter thing. I feel 14 00:00:37,640 --> 00:00:40,480 Speaker 1: like spreading peanut butter on bread doesn't take that much 15 00:00:40,560 --> 00:00:44,280 Speaker 1: time or effort, I know, but I can't outsource that 16 00:00:44,360 --> 00:00:46,400 Speaker 1: job to my kids. Plus, you know, when I'm rushing 17 00:00:46,400 --> 00:00:47,720 Speaker 1: out in the morning, so I don't want to leave 18 00:00:47,760 --> 00:00:50,080 Speaker 1: like a dirty butter knife in the sink, which I 19 00:00:50,159 --> 00:00:53,159 Speaker 1: realized is I'm saying it is like a very minor gripe. 20 00:00:53,200 --> 00:00:55,600 Speaker 1: But that's the kind of future I want to live 21 00:00:55,640 --> 00:00:58,040 Speaker 1: in I mean, that's my favorite part about a P. 22 00:00:58,200 --> 00:01:00,600 Speaker 1: B and J is actually licking the peanut butter off 23 00:01:00,640 --> 00:01:03,560 Speaker 1: the knife after that. But anyway, I actually looked up 24 00:01:03,600 --> 00:01:06,680 Speaker 1: these newspaper predictions for the future for this episode, and 25 00:01:06,920 --> 00:01:09,720 Speaker 1: there's this article from I think it was nineteen hundred 26 00:01:09,760 --> 00:01:12,520 Speaker 1: in the Boston Globe, and the author makes all these 27 00:01:12,520 --> 00:01:17,360 Speaker 1: predictions about the year two thousand, no pollution, moving sidewalks, 28 00:01:17,360 --> 00:01:19,720 Speaker 1: and for some reason he thinks there will be an 29 00:01:19,760 --> 00:01:23,680 Speaker 1: AM and PM newspaper, which I mean, I guess there 30 00:01:23,760 --> 00:01:26,160 Speaker 1: is if you consider the round the clock reporting on 31 00:01:26,200 --> 00:01:28,600 Speaker 1: the Internet. But the thing I loved about it was 32 00:01:28,640 --> 00:01:31,320 Speaker 1: that he refused to make any predictions about the weather 33 00:01:31,400 --> 00:01:34,040 Speaker 1: because he assumed that, you know, a hundred years from then, 34 00:01:34,400 --> 00:01:38,920 Speaker 1: it would still be impossible to predict. Speaking of genius, 35 00:01:38,959 --> 00:01:41,400 Speaker 1: we've actually got Zach Weener Smith on the program today. 36 00:01:41,440 --> 00:01:43,640 Speaker 1: He's the co author of the book Soon Ish, and 37 00:01:44,160 --> 00:01:47,800 Speaker 1: it's basically the super fun textbook of the future. It's 38 00:01:47,800 --> 00:01:50,120 Speaker 1: almost more a guide to the hurdles will face, plus 39 00:01:50,120 --> 00:01:53,040 Speaker 1: a status check on how things like asteroid mining or 40 00:01:53,240 --> 00:01:56,560 Speaker 1: nuclear powered toasters are coming along. And I'm already doing 41 00:01:56,600 --> 00:02:00,160 Speaker 1: too much talking. Let's get him on the line. M 42 00:02:20,000 --> 00:02:22,480 Speaker 1: Hey there, podcast listeners, Welcome to Part Time Genius. I'm 43 00:02:22,480 --> 00:02:24,360 Speaker 1: Will Pearson and as always I'm joined by my good 44 00:02:24,400 --> 00:02:27,519 Speaker 1: friend man guest Ticketer and sitting behind the soundproof glass 45 00:02:27,600 --> 00:02:30,640 Speaker 1: playing with his doc Brown bobble Head. I didn't know 46 00:02:30,639 --> 00:02:32,919 Speaker 1: Tristan was a Back to the Future fan. Well, apparently 47 00:02:32,960 --> 00:02:35,760 Speaker 1: he only likes the first and the third one, so 48 00:02:35,880 --> 00:02:37,840 Speaker 1: so do not talk to him about Back to the 49 00:02:37,840 --> 00:02:40,920 Speaker 1: Future Part two. He gets really emotional. But but that's 50 00:02:40,919 --> 00:02:44,040 Speaker 1: our pal and producer Tristan McNeil. Now I want to 51 00:02:44,080 --> 00:02:46,839 Speaker 1: jump right into this interview. Today we've got a very 52 00:02:46,880 --> 00:02:49,880 Speaker 1: special guest, the super funny cartoonist and the co author 53 00:02:49,919 --> 00:02:53,000 Speaker 1: of the book soon Ish ten Emerging Technologies that will 54 00:02:53,040 --> 00:02:56,880 Speaker 1: improve and or ruin Everything, Zach Wiener Smith. Welcome to 55 00:02:56,919 --> 00:03:00,560 Speaker 1: Part Time Genius. Than now, Zach, we should mentioned up 56 00:03:00,560 --> 00:03:02,280 Speaker 1: front that this is a book that you wrote with 57 00:03:02,360 --> 00:03:05,440 Speaker 1: your super smart wife, Kelly, who's a professor at RICE. 58 00:03:05,440 --> 00:03:07,400 Speaker 1: Can can you remind us what she what she does 59 00:03:07,400 --> 00:03:11,880 Speaker 1: at Rice? Her research pertains to parasites that manipulate host behavior. 60 00:03:12,240 --> 00:03:13,920 Speaker 1: So if your audience is familiar with any of this, 61 00:03:13,960 --> 00:03:17,600 Speaker 1: they probably heard of toxoplasmosis, which perhaps manipulates a lot 62 00:03:17,600 --> 00:03:21,639 Speaker 1: of human behavior cats right in cat litter. Yes, yeah, precisely, 63 00:03:21,760 --> 00:03:24,080 Speaker 1: lots of us have it. Um. I actually don't know idea, 64 00:03:24,080 --> 00:03:26,920 Speaker 1: I probably do. I group with cats. But yeah, so 65 00:03:26,919 --> 00:03:29,480 Speaker 1: so they're they're never of claims about it, like it 66 00:03:30,440 --> 00:03:33,320 Speaker 1: reduces reaction time, which you can kind of understand if 67 00:03:33,400 --> 00:03:35,440 Speaker 1: if you wanted to have if you were the parasite 68 00:03:35,440 --> 00:03:38,160 Speaker 1: and you were trying to make a mouse get etn 69 00:03:38,160 --> 00:03:41,160 Speaker 1: my cat. Uh, it makes mice less averse to the 70 00:03:41,160 --> 00:03:44,680 Speaker 1: small cat urine. Um. There, there's some there's some studies. Uh, 71 00:03:44,880 --> 00:03:46,760 Speaker 1: Kelly knows this stuff better than I do, but minderstaining 72 00:03:46,760 --> 00:03:48,920 Speaker 1: from talking to her is there's some studies that try 73 00:03:48,920 --> 00:03:53,280 Speaker 1: to correlate the stuff to humans, and they're very tantalizing. 74 00:03:53,880 --> 00:03:55,760 Speaker 1: But obviously you're you're not allowed to like get a 75 00:03:55,800 --> 00:03:58,440 Speaker 1: hundred humans and then put parasites in them and see 76 00:03:58,480 --> 00:04:05,560 Speaker 1: what changes. Meanwhile, Kelly's at work right now doing exactly that. Yeah, 77 00:04:05,840 --> 00:04:10,480 Speaker 1: we don't talk about that. So this book soon is 78 00:04:10,480 --> 00:04:14,360 Speaker 1: is so fun and funny, and especially the nota Benet 79 00:04:14,440 --> 00:04:17,640 Speaker 1: sections which are just so great. But can you tell 80 00:04:17,680 --> 00:04:19,200 Speaker 1: us why you chose to write a book about the 81 00:04:19,240 --> 00:04:22,040 Speaker 1: future and why you concentrated on the ten subject areas 82 00:04:22,080 --> 00:04:25,200 Speaker 1: that you did. Sure. Um, so we were kind of 83 00:04:25,240 --> 00:04:28,320 Speaker 1: just interested in future stuff. And also we we both 84 00:04:28,320 --> 00:04:30,560 Speaker 1: had this thing where we're sort of generally interested in 85 00:04:30,560 --> 00:04:34,039 Speaker 1: space travel and realize we know really nothing about it. 86 00:04:34,320 --> 00:04:37,400 Speaker 1: The tent technologies in particular, we got into sort of 87 00:04:37,440 --> 00:04:40,080 Speaker 1: a process of cutting away. We actually started with a 88 00:04:40,080 --> 00:04:42,840 Speaker 1: list of fifty and some got cut because they turned 89 00:04:42,839 --> 00:04:44,760 Speaker 1: out to just not really exist. We thought maybe it 90 00:04:44,800 --> 00:04:47,600 Speaker 1: was a thing, and it's not a thing. Uh so 91 00:04:47,640 --> 00:04:49,479 Speaker 1: we we have to give an example of that. Like 92 00:04:49,560 --> 00:04:51,800 Speaker 1: we we thought, hey, what is anyone working on like 93 00:04:52,520 --> 00:04:55,240 Speaker 1: perfected economic forecasting or something like that, Like, is there's 94 00:04:55,240 --> 00:04:58,080 Speaker 1: gonna be some future like you know, super computer devices. 95 00:04:58,120 --> 00:04:59,880 Speaker 1: I mean there are things that kind of do related 96 00:05:00,000 --> 00:05:01,640 Speaker 1: suff but no one's trying to sort of forecast like 97 00:05:01,640 --> 00:05:04,360 Speaker 1: the weather gets forecast, right. So Yeah, it was sort 98 00:05:04,360 --> 00:05:06,120 Speaker 1: of willing away to stuff that we thought we could 99 00:05:06,160 --> 00:05:08,719 Speaker 1: do properly in five to ten thousand words and which 100 00:05:08,760 --> 00:05:11,680 Speaker 1: was interesting, and also stuff that we thought maybe people 101 00:05:11,720 --> 00:05:14,200 Speaker 1: hadn't heard about. Um that's not true every chapter, but 102 00:05:14,240 --> 00:05:16,520 Speaker 1: stuff like programmable matter I think most people have not 103 00:05:16,600 --> 00:05:19,880 Speaker 1: heard of, Um, maybe synthetic biology most people have not 104 00:05:19,960 --> 00:05:21,160 Speaker 1: heard of it, and I think probably most of the 105 00:05:21,279 --> 00:05:24,800 Speaker 1: like cheap access to space technology stuff is not the 106 00:05:24,839 --> 00:05:26,320 Speaker 1: exception of a couple of things is probably not on 107 00:05:26,360 --> 00:05:28,520 Speaker 1: people's radar. Yeah, and we we definitely want to ask 108 00:05:28,520 --> 00:05:30,120 Speaker 1: you about a few of those things. I will say 109 00:05:30,160 --> 00:05:33,000 Speaker 1: one of our favorite sections was the note on the 110 00:05:33,120 --> 00:05:35,320 Speaker 1: End for humanity and you know and why you and 111 00:05:35,400 --> 00:05:38,680 Speaker 1: Kelly are are pessimistic about betting on humans if robots 112 00:05:38,760 --> 00:05:41,400 Speaker 1: ever revolt. So can you tell us a little bit 113 00:05:41,440 --> 00:05:45,800 Speaker 1: about promobod and guyas specifically? Sure? Problem about we kind 114 00:05:45,800 --> 00:05:49,000 Speaker 1: of just throwing for a cute seas this is this 115 00:05:49,080 --> 00:05:51,200 Speaker 1: Russian robot or maybe maybe sort of set the table. 116 00:05:51,480 --> 00:05:54,200 Speaker 1: Is this Russian robot designed to assist people like to 117 00:05:54,240 --> 00:05:57,120 Speaker 1: assist the elderly. So the idea is basically elder care. 118 00:05:57,360 --> 00:05:58,600 Speaker 1: There are a lot of people who want to do it. 119 00:05:58,600 --> 00:06:00,160 Speaker 1: It's not well paid work. It would be really nice 120 00:06:00,200 --> 00:06:01,520 Speaker 1: if you could have a machine to do a lot 121 00:06:01,560 --> 00:06:05,160 Speaker 1: of the basic tasks of elder care. Uh. And so 122 00:06:05,320 --> 00:06:07,320 Speaker 1: there's this robot a Russian company is working on called 123 00:06:07,360 --> 00:06:09,400 Speaker 1: promo Butt. I think I'm sure I'm pronouncing that wrong, 124 00:06:09,839 --> 00:06:15,680 Speaker 1: but anyway, it uh tried to escape from the facility 125 00:06:15,760 --> 00:06:17,919 Speaker 1: we're working apparently twice and one time it escaped and 126 00:06:18,000 --> 00:06:22,200 Speaker 1: like died in traffic, which I presumably wasn't like a 127 00:06:22,320 --> 00:06:27,440 Speaker 1: choicebody who knows, Um, guy is this robot made by 128 00:06:27,480 --> 00:06:30,560 Speaker 1: Serena Booth who's was doing an undergraduate thesis at Harvard. 129 00:06:30,920 --> 00:06:33,120 Speaker 1: And the basic deal on Guya is it's just a 130 00:06:33,160 --> 00:06:36,320 Speaker 1: dumb remote control robot and she would secretly control it 131 00:06:36,400 --> 00:06:38,680 Speaker 1: so to an onlooker would look like, you know, for 132 00:06:38,720 --> 00:06:41,159 Speaker 1: all you know, it was like a startup autonomous robot 133 00:06:41,200 --> 00:06:45,440 Speaker 1: deal because it's it's Boston after all. And uh, the 134 00:06:45,520 --> 00:06:48,280 Speaker 1: robot would go up to students near their dorm rooms 135 00:06:48,360 --> 00:06:49,880 Speaker 1: or their dorms and say, well, you let me into 136 00:06:49,920 --> 00:06:52,640 Speaker 1: the building, which which is a big no no. It's 137 00:06:52,720 --> 00:06:54,919 Speaker 1: especially big no no and on Harvard because Harvard is 138 00:06:55,279 --> 00:06:57,080 Speaker 1: you know, the world famous, so they're like lots of 139 00:06:57,120 --> 00:07:00,000 Speaker 1: looky loose and weirdos trying to get into buildings. Uh. 140 00:07:00,600 --> 00:07:03,080 Speaker 1: And actually, so to speak fortuitously, around the time of 141 00:07:03,120 --> 00:07:06,839 Speaker 1: the experiment, there apparently been bomb threats. Um and so 142 00:07:07,200 --> 00:07:08,720 Speaker 1: I forget that the numbers are in the book, but 143 00:07:08,720 --> 00:07:10,840 Speaker 1: it's something like twenty percent of students will let people 144 00:07:10,840 --> 00:07:13,360 Speaker 1: in which is, you know, not not a high number. 145 00:07:13,400 --> 00:07:15,320 Speaker 1: I remember it's harvardson so they're supposed to be ultra 146 00:07:15,360 --> 00:07:18,280 Speaker 1: genius is um And and furthermore, if you're trying to 147 00:07:18,320 --> 00:07:21,960 Speaker 1: cause some trouble and you only have to try five times, uh, 148 00:07:22,040 --> 00:07:25,880 Speaker 1: you know, that's not great odds. But but they're the 149 00:07:25,880 --> 00:07:28,560 Speaker 1: really interesting thing that there are a number of permutations 150 00:07:28,560 --> 00:07:30,040 Speaker 1: of this, and there's a different set of experiments by 151 00:07:30,040 --> 00:07:32,640 Speaker 1: another researcher, but the really the one we were really 152 00:07:32,680 --> 00:07:35,320 Speaker 1: interested in was she didn't experiment where all she did 153 00:07:35,360 --> 00:07:38,280 Speaker 1: was load the robot with cookies, like she just got 154 00:07:38,280 --> 00:07:39,640 Speaker 1: a box of cookies to put them on top, and 155 00:07:39,680 --> 00:07:41,880 Speaker 1: the rowbout what compeist didn't say, I'm here delivering cookies, 156 00:07:41,920 --> 00:07:43,600 Speaker 1: will you let me in? And it's shot up to 157 00:07:43,680 --> 00:07:45,120 Speaker 1: the exact numbers in the book, but it's something like 158 00:07:45,160 --> 00:07:48,800 Speaker 1: three quarters of students would let that robot in. Um. So, 159 00:07:49,200 --> 00:07:52,120 Speaker 1: I mean it's sort of a funny experiment, um. But 160 00:07:52,120 --> 00:07:54,280 Speaker 1: but there is a sort of serious side to it, 161 00:07:54,360 --> 00:07:56,800 Speaker 1: which is the way we like to say in talks 162 00:07:56,800 --> 00:07:59,200 Speaker 1: we've given on this is essentially, if you look at 163 00:07:59,280 --> 00:08:01,640 Speaker 1: movies we humans make about what it would take to 164 00:08:01,680 --> 00:08:04,480 Speaker 1: fool us into obeying the robots. The robots always have 165 00:08:04,520 --> 00:08:09,040 Speaker 1: to really put out right, like build an exoskeleton with 166 00:08:09,120 --> 00:08:10,800 Speaker 1: like stem cells or I don't know what, Like the 167 00:08:10,880 --> 00:08:13,400 Speaker 1: terminator looks exactly like a human except for it's got 168 00:08:13,400 --> 00:08:16,480 Speaker 1: a metal endoskeleton and this and that, Like they bulot 169 00:08:16,480 --> 00:08:19,160 Speaker 1: a tea one thousand like this a possible complicated robot 170 00:08:19,280 --> 00:08:21,160 Speaker 1: just to like look kind of human. It just turns 171 00:08:21,200 --> 00:08:25,679 Speaker 1: out we'll we'll like opening a trash can with cookies. Yeah, yeah, 172 00:08:27,320 --> 00:08:29,640 Speaker 1: and so so it is. It is kind of I 173 00:08:29,680 --> 00:08:32,400 Speaker 1: don't know, I suppose ominous um or I think you 174 00:08:32,400 --> 00:08:34,400 Speaker 1: think about it hard. PUD should cause you to rethink 175 00:08:34,559 --> 00:08:38,839 Speaker 1: your relationship to machines a bit like yeah, and well 176 00:08:38,920 --> 00:08:42,360 Speaker 1: and cookies. It is kind of analogous to something like um. 177 00:08:42,400 --> 00:08:43,680 Speaker 1: I think a lot of us have been rethinking our 178 00:08:43,679 --> 00:08:46,360 Speaker 1: relationship with Facebook, right, which is kind of the same deal. 179 00:08:46,360 --> 00:08:49,160 Speaker 1: It's giving us something cool, which is the sort of 180 00:08:49,200 --> 00:08:52,840 Speaker 1: proverbial cookies, and in exchange, it's taking something or I know, 181 00:08:52,920 --> 00:08:55,920 Speaker 1: for example, like I used to second guess my GPS, 182 00:08:55,960 --> 00:08:57,920 Speaker 1: and I discovered literally every time I did that, I 183 00:08:58,040 --> 00:09:01,199 Speaker 1: was wrong. The machine is always right, and so now 184 00:09:01,280 --> 00:09:03,599 Speaker 1: I will obey the GPS, even if it appears to 185 00:09:03,640 --> 00:09:07,160 Speaker 1: be doing something stupid to me, I've completely gotten rid 186 00:09:07,200 --> 00:09:09,600 Speaker 1: of my my like regard for self. As it as 187 00:09:09,640 --> 00:09:14,240 Speaker 1: it pertains to mapping, it is a little ominous. I mean, 188 00:09:14,280 --> 00:09:16,760 Speaker 1: you don't even have deposit sentient machines to be scared. 189 00:09:16,800 --> 00:09:19,560 Speaker 1: It could be just some bad actor. Well, let's let's 190 00:09:19,640 --> 00:09:22,240 Speaker 1: let's talk a little bit about the positive applications because 191 00:09:22,240 --> 00:09:28,320 Speaker 1: imagine it sticking with robots and programmable matter. I'm really 192 00:09:28,360 --> 00:09:31,240 Speaker 1: fascinated by the benefits in medicine and then also in 193 00:09:31,400 --> 00:09:33,439 Speaker 1: the home that you talked about. Could you explain a 194 00:09:33,480 --> 00:09:35,959 Speaker 1: little bit about to our listeners. So the basic idea 195 00:09:36,000 --> 00:09:38,160 Speaker 1: with programm will matter is something like, you know, the 196 00:09:38,200 --> 00:09:40,679 Speaker 1: neat thing about a computer is a computer is not 197 00:09:40,720 --> 00:09:43,640 Speaker 1: just one machine. It's potentially infinite machines. You computer does 198 00:09:43,679 --> 00:09:45,559 Speaker 1: almost everything for you. That's why your phone can be 199 00:09:45,600 --> 00:09:50,319 Speaker 1: a radio and a clock and well phone to it occasionally. Um. 200 00:09:50,400 --> 00:09:52,720 Speaker 1: And the idea with programmable matter is to be able 201 00:09:52,840 --> 00:09:54,840 Speaker 1: to do that sort of thing but with stuff. So 202 00:09:54,880 --> 00:09:57,160 Speaker 1: it's like if you had a sort of object that 203 00:09:57,200 --> 00:09:58,720 Speaker 1: could be a cup, or it could be a phone, 204 00:09:58,800 --> 00:10:00,280 Speaker 1: or it could be anything else. And the the way 205 00:10:00,320 --> 00:10:01,960 Speaker 1: we like to describe it is it's like having the 206 00:10:01,960 --> 00:10:03,959 Speaker 1: T one thousand, only it's not trying to kill you. 207 00:10:04,200 --> 00:10:06,079 Speaker 1: So obviously that level of stuff that sort of like 208 00:10:06,120 --> 00:10:08,600 Speaker 1: shimmering good that turns into whatever, that's really far off 209 00:10:08,679 --> 00:10:11,920 Speaker 1: if it's even possible. But there are lots of intermediate versions. 210 00:10:12,000 --> 00:10:15,160 Speaker 1: And so one idea is what's called Oregonmy robots. And 211 00:10:15,160 --> 00:10:17,400 Speaker 1: so if you think about it in terms of universality, 212 00:10:17,800 --> 00:10:19,800 Speaker 1: Oregonmy is basically this thing where you take a piece 213 00:10:19,800 --> 00:10:21,360 Speaker 1: of paper and the piece of paper can be almost 214 00:10:21,440 --> 00:10:23,960 Speaker 1: infinite things. It might be literally infinite the math on 215 00:10:24,040 --> 00:10:26,440 Speaker 1: it um, but it's something like infinite. Right. You can 216 00:10:26,440 --> 00:10:28,360 Speaker 1: get many different shapes out of one thing, and many 217 00:10:28,360 --> 00:10:30,360 Speaker 1: of them could be useful. And so one idea is 218 00:10:30,400 --> 00:10:33,760 Speaker 1: you could um do Oregonmy robots that do stuff in 219 00:10:33,800 --> 00:10:36,199 Speaker 1: the body. Right. Um. So, the the the really cool 220 00:10:36,280 --> 00:10:37,880 Speaker 1: version of this that we were interested in was done 221 00:10:37,880 --> 00:10:39,800 Speaker 1: by a woman named Danielle Rouse, or done by her 222 00:10:39,880 --> 00:10:43,160 Speaker 1: lab at M I. T. And the basic idea was 223 00:10:43,360 --> 00:10:45,840 Speaker 1: you have a pilly swallow in the pill. The pills 224 00:10:45,840 --> 00:10:48,400 Speaker 1: made of ice, but in it is this little um 225 00:10:48,440 --> 00:10:50,680 Speaker 1: it's actually piece of sausage casing with some wiring in 226 00:10:50,760 --> 00:10:54,360 Speaker 1: it so that it can fold along preprogrammed axis. Uh. 227 00:10:54,400 --> 00:10:56,800 Speaker 1: And you swallow it, it it melts. Well, it's in your gut. 228 00:10:57,120 --> 00:10:59,240 Speaker 1: And now there's this little machine that you can manipulate 229 00:10:59,280 --> 00:11:02,440 Speaker 1: remotely that can fold the different configurations. Uh and and 230 00:11:02,480 --> 00:11:04,480 Speaker 1: you know, it's sort of like future optimal version. It's 231 00:11:04,600 --> 00:11:06,559 Speaker 1: it's completely self powered and could do all sorts of 232 00:11:06,600 --> 00:11:08,400 Speaker 1: cool stuff. In this version, it's it's I think we're 233 00:11:08,400 --> 00:11:11,400 Speaker 1: out controlled by like a magnet you have outside the body. 234 00:11:11,400 --> 00:11:13,640 Speaker 1: But anyway, the reason why would you want this? Why 235 00:11:13,600 --> 00:11:16,320 Speaker 1: would why do you want an orgony robot in your guts? It? Well, 236 00:11:16,360 --> 00:11:19,679 Speaker 1: it turns out in America thirty of us every year 237 00:11:19,679 --> 00:11:22,440 Speaker 1: swallow a watch battery. Which thing gets lodged in our guts? 238 00:11:24,840 --> 00:11:26,960 Speaker 1: And I I hasten to add that's just the people 239 00:11:26,960 --> 00:11:31,160 Speaker 1: who a report and and who'd be didn't pass it naturally. 240 00:11:31,240 --> 00:11:34,040 Speaker 1: So there's like a like a shockinglar high number of 241 00:11:34,040 --> 00:11:36,800 Speaker 1: people swallowing watch batteries. I don't know why. Presumably most 242 00:11:36,800 --> 00:11:38,440 Speaker 1: of them are children, but I we didn't get that 243 00:11:38,520 --> 00:11:42,679 Speaker 1: set um so. Uh So, But to the idea is 244 00:11:42,720 --> 00:11:44,280 Speaker 1: you just have a slow machine that walks in kind 245 00:11:44,280 --> 00:11:47,199 Speaker 1: of grabs the battery and then the whole thing passes 246 00:11:47,280 --> 00:11:49,760 Speaker 1: right out as nature intended. This one's about the size 247 00:11:49,760 --> 00:11:51,720 Speaker 1: of a postage stamp. You can imagine ones that are 248 00:11:51,800 --> 00:11:54,520 Speaker 1: much more mentorized. Um and why that would be exciting 249 00:11:54,600 --> 00:11:56,320 Speaker 1: or something like this. So you know, if you have 250 00:11:56,360 --> 00:11:59,000 Speaker 1: a headache, it's not like your whole body hurts, but 251 00:11:59,080 --> 00:12:02,880 Speaker 1: you take a pill, it lowers the inflammation your whole body. Right, So, 252 00:12:02,960 --> 00:12:04,720 Speaker 1: so aspirin is good whether you have like a banged 253 00:12:04,760 --> 00:12:07,680 Speaker 1: up knee or a headache or what have you. Ideally, 254 00:12:07,720 --> 00:12:09,800 Speaker 1: and be nice if you could get machines to deliver 255 00:12:10,400 --> 00:12:12,960 Speaker 1: the medicine to a specific location. And you know, an 256 00:12:12,960 --> 00:12:14,679 Speaker 1: aspirin it's not a big deal. But if you imagine 257 00:12:14,760 --> 00:12:17,520 Speaker 1: like really toxic like chemo medication, it would be really 258 00:12:17,600 --> 00:12:20,600 Speaker 1: nice if you could dose only the necessary areas. UM. 259 00:12:20,640 --> 00:12:23,079 Speaker 1: So that's a sort of very futuristic application of this 260 00:12:23,200 --> 00:12:26,480 Speaker 1: sort of thing. Um it could be really interesting. I 261 00:12:26,480 --> 00:12:29,480 Speaker 1: think you asked about programmable homes. I I'm trying to 262 00:12:29,520 --> 00:12:31,000 Speaker 1: remember what they think. There are a couple of different 263 00:12:31,080 --> 00:12:32,440 Speaker 1: versions of that we talked about. I think the thing 264 00:12:32,480 --> 00:12:35,400 Speaker 1: I was fascinated moths about was just that because I 265 00:12:35,679 --> 00:12:38,079 Speaker 1: lived in a tiny apartment in Brooklyn, and the fact 266 00:12:38,120 --> 00:12:41,160 Speaker 1: that you know, your whole room could transform so quickly 267 00:12:41,400 --> 00:12:44,480 Speaker 1: or and French and assembly, un assemble and like that. 268 00:12:44,480 --> 00:12:47,080 Speaker 1: That's all just amazing to me. Yeah, it's funny because 269 00:12:47,120 --> 00:12:49,600 Speaker 1: so pregimal matter we call it that. It actually goes 270 00:12:49,640 --> 00:12:51,320 Speaker 1: under a lot of different names. But but yeah, so 271 00:12:51,360 --> 00:12:53,839 Speaker 1: there's a lot more sort of you know, macro stuff. 272 00:12:53,880 --> 00:12:56,480 Speaker 1: It's really interesting. Um, there's there's a group we talked 273 00:12:56,480 --> 00:12:59,760 Speaker 1: about called room bots and rombots. Basically, these cube like 274 00:13:00,000 --> 00:13:02,400 Speaker 1: things about the size of a baseball. Uh, it's sort 275 00:13:02,400 --> 00:13:04,319 Speaker 1: of a cross between a cube and a circle, so 276 00:13:04,640 --> 00:13:06,520 Speaker 1: can sort of roll around and and then it's just 277 00:13:06,559 --> 00:13:08,480 Speaker 1: that and then it has a little sensing ability and 278 00:13:08,520 --> 00:13:11,160 Speaker 1: it can doc with other versions of itself and it's needs. 279 00:13:11,240 --> 00:13:13,680 Speaker 1: Then they can self assemble into like chairs or tables 280 00:13:13,760 --> 00:13:16,120 Speaker 1: or whatever. So if you're really wanted to be you know, 281 00:13:16,200 --> 00:13:19,160 Speaker 1: Swedish about your your room, you could just have a 282 00:13:19,200 --> 00:13:21,400 Speaker 1: pile of these robots in the corner and you could say, 283 00:13:21,400 --> 00:13:23,920 Speaker 1: assemble into like my kitchen set up or whatever whenever 284 00:13:23,960 --> 00:13:25,680 Speaker 1: you need. Of course, they're very slow right now, but 285 00:13:25,720 --> 00:13:28,439 Speaker 1: you can imagine a future right where it doesn't take 286 00:13:28,559 --> 00:13:30,800 Speaker 1: you know, ten hours. I don remember if we put 287 00:13:30,840 --> 00:13:33,600 Speaker 1: this in the book, but actually people worked on systems 288 00:13:33,600 --> 00:13:36,120 Speaker 1: of room buts that mate with each other to produce 289 00:13:36,200 --> 00:13:42,120 Speaker 1: like um terrifying Yeah right, it's like implementing genetic algorithms 290 00:13:42,160 --> 00:13:47,520 Speaker 1: in the real world. Dogs everywhere shivering the promise. It's 291 00:13:47,520 --> 00:13:50,200 Speaker 1: really slow. And again I don't remember mentioned this this 292 00:13:50,240 --> 00:13:52,160 Speaker 1: in the book, but there's one group that actually did 293 00:13:52,200 --> 00:13:55,520 Speaker 1: basically like machine breading via a three D printer. If 294 00:13:55,520 --> 00:13:57,320 Speaker 1: that makes sense that you have two machines the quote 295 00:13:57,400 --> 00:14:00,320 Speaker 1: unquote agree to mate in whatever sense machines do that, 296 00:14:00,440 --> 00:14:01,960 Speaker 1: and then you have a three D printer sort of 297 00:14:02,160 --> 00:14:04,840 Speaker 1: print the outcome. Uh. And what it was funny is, 298 00:14:04,880 --> 00:14:06,720 Speaker 1: for whatever reason, the group decided to have the robots 299 00:14:06,760 --> 00:14:08,840 Speaker 1: kind of rubb on each other. But they were like 300 00:14:08,880 --> 00:14:13,840 Speaker 1: completely needlessly. Uh. I don't know. It just gets lonely 301 00:14:13,880 --> 00:14:16,760 Speaker 1: working on these projects. I think these people are kind 302 00:14:16,760 --> 00:14:22,440 Speaker 1: of fulfilling other needs. But the speaking of multiple robots 303 00:14:22,480 --> 00:14:25,520 Speaker 1: of it. What about these robotics swarms and robots that 304 00:14:25,600 --> 00:14:27,920 Speaker 1: work in teams. Can you talk a little bit about 305 00:14:27,920 --> 00:14:31,720 Speaker 1: the applications there? Yeah, robot swarms, So we talked about 306 00:14:31,720 --> 00:14:33,600 Speaker 1: them in two contexts. Is robot swarm is what it 307 00:14:33,640 --> 00:14:35,720 Speaker 1: sounds like. It's a swarm of robots, and you might 308 00:14:35,720 --> 00:14:37,680 Speaker 1: ask yourself, why why would I want that instead of 309 00:14:37,720 --> 00:14:39,960 Speaker 1: one robot or one large robot. So there are a 310 00:14:39,960 --> 00:14:43,160 Speaker 1: couple of contexts where might be valuable. One is suppose 311 00:14:43,280 --> 00:14:46,120 Speaker 1: you UM have some kind of disaster areas, so there's 312 00:14:46,120 --> 00:14:48,800 Speaker 1: like a nuclear accident, or there's just a lot of 313 00:14:48,840 --> 00:14:50,720 Speaker 1: rubbles something where you wouldn't want to send people in. 314 00:14:51,400 --> 00:14:53,520 Speaker 1: UM with a store of robots, you can have a 315 00:14:53,520 --> 00:14:55,840 Speaker 1: lot of small cameras sort of going all over the 316 00:14:55,840 --> 00:14:58,240 Speaker 1: place to explore an area. And also if one breaks, 317 00:14:58,280 --> 00:15:01,280 Speaker 1: it's less of a big deal. Yeah. In addition, UM, 318 00:15:01,640 --> 00:15:03,480 Speaker 1: you can design them. There's some groups that work on this. 319 00:15:03,600 --> 00:15:05,520 Speaker 1: You can design them so they can do stuff with 320 00:15:05,560 --> 00:15:08,360 Speaker 1: each other, so they can their modular So you know, 321 00:15:08,360 --> 00:15:10,600 Speaker 1: there's one group we looked at where they had this 322 00:15:10,600 --> 00:15:12,720 Speaker 1: this clever trick where I suppose you have You can 323 00:15:12,760 --> 00:15:14,720 Speaker 1: imagine a little tiny robot about the size of your 324 00:15:14,720 --> 00:15:16,800 Speaker 1: fist that's rolling around and it comes across a chasm 325 00:15:16,840 --> 00:15:19,240 Speaker 1: it can't cross on its own. It then signals like 326 00:15:19,360 --> 00:15:21,320 Speaker 1: five other of the same robot to dock with it 327 00:15:21,360 --> 00:15:23,160 Speaker 1: to form a sort of train and now it can 328 00:15:23,240 --> 00:15:25,400 Speaker 1: go over or for example, of it has to cross 329 00:15:25,520 --> 00:15:29,000 Speaker 1: like on a narrow passage, you can configure so that 330 00:15:29,040 --> 00:15:31,960 Speaker 1: it has the sort of robot equivalent of holding two 331 00:15:32,080 --> 00:15:34,320 Speaker 1: arms out to the sides. Uh and so it can 332 00:15:34,360 --> 00:15:36,120 Speaker 1: balance the stuff like that. So you can imagine it 333 00:15:36,480 --> 00:15:39,000 Speaker 1: was some fairly simple algorithms that can navigate all sorts 334 00:15:39,000 --> 00:15:42,160 Speaker 1: of terrain more effectively. The other context in which we 335 00:15:42,160 --> 00:15:44,920 Speaker 1: talked about robot sworms is we have trapped on what's 336 00:15:44,920 --> 00:15:47,120 Speaker 1: called robotic construction, which is again what it sounds like, 337 00:15:47,200 --> 00:15:49,800 Speaker 1: it's machines sort of taking over the work of construction 338 00:15:49,800 --> 00:15:52,280 Speaker 1: workers or even extending it. The opposite of a swarm 339 00:15:52,320 --> 00:15:53,800 Speaker 1: would be this this one idea, which is to have 340 00:15:53,840 --> 00:15:55,960 Speaker 1: a sort of giant gantry, like basically a huge three 341 00:15:56,000 --> 00:15:58,200 Speaker 1: D printer that kind of prints out a house, which 342 00:15:58,240 --> 00:16:00,400 Speaker 1: is cool, but like might be limb to say, if 343 00:16:00,400 --> 00:16:01,880 Speaker 1: you wanted to build in New York. That's a little 344 00:16:01,920 --> 00:16:04,320 Speaker 1: tough to do. A robot swarm would be building upward, 345 00:16:04,360 --> 00:16:06,160 Speaker 1: kind of like termites do. Right, So you have like 346 00:16:06,200 --> 00:16:08,600 Speaker 1: a lot of little robots delivering small amounts of material 347 00:16:09,240 --> 00:16:11,920 Speaker 1: um and and sort of coordinating with each other. Uh. 348 00:16:11,960 --> 00:16:14,200 Speaker 1: And so there would be sort of an interesting different 349 00:16:14,200 --> 00:16:18,000 Speaker 1: way to do it. Um like like pretty revolutionary for 350 00:16:18,080 --> 00:16:20,440 Speaker 1: better end worse. Um. But but to me that that 351 00:16:20,480 --> 00:16:22,600 Speaker 1: seems kind of like plausible because the idea that is 352 00:16:22,600 --> 00:16:26,040 Speaker 1: is essentially what might happen is like a semi shows 353 00:16:26,120 --> 00:16:28,080 Speaker 1: up and it's loaded with you know, a thousand or 354 00:16:28,120 --> 00:16:30,840 Speaker 1: whatever little robots, and they come out and they slowly 355 00:16:30,840 --> 00:16:33,080 Speaker 1: assemble your house, kind of like a team of ants. 356 00:16:33,520 --> 00:16:35,160 Speaker 1: Um And again it has the same virtue of if 357 00:16:35,160 --> 00:16:37,800 Speaker 1: one breaks or something screwy, it doesn't matter. So it's 358 00:16:37,840 --> 00:16:39,920 Speaker 1: just like an ant colony. One dies, it's no big deal. 359 00:16:40,240 --> 00:16:42,280 Speaker 1: And they can do this trick of building up structures. 360 00:16:42,320 --> 00:16:44,160 Speaker 1: They're very simple. They're not ready for prime time. They 361 00:16:44,160 --> 00:16:46,280 Speaker 1: can't put in things like plumbing, etcetera. But it's really 362 00:16:46,320 --> 00:16:48,280 Speaker 1: cool to see it done. They can build structures and 363 00:16:48,280 --> 00:16:51,240 Speaker 1: they can even climb up the structures to add more concrete. 364 00:16:51,280 --> 00:16:54,240 Speaker 1: It's a really cool potential future paradigm for construction. Wow, 365 00:16:54,360 --> 00:16:55,640 Speaker 1: I kind of want to see this. There are there 366 00:16:55,760 --> 00:16:58,960 Speaker 1: videos of this. They totally are. If you have a 367 00:16:58,960 --> 00:17:02,120 Speaker 1: good section in the about construction chapter about swarm robotics. 368 00:17:02,440 --> 00:17:04,359 Speaker 1: There are a bunch of different ways it might go down. 369 00:17:04,840 --> 00:17:06,760 Speaker 1: If this is the paradigm we choose that there's there's 370 00:17:06,760 --> 00:17:08,800 Speaker 1: also two reasons that it might not work. You know, 371 00:17:08,800 --> 00:17:10,520 Speaker 1: we talked about a guy named Justin Warfel who was 372 00:17:10,560 --> 00:17:13,000 Speaker 1: sort of a friend of ours now who works he 373 00:17:13,080 --> 00:17:15,080 Speaker 1: studies termites to kind of see how they do it. 374 00:17:15,200 --> 00:17:18,080 Speaker 1: Because the neat thing with termites is no particular termite 375 00:17:18,119 --> 00:17:20,760 Speaker 1: knows what the structure is supposed to. Like the new 376 00:17:20,800 --> 00:17:24,200 Speaker 1: Blueprint system, they have some sort of algorithm they're running 377 00:17:24,800 --> 00:17:27,440 Speaker 1: um or or however you want to say it, um 378 00:17:27,480 --> 00:17:30,360 Speaker 1: and and so it's sort of fascinating that I learned 379 00:17:30,400 --> 00:17:31,520 Speaker 1: a new word. I don't remember if we put it 380 00:17:31,520 --> 00:17:33,359 Speaker 1: in the book, but the word of stigmer g, which 381 00:17:33,400 --> 00:17:36,200 Speaker 1: is embedding information in an environment, which is really interesting 382 00:17:36,200 --> 00:17:37,760 Speaker 1: because you say, how to termites do this? Well, one 383 00:17:37,760 --> 00:17:39,960 Speaker 1: thing they can do is they can leave a chemical 384 00:17:39,960 --> 00:17:41,880 Speaker 1: trail that says something like don't go here or don't 385 00:17:41,920 --> 00:17:43,720 Speaker 1: build here. Um, I don't actually know what. You know, 386 00:17:43,800 --> 00:17:46,639 Speaker 1: literally every termite chemical trail does. But they can leave 387 00:17:46,720 --> 00:17:48,520 Speaker 1: chemical trails. And it's interesting because they don't even know 388 00:17:48,600 --> 00:17:50,360 Speaker 1: I have to know what the trail does, right, They're 389 00:17:50,400 --> 00:17:52,879 Speaker 1: just acting out part of this process of building. It's 390 00:17:52,880 --> 00:17:55,280 Speaker 1: sort of fascinating. Now there's so much more we want 391 00:17:55,320 --> 00:17:57,399 Speaker 1: to ask you about, from the future of poco sticks 392 00:17:57,440 --> 00:18:00,520 Speaker 1: to the important of breathing through both your nostril. But 393 00:18:00,600 --> 00:18:02,440 Speaker 1: before we get to that, let's take a quick break. 394 00:18:16,080 --> 00:18:18,000 Speaker 1: So I want to switch gears a little bit and 395 00:18:18,000 --> 00:18:20,399 Speaker 1: and talk a little bit about travel to space, because 396 00:18:20,440 --> 00:18:22,520 Speaker 1: you know that this week we watched the SpaceX launch 397 00:18:22,560 --> 00:18:24,240 Speaker 1: and one of the things you talk about in your 398 00:18:24,240 --> 00:18:27,520 Speaker 1: book is why cheap travel to space will be important. Um, 399 00:18:27,920 --> 00:18:29,760 Speaker 1: can you talk a little bit about why that is 400 00:18:29,880 --> 00:18:32,720 Speaker 1: and what's the real hindrance right now? And also can 401 00:18:32,800 --> 00:18:35,160 Speaker 1: this idea of a giant pogo stick or super gun 402 00:18:35,320 --> 00:18:38,119 Speaker 1: really help us get there? Yeah? Good, Okay, So the 403 00:18:38,160 --> 00:18:40,720 Speaker 1: first question was why does it matter? I actually I 404 00:18:40,800 --> 00:18:42,679 Speaker 1: want to punch a little of the mattering thing. Actually, 405 00:18:42,840 --> 00:18:44,960 Speaker 1: I feel bad because I've had We've been on economics 406 00:18:44,960 --> 00:18:46,959 Speaker 1: podcasts and they'll say, like, so, what's the economic benefit 407 00:18:47,000 --> 00:18:48,879 Speaker 1: of space? And I'm like, I just don't know. I 408 00:18:48,880 --> 00:18:50,639 Speaker 1: don't know if there is one. I'm giving you some 409 00:18:50,720 --> 00:18:53,119 Speaker 1: arguments that might work. There are minerals in space, it 410 00:18:53,200 --> 00:18:54,760 Speaker 1: might be valuable. The question you always have to ask 411 00:18:54,760 --> 00:18:56,520 Speaker 1: yourself is would it be cheaper to just dig a 412 00:18:56,560 --> 00:19:00,680 Speaker 1: big hole on Earth, and often it probably isn't more valuable. 413 00:19:00,720 --> 00:19:02,840 Speaker 1: I I really think the value of going to space 414 00:19:02,880 --> 00:19:05,040 Speaker 1: you have to be a bit poetic about it. I 415 00:19:05,040 --> 00:19:07,200 Speaker 1: think Carl Sagan wrote a bit about how it just 416 00:19:07,280 --> 00:19:11,280 Speaker 1: kind of satisfies the fundamental human urge um to to explore. 417 00:19:11,320 --> 00:19:13,879 Speaker 1: And then you know, an economists I talked to a 418 00:19:13,880 --> 00:19:15,520 Speaker 1: pointed out, well, you know, the deal is we don't 419 00:19:15,600 --> 00:19:18,320 Speaker 1: know so that there could be developments were not imagining 420 00:19:18,400 --> 00:19:21,479 Speaker 1: yet um in these other environments. So who knows. But 421 00:19:21,720 --> 00:19:23,199 Speaker 1: I think we we just kind of want to do 422 00:19:23,280 --> 00:19:25,960 Speaker 1: it um so, and the basic deals we can't now, 423 00:19:26,040 --> 00:19:28,679 Speaker 1: mostly because it's expensive. You know, people will say we 424 00:19:28,720 --> 00:19:31,000 Speaker 1: don't find NASA enough. I think there's a good argument there, 425 00:19:31,400 --> 00:19:33,840 Speaker 1: But NASA is funded about half as much as in 426 00:19:33,880 --> 00:19:35,880 Speaker 1: the glory days of Apollo, and a lot of technology 427 00:19:35,880 --> 00:19:37,800 Speaker 1: is a lot cheaper um. So I don't think it 428 00:19:37,800 --> 00:19:40,000 Speaker 1: comes down to funding. It comes down to just the 429 00:19:40,000 --> 00:19:43,639 Speaker 1: the general extreme difficulty and expensive going to space, and 430 00:19:43,720 --> 00:19:47,359 Speaker 1: because until very recently the technology had not developed that much. 431 00:19:48,280 --> 00:19:51,040 Speaker 1: Because I think it's mostly you know, it's mostly governmental technology. 432 00:19:51,040 --> 00:19:53,639 Speaker 1: And I'm not in any way against governmental science research. 433 00:19:53,680 --> 00:19:55,800 Speaker 1: But there wasn't they that drive to get cost down 434 00:19:56,160 --> 00:19:58,159 Speaker 1: that you get with you know, companies like SpaceX that 435 00:19:58,240 --> 00:20:01,719 Speaker 1: have a bottom line. Um. And so I would say 436 00:20:01,720 --> 00:20:04,040 Speaker 1: the best argument in Favorite cheap space flight is basically 437 00:20:04,040 --> 00:20:06,080 Speaker 1: that it's awesome. It would be amazing, it would be wonderful. 438 00:20:06,640 --> 00:20:09,919 Speaker 1: You I. I really think there's no no stronger argument Favorite. 439 00:20:09,960 --> 00:20:11,840 Speaker 1: And there's also sort of let's find out what we'll 440 00:20:11,840 --> 00:20:13,399 Speaker 1: do in space. We don't really know what we're going 441 00:20:13,440 --> 00:20:15,000 Speaker 1: to do up there. Um. If you want to be 442 00:20:15,040 --> 00:20:19,600 Speaker 1: a little more cynical, they're they're incredibly strong military applications. Um. 443 00:20:19,640 --> 00:20:22,160 Speaker 1: Just space. You know, if you go to g geosynchronous 444 00:20:22,200 --> 00:20:24,000 Speaker 1: orbit and you can drop a rock, it's space, just 445 00:20:24,040 --> 00:20:27,479 Speaker 1: like a hunk of metals. Say, it's like dropping nuclear weapon. Um, 446 00:20:27,480 --> 00:20:28,960 Speaker 1: and it's a nuclear weapon. It's actually worse than a 447 00:20:29,000 --> 00:20:32,399 Speaker 1: nuclear weapon because it's very hard to stop. Um, because 448 00:20:32,400 --> 00:20:34,800 Speaker 1: you're just it's just Newton who's throwing the piece of 449 00:20:34,800 --> 00:20:36,720 Speaker 1: metal down at you. It's not you know, a little 450 00:20:36,760 --> 00:20:40,960 Speaker 1: ticklish nuclear warhead. Um that that could be diffused. Uh. 451 00:20:41,440 --> 00:20:45,280 Speaker 1: So they're they're tremendous military applications. They're also their communication 452 00:20:45,320 --> 00:20:47,919 Speaker 1: satellite applications. That's a little less sexy, but it's you know, 453 00:20:47,920 --> 00:20:50,680 Speaker 1: impartant and so so those are some arguments I do think. 454 00:20:50,720 --> 00:20:51,760 Speaker 1: If you if you want to talk about like why 455 00:20:51,800 --> 00:20:53,600 Speaker 1: is it valuable to go to Mars? The only legit 456 00:20:53,720 --> 00:20:55,920 Speaker 1: argument I think is that we want to do it. 457 00:20:56,080 --> 00:20:58,359 Speaker 1: And it's sort of in a way. I mean that seriously, 458 00:20:58,480 --> 00:21:00,640 Speaker 1: it's sort of a glorified art project. It just sort 459 00:21:00,680 --> 00:21:03,240 Speaker 1: of satisfies some desire we have. There are two things 460 00:21:03,280 --> 00:21:05,760 Speaker 1: that really allows you about taking a rocket to space 461 00:21:05,800 --> 00:21:09,040 Speaker 1: and most uses. Um So the first thing that's a 462 00:21:09,040 --> 00:21:12,359 Speaker 1: lousy is in in every company but SpaceX, we throw 463 00:21:12,400 --> 00:21:14,480 Speaker 1: away the rocket. And it's actually pretty close in terms 464 00:21:14,480 --> 00:21:16,480 Speaker 1: of price point. If you imagine you wanted to fly 465 00:21:16,720 --> 00:21:21,040 Speaker 1: from London to Los Angeles, uh, and imagine the only 466 00:21:21,040 --> 00:21:22,480 Speaker 1: way to do that would you get in the plane 467 00:21:22,520 --> 00:21:24,119 Speaker 1: and you fly until your over Los Angeles and then 468 00:21:24,160 --> 00:21:25,960 Speaker 1: you jump out in a parachute and then the plane 469 00:21:25,960 --> 00:21:28,600 Speaker 1: crashes into the ocean. That is essentially what we do 470 00:21:28,640 --> 00:21:31,880 Speaker 1: with rockets. These rocket launches, they cost at the very cheapest, 471 00:21:32,040 --> 00:21:34,240 Speaker 1: like SpaceX rates. You're talking like sixty million dollars and 472 00:21:34,240 --> 00:21:36,040 Speaker 1: that's for a pretty ho hum mission. That's just putting 473 00:21:36,040 --> 00:21:40,359 Speaker 1: like satellites and in lower thorbit um and and it's 474 00:21:40,400 --> 00:21:41,800 Speaker 1: about the same for planes. Like if you had to 475 00:21:41,880 --> 00:21:43,440 Speaker 1: if you had to crash the plane every time you 476 00:21:43,480 --> 00:21:45,280 Speaker 1: went to London, you probably would have to pay a 477 00:21:45,320 --> 00:21:48,800 Speaker 1: million dollars to seet maybe more so. Part of why 478 00:21:48,800 --> 00:21:51,160 Speaker 1: it's so exciting that SpaceX can land part of its rocket. 479 00:21:51,160 --> 00:21:52,639 Speaker 1: It can't land the whole thing yet, but maybe one 480 00:21:52,680 --> 00:21:55,760 Speaker 1: day is that that's the major part of the cost. 481 00:21:55,880 --> 00:21:58,359 Speaker 1: The the the fuel is minimal, right, so we said 482 00:21:58,400 --> 00:22:01,200 Speaker 1: sixty million, it's generally more than that. Let's say sixty million. 483 00:22:01,359 --> 00:22:03,600 Speaker 1: The cost for fuel is under a million dollars. I 484 00:22:03,640 --> 00:22:05,399 Speaker 1: don't know what the cost for like staff is, and 485 00:22:05,440 --> 00:22:07,439 Speaker 1: we don't know yet what the cost of refurbishment is 486 00:22:07,600 --> 00:22:09,680 Speaker 1: or even if SpaceX notices, I don't think they're telling 487 00:22:10,200 --> 00:22:13,440 Speaker 1: um yet. But but it's like planes, right, Your plane 488 00:22:13,440 --> 00:22:16,280 Speaker 1: ticket from London is now, you know, maybe a thousand 489 00:22:16,359 --> 00:22:18,040 Speaker 1: bucks instead of the million bucks it would be. So 490 00:22:18,040 --> 00:22:19,479 Speaker 1: you can do the same for space travel. It's kind 491 00:22:19,480 --> 00:22:22,119 Speaker 1: of wild, right, So you as a first approximation, imagine 492 00:22:22,119 --> 00:22:24,160 Speaker 1: you get the price of all space stuff down by 493 00:22:24,160 --> 00:22:26,879 Speaker 1: a factor of ten. Uh. Take any space mission you 494 00:22:26,920 --> 00:22:29,080 Speaker 1: care about, and multiplied by say a factor of two 495 00:22:29,760 --> 00:22:32,240 Speaker 1: or three, you know, so instead of two dudes on 496 00:22:32,280 --> 00:22:35,080 Speaker 1: the moon, it's four or five dudes or well and 497 00:22:35,400 --> 00:22:38,639 Speaker 1: four or five ladies. So that's just awesome. That's great, 498 00:22:38,680 --> 00:22:40,919 Speaker 1: and nothing is bad about that. The other thing that's 499 00:22:40,920 --> 00:22:44,879 Speaker 1: standing in the way with rockets. Uh, people don't realize this. 500 00:22:44,920 --> 00:22:46,480 Speaker 1: If you look at rockets sitting on the pad ready 501 00:22:46,520 --> 00:22:51,600 Speaker 1: to launch, about of it by mass is propellant. It's 502 00:22:51,760 --> 00:22:54,680 Speaker 1: fuel and oxidizers. The stuff you're gonna burn in order 503 00:22:54,680 --> 00:22:59,439 Speaker 1: to go up and over about sixteen sevent is just 504 00:22:59,480 --> 00:23:02,520 Speaker 1: the machine itself, like the metal and plastic and silicon 505 00:23:02,880 --> 00:23:06,000 Speaker 1: that go into you know, going up. Uh. And the 506 00:23:06,160 --> 00:23:09,040 Speaker 1: the stuff you might call actually going to space is 507 00:23:09,160 --> 00:23:12,160 Speaker 1: on on a really lower thorbit. Efficient mission is about 508 00:23:12,160 --> 00:23:14,480 Speaker 1: three and a half percent, right the little tippy top. 509 00:23:14,560 --> 00:23:16,480 Speaker 1: It's so it's it's essentially like you have to launch 510 00:23:16,480 --> 00:23:20,600 Speaker 1: a skyscraper into space to put the first floor into space. Uh, 511 00:23:20,680 --> 00:23:24,160 Speaker 1: And which you might imagine is is is pretty inefficient. Uh. 512 00:23:24,280 --> 00:23:27,320 Speaker 1: There are also some environmental concerns, but but in terms 513 00:23:27,359 --> 00:23:29,680 Speaker 1: of costs that it's pretty dramatic. Right, So as a 514 00:23:29,720 --> 00:23:31,760 Speaker 1: way to think about it, if you could just get 515 00:23:31,800 --> 00:23:34,800 Speaker 1: the fuel usage from pent to you know, seventy six 516 00:23:34,840 --> 00:23:36,760 Speaker 1: and a half percent, you've cut the cost in half 517 00:23:36,760 --> 00:23:39,120 Speaker 1: of any card goo going up. The tricky part is 518 00:23:39,119 --> 00:23:42,719 Speaker 1: is rocketcern Like they're pretty optimized to my understanding, Like 519 00:23:42,760 --> 00:23:44,639 Speaker 1: you know, it's just it's the rocket equation. It's it's 520 00:23:44,680 --> 00:23:46,719 Speaker 1: pretty Newtony, and it's it's there's not a whole lot 521 00:23:46,760 --> 00:23:48,560 Speaker 1: you can do to make it better. There's some wacky ideas. 522 00:23:49,000 --> 00:23:50,480 Speaker 1: Um but I'll get to the pogo stick in a 523 00:23:50,520 --> 00:23:53,879 Speaker 1: moment um. But but but so, uh I can I 524 00:23:53,920 --> 00:23:55,399 Speaker 1: can get a lot more into that. But but so, 525 00:23:55,560 --> 00:23:57,719 Speaker 1: like a lot of the proposals for how to get 526 00:23:57,800 --> 00:23:59,760 Speaker 1: to space without using a rocket or in one way 527 00:23:59,840 --> 00:24:02,119 Speaker 1: or an they're essentially fighting this problem of the idea 528 00:24:02,200 --> 00:24:03,840 Speaker 1: that most of what you're doing when you go to 529 00:24:03,880 --> 00:24:06,439 Speaker 1: space is lifting fuel that you're gonna burn on your 530 00:24:06,440 --> 00:24:08,280 Speaker 1: way to space, and not stuff that you want to 531 00:24:08,359 --> 00:24:10,720 Speaker 1: go to space. Right, I hasten to add to for 532 00:24:10,800 --> 00:24:13,000 Speaker 1: like Apollo going just to the moon, just the moon, 533 00:24:13,080 --> 00:24:15,080 Speaker 1: not very far at all in a solar system sense, 534 00:24:15,200 --> 00:24:17,520 Speaker 1: that was about one point five percent stuff going to space. 535 00:24:18,040 --> 00:24:22,280 Speaker 1: Um so the Poco stick. So how might you improve. Well, 536 00:24:22,440 --> 00:24:25,080 Speaker 1: so remember I said, only let's say three percent is 537 00:24:25,080 --> 00:24:28,119 Speaker 1: going it's just for efficiency and math. So that means 538 00:24:28,200 --> 00:24:31,399 Speaker 1: if you just get three percent more efficiency, you double 539 00:24:31,720 --> 00:24:33,480 Speaker 1: the amount of cargo, you cut the price in half. 540 00:24:33,720 --> 00:24:37,080 Speaker 1: So well, so part of the fuel, as you might imagine, 541 00:24:37,200 --> 00:24:41,560 Speaker 1: is just getting the like the initial launch, because right 542 00:24:41,640 --> 00:24:44,840 Speaker 1: when you launch, most of what you're lifting is other fuel. Right. Um, 543 00:24:44,880 --> 00:24:47,800 Speaker 1: so if you could spare yourself just a little of 544 00:24:47,840 --> 00:24:50,360 Speaker 1: that need to get that fuel up to speed, you 545 00:24:50,359 --> 00:24:52,520 Speaker 1: you get like that. Our guy we talked to it 546 00:24:52,600 --> 00:24:54,960 Speaker 1: NASA said you could get like one or two percent 547 00:24:54,960 --> 00:24:57,359 Speaker 1: more efficiency maybe. So the basic idea, it's a poco 548 00:24:57,400 --> 00:25:00,320 Speaker 1: stick rocket up some height I don't know, let's say 549 00:25:00,320 --> 00:25:04,240 Speaker 1: a hundred feet, You drop it, it bounces. Now you've 550 00:25:04,240 --> 00:25:06,480 Speaker 1: got some speed, uh, and off you go to space. 551 00:25:08,080 --> 00:25:10,560 Speaker 1: And so there might be some people who are thinking, 552 00:25:10,600 --> 00:25:12,120 Speaker 1: I don't see how the physics of this checks out. 553 00:25:12,400 --> 00:25:15,679 Speaker 1: The way to think about it is just, um, you know, 554 00:25:15,840 --> 00:25:18,399 Speaker 1: you're imparting some of your energy just by lifting it 555 00:25:18,480 --> 00:25:20,240 Speaker 1: up and then dropping it, and so it's it's a 556 00:25:20,280 --> 00:25:24,199 Speaker 1: more efficient use of energy than having to burn the 557 00:25:24,200 --> 00:25:26,520 Speaker 1: fuel in the rocket to move the rocket. We have 558 00:25:26,600 --> 00:25:28,399 Speaker 1: we have some sort of analogies for for how to 559 00:25:28,440 --> 00:25:31,600 Speaker 1: explain that in the book. But but, but the basic 560 00:25:31,640 --> 00:25:33,320 Speaker 1: idea is, as long as you're burning fuel, it's a 561 00:25:33,320 --> 00:25:34,879 Speaker 1: little more fuel, it's bad or is it is a 562 00:25:34,920 --> 00:25:36,679 Speaker 1: simple way to think about it. The best way to 563 00:25:36,680 --> 00:25:39,040 Speaker 1: fly to space would be a magic pixie dropped a 564 00:25:39,160 --> 00:25:41,680 Speaker 1: drop of fuel in the tank every time you needed it, 565 00:25:41,960 --> 00:25:44,320 Speaker 1: so you'd only be looking stuff, so that the popo 566 00:25:44,400 --> 00:25:47,679 Speaker 1: gets you just a nudge towards that. Um and at 567 00:25:47,760 --> 00:25:48,879 Speaker 1: least a card of the guy we talked to it. 568 00:25:48,880 --> 00:25:52,359 Speaker 1: And now it would work in the sense of physics. Uh, 569 00:25:52,280 --> 00:25:57,800 Speaker 1: whether I said I just like rocket just bouncing slowly 570 00:25:58,080 --> 00:26:01,479 Speaker 1: getting high off back, yeah, I'd be like t minus 571 00:26:01,880 --> 00:26:05,639 Speaker 1: you know one second to pogo drop. But I do 572 00:26:05,840 --> 00:26:08,520 Speaker 1: understanding solutions aren't tried because it went wrong, you'd look 573 00:26:08,600 --> 00:26:12,880 Speaker 1: like such an idiot. Um So, but yeah, that's that's 574 00:26:12,880 --> 00:26:15,760 Speaker 1: the pogo stick method. More with Zach Weener Smith right 575 00:26:15,800 --> 00:26:31,800 Speaker 1: after this break. One last question for you. Your book 576 00:26:31,840 --> 00:26:34,199 Speaker 1: is so good to encover so many topics from you know, 577 00:26:34,320 --> 00:26:37,320 Speaker 1: synthetic biology through d printing for food, like like there's 578 00:26:37,359 --> 00:26:39,480 Speaker 1: there's just so much in their asteroid mining, which we 579 00:26:39,480 --> 00:26:41,439 Speaker 1: didn't even get to, which is really great. But um, 580 00:26:41,760 --> 00:26:43,600 Speaker 1: but I did want to ask you about this one 581 00:26:43,640 --> 00:26:46,399 Speaker 1: experiment that you guys found on the effects of breathing 582 00:26:46,440 --> 00:26:53,359 Speaker 1: through one nostril. Yes, talk about that. Yeah, if I correctly, 583 00:26:53,359 --> 00:26:56,639 Speaker 1: there was an acronym. It was like unilateral force nose 584 00:26:56,680 --> 00:27:01,320 Speaker 1: breathing something like that. You you and B or so anyway, 585 00:27:01,400 --> 00:27:04,800 Speaker 1: So uh yeah, so let me set the table a 586 00:27:04,800 --> 00:27:06,760 Speaker 1: little for that. So we we did the chapter I 587 00:27:06,800 --> 00:27:08,600 Speaker 1: think this is in the chapter and augmented reality. And 588 00:27:08,600 --> 00:27:09,720 Speaker 1: so in the book we have these things called not 589 00:27:09,800 --> 00:27:12,360 Speaker 1: to bannis, which are generally just little tidbits of weirdness 590 00:27:12,400 --> 00:27:14,520 Speaker 1: we came across that had nothing to do with anything 591 00:27:14,520 --> 00:27:16,240 Speaker 1: where just didn't fit the chapter, but that we still 592 00:27:16,240 --> 00:27:19,480 Speaker 1: wanted to write about. And so this one was on 593 00:27:20,440 --> 00:27:23,159 Speaker 1: uh noses. And the reason it's relates to augmented reality 594 00:27:23,200 --> 00:27:25,000 Speaker 1: is when you put on an augmented reality helmet, one 595 00:27:25,040 --> 00:27:27,160 Speaker 1: of the obvious things that has to do is offset 596 00:27:27,160 --> 00:27:28,800 Speaker 1: the image you see in each eye in order to 597 00:27:28,840 --> 00:27:31,240 Speaker 1: convey a three D environment, right, because of course in 598 00:27:31,280 --> 00:27:33,840 Speaker 1: the real world, your eyes are offset from each other. Uh. 599 00:27:33,880 --> 00:27:36,320 Speaker 1: And so you actually would if if you would have 600 00:27:36,320 --> 00:27:38,560 Speaker 1: an augmented reality system for the ears. You have to 601 00:27:38,600 --> 00:27:40,560 Speaker 1: do the same thing. You have to offset sounds a 602 00:27:40,600 --> 00:27:43,280 Speaker 1: little like so you know something's to your right because 603 00:27:43,320 --> 00:27:45,040 Speaker 1: you hear the sound from it a little earlier in 604 00:27:45,080 --> 00:27:47,560 Speaker 1: the right ear. And so we thought, well, let's see, 605 00:27:48,160 --> 00:27:50,280 Speaker 1: you've got two eyes, you've got two ears. The only 606 00:27:50,280 --> 00:27:51,840 Speaker 1: other thing on your face you have two of is 607 00:27:51,880 --> 00:27:55,399 Speaker 1: your nose, And so can you sort of triangulate with smell? 608 00:27:55,560 --> 00:27:58,080 Speaker 1: And we realized either had an answer to that, so 609 00:27:58,119 --> 00:28:00,640 Speaker 1: we started like, how can we explode this question? And 610 00:28:00,640 --> 00:28:02,880 Speaker 1: and it turned out Kelly happened to know a guy 611 00:28:03,320 --> 00:28:05,960 Speaker 1: who worked on snakes, and it's a sort of similar question. 612 00:28:06,000 --> 00:28:08,440 Speaker 1: Why do snakes have forked tongues? And it turns out 613 00:28:08,560 --> 00:28:11,000 Speaker 1: snakes and the number of other species might have the 614 00:28:11,000 --> 00:28:13,120 Speaker 1: ability to do something like that to say like, I'm 615 00:28:13,119 --> 00:28:15,320 Speaker 1: getting a little more chemical on this side of the fork, 616 00:28:15,720 --> 00:28:18,000 Speaker 1: therefore I should go this way when I'm I'm trailing 617 00:28:18,000 --> 00:28:20,480 Speaker 1: a rat. Uh. And so then our crush was can 618 00:28:20,560 --> 00:28:22,639 Speaker 1: humans do this? And it basically turns out the answer 619 00:28:22,720 --> 00:28:27,040 Speaker 1: is probably no. Very sadly, you cannot probably tell where 620 00:28:27,040 --> 00:28:29,080 Speaker 1: the bad smell of the room is by by using 621 00:28:29,280 --> 00:28:31,840 Speaker 1: you know, offsetting your nostrils. Um so then the question 622 00:28:31,840 --> 00:28:33,600 Speaker 1: we had was, then why do you have two nostrils 623 00:28:33,640 --> 00:28:37,399 Speaker 1: line up one big, giant, awesome nostril um. And it 624 00:28:37,400 --> 00:28:40,560 Speaker 1: turns out the best argument is, essentially your your nostril 625 00:28:40,680 --> 00:28:43,560 Speaker 1: is this sort of um, your your You have this 626 00:28:43,640 --> 00:28:47,640 Speaker 1: mucous line system that protects you from pathogens and other stuff, 627 00:28:47,640 --> 00:28:50,040 Speaker 1: and it has to be kept moist in order to 628 00:28:50,040 --> 00:28:52,520 Speaker 1: do its job. And so the you don't realize it, 629 00:28:52,640 --> 00:28:57,480 Speaker 1: but you constantly have one nostril dominant quote unquote, you 630 00:28:57,480 --> 00:29:00,840 Speaker 1: have one dominant nostril that has this engorged tissue that's 631 00:29:00,880 --> 00:29:03,920 Speaker 1: the term in the field and gorged uh um in 632 00:29:04,000 --> 00:29:06,920 Speaker 1: one nostril. So so one one side is essentially getting 633 00:29:06,920 --> 00:29:10,400 Speaker 1: a break. Uh and um you really noticed this, by 634 00:29:10,440 --> 00:29:11,760 Speaker 1: the way, when you have a cold, you know, when 635 00:29:11,800 --> 00:29:15,560 Speaker 1: when when one nelson was just not doing anything? Um 636 00:29:15,640 --> 00:29:20,360 Speaker 1: and uh so um, so what does this have to 637 00:29:20,360 --> 00:29:23,840 Speaker 1: do with abuse to undergrads? Uh So, it turns out 638 00:29:23,840 --> 00:29:27,120 Speaker 1: there's some weirdness that goes with this. There's some evidence 639 00:29:27,120 --> 00:29:30,520 Speaker 1: we're i would say, pretty skeptical of that when you 640 00:29:30,640 --> 00:29:33,880 Speaker 1: force people to use their non dominant nostril, they perform 641 00:29:33,960 --> 00:29:37,200 Speaker 1: worse in all sorts of ways. Uh and so um, 642 00:29:37,520 --> 00:29:39,800 Speaker 1: so they they're apparently this was done to undergrade. There 643 00:29:39,800 --> 00:29:43,080 Speaker 1: are these tests that do unilateral force. I think it's 644 00:29:43,160 --> 00:29:45,600 Speaker 1: nostril breathing or nose breathing. Anyway to the book where 645 00:29:45,640 --> 00:29:48,880 Speaker 1: you guess compel students to breathe through the wrong nostril 646 00:29:49,200 --> 00:29:51,959 Speaker 1: and do stuff like I I think it was like, 647 00:29:52,000 --> 00:29:54,320 Speaker 1: you know, just like intelligence tests. And there's there's even 648 00:29:54,360 --> 00:29:57,320 Speaker 1: some argument that there's some correlation with like schizophrenia, uh 649 00:29:57,960 --> 00:30:00,760 Speaker 1: like using the wrong nostril. Yeah, like weird of Um, 650 00:30:00,800 --> 00:30:03,640 Speaker 1: I'm pretty skeptical of any of it, but I just 651 00:30:03,720 --> 00:30:06,600 Speaker 1: love the idea that, you know, someone was like, hey, 652 00:30:06,600 --> 00:30:08,240 Speaker 1: we need undergrads for a test. You're like, oh, I'm 653 00:30:08,240 --> 00:30:10,080 Speaker 1: gonna come in and take a test, and it's like, okay, 654 00:30:10,240 --> 00:30:12,400 Speaker 1: breathe through the nostril you don't feel like breathing through 655 00:30:12,520 --> 00:30:15,920 Speaker 1: and answer questions. That's what that's what they're good for. Right, 656 00:30:16,120 --> 00:30:18,120 Speaker 1: Bring out of that nostril and now go get on 657 00:30:18,120 --> 00:30:21,400 Speaker 1: that poco stick. We're sending that's right, and we'll see 658 00:30:21,400 --> 00:30:25,520 Speaker 1: what happens. Yeah, before we let you go, I do 659 00:30:25,600 --> 00:30:27,920 Speaker 1: have to know after all of this research for this book, 660 00:30:28,040 --> 00:30:30,520 Speaker 1: and by the way, the amount of research and the 661 00:30:30,560 --> 00:30:33,400 Speaker 1: amount of fact checking that must have gone into this 662 00:30:33,440 --> 00:30:39,320 Speaker 1: project was just tremendous and really impressive to see. I 663 00:30:39,400 --> 00:30:43,480 Speaker 1: can only imagine. But I'm curious after putting this together, 664 00:30:43,880 --> 00:30:46,640 Speaker 1: you know what, what about the future excites you the most? 665 00:30:46,680 --> 00:30:49,080 Speaker 1: And then on the flip side of that, what terrifies 666 00:30:49,160 --> 00:30:51,520 Speaker 1: you the most now that you've you've you know, dug 667 00:30:51,520 --> 00:30:55,160 Speaker 1: into so much of this. Sure, UM, I would say 668 00:30:56,000 --> 00:31:00,800 Speaker 1: the technology that is in some ways less or sexy 669 00:31:00,840 --> 00:31:02,920 Speaker 1: spacy than the other technologies, but it is really exciting 670 00:31:03,080 --> 00:31:06,600 Speaker 1: is what's called UM precision medicine. Uh. We have chapter 671 00:31:06,680 --> 00:31:08,280 Speaker 1: on this in the book. And the basic ideas you 672 00:31:08,320 --> 00:31:11,200 Speaker 1: just have a sort of stats approach to medicine. UM. 673 00:31:11,280 --> 00:31:12,959 Speaker 1: So maybe some of your listeners have heard of all 674 00:31:13,000 --> 00:31:15,440 Speaker 1: these blood tests now coming out that can detect cancer early, 675 00:31:15,520 --> 00:31:18,080 Speaker 1: maybe heart attack early, all sorts of stuff. So the 676 00:31:18,160 --> 00:31:20,480 Speaker 1: idea is maybe maybe in thirty or forty years will 677 00:31:20,480 --> 00:31:22,640 Speaker 1: be a new system where essentially when you go to 678 00:31:22,680 --> 00:31:25,160 Speaker 1: the doctor and so of UM instead of the basic stuff, 679 00:31:25,160 --> 00:31:26,880 Speaker 1: the fight still do the basic stuff blood pressure and 680 00:31:26,920 --> 00:31:29,360 Speaker 1: talking to your doctor, etcetera, or you exercising enough, but 681 00:31:29,360 --> 00:31:31,240 Speaker 1: they'll also take a blood test that will be run 682 00:31:31,280 --> 00:31:34,040 Speaker 1: through a fancy computer and and we'll spit out a 683 00:31:34,080 --> 00:31:36,400 Speaker 1: lot of results. And what's exciting to that is is 684 00:31:37,400 --> 00:31:40,560 Speaker 1: potentially it means you drastically drop healthcare costs. A lot 685 00:31:40,560 --> 00:31:43,800 Speaker 1: of healthcare costs have to do with unprevented stuff, um, 686 00:31:43,840 --> 00:31:46,400 Speaker 1: discovering cancers late, that sort of thing. I'm not to mention, 687 00:31:46,400 --> 00:31:48,520 Speaker 1: you know, a lot of like a lot of why, 688 00:31:48,560 --> 00:31:50,920 Speaker 1: for instance, pancreatic cancer is dangerous is because you don't 689 00:31:50,920 --> 00:31:53,560 Speaker 1: detect it early, um, and so stuff like that, like 690 00:31:53,800 --> 00:31:57,120 Speaker 1: subtle heart to detect cancers you can detect early, would 691 00:31:57,120 --> 00:31:58,800 Speaker 1: save a lot of money, but it would also obviously 692 00:31:58,800 --> 00:32:01,800 Speaker 1: save a lot of you know, horror, a lot of 693 00:32:01,800 --> 00:32:04,720 Speaker 1: badness in human life. Um. And so to me, that's 694 00:32:04,800 --> 00:32:07,840 Speaker 1: that's really exciting, especially because it kind of comes at 695 00:32:07,840 --> 00:32:10,400 Speaker 1: a time when, especially in the US, but also this 696 00:32:10,480 --> 00:32:12,680 Speaker 1: is happening in other countries where healthcare costs are really 697 00:32:12,680 --> 00:32:15,760 Speaker 1: getting high and it's not exactly clear how the system 698 00:32:15,800 --> 00:32:18,640 Speaker 1: can say solvent over the long term, this could be 699 00:32:18,680 --> 00:32:20,440 Speaker 1: a way out. It might take us in the other 700 00:32:20,440 --> 00:32:22,080 Speaker 1: direction of the short term, but in the long term, 701 00:32:22,280 --> 00:32:24,160 Speaker 1: you know, you can imagine a paradigm where it's like 702 00:32:24,160 --> 00:32:26,400 Speaker 1: Star Trek, where someone weighs a little wand over you. 703 00:32:26,800 --> 00:32:28,640 Speaker 1: More likely the one is like poking you and taking 704 00:32:28,640 --> 00:32:33,080 Speaker 1: tissues and but still uh, and it just tells you, hey, 705 00:32:33,480 --> 00:32:35,360 Speaker 1: and this is this is apparently literally possible. It might 706 00:32:35,360 --> 00:32:37,280 Speaker 1: be able to say, hey, you're headed for a heart attack. 707 00:32:37,320 --> 00:32:39,320 Speaker 1: You have dying heart tissue. If this goes on for 708 00:32:39,360 --> 00:32:41,440 Speaker 1: three more days, you're gonna be here for a heart attack. 709 00:32:41,680 --> 00:32:43,800 Speaker 1: And so instead of you know, having it in your 710 00:32:43,800 --> 00:32:45,800 Speaker 1: car or at work or whatever and having to be 711 00:32:45,800 --> 00:32:47,640 Speaker 1: worked to the hospital, you come in early and they 712 00:32:47,640 --> 00:32:50,160 Speaker 1: take care of it. Um that the amount of prevented 713 00:32:50,200 --> 00:32:53,760 Speaker 1: in medicine that could be done is amazing. UM. So 714 00:32:53,840 --> 00:32:55,760 Speaker 1: I'll just say that, I mean, I'm on a visceral level, 715 00:32:55,800 --> 00:32:59,440 Speaker 1: I'm I'm I suppose more excited about like this space stuff. Um. 716 00:32:59,480 --> 00:33:02,440 Speaker 1: But but that's really that's really sort of I don't 717 00:33:02,440 --> 00:33:04,320 Speaker 1: know if you've ever known someone who's who's died of 718 00:33:04,640 --> 00:33:07,000 Speaker 1: cancer or some sort of prevent a littleness, especially the 719 00:33:07,040 --> 00:33:09,160 Speaker 1: young agents, it's very sort of moving to think about 720 00:33:09,440 --> 00:33:13,680 Speaker 1: being able to do these things. Um. In terms of scary, 721 00:33:13,880 --> 00:33:16,480 Speaker 1: I would say far and away the most scary one 722 00:33:16,600 --> 00:33:18,840 Speaker 1: for me is we do have to on brain computer interfaces, 723 00:33:18,840 --> 00:33:22,640 Speaker 1: which again is what it sounds like. Uh, and what 724 00:33:22,760 --> 00:33:24,720 Speaker 1: scares me a little. So let me give you a 725 00:33:24,760 --> 00:33:27,280 Speaker 1: scenario we wade about that was proposed as a positive, 726 00:33:28,160 --> 00:33:30,920 Speaker 1: which is something like this. Suppose you're on the job, uh, 727 00:33:31,120 --> 00:33:34,160 Speaker 1: and you lapsed and focus. Well, that's bad for your job. 728 00:33:34,640 --> 00:33:36,360 Speaker 1: But suppose you have a brain computer face it to 729 00:33:36,440 --> 00:33:38,920 Speaker 1: texts that you're not focusing and does something. Maybe he 730 00:33:38,960 --> 00:33:42,120 Speaker 1: tells your boss Bob isn't focusing. Maybe it just gives 731 00:33:42,120 --> 00:33:45,920 Speaker 1: you a little something, a little uh, little shock or 732 00:33:46,040 --> 00:33:48,520 Speaker 1: a does of chemical or something, and then you focus again. Now, 733 00:33:48,520 --> 00:33:50,840 Speaker 1: of course that that sounds instantly dystopie on the way 734 00:33:50,840 --> 00:33:52,880 Speaker 1: I'm describing it. There is a plausible good version, which 735 00:33:52,880 --> 00:33:55,240 Speaker 1: is something like if you're you know, you don't want 736 00:33:55,240 --> 00:33:57,400 Speaker 1: that on the job. You might want the guy piloting 737 00:33:57,440 --> 00:34:01,480 Speaker 1: your plane to do that. You might want a surgeon 738 00:34:01,600 --> 00:34:04,840 Speaker 1: to do that. There are situations where it's reasonable. Um. 739 00:34:04,880 --> 00:34:06,920 Speaker 1: But but the scary thing is once the cats out 740 00:34:06,920 --> 00:34:09,640 Speaker 1: of the bag, like once this is a possible thing, um, 741 00:34:09,760 --> 00:34:11,880 Speaker 1: then there's a sort of race to the bottom or 742 00:34:12,040 --> 00:34:14,799 Speaker 1: race to the cydeboard, however you want to say it. Uh, 743 00:34:15,320 --> 00:34:18,200 Speaker 1: just something like um, so I don't know. I assume 744 00:34:18,400 --> 00:34:21,040 Speaker 1: y'all are either an academia or cross swords with it. UM. 745 00:34:21,080 --> 00:34:23,080 Speaker 1: You know, you're you're familiar with that world. You may 746 00:34:23,120 --> 00:34:26,520 Speaker 1: not know that on surveys, something like fifth to a 747 00:34:26,560 --> 00:34:30,000 Speaker 1: quarter of elite American researchers will admit to using mind 748 00:34:30,080 --> 00:34:32,560 Speaker 1: enhancing drugs. And I don't mean like they're dropping acid. 749 00:34:32,600 --> 00:34:34,799 Speaker 1: I mean like they're taking adderall or moda, phanil or 750 00:34:34,800 --> 00:34:36,759 Speaker 1: what have you to to you know, or or or 751 00:34:36,800 --> 00:34:39,360 Speaker 1: illegal stuff like a cocaine or or what have you. 752 00:34:39,400 --> 00:34:41,560 Speaker 1: Maybe crack, I don't know, um to to be able 753 00:34:41,600 --> 00:34:44,160 Speaker 1: to put out more papers, uh, you know, sleepless, etcetera. 754 00:34:44,480 --> 00:34:46,839 Speaker 1: I think drugs are sort of morally neutral, but there 755 00:34:46,960 --> 00:34:49,640 Speaker 1: is an externality problem, which is if the top researchers 756 00:34:49,640 --> 00:34:52,839 Speaker 1: are all on crack, and presuming crack actually helps by 757 00:34:52,840 --> 00:34:55,720 Speaker 1: the way that these are some of the questions, but 758 00:34:55,800 --> 00:34:58,200 Speaker 1: I think it's probably possible that it does, especially if 759 00:34:58,200 --> 00:35:00,000 Speaker 1: you're like a post talk or something, you have an 760 00:35:00,040 --> 00:35:01,759 Speaker 1: narrow time band to do a lot of work. There's 761 00:35:01,760 --> 00:35:03,640 Speaker 1: an externality, which is it means that people who aren't 762 00:35:03,680 --> 00:35:06,319 Speaker 1: willing to do this stuff, um are putting in a 763 00:35:06,320 --> 00:35:09,399 Speaker 1: bad position. Uh, and so we kind of I feel 764 00:35:09,400 --> 00:35:10,719 Speaker 1: like most of us don't care if it's happening with 765 00:35:10,760 --> 00:35:12,600 Speaker 1: researchers because there's an extent to which you're like, well 766 00:35:12,640 --> 00:35:16,000 Speaker 1: I get more science. Uh so whatever, I don't care 767 00:35:16,000 --> 00:35:18,360 Speaker 1: that much. And also anyway, they're already elite people, so whatever. 768 00:35:18,719 --> 00:35:20,799 Speaker 1: But you can imagine a situation where this goes all 769 00:35:20,800 --> 00:35:23,920 Speaker 1: the way down, Like, you know, the worker who is 770 00:35:23,960 --> 00:35:27,719 Speaker 1: willing to release their data to an employer, um, and 771 00:35:27,760 --> 00:35:31,480 Speaker 1: allow the employer to manipulate that data is more employable. UM. 772 00:35:31,560 --> 00:35:33,480 Speaker 1: So do we get into a situation where everyone has 773 00:35:33,480 --> 00:35:35,440 Speaker 1: to be wearing one of these and publicly sharing their 774 00:35:35,520 --> 00:35:38,800 Speaker 1: data or at least not publicly maybe but sharing it 775 00:35:38,840 --> 00:35:40,640 Speaker 1: with employers or what have you. That there's a really 776 00:35:40,680 --> 00:35:44,200 Speaker 1: ominous scenario where where we're all, you know, being forced 777 00:35:44,200 --> 00:35:46,279 Speaker 1: to become cyborgs or whether we want to or not, 778 00:35:46,840 --> 00:35:48,799 Speaker 1: you know, over over the next hundred years, say not 779 00:35:48,800 --> 00:35:51,640 Speaker 1: not anytime soon. Yeah, Well, to make sure that we 780 00:35:51,719 --> 00:35:54,160 Speaker 1: finish on a positive note, maybe we should just we 781 00:35:54,200 --> 00:35:59,839 Speaker 1: should just repeat the term pogo stick to space. Let's 782 00:35:59,840 --> 00:36:01,880 Speaker 1: just let's just maybe close with that. What do you 783 00:36:01,880 --> 00:36:06,120 Speaker 1: think I think pogo stick to space? I should have 784 00:36:06,160 --> 00:36:08,759 Speaker 1: done the precision medicine thing. Second, we have two in 785 00:36:08,800 --> 00:36:10,520 Speaker 1: the wrong order. But this this book, I know we've 786 00:36:10,520 --> 00:36:13,520 Speaker 1: already said it. It is so fascinating and so fun 787 00:36:13,560 --> 00:36:15,359 Speaker 1: to read. I hope all of our listeners will check 788 00:36:15,360 --> 00:36:18,239 Speaker 1: it out. It's called soon Ish Ten Emerging Technologies that 789 00:36:18,280 --> 00:36:21,880 Speaker 1: will improve and or ruin Everything. But Zach, thanks so 790 00:36:21,960 --> 00:36:23,920 Speaker 1: much for joining us today, Thanks for having a lot 791 00:36:23,920 --> 00:36:26,000 Speaker 1: of fun. That's it for today's show, but be sure 792 00:36:26,040 --> 00:36:28,759 Speaker 1: to check out Kelly and Zach Wienersmith's new book soon Ish, 793 00:36:28,760 --> 00:36:31,560 Speaker 1: which is on shelves everywhere right now. And if you 794 00:36:31,600 --> 00:36:34,000 Speaker 1: like the program or just have anything to add, remember 795 00:36:34,040 --> 00:36:36,160 Speaker 1: you can always hit us up on Facebook or Twitter. 796 00:36:36,360 --> 00:36:38,400 Speaker 1: You can email us part Time Genius at how stuff 797 00:36:38,400 --> 00:36:41,839 Speaker 1: Works dot com or hit us up one fact hot line. 798 00:36:41,880 --> 00:36:58,840 Speaker 1: That's one eight four four pt Genius. Thanks again for listening. 799 00:36:58,960 --> 00:37:01,160 Speaker 1: Part Time Genius is a duction of How Stuff Works 800 00:37:01,160 --> 00:37:03,719 Speaker 1: and wouldn't be possible without several brilliant people who do 801 00:37:03,800 --> 00:37:06,880 Speaker 1: the important things we couldn't even begin to understand. Tristan 802 00:37:06,960 --> 00:37:09,440 Speaker 1: McNeil does the editing thing. Noel Brown made the theme 803 00:37:09,520 --> 00:37:12,480 Speaker 1: song and does the MIXI mixy sound thing. Gerry Rowland 804 00:37:12,520 --> 00:37:15,719 Speaker 1: does the exact producer thing. Gabe Loesier is our lead researcher, 805 00:37:15,760 --> 00:37:18,720 Speaker 1: with support from the Research Army including Austin Thompson, Nolan 806 00:37:18,760 --> 00:37:21,040 Speaker 1: Brown and Lucas Adams and Eve. Jeff Cook gets the 807 00:37:21,080 --> 00:37:23,239 Speaker 1: show to your ears. Good job, Eves. If you like 808 00:37:23,320 --> 00:37:25,160 Speaker 1: what you heard, we hope you'll subscribe, And if you 809 00:37:25,200 --> 00:37:27,160 Speaker 1: really really like what you've heard, maybe you could leave 810 00:37:27,200 --> 00:37:30,240 Speaker 1: a good review for us. Could we forget Jan Jason 811 00:37:30,239 --> 00:37:30,399 Speaker 1: who