1 00:00:07,400 --> 00:00:09,840 Speaker 1: So sometimes when I look around me, I'm amazed at 2 00:00:09,840 --> 00:00:12,080 Speaker 1: how many tech devices I have in my life. I mean, 3 00:00:12,160 --> 00:00:14,520 Speaker 1: of course I have a laptop, I have a telephone, 4 00:00:15,120 --> 00:00:17,120 Speaker 1: you know, I have a TV, this kind of stuff, 5 00:00:17,120 --> 00:00:19,159 Speaker 1: But just all around me, to my kitchen is a 6 00:00:19,200 --> 00:00:22,560 Speaker 1: bunch of stuff that my grandparents and their grandparents wouldn't 7 00:00:22,600 --> 00:00:26,000 Speaker 1: even recognize. Your grandparents didn't have the George Foreman grill. 8 00:00:26,160 --> 00:00:29,440 Speaker 1: What do you mean, that's right, the panini maker. I mean, 9 00:00:29,520 --> 00:00:32,440 Speaker 1: it's a fundamental element if youman society. Now, how did 10 00:00:32,440 --> 00:00:36,199 Speaker 1: people live so long without paninis? Be without Panini's. It's 11 00:00:36,280 --> 00:00:39,479 Speaker 1: baffling to me, It's baffling. Um. You know, I just 12 00:00:39,520 --> 00:00:42,400 Speaker 1: feel like so many elements of my life rely on 13 00:00:42,560 --> 00:00:45,559 Speaker 1: inventions that have appeared fairly recently, which means that my 14 00:00:45,600 --> 00:00:48,520 Speaker 1: life is completely different from the life from my grandparents 15 00:00:48,520 --> 00:00:51,239 Speaker 1: and their grandparents. Yeah. I guess what maybe one way 16 00:00:51,280 --> 00:00:53,800 Speaker 1: to think about it is look around you and think 17 00:00:53,960 --> 00:00:57,640 Speaker 1: which technology, if it was wasn't there, would make my 18 00:00:57,720 --> 00:01:00,920 Speaker 1: life totally different. Yeah, And I think the most important 19 00:01:00,960 --> 00:01:03,080 Speaker 1: invention might not be something that you notice when you 20 00:01:03,120 --> 00:01:05,760 Speaker 1: just look around you might not be something that throws 21 00:01:05,800 --> 00:01:08,040 Speaker 1: itself in your face every single day of your life. 22 00:01:08,240 --> 00:01:10,160 Speaker 1: It could be something you use every day and not 23 00:01:10,360 --> 00:01:12,600 Speaker 1: even think about it. Are you talking about the toilet? 24 00:01:15,319 --> 00:01:21,200 Speaker 1: Maybe I'm using it right now. Please this podcast out 25 00:01:21,200 --> 00:01:40,880 Speaker 1: of the studio and into the toilet. Him and I'm Daniel, 26 00:01:41,080 --> 00:01:44,400 Speaker 1: and this is our podcast degree his invention in the universe. 27 00:01:45,000 --> 00:01:49,760 Speaker 1: The podcast called Daniel and Jorge Invent the Universe. Inventing 28 00:01:49,760 --> 00:01:53,280 Speaker 1: new title for the podcast on this pot now the 29 00:01:53,280 --> 00:01:56,720 Speaker 1: podcast is called Daniel and Jorge Explain the Universe, in 30 00:01:56,760 --> 00:01:59,200 Speaker 1: which we take everything in the universe. We explained to 31 00:01:59,240 --> 00:02:01,800 Speaker 1: you how is an invented or discovered or at least 32 00:02:02,000 --> 00:02:04,840 Speaker 1: understood in a way that you can possibly understand it 33 00:02:04,840 --> 00:02:06,400 Speaker 1: and explain it to somebody else, and then you can 34 00:02:06,440 --> 00:02:09,480 Speaker 1: tell them about the awesome podcast you heard it on 35 00:02:09,680 --> 00:02:12,080 Speaker 1: and maybe using this knowledge, you can go out there 36 00:02:12,120 --> 00:02:15,240 Speaker 1: and invent new thinks. That's right and give us one 37 00:02:15,280 --> 00:02:18,760 Speaker 1: percent of all your royalty. That's right. By listening, you 38 00:02:18,760 --> 00:02:23,120 Speaker 1: are implicitly signing a contract, that's right. Or there we 39 00:02:23,160 --> 00:02:26,040 Speaker 1: need terms and conditions on this podcast. Yeah, so welcome 40 00:02:26,040 --> 00:02:32,440 Speaker 1: to our podcast everyone, and the topic today, what is 41 00:02:32,480 --> 00:02:35,520 Speaker 1: the most important invention in human history? Yeah, of all 42 00:02:35,560 --> 00:02:38,200 Speaker 1: the things that have been invented to make life easier, 43 00:02:38,680 --> 00:02:42,000 Speaker 1: more fun, maybe more violent. What do you think is 44 00:02:42,040 --> 00:02:46,080 Speaker 1: the one invention that has had the most impact on 45 00:02:46,240 --> 00:02:50,160 Speaker 1: human filization or the human condition? That's right. And you 46 00:02:50,240 --> 00:02:53,120 Speaker 1: might be tempted to think about something which is very immediate, 47 00:02:53,200 --> 00:02:55,600 Speaker 1: like well, I use my phone every day, and so 48 00:02:55,680 --> 00:02:58,679 Speaker 1: the iPhone is a very impactful invention. But think a 49 00:02:58,720 --> 00:03:00,320 Speaker 1: little deeper. I mean, think about the thing in which 50 00:03:00,360 --> 00:03:02,720 Speaker 1: if we didn't have it would change human history. And 51 00:03:02,720 --> 00:03:04,960 Speaker 1: so I think that's I'm tipping my hand here. That's 52 00:03:05,040 --> 00:03:07,760 Speaker 1: my definition for what's the most important invention is the 53 00:03:07,800 --> 00:03:12,280 Speaker 1: invention with um, without which human history would be markedly different. 54 00:03:12,600 --> 00:03:14,760 Speaker 1: And I love thinking about the way that history could 55 00:03:14,760 --> 00:03:16,440 Speaker 1: have been different. Yeah, I mean, let's talk about that. 56 00:03:16,520 --> 00:03:18,760 Speaker 1: When you say what's the most important invention, what do 57 00:03:18,760 --> 00:03:21,800 Speaker 1: you mean? And so for you, it's about changing history, 58 00:03:22,040 --> 00:03:26,440 Speaker 1: like the invention that marked the point in time at 59 00:03:26,480 --> 00:03:29,240 Speaker 1: which humanity could have gone left or right exactly. And 60 00:03:29,320 --> 00:03:31,920 Speaker 1: some of these inventions they really come from like moments 61 00:03:31,960 --> 00:03:36,000 Speaker 1: of inspiration or or or you know, just um accidents. 62 00:03:36,040 --> 00:03:38,680 Speaker 1: You know, somebody accidentally invents something in the lab. They're 63 00:03:38,960 --> 00:03:41,160 Speaker 1: trying to make a better peanut butter, sandwich, but they 64 00:03:41,160 --> 00:03:44,160 Speaker 1: actually invent a laser gun or something, right, And this 65 00:03:44,280 --> 00:03:46,200 Speaker 1: kind of thing happens all the time, and it changes 66 00:03:46,240 --> 00:03:49,000 Speaker 1: the course of human history. And so I wonder sometimes, 67 00:03:49,040 --> 00:03:51,160 Speaker 1: like if that person had been sick that day, or 68 00:03:51,160 --> 00:03:53,760 Speaker 1: gotten a car accident, or decided to become an artist 69 00:03:53,840 --> 00:03:56,600 Speaker 1: instead of an engineer, how human history could have been different. 70 00:03:56,640 --> 00:03:59,440 Speaker 1: And so there are these moments, these pivot points in history, 71 00:03:59,880 --> 00:04:02,680 Speaker 1: or I feel like if things hadn't gotten a certain way, 72 00:04:03,240 --> 00:04:06,920 Speaker 1: the whole future could have been dramatically different. And inventions 73 00:04:06,920 --> 00:04:08,480 Speaker 1: are one of those. And so I like to think 74 00:04:08,480 --> 00:04:11,440 Speaker 1: about if you had deleted one person from history or 75 00:04:11,600 --> 00:04:14,000 Speaker 1: distracted them in the right moment, things could have been 76 00:04:14,080 --> 00:04:16,560 Speaker 1: very different. Yeah, like talk about a butterfly. Fi Like, 77 00:04:16,600 --> 00:04:20,400 Speaker 1: if a butterfly had flown in front of Einstein just 78 00:04:20,480 --> 00:04:22,520 Speaker 1: as he was about to come up with, you know, 79 00:04:23,160 --> 00:04:25,839 Speaker 1: the idea of relativity or quantum physics, he could have 80 00:04:25,839 --> 00:04:28,920 Speaker 1: been distracted and not come up with it. That's right, Yeah, absolutely, 81 00:04:29,120 --> 00:04:31,680 Speaker 1: And there's there's lots of real examples, you know, for example, 82 00:04:31,720 --> 00:04:34,800 Speaker 1: Isaac Newton, genius in human history, right, changed the way 83 00:04:34,839 --> 00:04:38,240 Speaker 1: we think, invented lots of important stuff, physics, gravity, calculus, 84 00:04:38,240 --> 00:04:41,560 Speaker 1: all this stuff. His family was sheep herders, right, And 85 00:04:41,600 --> 00:04:44,159 Speaker 1: the only reason he actually got an education was because 86 00:04:44,200 --> 00:04:47,360 Speaker 1: his father died and his mother remarried somebody who insisted 87 00:04:47,520 --> 00:04:49,880 Speaker 1: he go and get an education, maybe just to get 88 00:04:49,920 --> 00:04:52,320 Speaker 1: young Isaac out of the house, right, And so if 89 00:04:52,360 --> 00:04:55,000 Speaker 1: Isaac hadn't been sent to school, he never would become 90 00:04:55,040 --> 00:04:58,080 Speaker 1: this staggering genius in human history. Yeah, or even more 91 00:04:58,320 --> 00:05:01,320 Speaker 1: sort of crazy is supposedly he came up with the 92 00:05:01,320 --> 00:05:04,120 Speaker 1: idea of universal invitation when the end an apple hit 93 00:05:04,120 --> 00:05:08,880 Speaker 1: it hit him, right, I think that's probably mythology. But 94 00:05:09,000 --> 00:05:12,360 Speaker 1: let's but let's go let but you know what would 95 00:05:12,360 --> 00:05:14,720 Speaker 1: have happened if instead of an apple had been an orange. 96 00:05:14,800 --> 00:05:17,800 Speaker 1: You know, we could have had a totally different science, 97 00:05:18,520 --> 00:05:23,360 Speaker 1: the theory of universal juice instead, Right, Orange, You're glad 98 00:05:23,560 --> 00:05:27,320 Speaker 1: it was an apple who invented the first pun. Anyway, 99 00:05:27,360 --> 00:05:29,719 Speaker 1: somebody's got to get credit for that one. That definitely 100 00:05:29,800 --> 00:05:32,320 Speaker 1: changed that is the most important for the worst. Yeah. 101 00:05:37,960 --> 00:05:41,279 Speaker 1: So I'm not an expert in history of technology, and 102 00:05:41,320 --> 00:05:44,240 Speaker 1: I'm guessing that Moore you are not either, And so 103 00:05:44,400 --> 00:05:47,200 Speaker 1: for this particular topic, we decided to reach out to 104 00:05:47,360 --> 00:05:49,640 Speaker 1: somebody we know who is an expert who has thought 105 00:05:49,760 --> 00:05:52,760 Speaker 1: really deeply about these topics. So this is a good 106 00:05:52,760 --> 00:05:55,800 Speaker 1: friend of mine. His name is Ryan North. So Ryan, 107 00:05:55,839 --> 00:05:59,240 Speaker 1: welcome to the podcast. Thank you for having me great. 108 00:05:59,279 --> 00:06:01,919 Speaker 1: Can you tell us a little bit about yourself? Sure? 109 00:06:02,440 --> 00:06:05,479 Speaker 1: My name is Ryan North. I m write a web 110 00:06:05,480 --> 00:06:08,960 Speaker 1: comic called Dinosaur Comics between that for fifteen years, and 111 00:06:09,080 --> 00:06:11,839 Speaker 1: on top of that, I do nonfiction writing. I write 112 00:06:11,839 --> 00:06:15,039 Speaker 1: the inm Beautifule Squirrel Girl for Marvel Comics. And my 113 00:06:15,160 --> 00:06:18,560 Speaker 1: new book is called How To Invent Everything, a survival 114 00:06:18,560 --> 00:06:21,440 Speaker 1: guide for the stranded time traveler, and it's proceeds in 115 00:06:21,440 --> 00:06:23,200 Speaker 1: the premise that if you are in the future, you've 116 00:06:23,200 --> 00:06:25,000 Speaker 1: rented a time machine, you go back in time and 117 00:06:25,040 --> 00:06:28,480 Speaker 1: your time machine breaks, Here's how you fix things. There's 118 00:06:28,480 --> 00:06:31,320 Speaker 1: how you rebuild civilization from scratch in any time period 119 00:06:31,320 --> 00:06:34,920 Speaker 1: in nurse history. So it's sort of a nonfiction book 120 00:06:34,920 --> 00:06:37,240 Speaker 1: with a fictional candy coding on the outside, which is 121 00:06:37,279 --> 00:06:38,720 Speaker 1: the time travel part of it. I get to call 122 00:06:38,839 --> 00:06:41,799 Speaker 1: my my nonfiction time travel book, which I'm very happy about. 123 00:06:42,600 --> 00:06:44,880 Speaker 1: And was it inspired because you met a stranded time 124 00:06:44,880 --> 00:06:46,960 Speaker 1: traveler and thought, how am I going to help this person? 125 00:06:47,320 --> 00:06:53,000 Speaker 1: I have no comment on that you are. Yes, I 126 00:06:53,000 --> 00:06:55,360 Speaker 1: always thought of you as a man from the future, Ryan, 127 00:06:55,440 --> 00:06:58,240 Speaker 1: So is that true? Maybe maybe that's your secret. This 128 00:06:58,279 --> 00:07:02,440 Speaker 1: is your way to disclosing it to the Again, I 129 00:07:02,480 --> 00:07:06,520 Speaker 1: don't want to confirm anything, some mystery, right, keeping guessing. 130 00:07:07,040 --> 00:07:09,760 Speaker 1: I was really interested in the book because it reminded 131 00:07:09,800 --> 00:07:11,360 Speaker 1: me of when I was a kid. I was totally 132 00:07:11,480 --> 00:07:15,320 Speaker 1: enamored by that game Civilization, which you also played, right, 133 00:07:15,560 --> 00:07:18,000 Speaker 1: And in Civilization for those of you haven't played, you 134 00:07:18,000 --> 00:07:20,480 Speaker 1: have to basically reinvent Civilization and you have to do 135 00:07:20,520 --> 00:07:22,480 Speaker 1: it in order. And for me it was the first 136 00:07:22,560 --> 00:07:27,320 Speaker 1: education where somebody had taken down a lot of humanities breakthroughs, 137 00:07:27,360 --> 00:07:28,680 Speaker 1: you know, and said, well, what would you need to 138 00:07:28,720 --> 00:07:31,560 Speaker 1: reinvent um the combustion engine? What you need this? And 139 00:07:31,640 --> 00:07:33,080 Speaker 1: for that you need this, and for that you need this, 140 00:07:33,120 --> 00:07:35,480 Speaker 1: all the way back to numbers and writing and this 141 00:07:35,560 --> 00:07:38,200 Speaker 1: kind of stuff. So it was really fun to see 142 00:07:38,520 --> 00:07:41,560 Speaker 1: this sort of detailed breakdown and what made me think about, like, 143 00:07:41,640 --> 00:07:44,840 Speaker 1: which are the inventions in human history that most catalyzed 144 00:07:44,920 --> 00:07:48,520 Speaker 1: technological progress, from which most changed the future. So that's 145 00:07:48,520 --> 00:07:50,640 Speaker 1: what we wanted to sort of focus on in today's 146 00:07:50,640 --> 00:07:53,760 Speaker 1: podcast episode. Is this sort of broader question, what is 147 00:07:53,760 --> 00:07:57,240 Speaker 1: the most important invention in human history? And before we 148 00:07:57,280 --> 00:07:59,440 Speaker 1: talk to you about it, since obviously you've thought deeply 149 00:07:59,440 --> 00:08:01,960 Speaker 1: about this to write your book, we went and asked um, 150 00:08:02,200 --> 00:08:04,000 Speaker 1: a bunch of people on the street who hadn't had 151 00:08:04,000 --> 00:08:06,360 Speaker 1: a chance to think about at all, and ask them 152 00:08:06,480 --> 00:08:08,960 Speaker 1: what they thought was the most important invention in human history, 153 00:08:09,200 --> 00:08:11,200 Speaker 1: just from the top of their heads. Here's what they 154 00:08:11,200 --> 00:08:15,160 Speaker 1: had to say. Fire. Fire. If I call the wheel, 155 00:08:16,280 --> 00:08:21,640 Speaker 1: I would say that, all right, thanks, light bulb, the wheel, wheel, 156 00:08:22,320 --> 00:08:25,520 Speaker 1: the wheel. The concept of evolution is that what you're 157 00:08:25,520 --> 00:08:29,760 Speaker 1: happen to be studying right now. I was surprised how 158 00:08:29,760 --> 00:08:33,120 Speaker 1: often fire showed up. Yeah, fire, fire was a popular one. 159 00:08:33,160 --> 00:08:35,600 Speaker 1: Fire in the wheel. Yeah, so that that's surprised to you. 160 00:08:35,600 --> 00:08:39,120 Speaker 1: How come? Yeah, Um, and I see why they went 161 00:08:39,160 --> 00:08:40,920 Speaker 1: for fire, But I feel like it it kind of 162 00:08:40,960 --> 00:08:43,920 Speaker 1: dodges a question a little bit because it's a pretty 163 00:08:43,960 --> 00:08:48,120 Speaker 1: human invention. Yeah. Wow, so fire predates humanity. Homo erectus 164 00:08:48,200 --> 00:08:51,480 Speaker 1: was using human was using fire, and they're not us. 165 00:08:51,960 --> 00:08:53,800 Speaker 1: I mean, they're humans, are homo, but they're not Homo 166 00:08:53,840 --> 00:08:56,600 Speaker 1: sapiens are not. So wait, we can't we can't claim 167 00:08:56,600 --> 00:08:59,400 Speaker 1: credit for fire. We can't claim Homo sapiens did not 168 00:08:59,480 --> 00:09:01,600 Speaker 1: invent fire, and they might have stolen it or reinvented it, 169 00:09:01,600 --> 00:09:04,160 Speaker 1: but they didn't first invented it was Homo orectus. I 170 00:09:04,160 --> 00:09:06,640 Speaker 1: think you've just undermined like a core tenant and belief 171 00:09:06,679 --> 00:09:08,760 Speaker 1: a lot of people have about their own species, and 172 00:09:09,160 --> 00:09:10,960 Speaker 1: a lot of people went to fires like this is 173 00:09:10,960 --> 00:09:13,560 Speaker 1: the defining thing about humanity. This was mixes who we 174 00:09:13,600 --> 00:09:15,880 Speaker 1: are with this is what makes us. Yeah, you just 175 00:09:15,960 --> 00:09:19,439 Speaker 1: de cute it from the list. That's pretty tough, man. Okay, 176 00:09:19,480 --> 00:09:22,240 Speaker 1: so we eliminate fire then from the list of important 177 00:09:22,240 --> 00:09:25,679 Speaker 1: to mentions because we didn't invent it. Yeah, so yes, 178 00:09:25,760 --> 00:09:27,760 Speaker 1: so people said fire. They also tend to go with 179 00:09:27,840 --> 00:09:31,400 Speaker 1: for the wheel, right, the wheels are a pretty common one. 180 00:09:31,440 --> 00:09:34,240 Speaker 1: Was also embarrassing, though the wheel is embarrassing, was that, well, 181 00:09:34,240 --> 00:09:37,320 Speaker 1: we had the wheel for thousands of thousands of years, 182 00:09:37,320 --> 00:09:39,760 Speaker 1: but we used it for pottery on this side as 183 00:09:39,800 --> 00:09:42,880 Speaker 1: a pottery wheel. So it's again like if you want 184 00:09:42,880 --> 00:09:45,520 Speaker 1: for transportation, it took us thousands of years to flip 185 00:09:45,559 --> 00:09:47,240 Speaker 1: it over on its side. And that's what we think 186 00:09:47,280 --> 00:09:50,360 Speaker 1: with think about whe we think about movement transportation, but 187 00:09:50,600 --> 00:09:52,800 Speaker 1: we use it to make pottery for a really long time. 188 00:09:53,760 --> 00:09:57,440 Speaker 1: Is that true? Yes? Yes, the wheel for transport comes 189 00:09:57,440 --> 00:09:59,960 Speaker 1: well after the wheel for pottery. Nobody thought to put 190 00:10:00,000 --> 00:10:01,960 Speaker 1: did on the side. I want to talk about that 191 00:10:02,000 --> 00:10:16,679 Speaker 1: some more, but first let's take a quick break. So 192 00:10:16,720 --> 00:10:18,480 Speaker 1: this is something you come back to a lot in 193 00:10:18,480 --> 00:10:20,720 Speaker 1: the book. Is that you say, we had everything we 194 00:10:20,800 --> 00:10:23,960 Speaker 1: needed and if you just knew what to do, you 195 00:10:24,000 --> 00:10:26,360 Speaker 1: could skip a thousand years or a hundred thousand years, 196 00:10:26,360 --> 00:10:29,320 Speaker 1: you could accomplish something in an afternoon. Right, And I 197 00:10:29,360 --> 00:10:31,760 Speaker 1: get that that's like a really fun fantasy to imagine 198 00:10:31,760 --> 00:10:35,760 Speaker 1: fast forwarding human progress. But I wonder sometimes how true 199 00:10:35,800 --> 00:10:38,719 Speaker 1: that is. Like, let's take the question of language. Right 200 00:10:38,960 --> 00:10:40,920 Speaker 1: in your book, you you lay down language is like 201 00:10:40,960 --> 00:10:45,000 Speaker 1: a pretty critical cornerstone of human technology, and I completely agree. 202 00:10:45,640 --> 00:10:49,240 Speaker 1: But um, do you think that going back like fifty 203 00:10:49,679 --> 00:10:53,720 Speaker 1: years to skeletally identical humans, that if you went back 204 00:10:53,720 --> 00:10:56,680 Speaker 1: that you could teach English, for example, to a group 205 00:10:56,760 --> 00:11:00,240 Speaker 1: of two year old, um, you know, modern human from 206 00:11:00,240 --> 00:11:02,320 Speaker 1: fifty years ago, and that they would learn it and 207 00:11:02,360 --> 00:11:05,800 Speaker 1: develop and and and turn into you know, English speakers. 208 00:11:05,840 --> 00:11:07,960 Speaker 1: I do. And the reason, UH sort out of that 209 00:11:08,000 --> 00:11:10,040 Speaker 1: loophole there is that you said two year olds, which 210 00:11:10,080 --> 00:11:14,280 Speaker 1: is great because, um, it's very hard, maybe impossible for 211 00:11:14,440 --> 00:11:17,959 Speaker 1: humans to learn a language after puberty. It's also really 212 00:11:18,000 --> 00:11:20,920 Speaker 1: unethical to test to run experiments on it. But in 213 00:11:21,000 --> 00:11:23,839 Speaker 1: cases of feral children, people spend their lives trying to 214 00:11:23,840 --> 00:11:25,280 Speaker 1: teach them to talk, and maybe they learned it a 215 00:11:25,320 --> 00:11:27,800 Speaker 1: little bit, but they're never communicative that don't really use it. 216 00:11:28,200 --> 00:11:30,280 Speaker 1: And so if you're traveling back, you know, a hundred 217 00:11:30,280 --> 00:11:32,400 Speaker 1: thousand years, fifty thousand years, and you want to start 218 00:11:32,400 --> 00:11:36,480 Speaker 1: rebuilding civilization, I would recommend don't necessarily chat up the 219 00:11:36,520 --> 00:11:39,920 Speaker 1: cave men the cave women, but maybe talk to their kids. 220 00:11:40,520 --> 00:11:42,680 Speaker 1: Maybe steal some babies if you're going to do it 221 00:11:42,720 --> 00:11:44,960 Speaker 1: that way. So you're on going on record for baby 222 00:11:45,000 --> 00:11:47,480 Speaker 1: stealing right here. Um, I don't want to go on 223 00:11:47,520 --> 00:11:50,760 Speaker 1: record for fully endorsing baby stealing. But in terms of 224 00:11:50,800 --> 00:11:55,240 Speaker 1: just pure efficiency of civilization building, directing your efforts towards 225 00:11:55,240 --> 00:11:57,800 Speaker 1: the cave babies will get you much better results. And 226 00:11:58,480 --> 00:12:00,320 Speaker 1: there's there's a there's a hug question where care right, 227 00:12:00,360 --> 00:12:04,080 Speaker 1: because you mentioned how we have these skeletally identical humans 228 00:12:04,160 --> 00:12:08,280 Speaker 1: like um, anatomically modern humans. So people whose skeletons look 229 00:12:08,320 --> 00:12:11,360 Speaker 1: like ours, and those show up around two thousand BC, 230 00:12:12,120 --> 00:12:15,719 Speaker 1: and then behaviorally modern humans show up, humans that that 231 00:12:15,760 --> 00:12:19,200 Speaker 1: act like us, that behave like us, that decorate their bodies, 232 00:12:19,240 --> 00:12:21,400 Speaker 1: bury the dead, that sort of thing shop around fifty 233 00:12:22,559 --> 00:12:25,400 Speaker 1: You know, there's there's this huge gap of what took 234 00:12:25,480 --> 00:12:27,880 Speaker 1: us along, what what made us finally take that leap 235 00:12:27,920 --> 00:12:32,040 Speaker 1: from anatomically modern to behaviory modern. And we don't for sure, no, 236 00:12:32,400 --> 00:12:35,800 Speaker 1: because it's very hard to desk. None these things fossilized 237 00:12:36,200 --> 00:12:38,520 Speaker 1: are preserved. But one the theories and want to go 238 00:12:38,520 --> 00:12:40,720 Speaker 1: within the book is it was the mention of language. 239 00:12:40,720 --> 00:12:43,760 Speaker 1: It was he mentioned of talking to each other, that 240 00:12:43,760 --> 00:12:45,959 Speaker 1: that let us make that leap, for let us finally 241 00:12:46,000 --> 00:12:48,840 Speaker 1: become fully human. And so that's the technology. I would 242 00:12:48,840 --> 00:12:52,160 Speaker 1: say it's the most important one for us, language because 243 00:12:52,160 --> 00:12:55,679 Speaker 1: it allows us to to have not just to talk 244 00:12:55,720 --> 00:12:57,559 Speaker 1: to each other, but to like have ideas that can 245 00:12:57,600 --> 00:13:01,920 Speaker 1: survive the death of the host. That's so importantly. Language 246 00:13:01,960 --> 00:13:05,600 Speaker 1: is definitely definitely important. But but the supposition there is 247 00:13:05,600 --> 00:13:08,760 Speaker 1: that somehow we had the capacity for it but didn't 248 00:13:08,800 --> 00:13:13,080 Speaker 1: invent it for aft years, Right, But is it There 249 00:13:13,080 --> 00:13:15,400 Speaker 1: are other ways you might imagine it could have gone, right, 250 00:13:15,400 --> 00:13:17,240 Speaker 1: it could be that most of us didn't have the 251 00:13:17,240 --> 00:13:20,960 Speaker 1: capacity for it, and then a few brains you know, mutated, 252 00:13:21,000 --> 00:13:25,160 Speaker 1: evolved whatever in to develop an additional capacity which allowed 253 00:13:25,160 --> 00:13:27,320 Speaker 1: for the creation of it um and then and then 254 00:13:27,360 --> 00:13:29,360 Speaker 1: of course it would be a rapid selection effect. So 255 00:13:29,400 --> 00:13:32,480 Speaker 1: you can imagine that after the capacity for language evolved, 256 00:13:32,480 --> 00:13:36,080 Speaker 1: it might have been developed and then spread very rapidly. Sure, 257 00:13:36,400 --> 00:13:40,600 Speaker 1: but that requires a change to the brain that sort 258 00:13:40,600 --> 00:13:43,960 Speaker 1: of evolved in us. And without needing to suppose that, 259 00:13:44,000 --> 00:13:47,520 Speaker 1: we can just suppose someone inventive language, and then the question, well, 260 00:13:47,520 --> 00:13:50,240 Speaker 1: why why did it take us so long? And language 261 00:13:50,280 --> 00:13:52,199 Speaker 1: is a really hard thing to advent because imagine you're 262 00:13:52,200 --> 00:13:54,800 Speaker 1: trying to you have to be sort of this uh, 263 00:13:55,280 --> 00:13:58,560 Speaker 1: it might had a column Caveman Einstein who has to 264 00:13:59,040 --> 00:14:01,120 Speaker 1: not only come up with the idea of language, the 265 00:14:01,120 --> 00:14:06,600 Speaker 1: idea of expressing thoughts in words, but instantiate that idea. 266 00:14:07,000 --> 00:14:09,760 Speaker 1: And it's still completely useless unless you're also smart enough 267 00:14:09,800 --> 00:14:13,240 Speaker 1: to teach someone else how to use it all within 268 00:14:13,240 --> 00:14:15,880 Speaker 1: a single lifetime. Like it's it's not easy to be 269 00:14:15,920 --> 00:14:18,600 Speaker 1: that first person who's coming up with the technology of languages, 270 00:14:18,640 --> 00:14:21,040 Speaker 1: inventing language. And I think you can point to out 271 00:14:21,040 --> 00:14:22,720 Speaker 1: and say, yeah, that that might take a long time 272 00:14:22,760 --> 00:14:25,640 Speaker 1: to have all those things line up as they would 273 00:14:25,720 --> 00:14:28,680 Speaker 1: need to be for this to have any practical use, right. 274 00:14:28,760 --> 00:14:30,960 Speaker 1: And I think it touches on this other issue, which 275 00:14:31,000 --> 00:14:34,000 Speaker 1: is a lot of these foundational inventions. People who are 276 00:14:34,040 --> 00:14:36,480 Speaker 1: listening might think, what, that's crazy. It's just so obvious, 277 00:14:36,960 --> 00:14:39,760 Speaker 1: And a lot of these things are so foundational. They've 278 00:14:39,800 --> 00:14:42,560 Speaker 1: in you know, they're deeply embedded in the way we 279 00:14:42,640 --> 00:14:45,920 Speaker 1: think that it's impossible to imagine life without them, which 280 00:14:45,960 --> 00:14:49,280 Speaker 1: is why, frankly, they're so difficult to invent, right, I mean, 281 00:14:49,640 --> 00:14:52,000 Speaker 1: because they completely transform the way you think and then 282 00:14:52,040 --> 00:14:55,040 Speaker 1: become deeply ingrained in your thought process. It's hard to 283 00:14:55,080 --> 00:14:57,280 Speaker 1: imagine how to get there when you don't have it. 284 00:14:57,440 --> 00:15:01,440 Speaker 1: I've spent years trying to picture thinking without words, Like, 285 00:15:01,480 --> 00:15:03,800 Speaker 1: how do you what? I'm talking to two cartoonists here, right, 286 00:15:03,800 --> 00:15:06,080 Speaker 1: you guys are experts at thinking without words, right, But 287 00:15:06,120 --> 00:15:08,080 Speaker 1: we're we're taking words and put them into pictures. But 288 00:15:08,120 --> 00:15:11,600 Speaker 1: it's still your The process seems very very language based. 289 00:15:11,840 --> 00:15:14,320 Speaker 1: But one of examples I love touching on that is 290 00:15:14,320 --> 00:15:16,920 Speaker 1: the idea of if you have a time machine, you 291 00:15:16,920 --> 00:15:21,200 Speaker 1: could take one of those children born, take them to 292 00:15:21,280 --> 00:15:24,040 Speaker 1: the modern world, adopt them, raise them as a modern child, 293 00:15:24,280 --> 00:15:26,320 Speaker 1: and they'll be like any other human on the planet. 294 00:15:26,320 --> 00:15:28,360 Speaker 1: They'll they'll be a smart and creative and clever and 295 00:15:28,400 --> 00:15:32,080 Speaker 1: fun and loving as any other human because they're, you know, 296 00:15:32,120 --> 00:15:35,080 Speaker 1: standing on the shoulders of giants. They're having fifty years 297 00:15:35,120 --> 00:15:39,120 Speaker 1: worth of technological process for free. They get to have language, 298 00:15:39,120 --> 00:15:41,000 Speaker 1: they get to learn how to read and write, they 299 00:15:41,040 --> 00:15:44,000 Speaker 1: get to be in a community. And it seems almost 300 00:15:44,000 --> 00:15:46,080 Speaker 1: like you're breaking a rule right to have this you know, 301 00:15:46,240 --> 00:15:49,160 Speaker 1: literal cave person and have them be in distinguished from 302 00:15:49,160 --> 00:15:51,760 Speaker 1: modern person just by changing the environment in which they're raised. 303 00:15:52,000 --> 00:15:54,520 Speaker 1: But that's what that's what these inventions do for us. 304 00:15:54,640 --> 00:15:57,080 Speaker 1: They change the nature of who and what we are 305 00:15:57,160 --> 00:15:58,560 Speaker 1: in a way that makes it hard to imagine what 306 00:15:58,560 --> 00:16:03,440 Speaker 1: it's like without it. And that's the kind of thing 307 00:16:03,480 --> 00:16:06,480 Speaker 1: that makes me wonder what's in the future, Like if 308 00:16:06,520 --> 00:16:09,520 Speaker 1: there were in the past these sort of trivial but 309 00:16:09,600 --> 00:16:14,520 Speaker 1: transformational inventions math, language, etcetera. Are there ones that remain, 310 00:16:14,680 --> 00:16:16,880 Speaker 1: you know, well, in a hundred thousand years, people look 311 00:16:16,880 --> 00:16:19,320 Speaker 1: back and think, how come Ryan North didn't think of 312 00:16:19,480 --> 00:16:22,880 Speaker 1: you know, blah blah blah, some transformational but basic way 313 00:16:22,960 --> 00:16:27,280 Speaker 1: of living and thinking that exploded our capacity for technology 314 00:16:27,440 --> 00:16:28,960 Speaker 1: in life. I mean, do you think that there are 315 00:16:28,960 --> 00:16:31,680 Speaker 1: those sort of transformations left? Yeah? I hope so, I 316 00:16:31,680 --> 00:16:33,840 Speaker 1: mean I believe. So what the punchline of this is, 317 00:16:33,880 --> 00:16:35,040 Speaker 1: you know, we could have met all this stuff and 318 00:16:35,080 --> 00:16:37,160 Speaker 1: we didn't. And look how blind we were not to 319 00:16:37,200 --> 00:16:39,120 Speaker 1: see that. How can we have been so stupid? But 320 00:16:40,040 --> 00:16:42,800 Speaker 1: if there are all these inventions and all these points 321 00:16:42,800 --> 00:16:44,120 Speaker 1: in history we can point out and say, yeah, we 322 00:16:44,120 --> 00:16:47,120 Speaker 1: didn't see this until later, Um, it stands the reason 323 00:16:47,160 --> 00:16:49,640 Speaker 1: that there maybe that right now there's probably still some 324 00:16:49,680 --> 00:16:51,920 Speaker 1: of this I call it the low hanging fruit of 325 00:16:51,960 --> 00:16:55,920 Speaker 1: civilization that we could invent an arn't and just aren't 326 00:16:55,920 --> 00:16:57,640 Speaker 1: seeing yet. And I think that's really optimistic. I don't 327 00:16:57,640 --> 00:16:59,240 Speaker 1: think that makes us feel stupid, that makes us That 328 00:16:59,280 --> 00:17:01,960 Speaker 1: makes me feel like side it, like, that's what are 329 00:17:01,960 --> 00:17:04,040 Speaker 1: we missing? That's out there right now? There's still really 330 00:17:04,040 --> 00:17:07,480 Speaker 1: cool stuff that we can all come up with. So 331 00:17:07,680 --> 00:17:10,800 Speaker 1: is that your choice, Bryan, for the most important invention 332 00:17:10,800 --> 00:17:15,439 Speaker 1: in human history? Language? Yeah? I think it's foundational. I 333 00:17:15,480 --> 00:17:18,840 Speaker 1: think it's consequential. I think it's transformational. And also I'm 334 00:17:18,880 --> 00:17:21,880 Speaker 1: kind of cheating because it's actually two inventions. I'm rolling 335 00:17:21,920 --> 00:17:25,760 Speaker 1: reading and writing together into one. They're just calling it language. 336 00:17:25,760 --> 00:17:27,680 Speaker 1: But how do you define language? You mean, like the 337 00:17:27,880 --> 00:17:31,040 Speaker 1: the idea of words or just communicating because people I'm 338 00:17:31,040 --> 00:17:34,720 Speaker 1: sure communicated with grunt or hand signals, right, yeah, but 339 00:17:34,800 --> 00:17:38,119 Speaker 1: that's not language. Um, So the example I would give is, 340 00:17:38,440 --> 00:17:41,520 Speaker 1: let's say I draw three pictures. I draw a picture 341 00:17:41,520 --> 00:17:44,200 Speaker 1: of a cool dog and a picture of a skateboard, 342 00:17:44,480 --> 00:17:48,240 Speaker 1: and a picture of a thumbs up, and those symbols 343 00:17:48,280 --> 00:17:51,399 Speaker 1: can be interpreted. You can, Oh, he's saying a cool 344 00:17:51,440 --> 00:17:54,679 Speaker 1: dog in a skateboard is good. Or maybe I'm saying 345 00:17:55,000 --> 00:17:56,840 Speaker 1: I saw a cool dog in the skateboard and I 346 00:17:56,880 --> 00:17:58,840 Speaker 1: gave it a thumbs up because I love cool dogs 347 00:17:58,840 --> 00:18:03,320 Speaker 1: on skateboards. But there's there's ambiguity there, and what languages 348 00:18:03,560 --> 00:18:06,600 Speaker 1: is something that works to eliminate ambiguity. So when we're 349 00:18:06,600 --> 00:18:10,040 Speaker 1: communicating now, because I'm using words with precise meanings, they're 350 00:18:10,080 --> 00:18:11,480 Speaker 1: not perfect, but they try to be precise so that 351 00:18:11,480 --> 00:18:13,040 Speaker 1: we can communicate quickly and clear that they're having to 352 00:18:13,080 --> 00:18:15,840 Speaker 1: go back and clarify all the time. And so yeah, 353 00:18:15,880 --> 00:18:17,959 Speaker 1: you can communicate with Brench, you can communicate with how 354 00:18:18,040 --> 00:18:21,280 Speaker 1: you can communicate with long ang glances across the dance floor, 355 00:18:21,560 --> 00:18:25,720 Speaker 1: but for precision communication you need language. And that's that's 356 00:18:25,720 --> 00:18:28,040 Speaker 1: what I'm calling the technology. So it's kind of like 357 00:18:28,080 --> 00:18:30,720 Speaker 1: the idea of somebody at some point they say, hey, guys, 358 00:18:30,720 --> 00:18:34,919 Speaker 1: this is crazy. We should have rules that define what 359 00:18:35,040 --> 00:18:38,159 Speaker 1: those glances across the dance floors are. Yeah, yeah, okay. 360 00:18:38,160 --> 00:18:41,040 Speaker 1: So it's the the invention of rules that everyone would 361 00:18:41,080 --> 00:18:45,119 Speaker 1: agree to communicate ideas Yes with an asterix. Because when 362 00:18:45,160 --> 00:18:47,800 Speaker 1: the when of the craziest proofs I read when I 363 00:18:47,800 --> 00:18:51,040 Speaker 1: was doing computation linguistics was this person was making the 364 00:18:51,119 --> 00:18:53,760 Speaker 1: argument that one of the neat things about language that 365 00:18:53,840 --> 00:18:56,320 Speaker 1: is almost unique is that it involves so quickly. You 366 00:18:56,359 --> 00:18:58,920 Speaker 1: look at the way English was spoken a hundred years ago, 367 00:18:58,920 --> 00:19:01,320 Speaker 1: it already it sounds age and or at least odd. 368 00:19:01,359 --> 00:19:02,440 Speaker 1: You go back two or three or four, you go 369 00:19:02,480 --> 00:19:06,320 Speaker 1: back to Shakespeare four years ago, and it's hard to understand, right, 370 00:19:06,960 --> 00:19:09,520 Speaker 1: So why is it changed so quickly? Why isn't why 371 00:19:09,640 --> 00:19:12,479 Speaker 1: it's based on rules, rules what make language work clearly? 372 00:19:12,560 --> 00:19:14,720 Speaker 1: Why these rules seem to change so often? That seems 373 00:19:14,720 --> 00:19:17,600 Speaker 1: like a failure. And the argument this paper was making, 374 00:19:17,600 --> 00:19:20,879 Speaker 1: which I love was that the reason language evolves so 375 00:19:20,960 --> 00:19:23,680 Speaker 1: quickly is because language is really hard to learn, which 376 00:19:23,720 --> 00:19:27,359 Speaker 1: is true, but actually it's impossible to learn, and we 377 00:19:27,440 --> 00:19:30,000 Speaker 1: never actually learned the language our parents are speaking. We 378 00:19:30,080 --> 00:19:32,960 Speaker 1: learned an approximation of it that allows us to communicate, 379 00:19:33,000 --> 00:19:35,399 Speaker 1: but we all the edges are fuzzy, and so since 380 00:19:35,520 --> 00:19:37,359 Speaker 1: we have all these places where we don't actually know 381 00:19:37,400 --> 00:19:40,560 Speaker 1: what the rule is, that allows language to change so 382 00:19:40,640 --> 00:19:43,840 Speaker 1: quickly and to evolve generationally so fast, which I love. 383 00:19:43,880 --> 00:19:45,800 Speaker 1: I love to do that. You know, language not just hard, 384 00:19:45,840 --> 00:19:48,320 Speaker 1: it's impossible, and you will never learn the language your 385 00:19:48,320 --> 00:19:51,480 Speaker 1: parents are speaking just can't be done. Well, it's certainly true. 386 00:19:51,520 --> 00:19:53,840 Speaker 1: The parents just don't understand. So maybe that's the reason. 387 00:19:54,880 --> 00:19:57,720 Speaker 1: That's right, I forgot your computational linguist, right, that was 388 00:19:57,760 --> 00:20:01,359 Speaker 1: your education, Yeah, I did. I did a master's in 389 00:20:01,359 --> 00:20:04,400 Speaker 1: compunition linguistics. So that's a bit suspicious that a computation 390 00:20:04,440 --> 00:20:07,560 Speaker 1: of linguists thinks language is the greatest invention ever. You know, 391 00:20:09,080 --> 00:20:13,080 Speaker 1: cartoons are the most important invention ever. It's only a 392 00:20:13,080 --> 00:20:15,159 Speaker 1: tiny of itself centered. But also it's a bit of 393 00:20:15,160 --> 00:20:18,240 Speaker 1: a dark view because it suggests that the greatest accomplishment 394 00:20:18,240 --> 00:20:20,520 Speaker 1: in human history is thousands of years ago and we 395 00:20:20,560 --> 00:20:23,760 Speaker 1: haven't really done anything since, which matches up. But we're 396 00:20:23,760 --> 00:20:26,119 Speaker 1: still using it right, Like it's allowed us to do 397 00:20:26,320 --> 00:20:29,159 Speaker 1: everything else. So that's that's why I say it's so 398 00:20:29,240 --> 00:20:31,439 Speaker 1: it's so foundational, is that it it is what unlocks 399 00:20:31,480 --> 00:20:33,160 Speaker 1: everything is you can be the smartest person in the world. 400 00:20:33,200 --> 00:20:35,680 Speaker 1: Without language, You're trapped in your own head and you're 401 00:20:35,720 --> 00:20:39,040 Speaker 1: having these amazing world changing thoughts and can't communicate them 402 00:20:39,119 --> 00:20:41,639 Speaker 1: in a way that's clear you can. You're not gonna communicate, 403 00:20:41,800 --> 00:20:44,720 Speaker 1: you know, relativity through grunts and glances. You need you 404 00:20:44,720 --> 00:20:46,920 Speaker 1: need language that I need mathematics for that. So I 405 00:20:46,960 --> 00:20:49,239 Speaker 1: think it's fascinating what you're saying, because I was going 406 00:20:49,280 --> 00:20:52,399 Speaker 1: to make exactly the same argument to make a different point. 407 00:20:52,560 --> 00:20:54,600 Speaker 1: I was going to use the same argument to suggest 408 00:20:54,640 --> 00:20:57,879 Speaker 1: that math and science are the most important inventions in 409 00:20:57,880 --> 00:21:03,320 Speaker 1: the industry, but for exactly the same reason. And I 410 00:21:03,359 --> 00:21:06,280 Speaker 1: feel like you guys are maybe a little biased in 411 00:21:06,400 --> 00:21:09,840 Speaker 1: your selections here. I feel like until we had mathematics, 412 00:21:09,920 --> 00:21:12,359 Speaker 1: all we had was language, which is frankly kind of 413 00:21:12,440 --> 00:21:16,280 Speaker 1: clumsy when you want to communicate very clearly and precisely. 414 00:21:16,720 --> 00:21:20,000 Speaker 1: And I remember learning math and learning logic and feeling like, finally, 415 00:21:20,320 --> 00:21:23,960 Speaker 1: here we have effectively a language for very clearly and 416 00:21:24,000 --> 00:21:27,760 Speaker 1: precisely communicating ideas ideas which are too fuzzy in English 417 00:21:27,880 --> 00:21:31,880 Speaker 1: to communicate clearly. But is mathematical language just another language? 418 00:21:32,000 --> 00:21:33,560 Speaker 1: You know what I mean? Like? Is that a sub 419 00:21:33,640 --> 00:21:36,720 Speaker 1: invention of language? Uh? In the book, I cheat because 420 00:21:36,720 --> 00:21:38,440 Speaker 1: I'm saying now language is the most important to mention 421 00:21:38,480 --> 00:21:41,199 Speaker 1: the book. I actually give five I say written language, 422 00:21:41,240 --> 00:21:45,520 Speaker 1: spoken language, scientific method, calorie surplus, so again having extra 423 00:21:45,560 --> 00:21:47,320 Speaker 1: food so they can worry about other things, and we're 424 00:21:47,320 --> 00:21:49,720 Speaker 1: your next meal is coming from. And the last one 425 00:21:49,920 --> 00:21:52,400 Speaker 1: I call non stucky numbers, which is basically a number 426 00:21:52,440 --> 00:21:55,919 Speaker 1: sism that permits mathematics to happen in a in a 427 00:21:56,000 --> 00:21:59,160 Speaker 1: productive way. So one of the reasons the ancient Romans 428 00:21:59,200 --> 00:22:00,920 Speaker 1: didn't get that far with maths. They have these Roman 429 00:22:00,960 --> 00:22:04,600 Speaker 1: numerals which are just incredibly clumsy to operate with. You 430 00:22:05,000 --> 00:22:06,600 Speaker 1: you have to do math. We know what number you're 431 00:22:06,640 --> 00:22:10,120 Speaker 1: looking at. So I'm not gonna sit here and argue 432 00:22:10,119 --> 00:22:12,119 Speaker 1: that math isn't important because I have I have on paper, 433 00:22:12,119 --> 00:22:16,040 Speaker 1: I think it is important. But I will say, um, 434 00:22:16,080 --> 00:22:19,280 Speaker 1: and this is maybe dodgy, But I, personally, Ryan North, 435 00:22:19,440 --> 00:22:22,560 Speaker 1: think that mathematics and correct me if I'm wrong, But 436 00:22:22,600 --> 00:22:26,560 Speaker 1: I feel like mathematics can be a creative expression in 437 00:22:26,600 --> 00:22:29,760 Speaker 1: the same way that language can, only with more rules. 438 00:22:29,960 --> 00:22:31,879 Speaker 1: Is that wrong? Is that romanticizing it? Or am I 439 00:22:33,200 --> 00:22:37,520 Speaker 1: the romantization of mathematics? While um, I'm not sure it's 440 00:22:37,520 --> 00:22:40,800 Speaker 1: possible to romanticize math. Now I think that's fair. Um, 441 00:22:41,440 --> 00:22:43,600 Speaker 1: but I would also make a similar argument, you know, 442 00:22:43,680 --> 00:22:48,840 Speaker 1: for science, like in terms of helping us develop technologies, 443 00:22:48,920 --> 00:22:51,439 Speaker 1: or helping us understand the world, or communicating clearly with 444 00:22:51,480 --> 00:22:54,879 Speaker 1: each other what we know and what we don't know. Um. 445 00:22:54,920 --> 00:22:57,320 Speaker 1: It's always amazing to me that took so long for 446 00:22:57,359 --> 00:22:59,399 Speaker 1: people to come up with a scientific method, you know, 447 00:22:59,480 --> 00:23:02,080 Speaker 1: or even to come up with the idea of empiricism, 448 00:23:02,160 --> 00:23:05,600 Speaker 1: Like you have an idea, let's actually check if it 449 00:23:05,680 --> 00:23:09,919 Speaker 1: works before we accept it in the canon of ideas, right, Um. 450 00:23:09,960 --> 00:23:11,520 Speaker 1: You know, one of my favorite examples when I teach 451 00:23:11,560 --> 00:23:17,439 Speaker 1: introductory physics is the comparison between Aristotilion physics and Galilean physics. 452 00:23:17,480 --> 00:23:20,320 Speaker 1: You know, like Aristotle thousands of years ago said, oh, 453 00:23:20,480 --> 00:23:22,600 Speaker 1: things just moved, because it's in the way of things 454 00:23:22,680 --> 00:23:26,320 Speaker 1: to move, and Galileo was like, let's check, and turns 455 00:23:26,359 --> 00:23:30,199 Speaker 1: out in an afternoon he disproved all of Aristotle. Right, 456 00:23:30,400 --> 00:23:32,960 Speaker 1: And there's an example of somebody actually making huge progress 457 00:23:32,960 --> 00:23:36,159 Speaker 1: in an afternoon based on a single simple idea. But 458 00:23:36,200 --> 00:23:39,280 Speaker 1: why did it take thousands of years before people realized, Wow, 459 00:23:39,320 --> 00:23:41,600 Speaker 1: science is actually only useful when you compare it to 460 00:23:41,680 --> 00:23:45,439 Speaker 1: what's actually happening in the real world. Um, I, you know, 461 00:23:45,480 --> 00:23:47,199 Speaker 1: once you have that idea, of course, then that you 462 00:23:47,200 --> 00:23:50,600 Speaker 1: have this enormous flowering of technology and advancement. So it 463 00:23:50,640 --> 00:23:53,480 Speaker 1: seems to me like, yeah, language is important, math is important, 464 00:23:53,520 --> 00:23:56,200 Speaker 1: but in some sense science is a really strong contender 465 00:23:56,680 --> 00:23:58,879 Speaker 1: for the most important because it's we've had it the 466 00:23:58,880 --> 00:24:01,320 Speaker 1: shortest amount of time, but it's led to perhaps the 467 00:24:01,359 --> 00:24:03,920 Speaker 1: greatest transformations in you know, the way we live. Well, 468 00:24:03,960 --> 00:24:06,600 Speaker 1: here's an interesting question. Um so you Ryan think language 469 00:24:06,640 --> 00:24:10,040 Speaker 1: is the greatest invention ever? Uh? And Danny, you think 470 00:24:10,040 --> 00:24:12,720 Speaker 1: math and science are the greatest inventors ever? Do you 471 00:24:12,760 --> 00:24:17,359 Speaker 1: think that we're done? Like, can you foresee a possibility 472 00:24:17,480 --> 00:24:20,159 Speaker 1: that there's an invention in our future that could maybe 473 00:24:20,400 --> 00:24:23,320 Speaker 1: overtake these two things to be the greatest invention ever? 474 00:24:23,800 --> 00:24:26,480 Speaker 1: So one of the reasons my book is structured the 475 00:24:26,480 --> 00:24:28,600 Speaker 1: way it is, where it has this invention of time 476 00:24:28,600 --> 00:24:30,320 Speaker 1: machines and then you go back and you're doing time 477 00:24:30,359 --> 00:24:32,119 Speaker 1: tourism and you get trapped in the past. Is that. 478 00:24:33,000 --> 00:24:38,440 Speaker 1: I feel like, if you invented time travel, then you're done. 479 00:24:38,640 --> 00:24:40,760 Speaker 1: That's the last invention that ever needs to be invented, 480 00:24:40,760 --> 00:24:43,480 Speaker 1: because any problem you encounter, you go to the future, 481 00:24:43,480 --> 00:24:45,440 Speaker 1: see how they solved it, bring the solution back with you. 482 00:24:45,960 --> 00:24:50,000 Speaker 1: The second you invent time travel, you've invented all other inventions. 483 00:24:50,040 --> 00:24:52,919 Speaker 1: It's possible for humans to invent boom, which rooves the 484 00:24:52,960 --> 00:24:56,399 Speaker 1: time time travel is impossible, or it proves that that 485 00:24:56,520 --> 00:24:59,600 Speaker 1: could be that would surpass language and science in my 486 00:24:59,760 --> 00:25:03,760 Speaker 1: est ration, all other inventions on mass beats everything else 487 00:25:03,800 --> 00:25:06,159 Speaker 1: for sure. But what do you think, Daniel? I think 488 00:25:06,400 --> 00:25:09,000 Speaker 1: that for that to work, you'd have to invent time travel, 489 00:25:09,040 --> 00:25:11,720 Speaker 1: which would violate causality, which is what you need in 490 00:25:11,840 --> 00:25:14,159 Speaker 1: order to be able to steal inventions from the future 491 00:25:14,200 --> 00:25:17,760 Speaker 1: that haven't been made yet, which is basically just science 492 00:25:17,800 --> 00:25:21,000 Speaker 1: fiction and so, and we already invented that. So that's 493 00:25:21,000 --> 00:25:23,399 Speaker 1: what I think about that. What do you What do 494 00:25:23,440 --> 00:25:25,560 Speaker 1: you think, Daniel, would be What do you think could 495 00:25:25,600 --> 00:25:28,240 Speaker 1: be something we might invent in the future that could 496 00:25:28,280 --> 00:25:32,320 Speaker 1: totally revolutionize things even more. That's impossible to comprehend, Like I, 497 00:25:32,400 --> 00:25:33,800 Speaker 1: you know, if I if I knew that, then I 498 00:25:33,800 --> 00:25:36,720 Speaker 1: would invent it right. All of these inventions that really 499 00:25:36,760 --> 00:25:39,080 Speaker 1: the transformational ones that And I love the way Ryan 500 00:25:39,080 --> 00:25:41,199 Speaker 1: pointed this out in this book, all these inventions, if 501 00:25:41,240 --> 00:25:44,840 Speaker 1: you just knew what the invention was, then you have 502 00:25:44,920 --> 00:25:48,320 Speaker 1: invented it, right. It's like having the password right. All 503 00:25:48,400 --> 00:25:50,720 Speaker 1: you need to know is the password and the doors open. 504 00:25:51,160 --> 00:25:54,119 Speaker 1: Some of these inventions, like you know, steal, how do 505 00:25:54,160 --> 00:25:56,439 Speaker 1: you even if you went back and told somebody how 506 00:25:56,480 --> 00:25:58,600 Speaker 1: to smelt steal. It's not like they could do that 507 00:25:58,600 --> 00:26:01,520 Speaker 1: that afternoon. They have to build a whole industrial base, etcetera, etcetera. 508 00:26:01,920 --> 00:26:05,360 Speaker 1: But these really transformational ones, it's just knowing the idea 509 00:26:05,640 --> 00:26:08,199 Speaker 1: is the invention, so all right. It's not like I 510 00:26:08,240 --> 00:26:11,400 Speaker 1: have the next human transformational invention already in my head. 511 00:26:11,440 --> 00:26:13,359 Speaker 1: I just haven't shared it with anybody, and I was 512 00:26:13,400 --> 00:26:17,960 Speaker 1: waiting for this podcast to reveal it. But if I did, 513 00:26:18,160 --> 00:26:21,520 Speaker 1: I would totally roll it out right now, let's take 514 00:26:21,520 --> 00:26:36,240 Speaker 1: a quick break. I also wonder like if you could 515 00:26:36,240 --> 00:26:39,239 Speaker 1: go back in time and you're you're talking to year 516 00:26:39,240 --> 00:26:43,000 Speaker 1: old dude and you're like excited to share with them 517 00:26:43,240 --> 00:26:45,479 Speaker 1: the ideas you have and the technology that you can 518 00:26:45,560 --> 00:26:48,040 Speaker 1: use to transform is in her world. I wonder if 519 00:26:48,080 --> 00:26:51,320 Speaker 1: they might think, you know what, we're good, like, uh, 520 00:26:51,560 --> 00:26:54,680 Speaker 1: you know, I got I got my roving mammals, I 521 00:26:54,720 --> 00:26:57,840 Speaker 1: can eat. You know that, hunter, I gather, I you know, 522 00:26:58,080 --> 00:26:59,959 Speaker 1: carve sticks. Every once in a while, we bang rock 523 00:27:00,040 --> 00:27:03,320 Speaker 1: together around the campfire. Life's not bad, you know. Yeah, 524 00:27:03,320 --> 00:27:04,800 Speaker 1: that's the That's the thing that we kind of forget 525 00:27:04,920 --> 00:27:07,840 Speaker 1: is that in times of plenty, the hunting and gathering 526 00:27:07,920 --> 00:27:11,280 Speaker 1: lifestyle is a fabulous lifestyle. It's like you're lazing around. 527 00:27:11,640 --> 00:27:13,280 Speaker 1: It doesn't take all day to hunt and gather. You 528 00:27:13,320 --> 00:27:15,720 Speaker 1: have lots of free time, just chill out, do whatever 529 00:27:15,760 --> 00:27:19,760 Speaker 1: you want. Foods plenty. Who would give that up? Farming sucks. 530 00:27:19,960 --> 00:27:23,560 Speaker 1: Farming is fluck crack, break back, breaking labor in a field. 531 00:27:23,680 --> 00:27:25,359 Speaker 1: Like it's it's a lot of work, but what it 532 00:27:25,400 --> 00:27:28,440 Speaker 1: gets you is reliable foods that when the times of 533 00:27:28,480 --> 00:27:31,919 Speaker 1: plenty run out pretending gathering, you don't starve and suffer 534 00:27:32,080 --> 00:27:35,760 Speaker 1: catastrophic population collapse because you have resources. You're in one place, 535 00:27:35,800 --> 00:27:38,880 Speaker 1: you can build infrastructure. You can start building buildings that 536 00:27:39,200 --> 00:27:40,879 Speaker 1: you don't have to pick up and carry with you 537 00:27:40,960 --> 00:27:43,800 Speaker 1: like it's it's where civilization begins when you when you 538 00:27:43,840 --> 00:27:46,400 Speaker 1: stop moving around. Every time things get hub, it gets hard. 539 00:27:46,800 --> 00:27:51,080 Speaker 1: So yeah, I can I can see. Uh, if you 540 00:27:51,320 --> 00:27:53,960 Speaker 1: arrive at a time where it's very easy to find food, 541 00:27:53,960 --> 00:27:55,639 Speaker 1: is gonna be hard to humance people to join the 542 00:27:55,640 --> 00:27:58,440 Speaker 1: farm and work for you or work with you. But 543 00:27:59,640 --> 00:28:03,400 Speaker 1: one of the arguments for that that I found was, um, 544 00:28:03,600 --> 00:28:06,800 Speaker 1: someone was pointing out that it's very hard to produce 545 00:28:06,840 --> 00:28:10,880 Speaker 1: alcohol in a hunting and gathering lifestyle all the time, 546 00:28:11,640 --> 00:28:15,199 Speaker 1: and so if you want to have a beer, you 547 00:28:15,240 --> 00:28:17,399 Speaker 1: need a civilization for that, and that might be one 548 00:28:17,440 --> 00:28:20,360 Speaker 1: of the things that induces people to come and help 549 00:28:20,400 --> 00:28:23,080 Speaker 1: out on a farm. Yeah. My my other fear for 550 00:28:23,320 --> 00:28:25,240 Speaker 1: time traveling Ryan is that you go back in time 551 00:28:25,280 --> 00:28:27,720 Speaker 1: with all these crazy ideas, you're just gonna get brands 552 00:28:27,720 --> 00:28:30,240 Speaker 1: into which and killed. Like you know, this is sort 553 00:28:30,240 --> 00:28:32,840 Speaker 1: of a social barrier to convincing people to join your 554 00:28:32,920 --> 00:28:36,160 Speaker 1: let's transform humanity movement. But you're you're right, maybe beer 555 00:28:36,240 --> 00:28:38,440 Speaker 1: is the answer to that problem. All right, I only 556 00:28:38,480 --> 00:28:40,560 Speaker 1: only have one more question for you, which is sort 557 00:28:40,560 --> 00:28:45,040 Speaker 1: of a multiverse question. To me, the history of human 558 00:28:45,040 --> 00:28:47,920 Speaker 1: invention seems sort of chaotic, you know, with somebody had 559 00:28:47,960 --> 00:28:50,520 Speaker 1: this idea, somebody had that idea, sort of came together 560 00:28:50,560 --> 00:28:53,280 Speaker 1: here and there. Have you thought about sort of the 561 00:28:53,440 --> 00:28:57,280 Speaker 1: thousand parallel universes where you run the human experiment, and 562 00:28:57,360 --> 00:28:59,200 Speaker 1: how many of them do you think we would end 563 00:28:59,280 --> 00:29:01,760 Speaker 1: up after this out of time at roughly the same place? 564 00:29:01,840 --> 00:29:04,840 Speaker 1: Like do we always end up stumbling into the same 565 00:29:04,840 --> 00:29:07,360 Speaker 1: things in roughly the same order? Or are there these 566 00:29:07,480 --> 00:29:10,800 Speaker 1: moments when human technology could have shifted dramatically and gone 567 00:29:10,800 --> 00:29:13,960 Speaker 1: down a different path and become, you know, done things 568 00:29:13,960 --> 00:29:17,240 Speaker 1: more rapidly. Do you think those thousand different parallel universes 569 00:29:17,360 --> 00:29:20,480 Speaker 1: have similarities? Are all totally different? I think that's similarity. 570 00:29:20,560 --> 00:29:22,840 Speaker 1: I think they're they're markedly different. Like when you have 571 00:29:22,880 --> 00:29:24,800 Speaker 1: these huge expanse of time which we could have met 572 00:29:24,840 --> 00:29:27,000 Speaker 1: something and didn't, all you need is that one person 573 00:29:27,040 --> 00:29:29,400 Speaker 1: to invent it. You remove Isaac Newton from our history 574 00:29:29,720 --> 00:29:33,040 Speaker 1: and we have a very different history of thoughts of mathematics, 575 00:29:33,160 --> 00:29:35,560 Speaker 1: right perhaps, but you know, maybe Leibnitz would have invented everything. 576 00:29:35,600 --> 00:29:37,760 Speaker 1: Isaac Newton didn't think of right, there could have just 577 00:29:37,800 --> 00:29:39,360 Speaker 1: been like it could have been an idea of the 578 00:29:39,400 --> 00:29:41,840 Speaker 1: time that somebody was going to invent because the pieces 579 00:29:41,880 --> 00:29:43,479 Speaker 1: were there. And that does show up. We look at 580 00:29:43,480 --> 00:29:44,800 Speaker 1: the invention of radio, and there are a bunch of 581 00:29:44,840 --> 00:29:47,240 Speaker 1: people independently coming with radio at about the same time 582 00:29:47,280 --> 00:29:49,920 Speaker 1: because the pieces were in place. There are certainly moments 583 00:29:49,920 --> 00:29:52,120 Speaker 1: where sort of things are in the air and everyone's 584 00:29:52,160 --> 00:29:54,640 Speaker 1: moving towards this one invention, and radio's example of that, 585 00:29:54,960 --> 00:29:57,880 Speaker 1: but all these other examples where it didn't have to 586 00:29:57,920 --> 00:30:00,160 Speaker 1: be that way. Um, the stat Dou's go, it was 587 00:30:00,160 --> 00:30:03,120 Speaker 1: a fun one. That's that's The first stethoscope was just 588 00:30:03,280 --> 00:30:05,600 Speaker 1: a rolled up tube of paper to listen, to isolate 589 00:30:05,640 --> 00:30:08,040 Speaker 1: and listen to a sound, and we had paper since 590 00:30:09,240 --> 00:30:14,600 Speaker 1: it was invented in eighteen sixteen CE by male heterosexual 591 00:30:14,680 --> 00:30:17,560 Speaker 1: doctor who had a busty female patient and didn't want 592 00:30:17,560 --> 00:30:19,200 Speaker 1: to press his ear to her chest. That was too 593 00:30:19,200 --> 00:30:21,560 Speaker 1: erotic and experience for him, so he rolled up a 594 00:30:21,560 --> 00:30:23,239 Speaker 1: top of paper to leave some room for Jesus and 595 00:30:23,280 --> 00:30:26,200 Speaker 1: listened through that and accidentally discovered that this isolates and 596 00:30:26,320 --> 00:30:29,000 Speaker 1: clarifies the sound. So that's when a few examples I 597 00:30:29,000 --> 00:30:31,840 Speaker 1: can find where someone actually progressed science by being too 598 00:30:31,880 --> 00:30:35,000 Speaker 1: horny to do with their job properly. And that guy 599 00:30:35,040 --> 00:30:36,719 Speaker 1: could have shown up at any point in history, right, 600 00:30:36,720 --> 00:30:39,800 Speaker 1: Like that's and also proves that that boobs are useful 601 00:30:39,840 --> 00:30:43,200 Speaker 1: for something. All right, Ryan, thank you so much for 602 00:30:43,240 --> 00:30:45,960 Speaker 1: joining us. Thank you very much, Ryan for entertaining all 603 00:30:46,000 --> 00:30:48,840 Speaker 1: of our amateurish thoughts on a topic in which you 604 00:30:48,880 --> 00:30:51,120 Speaker 1: are an expert. Thank you. I think they're great questions. 605 00:30:51,160 --> 00:30:52,640 Speaker 1: I love talking about this stuff. I wrote a book 606 00:30:52,640 --> 00:30:55,000 Speaker 1: about I love talking about it so much. It's called 607 00:30:55,160 --> 00:30:57,800 Speaker 1: How to Invent Everything, a survival Guide for the Stranded 608 00:30:57,800 --> 00:31:00,160 Speaker 1: time Traveler, and you can get it at how to 609 00:31:00,200 --> 00:31:07,200 Speaker 1: Invent Everything dot com. Well, I think that Ryan, who 610 00:31:07,280 --> 00:31:10,120 Speaker 1: has a history of studying language, thinks that language is 611 00:31:10,160 --> 00:31:13,000 Speaker 1: the most important thing invented human history, and me, as 612 00:31:13,000 --> 00:31:15,040 Speaker 1: a scientist, I think math and science is the most 613 00:31:15,040 --> 00:31:17,400 Speaker 1: important thing invented in human history. Yeah, I found that 614 00:31:17,440 --> 00:31:20,320 Speaker 1: a little suspicious, Like you guys said all these great 615 00:31:20,400 --> 00:31:23,760 Speaker 1: reasons why your team should win. That's right. And you 616 00:31:23,800 --> 00:31:25,560 Speaker 1: know there's a trend there, because when I was asking 617 00:31:25,600 --> 00:31:27,680 Speaker 1: people on the street what they thought the greatest invision 618 00:31:27,720 --> 00:31:30,080 Speaker 1: in human history, was. Most of them talked about what 619 00:31:30,120 --> 00:31:32,600 Speaker 1: they happened to have been reading. You know, so students 620 00:31:32,640 --> 00:31:35,720 Speaker 1: studying evolution said evolution. A student studying the atomic theory 621 00:31:35,720 --> 00:31:39,320 Speaker 1: said atomic theory. Uh, student on a scooter said the wheel. 622 00:31:41,360 --> 00:31:43,840 Speaker 1: I think it's it's just it's a hard question to 623 00:31:43,880 --> 00:31:47,680 Speaker 1: answer because it's so broad, and so people freeze up 624 00:31:47,680 --> 00:31:49,320 Speaker 1: a little bit and then they think about it from 625 00:31:49,320 --> 00:31:51,080 Speaker 1: their perspective, and there's a lesson there, and you know 626 00:31:51,120 --> 00:31:54,520 Speaker 1: that we all see the world from our own perspective. Well, 627 00:31:54,560 --> 00:31:55,960 Speaker 1: you know, it sort of happened to me too when 628 00:31:55,960 --> 00:31:58,120 Speaker 1: I was asked this question. When I had to think 629 00:31:58,160 --> 00:32:00,479 Speaker 1: about it, I just kind of looked around me, you know, 630 00:32:00,760 --> 00:32:04,320 Speaker 1: like I didn't think internally through the history of human civilization. 631 00:32:04,400 --> 00:32:05,800 Speaker 1: I just kind of looked around me, and I thought, 632 00:32:07,320 --> 00:32:11,760 Speaker 1: what would be You thought banana? I could not live 633 00:32:11,800 --> 00:32:14,880 Speaker 1: without bananas. Yeah, it's what makes everything else possible. To 634 00:32:14,920 --> 00:32:18,000 Speaker 1: come on, It's what God monkeys out of the trees, 635 00:32:18,640 --> 00:32:22,560 Speaker 1: anything up into the trees exactly. But yeah, you sort 636 00:32:22,560 --> 00:32:24,520 Speaker 1: of have that instinct to look around you and to 637 00:32:24,680 --> 00:32:27,920 Speaker 1: try to gauge impact that way, like what all around 638 00:32:27,960 --> 00:32:31,080 Speaker 1: me would not be here without if it hadn't been invented, 639 00:32:31,360 --> 00:32:33,160 Speaker 1: That's right, And I think the lessons there is that 640 00:32:33,200 --> 00:32:35,640 Speaker 1: we can all see only a tiny bit of the 641 00:32:35,680 --> 00:32:38,760 Speaker 1: fabric of human history, right, and so it's very difficult 642 00:32:38,800 --> 00:32:41,120 Speaker 1: to say anything in general, because human history is this 643 00:32:41,200 --> 00:32:44,960 Speaker 1: incredible mosaic of billions of people's experiences, and all we 644 00:32:44,960 --> 00:32:46,640 Speaker 1: can do is speak for ourselves. I mean, I know 645 00:32:46,760 --> 00:32:50,400 Speaker 1: historians try their best to weave these broader stories about 646 00:32:50,520 --> 00:32:52,959 Speaker 1: what has happened in humanity, but I always feel like 647 00:32:53,280 --> 00:32:56,040 Speaker 1: so much of actual human experience it just brushed under 648 00:32:56,040 --> 00:32:58,040 Speaker 1: the rug when they try to do that, And so 649 00:32:58,080 --> 00:32:59,479 Speaker 1: I think none of us can really speak for all 650 00:32:59,480 --> 00:33:01,480 Speaker 1: of humanity, can only speak for ourselves. What is the 651 00:33:01,520 --> 00:33:04,440 Speaker 1: most important invention that is affecting your life? I think 652 00:33:04,440 --> 00:33:06,520 Speaker 1: that's really the question of the podcast, or you know, 653 00:33:06,560 --> 00:33:08,760 Speaker 1: maybe points to the idea that the more you know 654 00:33:08,800 --> 00:33:12,760 Speaker 1: about something, the more fascinating it becomes, you know what 655 00:33:12,760 --> 00:33:15,680 Speaker 1: I mean. Like Ryan has studied linguistics for a long 656 00:33:15,720 --> 00:33:18,320 Speaker 1: time and so he just knows so much about it 657 00:33:18,360 --> 00:33:20,600 Speaker 1: and how it's connected to everything. So from his point 658 00:33:20,640 --> 00:33:23,040 Speaker 1: of view, it's like the most important thing. It's it's 659 00:33:23,080 --> 00:33:26,040 Speaker 1: the hub of all things. And like you've studied science, 660 00:33:26,040 --> 00:33:29,200 Speaker 1: and physics for a long time, and so you've seen 661 00:33:29,240 --> 00:33:31,280 Speaker 1: how it's kind of sort of connected to everything else, 662 00:33:31,320 --> 00:33:34,239 Speaker 1: and how it's nothing would be possible without it, and 663 00:33:34,280 --> 00:33:36,840 Speaker 1: so you see it as the most fascinating, most important thing. 664 00:33:37,080 --> 00:33:39,960 Speaker 1: And so maybe it's just all sort of connected to itself. 665 00:33:40,000 --> 00:33:42,240 Speaker 1: And it's just that the more you know about something, 666 00:33:42,760 --> 00:33:45,520 Speaker 1: the more you think it's crucial to the structure of 667 00:33:45,560 --> 00:33:47,960 Speaker 1: human history. So why didn't you argue for comics to 668 00:33:48,000 --> 00:33:52,480 Speaker 1: be the most important? Do you spend twenty years studying it? Right? 669 00:33:52,960 --> 00:33:55,800 Speaker 1: I totally agree that things as you study them, anything 670 00:33:55,800 --> 00:33:58,239 Speaker 1: can become interesting. You can find a puzzle anywhere, right, 671 00:33:58,560 --> 00:34:00,760 Speaker 1: Like you can go into deep I have about like 672 00:34:01,360 --> 00:34:04,920 Speaker 1: a cup, you know, like if humans hadn't invent at 673 00:34:04,960 --> 00:34:07,400 Speaker 1: the cup, what would happen? There'd be a lot more 674 00:34:07,440 --> 00:34:10,440 Speaker 1: injuries in baseball. Wait, now you're talking about a different Sorry, 675 00:34:12,600 --> 00:34:13,680 Speaker 1: but you don't do you know what I mean? Like 676 00:34:13,760 --> 00:34:16,160 Speaker 1: you're going to deep diving on anything and see how 677 00:34:16,160 --> 00:34:20,000 Speaker 1: it's all connected to the greatest moments in history and civilization. Right, 678 00:34:20,120 --> 00:34:22,239 Speaker 1: Like without a cup or any kind of vessel to 679 00:34:22,280 --> 00:34:25,440 Speaker 1: hold water, maybe we wouldn't you know, being able to 680 00:34:26,160 --> 00:34:28,920 Speaker 1: you know, leave the water and hole and start building 681 00:34:29,239 --> 00:34:31,480 Speaker 1: villagers and things like that. All right, so maybe we 682 00:34:31,520 --> 00:34:34,360 Speaker 1: should leave that as a challenge to our listeners. Choose 683 00:34:34,440 --> 00:34:38,400 Speaker 1: some trivial item in human life and challenge us to 684 00:34:38,440 --> 00:34:42,400 Speaker 1: spend an entire podcast drilling down and discovering what's fascinating 685 00:34:42,400 --> 00:34:47,560 Speaker 1: about tonail clippers or glad wrap or whatever it is 686 00:34:47,880 --> 00:34:50,359 Speaker 1: in your life. Send it to us at feedback at 687 00:34:50,480 --> 00:34:54,759 Speaker 1: Daniel and Jorge dot com or rights as at jog 688 00:34:54,800 --> 00:35:00,440 Speaker 1: strap at Daniel Cornall hot coom. And thank everyone for 689 00:35:00,480 --> 00:35:03,920 Speaker 1: listening to this episode of Daniel and Jorge Explain the Universe. 690 00:35:04,280 --> 00:35:06,759 Speaker 1: If you enjoyed it, tune in next time or check 691 00:35:06,760 --> 00:35:09,960 Speaker 1: out our book called We Have No Idea, An illustrated 692 00:35:09,960 --> 00:35:13,000 Speaker 1: guide to the Unknown Universe. See you next time. Thanks 693 00:35:13,040 --> 00:35:23,600 Speaker 1: for listening. If you still have a question after listening 694 00:35:23,600 --> 00:35:26,680 Speaker 1: to all these explanations, please drop us a line. We'd 695 00:35:26,719 --> 00:35:29,560 Speaker 1: love to hear from you. You can find us at Facebook, Twitter, 696 00:35:29,680 --> 00:35:33,319 Speaker 1: and Instagram at Daniel and Jorge That's one word, or 697 00:35:33,440 --> 00:35:46,239 Speaker 1: email us at feedback at Daniel and Jorge dot com.