1 00:00:01,440 --> 00:00:04,240 Speaker 1: Welcome to Stuff you should know, a production of I 2 00:00:04,360 --> 00:00:13,239 Speaker 1: Heart Radio. Hey, and welcome to the podcast. I'm Josh, 3 00:00:13,320 --> 00:00:16,040 Speaker 1: and there's Chuck and Jerry's here too, and this is 4 00:00:16,079 --> 00:00:23,319 Speaker 1: stuff you should know, the brainiac condition. That's right. I 5 00:00:23,480 --> 00:00:26,240 Speaker 1: was trying to think of something and I was truly blank. 6 00:00:26,920 --> 00:00:30,760 Speaker 1: That's uh, that's how appropriate. It totally Chuck, because we 7 00:00:30,800 --> 00:00:33,600 Speaker 1: are talking today about human intelligence in the origin of 8 00:00:33,680 --> 00:00:36,600 Speaker 1: human intelligence, and it just seems super stuff you should 9 00:00:36,600 --> 00:00:38,720 Speaker 1: know me for us to not be able to come 10 00:00:38,800 --> 00:00:43,320 Speaker 1: up with a decent joke, you know, well Ed did 11 00:00:43,360 --> 00:00:45,720 Speaker 1: that for us, actually, because I did want to shut out. 12 00:00:45,760 --> 00:00:49,360 Speaker 1: We usually don't mention like section titles and stuff like 13 00:00:49,440 --> 00:00:53,320 Speaker 1: that that's actually in our notes, but Ed drops because 14 00:00:53,320 --> 00:00:54,920 Speaker 1: it's you know, it's for our eyes. But Ed dropped 15 00:00:54,920 --> 00:00:59,960 Speaker 1: to Simpson's reference in his section title one of the 16 00:01:00,040 --> 00:01:05,720 Speaker 1: great Simpsons references chimpan A to Chimpanzee was so great. 17 00:01:05,840 --> 00:01:08,240 Speaker 1: That was from the Planet of the Apes musical, right, 18 00:01:09,000 --> 00:01:10,760 Speaker 1: that's right. And I just I like to think is 19 00:01:10,800 --> 00:01:12,720 Speaker 1: that as a little gift from Ed to us? Yeah, 20 00:01:12,959 --> 00:01:15,280 Speaker 1: it definitely was, and it was well received too, So 21 00:01:15,400 --> 00:01:19,840 Speaker 1: thanks ed Um and the reason Ed created a section 22 00:01:19,840 --> 00:01:22,800 Speaker 1: called chimpaneze to chimpanzees because we're gonna talk about the 23 00:01:23,440 --> 00:01:28,039 Speaker 1: lineage of humanity like where humans came from UM And 24 00:01:28,120 --> 00:01:32,080 Speaker 1: despite that hilarious and clever um section title, we did 25 00:01:32,160 --> 00:01:36,000 Speaker 1: not actually evolve from chimpanzees, but we do share a 26 00:01:36,040 --> 00:01:40,480 Speaker 1: common ancestor from chimpanzee. So chimps and humans split off 27 00:01:40,480 --> 00:01:43,760 Speaker 1: from a shared ancestor about six to eight million years 28 00:01:43,760 --> 00:01:48,080 Speaker 1: ago UM, and that really kicked off a long line, 29 00:01:48,480 --> 00:01:55,280 Speaker 1: very long process um of evolution where intelligence started to 30 00:01:55,360 --> 00:02:00,360 Speaker 1: develop fairly early on. It just was really slow to start, 31 00:02:00,480 --> 00:02:04,160 Speaker 1: and then over time it kind of picked up speed. Yeah, 32 00:02:04,360 --> 00:02:06,800 Speaker 1: you found this some kind of cool statistic. There's a 33 00:02:06,880 --> 00:02:12,560 Speaker 1: researcher writer named Richard Leakey and H Richard, and I 34 00:02:12,600 --> 00:02:16,760 Speaker 1: think most people agree. They posit that there was what's 35 00:02:16,800 --> 00:02:20,280 Speaker 1: called a big bang of human culture around the Upper 36 00:02:20,320 --> 00:02:25,200 Speaker 1: Paleolithic time period where things were, like you said, slow 37 00:02:25,280 --> 00:02:29,080 Speaker 1: going for so long, and things were measured in in 38 00:02:30,320 --> 00:02:34,639 Speaker 1: eras before that, very very slowly over like hundreds of millennia, 39 00:02:35,440 --> 00:02:39,440 Speaker 1: and then all of a sudden, like sixty two thirty 40 00:02:39,480 --> 00:02:42,840 Speaker 1: thousand years ago or so, things started to really ramp 41 00:02:42,960 --> 00:02:48,040 Speaker 1: up in terms of innovation and intelligence. And uh, just 42 00:02:48,160 --> 00:02:51,720 Speaker 1: really moving the ball forward to use a football metaphor, 43 00:02:52,200 --> 00:02:56,200 Speaker 1: and we're talking about you know, clothing and uh, social 44 00:02:56,200 --> 00:02:59,480 Speaker 1: structure and in art and creativity and stuff like that. 45 00:02:59,560 --> 00:03:01,840 Speaker 1: So it's kind of cool to think, and you know, 46 00:03:02,200 --> 00:03:04,040 Speaker 1: we're going to talk about why that might have happened, 47 00:03:04,040 --> 00:03:08,200 Speaker 1: but uh, the fact that that did happen got us 48 00:03:08,320 --> 00:03:11,160 Speaker 1: on the moon in short order over the last few 49 00:03:11,200 --> 00:03:13,680 Speaker 1: thousand years. It's kind of like if you look at 50 00:03:13,720 --> 00:03:16,799 Speaker 1: the development of intelligence as a train that's starting from 51 00:03:16,800 --> 00:03:19,760 Speaker 1: a stop, it starts out with kind of a chuck 52 00:03:20,120 --> 00:03:23,800 Speaker 1: good chug good chug chug chugga chug a chuga, and 53 00:03:23,800 --> 00:03:27,840 Speaker 1: then that um, that Upper Paleolithic Revolution, the big bang 54 00:03:27,880 --> 00:03:32,400 Speaker 1: of culture. That's the cho choo part that really punctuates 55 00:03:32,440 --> 00:03:35,160 Speaker 1: the whole thing. I was thinking more along the lines 56 00:03:35,200 --> 00:03:38,320 Speaker 1: of it like a Japanese bullet train. But sure, I 57 00:03:38,320 --> 00:03:42,280 Speaker 1: don't think we're OK. We still do some really stupid stuff. 58 00:03:42,360 --> 00:03:45,840 Speaker 1: So we can also create a bullet train. We can, 59 00:03:46,120 --> 00:03:50,000 Speaker 1: but we just can't be the bullet train intellectual. Oh man, 60 00:03:50,400 --> 00:03:53,880 Speaker 1: mind blown. So chuck. The fact that thirty to sixty 61 00:03:53,920 --> 00:03:57,720 Speaker 1: thousand years ago there was that Upper Paleolithic Revolution where 62 00:03:58,080 --> 00:04:03,440 Speaker 1: humanity just suddenly blossom into what we recognize today as humanity. 63 00:04:03,600 --> 00:04:07,360 Speaker 1: It's really tempting to think that human intelligence just was 64 00:04:07,440 --> 00:04:12,280 Speaker 1: suddenly born all of a sudden, like geologically speaking, overnight 65 00:04:12,880 --> 00:04:15,840 Speaker 1: at that time. But that's just not the case. Um. 66 00:04:16,440 --> 00:04:20,000 Speaker 1: It seems like something definitely happened there, like some wire 67 00:04:20,080 --> 00:04:23,440 Speaker 1: connected with another wire that really made a big difference. 68 00:04:23,839 --> 00:04:26,440 Speaker 1: But instead, again, it was this part of this very 69 00:04:26,560 --> 00:04:33,080 Speaker 1: long line of seemingly random and unconnected um developments in 70 00:04:33,160 --> 00:04:36,120 Speaker 1: the in the history of humanity. And I guess our 71 00:04:36,200 --> 00:04:40,760 Speaker 1: our genus Homo um that led to that point and 72 00:04:40,800 --> 00:04:43,799 Speaker 1: actually led to that point today, because we're still evolving 73 00:04:43,800 --> 00:04:46,880 Speaker 1: and developing. Yeah, I guess uh. If you look at 74 00:04:46,880 --> 00:04:50,240 Speaker 1: it on a timeline, it looks like a mechanic came 75 00:04:50,279 --> 00:04:53,240 Speaker 1: along and said, well, here's your problem. You forgot to 76 00:04:53,279 --> 00:04:56,320 Speaker 1: plug you forgot to plug it in. That's right, you 77 00:04:56,400 --> 00:04:58,520 Speaker 1: gotta plug these two wires together and then you're all 78 00:04:58,520 --> 00:05:04,480 Speaker 1: set totally. But we like to talk about Homo sapiens 79 00:05:04,560 --> 00:05:09,960 Speaker 1: in in terms of human intelligence for good reason. Homo sapiens, 80 00:05:10,000 --> 00:05:13,520 Speaker 1: that is to say, us a k a. Modern humans 81 00:05:13,560 --> 00:05:17,520 Speaker 1: evolved about three hundred thousand years ago, um. But we 82 00:05:17,600 --> 00:05:22,560 Speaker 1: are just one of a collection of uh in in 83 00:05:22,600 --> 00:05:27,279 Speaker 1: this big lovely family called the Hominem's yeah, cominens comin 84 00:05:27,400 --> 00:05:30,440 Speaker 1: in Yeah, I think it's at ms. Yeah. So the 85 00:05:30,480 --> 00:05:35,840 Speaker 1: hominins are everybody that um that started off branching from 86 00:05:35,839 --> 00:05:39,200 Speaker 1: that common ancestor with chimps. That's the hominin line, and 87 00:05:39,360 --> 00:05:42,200 Speaker 1: humans in our genus Homo that Homo sapiens are a 88 00:05:42,200 --> 00:05:44,800 Speaker 1: part of, is just part of that hominin. There are 89 00:05:44,800 --> 00:05:48,560 Speaker 1: other entirely different genus or genie i that make up 90 00:05:48,760 --> 00:05:52,040 Speaker 1: the hominin line, right, that's right, And we should point 91 00:05:52,040 --> 00:05:54,560 Speaker 1: out that sapiens actually is taken from the Latin word 92 00:05:54,560 --> 00:05:57,320 Speaker 1: for knowledge, so it kind of all makes sense. It does. 93 00:05:57,640 --> 00:05:59,520 Speaker 1: So the whole thing starts out it seems like as 94 00:05:59,560 --> 00:06:02,680 Speaker 1: far back because we can tell um something again, like 95 00:06:02,800 --> 00:06:05,960 Speaker 1: somewhere around six or so million years ago, there was 96 00:06:06,120 --> 00:06:11,839 Speaker 1: a group of homonyms called Artipithecus um who basically walked upright. 97 00:06:11,920 --> 00:06:15,960 Speaker 1: But that was essentially the big difference between them and chimpanzees. 98 00:06:16,400 --> 00:06:21,080 Speaker 1: But as we'll see, that was a really really big difference, right, Yeah, 99 00:06:21,080 --> 00:06:23,000 Speaker 1: I mean we'll we'll get into this in more detail, 100 00:06:23,040 --> 00:06:26,039 Speaker 1: but obviously if you're walking up right, then you have 101 00:06:26,080 --> 00:06:28,640 Speaker 1: a very important thing at your disposal, which is use 102 00:06:28,680 --> 00:06:33,800 Speaker 1: of your hands. Right, So then, um, you've got Australopithecus uh, 103 00:06:33,839 --> 00:06:36,760 Speaker 1: and some a few other different kinds of branches that 104 00:06:36,839 --> 00:06:41,400 Speaker 1: kind of branch off. It's a really tangled, convoluted family tree, um, 105 00:06:41,440 --> 00:06:44,400 Speaker 1: where some kind of lead to blind alleys, others lead 106 00:06:44,400 --> 00:06:48,320 Speaker 1: to others. But um, they think that Australopithecus is was 107 00:06:48,360 --> 00:06:51,799 Speaker 1: a really big, long lasting group that was a little 108 00:06:51,800 --> 00:06:55,400 Speaker 1: more human, definitely more human than art Epithecus um, but 109 00:06:55,480 --> 00:06:59,680 Speaker 1: not quite as human as the genus Homo um, which 110 00:07:00,200 --> 00:07:03,600 Speaker 1: off all of these different species of human Because we're 111 00:07:03,640 --> 00:07:06,359 Speaker 1: we're alive today, we're on you know, planet Earth, living 112 00:07:06,440 --> 00:07:09,880 Speaker 1: here in two and every single human alive is a 113 00:07:09,880 --> 00:07:13,760 Speaker 1: member of the same species. So like there's different kinds 114 00:07:13,760 --> 00:07:16,480 Speaker 1: of cats, there's different kinds of fish species, there's different 115 00:07:16,560 --> 00:07:19,960 Speaker 1: kinds of bird species. Um, there's only one kind of 116 00:07:20,040 --> 00:07:22,360 Speaker 1: human species. But that wasn't always the case. There are 117 00:07:22,360 --> 00:07:26,400 Speaker 1: plenty of different human species, some living alongside one another 118 00:07:26,760 --> 00:07:30,160 Speaker 1: for tens or hundreds of thousands of years, that's right. 119 00:07:30,480 --> 00:07:34,560 Speaker 1: And almost all the hominins used tools, it seems like, 120 00:07:34,600 --> 00:07:37,720 Speaker 1: and make made tools, and for a long time we 121 00:07:37,760 --> 00:07:40,640 Speaker 1: thought that that was sort of it, that only the 122 00:07:40,680 --> 00:07:43,800 Speaker 1: Homo genus was the one who did use tools, which 123 00:07:43,840 --> 00:07:45,720 Speaker 1: is and you know, we talk about things like being 124 00:07:45,760 --> 00:07:48,679 Speaker 1: bipedal and using tools as sort of some of the 125 00:07:48,760 --> 00:07:52,320 Speaker 1: building blocks of what would become human intelligence. But now 126 00:07:52,320 --> 00:07:56,280 Speaker 1: we know that there are uh some older you know, 127 00:07:56,720 --> 00:08:00,000 Speaker 1: we found evidence that they use tools before that. Uh 128 00:08:00,080 --> 00:08:03,760 Speaker 1: that's kind of fairly recently, right, Yeah. We we wanted 129 00:08:03,800 --> 00:08:07,920 Speaker 1: to say that toolmaking started sometime after the Homo genus 130 00:08:07,960 --> 00:08:10,920 Speaker 1: showed up of a couple million years ago, um, but 131 00:08:11,080 --> 00:08:14,640 Speaker 1: we found even older tools. So it seems like austral Epithecus, 132 00:08:14,640 --> 00:08:17,640 Speaker 1: which again their hominance, they're part of the branch that 133 00:08:17,760 --> 00:08:20,720 Speaker 1: led to us humans, but they're not human in any way, 134 00:08:20,720 --> 00:08:23,400 Speaker 1: shape or form um. So the fact that they're using 135 00:08:23,440 --> 00:08:26,320 Speaker 1: tools was kind of mind blowing, and it also really 136 00:08:26,400 --> 00:08:29,559 Speaker 1: kind of undermined kind of like what you were saying, 137 00:08:29,880 --> 00:08:32,400 Speaker 1: like our idea of using tools, like that's a big 138 00:08:32,440 --> 00:08:35,680 Speaker 1: sign of intelligence, and and humans are intelligent, so it's 139 00:08:35,720 --> 00:08:38,480 Speaker 1: weird to find out that non humans were using tools 140 00:08:38,880 --> 00:08:43,520 Speaker 1: millions of years ago. That's right. Um, should we move 141 00:08:43,559 --> 00:08:46,840 Speaker 1: on to the hardware software thing? Yeah, So if if 142 00:08:46,880 --> 00:08:50,640 Speaker 1: tools and fire are not, um, because we found use 143 00:08:50,679 --> 00:08:53,480 Speaker 1: of fire dating back at least a million years. UM. 144 00:08:53,559 --> 00:08:57,520 Speaker 1: So if if tools fire hanging out with one another collectively, 145 00:08:57,840 --> 00:09:01,800 Speaker 1: if these aren't like the indicators that make human intelligence, 146 00:09:02,600 --> 00:09:05,439 Speaker 1: we've got to like get a little more granular. Unfortunately 147 00:09:05,480 --> 00:09:08,520 Speaker 1: for you and I sitting here today, chuck, UM, scientists 148 00:09:08,520 --> 00:09:09,960 Speaker 1: have done that, and they've come up with some really 149 00:09:10,000 --> 00:09:14,200 Speaker 1: interesting like ways of looking at this. Yeah, and UM, 150 00:09:14,280 --> 00:09:16,520 Speaker 1: there's a bit more of a preamble before we get 151 00:09:16,600 --> 00:09:19,719 Speaker 1: actually to the intelligence. UM. And I like the way 152 00:09:19,840 --> 00:09:23,880 Speaker 1: ed put this, sort of like talking about hardware versus software. Um, 153 00:09:24,000 --> 00:09:28,360 Speaker 1: they were very intertwined and um, you know, sort of 154 00:09:28,400 --> 00:09:30,760 Speaker 1: happening at the same time. So it's not like one 155 00:09:31,000 --> 00:09:33,400 Speaker 1: couldn't happen without the other as far as the hardware 156 00:09:33,440 --> 00:09:38,000 Speaker 1: software thing goes. But Uh, if we're looking at hardware 157 00:09:38,040 --> 00:09:42,040 Speaker 1: and we're talking about um changes that made us like 158 00:09:42,360 --> 00:09:44,560 Speaker 1: better at walking up right, Like you can you all 159 00:09:44,600 --> 00:09:46,480 Speaker 1: of a sudden just don't stand up and start walking 160 00:09:46,520 --> 00:09:49,079 Speaker 1: like this happens over a long long period of time. 161 00:09:49,480 --> 00:09:52,600 Speaker 1: Our hind legs got longer, Uh, the shape of our 162 00:09:52,679 --> 00:09:57,600 Speaker 1: pelvis changed. Um. There's something called the Foreman magnum, which 163 00:09:57,679 --> 00:09:59,839 Speaker 1: is a hole in the base of the skull where 164 00:09:59,880 --> 00:10:02,760 Speaker 1: the spinal cord and lots of nerves and things passed 165 00:10:02,920 --> 00:10:06,000 Speaker 1: like sort of open up those neural pathways, and that 166 00:10:06,160 --> 00:10:10,880 Speaker 1: changed its location. So these literal physical changes are happening 167 00:10:10,880 --> 00:10:13,680 Speaker 1: over great periods of time in order just to be 168 00:10:13,760 --> 00:10:17,240 Speaker 1: able to walk up right right and and bipedalism it's 169 00:10:17,280 --> 00:10:20,959 Speaker 1: like the defining characteristic of prominence. Right. There's there's not really, 170 00:10:21,160 --> 00:10:23,240 Speaker 1: as far as I can tell, any other animals that 171 00:10:23,760 --> 00:10:27,520 Speaker 1: that walk upright like by default. Um, so there had 172 00:10:27,559 --> 00:10:30,960 Speaker 1: to be physiological changes, but they're not entirely certain why 173 00:10:31,040 --> 00:10:33,320 Speaker 1: we started walking up right. But the fact that we 174 00:10:33,400 --> 00:10:36,719 Speaker 1: did and it's lasted for this long means that there 175 00:10:36,760 --> 00:10:40,280 Speaker 1: was some advantage to it because enough people walking upright 176 00:10:40,559 --> 00:10:43,080 Speaker 1: were able to pass along their genes. And they think 177 00:10:43,120 --> 00:10:46,920 Speaker 1: one one big theory is that it helped us survive 178 00:10:47,120 --> 00:10:50,040 Speaker 1: climate change where maybe things got colder and there were 179 00:10:50,120 --> 00:10:53,199 Speaker 1: less trees. So since we weren't arboreal anymore, we didn't 180 00:10:53,200 --> 00:10:56,079 Speaker 1: hang out and live and eating trees. We were able 181 00:10:56,120 --> 00:10:58,800 Speaker 1: to kind of move around and find different like food 182 00:10:58,800 --> 00:11:02,920 Speaker 1: sources in different shelters, whereas the like our cousins the chimps, 183 00:11:02,960 --> 00:11:07,079 Speaker 1: were in big trouble. They were up the proverbial creek. Yeah, 184 00:11:07,120 --> 00:11:09,160 Speaker 1: and I love that you point out that we, uh, 185 00:11:09,280 --> 00:11:11,920 Speaker 1: we're the only ones who walk upright by default, because 186 00:11:12,280 --> 00:11:14,439 Speaker 1: I think we can all agree there's nothing more fun 187 00:11:15,120 --> 00:11:18,400 Speaker 1: than YouTube videos of like a dog or a cat 188 00:11:18,480 --> 00:11:21,319 Speaker 1: or something just walking on its hind legs for some reason. Definitely, 189 00:11:21,679 --> 00:11:23,960 Speaker 1: and and it's the coolest and most fun thing ever. 190 00:11:24,160 --> 00:11:26,200 Speaker 1: It totally is. And I mean, now I'll take like 191 00:11:26,400 --> 00:11:29,040 Speaker 1: a Jesus lizard running across some water once in a 192 00:11:29,040 --> 00:11:31,800 Speaker 1: while too. There's nothing wrong with watching. Is that what 193 00:11:31,840 --> 00:11:34,520 Speaker 1: those are called? Yeah? I think I saw one of 194 00:11:34,520 --> 00:11:39,360 Speaker 1: those in Mexico, but that was surprising. Are they not there? 195 00:11:39,679 --> 00:11:41,640 Speaker 1: I don't know. I'm just saying I've I've never seen 196 00:11:41,640 --> 00:11:43,400 Speaker 1: one in real life, so I'm sure that was surprising 197 00:11:43,440 --> 00:11:48,200 Speaker 1: to see. I saw a lizard that was walking, and 198 00:11:48,280 --> 00:11:50,440 Speaker 1: it wasn't walking on water, but I think it was 199 00:11:50,559 --> 00:11:52,440 Speaker 1: one of those kinds that can I just didn't know 200 00:11:52,480 --> 00:11:55,120 Speaker 1: the name of it. Had you recently eaten the worm 201 00:11:55,240 --> 00:11:59,680 Speaker 1: from the bottom of the bottom of most cow? Very funny. 202 00:12:00,400 --> 00:12:03,440 Speaker 1: Another thing we should point out is that walking upright 203 00:12:04,200 --> 00:12:07,199 Speaker 1: is an energy saver. I mean, it's they've done studies 204 00:12:07,240 --> 00:12:10,520 Speaker 1: and they found that um, you use about the energy 205 00:12:11,160 --> 00:12:14,040 Speaker 1: um rather than you know, bounding around on all fours 206 00:12:14,080 --> 00:12:16,560 Speaker 1: like a chimp mte right or like a chimp does. 207 00:12:17,120 --> 00:12:20,640 Speaker 1: But to save that energy, to conserve it um, our 208 00:12:20,720 --> 00:12:23,240 Speaker 1: pelvis has had to change shape. Like you mentioned, that 209 00:12:23,320 --> 00:12:27,080 Speaker 1: was just a consequence of walking upright. Um. And the 210 00:12:27,120 --> 00:12:29,720 Speaker 1: reason to change shape is when you walk up right, 211 00:12:29,720 --> 00:12:32,520 Speaker 1: if you're a chimp, your body swings side to side. 212 00:12:32,559 --> 00:12:34,760 Speaker 1: You have to hold your arms up to balance yourself. 213 00:12:35,120 --> 00:12:38,600 Speaker 1: That takes a lot of energy. UM. So we developed 214 00:12:38,600 --> 00:12:41,800 Speaker 1: like gluteal muscles and other muscles that can cling to 215 00:12:42,280 --> 00:12:46,200 Speaker 1: a specific shaped in size pelvis so that we don't 216 00:12:46,240 --> 00:12:48,120 Speaker 1: have to spend all the energy. Our muscles are just 217 00:12:48,200 --> 00:12:51,400 Speaker 1: kind of keeping us much more balanced. But one of 218 00:12:51,440 --> 00:12:54,560 Speaker 1: the consequences of that of walking upright and our pelvis 219 00:12:54,640 --> 00:12:58,679 Speaker 1: changing means that the size of the birth canal afforded 220 00:12:58,720 --> 00:13:00,880 Speaker 1: by the hole in the all of us the child 221 00:13:00,920 --> 00:13:06,400 Speaker 1: passes through during birth. UM got smaller, a lot smaller. 222 00:13:06,880 --> 00:13:10,640 Speaker 1: And it's really strange to think that this the the 223 00:13:10,760 --> 00:13:14,880 Speaker 1: decreasing in size of the birth canal actually was one 224 00:13:14,920 --> 00:13:18,920 Speaker 1: of the factors that led to an increase in intelligence. Yeah, 225 00:13:18,960 --> 00:13:20,640 Speaker 1: and you know, we should point out this is just 226 00:13:21,160 --> 00:13:24,040 Speaker 1: the first of what will be a lovely cascade of 227 00:13:24,080 --> 00:13:28,440 Speaker 1: theories that we're gonna lay lay on your brains today. Uh, 228 00:13:28,440 --> 00:13:30,800 Speaker 1: and that, like you said earlier, there there is no one, 229 00:13:30,920 --> 00:13:33,160 Speaker 1: single one. It's kind of when you put all of 230 00:13:33,200 --> 00:13:35,720 Speaker 1: this stuff together. I think that's sort of the beauty 231 00:13:35,760 --> 00:13:39,120 Speaker 1: of of human intelligence, is it took all of these 232 00:13:39,280 --> 00:13:42,640 Speaker 1: great things sort of coalescing um. But the whole thing 233 00:13:42,640 --> 00:13:47,120 Speaker 1: with the brain is interesting because the size of the 234 00:13:47,160 --> 00:13:50,520 Speaker 1: brain is is one of nature's kind of controversies. Like 235 00:13:51,320 --> 00:13:53,560 Speaker 1: we know that as far as humans go, just because 236 00:13:53,600 --> 00:13:56,280 Speaker 1: you have a bigger brain doesn't mean that you definitely 237 00:13:56,320 --> 00:14:00,760 Speaker 1: will be smarter. But there are some correlations cross species 238 00:14:00,760 --> 00:14:04,360 Speaker 1: in nature, uh, and in humans there can be, you know, 239 00:14:04,440 --> 00:14:08,320 Speaker 1: evidence that bigger brain means you're more intelligent. But it's 240 00:14:08,360 --> 00:14:10,320 Speaker 1: not one of those things where it settles science where 241 00:14:10,320 --> 00:14:11,959 Speaker 1: they just say, hey, if you've got a bigger brain, 242 00:14:12,000 --> 00:14:14,440 Speaker 1: you're gonna be smarter. No, And in fact, like there's 243 00:14:14,480 --> 00:14:17,960 Speaker 1: all sorts of evidence in nature that suggests that's not 244 00:14:18,000 --> 00:14:22,280 Speaker 1: the case. Because our brain to body size ratio among 245 00:14:22,400 --> 00:14:25,440 Speaker 1: humans is one to forty, So our brain makes up 246 00:14:25,440 --> 00:14:29,080 Speaker 1: about our body mass, and that's the same ratio that 247 00:14:29,120 --> 00:14:32,120 Speaker 1: a mouse has. My mice just I don't care how 248 00:14:32,160 --> 00:14:35,000 Speaker 1: you cut it. They're just not as intelligent as humans. 249 00:14:35,000 --> 00:14:38,280 Speaker 1: But on the other hand, an elephants brain to body 250 00:14:38,400 --> 00:14:41,720 Speaker 1: ratio is one to five hundred and sixty and elephants 251 00:14:41,720 --> 00:14:46,240 Speaker 1: are super smart. So, um, you you can't really find 252 00:14:46,360 --> 00:14:50,760 Speaker 1: much there that says, you know, there's no direct correlation 253 00:14:50,800 --> 00:14:53,440 Speaker 1: where it's like the bigger the brain, the more intelligent 254 00:14:53,520 --> 00:14:57,680 Speaker 1: the being. Um. But there does have to be some 255 00:14:57,840 --> 00:15:02,880 Speaker 1: minimum amount of brain size because it seems like the 256 00:15:02,880 --> 00:15:06,080 Speaker 1: connections of the brain is we'll see, are what really matter. 257 00:15:06,440 --> 00:15:08,880 Speaker 1: And the more brain tissue you have up to a 258 00:15:08,880 --> 00:15:12,640 Speaker 1: certain point, the more connections that can be made. Right, 259 00:15:12,720 --> 00:15:15,880 Speaker 1: So that brings us back to the birth canal situation. 260 00:15:15,960 --> 00:15:18,480 Speaker 1: Like you mentioned you're walking upright, that changes in the 261 00:15:18,480 --> 00:15:21,120 Speaker 1: shape of the pelvis, you have a much smaller birth 262 00:15:21,120 --> 00:15:25,120 Speaker 1: canal all of a sudden. So evolutionarily speaking, you might think, well, 263 00:15:25,680 --> 00:15:27,360 Speaker 1: does that mean we're going to have to have babies 264 00:15:27,360 --> 00:15:30,840 Speaker 1: with tiny, tiny little heads and therefore tiny tiny little 265 00:15:30,840 --> 00:15:33,680 Speaker 1: brains that may not be able to grow very fast 266 00:15:33,800 --> 00:15:37,640 Speaker 1: because it's enclosed in a skull that's sort of locked down, 267 00:15:38,840 --> 00:15:41,400 Speaker 1: but that didn't happen to us. What happened was we 268 00:15:41,440 --> 00:15:46,040 Speaker 1: have Fontanell's and we have this delayed fusing of the 269 00:15:46,080 --> 00:15:49,640 Speaker 1: skull kind of you know, closing for good, and so 270 00:15:49,880 --> 00:15:52,760 Speaker 1: it allows and it's you know, it's remarkable still to 271 00:15:52,760 --> 00:15:56,080 Speaker 1: think about this to me, but it allows that little 272 00:15:56,120 --> 00:15:59,680 Speaker 1: baby head to squish down to get through the birth canal, 273 00:15:59,760 --> 00:16:02,320 Speaker 1: and it through the vagina and out into the world 274 00:16:02,960 --> 00:16:05,920 Speaker 1: and stay that way for a while. And it's during 275 00:16:05,960 --> 00:16:09,560 Speaker 1: that for a while period before that skull completely fuses 276 00:16:10,040 --> 00:16:14,240 Speaker 1: that a human brain really really grows a lot, and 277 00:16:14,400 --> 00:16:18,080 Speaker 1: chimps don't have that ability. No, a chimp, uh, their 278 00:16:18,200 --> 00:16:21,080 Speaker 1: skull fuses mostly in the womb, and their brain, as 279 00:16:21,080 --> 00:16:24,880 Speaker 1: a consequence, grows mostly to what size it's going to 280 00:16:24,920 --> 00:16:27,120 Speaker 1: reach in the womb. So, on the one hand, a 281 00:16:27,200 --> 00:16:31,240 Speaker 1: chimp baby, you could say, is much smarter and much 282 00:16:31,360 --> 00:16:35,320 Speaker 1: less helpless than a human baby. But given enough time, 283 00:16:35,360 --> 00:16:37,520 Speaker 1: the human baby is going to start to exceed the 284 00:16:37,600 --> 00:16:42,560 Speaker 1: chimp's abilities very quickly. And it's because our development is delayed. 285 00:16:42,600 --> 00:16:44,720 Speaker 1: We do a lot of developing outside of the womb, 286 00:16:44,880 --> 00:16:47,800 Speaker 1: and that's afforded by that skull that's not fused. For 287 00:16:47,880 --> 00:16:51,160 Speaker 1: a couple of years after birth. And this was not 288 00:16:52,080 --> 00:16:55,600 Speaker 1: there is no intelligent design, so there this was not like, um, 289 00:16:55,640 --> 00:16:59,440 Speaker 1: like a good solution or work around. This was just 290 00:16:59,640 --> 00:17:05,360 Speaker 1: a natural selected, naturally selected trait, the skull not fusing. 291 00:17:05,640 --> 00:17:09,800 Speaker 1: That was a solution to the smaller birth canal, not 292 00:17:09,880 --> 00:17:15,000 Speaker 1: to not to increase intelligence, but the advent of babies 293 00:17:15,040 --> 00:17:18,919 Speaker 1: being born that didn't have few skulls allowed for the 294 00:17:19,040 --> 00:17:22,960 Speaker 1: advent of intelligence. Yeah, a solution to the problem of 295 00:17:23,560 --> 00:17:27,000 Speaker 1: walking up right, which is really interesting to think about. Yeah, 296 00:17:27,040 --> 00:17:30,200 Speaker 1: And it also just goes to show like it's like 297 00:17:30,560 --> 00:17:34,040 Speaker 1: nature is not always like elegantly simple. Sometimes it's really 298 00:17:34,080 --> 00:17:37,679 Speaker 1: convoluted and organisms including us are held together by like 299 00:17:38,000 --> 00:17:41,440 Speaker 1: duct tape and bubblegum, you know. And that's a good 300 00:17:41,440 --> 00:17:44,560 Speaker 1: example of it. I think that's a good time for 301 00:17:44,600 --> 00:17:48,800 Speaker 1: the break, yea, yea. And we'll come back and drop 302 00:17:48,880 --> 00:18:13,040 Speaker 1: some plasticity on your brain right after this, So, Chuck, 303 00:18:13,119 --> 00:18:15,720 Speaker 1: this is the point we're about to talk about brain plasticity. 304 00:18:16,119 --> 00:18:22,600 Speaker 1: This seems to be, uh, what if anything explains human intelligence, 305 00:18:22,640 --> 00:18:25,760 Speaker 1: and certainly the burst of intelligence that happened thirty to 306 00:18:25,840 --> 00:18:30,360 Speaker 1: sixty thou years ago. Yeah, And I think the opening 307 00:18:30,359 --> 00:18:32,440 Speaker 1: statement to this whole thing is all you gotta do 308 00:18:33,200 --> 00:18:37,160 Speaker 1: is look at the fact that we learn almost everything 309 00:18:38,240 --> 00:18:42,080 Speaker 1: as humans, like from the moment we're born. There there 310 00:18:42,200 --> 00:18:45,159 Speaker 1: is some maybe instinctive knowledge, but like you said, like 311 00:18:45,240 --> 00:18:48,879 Speaker 1: human babies are kind of helpless, little dumb dumbs. And 312 00:18:48,960 --> 00:18:53,439 Speaker 1: from that point forward, our brains are are learning and 313 00:18:53,480 --> 00:18:56,560 Speaker 1: they're growing, and they're capable of learning, and they're capable 314 00:18:56,600 --> 00:19:01,160 Speaker 1: of adapting. And this all has to do with plasticity, right, So, 315 00:19:01,400 --> 00:19:04,919 Speaker 1: just if you aren't familiar, plasticity is the brain's ability 316 00:19:04,960 --> 00:19:10,160 Speaker 1: to um basically rewire and create new connections as new 317 00:19:10,200 --> 00:19:13,520 Speaker 1: experiences come along. Uh, And you can even take old 318 00:19:13,560 --> 00:19:17,639 Speaker 1: experiences that you experience more than once and the second 319 00:19:17,640 --> 00:19:20,480 Speaker 1: and third and fourth time, those neural connections are going 320 00:19:20,520 --> 00:19:24,719 Speaker 1: to become more sophisticated and more connected than they were before. 321 00:19:24,920 --> 00:19:27,720 Speaker 1: So our brains are plastic. They can be molded and 322 00:19:27,760 --> 00:19:31,000 Speaker 1: shaped kind of like in the rhinoplastic um sense of 323 00:19:31,040 --> 00:19:33,760 Speaker 1: the word. They're not made of plastic. They can be 324 00:19:33,840 --> 00:19:36,400 Speaker 1: molded and they're molded by the connections that they make. 325 00:19:36,680 --> 00:19:39,560 Speaker 1: So it's not necessarily that you have a giant brain. 326 00:19:40,080 --> 00:19:42,919 Speaker 1: It's that you human being have a brain that is 327 00:19:43,119 --> 00:19:48,800 Speaker 1: really highly capable of creating new connections, and it's those 328 00:19:48,800 --> 00:19:52,960 Speaker 1: connections that forms the basis of intellect. Yeah, and that 329 00:19:53,320 --> 00:19:55,840 Speaker 1: really frees up, Like once you have a brain that's 330 00:19:55,880 --> 00:19:59,640 Speaker 1: plastic and that can evolve, you know, to figure out 331 00:19:59,680 --> 00:20:03,359 Speaker 1: a pro ablem rather than taking eons and eons to 332 00:20:04,000 --> 00:20:07,840 Speaker 1: have like genetic to genetically adapt to a solution to 333 00:20:07,880 --> 00:20:09,760 Speaker 1: a problem. If all of a sudden you have a 334 00:20:09,760 --> 00:20:11,639 Speaker 1: brain that can figure something out, you do it so 335 00:20:11,720 --> 00:20:14,640 Speaker 1: much quicker, and that frees you up to do more 336 00:20:14,800 --> 00:20:18,840 Speaker 1: and learn more, and it creates this feedback loop all 337 00:20:18,840 --> 00:20:21,399 Speaker 1: of a sudden where the process really really speeds up. 338 00:20:21,440 --> 00:20:25,000 Speaker 1: And that's you know, basically what we saw thirty sixty 339 00:20:25,359 --> 00:20:27,920 Speaker 1: years ago. Yeah, and the reason and we're still seeing 340 00:20:27,920 --> 00:20:30,119 Speaker 1: it today, Chuck. I mean, like you know, a thirty 341 00:20:30,200 --> 00:20:32,399 Speaker 1: to sixty thousand years ago, it was a huge burst 342 00:20:32,440 --> 00:20:36,760 Speaker 1: of creativity and intelligence, but we're still talking about changes 343 00:20:36,800 --> 00:20:39,520 Speaker 1: that took place over thousands of years. Now. We're seeing 344 00:20:39,720 --> 00:20:43,920 Speaker 1: changes to the human condition in our society that takes 345 00:20:43,920 --> 00:20:46,760 Speaker 1: place over like tens of years. So it still seems 346 00:20:46,800 --> 00:20:48,560 Speaker 1: to be speeding up and we're still going through the 347 00:20:48,600 --> 00:20:51,679 Speaker 1: same process. But the I guess the best way to 348 00:20:51,720 --> 00:20:56,520 Speaker 1: think about what you've just described as evolution, which typically 349 00:20:56,680 --> 00:21:02,160 Speaker 1: you know, UM forces changes on us based on environmental conditions, 350 00:21:02,680 --> 00:21:06,200 Speaker 1: goes into the brain, and now it's the brain that's 351 00:21:06,240 --> 00:21:08,919 Speaker 1: able to change, and like you said, it changes much faster, 352 00:21:09,600 --> 00:21:13,240 Speaker 1: and that leaves genetic evolution or genetic natural selection to 353 00:21:13,600 --> 00:21:17,879 Speaker 1: focus on UM selecting for traits that create more and 354 00:21:17,960 --> 00:21:20,800 Speaker 1: more intelligence. So it creates that positive feedback loop like 355 00:21:20,800 --> 00:21:23,879 Speaker 1: you said, and speeds things up. It's pretty brilliant. So 356 00:21:23,960 --> 00:21:28,159 Speaker 1: there's been a lot of really interesting research, especially in 357 00:21:28,160 --> 00:21:30,360 Speaker 1: that it seemed like the early to mid nineteen nineties 358 00:21:30,400 --> 00:21:33,480 Speaker 1: about plasticity UM. There are a couple of researchers name 359 00:21:34,000 --> 00:21:38,359 Speaker 1: to Be and Cosmides great names, and they had a 360 00:21:38,480 --> 00:21:43,119 Speaker 1: theory basically that human intelligence UH evolved with all these 361 00:21:43,760 --> 00:21:47,840 Speaker 1: UM encapsulated cognitive models, so they did not have the 362 00:21:47,880 --> 00:21:51,520 Speaker 1: ability to excess each other. Each each of these modules, 363 00:21:52,200 --> 00:21:55,480 Speaker 1: and each one was very specialized for a very specialized 364 00:21:56,080 --> 00:21:58,320 Speaker 1: problem or task that was trying to do our problem 365 00:21:58,320 --> 00:22:03,920 Speaker 1: that was trying to solve, and that's like a language module, UH, 366 00:22:04,160 --> 00:22:07,800 Speaker 1: spatial relation module. UH, here's how to make and use 367 00:22:07,840 --> 00:22:10,640 Speaker 1: a tool that kind of a module. And that all 368 00:22:10,680 --> 00:22:13,920 Speaker 1: these modules are still around, uh and basically the same 369 00:22:13,960 --> 00:22:18,919 Speaker 1: form that they were back then, because it's there on 370 00:22:18,960 --> 00:22:22,199 Speaker 1: the timeline of you know, humanity, that that hasn't been 371 00:22:22,240 --> 00:22:26,320 Speaker 1: a lot of time to undergo any kind of modifications basically, 372 00:22:26,720 --> 00:22:28,879 Speaker 1: So I disagree with that. I think I think on 373 00:22:29,000 --> 00:22:33,199 Speaker 1: a speaking about classic evolution natural selection, that's true, but 374 00:22:33,440 --> 00:22:37,600 Speaker 1: brain based evolution and natural selection like cultural natural selection, 375 00:22:37,640 --> 00:22:42,200 Speaker 1: I think that that's false. You hear that to be. So. 376 00:22:42,359 --> 00:22:45,359 Speaker 1: The the idea about all this is that these modules 377 00:22:45,359 --> 00:22:47,960 Speaker 1: that we developed over time is like we came upon 378 00:22:48,080 --> 00:22:51,360 Speaker 1: new problems in our environments and had to figure out 379 00:22:51,400 --> 00:22:54,520 Speaker 1: new solutions to them. They started to kind of get 380 00:22:54,560 --> 00:22:59,000 Speaker 1: cross referenced here there, Like you could say, um, you know, 381 00:22:59,119 --> 00:23:05,000 Speaker 1: the same the same ability to UM to to follow 382 00:23:05,080 --> 00:23:08,080 Speaker 1: the sunset, right, Yeah, I wish I would have come 383 00:23:08,160 --> 00:23:11,320 Speaker 1: up with something better can also can also be used 384 00:23:11,359 --> 00:23:16,320 Speaker 1: to UM to follow herds of game, right. And so 385 00:23:16,480 --> 00:23:18,600 Speaker 1: all of a sudden we now not only just know 386 00:23:18,760 --> 00:23:21,200 Speaker 1: to to follow the sunset if we want to follow 387 00:23:21,240 --> 00:23:23,120 Speaker 1: the sunset, we also know we can use that same 388 00:23:23,160 --> 00:23:25,840 Speaker 1: ability to follow game around. And all of a sudden, 389 00:23:25,840 --> 00:23:29,080 Speaker 1: our diet expands that kind of thing. So as these 390 00:23:29,119 --> 00:23:33,560 Speaker 1: different modules started cross referencing themselves and and got more 391 00:23:33,600 --> 00:23:36,760 Speaker 1: and more connected. We were able to apply these different 392 00:23:36,800 --> 00:23:39,199 Speaker 1: things to more and more situations and got more and 393 00:23:39,240 --> 00:23:43,560 Speaker 1: more intelligent. All right, So that's UM. That's one sort 394 00:23:43,600 --> 00:23:48,280 Speaker 1: of grand theory which I love. Another one that we're 395 00:23:48,280 --> 00:23:51,880 Speaker 1: going to talk about is I think super interesting because 396 00:23:51,880 --> 00:23:55,879 Speaker 1: some of this stuff is so kind of rudimentary in 397 00:23:55,920 --> 00:23:57,720 Speaker 1: it when you just sort of look at it from 398 00:23:57,720 --> 00:24:00,359 Speaker 1: a macro view, but when you really have to think 399 00:24:00,400 --> 00:24:04,160 Speaker 1: about how important that ended up being, it's it's fascinating 400 00:24:04,200 --> 00:24:06,800 Speaker 1: to me. And in this case, we're talking about the 401 00:24:06,800 --> 00:24:10,399 Speaker 1: fact that UM. One of the sort of side and 402 00:24:10,400 --> 00:24:12,680 Speaker 1: again it's going back to bipedalism. One of the side 403 00:24:12,680 --> 00:24:17,120 Speaker 1: effects bipedalism is that we lost our ability, UM, with 404 00:24:17,119 --> 00:24:19,679 Speaker 1: our feet to be able to like hang onto things 405 00:24:20,200 --> 00:24:22,879 Speaker 1: like chimpanzees do. They were these boots were all of 406 00:24:22,920 --> 00:24:26,320 Speaker 1: a sudden made for walking, and they weren't made for grabbing. 407 00:24:26,960 --> 00:24:30,200 Speaker 1: And if they weren't made for grabbing, then you couldn't 408 00:24:30,240 --> 00:24:33,800 Speaker 1: hang on to mama like a chimp could with hands 409 00:24:33,840 --> 00:24:37,440 Speaker 1: and feet. So mama had to hang on to human baby. 410 00:24:37,640 --> 00:24:40,000 Speaker 1: And mama can't hang on to human baby all the 411 00:24:40,000 --> 00:24:43,000 Speaker 1: time because mama still has to get things done around 412 00:24:43,080 --> 00:24:46,439 Speaker 1: the you know, savannah. So what you have to do 413 00:24:46,520 --> 00:24:50,320 Speaker 1: then is leave that baby somewhere and go do stuff 414 00:24:50,359 --> 00:24:53,080 Speaker 1: like go down to the river and uh and do 415 00:24:53,280 --> 00:24:57,359 Speaker 1: things and uh. If you leave a baby somewhere, what 416 00:24:58,280 --> 00:25:00,560 Speaker 1: you go down to the river and do and things? 417 00:25:01,119 --> 00:25:05,400 Speaker 1: Two things you know unen uh and if you leave 418 00:25:05,440 --> 00:25:07,719 Speaker 1: that baby And this is all leading to this statement, 419 00:25:07,760 --> 00:25:09,800 Speaker 1: if you leave that baby somewhere, you want to be 420 00:25:09,840 --> 00:25:11,800 Speaker 1: able to go back and find that baby. And it 421 00:25:11,840 --> 00:25:15,240 Speaker 1: seems so rudimentary and basic, but that is a huge 422 00:25:15,920 --> 00:25:19,159 Speaker 1: thing in the development of the early human brain, is 423 00:25:19,560 --> 00:25:23,199 Speaker 1: simply to spatially map and remember, like where I have 424 00:25:23,320 --> 00:25:25,840 Speaker 1: left this child. It's important to go back and get 425 00:25:25,880 --> 00:25:30,600 Speaker 1: that child, and I can do that. Yeah, and then um. 426 00:25:31,240 --> 00:25:36,200 Speaker 1: Consequently to that, another adaptation seems to have arisen from 427 00:25:36,280 --> 00:25:38,840 Speaker 1: the same problem, the problem of not being able to 428 00:25:38,880 --> 00:25:41,560 Speaker 1: cling to the mother anymore, and then also the problem 429 00:25:41,600 --> 00:25:44,440 Speaker 1: of the baby being otherwise helpless, way more helpless than 430 00:25:44,440 --> 00:25:47,720 Speaker 1: a baby chimp. Right, So they think that around the 431 00:25:47,760 --> 00:25:51,800 Speaker 1: same time baby's cries developed, like you don't hear other 432 00:25:51,840 --> 00:25:54,640 Speaker 1: things necessarily crying like a human baby, And they don't 433 00:25:54,680 --> 00:25:57,560 Speaker 1: think that babies cried like that until around this time, 434 00:25:57,560 --> 00:26:00,000 Speaker 1: because there was that problem. So even if a mom 435 00:26:00,040 --> 00:26:02,440 Speaker 1: couldn't remember where she put her baby, she could listen 436 00:26:02,480 --> 00:26:05,600 Speaker 1: out for the baby crying. And they also think around 437 00:26:05,640 --> 00:26:09,040 Speaker 1: this time that an urge or desire to soothe the 438 00:26:09,080 --> 00:26:12,880 Speaker 1: baby from its crying would have developed, and that it's possible, Chuck, 439 00:26:12,920 --> 00:26:16,560 Speaker 1: And this makes so much awesome sense that language actually 440 00:26:16,560 --> 00:26:20,080 Speaker 1: developed out of what's called mother ease, that kind of 441 00:26:20,200 --> 00:26:23,960 Speaker 1: soothing baby talk that calms the baby, that mothers know 442 00:26:24,040 --> 00:26:27,320 Speaker 1: how to do just naturally. They think that it's possible 443 00:26:27,359 --> 00:26:31,680 Speaker 1: that that is what formed the basis of language. Yeah, 444 00:26:31,680 --> 00:26:34,280 Speaker 1: and I'm gonna go beyond that even because what I 445 00:26:34,359 --> 00:26:38,000 Speaker 1: noticed when I had a baby in the house was that, 446 00:26:38,320 --> 00:26:41,879 Speaker 1: even beyond the soothing thing, if you are holding your 447 00:26:41,880 --> 00:26:43,720 Speaker 1: baby and you have to put your baby down to 448 00:26:43,720 --> 00:26:47,320 Speaker 1: go wash the dishes or whatever, generally, and I think 449 00:26:47,359 --> 00:26:49,680 Speaker 1: I speak for most parents, you don't just go set 450 00:26:49,680 --> 00:26:51,679 Speaker 1: your baby down, go in the other room and do stuff. 451 00:26:52,040 --> 00:26:54,960 Speaker 1: You're you're talking to that baby from the very beginning, 452 00:26:55,480 --> 00:26:57,000 Speaker 1: and you're saying, all right, let's go over to our 453 00:26:57,040 --> 00:26:59,240 Speaker 1: little place here. I'll be right back. I'm gonna be 454 00:26:59,320 --> 00:27:01,080 Speaker 1: right here in the other whom that baby doesn't know 455 00:27:01,080 --> 00:27:04,439 Speaker 1: what you're talking about, obviously, but there's there just seems 456 00:27:04,480 --> 00:27:07,879 Speaker 1: to be this evolutionary instinct to to say things to 457 00:27:08,000 --> 00:27:11,040 Speaker 1: it right out of the gate. It's really interesting. It 458 00:27:11,119 --> 00:27:14,000 Speaker 1: is interesting, and then wrapped up in this also there's 459 00:27:14,040 --> 00:27:17,040 Speaker 1: a better example than following the stupid sunset that I 460 00:27:17,080 --> 00:27:19,960 Speaker 1: came up with. I love sunsets. But if you can 461 00:27:20,480 --> 00:27:23,520 Speaker 1: now all of a sudden like remember where landmarks are 462 00:27:23,520 --> 00:27:26,399 Speaker 1: and then wayfind your your way back to you know, 463 00:27:26,600 --> 00:27:29,520 Speaker 1: the starting point. Now you can start to use that 464 00:27:29,560 --> 00:27:33,359 Speaker 1: to follow game further and further afield, and you're expanding 465 00:27:33,400 --> 00:27:35,920 Speaker 1: your range and you can expand your diet. So that's 466 00:27:35,960 --> 00:27:37,920 Speaker 1: a really good example of one thing kind of leading 467 00:27:37,960 --> 00:27:41,920 Speaker 1: to another, and all of it being um arising from 468 00:27:42,400 --> 00:27:48,320 Speaker 1: environmental pressers brought on by changes to buy of us. Yeah, 469 00:27:48,440 --> 00:27:52,520 Speaker 1: I love it me too. Um. This next one is 470 00:27:52,880 --> 00:27:54,960 Speaker 1: kind of fits in a little bit with a plasticity 471 00:27:55,520 --> 00:27:59,240 Speaker 1: I think. Uh. The idea of the cognitive niche um, 472 00:27:59,280 --> 00:28:03,160 Speaker 1: which is, you know, typically figuring out like a solution 473 00:28:03,200 --> 00:28:07,320 Speaker 1: to a problem. But this theory is that maybe intelligence 474 00:28:07,320 --> 00:28:13,160 Speaker 1: evolved as a universal adaptation to all kinds of uh, 475 00:28:13,440 --> 00:28:17,320 Speaker 1: evolutionary pressures that we're bearing down. So um and ed 476 00:28:17,359 --> 00:28:19,520 Speaker 1: has a pretty great example. If you've got a an 477 00:28:19,560 --> 00:28:23,080 Speaker 1: island with a tree that has a certain um fruit 478 00:28:23,520 --> 00:28:26,600 Speaker 1: seed that's really beneficial for your body, Um, but you 479 00:28:26,640 --> 00:28:29,840 Speaker 1: can't crack into it. There's you know, it would take 480 00:28:29,880 --> 00:28:33,480 Speaker 1: a bird, uh, you know, hundreds to thousands or millions 481 00:28:33,480 --> 00:28:36,760 Speaker 1: of years to develop and evolved to have a beak 482 00:28:36,840 --> 00:28:39,719 Speaker 1: that can crack that thing open. But if all of 483 00:28:39,720 --> 00:28:42,240 Speaker 1: a sudden, you know how to make a tool, you 484 00:28:42,280 --> 00:28:44,400 Speaker 1: can just walk over and steal that thing from that 485 00:28:44,480 --> 00:28:47,200 Speaker 1: dumb bird and just crack it open with the tool. 486 00:28:47,520 --> 00:28:50,640 Speaker 1: So it's not filling, it's your brain at work, and 487 00:28:50,680 --> 00:28:53,640 Speaker 1: in that case is filling a specific niche. But that's 488 00:28:53,640 --> 00:28:56,440 Speaker 1: a tool that was also used to kill the animal 489 00:28:56,600 --> 00:28:59,000 Speaker 1: or chop the wood. Yeah, and that really supports what 490 00:28:59,040 --> 00:29:02,240 Speaker 1: we were talking about a few minutes ago that once evolution, 491 00:29:02,400 --> 00:29:05,240 Speaker 1: once a brain is evolved to a certain amount of intellect, 492 00:29:05,600 --> 00:29:09,440 Speaker 1: the brain can take care of the organism and natural 493 00:29:09,480 --> 00:29:12,800 Speaker 1: selection and genetics can kind of take a step back 494 00:29:13,200 --> 00:29:17,480 Speaker 1: and not have to say, like, um, you know, uh, 495 00:29:17,600 --> 00:29:21,320 Speaker 1: select for a thicker harrier chest. Because we're living in 496 00:29:21,360 --> 00:29:23,920 Speaker 1: a colder time now because the brain can come up 497 00:29:23,960 --> 00:29:26,720 Speaker 1: with a way to create a coat, right, So it 498 00:29:26,840 --> 00:29:31,320 Speaker 1: just kind of takes over evolution from evolution by doing that, 499 00:29:31,400 --> 00:29:35,160 Speaker 1: and that's that cognitive niche. And one of the consequences 500 00:29:35,200 --> 00:29:38,320 Speaker 1: of it is that there seems to be as as 501 00:29:38,360 --> 00:29:42,400 Speaker 1: things change in our environment, we figure out new ways 502 00:29:42,440 --> 00:29:47,160 Speaker 1: to to solve those, and then those those solutions are 503 00:29:47,200 --> 00:29:51,440 Speaker 1: inevitably going to create other problems or changes, So then 504 00:29:51,480 --> 00:29:54,680 Speaker 1: we have to we have to evolve even further intelligence 505 00:29:54,720 --> 00:29:57,000 Speaker 1: to figure out how to solve these new problems. You 506 00:29:57,040 --> 00:29:59,440 Speaker 1: can actually see it still going on today, Chuck, Like, 507 00:30:00,080 --> 00:30:04,040 Speaker 1: we've evolved a level of intelligence where we can extract 508 00:30:04,360 --> 00:30:08,720 Speaker 1: petroleum from the earth, we can build machines that run 509 00:30:08,800 --> 00:30:12,440 Speaker 1: on that petroleum, and we can develop science that figures 510 00:30:12,440 --> 00:30:16,600 Speaker 1: out that burning those the that petroleum is really really 511 00:30:16,600 --> 00:30:20,680 Speaker 1: bad for the climate. So now we've we've altered our 512 00:30:20,720 --> 00:30:24,600 Speaker 1: ecosystem enough that we have to evolve intelligence enough to 513 00:30:24,720 --> 00:30:27,520 Speaker 1: figure out how to get out of this new conundrum 514 00:30:27,520 --> 00:30:30,960 Speaker 1: that we've created for ourselves based on our previous intellect. 515 00:30:30,960 --> 00:30:34,880 Speaker 1: So intellect builds on intellect through environmental pressures that we 516 00:30:34,960 --> 00:30:38,680 Speaker 1: often bring on ourselves. That seems like the case, Yeah, 517 00:30:39,000 --> 00:30:40,440 Speaker 1: I mean, I think a lot of people seem to 518 00:30:40,480 --> 00:30:43,520 Speaker 1: think of intelligence is only solving problems, but it also 519 00:30:43,640 --> 00:30:48,840 Speaker 1: creates a lot as well. It's interesting, Yeah, it really is. Um. 520 00:30:48,880 --> 00:30:51,840 Speaker 1: I know we were gonna skip this section entirely, but 521 00:30:51,920 --> 00:30:55,440 Speaker 1: I think just for frenzies, we should very quickly mention one, 522 00:30:56,280 --> 00:31:01,440 Speaker 1: uh the ideas from someone called Terence McKenna who describes 523 00:31:01,480 --> 00:31:04,560 Speaker 1: as a post modern Timothy Leary type, one of these 524 00:31:04,560 --> 00:31:08,959 Speaker 1: people that that advocates for psychedelic drug use, and just 525 00:31:09,040 --> 00:31:13,840 Speaker 1: very quickly the idea is that, uh, the cavemen were 526 00:31:13,840 --> 00:31:18,360 Speaker 1: tripping on mushrooms and that's how intelligence evolved. And I 527 00:31:18,480 --> 00:31:21,360 Speaker 1: just like mention it because I feel like there's almost nothing, 528 00:31:22,480 --> 00:31:26,840 Speaker 1: no leap in history that some person hasn't said, like 529 00:31:26,920 --> 00:31:29,920 Speaker 1: the Enlightenment or whatever, like, oh man, they were just tripping, right, 530 00:31:30,560 --> 00:31:34,480 Speaker 1: they were just super I don't think it's pretty funny. 531 00:31:34,560 --> 00:31:36,440 Speaker 1: It is funny, but it does. I mean, like if 532 00:31:36,440 --> 00:31:41,040 Speaker 1: you apply it exclusively to the Upper Paleolithic Revolution, where 533 00:31:41,040 --> 00:31:43,080 Speaker 1: all of a sudden there's like art and jewelry and 534 00:31:43,160 --> 00:31:47,600 Speaker 1: dancing that stuff, it's possible that it was based at 535 00:31:47,680 --> 00:31:50,680 Speaker 1: least in part you know, yeah, you never know, alright, 536 00:31:50,720 --> 00:31:52,240 Speaker 1: So I say, we take a break and we come 537 00:31:52,280 --> 00:31:54,880 Speaker 1: back and get down to the nitty gritty of how 538 00:31:55,000 --> 00:31:58,720 Speaker 1: food might have brought intelligence along. How about that? Mm hmm. 539 00:32:19,800 --> 00:32:21,960 Speaker 1: By the way, Chuck, I have a theory real quick 540 00:32:23,080 --> 00:32:26,560 Speaker 1: that the more you say uh or um, the more 541 00:32:26,600 --> 00:32:30,560 Speaker 1: intelligent you um are. Oh boy, I must be a 542 00:32:30,560 --> 00:32:35,400 Speaker 1: smarty pan. You know, other more intelligent podcasts cut all 543 00:32:35,400 --> 00:32:40,160 Speaker 1: that stuff out. Yeah, I guess do they? Yeah? I mean, 544 00:32:40,320 --> 00:32:43,000 Speaker 1: what did I mispronounced in the Row episode? I couldn't 545 00:32:43,040 --> 00:32:48,480 Speaker 1: even substantive Oh yeah, yeah, it's true. Other podcasts would 546 00:32:48,480 --> 00:32:51,000 Speaker 1: not have left that in. But that's because dummy leaves 547 00:32:51,000 --> 00:32:55,560 Speaker 1: that they're not as brave. We're courageous podcasters. Okay, I'll 548 00:32:55,600 --> 00:32:58,920 Speaker 1: buy that. Um. This next theory I think is super 549 00:32:58,920 --> 00:33:03,920 Speaker 1: cool because anytime each tie in um, not just food, 550 00:33:04,000 --> 00:33:09,000 Speaker 1: but sort of a an appreciation for a creature comfort, 551 00:33:09,960 --> 00:33:14,200 Speaker 1: it really like turns turns me on and not you know, 552 00:33:15,400 --> 00:33:19,280 Speaker 1: in certain ways intellectually turns me on. And in this case, 553 00:33:19,320 --> 00:33:23,800 Speaker 1: we're talking about the fact that we used fire obviously 554 00:33:23,920 --> 00:33:30,360 Speaker 1: and then started cooking food, and people that cooked food said, wow, 555 00:33:30,840 --> 00:33:34,280 Speaker 1: this is really good, and this taste a whole lot 556 00:33:34,360 --> 00:33:37,240 Speaker 1: better than that raw meat. We've been eating this charred 557 00:33:37,240 --> 00:33:40,760 Speaker 1: meat is delicious, and let's let's try and do more 558 00:33:40,800 --> 00:33:43,760 Speaker 1: of this around here. Yeah, And so that would have 559 00:33:43,880 --> 00:33:49,320 Speaker 1: just them being um, responding to like a taste preference, 560 00:33:49,720 --> 00:33:52,200 Speaker 1: and that's it. But it just so happened that that 561 00:33:52,280 --> 00:33:55,640 Speaker 1: taste preference would have had a really big benefit and 562 00:33:55,960 --> 00:33:59,320 Speaker 1: a big contribution to the development of intelligence. Because if 563 00:33:59,320 --> 00:34:02,920 Speaker 1: you cook me eat um, you unlock a bunch of 564 00:34:03,000 --> 00:34:06,800 Speaker 1: nutrients and calories that are otherwise unavailable to you if 565 00:34:06,800 --> 00:34:10,040 Speaker 1: you just eat it raw. So over time, the people 566 00:34:10,120 --> 00:34:12,839 Speaker 1: who ate meat would have had more energy and more 567 00:34:12,920 --> 00:34:16,319 Speaker 1: calories to contribute to a growing brain, which could have 568 00:34:16,400 --> 00:34:19,439 Speaker 1: helped the process along if not sped it up. And 569 00:34:19,480 --> 00:34:22,480 Speaker 1: if you consider the fact that we've definitely seen that 570 00:34:22,840 --> 00:34:27,239 Speaker 1: that taste and smell has responded to evolutionary pressures in 571 00:34:27,280 --> 00:34:30,640 Speaker 1: that we at some point learned not to eat poop, 572 00:34:30,719 --> 00:34:33,920 Speaker 1: and we learned not to eat rotten food and stuff. Um, 573 00:34:33,960 --> 00:34:36,759 Speaker 1: And that's you know, taste and smell it can have 574 00:34:36,880 --> 00:34:38,520 Speaker 1: the you know, it looks like it can have the 575 00:34:38,520 --> 00:34:40,799 Speaker 1: opposite effect to where all of a sudden you have 576 00:34:40,880 --> 00:34:43,400 Speaker 1: a preference for the good. And that just happens to 577 00:34:43,440 --> 00:34:45,520 Speaker 1: work out in your favor. Yeah, and this is another 578 00:34:45,560 --> 00:34:48,520 Speaker 1: example of one thing leading to another, where like you know, 579 00:34:48,640 --> 00:34:52,080 Speaker 1: mothers developed an awareness of landmarks and wayfinding, and then 580 00:34:52,360 --> 00:34:55,160 Speaker 1: that led to um being able to follow game, which 581 00:34:55,160 --> 00:34:57,919 Speaker 1: expanded our diet, which led to us eating meat, which 582 00:34:57,920 --> 00:35:02,560 Speaker 1: eventually led to applying controlled fire to that meat, which 583 00:35:02,640 --> 00:35:05,640 Speaker 1: led to more calories and nutrients available, which led to 584 00:35:05,800 --> 00:35:09,319 Speaker 1: bigger brain growth, which helped found uh, the growth of 585 00:35:09,320 --> 00:35:13,960 Speaker 1: intelligence and humans. Just one thing, one totally random, unconnected thing, 586 00:35:14,320 --> 00:35:20,080 Speaker 1: or even connected but seemingly unconnected, uh, just creating us today. 587 00:35:20,160 --> 00:35:23,160 Speaker 1: It's just so nuts. So to me, I love it. Yeah, 588 00:35:23,239 --> 00:35:25,640 Speaker 1: me too. And the fact that, like, think about this, 589 00:35:26,640 --> 00:35:29,880 Speaker 1: not only the preference for a charred meat, but the 590 00:35:29,960 --> 00:35:33,920 Speaker 1: preference for a specific charred meat because you know, different 591 00:35:33,960 --> 00:35:36,520 Speaker 1: stuff tastes different. It's not like everything tastes like chicken. 592 00:35:36,560 --> 00:35:38,720 Speaker 1: I know that's a joke, but all of a sudden 593 00:35:38,719 --> 00:35:42,480 Speaker 1: took took his out there and and says, boy, that 594 00:35:42,600 --> 00:35:45,719 Speaker 1: one thing that we killed yesterday. You guys, I don't 595 00:35:45,719 --> 00:35:49,120 Speaker 1: know about you, but that was really really delicious. And 596 00:35:49,239 --> 00:35:52,879 Speaker 1: we know that that is we saw that thing three 597 00:35:52,960 --> 00:35:56,480 Speaker 1: days ago, about fifty miles away. Everyone said, what's a 598 00:35:56,520 --> 00:35:59,000 Speaker 1: mile and he well, that doesn't matter right now, but 599 00:35:59,239 --> 00:36:00,880 Speaker 1: the point that it was really far away. So all 600 00:36:00,880 --> 00:36:04,880 Speaker 1: of a sudden, other things are introduced, like cooperation, not 601 00:36:04,920 --> 00:36:07,879 Speaker 1: just wayfinding, but hey, let's let's all get together because 602 00:36:07,920 --> 00:36:10,160 Speaker 1: this is like a three day journey and this thing 603 00:36:10,239 --> 00:36:13,000 Speaker 1: is really big. That tastes so delicious, So it's gonna 604 00:36:13,040 --> 00:36:14,680 Speaker 1: take a few of us to bring this thing down 605 00:36:15,200 --> 00:36:18,840 Speaker 1: and to process this animal and get this meat ready 606 00:36:18,840 --> 00:36:22,560 Speaker 1: for cooking. So it just introduces like a cascade and this, 607 00:36:22,880 --> 00:36:25,319 Speaker 1: and it could have all just come from Hey, that 608 00:36:25,360 --> 00:36:28,440 Speaker 1: tastes really great. Yeah, and so that that you know, 609 00:36:28,480 --> 00:36:32,120 Speaker 1: these all that hunting and coordinating all that takes like 610 00:36:32,160 --> 00:36:34,879 Speaker 1: a lot of intelligence. And not only does it take 611 00:36:34,880 --> 00:36:39,000 Speaker 1: intelligence to to coordinate, it takes intelligence to explain what 612 00:36:39,040 --> 00:36:42,000 Speaker 1: you're talking about. And it takes intelligence to come up 613 00:36:42,040 --> 00:36:44,160 Speaker 1: with that plan in the first place, you know. So 614 00:36:44,520 --> 00:36:48,080 Speaker 1: all of those factors combining are just making humans more 615 00:36:48,120 --> 00:36:51,719 Speaker 1: and more intelligent with every every step. And again, it's 616 00:36:51,760 --> 00:36:56,120 Speaker 1: not like it's just following this perfect linear progression. Yeah, 617 00:36:56,360 --> 00:37:00,080 Speaker 1: it's just like it's just kind of randomly. And the 618 00:37:00,080 --> 00:37:04,080 Speaker 1: the reason that we're intelligent today is because the attempts 619 00:37:04,120 --> 00:37:06,960 Speaker 1: that didn't work out, got selected out, the fact got 620 00:37:06,960 --> 00:37:09,720 Speaker 1: trimmed along the way. Is it kind of a ruthless 621 00:37:09,760 --> 00:37:12,759 Speaker 1: way to put it, but you know it makes sense. Uh. 622 00:37:12,760 --> 00:37:15,759 Speaker 1: And that sort of ties into this other theory of 623 00:37:16,400 --> 00:37:21,840 Speaker 1: um smaller prey, like when they were hunting um large 624 00:37:21,880 --> 00:37:25,799 Speaker 1: prey species that you know, they eventually uh, they were 625 00:37:25,880 --> 00:37:28,640 Speaker 1: hunting and track of these these large animals and eventually 626 00:37:29,040 --> 00:37:32,160 Speaker 1: they were driven to extinction. So humans had to start 627 00:37:32,320 --> 00:37:36,239 Speaker 1: going after smaller things or I guess, uh, hominem's had 628 00:37:36,280 --> 00:37:39,800 Speaker 1: to start going after smaller things, and the fossil record 629 00:37:39,840 --> 00:37:43,840 Speaker 1: indicates this. It sort of worked in lockstep with the 630 00:37:43,920 --> 00:37:47,040 Speaker 1: evolution of human intelligence. So all of a sudden, if 631 00:37:47,040 --> 00:37:50,160 Speaker 1: you're hunting smaller things, you probably have to be a 632 00:37:50,239 --> 00:37:52,080 Speaker 1: little bit smarter. You have to be a little bit 633 00:37:52,120 --> 00:37:55,600 Speaker 1: more coordinated, you have to cooperate a little more, you 634 00:37:55,640 --> 00:37:59,480 Speaker 1: have to maybe invent new tools, and like obviously using 635 00:37:59,520 --> 00:38:02,160 Speaker 1: a big thing to smash a large thing isn't the 636 00:38:02,200 --> 00:38:05,400 Speaker 1: same thing smashing a small thing. Uh. And just simply 637 00:38:05,440 --> 00:38:06,839 Speaker 1: the fact that they had to do a lot more 638 00:38:06,880 --> 00:38:09,400 Speaker 1: of it. You know, if you're eating a squirrel uh 639 00:38:09,480 --> 00:38:12,240 Speaker 1: as your diet, you're you're eating a lot of squirrel 640 00:38:12,280 --> 00:38:14,520 Speaker 1: every day, whereas if you eat a wooly mammoth, that's 641 00:38:14,520 --> 00:38:17,600 Speaker 1: your food for the month or whatever exactly. And that's 642 00:38:17,640 --> 00:38:19,680 Speaker 1: a really good example of what I was talking about earlier, 643 00:38:19,719 --> 00:38:23,279 Speaker 1: that cognitive niche where the more sophisticated we get, the 644 00:38:23,320 --> 00:38:26,520 Speaker 1: more problems we actually generate for ourselves, the more challenges, 645 00:38:26,640 --> 00:38:31,160 Speaker 1: the more intelligent we have to become. That's right. And 646 00:38:31,200 --> 00:38:33,640 Speaker 1: what about this last notion? And then I think this 647 00:38:33,719 --> 00:38:36,399 Speaker 1: is kind of where it all comes together, right, Yeah, 648 00:38:36,719 --> 00:38:39,000 Speaker 1: so you know, we have like a real urge and 649 00:38:39,040 --> 00:38:41,319 Speaker 1: a desire to to wrap everything up in a neat 650 00:38:41,360 --> 00:38:43,879 Speaker 1: little package, and we just haven't reached that point yet 651 00:38:43,880 --> 00:38:47,719 Speaker 1: with human intelligence. But if you step back and look 652 00:38:47,760 --> 00:38:50,880 Speaker 1: at some of the theories, um and see how they 653 00:38:50,920 --> 00:38:54,759 Speaker 1: all kind of fit together, it seems like most are. 654 00:38:54,800 --> 00:38:57,480 Speaker 1: All of them with the exception of stone date probably 655 00:38:57,920 --> 00:39:00,760 Speaker 1: could be right. But they all have to work together 656 00:39:00,840 --> 00:39:03,960 Speaker 1: and work with one another, um, which is great because 657 00:39:04,000 --> 00:39:07,960 Speaker 1: that level of organization requires intelligence, that's right. But the 658 00:39:08,040 --> 00:39:11,759 Speaker 1: key to all of this, and I think, um, we 659 00:39:11,760 --> 00:39:14,200 Speaker 1: we talked about the evolution of language on a whole show, 660 00:39:14,280 --> 00:39:18,120 Speaker 1: right yeah, uh, we we still don't quite know exactly 661 00:39:18,160 --> 00:39:20,040 Speaker 1: how that evolve, but we have some ideas, like we 662 00:39:20,080 --> 00:39:22,919 Speaker 1: talked about with the uh, what'd you call it? Mother 663 00:39:23,040 --> 00:39:27,239 Speaker 1: ins No, mother ease, mother ease. But all of this 664 00:39:27,400 --> 00:39:30,440 Speaker 1: became possible because of language. All of this, like you 665 00:39:30,480 --> 00:39:34,800 Speaker 1: were saying, all of this coordination, all this cooperation, anything 666 00:39:34,840 --> 00:39:39,000 Speaker 1: that would eventually lead to writing down human history, all 667 00:39:39,040 --> 00:39:42,120 Speaker 1: of that had to have language. So it seems that 668 00:39:42,200 --> 00:39:45,920 Speaker 1: all of these sort of theories coalescing around the beginnings 669 00:39:45,960 --> 00:39:49,239 Speaker 1: of language, and eventually the written word is like the 670 00:39:49,280 --> 00:39:51,719 Speaker 1: key to it all. Yeah, totally. And one of the 671 00:39:51,719 --> 00:39:55,160 Speaker 1: other things. Um, because we are so aware that we're 672 00:39:55,200 --> 00:39:58,839 Speaker 1: intellectually superior, not only to all the other animals, at 673 00:39:58,840 --> 00:40:01,040 Speaker 1: the very least, we're intellect really different from the other 674 00:40:01,080 --> 00:40:04,000 Speaker 1: smart ones. Um, we tend to think of ourselves as 675 00:40:04,040 --> 00:40:08,680 Speaker 1: the most intellectually evolved or the most successful humans ever, 676 00:40:09,120 --> 00:40:11,960 Speaker 1: and that's absolutely not the case. Um. I think Homo 677 00:40:12,040 --> 00:40:15,320 Speaker 1: erectus was around for one and a half million years 678 00:40:15,800 --> 00:40:18,560 Speaker 1: and modern humans have only been around for about three 679 00:40:18,640 --> 00:40:22,560 Speaker 1: hundred thousand years. So we're definitely not necessarily the pinnacle 680 00:40:22,800 --> 00:40:26,080 Speaker 1: of evolution just in the amount of time and success 681 00:40:26,120 --> 00:40:28,960 Speaker 1: we've had so far. But also, um, we have a 682 00:40:28,960 --> 00:40:31,080 Speaker 1: tendency to think like we're we're the top and there's 683 00:40:31,120 --> 00:40:34,640 Speaker 1: nothing coming. And that's not necessarily true either. Like if 684 00:40:34,640 --> 00:40:38,319 Speaker 1: you look at that acceleration and technology, like um, some 685 00:40:38,400 --> 00:40:41,440 Speaker 1: of our ancestors used the same tool for a million 686 00:40:41,480 --> 00:40:44,440 Speaker 1: and a half years without innovating upon it. They just 687 00:40:44,560 --> 00:40:46,800 Speaker 1: made that same tool over and over and over again. 688 00:40:46,880 --> 00:40:49,120 Speaker 1: And then somebody came along who was born and figured 689 00:40:49,160 --> 00:40:51,439 Speaker 1: out a way to make it better, and that kicked 690 00:40:51,480 --> 00:40:54,879 Speaker 1: off more and more technology, and you can see it's 691 00:40:54,920 --> 00:40:58,080 Speaker 1: picking up faster and faster. But the fact that evolution 692 00:40:58,160 --> 00:41:01,239 Speaker 1: has jumped from the external old for us to our 693 00:41:01,280 --> 00:41:05,040 Speaker 1: brains and in turn to our culture, you can make 694 00:41:05,040 --> 00:41:07,960 Speaker 1: a really good case that we're not necessarily going to 695 00:41:08,000 --> 00:41:11,640 Speaker 1: physically evolve any longer. We're going to mentally evolve. So 696 00:41:11,719 --> 00:41:14,360 Speaker 1: it's not it's not certain what humanity is going to 697 00:41:14,400 --> 00:41:17,480 Speaker 1: look like in the future, but it's probably going to happen. 698 00:41:17,520 --> 00:41:20,480 Speaker 1: The changes are going to happen a lot faster soon 699 00:41:20,920 --> 00:41:24,200 Speaker 1: than they have been before, and we'll all just end 700 00:41:24,320 --> 00:41:29,920 Speaker 1: up brains and jars right probably or uploaded. That's right. 701 00:41:30,080 --> 00:41:35,800 Speaker 1: Oh boy, good luck with that. Everybody that had a 702 00:41:35,920 --> 00:41:39,080 Speaker 1: very so long sucker bring to you, got anything else? 703 00:41:39,640 --> 00:41:42,600 Speaker 1: I got nothing else. I love these, uh that types 704 00:41:42,640 --> 00:41:46,000 Speaker 1: of episodes. Stuff me too. Since Chuck and I agreed 705 00:41:46,040 --> 00:41:52,080 Speaker 1: we'd love this episode, it's time for listener mail. So 706 00:41:52,520 --> 00:41:55,120 Speaker 1: this is another Appalachian trail probably the last one I'll 707 00:41:55,160 --> 00:41:59,760 Speaker 1: read because, Um, Sophie here a k A tough cookie, 708 00:41:59,760 --> 00:42:02,560 Speaker 1: which a Sophie's trail name. It's just a lovely human 709 00:42:02,600 --> 00:42:05,240 Speaker 1: and we we had a nice back and forth Sophie 710 00:42:05,239 --> 00:42:10,680 Speaker 1: and Sophie's sister did a nobo through hike inven and 711 00:42:10,760 --> 00:42:14,080 Speaker 1: just had some kind of fun things to point out. Um. 712 00:42:14,120 --> 00:42:16,840 Speaker 1: One of the general rules of trail names is someone 713 00:42:16,960 --> 00:42:20,160 Speaker 1: else has to give it to you. So I think 714 00:42:20,239 --> 00:42:22,920 Speaker 1: that's kind of like, uh, if you're a pilot in 715 00:42:22,960 --> 00:42:26,480 Speaker 1: the military, you're like a Maverick and Goose. I think 716 00:42:26,480 --> 00:42:29,359 Speaker 1: people think they name themselves cool names. But my brother 717 00:42:29,360 --> 00:42:30,759 Speaker 1: in law was like, no, no no, no, no, no, you'd 718 00:42:30,800 --> 00:42:33,240 Speaker 1: get a name, and it's usually not something super cool 719 00:42:33,320 --> 00:42:36,560 Speaker 1: like Maverick. Yeah, if you name yourself, I'm sure that 720 00:42:36,600 --> 00:42:38,520 Speaker 1: people are going to be way harder on you in 721 00:42:38,560 --> 00:42:41,160 Speaker 1: the name they actually select for you. Yeah. I don't 722 00:42:41,200 --> 00:42:43,839 Speaker 1: even know if you're allowed to. I'm not sure. Uh. 723 00:42:43,880 --> 00:42:46,040 Speaker 1: Sophie says that my sister and I cheated a little 724 00:42:46,040 --> 00:42:48,399 Speaker 1: bit because we gave names to each other a few 725 00:42:48,480 --> 00:42:50,720 Speaker 1: days in. I don't think that's cheating. You're still naming 726 00:42:50,719 --> 00:42:54,120 Speaker 1: someone else. Sure, that's called getting ahead of the curve. Uh. 727 00:42:54,160 --> 00:42:56,560 Speaker 1: We We did have some unofficial trail names though, that 728 00:42:56,880 --> 00:42:59,640 Speaker 1: other people would refer to the pair of us as. 729 00:43:00,080 --> 00:43:02,239 Speaker 1: My favorite was a sixty year old Kentucky hiker from 730 00:43:02,239 --> 00:43:05,560 Speaker 1: Maine who told us, uh, he referred to us as 731 00:43:05,600 --> 00:43:09,680 Speaker 1: the Kentucky Wonders, which is pretty fun. And one thing 732 00:43:09,680 --> 00:43:12,600 Speaker 1: I realized after reading all these A t emails is 733 00:43:12,640 --> 00:43:14,960 Speaker 1: that it's it's really kind of fun. Like people get 734 00:43:15,000 --> 00:43:17,239 Speaker 1: together and like they start off alone and all of 735 00:43:17,280 --> 00:43:19,360 Speaker 1: a sudden there's a group of like twelve people hiking 736 00:43:19,400 --> 00:43:21,600 Speaker 1: together for weeks at a time. That is the very 737 00:43:21,640 --> 00:43:25,799 Speaker 1: reason why I will never hike they that sounds like 738 00:43:25,840 --> 00:43:29,279 Speaker 1: a nightmare to me. You would be the uh, the 739 00:43:29,400 --> 00:43:32,319 Speaker 1: loner hiker totally. Maybe, like, don't turn your back on 740 00:43:32,360 --> 00:43:39,319 Speaker 1: that one. I think your trail name might be Jed Bundy. Um. 741 00:43:39,480 --> 00:43:41,799 Speaker 1: The trail through West Virginia is actually less than four 742 00:43:41,840 --> 00:43:44,640 Speaker 1: miles and I heard this from other people to um, 743 00:43:44,719 --> 00:43:47,440 Speaker 1: not eighteen. So I think we screwed that up. It's 744 00:43:47,480 --> 00:43:51,440 Speaker 1: an amazing feeling to go through so quickly after psychologically, 745 00:43:51,480 --> 00:43:54,920 Speaker 1: after completing Virginia, which is five hundred miles and a 746 00:43:55,000 --> 00:43:57,560 Speaker 1: quarter of the whole entire trail. Uh. And there is 747 00:43:57,600 --> 00:44:00,120 Speaker 1: a four state challenge that some hikers will attempt to 748 00:44:00,160 --> 00:44:03,680 Speaker 1: do a forty five mile day to go through the 749 00:44:03,760 --> 00:44:07,279 Speaker 1: end of Virginia, West Virginia, Maryland, and into Pennsylvania in 750 00:44:07,440 --> 00:44:14,840 Speaker 1: two hours. Her family hosts a trail magic spot because 751 00:44:14,840 --> 00:44:16,879 Speaker 1: they live near the trail in Tennessee, So they will 752 00:44:16,920 --> 00:44:19,440 Speaker 1: go out on the weekends and they pack up a 753 00:44:19,480 --> 00:44:21,840 Speaker 1: bunch of hiker food and they grill burgers and stuff 754 00:44:22,160 --> 00:44:24,400 Speaker 1: or make pancakes and just feed people on the trail 755 00:44:24,520 --> 00:44:27,600 Speaker 1: and then we'll go back home. I think you could 756 00:44:27,640 --> 00:44:29,919 Speaker 1: be down with that part, right. Sure, I'd need a 757 00:44:29,920 --> 00:44:32,640 Speaker 1: free hamburger, Like can I take it to go? I'd 758 00:44:32,680 --> 00:44:36,160 Speaker 1: be like why, I'd be like why is there mustard 759 00:44:36,160 --> 00:44:39,800 Speaker 1: on here? But not catch up? Uh. And then finally, 760 00:44:39,880 --> 00:44:42,160 Speaker 1: during the hike, we would treat ourselves to podcast for 761 00:44:42,200 --> 00:44:44,840 Speaker 1: a couple of hours when hiking was getting monotonous and 762 00:44:44,840 --> 00:44:46,879 Speaker 1: wanted to get out of our heads some and your 763 00:44:46,960 --> 00:44:50,200 Speaker 1: voices were a frequent, frequent companion listening to stuff that 764 00:44:50,239 --> 00:44:52,880 Speaker 1: you should know. Selects these days often have the weird 765 00:44:52,920 --> 00:44:56,120 Speaker 1: sensation of remembering exactly where I was hiking in the 766 00:44:56,120 --> 00:45:01,440 Speaker 1: woods when I listened to that episode. Uh, come to 767 00:45:01,560 --> 00:45:04,880 Speaker 1: Kentucky sometime, check out the Bourbon distilleries and the Red 768 00:45:04,960 --> 00:45:08,960 Speaker 1: River Gorge and do a show here Lexington. I know 769 00:45:09,040 --> 00:45:11,440 Speaker 1: you'd probably rather go to Louisville or Cincinnati, but Lexington 770 00:45:11,520 --> 00:45:14,239 Speaker 1: is definitely worth a visit. And Sophie sent along a 771 00:45:14,239 --> 00:45:17,440 Speaker 1: bunch of cool pictures of Sophie and her sister before 772 00:45:17,440 --> 00:45:19,839 Speaker 1: and after, and it's just looked like a really great time. 773 00:45:19,920 --> 00:45:22,120 Speaker 1: That's awesome. Thanks a lot for that email, Sophie, that 774 00:45:22,200 --> 00:45:24,560 Speaker 1: was a great one. And agreed Chuck that one had 775 00:45:24,600 --> 00:45:27,440 Speaker 1: to be right for sure. Just stay away from Josh 776 00:45:27,440 --> 00:45:29,920 Speaker 1: if you see him in the woods. No, I'm harmless. 777 00:45:29,960 --> 00:45:32,080 Speaker 1: I just don't want to be spoken to. That's all. 778 00:45:32,680 --> 00:45:36,239 Speaker 1: I want to be left alone. It's too awkward. Otherwise, 779 00:45:36,800 --> 00:45:39,600 Speaker 1: you could just hike with a big giant like nineteen 780 00:45:39,640 --> 00:45:43,080 Speaker 1: seventies headphones as if you're listening to right with my 781 00:45:43,120 --> 00:45:46,520 Speaker 1: head down, sunglasses on in a bag over my head. 782 00:45:47,160 --> 00:45:49,440 Speaker 1: I love it. If you want to be like Sophie 783 00:45:49,440 --> 00:45:51,520 Speaker 1: and get in touch with us. You can send us 784 00:45:51,560 --> 00:45:55,719 Speaker 1: an email and send it off heart Radio. For more 785 00:45:55,719 --> 00:45:59,080 Speaker 1: podcasts my heart Radio, visit the heart radio app. Apple 786 00:45:59,120 --> 00:46:04,560 Speaker 1: podcasts are ever you listen to your favorite shows. H