1 00:00:06,680 --> 00:00:10,399 Speaker 1: Why are some animals so much smarter than others? And 2 00:00:10,440 --> 00:00:15,080 Speaker 1: in particular, what is it that makes humans so incredibly smart? Now? 3 00:00:15,160 --> 00:00:17,720 Speaker 1: I love animals as much as anyone else, and yes, 4 00:00:17,960 --> 00:00:21,640 Speaker 1: other animals do do things like use tools and communicate 5 00:00:21,680 --> 00:00:24,680 Speaker 1: with each other, But humans are the only species we 6 00:00:24,760 --> 00:00:29,000 Speaker 1: know of that uncomplicated research programs to try to understand 7 00:00:29,040 --> 00:00:32,879 Speaker 1: at a deep level why some animal species are more 8 00:00:32,920 --> 00:00:36,320 Speaker 1: intelligent than others and how would these differences come about. 9 00:00:37,040 --> 00:00:40,519 Speaker 1: We humans ask questions like why is it that crows 10 00:00:40,600 --> 00:00:43,720 Speaker 1: have learned how to fashion sticks into tools that they 11 00:00:43,720 --> 00:00:46,280 Speaker 1: can use to grab tasty bugs out of the nooks 12 00:00:46,320 --> 00:00:49,280 Speaker 1: and crannies of trees, But I can't get my dumb 13 00:00:49,360 --> 00:00:51,199 Speaker 1: chickens to stop eating the paints on the side of 14 00:00:51,240 --> 00:00:53,720 Speaker 1: the chicken coop? And what kind of conditions would you 15 00:00:53,720 --> 00:00:56,040 Speaker 1: need to have on another planet in order to produce 16 00:00:56,160 --> 00:00:59,880 Speaker 1: highly intelligent aliens? These kinds of questions keep me up 17 00:01:00,520 --> 00:01:02,880 Speaker 1: and also seem to be keeping many of you up 18 00:01:02,920 --> 00:01:05,840 Speaker 1: at night as well. Today, we're going to tackle three 19 00:01:06,200 --> 00:01:11,039 Speaker 1: wonderful listener questions that we've received related to intelligence. Welcome 20 00:01:11,160 --> 00:01:13,760 Speaker 1: to Daniel and Kelly's Extraordinary Universe. 21 00:01:28,240 --> 00:01:31,080 Speaker 2: I'm Daniel, I'm a particle physicist, and I hope that 22 00:01:31,160 --> 00:01:34,400 Speaker 2: humans are not the smartest folks in the galaxy. 23 00:01:34,920 --> 00:01:38,640 Speaker 1: Hello, I'm Kelly Wiener Smith. I study parasites and space, 24 00:01:38,760 --> 00:01:40,600 Speaker 1: and yeah, I guess I hope we're not the smartest. 25 00:01:40,640 --> 00:01:44,080 Speaker 1: As long as they're nice and smart, that would be great. 26 00:01:45,720 --> 00:01:47,840 Speaker 2: I don't even really need to put that caveat on it, 27 00:01:47,880 --> 00:01:50,000 Speaker 2: Like if they're mean and smart, like, I'll take a 28 00:01:50,040 --> 00:01:52,400 Speaker 2: little bit of meanness for some secrets of the universe. 29 00:01:52,520 --> 00:01:56,040 Speaker 1: What if they think we're delicious and they're smart, I 30 00:01:56,080 --> 00:01:57,360 Speaker 1: think that's a bad combo. 31 00:01:57,440 --> 00:02:01,280 Speaker 2: I think I don't think I'm certainly very tasty, so 32 00:02:01,360 --> 00:02:02,880 Speaker 2: I'm not worried that I'd be going to be like 33 00:02:03,240 --> 00:02:05,400 Speaker 2: high on the list of humans to be consumed. 34 00:02:05,520 --> 00:02:07,840 Speaker 1: What would make you less tasty than other humans? 35 00:02:10,440 --> 00:02:11,560 Speaker 2: This is getting personal now? 36 00:02:11,760 --> 00:02:14,440 Speaker 1: Is us this sprinkled with stinky sauce or something. 37 00:02:16,440 --> 00:02:19,160 Speaker 2: I actually kind of am marbled through with the ribbons 38 00:02:19,160 --> 00:02:21,239 Speaker 2: of fats, So maybe I would be quite tasty. I 39 00:02:21,280 --> 00:02:22,760 Speaker 2: don't know. I try to stay fit. 40 00:02:22,880 --> 00:02:25,520 Speaker 1: You know, Marbling is a delicacy. 41 00:02:25,840 --> 00:02:27,560 Speaker 2: I think we just set a record for how quickly 42 00:02:27,639 --> 00:02:30,760 Speaker 2: we started talking about aliens and consumption of human flesh. 43 00:02:32,200 --> 00:02:33,920 Speaker 1: Glad to see if by the end of the episode 44 00:02:33,960 --> 00:02:36,320 Speaker 1: we hit poop and then we'll have the trifecta. But 45 00:02:36,560 --> 00:02:37,919 Speaker 1: we'll probably get there, all right. 46 00:02:37,960 --> 00:02:41,240 Speaker 2: So my question for you today, Kelly, is do you 47 00:02:41,480 --> 00:02:46,000 Speaker 2: think that we will ever crack the barrier to communicating 48 00:02:46,520 --> 00:02:52,560 Speaker 2: with our fellow intelligent earthlings, Like somehow as AI or 49 00:02:52,680 --> 00:02:56,960 Speaker 2: future linguistics can help us understand the chirpings of prairie 50 00:02:57,000 --> 00:02:59,120 Speaker 2: dogs and the whistling of whales. 51 00:02:59,520 --> 00:03:03,360 Speaker 1: Wow, all right, So I've recently read this really great 52 00:03:03,400 --> 00:03:05,880 Speaker 1: book called Do Aliens Speak Physics? 53 00:03:06,560 --> 00:03:06,680 Speaker 3: Ed? 54 00:03:10,400 --> 00:03:12,200 Speaker 2: That is a great book. 55 00:03:12,200 --> 00:03:15,160 Speaker 1: It's a fantastic book, and it kind of convinced me 56 00:03:15,160 --> 00:03:18,320 Speaker 1: that unless you can have two way communication, it's really 57 00:03:18,360 --> 00:03:21,799 Speaker 1: hard to get on the same page regarding language. You can, 58 00:03:21,919 --> 00:03:24,839 Speaker 1: like work with dolphins to try to figure stuff out. 59 00:03:24,919 --> 00:03:28,280 Speaker 1: Maybe we could get some very preliminary like you know, 60 00:03:28,400 --> 00:03:30,120 Speaker 1: I say hello to you, you say hello to me. 61 00:03:30,160 --> 00:03:31,959 Speaker 1: We're on the same page there, But like I guess 62 00:03:31,960 --> 00:03:36,080 Speaker 1: I'm not super optimistic. What kind of communication do you need? Like, so, 63 00:03:36,160 --> 00:03:40,440 Speaker 1: my dog is really cute but about as smart as 64 00:03:40,480 --> 00:03:43,480 Speaker 1: a ton of bricks, and so I can't communicate too 65 00:03:43,480 --> 00:03:45,760 Speaker 1: many complicated ideas to her. But you know, there are 66 00:03:45,760 --> 00:03:48,400 Speaker 1: some border colligues that are brilliant. So how much communication 67 00:03:48,480 --> 00:03:49,280 Speaker 1: do you feel like you need? 68 00:03:50,880 --> 00:03:52,600 Speaker 2: I just kind of want to know what's going on 69 00:03:52,720 --> 00:03:55,680 Speaker 2: in there. Like the whales are definitely talking to each other. 70 00:03:55,720 --> 00:03:58,920 Speaker 2: What are they saying? The honeybees are wiggling to each other. 71 00:03:58,960 --> 00:04:01,760 Speaker 2: This definitely in for me. I just want to translate 72 00:04:01,800 --> 00:04:04,640 Speaker 2: that information somehow, to know what they think is interesting 73 00:04:04,720 --> 00:04:07,160 Speaker 2: to talk about. And like you, I know that my 74 00:04:07,280 --> 00:04:09,680 Speaker 2: dog understands a lot of stuff, Like he picks up 75 00:04:09,720 --> 00:04:12,120 Speaker 2: on pretty subtle patterns, so he knows when he's getting 76 00:04:12,160 --> 00:04:15,120 Speaker 2: a walk. You know, shoes on plus the leash means 77 00:04:15,120 --> 00:04:17,520 Speaker 2: a walk, shoes on plus the backpack I means Daniel's 78 00:04:17,520 --> 00:04:19,920 Speaker 2: going to work this kind of stuff. And he notices 79 00:04:20,000 --> 00:04:23,080 Speaker 2: patterns through the day, like he's definitely paying attention to 80 00:04:23,120 --> 00:04:26,040 Speaker 2: patterns and understanding stuff. But I wish we could talk. 81 00:04:26,080 --> 00:04:27,520 Speaker 2: I wish we could commun in the game more. I 82 00:04:27,560 --> 00:04:29,479 Speaker 2: wish I could tell him like, Hey, we're getting in 83 00:04:29,480 --> 00:04:31,200 Speaker 2: the car and we're going to this thing, or we're 84 00:04:31,240 --> 00:04:33,599 Speaker 2: not going to the vet. You know this information I 85 00:04:33,600 --> 00:04:35,920 Speaker 2: wish we could share with him, Like my dog when 86 00:04:35,920 --> 00:04:37,839 Speaker 2: we leave, he has no idea are we going on 87 00:04:37,880 --> 00:04:39,960 Speaker 2: a three week trip? Are we going to be back 88 00:04:40,000 --> 00:04:41,960 Speaker 2: in an hour? He's cool, and if I could talk 89 00:04:42,000 --> 00:04:43,440 Speaker 2: to him, I could share that with him. 90 00:04:43,560 --> 00:04:45,159 Speaker 1: So, first of all, I think it's very nice that 91 00:04:45,240 --> 00:04:47,880 Speaker 1: you're worried about how your dog's going to feel about 92 00:04:47,880 --> 00:04:50,960 Speaker 1: how long you're gone. I think that's very sweet. So 93 00:04:51,120 --> 00:04:53,960 Speaker 1: bees do this waggle dance, and you can tell by 94 00:04:53,960 --> 00:04:56,359 Speaker 1: the way that they're dancing, sort of like the direction 95 00:04:56,560 --> 00:04:59,640 Speaker 1: and the distance that they're telling their nest mats. We're 96 00:04:59,640 --> 00:05:02,120 Speaker 1: the great source of pollen or nectar is and so 97 00:05:02,360 --> 00:05:05,400 Speaker 1: we can decipher that. But you know, I don't think 98 00:05:05,440 --> 00:05:07,080 Speaker 1: that I could like dress up in a bee suit 99 00:05:07,160 --> 00:05:09,560 Speaker 1: and shake my butt the right way and tell them like, hey, 100 00:05:09,560 --> 00:05:12,160 Speaker 1: I just planted some flowers in the southern corner of 101 00:05:12,200 --> 00:05:15,279 Speaker 1: the property. I mean, I think we can decode a 102 00:05:15,279 --> 00:05:17,160 Speaker 1: lot of what animals say, but we have not yet 103 00:05:17,200 --> 00:05:19,680 Speaker 1: figured out how to communicate back with them. 104 00:05:19,920 --> 00:05:22,600 Speaker 2: Yeah. Well, part of that depends on what's going on 105 00:05:22,720 --> 00:05:26,120 Speaker 2: inside those animals' minds and what intelligence really means and 106 00:05:26,160 --> 00:05:28,720 Speaker 2: where it comes from. So let's dig into the episode today. 107 00:05:28,760 --> 00:05:31,440 Speaker 2: I'm dying here what you had to tell us. 108 00:05:31,720 --> 00:05:34,080 Speaker 1: I'm super excited about today's episode. But now I've got 109 00:05:34,080 --> 00:05:36,920 Speaker 1: myself wondering, like, could you make a little electronic bee 110 00:05:37,279 --> 00:05:39,520 Speaker 1: that was like a bee size and like do the 111 00:05:39,600 --> 00:05:41,880 Speaker 1: waggle dance and maybe get the bees to go to 112 00:05:42,040 --> 00:05:44,800 Speaker 1: like your source of nectar. Anyway, surely there's got to 113 00:05:44,800 --> 00:05:47,560 Speaker 1: be funding for that. That sounds awesome. I find questions 114 00:05:47,600 --> 00:05:50,599 Speaker 1: about animal intelligence to be absolutely fascinating, and this is 115 00:05:50,600 --> 00:05:52,960 Speaker 1: actually a topic that we get a lot of questions 116 00:05:52,960 --> 00:05:55,200 Speaker 1: from y'all about, And if you want to send us 117 00:05:55,200 --> 00:05:57,880 Speaker 1: your questions, you can send us questions at questions at 118 00:05:57,960 --> 00:06:01,360 Speaker 1: danieland Kelly dot org. So today we're going to tackle 119 00:06:01,600 --> 00:06:04,640 Speaker 1: three of the questions that we've gotten from listeners about 120 00:06:04,680 --> 00:06:07,440 Speaker 1: animal intelligence and they sort of build on one another, 121 00:06:07,520 --> 00:06:09,040 Speaker 1: and so we're gonna, you know, do a bit of 122 00:06:09,080 --> 00:06:11,000 Speaker 1: an overview on animal intelligence today. 123 00:06:11,440 --> 00:06:13,840 Speaker 2: This is super cool. Second new kind of episode is 124 00:06:13,880 --> 00:06:17,320 Speaker 2: like a listener questions themes episode, almost like these listeners 125 00:06:17,320 --> 00:06:19,799 Speaker 2: are working together to send us questions that click together 126 00:06:19,839 --> 00:06:20,360 Speaker 2: so nicely. 127 00:06:20,600 --> 00:06:23,000 Speaker 1: That's right, Yeah, and these three click together very nicely. 128 00:06:23,040 --> 00:06:24,760 Speaker 1: You know, we could have done something similar with our 129 00:06:24,800 --> 00:06:28,280 Speaker 1: consciousness episode, but I think we got twenty questions about 130 00:06:28,279 --> 00:06:30,279 Speaker 1: consciousness before we were like, all right, we get it, 131 00:06:30,360 --> 00:06:33,160 Speaker 1: We're going to get an expert. Hopefully we answered everyone's 132 00:06:33,240 --> 00:06:36,320 Speaker 1: questions with that one. But today we are going to 133 00:06:36,360 --> 00:06:39,800 Speaker 1: start with a question from Tom. So let's go ahead 134 00:06:39,800 --> 00:06:40,880 Speaker 1: and hear Tom's question. 135 00:06:41,720 --> 00:06:45,000 Speaker 4: I'm reasonably confident that humans are the only Earth species 136 00:06:45,040 --> 00:06:48,560 Speaker 4: that has ever built rockets and such. So why is 137 00:06:48,560 --> 00:06:52,039 Speaker 4: this level of intelligence so rare? Life on Earth is 138 00:06:52,080 --> 00:06:55,560 Speaker 4: incredibly diverse and the Earth has been teeming with it 139 00:06:55,640 --> 00:06:59,200 Speaker 4: for billions of years? How does the species even develop 140 00:06:59,279 --> 00:07:04,760 Speaker 4: rocket level intelligence? Is intelligence a biologically selectable property that 141 00:07:04,839 --> 00:07:08,839 Speaker 4: improves the chances for a population to survive? Or is 142 00:07:08,920 --> 00:07:12,360 Speaker 4: rocket level intelligence just dumb luck? After all, the fact 143 00:07:12,400 --> 00:07:15,720 Speaker 4: that rocket level intelligence emerged on Earth has proved to 144 00:07:15,760 --> 00:07:19,560 Speaker 4: be quite unlucky for many coexisting species and may yet 145 00:07:19,600 --> 00:07:21,920 Speaker 4: still be unlucky for humans. Thanks. 146 00:07:22,880 --> 00:07:26,400 Speaker 1: Wow. Okay, so that is a great question, but it's 147 00:07:26,400 --> 00:07:30,280 Speaker 1: a very specific question. So why is a rocket level 148 00:07:30,400 --> 00:07:34,120 Speaker 1: intelligence so rare? So we have an end of one. 149 00:07:34,320 --> 00:07:37,800 Speaker 1: We know of one species that is able to build rockets, 150 00:07:37,840 --> 00:07:40,320 Speaker 1: and with such a small sample size it can be 151 00:07:40,400 --> 00:07:42,800 Speaker 1: really hard to know for sure. A bit of a 152 00:07:42,840 --> 00:07:46,840 Speaker 1: spoiler here, we don't know. In general, science does not 153 00:07:47,040 --> 00:07:49,600 Speaker 1: know exactly what it was that put humans on the 154 00:07:49,640 --> 00:07:51,960 Speaker 1: trajectory that we went on to be able to build 155 00:07:52,040 --> 00:07:54,600 Speaker 1: rockets and all the other amazing things that we can do. 156 00:07:54,960 --> 00:07:57,360 Speaker 1: We will talk about some of the hypotheses today that 157 00:07:57,400 --> 00:07:59,880 Speaker 1: we think were important, but first let's back up and 158 00:08:00,080 --> 00:08:04,680 Speaker 1: ask the question why can't all species build rockets? Why 159 00:08:04,840 --> 00:08:06,440 Speaker 1: is this kind of intelligence so rare? 160 00:08:06,840 --> 00:08:09,560 Speaker 2: I mean, ants and termites can build impressive stuff, right, 161 00:08:09,640 --> 00:08:13,240 Speaker 2: they definitely work together, and we see other species doing 162 00:08:13,240 --> 00:08:15,840 Speaker 2: cool stuff. So why aren't they building rockets and getting 163 00:08:15,840 --> 00:08:17,480 Speaker 2: off planet? What does that really require? 164 00:08:17,960 --> 00:08:20,840 Speaker 1: So again, the rocket thing is very specific, and part 165 00:08:20,840 --> 00:08:22,760 Speaker 1: of it is, you know, ants just might not have 166 00:08:22,800 --> 00:08:25,320 Speaker 1: the right appendages to build a rocket. You know, maybe 167 00:08:25,320 --> 00:08:28,040 Speaker 1: they've got rocket plans and if they just had hands, 168 00:08:28,080 --> 00:08:29,920 Speaker 1: they'd be building rockets, but probably not. 169 00:08:30,760 --> 00:08:32,320 Speaker 2: That's the kind of thing we could learn if we 170 00:08:32,400 --> 00:08:35,040 Speaker 2: learned to speak ant and be you know, maybe they're 171 00:08:35,360 --> 00:08:39,280 Speaker 2: desperate to fly into space and frustrated about their limitations. 172 00:08:39,760 --> 00:08:42,120 Speaker 1: We should be bringing them with us just you know, 173 00:08:42,360 --> 00:08:42,760 Speaker 1: they might. 174 00:08:42,679 --> 00:08:44,960 Speaker 2: Appreciate that, or we should just have them on the 175 00:08:44,960 --> 00:08:46,839 Speaker 2: podcast and ask them directly. 176 00:08:46,920 --> 00:08:50,720 Speaker 1: Maybe in five or ten years. So if you ask yourself, 177 00:08:50,760 --> 00:08:53,840 Speaker 1: why can't all species build rockets. You're essentially asking, well, 178 00:08:53,840 --> 00:08:58,160 Speaker 1: why aren't all animals intelligent? What holds you back? And 179 00:08:58,440 --> 00:09:00,400 Speaker 1: we think the answer has to do with how much 180 00:09:00,559 --> 00:09:04,000 Speaker 1: energy it takes to be super smart. So our adult 181 00:09:04,040 --> 00:09:07,320 Speaker 1: brains weigh about three pounds, so for a person who 182 00:09:07,320 --> 00:09:10,080 Speaker 1: weighs one hundred and eighty pounds, that's about two percent 183 00:09:10,120 --> 00:09:14,480 Speaker 1: of your body weight. But by measuring oxygen consumption rates 184 00:09:14,480 --> 00:09:16,840 Speaker 1: in different parts of our body, we estimate that our 185 00:09:16,920 --> 00:09:20,200 Speaker 1: brain uses about twenty percent of all of the energy 186 00:09:20,200 --> 00:09:23,640 Speaker 1: that we burn every single day. So this very small 187 00:09:23,760 --> 00:09:27,160 Speaker 1: organ is using an outsized amount of energy. So brains 188 00:09:27,200 --> 00:09:30,560 Speaker 1: are just energetically expensive. So if you are a species 189 00:09:30,600 --> 00:09:34,320 Speaker 1: that doesn't need a super big, smart brain, then there's 190 00:09:34,360 --> 00:09:37,560 Speaker 1: probably not great selection pressure for you to be expending 191 00:09:37,559 --> 00:09:40,120 Speaker 1: a bunch of energy on an energetically expensive organ that 192 00:09:40,160 --> 00:09:40,960 Speaker 1: you are not using. 193 00:09:41,440 --> 00:09:44,080 Speaker 2: So you're saying that this thing is expensive, right, that 194 00:09:44,160 --> 00:09:46,839 Speaker 2: it costs us something, and then so if it doesn't 195 00:09:46,880 --> 00:09:49,520 Speaker 2: pay for itself, it's definitely not going to survive because 196 00:09:49,520 --> 00:09:51,440 Speaker 2: I've hung around and used up a bunch of energy 197 00:09:51,559 --> 00:09:54,960 Speaker 2: and didn't provide tangible benefits. Then folks with big brains 198 00:09:55,240 --> 00:09:58,400 Speaker 2: would die more often during famines and cold winters and 199 00:09:58,400 --> 00:10:00,520 Speaker 2: stuff like that. That's the idea. 200 00:10:00,679 --> 00:10:02,640 Speaker 1: Yeah, so this is more of a species level argument. 201 00:10:02,679 --> 00:10:04,880 Speaker 1: But yeah, so species with big brains that didn't need 202 00:10:04,920 --> 00:10:08,240 Speaker 1: them would have to spend more time foraging or get better, 203 00:10:08,360 --> 00:10:11,120 Speaker 1: higher quality food because of their big brains. And if 204 00:10:11,120 --> 00:10:12,640 Speaker 1: that's not happening, it's not worth having. 205 00:10:12,880 --> 00:10:14,960 Speaker 2: I guess you could also imagine a population, right of 206 00:10:14,960 --> 00:10:17,400 Speaker 2: a single species with smaller brains and bigger brains, And 207 00:10:17,440 --> 00:10:19,760 Speaker 2: if the bigger brains aren't providing a benefit and they 208 00:10:19,760 --> 00:10:23,360 Speaker 2: are providing a cost, then those big brain folks among 209 00:10:23,440 --> 00:10:25,920 Speaker 2: us are not going to survive. There is a population 210 00:10:26,040 --> 00:10:27,439 Speaker 2: level argument there also, right. 211 00:10:27,440 --> 00:10:31,120 Speaker 1: Yeah, there's a population level argument. There's often more variables 212 00:10:31,160 --> 00:10:33,720 Speaker 1: at play when you start thinking about a specific species. So, 213 00:10:33,840 --> 00:10:36,480 Speaker 1: for example, when human brains, if they were to get 214 00:10:36,559 --> 00:10:39,240 Speaker 1: too big, you would start to have trouble getting those 215 00:10:39,320 --> 00:10:42,160 Speaker 1: giant heads through the birth canal. And so there's limits 216 00:10:42,160 --> 00:10:44,480 Speaker 1: on how big brains can be. And in general, there's 217 00:10:44,520 --> 00:10:47,240 Speaker 1: not a ton of variability between the sizes of brains 218 00:10:47,240 --> 00:10:50,160 Speaker 1: of adults. And now we can get into this whole 219 00:10:50,240 --> 00:10:53,280 Speaker 1: different debate about is it really about how big the 220 00:10:53,320 --> 00:10:56,480 Speaker 1: brain is? So like blue whales have much bigger brains 221 00:10:56,480 --> 00:10:58,839 Speaker 1: than we do, but they haven't created rockets to get 222 00:10:58,840 --> 00:11:01,000 Speaker 1: them out of space. Maybe that's because it's just too 223 00:11:01,040 --> 00:11:03,840 Speaker 1: expensive to send a giant blue whale to space. But yeah, 224 00:11:03,840 --> 00:11:05,480 Speaker 1: so trying to figure out exactly what feature of the 225 00:11:05,480 --> 00:11:07,400 Speaker 1: brain we should measure is difficult. But we know that 226 00:11:07,440 --> 00:11:11,040 Speaker 1: brains are expensive energetically, right, And. 227 00:11:10,960 --> 00:11:12,880 Speaker 2: I was actually reading a study about the size of 228 00:11:12,880 --> 00:11:15,600 Speaker 2: the human brain that suggested that if you packed in 229 00:11:15,640 --> 00:11:18,680 Speaker 2: the neurons more densely, you would add more noise. So 230 00:11:18,760 --> 00:11:21,959 Speaker 2: making the neurons smaller and denser wouldn't actually improve your 231 00:11:22,000 --> 00:11:24,520 Speaker 2: cognition because you'd be ending up adding noise. And so like, 232 00:11:24,800 --> 00:11:27,160 Speaker 2: human brains are sort of at this sweet spot for 233 00:11:27,360 --> 00:11:30,360 Speaker 2: how much you can squeeze them down to get cognition. 234 00:11:30,679 --> 00:11:32,199 Speaker 1: I think maybe one of these days we should have 235 00:11:32,240 --> 00:11:36,360 Speaker 1: a whole separate episode on exactly how you should measure 236 00:11:36,400 --> 00:11:39,160 Speaker 1: brains to correlate it with intelligence, because I read a 237 00:11:39,160 --> 00:11:42,200 Speaker 1: paper that was arguing that actually, the reason we're so 238 00:11:42,280 --> 00:11:46,080 Speaker 1: great is because we have denser neurons, and that it's 239 00:11:46,160 --> 00:11:50,000 Speaker 1: not brain size, it's neuron density that really unlocks intelligence. 240 00:11:50,160 --> 00:11:53,160 Speaker 1: I think in general, the field has not figured out 241 00:11:53,200 --> 00:11:54,720 Speaker 1: exactly what we need to be measuring. 242 00:11:55,480 --> 00:11:57,959 Speaker 2: That's such a nice way of saying that these folks 243 00:11:57,960 --> 00:12:02,280 Speaker 2: don't know what they're doing. Well, you gotta start is immature, Yes, well. 244 00:12:02,160 --> 00:12:05,160 Speaker 1: Yes, brains are complicated man, Yes they are. 245 00:12:05,440 --> 00:12:07,760 Speaker 2: No, you're totally right in great respect to folks who 246 00:12:07,800 --> 00:12:10,320 Speaker 2: are studying the brain, because obviously it's important. And just 247 00:12:10,360 --> 00:12:13,080 Speaker 2: because we don't understand everything doesn't mean that early studies 248 00:12:13,080 --> 00:12:15,240 Speaker 2: are not valuable. They're the ones that lay the foundation 249 00:12:15,360 --> 00:12:18,760 Speaker 2: for future progress. Yes, absolutely, and so we shouldn't mock them. 250 00:12:18,920 --> 00:12:20,760 Speaker 2: We're grateful for their pioneering work. 251 00:12:20,880 --> 00:12:24,040 Speaker 1: Yes, I studied a brain infecting parasite of fish and 252 00:12:24,080 --> 00:12:28,520 Speaker 1: I started getting into neuroanatomy and neurochemistry and I was like, no, 253 00:12:29,080 --> 00:12:32,679 Speaker 1: I'm sticking with parasitology. Brains are complicated, so anyway, but 254 00:12:32,720 --> 00:12:33,840 Speaker 1: here we're jumping in anyway. 255 00:12:34,000 --> 00:12:36,480 Speaker 2: I think also Tom's question has this fascinating flavor to it, 256 00:12:36,520 --> 00:12:39,600 Speaker 2: which is like, you can make the evolutionary argument, and 257 00:12:39,600 --> 00:12:41,360 Speaker 2: I think you're going to get into this later for 258 00:12:41,480 --> 00:12:45,520 Speaker 2: why intelligence might be beneficial, right, it might even overcome 259 00:12:45,559 --> 00:12:48,920 Speaker 2: the expense of the brain. But it's hard to understand 260 00:12:49,000 --> 00:12:51,200 Speaker 2: why evolution would give us a brain that's capable of 261 00:12:51,200 --> 00:12:54,679 Speaker 2: doing things that you didn't need to survive on the savannah, Right, 262 00:12:54,720 --> 00:12:56,840 Speaker 2: You didn't need to be able to think about eleven 263 00:12:56,880 --> 00:12:59,719 Speaker 2: dimensional mathematics and build rockets and all sorts of other 264 00:13:00,080 --> 00:13:04,000 Speaker 2: easy stuff in order to survive evolutionarily. So why is 265 00:13:04,040 --> 00:13:07,880 Speaker 2: it that evolution provided this tool which has this crazy 266 00:13:07,920 --> 00:13:10,400 Speaker 2: capability clearly outstripping our needs. 267 00:13:10,559 --> 00:13:12,880 Speaker 1: Yeah, well, you could ask a similar question like why 268 00:13:12,880 --> 00:13:15,840 Speaker 1: do peacocks have such big, showy tales? And I think 269 00:13:15,920 --> 00:13:18,600 Speaker 1: that you know, natural selection doesn't just select for things 270 00:13:18,640 --> 00:13:21,680 Speaker 1: that help us run faster to catch our food or 271 00:13:21,720 --> 00:13:25,720 Speaker 1: help us garden more efficiently. It also selects for traits 272 00:13:25,720 --> 00:13:28,439 Speaker 1: that make us more attractive mates in one way or another. 273 00:13:28,480 --> 00:13:31,600 Speaker 1: So sometimes it makes you know, you bigger and more 274 00:13:31,679 --> 00:13:34,120 Speaker 1: muscly if that's what the females want. But you know, 275 00:13:34,160 --> 00:13:36,840 Speaker 1: there's also variability in what females of a species want, 276 00:13:36,920 --> 00:13:40,080 Speaker 1: And some of us want super smart partners, our partners 277 00:13:40,080 --> 00:13:42,800 Speaker 1: with musical abilities. And I think humans are kind of 278 00:13:42,960 --> 00:13:45,480 Speaker 1: weird in this respect, that we really are interested in 279 00:13:45,480 --> 00:13:49,800 Speaker 1: intellectual characteristics in addition to just physical characteristics. And I 280 00:13:49,800 --> 00:13:52,360 Speaker 1: don't know exactly how that played into our evolution and 281 00:13:52,400 --> 00:13:54,640 Speaker 1: made us unique. But you know, if you can do 282 00:13:54,760 --> 00:13:57,559 Speaker 1: nine dimensional math, there's got to be a group of 283 00:13:57,600 --> 00:13:58,480 Speaker 1: people who are into that. 284 00:14:00,480 --> 00:14:02,680 Speaker 2: Since both of us are married to nerds, I think 285 00:14:02,679 --> 00:14:06,520 Speaker 2: we may be an unrepresentative subset of humanity. But I 286 00:14:06,559 --> 00:14:08,600 Speaker 2: totally agree with you. Big brains are hot. 287 00:14:08,679 --> 00:14:11,440 Speaker 1: Yes, it's super hot, way more hot than muscles and 288 00:14:11,520 --> 00:14:15,400 Speaker 1: being in good shape, I think. But could be biased. Okay, 289 00:14:15,840 --> 00:14:18,240 Speaker 1: So the question was essentially about how do we get 290 00:14:18,640 --> 00:14:21,880 Speaker 1: humans in particular, and the answer is, we don't know 291 00:14:21,880 --> 00:14:24,320 Speaker 1: about how you get humans in particular, but we can 292 00:14:24,400 --> 00:14:27,960 Speaker 1: get some hints by looking at other kinds of animals 293 00:14:27,960 --> 00:14:31,560 Speaker 1: and trying to figure out what things selected for intelligence 294 00:14:31,680 --> 00:14:34,520 Speaker 1: in other animals, and that might give us some insights 295 00:14:34,560 --> 00:14:36,760 Speaker 1: into like maybe you know, we maxed out a particular 296 00:14:37,120 --> 00:14:39,280 Speaker 1: you know, trait, or there was a particularly strong selective 297 00:14:39,280 --> 00:14:42,400 Speaker 1: pressure for us. So we're going to start this conversation 298 00:14:43,200 --> 00:14:47,120 Speaker 1: by talking about how we define intelligence outside of the 299 00:14:47,160 --> 00:14:51,160 Speaker 1: ability to build rockets, and then some difficulties with measuring it, 300 00:14:51,200 --> 00:14:53,200 Speaker 1: and towards the end of the episode, when we're answering 301 00:14:53,240 --> 00:14:55,720 Speaker 1: other questions from other listeners, we'll talk about some of 302 00:14:55,760 --> 00:14:57,680 Speaker 1: the leading hypotheses. So we're going to get to. 303 00:14:57,640 --> 00:15:00,000 Speaker 2: Them, all right. I love when we start a hard 304 00:15:00,080 --> 00:15:03,880 Speaker 2: conversation with let's define what we're talking about, because it 305 00:15:03,920 --> 00:15:06,320 Speaker 2: makes me feel all philosophical and nerdy. But also it's 306 00:15:06,400 --> 00:15:08,840 Speaker 2: really important. If somebody doesn't do that, then I feel 307 00:15:08,840 --> 00:15:11,040 Speaker 2: like they're going to be sloppy with their arguments because 308 00:15:11,040 --> 00:15:12,480 Speaker 2: they're not being precise with their words. 309 00:15:12,920 --> 00:15:15,120 Speaker 1: Well, and I should say that while I was researching 310 00:15:15,120 --> 00:15:19,600 Speaker 1: this topic, I came across a lot of different definitions, 311 00:15:19,680 --> 00:15:21,800 Speaker 1: and I decided to try to go with something a 312 00:15:21,840 --> 00:15:24,160 Speaker 1: little bit more general that feels pretty good to all 313 00:15:24,240 --> 00:15:26,120 Speaker 1: of us. So here's what I came up with. It's 314 00:15:26,200 --> 00:15:29,920 Speaker 1: just the ability to solve lots of different kinds of problems, 315 00:15:30,040 --> 00:15:33,200 Speaker 1: because that's what they all seemed to narrow down to. 316 00:15:34,080 --> 00:15:36,200 Speaker 1: But what do you think how would you define intelligence? 317 00:15:36,560 --> 00:15:38,840 Speaker 2: I definitely think the ability to solve problems is a 318 00:15:38,920 --> 00:15:41,320 Speaker 2: component of it. It feels to me like also there 319 00:15:41,360 --> 00:15:46,200 Speaker 2: should be a learning aspect, right, discovering patterns generalizing from them, 320 00:15:47,040 --> 00:15:49,720 Speaker 2: you know, in machine learning, that's a big factor. We 321 00:15:49,760 --> 00:15:52,640 Speaker 2: want to develop networks that are not just capable of 322 00:15:52,680 --> 00:15:56,120 Speaker 2: solving the problems we've taught them, but solving a general 323 00:15:56,200 --> 00:16:00,480 Speaker 2: class of problems, including examples they haven't seen before, bringing 324 00:16:00,520 --> 00:16:04,640 Speaker 2: together ideas, coming up with bigger picture solutions. So something 325 00:16:04,680 --> 00:16:08,520 Speaker 2: about generalizing from the examples that you've learned feels to 326 00:16:08,560 --> 00:16:10,240 Speaker 2: me like an important part of intelligence. 327 00:16:10,560 --> 00:16:13,200 Speaker 1: Yeah, and I think that having the ability to solve 328 00:16:13,240 --> 00:16:16,080 Speaker 1: lots of different kinds of problems does, to some extent 329 00:16:16,120 --> 00:16:18,800 Speaker 1: imply the ability to learn, Okay, yeah, or at least 330 00:16:18,840 --> 00:16:21,240 Speaker 1: to experiment with your environment to try new things, which 331 00:16:21,280 --> 00:16:24,000 Speaker 1: I think is like learning. So I think we're pretty 332 00:16:24,040 --> 00:16:26,080 Speaker 1: much on the same page. What do you think. 333 00:16:26,280 --> 00:16:28,680 Speaker 2: I think that's a good working definition. Okay, So then 334 00:16:28,920 --> 00:16:31,400 Speaker 2: how do we measure this? Are we like giving tests 335 00:16:31,440 --> 00:16:34,320 Speaker 2: to cats and dogs to see can you do eleven 336 00:16:34,320 --> 00:16:35,400 Speaker 2: dimensional string theory? 337 00:16:35,760 --> 00:16:38,360 Speaker 1: We're not asking them about eleven dimensional string theory. We 338 00:16:38,400 --> 00:16:40,840 Speaker 1: will talk about some experiments that we do, but first 339 00:16:40,880 --> 00:16:43,000 Speaker 1: I wanted to tell a couple of stories about how 340 00:16:43,040 --> 00:16:46,240 Speaker 1: we've thought we were measuring something but it turns out 341 00:16:46,240 --> 00:16:49,160 Speaker 1: we're measuring something different. Just to highlight how difficult it is. 342 00:16:50,880 --> 00:16:52,280 Speaker 2: Here comes wet Blanket Kelly. 343 00:16:52,840 --> 00:16:54,320 Speaker 1: Come on, of course, we were going to get to 344 00:16:54,360 --> 00:17:00,760 Speaker 1: wet Blanka Kelly. Okay, So clever Hans is horse in 345 00:17:00,880 --> 00:17:03,520 Speaker 1: Germany at the end of the nineteenth century, and this 346 00:17:03,600 --> 00:17:07,360 Speaker 1: horse became super famous because you could ask him questions 347 00:17:07,400 --> 00:17:13,480 Speaker 1: about addition, subtraction, multiplication, division, including questions that involved fractions wow, 348 00:17:13,640 --> 00:17:16,800 Speaker 1: and he would stomp out answers, so he would like 349 00:17:16,960 --> 00:17:19,600 Speaker 1: stomp and then when he was done answering the question, 350 00:17:19,720 --> 00:17:21,560 Speaker 1: so for the right numbers, like if the answer was five, 351 00:17:21,600 --> 00:17:23,760 Speaker 1: he'd stop five times and then he'd sort of do 352 00:17:23,960 --> 00:17:26,399 Speaker 1: like a little circle thing with his hoof to be 353 00:17:26,400 --> 00:17:27,160 Speaker 1: like okay, I'm done. 354 00:17:27,600 --> 00:17:28,159 Speaker 2: Wow. 355 00:17:28,280 --> 00:17:30,840 Speaker 1: And then you could also ask Clever Hans questions like 356 00:17:31,280 --> 00:17:34,800 Speaker 1: what is that woman holding in her hand? And using 357 00:17:34,920 --> 00:17:37,280 Speaker 1: an alphabet that was on a board, each letter was 358 00:17:37,280 --> 00:17:41,320 Speaker 1: separated into columns and rows, he could clamp out what 359 00:17:41,480 --> 00:17:46,200 Speaker 1: letters he wanted to say, you know, umbrella or hot 360 00:17:46,240 --> 00:17:47,960 Speaker 1: dog or something like that. 361 00:17:47,960 --> 00:17:51,359 Speaker 2: That implies some ability to read, right, yeah. 362 00:17:51,280 --> 00:17:54,400 Speaker 1: Yeah, yeah right. Some experts came in and they were like, 363 00:17:54,560 --> 00:17:56,919 Speaker 1: oh my gosh, we totally agree because we can do 364 00:17:56,960 --> 00:17:58,920 Speaker 1: the trick with Clever Hans too. We can ask clever 365 00:17:58,960 --> 00:18:01,600 Speaker 1: Hans the same question, it's not just his owner, and 366 00:18:01,640 --> 00:18:03,959 Speaker 1: he can answer it. So they decided that Clever Hans 367 00:18:04,320 --> 00:18:07,320 Speaker 1: had the intelligence that was equivalent to about a thirteen 368 00:18:07,400 --> 00:18:12,320 Speaker 1: year old child. But then another set of experimenters came 369 00:18:12,320 --> 00:18:16,480 Speaker 1: in and they started walling clever Hans off from the 370 00:18:16,560 --> 00:18:19,440 Speaker 1: room where the questioner was asking. 371 00:18:19,880 --> 00:18:22,560 Speaker 2: Oh, yeah, that's right. 372 00:18:23,880 --> 00:18:28,600 Speaker 1: It turns out that clever Hans could only answer questions 373 00:18:28,640 --> 00:18:32,200 Speaker 1: correctly if the experimenter knew the answer ahead of time, 374 00:18:32,840 --> 00:18:35,080 Speaker 1: and if Hans could see the experimenter. 375 00:18:35,720 --> 00:18:38,720 Speaker 2: Oh, man, this is just like those experiments with the 376 00:18:38,720 --> 00:18:40,880 Speaker 2: wiki boards, right, uh right. 377 00:18:40,800 --> 00:18:43,320 Speaker 1: Like something else is sort of happening behind the scenes 378 00:18:43,359 --> 00:18:47,120 Speaker 1: that's unintended. So it turned out clever Hans isn't smart 379 00:18:47,280 --> 00:18:49,960 Speaker 1: in terms of the ability to do like division, but 380 00:18:50,040 --> 00:18:54,240 Speaker 1: clever Hans was very smart at reading human body language. 381 00:18:54,480 --> 00:18:56,679 Speaker 1: So even if you put a new person in that 382 00:18:56,720 --> 00:18:59,720 Speaker 1: clever Hans had never encountered before, Clever Hans could tell 383 00:18:59,720 --> 00:19:02,920 Speaker 1: that they were like holding their breath until he got 384 00:19:02,920 --> 00:19:05,120 Speaker 1: to like the right number of stomps, and that then 385 00:19:05,200 --> 00:19:08,240 Speaker 1: something about their body would change and then he'd know, okay, 386 00:19:08,240 --> 00:19:10,840 Speaker 1: that was it. And so clever Hans was actually just 387 00:19:10,960 --> 00:19:12,200 Speaker 1: queuing it on body language. 388 00:19:12,280 --> 00:19:15,399 Speaker 2: Wow, So clever Hans could really read human emotion and 389 00:19:15,520 --> 00:19:18,840 Speaker 2: like understand what humans wanted him to do and then 390 00:19:18,880 --> 00:19:19,359 Speaker 2: would do it. 391 00:19:19,600 --> 00:19:22,520 Speaker 1: Yes, right. But this points out one of the first 392 00:19:22,560 --> 00:19:25,840 Speaker 1: difficult things about studying animals is that sometimes it can 393 00:19:25,840 --> 00:19:28,720 Speaker 1: be really hard to figure out what you're measuring or 394 00:19:28,760 --> 00:19:30,639 Speaker 1: what it is that they're actually learning, and it's not 395 00:19:30,680 --> 00:19:32,840 Speaker 1: always what you think that they're learning. So you have 396 00:19:32,880 --> 00:19:35,639 Speaker 1: to be very careful about how you design experiments, and 397 00:19:35,680 --> 00:19:37,760 Speaker 1: you need to keep in mind that animals sense the 398 00:19:37,760 --> 00:19:40,760 Speaker 1: world in different ways. So you know, some animals echo locate, 399 00:19:40,840 --> 00:19:45,320 Speaker 1: some can see polarized light, some explore their universe using electricity, 400 00:19:45,400 --> 00:19:47,159 Speaker 1: and so you need to make sure that you're asking 401 00:19:47,280 --> 00:19:50,040 Speaker 1: questions that make sense with their sensory systems. 402 00:19:49,800 --> 00:19:52,440 Speaker 2: And not just for animals, right, like humans. Also, it's 403 00:19:52,560 --> 00:19:54,800 Speaker 2: very hard to measure intelligence among humans. A lot of 404 00:19:54,800 --> 00:19:57,320 Speaker 2: these tests have cultural biases in them. 405 00:19:57,359 --> 00:20:00,960 Speaker 1: For example, yep, yeah, absolutely, this is just really hard stuff. 406 00:20:00,960 --> 00:20:02,919 Speaker 1: And then how do you compare you know, intelligence and 407 00:20:02,960 --> 00:20:06,200 Speaker 1: a pigeon where maybe the amazing thing is their ability 408 00:20:06,200 --> 00:20:09,200 Speaker 1: to like orient and travel great distances in their great 409 00:20:09,200 --> 00:20:12,760 Speaker 1: spatial memory. And then compare that to you know, like 410 00:20:13,440 --> 00:20:16,399 Speaker 1: a really smart crow that can whittle ass stick so 411 00:20:16,400 --> 00:20:18,240 Speaker 1: that it can get at bugs in a tree, or like, 412 00:20:18,240 --> 00:20:21,639 Speaker 1: how do you compare intelligence between those two different organisms. 413 00:20:21,320 --> 00:20:23,520 Speaker 2: Right, it seems like it's obviously not a single number. 414 00:20:23,520 --> 00:20:26,240 Speaker 2: It's a multi dimensional right, And anytime you try to 415 00:20:26,240 --> 00:20:29,000 Speaker 2: boil something complex down to a single number, and you're 416 00:20:29,000 --> 00:20:30,440 Speaker 2: going to lose a lot of nuance. 417 00:20:30,640 --> 00:20:32,960 Speaker 1: Yes you are. And I want to tell one more 418 00:20:33,040 --> 00:20:35,840 Speaker 1: quick story about when we thought that we had taught 419 00:20:35,880 --> 00:20:39,520 Speaker 1: animals something and we had taught them something else. So 420 00:20:39,560 --> 00:20:43,080 Speaker 1: there's this thing called perceptual categories, where essentially you're trying 421 00:20:43,119 --> 00:20:46,440 Speaker 1: to figure out if animals can categorize things in ways 422 00:20:46,440 --> 00:20:50,520 Speaker 1: similar to how humans can. So people showed pigeons a 423 00:20:50,560 --> 00:20:53,000 Speaker 1: bunch of different pictures of things, and they trained the 424 00:20:53,040 --> 00:20:55,439 Speaker 1: pigeons to peck a key and get a reward whenever 425 00:20:55,480 --> 00:20:58,200 Speaker 1: a person was in the image, and so they actually 426 00:20:58,280 --> 00:20:59,959 Speaker 1: got pretty good at this. Over time. They would pay 427 00:21:00,119 --> 00:21:01,800 Speaker 1: the images, even if it was an image they had 428 00:21:01,880 --> 00:21:04,200 Speaker 1: never seen before, if there was a person in it. 429 00:21:04,800 --> 00:21:07,240 Speaker 1: And eventually they were able to get these pigeons to 430 00:21:07,280 --> 00:21:10,479 Speaker 1: pick out pictures with bodies of water, pictures with trees, 431 00:21:10,600 --> 00:21:13,119 Speaker 1: pictures with fish. So it turns out they were actually 432 00:21:13,160 --> 00:21:15,560 Speaker 1: really good at coming up with these categories if you 433 00:21:15,600 --> 00:21:18,240 Speaker 1: trained them. And then we did it with capuchin monkeys 434 00:21:18,520 --> 00:21:21,080 Speaker 1: and we trained them on a bunch of different photos 435 00:21:21,080 --> 00:21:23,760 Speaker 1: of people, and they were doing pretty well. But the 436 00:21:23,960 --> 00:21:27,000 Speaker 1: experimenters did something really clever. They looked at the mistakes 437 00:21:27,040 --> 00:21:29,560 Speaker 1: the monkeys were making and tried to see if there 438 00:21:29,600 --> 00:21:33,199 Speaker 1: was anything informative in those mistakes. And one mistake the 439 00:21:33,200 --> 00:21:36,639 Speaker 1: monkeys made was they did not count as a picture 440 00:21:36,680 --> 00:21:40,600 Speaker 1: of a human anything that was torso and higher. So 441 00:21:40,720 --> 00:21:42,439 Speaker 1: like you know, you think of your school pictures from 442 00:21:42,440 --> 00:21:44,280 Speaker 1: when you were a kid, and it's like your torso 443 00:21:44,320 --> 00:21:47,399 Speaker 1: and higher. The monkeys did not think that was a person. 444 00:21:49,520 --> 00:21:51,280 Speaker 2: They're counting legs only or what. 445 00:21:51,760 --> 00:21:53,760 Speaker 1: They didn't know exactly what they were counting, but they 446 00:21:53,800 --> 00:21:56,880 Speaker 1: hadn't generalized a person. There was like a very particular 447 00:21:57,240 --> 00:22:00,320 Speaker 1: all has to be there, So they hadn't learned exactly 448 00:22:00,400 --> 00:22:04,080 Speaker 1: the category we had in mind. And they counted as 449 00:22:04,119 --> 00:22:07,320 Speaker 1: a person an image of a jackal that was carrying 450 00:22:07,359 --> 00:22:11,480 Speaker 1: a dead flamingo. And the reason they ended up thinking 451 00:22:11,560 --> 00:22:13,679 Speaker 1: that the monkeys were categorizing that as a person is 452 00:22:13,680 --> 00:22:17,119 Speaker 1: because that image had read in it, and the only 453 00:22:17,280 --> 00:22:20,760 Speaker 1: other images in the training set that had read were 454 00:22:20,800 --> 00:22:23,560 Speaker 1: three of the pictures of people, and so maybe they 455 00:22:23,600 --> 00:22:25,800 Speaker 1: hadn't actually learned what humans were, but they had learned 456 00:22:25,840 --> 00:22:27,639 Speaker 1: something about the image set, and we have problems with 457 00:22:27,680 --> 00:22:28,960 Speaker 1: this with AI training sets. 458 00:22:29,000 --> 00:22:32,160 Speaker 2: Also, Yeah, there's a famous example of trying to teach 459 00:22:32,160 --> 00:22:34,920 Speaker 2: an AI to tell the difference between wolves and dogs, 460 00:22:35,640 --> 00:22:37,520 Speaker 2: and it was doing really, really well, and then they 461 00:22:37,560 --> 00:22:40,640 Speaker 2: discovered that the pictures of wolves have snow in the background, 462 00:22:40,640 --> 00:22:43,159 Speaker 2: pictures of dogs don't. As it was like learning to 463 00:22:43,200 --> 00:22:45,359 Speaker 2: tell the difference between grass and snow, which is not 464 00:22:45,400 --> 00:22:46,240 Speaker 2: that hard. 465 00:22:46,240 --> 00:22:48,960 Speaker 1: Right, right, And this was an experiment that was designed 466 00:22:48,960 --> 00:22:51,240 Speaker 1: by humans who apparently had tried to think this stuff through. 467 00:22:51,280 --> 00:22:53,639 Speaker 1: It's just really hard. So now that we have a 468 00:22:53,720 --> 00:22:56,280 Speaker 1: sense of how difficult it is to ask these kinds 469 00:22:56,280 --> 00:22:59,400 Speaker 1: of questions, let's take a break and when we come back, 470 00:22:59,440 --> 00:23:01,480 Speaker 1: we'll move on to the next question, which will bring 471 00:23:01,520 --> 00:23:05,000 Speaker 1: us to some of the hypotheses for what selects for intelligence. 472 00:23:22,200 --> 00:23:24,960 Speaker 2: Okay, we're back, and we are tackling some very difficult 473 00:23:24,960 --> 00:23:28,000 Speaker 2: but fascinating questions from listeners today about the nature of 474 00:23:28,080 --> 00:23:32,159 Speaker 2: intelligence and YBS have not yet gone off planet, and 475 00:23:32,200 --> 00:23:34,639 Speaker 2: we started with a really fun question from Tom about 476 00:23:34,680 --> 00:23:38,320 Speaker 2: how you end up with rocket level intelligence, and Kelly 477 00:23:38,320 --> 00:23:40,160 Speaker 2: shed some light on how hard it is to even 478 00:23:40,200 --> 00:23:43,800 Speaker 2: define and measure intelligence and through a wet blanket. I'm 479 00:23:43,800 --> 00:23:48,520 Speaker 2: basically the whole history of that field my day, and 480 00:23:48,640 --> 00:23:50,920 Speaker 2: so slightly unusually, we're not going to give a complete 481 00:23:50,920 --> 00:23:53,399 Speaker 2: and definite answer to Tom's question right now. We're going 482 00:23:53,480 --> 00:23:56,200 Speaker 2: to keep digging into this by answering more questions from 483 00:23:56,240 --> 00:23:58,360 Speaker 2: listeners and then hope at the end to come back 484 00:23:58,359 --> 00:24:01,520 Speaker 2: and give you our best answer to all of your queries. 485 00:24:01,880 --> 00:24:04,600 Speaker 1: All right, So let's go ahead then and jump into 486 00:24:04,680 --> 00:24:05,960 Speaker 1: question two from. 487 00:24:05,800 --> 00:24:09,080 Speaker 3: Mark, Hi, Daniel, and Kelly. My question has to do 488 00:24:09,119 --> 00:24:13,360 Speaker 3: with evolution and predation. Do you think it's possible for 489 00:24:13,720 --> 00:24:18,480 Speaker 3: complex intelligent life to evolve without predation? Or do we 490 00:24:18,520 --> 00:24:22,200 Speaker 3: think that predation is necessary for beings with human level 491 00:24:22,240 --> 00:24:27,320 Speaker 3: intelligence or higher intelligence to exist? If there are aliens 492 00:24:27,359 --> 00:24:32,360 Speaker 3: capable of interstellar travel that have never known predation, I imagine 493 00:24:32,280 --> 00:24:35,960 Speaker 3: they would find Earth to be a nightmare with billions 494 00:24:36,000 --> 00:24:40,880 Speaker 3: of organisms being murdered and consumed by other organisms every day. 495 00:24:41,760 --> 00:24:43,400 Speaker 3: Interested to know your thoughts. 496 00:24:43,400 --> 00:24:47,280 Speaker 1: Thanks, okay, wow, all right? So when I first listened 497 00:24:47,320 --> 00:24:50,520 Speaker 1: to this question, I was confused because I hadn't heard 498 00:24:50,760 --> 00:24:55,320 Speaker 1: this hypothesis that predation is important for producing intelligent life. 499 00:24:55,720 --> 00:24:57,520 Speaker 1: Had you heard this idea before Daniel. 500 00:24:57,760 --> 00:25:00,280 Speaker 2: I hadn't thought about in terms of predation. I thought 501 00:25:00,280 --> 00:25:04,080 Speaker 2: about in terms of like society and culture, that maybe 502 00:25:04,119 --> 00:25:08,040 Speaker 2: an important component of intelligence is having a complex interaction 503 00:25:08,119 --> 00:25:10,840 Speaker 2: between the individuals in the species more than just like 504 00:25:10,920 --> 00:25:13,399 Speaker 2: solitary folks. So I'm not sure how that connects to 505 00:25:13,480 --> 00:25:17,520 Speaker 2: predation or being preyed on, or preying on people, like 506 00:25:17,640 --> 00:25:20,679 Speaker 2: how taking down a mammoth requires like ten to fifteen people. 507 00:25:21,320 --> 00:25:22,480 Speaker 2: Is that the same sort of thing. 508 00:25:22,680 --> 00:25:25,680 Speaker 1: So that's the social intelligence hypothesis, which we're going to 509 00:25:25,720 --> 00:25:29,359 Speaker 1: get to next. But so the idea behind this predation 510 00:25:29,520 --> 00:25:32,600 Speaker 1: hypothesis is that you end up with an arms race 511 00:25:32,680 --> 00:25:35,080 Speaker 1: between predators in prey. So what happens here, So you've 512 00:25:35,119 --> 00:25:37,800 Speaker 1: got a predator that needs to be able to see 513 00:25:37,840 --> 00:25:39,800 Speaker 1: the prey so that it can find it in a 514 00:25:39,840 --> 00:25:41,600 Speaker 1: tree or something. You know, if it's hiding in a tree, 515 00:25:41,640 --> 00:25:43,879 Speaker 1: it kind of blends in, and so that selects for 516 00:25:43,960 --> 00:25:47,639 Speaker 1: them to have better vision or a better memory for 517 00:25:47,840 --> 00:25:51,320 Speaker 1: the kinds of behaviors that their prey typically engages in. So, 518 00:25:51,359 --> 00:25:53,879 Speaker 1: for example, if a certain kind of prey that's extra 519 00:25:54,000 --> 00:25:57,720 Speaker 1: super delicious is only active at dusk and dawn, then 520 00:25:57,760 --> 00:26:00,640 Speaker 1: maybe you focus your predation efforts at dusk and dawn, 521 00:26:00,720 --> 00:26:03,240 Speaker 1: because you've remembered that that's when the prey that's super 522 00:26:03,280 --> 00:26:06,760 Speaker 1: tasty is active. And on the other hand, the prey 523 00:26:07,080 --> 00:26:11,280 Speaker 1: might remember that when I smell cat urine, that means 524 00:26:11,320 --> 00:26:13,159 Speaker 1: that there's a cat in the area, and so anywhere 525 00:26:13,200 --> 00:26:15,119 Speaker 1: I smell cat urine, I'm just not going to hang 526 00:26:15,160 --> 00:26:18,640 Speaker 1: out anymore. And so you end up with this competition 527 00:26:18,720 --> 00:26:21,280 Speaker 1: for sensory systems and the ability to sort of like 528 00:26:21,359 --> 00:26:24,840 Speaker 1: think through risks and benefits that over time results in 529 00:26:24,880 --> 00:26:26,160 Speaker 1: smarter and smarter brains. 530 00:26:26,520 --> 00:26:30,160 Speaker 2: So is the argument just that smarter predators are more 531 00:26:30,200 --> 00:26:33,720 Speaker 2: successful and smarter prey and more successful that it's one 532 00:26:33,800 --> 00:26:37,440 Speaker 2: element of success in this sort of predator versus prey environment. 533 00:26:37,640 --> 00:26:38,480 Speaker 1: Yep, exactly. 534 00:26:38,720 --> 00:26:41,040 Speaker 2: But there's lots of predator prey situations and a lot 535 00:26:41,080 --> 00:26:42,920 Speaker 2: of times one or both of them are kind of 536 00:26:42,960 --> 00:26:46,719 Speaker 2: dumb deer not that smart, for example, though they've been 537 00:26:46,840 --> 00:26:50,080 Speaker 2: highly selected for because people like wolves been gobbling up 538 00:26:50,080 --> 00:26:53,480 Speaker 2: deer for many years. Is this just one possible way 539 00:26:53,520 --> 00:26:56,080 Speaker 2: evolution can go or is the argument that prey and 540 00:26:56,119 --> 00:26:58,119 Speaker 2: predation always leads to intelligence. 541 00:26:58,400 --> 00:27:01,840 Speaker 1: Yeah. So, first I'll note that it wouldn't surprise me 542 00:27:01,880 --> 00:27:03,960 Speaker 1: to find out that deer have gotten more stupid over 543 00:27:04,040 --> 00:27:07,800 Speaker 1: time because there's not tons of hunters out there and 544 00:27:07,840 --> 00:27:10,879 Speaker 1: we've killed most of their predators on this continent, so 545 00:27:11,280 --> 00:27:14,120 Speaker 1: they are perhaps a little bit relieved from the pressure 546 00:27:14,119 --> 00:27:17,399 Speaker 1: of this arms race, and maybe that explains their intelligence. 547 00:27:17,520 --> 00:27:19,480 Speaker 1: But yeah, so I think to me, that's one of 548 00:27:19,520 --> 00:27:22,760 Speaker 1: the issues with this hypothesis is that it's not as 549 00:27:22,760 --> 00:27:26,080 Speaker 1: far as I can tell, really clearer about which sets 550 00:27:26,080 --> 00:27:28,679 Speaker 1: of predators in prey or which kinds of predators should 551 00:27:28,720 --> 00:27:31,439 Speaker 1: end up, you know, maxing out being super smart and 552 00:27:31,520 --> 00:27:33,600 Speaker 1: which prey should end up being super smart. But it's 553 00:27:33,640 --> 00:27:36,040 Speaker 1: sort of more of a like, well, this is why 554 00:27:36,400 --> 00:27:39,120 Speaker 1: you end up with complex thought at all, and then 555 00:27:39,240 --> 00:27:41,960 Speaker 1: it just sort of depends on how much complex thought 556 00:27:41,960 --> 00:27:44,200 Speaker 1: you need, depending on what kinds of races you find 557 00:27:44,200 --> 00:27:46,879 Speaker 1: yourself in and with what kinds of organisms. 558 00:27:47,000 --> 00:27:49,239 Speaker 2: Because it's not hard to think of other examples, you know, 559 00:27:49,320 --> 00:27:53,720 Speaker 2: like spiders and flies, right, flies are very good evading spiders, 560 00:27:53,720 --> 00:27:56,040 Speaker 2: but they're not using intelligence. They're just like have a 561 00:27:56,240 --> 00:27:59,119 Speaker 2: crazy hair trigger and you can jump out of the 562 00:27:59,160 --> 00:28:00,960 Speaker 2: way in a bil I. 563 00:28:00,880 --> 00:28:03,320 Speaker 1: Mean that's pretty impressive as well, Like you know, those 564 00:28:03,400 --> 00:28:06,280 Speaker 1: crazy hair triggers could be because you know, you've had 565 00:28:06,320 --> 00:28:10,439 Speaker 1: selection for hairs that are particularly sensitive to you know, 566 00:28:10,720 --> 00:28:14,120 Speaker 1: spider webs, or maybe their visual systems are particularly good 567 00:28:14,160 --> 00:28:17,359 Speaker 1: at seeing the way light reflects off of a spider 568 00:28:17,359 --> 00:28:19,440 Speaker 1: web with some dew on it. But yes, I don't 569 00:28:19,440 --> 00:28:23,320 Speaker 1: think of flies as being particularly brilliant, although they outsmart 570 00:28:23,320 --> 00:28:25,600 Speaker 1: me when I try to smack them, so maybe they're 571 00:28:25,600 --> 00:28:26,720 Speaker 1: smarter than I give them credit for. 572 00:28:27,240 --> 00:28:30,720 Speaker 2: They're definitely amazing biological engineering. I don't know if it 573 00:28:30,760 --> 00:28:33,800 Speaker 2: counts as intelligence, but now we're back into that morass 574 00:28:33,840 --> 00:28:36,000 Speaker 2: of what is smart anyway. But I wouldn't put flies 575 00:28:36,080 --> 00:28:37,040 Speaker 2: near the top of the list. 576 00:28:37,840 --> 00:28:39,960 Speaker 1: I haven't read a lot of papers where flies were 577 00:28:39,960 --> 00:28:43,360 Speaker 1: like solving complex problems. Although I do think that they 578 00:28:43,440 --> 00:28:45,680 Speaker 1: keep track of their social structures because they fight with 579 00:28:45,760 --> 00:28:47,600 Speaker 1: each other, and I think who they fight with depends 580 00:28:47,600 --> 00:28:49,240 Speaker 1: on you know, who they fought with in the past. 581 00:28:49,320 --> 00:28:50,920 Speaker 1: So maybe there's a lot going on there that we 582 00:28:50,960 --> 00:28:51,440 Speaker 1: can't see. 583 00:28:51,520 --> 00:28:53,680 Speaker 2: All right, But you must have some fun examples of 584 00:28:53,680 --> 00:28:56,440 Speaker 2: predator and prey who have developed some clever intelligence. 585 00:28:56,600 --> 00:29:00,400 Speaker 1: I am particularly excited about chimps and New Caledonian crows 586 00:29:00,440 --> 00:29:03,800 Speaker 1: who have managed to like pull stems off of trees 587 00:29:03,840 --> 00:29:06,959 Speaker 1: and in some cases even like remove leaves and stuff 588 00:29:07,000 --> 00:29:11,120 Speaker 1: to make particularly good tools for going inside of like trees, 589 00:29:11,160 --> 00:29:13,000 Speaker 1: so that you can get bugs that are inside. 590 00:29:13,240 --> 00:29:14,920 Speaker 2: That's tool use. That's clever. Yeah. 591 00:29:15,000 --> 00:29:15,200 Speaker 3: Yeah. 592 00:29:15,240 --> 00:29:17,640 Speaker 1: And you'll find stories about crows who will sometimes like 593 00:29:17,680 --> 00:29:20,600 Speaker 1: if they have a metal implement, they'll bend it so 594 00:29:20,600 --> 00:29:22,960 Speaker 1: that they can sort of hook something out. There have 595 00:29:22,960 --> 00:29:25,840 Speaker 1: been stories of crows that are able to like drop 596 00:29:26,000 --> 00:29:29,400 Speaker 1: rocks into containers of water so that the water can 597 00:29:29,440 --> 00:29:31,040 Speaker 1: come up high enough so that they can drink it. 598 00:29:31,200 --> 00:29:34,080 Speaker 1: Oh wow, Like they do some pretty incredible things. Thinking 599 00:29:34,120 --> 00:29:36,880 Speaker 1: about prey, there are octopus that do things like use 600 00:29:37,000 --> 00:29:40,239 Speaker 1: halves of coconut shells and instead of using them like 601 00:29:40,280 --> 00:29:46,000 Speaker 1: in Monty Python to make the sound of a horse galloping. No, 602 00:29:46,280 --> 00:29:50,080 Speaker 1: I know, I don't understand, they pick up the coconuts shells, 603 00:29:50,480 --> 00:29:52,840 Speaker 1: clean them out, get inside of them, and then carry 604 00:29:52,840 --> 00:29:54,480 Speaker 1: them with them from place to place so that they've 605 00:29:54,480 --> 00:29:56,480 Speaker 1: got a shelter to hide from anything that might want 606 00:29:56,480 --> 00:29:59,120 Speaker 1: to harass or try to eat them. Okay, so there 607 00:29:59,160 --> 00:30:02,560 Speaker 1: are some amazing examples of animals that show a lot 608 00:30:02,560 --> 00:30:05,680 Speaker 1: of intelligence to either avoid being eaten or get their food. 609 00:30:06,080 --> 00:30:08,000 Speaker 2: Those are both definitely smarter than flies. 610 00:30:08,360 --> 00:30:11,720 Speaker 1: Well, but again, you know you're biased by what sounds 611 00:30:11,760 --> 00:30:14,600 Speaker 1: important to you and what you can understand. Well, but yeah, 612 00:30:14,640 --> 00:30:17,240 Speaker 1: I'll give you that. I hate flies. Diptorids are the worst. 613 00:30:19,160 --> 00:30:20,719 Speaker 1: My daughter knows I hate diptorans. 614 00:30:20,840 --> 00:30:21,920 Speaker 2: What are diptorids? 615 00:30:22,080 --> 00:30:23,680 Speaker 1: They're like flies? And mosquitoes. 616 00:30:24,200 --> 00:30:27,200 Speaker 2: Oh god, mosquitos absolutely the worst. If I could delete 617 00:30:27,200 --> 00:30:30,320 Speaker 2: mosquitos from the universe, I totally would do it. I 618 00:30:30,400 --> 00:30:33,240 Speaker 2: mostly love living creatures, and I appreciate that everything contributes 619 00:30:33,280 --> 00:30:36,520 Speaker 2: to the web of life. But mosquitoes, man, just delete those. 620 00:30:36,680 --> 00:30:39,040 Speaker 1: I know you are telling us in another episode recently 621 00:30:39,040 --> 00:30:41,240 Speaker 1: that you guys got some Chinese mosquitos that are now 622 00:30:41,320 --> 00:30:43,440 Speaker 1: all over the place and that you really hate them, 623 00:30:43,480 --> 00:30:44,600 Speaker 1: and I feel for you. 624 00:30:45,160 --> 00:30:47,200 Speaker 2: Nothing against the Chinese, nothing, no, no, of. 625 00:30:47,080 --> 00:30:49,120 Speaker 1: Course, of course, yep, nope. Didn't mean to apply that. 626 00:30:49,640 --> 00:30:53,120 Speaker 1: But so another part of Mark's question was would aliens 627 00:30:53,200 --> 00:30:56,720 Speaker 1: find our planet appalling because of all of the killings. So, 628 00:30:56,880 --> 00:30:59,880 Speaker 1: if predation is important for intelligence, and predation is like 629 00:31:00,080 --> 00:31:03,080 Speaker 1: all over the place, could you end up with an 630 00:31:03,080 --> 00:31:06,959 Speaker 1: intelligent alien species where there wasn't any predation? Is there 631 00:31:07,040 --> 00:31:11,560 Speaker 1: some other selection pressure that produces intelligent aliens and what 632 00:31:11,560 --> 00:31:14,680 Speaker 1: would aliens think of all of the killing? And you know, 633 00:31:14,920 --> 00:31:17,000 Speaker 1: I don't know, it might be appalling. There's an interesting 634 00:31:17,000 --> 00:31:21,000 Speaker 1: book called The Sparrow by Mary Doria Russell, and I 635 00:31:21,040 --> 00:31:24,280 Speaker 1: do not recommend this book to children. Maybe read the 636 00:31:24,280 --> 00:31:27,440 Speaker 1: Wikipedia summary before you read it. It covers some intense topics, 637 00:31:27,760 --> 00:31:30,560 Speaker 1: but it's an interesting look at how humans feel when 638 00:31:30,560 --> 00:31:32,960 Speaker 1: they go to an alien civilization and see how they 639 00:31:33,000 --> 00:31:36,800 Speaker 1: deal with their predator prey relationships. But would you tell 640 00:31:36,880 --> 00:31:39,120 Speaker 1: us about Eric Kirschenbaum's book. 641 00:31:39,400 --> 00:31:42,120 Speaker 2: Yeah. He's a biologist at Cambridge and he wrote this 642 00:31:42,160 --> 00:31:45,960 Speaker 2: book called The Zoologist Guide to the Galaxy. What Animals 643 00:31:46,000 --> 00:31:49,920 Speaker 2: on Earth reveal about aliens and ourselves. And I'll be honest, 644 00:31:49,960 --> 00:31:51,760 Speaker 2: when I first saw this book, I thought, oh no, 645 00:31:52,120 --> 00:31:54,640 Speaker 2: he scooped me because I was at the same time 646 00:31:54,720 --> 00:31:57,000 Speaker 2: working on a book about what we can learn about 647 00:31:57,000 --> 00:31:59,360 Speaker 2: aliens from examples here on Earth. But I was more 648 00:31:59,400 --> 00:32:01,880 Speaker 2: focused on the side of it. And his book is 649 00:32:01,920 --> 00:32:05,800 Speaker 2: really fascinating. It's essentially looking at examples of evolution on 650 00:32:05,840 --> 00:32:08,480 Speaker 2: Earth and trying to draw conclusions about what might be 651 00:32:08,600 --> 00:32:12,160 Speaker 2: universal and what might be local and unique to Earth. It's, 652 00:32:12,200 --> 00:32:14,640 Speaker 2: of course impossible to know for sure, but you know, 653 00:32:14,640 --> 00:32:17,000 Speaker 2: when you see things pop up over and over again, 654 00:32:17,440 --> 00:32:19,960 Speaker 2: you can argue that they might be more common. And 655 00:32:20,000 --> 00:32:22,360 Speaker 2: when you see something that only happened once on Earth, 656 00:32:22,640 --> 00:32:25,080 Speaker 2: you can argue that maybe it's rare and therefore might 657 00:32:25,160 --> 00:32:28,960 Speaker 2: not be everywhere in the universe. And something he talked 658 00:32:28,960 --> 00:32:31,000 Speaker 2: a lot about was predation. 659 00:32:31,000 --> 00:32:33,280 Speaker 1: And you provided me with a quote where he even 660 00:32:33,320 --> 00:32:37,120 Speaker 1: says no ecosystem can exist for long without someone trying 661 00:32:37,160 --> 00:32:38,880 Speaker 1: to take a bite out of somebody else. 662 00:32:39,640 --> 00:32:44,520 Speaker 2: Yeah, exactly. He thinks it's essentially inevitable because you'll develop mobility, 663 00:32:44,920 --> 00:32:47,280 Speaker 2: and then you'll start to consume, and then you'll start 664 00:32:47,280 --> 00:32:49,800 Speaker 2: to consume other folks, and then other folks will try 665 00:32:49,800 --> 00:32:53,120 Speaker 2: to get away from you. And so when you get big, 666 00:32:53,200 --> 00:32:55,600 Speaker 2: you get noisy, and people develop ears and then they 667 00:32:55,680 --> 00:32:58,240 Speaker 2: use those to hunt, and so he thinks predation is 668 00:32:58,280 --> 00:33:03,040 Speaker 2: totally inevitable. Directly like, do you think aliens eat each other? 669 00:33:03,080 --> 00:33:06,280 Speaker 1: And he was like, oh, yeah, absolutely, okay, So at 670 00:33:06,360 --> 00:33:09,080 Speaker 1: least that argues that if aliens visit Earth, they shouldn't 671 00:33:09,120 --> 00:33:12,560 Speaker 1: find us too apart, although maybe they'd be shocked that 672 00:33:12,600 --> 00:33:16,040 Speaker 1: we eat lobsters because gross. 673 00:33:17,120 --> 00:33:20,719 Speaker 2: Lobsters are like just huge ocean conchroaches. I don't get it, 674 00:33:20,760 --> 00:33:22,200 Speaker 2: Like they look really gross to me. 675 00:33:22,320 --> 00:33:26,040 Speaker 1: Oh yeah, I'll pass. Sorry, I'm not into it. So 676 00:33:26,160 --> 00:33:29,400 Speaker 1: to be honest, when I was researching hypotheses for why 677 00:33:29,760 --> 00:33:32,960 Speaker 1: intelligence pops up from time to time, I didn't come 678 00:33:33,000 --> 00:33:37,520 Speaker 1: across this predation hypothesis often. The hypothesis that I came 679 00:33:37,560 --> 00:33:42,040 Speaker 1: across the most often is what's called the social intelligence hypothesis. 680 00:33:42,280 --> 00:33:44,760 Speaker 1: This is what Daniel was talking about earlier, which is 681 00:33:44,800 --> 00:33:48,320 Speaker 1: that group living selects for intelligence, and you need group 682 00:33:48,400 --> 00:33:51,400 Speaker 1: living to do things like track members of a group. 683 00:33:51,480 --> 00:33:54,360 Speaker 1: So remember, like, you know, when I was cooperating with Beth, 684 00:33:54,920 --> 00:33:56,640 Speaker 1: you know, I pulled some ticks off her back, and 685 00:33:56,680 --> 00:33:58,600 Speaker 1: the next day she pulled some ticks off my back. 686 00:33:58,680 --> 00:34:00,680 Speaker 1: So that's great, I'll keep working with her. But I 687 00:34:00,720 --> 00:34:03,360 Speaker 1: pulled some ticks off Frank's back, and that jerk never 688 00:34:03,400 --> 00:34:05,000 Speaker 1: returned to the favor. So I'm not going to work 689 00:34:05,000 --> 00:34:08,279 Speaker 1: again with Frank. And so it helps you like keep 690 00:34:08,360 --> 00:34:11,400 Speaker 1: track of individuals in your group and how you should 691 00:34:11,760 --> 00:34:13,200 Speaker 1: react with each of them differently. 692 00:34:13,800 --> 00:34:17,839 Speaker 2: Essentially, politics is complicated and it takes real intelligence, and 693 00:34:17,880 --> 00:34:21,040 Speaker 2: in a society, politics are important for survival. You have 694 00:34:21,080 --> 00:34:24,000 Speaker 2: to build coalitions, you have to worry about your enemies, 695 00:34:24,120 --> 00:34:26,640 Speaker 2: you have to worry about being betrayed by your allies. 696 00:34:27,080 --> 00:34:28,160 Speaker 2: It's complicated stuff. 697 00:34:28,239 --> 00:34:30,600 Speaker 1: Well, and you need to communicate with each other so 698 00:34:30,640 --> 00:34:32,600 Speaker 1: that if you're you know, a bee, for example, you 699 00:34:32,640 --> 00:34:35,080 Speaker 1: can waggle and tell everyone where the good food source is. 700 00:34:35,239 --> 00:34:37,719 Speaker 1: Or you need to if your wolves coordinate so you 701 00:34:37,719 --> 00:34:39,840 Speaker 1: can take down the moose so that you can feed everyone. 702 00:34:40,440 --> 00:34:44,160 Speaker 1: So social behavior requires a lot of intelligence to work out. 703 00:34:44,480 --> 00:34:47,200 Speaker 2: Have you ever read the books about the Wolves of Yellowstone? 704 00:34:47,520 --> 00:34:48,200 Speaker 1: Oh? Have you? 705 00:34:48,640 --> 00:34:51,680 Speaker 2: Yeah, there's a fascinating series of books written by a 706 00:34:51,719 --> 00:34:54,239 Speaker 2: guy who spent like forty years getting up at like 707 00:34:54,280 --> 00:34:56,920 Speaker 2: three in the morning and just watching the wolves and 708 00:34:57,080 --> 00:34:59,160 Speaker 2: each of a number, and he just got to know 709 00:34:59,239 --> 00:35:03,080 Speaker 2: them all and wrote these books about their politics. And 710 00:35:03,120 --> 00:35:07,120 Speaker 2: it's like a soap opera. You know, betrayals and arguments 711 00:35:07,160 --> 00:35:10,040 Speaker 2: and you know, love affairs and all sorts of crazy 712 00:35:10,080 --> 00:35:12,279 Speaker 2: stuff and wolve's going off to start their own packs 713 00:35:12,280 --> 00:35:14,440 Speaker 2: and then coming back and fighting their parents and like 714 00:35:14,800 --> 00:35:17,600 Speaker 2: it's dramatic stuff, and it leaves you with this sense 715 00:35:17,840 --> 00:35:20,799 Speaker 2: of like, wow, this is a society. You know, these 716 00:35:20,800 --> 00:35:23,640 Speaker 2: are intelligent creatures and they have all the same kind 717 00:35:23,680 --> 00:35:26,440 Speaker 2: of politics that we do. It really is a soap opera. 718 00:35:26,560 --> 00:35:27,800 Speaker 2: It's fascinating reading. 719 00:35:27,960 --> 00:35:31,640 Speaker 1: Robert Sipolski wrote a book about baboons called a Primates Memoir, 720 00:35:31,680 --> 00:35:33,919 Speaker 1: which is sort of the same. It's really becomes clear 721 00:35:33,960 --> 00:35:36,359 Speaker 1: how they're a society that has a lot of similarities 722 00:35:36,360 --> 00:35:39,000 Speaker 1: to how humans work. And also he's a fantastic writer. 723 00:35:39,080 --> 00:35:40,400 Speaker 1: I really enjoyed that memoir. 724 00:35:40,600 --> 00:35:42,920 Speaker 2: So I really recommend these books there by Rick McIntyre, 725 00:35:42,960 --> 00:35:44,880 Speaker 2: one of them has called, for example, the Rise of 726 00:35:44,960 --> 00:35:49,720 Speaker 2: Wolf eight. They're fantastic books. Really couldn't recommend them higher anyway, 727 00:35:49,719 --> 00:35:52,480 Speaker 2: Go ahead, tell us about other intelligent critters. 728 00:35:52,640 --> 00:35:55,640 Speaker 1: Yeah. So in the nineteen nineties it was observed that 729 00:35:55,680 --> 00:35:59,800 Speaker 1: there was this positive correlation between neo coortec side. So 730 00:35:59,840 --> 00:36:01,879 Speaker 1: this is like the outer part of our brain that's 731 00:36:01,920 --> 00:36:04,239 Speaker 1: close to our skull, and it's thought to be sort 732 00:36:04,239 --> 00:36:09,000 Speaker 1: of like a more recent mammalian thing. And so monkeys 733 00:36:09,000 --> 00:36:13,160 Speaker 1: and apes that had bigger neo cortexes were also thought 734 00:36:13,200 --> 00:36:14,880 Speaker 1: to be smarter. And this part of the brain is 735 00:36:14,960 --> 00:36:17,280 Speaker 1: used for things like language and sensory perception. 736 00:36:17,560 --> 00:36:19,800 Speaker 2: Can I stop you right there and be a little skeptical. 737 00:36:20,000 --> 00:36:22,040 Speaker 2: Anytimes somebody says something like that like this part of 738 00:36:22,040 --> 00:36:24,439 Speaker 2: the brain is used for XYZ, I'm like, we don't 739 00:36:24,560 --> 00:36:28,439 Speaker 2: understand the brain. Isn't this some like correlational study using 740 00:36:28,560 --> 00:36:30,879 Speaker 2: fMRI And it's like, well, there's more blood flow here 741 00:36:30,960 --> 00:36:33,600 Speaker 2: when we show them this picture of a baboon and 742 00:36:33,640 --> 00:36:36,719 Speaker 2: so therefore, right, it's all pretty fuzzy. I feel like 743 00:36:36,719 --> 00:36:39,400 Speaker 2: this should be a huge asterisk anytime somebody says this 744 00:36:39,480 --> 00:36:40,560 Speaker 2: part of the brain does. 745 00:36:40,600 --> 00:36:43,520 Speaker 1: X OMG, yes, yeah, no, I totally agree with you, 746 00:36:43,560 --> 00:36:45,399 Speaker 1: and so we'll move this point up a little bit. 747 00:36:45,440 --> 00:36:47,000 Speaker 1: One of the issues with this field is that to 748 00:36:47,040 --> 00:36:49,799 Speaker 1: try to support the hypothesis, they'll do things like look 749 00:36:49,800 --> 00:36:54,400 Speaker 1: at relative brain size and correlate that with things that 750 00:36:54,440 --> 00:36:56,719 Speaker 1: are related to social things like if you are a 751 00:36:56,760 --> 00:36:59,479 Speaker 1: species that tends to live in groups of five rather 752 00:36:59,520 --> 00:37:02,239 Speaker 1: than you know, by yourself, they'll look to see, do 753 00:37:02,280 --> 00:37:04,960 Speaker 1: you have a bigger brain if you live in groups 754 00:37:04,960 --> 00:37:06,840 Speaker 1: of five, or if you're a species that lives in 755 00:37:06,880 --> 00:37:09,560 Speaker 1: a group of just one. Does that make sense? And 756 00:37:09,600 --> 00:37:13,120 Speaker 1: so there's a lot of these correlations, but there's a 757 00:37:13,160 --> 00:37:16,279 Speaker 1: lot of argument about what you should be measuring in 758 00:37:16,320 --> 00:37:18,840 Speaker 1: a brain to begin with, like, what is the relevant 759 00:37:18,840 --> 00:37:20,440 Speaker 1: part of the brain you should be measuring. Is it 760 00:37:20,480 --> 00:37:23,880 Speaker 1: the whole brain, is it the neocortex? Is it actually 761 00:37:23,920 --> 00:37:26,759 Speaker 1: the density of neurons and a very particular part of 762 00:37:26,800 --> 00:37:29,640 Speaker 1: the brain that matters. And additionally, you know, there are 763 00:37:29,719 --> 00:37:34,040 Speaker 1: animals that can do similarly smart tasks that have very 764 00:37:34,040 --> 00:37:36,080 Speaker 1: different kinds of brains. You know, how do you compare, 765 00:37:36,160 --> 00:37:39,520 Speaker 1: for example, a primate brain and a bird brain. And 766 00:37:39,600 --> 00:37:41,560 Speaker 1: so one of the big critiques in this field is 767 00:37:41,600 --> 00:37:44,640 Speaker 1: that they do rely a lot on different measures of 768 00:37:44,680 --> 00:37:48,239 Speaker 1: brains that have not necessarily been locked in as the 769 00:37:48,400 --> 00:37:51,279 Speaker 1: correct thing to be measuring when you're interested in intelligence. 770 00:37:51,800 --> 00:37:54,239 Speaker 2: Right, And we mentioned these as critiques of the field 771 00:37:54,320 --> 00:37:56,719 Speaker 2: not because we think these folks don't know what they're doing. Yeah, 772 00:37:56,760 --> 00:37:59,120 Speaker 2: they're following a glorious tradition in science, which is like 773 00:37:59,440 --> 00:38:02,480 Speaker 2: tackling and possibly hard question by doing the only things 774 00:38:02,520 --> 00:38:04,799 Speaker 2: they know how to do. And often this leads to 775 00:38:04,920 --> 00:38:07,120 Speaker 2: breakthroughs you didn't expect. It's better than just sitting on 776 00:38:07,120 --> 00:38:09,839 Speaker 2: the couch and saying, well, it's impossible, let's not do it. 777 00:38:10,239 --> 00:38:13,239 Speaker 2: But another glorious tradition in science is being open about 778 00:38:13,280 --> 00:38:15,480 Speaker 2: the critiques of what we really do know and what 779 00:38:15,560 --> 00:38:18,640 Speaker 2: we don't know, and being careful about drawing too broad 780 00:38:18,680 --> 00:38:19,280 Speaker 2: a conclusion. 781 00:38:19,480 --> 00:38:22,920 Speaker 1: Yeah, exactly. There have been a lot of correlations where 782 00:38:23,040 --> 00:38:27,680 Speaker 1: they have found that complicated social behaviors in insects and 783 00:38:27,719 --> 00:38:31,239 Speaker 1: birds and mammals are correlated with either bigger brains or 784 00:38:31,280 --> 00:38:35,040 Speaker 1: bigger brain parts. And that's at least suggestive. And again, 785 00:38:35,080 --> 00:38:37,759 Speaker 1: it's really hard to compare between two different species. What 786 00:38:37,840 --> 00:38:40,239 Speaker 1: is the right measurement for how intelligent they are? How 787 00:38:40,239 --> 00:38:42,640 Speaker 1: do you rank them? You know, as you're sort of 788 00:38:43,280 --> 00:38:46,200 Speaker 1: stumbling around in a dark room with no light on, 789 00:38:46,640 --> 00:38:48,880 Speaker 1: you've got to start somewhere. And I feel like they 790 00:38:48,960 --> 00:38:51,960 Speaker 1: have made some exciting progress and they're having these conversations 791 00:38:51,960 --> 00:38:54,600 Speaker 1: about what is the right thing to be measuring, and 792 00:38:54,800 --> 00:38:56,160 Speaker 1: you know, where do we go from here? And we 793 00:38:56,239 --> 00:38:59,040 Speaker 1: found something tantalizing here, how do we follow up and 794 00:38:59,080 --> 00:39:01,319 Speaker 1: get like a better answer based on this sort of 795 00:39:01,320 --> 00:39:05,160 Speaker 1: tantalizing initial result. That's pretty much where we are right now. 796 00:39:05,200 --> 00:39:07,640 Speaker 1: There's this interesting correlation, but we're still trying to figure 797 00:39:07,640 --> 00:39:10,359 Speaker 1: out what to measure and how to measure it. 798 00:39:10,880 --> 00:39:13,160 Speaker 2: So when you say correlation, you're saying there's a correlation 799 00:39:13,239 --> 00:39:18,000 Speaker 2: between brain size and like social activity, and that suggests 800 00:39:18,080 --> 00:39:21,520 Speaker 2: that bigger brains are smarter and social activities connected to 801 00:39:21,600 --> 00:39:22,120 Speaker 2: being smarter. 802 00:39:22,680 --> 00:39:24,719 Speaker 1: Yeah, that if you've got a bigger brain, you are 803 00:39:24,840 --> 00:39:27,480 Speaker 1: more able to, for example, live in a bigger social 804 00:39:27,520 --> 00:39:30,480 Speaker 1: group where you presumably have more you know, sort of 805 00:39:30,520 --> 00:39:33,120 Speaker 1: peers or conspecifics that you need to keep track of 806 00:39:33,160 --> 00:39:35,600 Speaker 1: so that you can remember their prior behaviors and you 807 00:39:35,640 --> 00:39:39,640 Speaker 1: can tell the difference between them. And so more complicated 808 00:39:39,680 --> 00:39:44,280 Speaker 1: social systems tend to be correlated with bigger brains relative 809 00:39:44,320 --> 00:39:48,839 Speaker 1: to body size or bigger brain parts. Sometimes this falls apart. 810 00:39:48,920 --> 00:39:52,480 Speaker 1: For example, little owls, which is Athena noctua. It's a 811 00:39:52,600 --> 00:39:55,239 Speaker 1: species of owl that is super cute. They are not 812 00:39:55,400 --> 00:39:58,680 Speaker 1: social like at all, but they got real big four brains, 813 00:39:58,760 --> 00:40:01,279 Speaker 1: and so it's kind of surprise that they're like, not 814 00:40:01,440 --> 00:40:03,880 Speaker 1: super social, but they got big old breaks. Why we 815 00:40:03,920 --> 00:40:04,279 Speaker 1: don't know. 816 00:40:04,600 --> 00:40:07,160 Speaker 2: Maybe they're sitting there thinking about eleven dimensional string theory 817 00:40:07,200 --> 00:40:08,200 Speaker 2: and maybe they know the answer. 818 00:40:08,200 --> 00:40:09,880 Speaker 1: Oh my god, we have to learn to talk to 819 00:40:09,920 --> 00:40:10,760 Speaker 1: the little owls. 820 00:40:12,680 --> 00:40:15,200 Speaker 2: Tell me, little owl what is the nature of the universe? 821 00:40:15,440 --> 00:40:15,600 Speaker 4: Oh? 822 00:40:15,680 --> 00:40:17,320 Speaker 1: That should be a poem like that sounds like the 823 00:40:17,360 --> 00:40:18,799 Speaker 1: kind of poem you sing to a kid when they're 824 00:40:18,800 --> 00:40:19,279 Speaker 1: going to bed. 825 00:40:22,040 --> 00:40:23,839 Speaker 2: All right, let's write that children's book. 826 00:40:23,920 --> 00:40:26,879 Speaker 1: All right. Okay, So now we're going to take a break, 827 00:40:26,880 --> 00:40:28,640 Speaker 1: and when we get back, we're going to talk about 828 00:40:28,680 --> 00:40:31,600 Speaker 1: one of the other hypotheses for the evolution of intelligence 829 00:40:31,880 --> 00:40:34,320 Speaker 1: as we answer our third listener question. 830 00:40:51,160 --> 00:40:54,560 Speaker 2: Okay, we are back, and we're talking about smartness across 831 00:40:54,600 --> 00:40:58,319 Speaker 2: the animal and alien kingdom? Is intelligence inevitable? What can 832 00:40:58,360 --> 00:41:01,160 Speaker 2: we learn from life on Earth and the patterns of 833 00:41:01,200 --> 00:41:05,719 Speaker 2: intelligence or deer and fly like dumbness on Earth? Is 834 00:41:05,760 --> 00:41:08,279 Speaker 2: there something we can tell about why humans ended up 835 00:41:08,520 --> 00:41:10,440 Speaker 2: so dang smart? 836 00:41:11,000 --> 00:41:13,640 Speaker 5: We as humans have long looked to the stars with 837 00:41:13,719 --> 00:41:16,560 Speaker 5: wonder and all trying to figure out exactly what the 838 00:41:16,640 --> 00:41:19,880 Speaker 5: universe is, what is it made of, what is our 839 00:41:19,920 --> 00:41:22,799 Speaker 5: place in it? And ultimately, I guess to try to 840 00:41:22,840 --> 00:41:26,080 Speaker 5: answer the age old question are we alone in the universe? 841 00:41:27,000 --> 00:41:31,120 Speaker 5: So if one of our Mars rovers or Europa Clipper 842 00:41:31,280 --> 00:41:34,879 Speaker 5: found even microbial signs of life out there, it would 843 00:41:34,880 --> 00:41:38,080 Speaker 5: of course be a game changer, a seismic event, to 844 00:41:38,160 --> 00:41:41,880 Speaker 5: be sure. And while it would be scientifically exciting, I 845 00:41:41,920 --> 00:41:45,680 Speaker 5: think it could also be somewhat disappointing because when most 846 00:41:45,680 --> 00:41:47,960 Speaker 5: of us look to the stars in hopes of finding 847 00:41:48,040 --> 00:41:51,200 Speaker 5: signs of life, I think we envision not just a 848 00:41:51,239 --> 00:41:57,040 Speaker 5: bacterial colony, but an actual, functioning civilization. So my question 849 00:41:57,160 --> 00:42:00,000 Speaker 5: to you is this, as you look at life here 850 00:42:00,040 --> 00:42:04,160 Speaker 5: on Earth, is there any indication at all that intelligence 851 00:42:04,680 --> 00:42:08,480 Speaker 5: is inevitable? I mean, given enough time for evolution to 852 00:42:08,520 --> 00:42:11,799 Speaker 5: do its thing, do you believe sentient beings would be 853 00:42:11,840 --> 00:42:15,440 Speaker 5: the ultimate result? And to put a finer point on that, 854 00:42:16,160 --> 00:42:19,200 Speaker 5: imagine if the asteroids that wiped out the dinosaurs had 855 00:42:19,280 --> 00:42:22,960 Speaker 5: narrowly missed the Earth and that extinction event never occurred. 856 00:42:23,600 --> 00:42:26,480 Speaker 5: Let's say mammals remained as nothing more than small, furry, 857 00:42:26,480 --> 00:42:29,520 Speaker 5: little rodent like creatures and did not lead to the 858 00:42:29,600 --> 00:42:33,400 Speaker 5: rise of the human species. Under those circumstances, do you 859 00:42:33,440 --> 00:42:36,200 Speaker 5: have any reason to believe that there might be an 860 00:42:36,200 --> 00:42:41,920 Speaker 5: intelligent civilization developed by reptilian creatures instead of warm blooded 861 00:42:41,920 --> 00:42:45,160 Speaker 5: things like us. Do you think dinosaurs could be running 862 00:42:45,160 --> 00:42:46,800 Speaker 5: the show these days? 863 00:42:47,600 --> 00:42:48,000 Speaker 2: Anyway? 864 00:42:48,520 --> 00:42:52,239 Speaker 5: Thank you for your consideration in answering this, and I 865 00:42:52,400 --> 00:42:54,759 Speaker 5: look forward to hearing what you say. Love the show, 866 00:42:54,880 --> 00:42:55,800 Speaker 5: Keep up the good work. 867 00:42:56,120 --> 00:43:00,920 Speaker 1: So Howard wanted to know if intelligence is inevitable and 868 00:43:01,080 --> 00:43:04,240 Speaker 1: man I don't know the answer, and it probably depends 869 00:43:04,239 --> 00:43:06,600 Speaker 1: on what the bar is for intelligence. But if you're 870 00:43:06,600 --> 00:43:10,160 Speaker 1: asking about like human level intelligence, there's over two million 871 00:43:10,239 --> 00:43:14,560 Speaker 1: described species on the planet, but only one human, and 872 00:43:14,640 --> 00:43:17,680 Speaker 1: so apparently you can have a lot of organisms without 873 00:43:17,680 --> 00:43:22,000 Speaker 1: getting us, you know, rocket level smart species. You only 874 00:43:22,000 --> 00:43:24,120 Speaker 1: get that once out of two million. So what do 875 00:43:24,200 --> 00:43:26,520 Speaker 1: you think Daniel would t Rex have been coming up 876 00:43:26,520 --> 00:43:29,759 Speaker 1: with Shakespearean style pros if that asteroid mister Earth? 877 00:43:30,120 --> 00:43:32,200 Speaker 2: Well, first, I want to quibble with your claim that 878 00:43:32,239 --> 00:43:35,440 Speaker 2: there's only one intelligent species. Okay, right, I think that's 879 00:43:35,480 --> 00:43:38,960 Speaker 2: a little bit definitional. Like we think, probably fifty thousand 880 00:43:39,040 --> 00:43:42,880 Speaker 2: years ago, there were multiple intelligent species. We just like killed, raped, 881 00:43:43,040 --> 00:43:46,120 Speaker 2: or ate the other ones, right, And so in some sense, 882 00:43:46,160 --> 00:43:48,880 Speaker 2: and you get smart enough, you're likely, I think, to 883 00:43:49,000 --> 00:43:51,759 Speaker 2: just kill off all the other intelligent competition un till 884 00:43:51,800 --> 00:43:53,799 Speaker 2: it becomes part of your in group. So you might 885 00:43:53,840 --> 00:43:56,719 Speaker 2: inevitably just get to one. But it seems like the 886 00:43:56,760 --> 00:43:59,400 Speaker 2: evidence suggests that there were multiple strands at least of 887 00:43:59,480 --> 00:44:01,520 Speaker 2: human life intelligence on Earth. Weren't there? 888 00:44:02,000 --> 00:44:04,319 Speaker 1: So usually when a listener asked the question, they mean, 889 00:44:04,440 --> 00:44:07,280 Speaker 1: like humans like as smart as us, you know, creating 890 00:44:07,360 --> 00:44:11,719 Speaker 1: rockets nine dimensional math. I don't know that even our 891 00:44:11,800 --> 00:44:14,960 Speaker 1: ancestors were that smart, you know, one hundred thousand years ago, 892 00:44:15,280 --> 00:44:18,040 Speaker 1: and so I'm not quite sure that there was you know, 893 00:44:18,160 --> 00:44:20,520 Speaker 1: songs were being sung by other species of humans. We 894 00:44:20,600 --> 00:44:22,960 Speaker 1: might never know, but you're right, they probably were very 895 00:44:23,000 --> 00:44:25,960 Speaker 1: smart relatively relative to other organisms on the planet. 896 00:44:26,040 --> 00:44:28,560 Speaker 2: Yeah, you're right that we don't know biologically. Do they 897 00:44:28,600 --> 00:44:32,879 Speaker 2: really have the same capacity and Neanderthals, for example, or 898 00:44:33,280 --> 00:44:35,880 Speaker 2: other variations. It's hard to know. And maybe they had 899 00:44:35,920 --> 00:44:38,879 Speaker 2: a different kind of smartness, right, maybe they would be like, hey, 900 00:44:38,880 --> 00:44:41,399 Speaker 2: eleven dimension string theory, that's the wrong way to go, man, 901 00:44:41,480 --> 00:44:43,080 Speaker 2: you should be doing loop quantum gravity. 902 00:44:43,200 --> 00:44:44,319 Speaker 1: That's right, that's right. 903 00:44:46,080 --> 00:44:48,960 Speaker 2: But yeah, the really deep question is like, is intelligence 904 00:44:49,000 --> 00:44:51,840 Speaker 2: inevitable in some creature? Are you likely to have it 905 00:44:51,920 --> 00:44:54,880 Speaker 2: no matter what happens? If the afteroid hadn't hit or 906 00:44:54,960 --> 00:44:56,960 Speaker 2: something else has gone different? On Earth, would you have 907 00:44:57,000 --> 00:45:02,440 Speaker 2: superintelligent spiders or whales dominating the planet, or reptiles or something? 908 00:45:02,840 --> 00:45:04,520 Speaker 2: And boy, I wish I knew the answer that, because 909 00:45:04,560 --> 00:45:06,840 Speaker 2: that would tell us a lot about the nature of 910 00:45:06,840 --> 00:45:10,520 Speaker 2: intelligence out there in the universe. Right, whether it's likely 911 00:45:10,560 --> 00:45:13,200 Speaker 2: when we land on alien planets to just find a 912 00:45:13,239 --> 00:45:16,240 Speaker 2: bunch of fun. Guy or something's going to be smart 913 00:45:16,280 --> 00:45:18,200 Speaker 2: on those planets. And I think the answer that we 914 00:45:18,239 --> 00:45:21,760 Speaker 2: really need a better understanding of how intelligence arose on Earth. 915 00:45:21,800 --> 00:45:24,360 Speaker 2: And as you could tell from this whole episode, we 916 00:45:24,440 --> 00:45:25,800 Speaker 2: don't know the answer to that either. 917 00:45:26,400 --> 00:45:28,960 Speaker 1: That's true. And you mentioned super intelligent spiders. Have you 918 00:45:29,000 --> 00:45:30,360 Speaker 1: read the Children of Time series? 919 00:45:30,920 --> 00:45:34,160 Speaker 2: Of course I have, Yes, I love that book. Yeah, fantastic. 920 00:45:34,560 --> 00:45:37,560 Speaker 2: We had Adrian Tchaikowsky on the Daniel and Jorge podcast, 921 00:45:37,840 --> 00:45:39,879 Speaker 2: really fun conversations. You guys should check that out. 922 00:45:40,120 --> 00:45:44,080 Speaker 1: Awesome. So, as you've pointed out, the other theories are 923 00:45:44,080 --> 00:45:47,319 Speaker 1: not exactly ironclad that we've talked about yet. There is 924 00:45:47,600 --> 00:45:50,680 Speaker 1: one additional hypothesis for why you get intelligence that we 925 00:45:50,719 --> 00:45:53,240 Speaker 1: haven't talked about yet that I saved for now because 926 00:45:53,239 --> 00:45:57,399 Speaker 1: it's possibly the one that's most relevant to space. It's 927 00:45:57,400 --> 00:45:59,080 Speaker 1: a bit of a stretch, but you know, I try 928 00:45:59,120 --> 00:46:02,120 Speaker 1: to have an answer for every question, no matter how 929 00:46:02,160 --> 00:46:04,400 Speaker 1: confused I am, So all right, here's my best shot. 930 00:46:05,239 --> 00:46:11,200 Speaker 1: So this hypothesis is called the cognitive buffer hypothesis, and 931 00:46:11,239 --> 00:46:16,280 Speaker 1: the idea is that how variable your environment is determines 932 00:46:16,320 --> 00:46:18,680 Speaker 1: whether or not you need to be intelligent to survive 933 00:46:18,719 --> 00:46:22,680 Speaker 1: in it. So maybe more variable planets would be more 934 00:46:22,760 --> 00:46:25,279 Speaker 1: likely to produce intelligent aliens. But so let's dig in 935 00:46:25,280 --> 00:46:27,080 Speaker 1: for a second and revisit that idea at the end. 936 00:46:27,480 --> 00:46:31,360 Speaker 1: So one of the ideas is that seasonality is important. 937 00:46:31,800 --> 00:46:34,480 Speaker 1: So if you live in an area where sometimes it's 938 00:46:34,560 --> 00:46:37,839 Speaker 1: really hot and sometimes it's really cold, maybe you need 939 00:46:37,880 --> 00:46:41,480 Speaker 1: to be smart to survive that transition between temperatures. So, 940 00:46:41,520 --> 00:46:46,560 Speaker 1: for example, migratory birds have two pretty smart solutions to 941 00:46:46,640 --> 00:46:50,279 Speaker 1: this problem. Some of them migrate and have amazing long 942 00:46:50,400 --> 00:46:54,040 Speaker 1: term spatial memory relative to non migratory ones, so they 943 00:46:54,120 --> 00:46:57,360 Speaker 1: essentially figure out how to leave when it gets bad 944 00:46:57,440 --> 00:46:59,760 Speaker 1: and then how to come back later, and that requires 945 00:47:00,120 --> 00:47:04,040 Speaker 1: some incredible brain abilities. And then there are other birds 946 00:47:04,080 --> 00:47:07,560 Speaker 1: that don't leave, but they have amazing memories for where 947 00:47:07,600 --> 00:47:11,160 Speaker 1: they've cached food, so even when food is rare, they've 948 00:47:11,280 --> 00:47:13,600 Speaker 1: hidden it when times are good, and they remember where 949 00:47:13,640 --> 00:47:15,439 Speaker 1: they can go to find it when times are bad. 950 00:47:15,960 --> 00:47:17,960 Speaker 1: And so the idea here is that living in a 951 00:47:18,000 --> 00:47:22,000 Speaker 1: super variable environment requires you to become intelligen so that 952 00:47:22,040 --> 00:47:24,080 Speaker 1: you can survive. There was a study looking at twelve 953 00:47:24,120 --> 00:47:27,400 Speaker 1: hundred bird species that found that you get bigger brains 954 00:47:27,640 --> 00:47:30,360 Speaker 1: and birds that live in environments that are more variable, 955 00:47:30,719 --> 00:47:32,920 Speaker 1: So suggesting that if things change more, you got to 956 00:47:32,920 --> 00:47:34,920 Speaker 1: be smarter to make it. What do you think, Daniel, 957 00:47:34,960 --> 00:47:35,840 Speaker 1: Do you buy it? 958 00:47:35,840 --> 00:47:38,000 Speaker 2: It makes some sense. It connects to the idea we 959 00:47:38,000 --> 00:47:41,960 Speaker 2: were talking about earlier that intelligence is about generalization, like 960 00:47:42,040 --> 00:47:44,720 Speaker 2: being able to solve new problems you hadn't seen before. 961 00:47:45,120 --> 00:47:48,360 Speaker 2: So if your environment is changing around you, ice ages 962 00:47:48,400 --> 00:47:52,160 Speaker 2: are ending, climate is changing something, you can't just rely 963 00:47:52,560 --> 00:47:55,040 Speaker 2: on the techniques handed down to you by ancestors. You 964 00:47:55,040 --> 00:47:58,280 Speaker 2: have to come up with new ideas to solve new problems, 965 00:47:58,320 --> 00:48:00,920 Speaker 2: and that requires intelligence. So it makes sense to me 966 00:48:01,080 --> 00:48:03,520 Speaker 2: that a climate with a lot of variation, or an 967 00:48:03,680 --> 00:48:06,720 Speaker 2: environment with a lot of variability to it would require 968 00:48:06,760 --> 00:48:09,760 Speaker 2: more intelligence in order to survive. On the other hand, 969 00:48:10,239 --> 00:48:13,240 Speaker 2: I don't know that. I consider like geese that's smart. 970 00:48:13,239 --> 00:48:15,400 Speaker 2: I mean, yes, they can do something amazing, which is 971 00:48:15,440 --> 00:48:18,799 Speaker 2: like navigate the planet with their magnetic sense in their eyeballs. 972 00:48:18,840 --> 00:48:21,239 Speaker 2: It's incredible. I couldn't do that. But I don't know 973 00:48:21,320 --> 00:48:24,440 Speaker 2: if that makes them smart. They seem like hyper optimized 974 00:48:24,560 --> 00:48:28,920 Speaker 2: to this one situation, like if for example, the planet 975 00:48:29,000 --> 00:48:31,920 Speaker 2: change the magnetic field flipped or where they need to 976 00:48:31,960 --> 00:48:34,239 Speaker 2: migrate to changed. Do you think geese would be able 977 00:48:34,280 --> 00:48:35,719 Speaker 2: to adapt to that situation? 978 00:48:36,239 --> 00:48:38,880 Speaker 1: So, the last time I was reading about birds and 979 00:48:38,920 --> 00:48:42,160 Speaker 1: their ability to sense the magnetic fields, my understanding is 980 00:48:42,160 --> 00:48:46,480 Speaker 1: that we don't actually understand that very well, and that 981 00:48:46,600 --> 00:48:49,640 Speaker 1: a lot of animals that we suspect rely on magnetic 982 00:48:49,719 --> 00:48:53,040 Speaker 1: fields also have other memory cues that they rely on 983 00:48:53,120 --> 00:48:54,120 Speaker 1: to get to where they're going. 984 00:48:54,320 --> 00:48:56,560 Speaker 2: Did you read that study about how whales might be 985 00:48:56,600 --> 00:49:00,120 Speaker 2: looking at the stars and navigating by the stars. Oh amazing, 986 00:49:00,239 --> 00:49:00,759 Speaker 2: Oh my god. 987 00:49:00,760 --> 00:49:02,640 Speaker 1: Okay, we should talk about that in a future show 988 00:49:02,680 --> 00:49:04,040 Speaker 1: because that's amazing. I'd love to read that. 989 00:49:04,280 --> 00:49:04,480 Speaker 2: Yeah. 990 00:49:04,560 --> 00:49:07,399 Speaker 1: Yeah, I guess that goes back to the question how 991 00:49:07,440 --> 00:49:09,160 Speaker 1: do you measure intelligence? And how do you make sure 992 00:49:09,160 --> 00:49:11,680 Speaker 1: that you're not measuring intelligence in a too human centric way? 993 00:49:11,760 --> 00:49:13,840 Speaker 1: Or maybe it doesn't matter because maybe if you're specifically 994 00:49:13,840 --> 00:49:17,000 Speaker 1: interested in how did you get human intelligence? Then maybe 995 00:49:17,000 --> 00:49:20,600 Speaker 1: focusing on human intelligence is fine. But I get lost, 996 00:49:21,040 --> 00:49:23,680 Speaker 1: like going to the grocery store sometimes, Like I use 997 00:49:23,760 --> 00:49:25,960 Speaker 1: my GPS in the town that I've been living in 998 00:49:26,000 --> 00:49:28,000 Speaker 1: for half a decade all the time. 999 00:49:28,120 --> 00:49:31,000 Speaker 2: And so GPS your goose positioning system. 1000 00:49:33,880 --> 00:49:36,719 Speaker 1: So I'm impressed in these long distance migrants no matter 1001 00:49:36,719 --> 00:49:38,719 Speaker 1: how they do it, because I would get lost. I'd 1002 00:49:38,800 --> 00:49:39,319 Speaker 1: be in a lake. 1003 00:49:39,680 --> 00:49:42,960 Speaker 2: Again the same way, like flies are impressive. They're very responsive. 1004 00:49:43,000 --> 00:49:45,239 Speaker 2: They're faster off the starting block than I am, or 1005 00:49:45,239 --> 00:49:47,680 Speaker 2: probably seen bolt is. But I don't know that. I 1006 00:49:47,719 --> 00:49:50,080 Speaker 2: call it smart and so, but maybe you're right, maybe 1007 00:49:50,120 --> 00:49:53,120 Speaker 2: I'm being too humanistic there. I'm looking for the kind 1008 00:49:53,120 --> 00:49:55,200 Speaker 2: of intelligence that I know the same way like I 1009 00:49:55,239 --> 00:49:57,480 Speaker 2: see in my dog, a lot of the kind of 1010 00:49:57,520 --> 00:49:59,799 Speaker 2: intelligence I see in people. And that's what impresses me. 1011 00:50:00,640 --> 00:50:03,399 Speaker 2: Sea patterns and respond to them and understand the world 1012 00:50:03,480 --> 00:50:05,440 Speaker 2: around him and adapt to it. 1013 00:50:05,560 --> 00:50:07,319 Speaker 1: Yeah, and appear to be able to read like the 1014 00:50:07,320 --> 00:50:11,480 Speaker 1: emotions of people around them. Yeah, agreed. You've got this 1015 00:50:11,560 --> 00:50:15,120 Speaker 1: seasonality components, which is about like you know, escaping extreme 1016 00:50:15,160 --> 00:50:19,640 Speaker 1: temperatures and stuff. Another component of this idea is resource availability. So, 1017 00:50:19,680 --> 00:50:23,000 Speaker 1: for example, if you are a monkey species that eats fruits, 1018 00:50:23,880 --> 00:50:26,600 Speaker 1: fruits kind of ripen at different times, so they're available 1019 00:50:26,640 --> 00:50:28,919 Speaker 1: sometimes not at other times. You need to remember where 1020 00:50:28,920 --> 00:50:31,799 Speaker 1: the fruit trees are. And so there's some evidence that 1021 00:50:32,120 --> 00:50:35,279 Speaker 1: monkeys that rely on fruit are smarter than monkeys that 1022 00:50:35,320 --> 00:50:38,200 Speaker 1: can eat leaves because leaves are just always available, they're 1023 00:50:38,239 --> 00:50:41,239 Speaker 1: easy to find, they're just much more reliable. So the 1024 00:50:41,280 --> 00:50:44,799 Speaker 1: idea is that if seasons are changing your environment, or 1025 00:50:45,040 --> 00:50:47,560 Speaker 1: resources that you need, or sort of popping in and 1026 00:50:47,600 --> 00:50:52,239 Speaker 1: out of existence, either of those things are important for intelligence. 1027 00:50:52,800 --> 00:50:55,359 Speaker 1: But you know, when you read about any of these hypotheses, 1028 00:50:55,440 --> 00:50:58,640 Speaker 1: like the recent papers that I was reading, everybody's got 1029 00:50:58,680 --> 00:51:02,960 Speaker 1: like their favorite hypothesis, point out limitations of the other hypothesis, 1030 00:51:03,440 --> 00:51:04,960 Speaker 1: and at the end of the day, it could be 1031 00:51:05,000 --> 00:51:07,000 Speaker 1: some combination of things. You know, it could be if 1032 00:51:07,040 --> 00:51:10,239 Speaker 1: you're in a hyper variable environment and you've got a 1033 00:51:10,239 --> 00:51:14,080 Speaker 1: big social group, that's when you get the super smart individuals. 1034 00:51:14,160 --> 00:51:16,479 Speaker 1: Or maybe it's you know, the three things we've talked 1035 00:51:16,480 --> 00:51:19,200 Speaker 1: about today, those are all important, plus a fourth thing 1036 00:51:19,280 --> 00:51:21,680 Speaker 1: that nobody's thought about yet, and so at the end 1037 00:51:21,680 --> 00:51:23,360 Speaker 1: of the day, we don't really have the answer. 1038 00:51:23,560 --> 00:51:25,799 Speaker 2: So what I like to think about sometimes is like 1039 00:51:25,920 --> 00:51:28,960 Speaker 2: fantasy data, Like if you had infinite resources and you've 1040 00:51:29,000 --> 00:51:31,400 Speaker 2: asked me these kind of questions before you could go 1041 00:51:31,440 --> 00:51:33,880 Speaker 2: all the way around the galaxy, whatever, is there a 1042 00:51:33,880 --> 00:51:37,600 Speaker 2: way we could answer this question someone definitively with infinite 1043 00:51:37,800 --> 00:51:41,719 Speaker 2: scientific powers you know, do you think actually seeing aliens 1044 00:51:41,800 --> 00:51:45,240 Speaker 2: and surveying intelligence across the galaxy would answer this question 1045 00:51:45,640 --> 00:51:47,320 Speaker 2: or or did just make it murky again? 1046 00:51:48,000 --> 00:51:52,200 Speaker 1: Wouldn't surprise me that on a different planet there could 1047 00:51:52,280 --> 00:51:55,759 Speaker 1: be some different thing that selects for intelligence relative to 1048 00:51:55,760 --> 00:51:57,960 Speaker 1: what we find here on Earth. You'd think that if 1049 00:51:57,960 --> 00:52:01,839 Speaker 1: you could find enough alien civilizations that had enough like 1050 00:52:02,239 --> 00:52:06,520 Speaker 1: equivalents of New Caledonian crows and Homo sapiens, and you know, 1051 00:52:06,640 --> 00:52:10,040 Speaker 1: enough different kinds of intelligence and enough variability and intelligence, 1052 00:52:10,440 --> 00:52:13,759 Speaker 1: you'd be able to find some patterns. But in terms 1053 00:52:13,760 --> 00:52:16,719 Speaker 1: of what you could do here on Earth, I think 1054 00:52:16,760 --> 00:52:18,799 Speaker 1: we're doing what you can do here on Earth. Like 1055 00:52:18,840 --> 00:52:20,960 Speaker 1: I think we're doing the best we can right now. 1056 00:52:21,320 --> 00:52:23,480 Speaker 2: No, I think the alien question is the fascinating one, 1057 00:52:23,520 --> 00:52:25,799 Speaker 2: and I think a lot of people feel like, of 1058 00:52:25,840 --> 00:52:28,480 Speaker 2: course you do. If we could visit those planets and 1059 00:52:28,520 --> 00:52:30,839 Speaker 2: see those then we would know, and you know, we'd 1060 00:52:30,880 --> 00:52:33,399 Speaker 2: get some answers, like if we saw there's alien life, 1061 00:52:33,440 --> 00:52:35,880 Speaker 2: we could be like, Okay, very clearly life is not 1062 00:52:35,920 --> 00:52:38,400 Speaker 2: as rare, but for intelligence, it's so much harder to 1063 00:52:38,440 --> 00:52:42,800 Speaker 2: define that unless we find alien life and it's intelligent 1064 00:52:42,840 --> 00:52:45,560 Speaker 2: and exactly the same way we are, which would tell 1065 00:52:45,600 --> 00:52:48,680 Speaker 2: us like, Okay, this kind of intelligence is weirdly inevitable. 1066 00:52:48,920 --> 00:52:51,280 Speaker 2: That instead we're going to find like a huge variety 1067 00:52:51,320 --> 00:52:53,920 Speaker 2: of different kinds of intelligences and we're going to get 1068 00:52:53,920 --> 00:52:56,400 Speaker 2: stuck in the same question, just now on a bigger scale. 1069 00:52:56,560 --> 00:52:59,480 Speaker 2: And it might be this question doesn't really have an answer. 1070 00:53:01,080 --> 00:53:05,680 Speaker 1: Yeah, I mean it could be that there's no one answer. 1071 00:53:05,840 --> 00:53:08,239 Speaker 1: I do feel like if you had thirty planets that 1072 00:53:08,280 --> 00:53:10,879 Speaker 1: intelligence arose on all of them, you'd probably be able 1073 00:53:10,880 --> 00:53:15,239 Speaker 1: to say, like, Okay, in every case where there's seasons, 1074 00:53:15,920 --> 00:53:19,400 Speaker 1: you find some animals that are intelligent in this way. 1075 00:53:19,480 --> 00:53:22,120 Speaker 1: Maybe it's like they're better at remembering where the good 1076 00:53:22,120 --> 00:53:25,319 Speaker 1: food was or something. I imagine that we'd find some 1077 00:53:25,719 --> 00:53:30,319 Speaker 1: common threads. But would we ever answer all of our questions. No, 1078 00:53:30,360 --> 00:53:33,040 Speaker 1: probably not. Probably we'd get a lot more questions from 1079 00:53:33,040 --> 00:53:35,439 Speaker 1: those twenty or thirty planets, But that would be exciting too. 1080 00:53:36,520 --> 00:53:38,799 Speaker 2: I mean, we can't even agree on this podcast about 1081 00:53:38,800 --> 00:53:42,680 Speaker 2: whether flies are intelligent. So I feel like what we're 1082 00:53:42,719 --> 00:53:46,400 Speaker 2: really asking is when we find aliens, are they going 1083 00:53:46,440 --> 00:53:48,680 Speaker 2: to have culture the way we have? Are they going 1084 00:53:48,760 --> 00:53:51,640 Speaker 2: to develop technology? Are they going to be scientific? Are 1085 00:53:51,640 --> 00:53:53,279 Speaker 2: they going to be exploring the universe? Are they going 1086 00:53:53,360 --> 00:53:55,560 Speaker 2: to be like us? So in the end, I think, 1087 00:53:55,719 --> 00:53:59,080 Speaker 2: really what these questions, deep down or asking is is 1088 00:53:59,120 --> 00:54:01,160 Speaker 2: our experience universe. So so I think it really is a 1089 00:54:01,239 --> 00:54:03,600 Speaker 2: human like thing. We could find aliens that are really 1090 00:54:03,640 --> 00:54:06,000 Speaker 2: super duper smart, but so different from us that we 1091 00:54:06,000 --> 00:54:08,320 Speaker 2: don't call it intelligence. We call it something else, schmid 1092 00:54:08,320 --> 00:54:11,800 Speaker 2: intelligence or something. I think, really this is probing about 1093 00:54:11,880 --> 00:54:12,879 Speaker 2: human intelligence. 1094 00:54:13,160 --> 00:54:15,040 Speaker 1: Yeah, So first I want to reiterate that I do 1095 00:54:15,080 --> 00:54:16,839 Speaker 1: feel like you should be in charge of naming new 1096 00:54:16,880 --> 00:54:20,840 Speaker 1: things in just about any fields shima in front of it, 1097 00:54:21,880 --> 00:54:24,200 Speaker 1: and I think that's great. Actually my first blog was 1098 00:54:24,239 --> 00:54:27,920 Speaker 1: Fungilius schmundolous because I think schma is a great thing 1099 00:54:27,920 --> 00:54:29,279 Speaker 1: to end in front of other words. But you know, 1100 00:54:29,320 --> 00:54:31,640 Speaker 1: we didn't even talk about all of the hypotheses today. 1101 00:54:31,640 --> 00:54:34,680 Speaker 1: So you mentioned culture. There's a cultural intelligence hypothesis that 1102 00:54:34,719 --> 00:54:38,520 Speaker 1: focuses in particular on learning and like learning in a 1103 00:54:38,560 --> 00:54:40,920 Speaker 1: cultural context and how that might be really important for 1104 00:54:40,960 --> 00:54:43,439 Speaker 1: things like humans, And so I agree with what you said. 1105 00:54:43,480 --> 00:54:45,759 Speaker 1: There's even more hypotheses we didn't get a chance to 1106 00:54:45,760 --> 00:54:48,480 Speaker 1: talk about today. So let's take a step back and 1107 00:54:48,520 --> 00:54:51,520 Speaker 1: see if we've answered all of the questions that we 1108 00:54:51,520 --> 00:54:54,480 Speaker 1: were asked today. And so, you know, Tom wanted to 1109 00:54:54,520 --> 00:54:58,799 Speaker 1: know what makes rocket level intelligence and I threw up 1110 00:54:58,840 --> 00:55:03,160 Speaker 1: my arms and trucks, I don't know. But Tom wanted 1111 00:55:03,200 --> 00:55:06,120 Speaker 1: to know if there's things that select for intelligence. And 1112 00:55:06,160 --> 00:55:08,400 Speaker 1: we've talked about a couple things that we think maybe 1113 00:55:08,400 --> 00:55:11,960 Speaker 1: select for intelligence. Yeah, and he pointed out that intelligence 1114 00:55:12,160 --> 00:55:14,279 Speaker 1: might yet be unlucky for us, and you know, maybe 1115 00:55:14,280 --> 00:55:17,440 Speaker 1: we will all blow up because of our nuclear weapons. 1116 00:55:17,520 --> 00:55:17,960 Speaker 1: I hope not. 1117 00:55:19,239 --> 00:55:20,400 Speaker 2: I don't know why I'm laughing at that. 1118 00:55:20,760 --> 00:55:23,479 Speaker 1: Yeah, well, sometimes you laugh or you cry. Laughing is better. 1119 00:55:23,800 --> 00:55:26,040 Speaker 1: And then we addressed Mark's question about whether or not 1120 00:55:26,080 --> 00:55:29,799 Speaker 1: predation is necessary. Some people think it's important, but you know, 1121 00:55:29,800 --> 00:55:31,839 Speaker 1: we don't know if it's the only thing. Then we 1122 00:55:31,840 --> 00:55:35,799 Speaker 1: looked at Howard's question about whether or not intelligence is inevitable, 1123 00:55:35,840 --> 00:55:38,760 Speaker 1: and I yet again threw up my arms and shrugs. 1124 00:55:39,040 --> 00:55:40,560 Speaker 1: But you know, we had fun imagining it. 1125 00:55:40,840 --> 00:55:43,200 Speaker 2: And I think that these are really fascinating questions. I 1126 00:55:43,280 --> 00:55:46,480 Speaker 2: love the folks ask them, even if we don't have answers. 1127 00:55:46,520 --> 00:55:48,920 Speaker 2: I think it's exciting to think about what efforts people 1128 00:55:48,960 --> 00:55:52,280 Speaker 2: are making and tackling these really big, really deep questions, 1129 00:55:52,719 --> 00:55:55,040 Speaker 2: because you know, those first steps are the hardest ones. 1130 00:55:55,440 --> 00:55:57,800 Speaker 2: Like when you look back what the Greeks did or 1131 00:55:57,840 --> 00:56:00,840 Speaker 2: what ancient civilizations have done to take the first steps 1132 00:56:00,840 --> 00:56:04,120 Speaker 2: towards science, they seem obvious in hindsight maybe, but it's 1133 00:56:04,239 --> 00:56:07,160 Speaker 2: difficult to take those first steps when you're up against 1134 00:56:07,200 --> 00:56:10,400 Speaker 2: the abyss of our ignorance about these deep questions. 1135 00:56:10,560 --> 00:56:11,960 Speaker 1: Yeah, and at the end of the day, I think 1136 00:56:11,960 --> 00:56:14,880 Speaker 1: it's amazing that there is a species on our planet 1137 00:56:14,960 --> 00:56:17,800 Speaker 1: that's asking these questions and are trying to buddle through. 1138 00:56:17,920 --> 00:56:21,440 Speaker 1: And you know, I, when I looked at the neuroanatomy textbook, 1139 00:56:21,480 --> 00:56:24,080 Speaker 1: threw up my hands and said, I'm studying something else. 1140 00:56:24,480 --> 00:56:27,319 Speaker 1: Thank goodness. There are people who are like, I'm going 1141 00:56:27,360 --> 00:56:30,040 Speaker 1: to spend my whole life banging my head against this 1142 00:56:30,160 --> 00:56:32,520 Speaker 1: problem and make a little bit a headway and move 1143 00:56:32,600 --> 00:56:35,520 Speaker 1: us all ahead. And so I lately feel like there's 1144 00:56:35,520 --> 00:56:37,440 Speaker 1: a lot of reasons to be down on our species. 1145 00:56:37,440 --> 00:56:39,480 Speaker 1: But it is pretty cool that we do this kind 1146 00:56:39,520 --> 00:56:41,480 Speaker 1: of stuff, and I find the science that we do 1147 00:56:41,560 --> 00:56:42,520 Speaker 1: super inspirational. 1148 00:56:42,719 --> 00:56:45,000 Speaker 2: Yes, although I think we can agree that the smartest 1149 00:56:45,000 --> 00:56:46,960 Speaker 2: people on the planet are the folks who listen to 1150 00:56:46,960 --> 00:56:47,600 Speaker 2: this podcast. 1151 00:56:47,680 --> 00:56:50,680 Speaker 1: Right, agreed, and that is the perfect note to end on. 1152 00:56:52,960 --> 00:56:55,319 Speaker 2: And so before we sign off, we were curious did 1153 00:56:55,360 --> 00:56:58,399 Speaker 2: we answer Tom and Mark and Howard's questions. We sent 1154 00:56:58,560 --> 00:57:01,360 Speaker 2: this to them and hear what they had to say. 1155 00:57:02,239 --> 00:57:05,759 Speaker 6: Thanks Daniel and Kelly for answering this question. You know, 1156 00:57:05,760 --> 00:57:07,560 Speaker 6: one of the reasons I love your podcast so much 1157 00:57:07,640 --> 00:57:12,400 Speaker 6: is because no matter how complete your answers are, you 1158 00:57:12,440 --> 00:57:16,440 Speaker 6: always offer a more beautiful question at can't. And I 1159 00:57:16,440 --> 00:57:19,400 Speaker 6: guess one of the reasons I was focused on rockets 1160 00:57:19,520 --> 00:57:24,520 Speaker 6: is because aliens were more likely to kind of discover 1161 00:57:24,600 --> 00:57:27,880 Speaker 6: alien technology than we are. Aliens first, and I imagine 1162 00:57:27,880 --> 00:57:33,640 Speaker 6: that rockets are pretty important in getting off planet. I'm 1163 00:57:33,640 --> 00:57:38,040 Speaker 6: not aware of any other way to do that, but yeah, 1164 00:57:38,080 --> 00:57:42,920 Speaker 6: I mean, I very satisfied with your discussion, and I 1165 00:57:43,080 --> 00:57:48,120 Speaker 6: look forward to finding out what this field is discovering 1166 00:57:48,120 --> 00:57:54,240 Speaker 6: about intelligence and different kinds of intelligences and what we 1167 00:57:54,360 --> 00:57:57,400 Speaker 6: might expect to find out there in the universe. 1168 00:57:58,160 --> 00:58:02,520 Speaker 3: Thank you, Hi, Daniel Cally. Yes, you answered my question. 1169 00:58:03,360 --> 00:58:06,040 Speaker 3: And while I was hoping the answer would be that 1170 00:58:06,080 --> 00:58:09,560 Speaker 3: there could be some world out there where all the 1171 00:58:09,640 --> 00:58:15,320 Speaker 3: creatures hold hands or tentacles or whatever and sing Kumbaya 1172 00:58:15,320 --> 00:58:18,120 Speaker 3: and are cool to each other. It sounds like the 1173 00:58:18,160 --> 00:58:22,760 Speaker 3: real answer is wherever there's life, whether it's intelligent or not, 1174 00:58:23,600 --> 00:58:27,000 Speaker 3: something is gone, sooner or later get the idea to 1175 00:58:27,000 --> 00:58:31,200 Speaker 3: eat something else. So thanks for taking my question and 1176 00:58:31,200 --> 00:58:33,919 Speaker 3: thanks for the book suggestion. I will definitely check it out, 1177 00:58:34,120 --> 00:58:36,720 Speaker 3: and of course thanks for the wonderful show. 1178 00:58:37,720 --> 00:58:40,560 Speaker 5: Thank you Daniel and Kelly. This is Howard and I 1179 00:58:40,600 --> 00:58:42,960 Speaker 5: want to thank you for taking on my question. I 1180 00:58:43,000 --> 00:58:46,200 Speaker 5: know it can be challenging when the question has no 1181 00:58:46,360 --> 00:58:50,160 Speaker 5: definitive answer, but I kind of knew that going in. 1182 00:58:50,240 --> 00:58:55,080 Speaker 5: But what I really appreciated was listening to your thought processes, 1183 00:58:55,160 --> 00:58:57,480 Speaker 5: how you break it down, how you would approach it, 1184 00:58:57,560 --> 00:59:02,640 Speaker 5: what hilarities are there that might apply. 1185 00:59:02,520 --> 00:59:07,200 Speaker 2: To it, And that was invigorating. Thank you so much. 1186 00:59:07,280 --> 00:59:12,880 Speaker 2: I appreciate it. A quick non science announcement. May tenth, 1187 00:59:13,000 --> 00:59:16,800 Speaker 2: twenty twenty five is the Stamp Out Hunger Drive, organized 1188 00:59:16,840 --> 00:59:19,960 Speaker 2: in tandem with the US Postal Service. It's very easy 1189 00:59:20,000 --> 00:59:22,600 Speaker 2: to help fill up your local food banks. Just leave 1190 00:59:22,800 --> 00:59:26,400 Speaker 2: non perishable food at your mailbox and your carrier will 1191 00:59:26,440 --> 00:59:28,560 Speaker 2: collect it for you. That's all you have to do. 1192 00:59:29,000 --> 00:59:32,280 Speaker 2: Leave non perishable foods by your mailbox on May tenth, 1193 00:59:32,360 --> 00:59:34,760 Speaker 2: twenty twenty five to help stamp out hunger. 1194 00:59:41,480 --> 00:59:45,320 Speaker 1: Daniel and Kelly's Extraordinary Universe is produced by Iheartreading. We 1195 00:59:45,360 --> 00:59:47,800 Speaker 1: would love to hear from you, We really would. 1196 00:59:47,960 --> 00:59:50,680 Speaker 2: We want to know what questions you have about this 1197 00:59:50,880 --> 00:59:52,560 Speaker 2: Extraordinary Universe. 1198 00:59:52,760 --> 00:59:55,720 Speaker 1: Want to know your thoughts on recent shows, suggestions for 1199 00:59:55,800 --> 00:59:58,880 Speaker 1: future shows. If you contact us, we will get back 1200 00:59:58,920 --> 00:59:59,240 Speaker 1: to you. 1201 00:59:59,320 --> 01:00:03,040 Speaker 2: We really mean we answer every message. Email us at 1202 01:00:03,120 --> 01:00:05,920 Speaker 2: Questions at Danielandkelly. 1203 01:00:05,160 --> 01:00:07,200 Speaker 1: Dot org, or you can find us on social media. 1204 01:00:07,280 --> 01:00:11,080 Speaker 1: We have accounts on x, Instagram, Blue Sky and on 1205 01:00:11,160 --> 01:00:13,120 Speaker 1: all of those platforms. You can find us at D 1206 01:00:13,560 --> 01:00:15,040 Speaker 1: and K Universe. 1207 01:00:15,200 --> 01:00:16,720 Speaker 2: Don't be shy right to us