1 00:00:03,000 --> 00:00:04,960 Speaker 1: Welcome Stuff to Blow Your Mind, a production of I 2 00:00:05,000 --> 00:00:13,880 Speaker 1: Heart Radios How Stuff Works. Hey, you, welcome to Stuff 3 00:00:13,920 --> 00:00:16,120 Speaker 1: to Blow your Mind. My name is Robert Lamb and 4 00:00:16,160 --> 00:00:20,000 Speaker 1: I'm Joe McCormick. Joe. What what kind of image enters 5 00:00:20,040 --> 00:00:24,520 Speaker 1: your head when I say the words brain soup? Brain soup? 6 00:00:24,640 --> 00:00:26,400 Speaker 1: This is a Green Day song, isn't it? Is it? 7 00:00:26,960 --> 00:00:30,880 Speaker 1: I'm having trouble making soup? Is it? I don't know. 8 00:00:30,920 --> 00:00:33,239 Speaker 1: I'm not. I'm not. I'm not a sound dukie, right, 9 00:00:33,440 --> 00:00:36,720 Speaker 1: I only know like Green Day songs. I think I'm 10 00:00:36,720 --> 00:00:40,360 Speaker 1: getting this wrong. Uh No. I also think of there's 11 00:00:40,360 --> 00:00:43,159 Speaker 1: a horrible scene in Indiana Jones and the Temple of 12 00:00:43,200 --> 00:00:46,960 Speaker 1: Doom where they're they're eating all kinds of weird foods 13 00:00:47,040 --> 00:00:49,600 Speaker 1: that are parts of animals that you shouldn't eat, but 14 00:00:49,720 --> 00:00:52,839 Speaker 1: why not. But one of those parts is they eat 15 00:00:52,840 --> 00:00:54,960 Speaker 1: a monkey brain. I don't think that's supposed to be soup. 16 00:00:56,440 --> 00:00:58,600 Speaker 1: By the way, one of my favorite things on the 17 00:00:58,640 --> 00:01:04,000 Speaker 1: Internet is a total tangent is uh is? Franchise specific 18 00:01:04,200 --> 00:01:10,080 Speaker 1: Wiki entries for inanimate objects allow me to explain. So 19 00:01:10,160 --> 00:01:13,000 Speaker 1: there is like an Indiana Jones wiki that's just a 20 00:01:13,080 --> 00:01:15,720 Speaker 1: website out there. It's got entries on it for all 21 00:01:15,800 --> 00:01:18,760 Speaker 1: the characters minor characters in Indiana Jones movies, all the 22 00:01:18,800 --> 00:01:21,119 Speaker 1: adventures he went on. But then also they're just entries 23 00:01:21,200 --> 00:01:24,760 Speaker 1: because it's a wiki of just objects that appear in 24 00:01:24,840 --> 00:01:28,080 Speaker 1: the Indiana Jones movies. So this will include like individual 25 00:01:28,120 --> 00:01:31,880 Speaker 1: pages for each course of the meal in Indiana Jones 26 00:01:31,880 --> 00:01:33,560 Speaker 1: and the Temple of Doom. And one of them, the 27 00:01:33,600 --> 00:01:35,440 Speaker 1: one that's got all the snakes in it, is called 28 00:01:35,520 --> 00:01:39,479 Speaker 1: Coiled Wriggley's. So wait, is it just it's just about 29 00:01:39,480 --> 00:01:42,160 Speaker 1: a fictional food or is it attempting to draw lines 30 00:01:42,200 --> 00:01:47,000 Speaker 1: between these outrageous and you know, like shock oriented foods 31 00:01:47,040 --> 00:01:49,640 Speaker 1: and in some actual traditions, because that if they're not 32 00:01:49,680 --> 00:01:51,520 Speaker 1: doing it, that would actually be kind of a fun 33 00:01:51,560 --> 00:01:54,400 Speaker 1: thing to do, to sort of take the shock that 34 00:01:54,480 --> 00:01:58,000 Speaker 1: they're that they're trying to utilize in that scene and say, 35 00:01:58,040 --> 00:02:01,400 Speaker 1: we'll hold on. Actually, here's some actual culinary traditions that 36 00:02:01,440 --> 00:02:04,720 Speaker 1: are not really that shocking. Uh no, it's not at 37 00:02:04,720 --> 00:02:07,960 Speaker 1: all that informative or mind opening. Just let me allow 38 00:02:08,000 --> 00:02:10,480 Speaker 1: me to quote directly from the page for Coiled Wriggles. 39 00:02:10,919 --> 00:02:13,600 Speaker 1: Coiled Wriggley is also known as Snakes Surprise, was a 40 00:02:13,639 --> 00:02:16,080 Speaker 1: dish served at the Guardian of Tradition dinner given at 41 00:02:16,120 --> 00:02:19,400 Speaker 1: Pancok Palace in nine thirty five. As the second course, 42 00:02:19,720 --> 00:02:23,120 Speaker 1: it was live baby eels stuffed inside a moist boa constrictor. 43 00:02:23,240 --> 00:02:25,240 Speaker 1: One of the guests at the dinner, a merchant, was 44 00:02:25,360 --> 00:02:28,360 Speaker 1: very pleased when the dish was served. Another guest enjoyed 45 00:02:28,360 --> 00:02:32,600 Speaker 1: the eels with great gusto. I have not seen that 46 00:02:32,639 --> 00:02:34,600 Speaker 1: film in a very long time. I really don't know 47 00:02:34,639 --> 00:02:38,280 Speaker 1: how it would would hold up. Some elements certainly do not. 48 00:02:39,480 --> 00:02:41,000 Speaker 1: You know, I should point out that, like you know, 49 00:02:41,160 --> 00:02:44,000 Speaker 1: talking about brain soup, actual brain soup does exist. Do 50 00:02:44,040 --> 00:02:47,120 Speaker 1: you have, you know, various pork brain soups, etcetera. And 51 00:02:47,360 --> 00:02:49,560 Speaker 1: even though I'm a I'm a pet scutarian these days, 52 00:02:50,639 --> 00:02:53,360 Speaker 1: I wouldn't order pork brain soup at a restaurant. I 53 00:02:53,400 --> 00:02:56,080 Speaker 1: looked at some photos and it looks perfectly delicious and 54 00:02:56,120 --> 00:02:58,600 Speaker 1: not at all weird. Uh So I just want to 55 00:02:58,680 --> 00:03:02,320 Speaker 1: drive that home regarding uh the consumption of of brain 56 00:03:03,040 --> 00:03:07,000 Speaker 1: based soups before I bring up something that comes to 57 00:03:07,040 --> 00:03:09,000 Speaker 1: my mind when I when I hear brain soup, and 58 00:03:09,360 --> 00:03:11,600 Speaker 1: what came into my mind when I heard it recently? 59 00:03:11,960 --> 00:03:16,480 Speaker 1: I instantly think back to the two film it came 60 00:03:16,520 --> 00:03:19,959 Speaker 1: from Hollywood. Oh, yes, so this was this is a film. 61 00:03:20,000 --> 00:03:22,040 Speaker 1: It's it's I think you can watch most of it 62 00:03:22,040 --> 00:03:25,440 Speaker 1: on YouTube these days. But as Dan Ackroyd, John Candy 63 00:03:25,639 --> 00:03:29,520 Speaker 1: Cheech and Chong Gilda Radner, that whole crew, and sort 64 00:03:29,520 --> 00:03:32,240 Speaker 1: of before Mystery Science Theater, there was this movie. It 65 00:03:32,360 --> 00:03:35,600 Speaker 1: was just a clip show of like bad sci fi 66 00:03:35,760 --> 00:03:39,400 Speaker 1: films and stuff, with the hosts of the movie making 67 00:03:39,480 --> 00:03:42,200 Speaker 1: jokes about them. Yeah, and so a lot of trailer clips, 68 00:03:42,240 --> 00:03:44,320 Speaker 1: a lot of weird clips, And there's a whole section 69 00:03:44,360 --> 00:03:47,600 Speaker 1: that I that I loved as a kid about brains. 70 00:03:47,720 --> 00:03:51,720 Speaker 1: You know, all these weird fifties and sixties brain monster films, 71 00:03:51,720 --> 00:03:54,160 Speaker 1: which which I think are probably expressing a certain amount 72 00:03:54,160 --> 00:03:57,720 Speaker 1: of anxiety regarding like post war advances in neuroscience and 73 00:03:57,760 --> 00:04:01,360 Speaker 1: the sort of existential conundros that are raised, you know, like, 74 00:04:01,400 --> 00:04:03,080 Speaker 1: am I just a brain? What does it mean to 75 00:04:03,120 --> 00:04:05,400 Speaker 1: be just a brain? Will a brain jump out of 76 00:04:05,440 --> 00:04:08,280 Speaker 1: someone's head and attack me with its tentacles? There was 77 00:04:08,320 --> 00:04:12,000 Speaker 1: a lot of revolutionary neuroscience going on in the nineteen fifties, 78 00:04:12,560 --> 00:04:14,920 Speaker 1: and yeah, I can see exactly why people would be 79 00:04:14,960 --> 00:04:16,920 Speaker 1: concerned about this kind of thing, though some of them 80 00:04:16,920 --> 00:04:18,919 Speaker 1: are just more like I think there's one called like 81 00:04:18,960 --> 00:04:23,200 Speaker 1: the brain from Planet Rouse or something that, yeah, just 82 00:04:23,279 --> 00:04:26,560 Speaker 1: giant brains swarming at people. So I don't know exactly 83 00:04:26,600 --> 00:04:29,919 Speaker 1: how much that plays on anxieties about neurology, but I 84 00:04:29,920 --> 00:04:31,839 Speaker 1: mean I get that that what I really get a 85 00:04:31,880 --> 00:04:34,880 Speaker 1: sense of it when I see clips from Fiend Without 86 00:04:34,920 --> 00:04:38,719 Speaker 1: a Face, because as we have the fabulous stop motion 87 00:04:39,160 --> 00:04:43,280 Speaker 1: brains and kind of like spinal column tentacles, yeah, all 88 00:04:43,320 --> 00:04:46,000 Speaker 1: over the furniture. We just sprayed for brains last week. 89 00:04:48,360 --> 00:04:51,760 Speaker 1: But anyway, this particular segment on brains was introduced by 90 00:04:51,839 --> 00:04:56,479 Speaker 1: dan ackroyd Uh and he's he's he's in this this scene. 91 00:04:56,480 --> 00:04:58,839 Speaker 1: It's like he's in a butcher's shop and he's playing 92 00:04:58,880 --> 00:05:01,960 Speaker 1: a maniacal Perhaps he's supposed to be like a mad 93 00:05:02,040 --> 00:05:06,320 Speaker 1: brain scientist or a brain butcher, or like a brain 94 00:05:06,520 --> 00:05:09,480 Speaker 1: addict cannibal. It's it's kind of hard to say, it's 95 00:05:09,480 --> 00:05:12,719 Speaker 1: not entirely clear, but it's it's silly and it's gross 96 00:05:12,880 --> 00:05:15,279 Speaker 1: and it's a I was tempted to play a clip 97 00:05:15,320 --> 00:05:18,840 Speaker 1: from it here, but it's just so over the top 98 00:05:19,400 --> 00:05:21,400 Speaker 1: that I don't I don't think it would translate well 99 00:05:21,440 --> 00:05:25,920 Speaker 1: to an audio only environment. You are correctly capturing here 100 00:05:26,040 --> 00:05:29,159 Speaker 1: the strangeness of the premise for the sketch in the movie. 101 00:05:29,480 --> 00:05:33,719 Speaker 1: I don't just blithely throw around the phrase cocaine fuel. 102 00:05:33,960 --> 00:05:36,960 Speaker 1: But there there are some cases where you get you 103 00:05:37,040 --> 00:05:40,000 Speaker 1: get a suspicion. But also in that scene, I remember 104 00:05:40,120 --> 00:05:42,160 Speaker 1: dan Ackroyd is doing a voice that sounds a lot 105 00:05:42,240 --> 00:05:44,800 Speaker 1: like a Monty Python voice, and he sounds like the 106 00:05:44,920 --> 00:05:48,200 Speaker 1: leader of the Knights who say knee. Yes he does. 107 00:05:48,279 --> 00:05:51,880 Speaker 1: It's that kind of like ridiculous cartoonish voice. Oh yeah, 108 00:05:52,040 --> 00:05:55,640 Speaker 1: the whole segment is is fabulous, all right, But all 109 00:05:55,720 --> 00:05:58,640 Speaker 1: this sort of these sort of visceral reactions to the 110 00:05:58,720 --> 00:06:02,000 Speaker 1: idea of brain soup aside, we are not going to 111 00:06:02,040 --> 00:06:05,520 Speaker 1: be talking about eating brains today on the show. We're 112 00:06:05,520 --> 00:06:09,280 Speaker 1: going to talk about the very real practice, the scientific 113 00:06:09,279 --> 00:06:13,880 Speaker 1: practice of making brain soup to better understand the neural 114 00:06:14,000 --> 00:06:19,520 Speaker 1: power of animal brains. Well, I am very intrigued already now, Robert, 115 00:06:19,600 --> 00:06:22,679 Speaker 1: I know this. The subject is not coming out of nowhere. 116 00:06:22,760 --> 00:06:25,640 Speaker 1: This is inspired by something that you just saw when 117 00:06:25,720 --> 00:06:27,600 Speaker 1: you were at the World Science Festival in New York 118 00:06:27,600 --> 00:06:29,799 Speaker 1: City last week. That's right. Yeah, I just got back 119 00:06:29,960 --> 00:06:32,280 Speaker 1: and uh yeah. This is held every the last week 120 00:06:32,320 --> 00:06:34,640 Speaker 1: of May every year, and as usual I got to 121 00:06:34,680 --> 00:06:37,039 Speaker 1: hear some of the world's leading scientists and thinkers discussed 122 00:06:37,040 --> 00:06:39,359 Speaker 1: a number of exciting topics. And one of the talks 123 00:06:39,360 --> 00:06:43,719 Speaker 1: I attended was rethinking thinking how intelligent are other animals? 124 00:06:44,040 --> 00:06:46,640 Speaker 1: This is always a great subject and something we've revisited 125 00:06:47,040 --> 00:06:49,280 Speaker 1: quite a few times on the podcast before. One time 126 00:06:49,320 --> 00:06:52,360 Speaker 1: we talked with friends devol here on the podcast in 127 00:06:52,400 --> 00:06:55,240 Speaker 1: an interview about his book. His book, I remember it 128 00:06:55,279 --> 00:06:57,400 Speaker 1: had a kind of awkward title. It was something it 129 00:06:57,480 --> 00:06:59,800 Speaker 1: was something like, are we smart enough to know how 130 00:07:00,040 --> 00:07:02,800 Speaker 1: marred animals are? Doesn't really roll off the tongue, but 131 00:07:02,839 --> 00:07:05,400 Speaker 1: it's actually a really good book. Uh. And and this 132 00:07:05,480 --> 00:07:08,039 Speaker 1: has come up with reference to the intelligence of birds 133 00:07:08,160 --> 00:07:10,960 Speaker 1: and and quite a few cases. Actually. Yeah. And so 134 00:07:11,000 --> 00:07:14,400 Speaker 1: this particular talk was loaded with interesting participants, But the 135 00:07:14,640 --> 00:07:17,920 Speaker 1: participant that got me thinking about brain soup was Susanna 136 00:07:18,120 --> 00:07:22,880 Speaker 1: or Kulana Uzel, PhD, a biologist and neuroscientists at Vanderbilt 137 00:07:22,960 --> 00:07:26,720 Speaker 1: University in Nashville, Tennessee, where she is Associate professor in 138 00:07:26,760 --> 00:07:30,960 Speaker 1: the Department of Psychology and biological sciences, and her research 139 00:07:31,040 --> 00:07:34,920 Speaker 1: focuses on what different brains are made off. Now, Robert, 140 00:07:34,920 --> 00:07:37,560 Speaker 1: I envy you getting to see a panel that involved her, 141 00:07:37,560 --> 00:07:40,040 Speaker 1: because I've actually watched a TED talk that she did 142 00:07:40,440 --> 00:07:44,040 Speaker 1: back from I think it was twenty She's a fantastic 143 00:07:44,040 --> 00:07:47,120 Speaker 1: public speaker. So yeah, she seems like a really good 144 00:07:47,360 --> 00:07:50,320 Speaker 1: public science communicator. Yeah, Like when I say, you know 145 00:07:50,440 --> 00:07:53,320 Speaker 1: that you have a panel of of leading scientists there 146 00:07:53,320 --> 00:07:56,720 Speaker 1: on the stage, like, don't don't imagine something stuffy, because 147 00:07:56,760 --> 00:08:00,000 Speaker 1: you often have some just really wonderful science communicators up 148 00:08:00,080 --> 00:08:03,000 Speaker 1: there that you know are experts in their field, but 149 00:08:03,040 --> 00:08:05,160 Speaker 1: are also able to talk about that in a way 150 00:08:05,640 --> 00:08:08,880 Speaker 1: that people at you know, various levels of expertise can 151 00:08:09,040 --> 00:08:11,240 Speaker 1: can jump in on. And that's certainly the case with 152 00:08:11,280 --> 00:08:14,600 Speaker 1: Susanna her colono ozel. Now, I do want to mention 153 00:08:14,640 --> 00:08:17,880 Speaker 1: you're you're hearing this last name. Sometimes you will hear 154 00:08:17,920 --> 00:08:22,560 Speaker 1: people pronounce it her colono hose l um. If you're 155 00:08:22,560 --> 00:08:24,560 Speaker 1: trying to look it up, it's h e r c 156 00:08:24,880 --> 00:08:27,560 Speaker 1: u l a n o dash h o u z 157 00:08:27,720 --> 00:08:30,760 Speaker 1: e l. Trying to to do the name justice here 158 00:08:30,760 --> 00:08:34,240 Speaker 1: with our bungling mouths. So, as she tells it her 159 00:08:34,280 --> 00:08:38,440 Speaker 1: Colona Hoose was working as a science educator um in 160 00:08:38,760 --> 00:08:42,200 Speaker 1: Rio de Janeiro, Brazil, and uh, and she kept running 161 00:08:42,200 --> 00:08:45,240 Speaker 1: into an old myth that I'm sure many of you 162 00:08:45,320 --> 00:08:47,640 Speaker 1: have run into before as well. Now, this is not 163 00:08:47,760 --> 00:08:50,320 Speaker 1: when we use the word myth too in the non 164 00:08:50,360 --> 00:08:56,360 Speaker 1: derogatory science. This is just foundational story, maybe involving supernatural elements. Yeah, 165 00:08:56,400 --> 00:08:59,280 Speaker 1: this is the derogatory myth. And that myth is we 166 00:08:59,400 --> 00:09:03,120 Speaker 1: only use ten percent of our brain. And this has 167 00:09:03,160 --> 00:09:07,800 Speaker 1: been used really overused in sci fi movies, especially over 168 00:09:07,840 --> 00:09:12,200 Speaker 1: the years, as recently as Teens Lucy and twenty eleven 169 00:09:12,320 --> 00:09:15,920 Speaker 1: is limitless um, which by the way, decided to double it, 170 00:09:16,840 --> 00:09:19,240 Speaker 1: but still but still stuck to this idea that there's 171 00:09:19,240 --> 00:09:23,040 Speaker 1: only a small percentage of the human brain that is available. Uh. 172 00:09:23,040 --> 00:09:25,280 Speaker 1: And of course all these these you know, films and 173 00:09:25,280 --> 00:09:27,840 Speaker 1: pretty much anytime you see this trouted out, it's it's 174 00:09:27,880 --> 00:09:30,120 Speaker 1: with the idea like there's there are ways to open 175 00:09:30,240 --> 00:09:33,480 Speaker 1: up the rest of the brain. There's like this dormant brain. Uh, 176 00:09:33,600 --> 00:09:36,800 Speaker 1: these dormant brain portions that can be utilized if only 177 00:09:36,880 --> 00:09:38,480 Speaker 1: you have the key. And the key of course is 178 00:09:39,679 --> 00:09:41,560 Speaker 1: right and it's supplied by you know, some sort of 179 00:09:41,559 --> 00:09:43,560 Speaker 1: mad science. If you're in a fiction or if you're 180 00:09:43,640 --> 00:09:46,800 Speaker 1: encountering it saying some sort of self help scenario, then 181 00:09:47,000 --> 00:09:48,920 Speaker 1: they're going to sell you the key or yeah, if 182 00:09:48,920 --> 00:09:52,840 Speaker 1: you're encountering it and uh, it's say nutritional supplement marketing. 183 00:09:52,840 --> 00:09:54,920 Speaker 1: I mean, this is a big thing out there. Yeah, 184 00:09:55,000 --> 00:09:57,360 Speaker 1: but we say it's a myth because and we'll break 185 00:09:57,400 --> 00:09:59,880 Speaker 1: this down a little bit here, but but basically, there's 186 00:10:00,000 --> 00:10:03,920 Speaker 1: nothing to these numbers. Now, we we all use of 187 00:10:03,920 --> 00:10:06,920 Speaker 1: our brains. Well, when you start, when you start to 188 00:10:07,040 --> 00:10:09,920 Speaker 1: look at the claim, it becomes less and less clear 189 00:10:09,920 --> 00:10:12,800 Speaker 1: what it even means. Does it mean we only use 190 00:10:12,840 --> 00:10:15,680 Speaker 1: ten percent of our brain at a time? Do we 191 00:10:15,720 --> 00:10:18,920 Speaker 1: only use ten percent of it ever in our whole lives? 192 00:10:18,960 --> 00:10:22,200 Speaker 1: Like what even counts as quote using the brain in 193 00:10:22,240 --> 00:10:24,640 Speaker 1: the first place. Yeah, I feel like we've talked about 194 00:10:25,960 --> 00:10:28,000 Speaker 1: about neuroscience on the show enough, you know, and talking 195 00:10:28,040 --> 00:10:30,960 Speaker 1: about various f M R eyes studies and like what 196 00:10:31,080 --> 00:10:32,760 Speaker 1: parts of the brain are lighting up, what parts of 197 00:10:32,800 --> 00:10:35,080 Speaker 1: the brain, and what networks of the brain are associated 198 00:10:35,120 --> 00:10:40,000 Speaker 1: with different thoughts and behaviors, etcetera. That it's it's clear 199 00:10:40,040 --> 00:10:42,720 Speaker 1: that like there's there is definitely a lot of stuff 200 00:10:42,760 --> 00:10:45,400 Speaker 1: in play and then virtually and then really everything is 201 00:10:45,480 --> 00:10:48,640 Speaker 1: in play. Um I guess, I guess the confusion could 202 00:10:48,679 --> 00:10:50,800 Speaker 1: be if you're looking at various images of fm R 203 00:10:50,800 --> 00:10:52,160 Speaker 1: E s and you're thinking, oh, well, that looks about 204 00:10:52,200 --> 00:10:54,720 Speaker 1: like ten percent. That looks about like ten percent. That's 205 00:10:54,720 --> 00:10:57,040 Speaker 1: a really good point. When people talk about f m 206 00:10:57,160 --> 00:11:01,240 Speaker 1: r I studies, sometimes they will loosely and casually try 207 00:11:01,280 --> 00:11:04,040 Speaker 1: to explain the findings of an fMRI I study by saying, Hey, 208 00:11:04,120 --> 00:11:06,720 Speaker 1: when somebody was doing this task with their brain, you know, 209 00:11:06,720 --> 00:11:08,760 Speaker 1: they were trying to tie a knot or they were 210 00:11:09,000 --> 00:11:12,160 Speaker 1: you know, whittling a whittling a horsey out of soap. 211 00:11:12,760 --> 00:11:15,319 Speaker 1: When they're doing this task, this part of their brain 212 00:11:15,480 --> 00:11:18,240 Speaker 1: lights up. That's the phrase this lights up, which implies 213 00:11:18,280 --> 00:11:21,160 Speaker 1: that the entire rest of the brain is dark until 214 00:11:21,280 --> 00:11:23,680 Speaker 1: it just lights up, as if it's like totally dormant, 215 00:11:23,920 --> 00:11:26,920 Speaker 1: and that's not true. I mean brain imaging studies like 216 00:11:27,080 --> 00:11:30,079 Speaker 1: you know, PET scans and fMRI I that they show 217 00:11:30,240 --> 00:11:34,480 Speaker 1: relative activity, so that they're charting activity and seeing what 218 00:11:34,600 --> 00:11:39,120 Speaker 1: areas get get surges of activity relative to what they 219 00:11:39,160 --> 00:11:42,079 Speaker 1: were doing at other times. Right there, the areas that 220 00:11:42,120 --> 00:11:44,800 Speaker 1: are not lit up are not lifeless, right, But I mean, 221 00:11:44,880 --> 00:11:47,280 Speaker 1: even back to this, I want to know again what 222 00:11:47,280 --> 00:11:50,640 Speaker 1: what would actually mean to use your whole brain or 223 00:11:50,679 --> 00:11:54,559 Speaker 1: to use certain percentages of your brain. I was reading 224 00:11:54,559 --> 00:11:56,920 Speaker 1: around about this and I found one pretty interesting thing. 225 00:11:57,000 --> 00:12:00,320 Speaker 1: So the neuroscientist Gabrielle and Torrey I think she's at 226 00:12:00,360 --> 00:12:04,400 Speaker 1: Boston University now. She wrote an interesting post about this 227 00:12:04,520 --> 00:12:06,959 Speaker 1: myth that I found on a website called Knowing Neurons, 228 00:12:07,000 --> 00:12:10,719 Speaker 1: and in one section of her article, she discusses difficulties 229 00:12:10,760 --> 00:12:14,280 Speaker 1: determining what it would actually mean to use quote a 230 00:12:14,360 --> 00:12:16,960 Speaker 1: hundred percent of your brain, Like there's no single way 231 00:12:16,960 --> 00:12:19,680 Speaker 1: to measure what portion of the brain is being used 232 00:12:19,720 --> 00:12:22,720 Speaker 1: at any given time. The default mode network, of course, 233 00:12:22,840 --> 00:12:25,480 Speaker 1: is in operation throughout the brain pretty much all the time. 234 00:12:26,080 --> 00:12:29,079 Speaker 1: But does that not count as the brain being used? 235 00:12:29,120 --> 00:12:31,240 Speaker 1: I mean, even when you're at rest, the default mode 236 00:12:31,320 --> 00:12:34,640 Speaker 1: network is whirrying away all throughout the brain quite unfortunately 237 00:12:34,640 --> 00:12:36,840 Speaker 1: in some cases. In some case, I don't know if 238 00:12:36,880 --> 00:12:38,640 Speaker 1: you'd want to be without your default Well now I 239 00:12:38,640 --> 00:12:40,520 Speaker 1: wouldn't want to give it up, but it it is 240 00:12:40,600 --> 00:12:42,600 Speaker 1: kind of thorn in my side at times. Oh, it 241 00:12:42,679 --> 00:12:45,880 Speaker 1: certainly can be. But then interesting I really like this part. 242 00:12:45,920 --> 00:12:49,440 Speaker 1: So she tries to take this claim seriously, like this 243 00:12:49,480 --> 00:12:53,160 Speaker 1: ten percent of the brain is being used claim, and 244 00:12:53,720 --> 00:12:55,640 Speaker 1: she's like, what would that actually mean? Well, she offers 245 00:12:55,679 --> 00:12:58,440 Speaker 1: one interpretation of what it would mean to use a 246 00:12:58,520 --> 00:13:00,640 Speaker 1: hundred percent of your brain, and it would mean a 247 00:13:00,679 --> 00:13:04,120 Speaker 1: Grand Moll seizure. So it would be like limitless except 248 00:13:04,120 --> 00:13:06,920 Speaker 1: Bradley Cooper takes a magic pill and it just makes 249 00:13:06,960 --> 00:13:10,960 Speaker 1: his brain like like we almost explode. Yeah, exactly. So, 250 00:13:11,000 --> 00:13:13,840 Speaker 1: to read from tories where quote, seizures are defined by 251 00:13:13,880 --> 00:13:16,760 Speaker 1: excessive and synchronous neural activity. If we wanted to use 252 00:13:16,800 --> 00:13:19,920 Speaker 1: a hundred percent of our brains to stimulate each of 253 00:13:19,960 --> 00:13:23,160 Speaker 1: the brains one hundred billion neurons, this is funny. We 254 00:13:23,200 --> 00:13:25,640 Speaker 1: should come back to this in a second, to maximum 255 00:13:25,679 --> 00:13:30,800 Speaker 1: capacity firing would result in a likely fatal physical experience 256 00:13:30,840 --> 00:13:34,160 Speaker 1: to hope for synchronous exit. Tory activity across the cortex 257 00:13:34,240 --> 00:13:38,080 Speaker 1: is in many ways synonymous to a Grand Mall seizure. Uh. 258 00:13:38,120 --> 00:13:40,240 Speaker 1: This is the most severe type of seizure and leads 259 00:13:40,240 --> 00:13:43,720 Speaker 1: to loss of consciousness and severe muscle contractions, not the 260 00:13:43,800 --> 00:13:47,679 Speaker 1: unlocking of superhuman abilities. So I think, obviously we don't 261 00:13:47,920 --> 00:13:51,960 Speaker 1: just want universal synchronous patterns of excitation of every neuron 262 00:13:52,000 --> 00:13:56,720 Speaker 1: in the cortex. That's stupid. That's not something to wish for, right, 263 00:13:56,800 --> 00:14:00,839 Speaker 1: It's like, you don't want the brain working that hard universally. 264 00:14:00,880 --> 00:14:02,360 Speaker 1: It's like you don't want your your you don't want 265 00:14:02,360 --> 00:14:05,640 Speaker 1: your xbox to be clearly just just heating up and 266 00:14:05,720 --> 00:14:07,920 Speaker 1: like that the fan is about to explode like that. 267 00:14:08,040 --> 00:14:10,280 Speaker 1: You know, something's wrong with it. Yeah, my CPU is 268 00:14:10,320 --> 00:14:16,360 Speaker 1: a constantly And she also mentions the default mode network. 269 00:14:16,440 --> 00:14:20,200 Speaker 1: You know, it's this diffuse, interconnected set of activity notes 270 00:14:20,280 --> 00:14:23,360 Speaker 1: that show activation when the brain is otherwise at rest. 271 00:14:23,520 --> 00:14:27,000 Speaker 1: You know, it shows the brain is pretty much never dormant. 272 00:14:27,000 --> 00:14:30,000 Speaker 1: There's activity throughout most of the brain most of the time. 273 00:14:30,480 --> 00:14:34,520 Speaker 1: So where did this clearly false claim come from? The 274 00:14:34,560 --> 00:14:36,840 Speaker 1: answer actually isn't known for sure, but I found one 275 00:14:36,880 --> 00:14:41,280 Speaker 1: commonly cited early example is a quote falsely attributed to 276 00:14:41,320 --> 00:14:44,520 Speaker 1: the American psychologist William James, the author of a great 277 00:14:44,560 --> 00:14:47,320 Speaker 1: classic text, The Varieties of Religious Experience, which is still 278 00:14:47,320 --> 00:14:49,760 Speaker 1: interesting to read parts of today. Yeah, we've cited him 279 00:14:49,800 --> 00:14:52,040 Speaker 1: a few different times on the show. Yeah, but apparently 280 00:14:52,160 --> 00:14:56,040 Speaker 1: James did write something generically kind of like this, but 281 00:14:56,080 --> 00:14:58,800 Speaker 1: without a number sited uh in a piece called The 282 00:14:58,920 --> 00:15:02,160 Speaker 1: Energies of Men, where you wrote, quote, we are making 283 00:15:02,280 --> 00:15:04,760 Speaker 1: use of only a small part of our possible mental 284 00:15:04,760 --> 00:15:08,480 Speaker 1: and physical resources, which is a very different kind of statement. Right. Yeah, 285 00:15:08,480 --> 00:15:10,400 Speaker 1: he's not putting a fine number on it or saying 286 00:15:10,400 --> 00:15:14,960 Speaker 1: like I mean that that statement alone is not not 287 00:15:15,000 --> 00:15:18,240 Speaker 1: implying that the portions of the brain are inactive, just that, 288 00:15:18,600 --> 00:15:21,480 Speaker 1: for you know, we're not taking full um advantage of 289 00:15:21,480 --> 00:15:25,200 Speaker 1: our neurological potential. And I want to mention a bit 290 00:15:25,240 --> 00:15:27,720 Speaker 1: more about that in a second. Uh So, as to 291 00:15:27,880 --> 00:15:30,280 Speaker 1: the origin of the phrase, the ten percent figure is 292 00:15:30,320 --> 00:15:33,280 Speaker 1: also cited in the nineteen thirties, and this has been 293 00:15:33,280 --> 00:15:35,920 Speaker 1: pointed out by quite a few investigators by the author 294 00:15:36,040 --> 00:15:40,160 Speaker 1: of an introduction to Dale Carnegie's classic self help book 295 00:15:40,200 --> 00:15:43,760 Speaker 1: How to Win Friends and Influence People, which possibly little 296 00:15:43,800 --> 00:15:47,680 Speaker 1: known fact Charles Manson's favorite book, Really Yeah, a classic 297 00:15:47,760 --> 00:15:50,960 Speaker 1: of business leaders and serial killers alike. But this explains 298 00:15:50,960 --> 00:15:53,160 Speaker 1: it a bit though. Right we have here an introduction 299 00:15:53,240 --> 00:15:57,360 Speaker 1: to a widely read popular book. Yeah, and the author 300 00:15:57,400 --> 00:15:59,720 Speaker 1: of the introduction says, we only act, you know, we 301 00:15:59,800 --> 00:16:02,320 Speaker 1: only use ten percent of our brains. The vast amount 302 00:16:02,360 --> 00:16:05,720 Speaker 1: of our mental potential is untapped. And I mentioned this 303 00:16:05,760 --> 00:16:07,680 Speaker 1: a minute ago, but I I just want to hammer 304 00:16:07,720 --> 00:16:10,360 Speaker 1: home again. I feel like I've seen this nonsensical ten 305 00:16:10,400 --> 00:16:14,680 Speaker 1: percent fact invoked by people selling some of the various 306 00:16:14,720 --> 00:16:19,160 Speaker 1: like brain booster pills and supposed new tropics supplements. Uh. 307 00:16:19,520 --> 00:16:22,120 Speaker 1: I would advise people to be cautious about that kind 308 00:16:22,160 --> 00:16:24,560 Speaker 1: of stuff. And I'm not ruling out the possibility that 309 00:16:24,600 --> 00:16:28,000 Speaker 1: there are some nutritional supplements or drugs that might in 310 00:16:28,080 --> 00:16:31,400 Speaker 1: some cases have a small effect on mental performance. But 311 00:16:31,440 --> 00:16:33,720 Speaker 1: if somebody is trying to sell you pills that will 312 00:16:33,840 --> 00:16:37,560 Speaker 1: unlock the untapped godlike potential of the you know, the 313 00:16:37,600 --> 00:16:40,640 Speaker 1: paid access part of your brain, the brains premium content. 314 00:16:41,040 --> 00:16:43,840 Speaker 1: As a general rule, be suspicious of this. I would. 315 00:16:44,080 --> 00:16:46,600 Speaker 1: I would advise in general, don't trust them and don't 316 00:16:46,600 --> 00:16:49,560 Speaker 1: give them your money, because they're certainly tapping into something 317 00:16:50,160 --> 00:16:51,680 Speaker 1: I think we all know to be true and we 318 00:16:51,720 --> 00:16:54,040 Speaker 1: all certainly want to be true, the idea that there 319 00:16:54,120 --> 00:16:58,080 Speaker 1: is room for improvement exactly, that we can learn new things, 320 00:16:58,160 --> 00:17:00,200 Speaker 1: that we can we can take on new pattern and 321 00:17:00,240 --> 00:17:01,920 Speaker 1: all of these things are true. You don't need a 322 00:17:01,920 --> 00:17:04,199 Speaker 1: phony number to back that up exactly right. I mean, 323 00:17:04,240 --> 00:17:05,879 Speaker 1: I think a large part of the basis for this 324 00:17:05,920 --> 00:17:09,560 Speaker 1: false fact about ten percent lies not in how much 325 00:17:09,640 --> 00:17:12,400 Speaker 1: of our brains we use as a as a ratio, 326 00:17:12,880 --> 00:17:15,640 Speaker 1: but in the way that we use our brains. And 327 00:17:15,800 --> 00:17:18,280 Speaker 1: the most salient example that comes to my mind is 328 00:17:18,680 --> 00:17:22,159 Speaker 1: the psychologist Daniel Konomon is really useful metaphors of system 329 00:17:22,200 --> 00:17:25,280 Speaker 1: one versus system to thinking. You know, so system one 330 00:17:25,280 --> 00:17:27,760 Speaker 1: we've discussed on the podcast before, but system one is 331 00:17:27,800 --> 00:17:31,240 Speaker 1: a suite of human brain functions that allow us to 332 00:17:31,280 --> 00:17:36,400 Speaker 1: process information in a fast, easy, automatic, and approximate way. 333 00:17:36,440 --> 00:17:40,480 Speaker 1: It's intuition, so you know, think rules of thumb, stereotypes, 334 00:17:40,680 --> 00:17:45,920 Speaker 1: intuitive and reactive thinking eyebawling it. Meanwhile, system two is 335 00:17:45,960 --> 00:17:49,679 Speaker 1: described as the slow, deliberate, effortful thinking, the kind of 336 00:17:49,720 --> 00:17:52,560 Speaker 1: reasoning you use when you make a list of pros 337 00:17:52,600 --> 00:17:55,280 Speaker 1: and cons, or you solve a math problem, or you 338 00:17:55,400 --> 00:17:59,439 Speaker 1: carefully plan a route of movement somewhere, or you effortfully 339 00:17:59,480 --> 00:18:02,560 Speaker 1: trace out the logic of an argument. We're all capable 340 00:18:02,600 --> 00:18:05,439 Speaker 1: of both kinds of thinking, but we also rely on 341 00:18:05,600 --> 00:18:08,639 Speaker 1: system one most of the time for most things, and 342 00:18:08,680 --> 00:18:12,520 Speaker 1: that's because of efficiency. Like, you can't process all information 343 00:18:12,560 --> 00:18:15,639 Speaker 1: in a slow, effortful way, you don't have time, and 344 00:18:15,720 --> 00:18:18,119 Speaker 1: System one is not all bad in some tasks. I 345 00:18:18,160 --> 00:18:20,960 Speaker 1: think it might actually be superior. Like consider the way 346 00:18:21,000 --> 00:18:26,360 Speaker 1: that overthinking some athletic tasks like shooting a basketball actually 347 00:18:26,400 --> 00:18:28,840 Speaker 1: makes you worse at them. You know, like a lot 348 00:18:28,840 --> 00:18:31,320 Speaker 1: of players find that they sink more baskets if they 349 00:18:31,359 --> 00:18:33,720 Speaker 1: just try to relax and let the shot happen rather 350 00:18:33,760 --> 00:18:37,160 Speaker 1: than focusing on the distance and the angle and everything. 351 00:18:37,760 --> 00:18:39,960 Speaker 1: And I think this actually even comes through in some 352 00:18:40,000 --> 00:18:43,040 Speaker 1: more abstract tasks, like sometimes I think people can be 353 00:18:43,040 --> 00:18:46,200 Speaker 1: better writers if they if they just kind of enter 354 00:18:46,240 --> 00:18:50,320 Speaker 1: a flow state and not sitting overthink over, deliberative lee 355 00:18:50,359 --> 00:18:53,960 Speaker 1: about the sentences they're constructing. This is very interesting and 356 00:18:54,200 --> 00:18:56,160 Speaker 1: hopefully this won't be too much of a tangent here, 357 00:18:56,200 --> 00:18:58,800 Speaker 1: but um, but it's also we have to be careful 358 00:18:58,800 --> 00:19:01,600 Speaker 1: about thinking about like system on a system to being 359 00:19:01,680 --> 00:19:05,760 Speaker 1: just complete like jackal and hides. Because actually the World 360 00:19:05,760 --> 00:19:08,320 Speaker 1: Science Festival, another panel I attendant had to do with 361 00:19:08,440 --> 00:19:12,000 Speaker 1: risky behavior and risk taking, especially in extreme sports UH 362 00:19:12,080 --> 00:19:15,680 Speaker 1: such as say free solo climbing UH and like free 363 00:19:15,680 --> 00:19:17,879 Speaker 1: cloaks of solo climbing is a great example because you 364 00:19:17,960 --> 00:19:21,760 Speaker 1: have you have certain individuals who definitely display more of 365 00:19:21,800 --> 00:19:25,560 Speaker 1: a system to approach, like they are methodical. They're they're 366 00:19:25,600 --> 00:19:29,080 Speaker 1: not they're not climbing at climbing that that that that 367 00:19:29,080 --> 00:19:32,639 Speaker 1: that sheer cliff without the aid of ropes, you know, 368 00:19:33,280 --> 00:19:36,320 Speaker 1: without having practiced it many times, without having climbed it 369 00:19:36,359 --> 00:19:39,840 Speaker 1: many times with ropes and thinking through several moves ahead, 370 00:19:39,960 --> 00:19:42,200 Speaker 1: you'll see where you're going to go. On the other hand, 371 00:19:42,240 --> 00:19:45,359 Speaker 1: you have individuals who do have more of an impulsive 372 00:19:46,000 --> 00:19:50,760 Speaker 1: approach to risky behaviors, risk taking activities. But like with 373 00:19:50,800 --> 00:19:53,240 Speaker 1: the basketball, you can say, well you can perhaps you're 374 00:19:53,280 --> 00:19:56,800 Speaker 1: engaging in system too to practice so that you can 375 00:19:56,840 --> 00:20:00,520 Speaker 1: reach the point where you can engage with the challenge 376 00:20:00,800 --> 00:20:03,880 Speaker 1: with a system one mindset. I think that's exactly right. So, yeah, 377 00:20:03,880 --> 00:20:06,720 Speaker 1: system one is not necessarily bad. It's not all bad. 378 00:20:07,040 --> 00:20:10,240 Speaker 1: Sometimes it's bad. And I think we're all aware of 379 00:20:10,320 --> 00:20:16,040 Speaker 1: cases where we have used fast, intuitive, reactive, likely inaccurate 380 00:20:16,119 --> 00:20:19,320 Speaker 1: thinking when we know that we probably could have and 381 00:20:19,359 --> 00:20:22,840 Speaker 1: should have stopped to think things out slowly and deliberately. 382 00:20:22,960 --> 00:20:26,119 Speaker 1: Like you can probably immediately think of examples where you 383 00:20:26,160 --> 00:20:29,320 Speaker 1: really wish you had switched over to system to thinking, 384 00:20:29,600 --> 00:20:32,280 Speaker 1: but you made a fast intuitive decision and you came 385 00:20:32,320 --> 00:20:34,679 Speaker 1: to regret it. And I think it's this kind of 386 00:20:34,760 --> 00:20:39,440 Speaker 1: frequently squandered mental potential that becomes the nugget of truth 387 00:20:39,520 --> 00:20:42,400 Speaker 1: in the ten percent factoid. We use our whole brains, 388 00:20:42,440 --> 00:20:45,520 Speaker 1: but we don't always use our brains as effectively as 389 00:20:45,560 --> 00:20:49,520 Speaker 1: we know we're capable of in every scenario, because it's hard, 390 00:20:49,760 --> 00:20:52,760 Speaker 1: because we're in a hurry. Does that make sense? Absolutely? Yeah. 391 00:20:52,800 --> 00:20:55,600 Speaker 1: We we we delive life. We have to engage both approaches 392 00:20:55,760 --> 00:20:58,240 Speaker 1: to our decision. Make yeah. All right. On that note, 393 00:20:58,240 --> 00:20:59,639 Speaker 1: we're going to take a quick break, but when we 394 00:20:59,680 --> 00:21:02,440 Speaker 1: come we're going to return to the idea of brain 395 00:21:02,520 --> 00:21:05,399 Speaker 1: soup and to the work of Susanna or Coolana Ozel. 396 00:21:06,160 --> 00:21:11,080 Speaker 1: Thank alright, we're back. So we got sidetracked on the 397 00:21:11,080 --> 00:21:14,040 Speaker 1: the idea of whether or not people actually use more 398 00:21:14,080 --> 00:21:16,639 Speaker 1: than ten percent of their brains. We do in pretty 399 00:21:16,680 --> 00:21:19,760 Speaker 1: much any way you interpret that. But that came up 400 00:21:19,800 --> 00:21:23,360 Speaker 1: because it was the basis of a journey of research 401 00:21:23,480 --> 00:21:27,080 Speaker 1: that the neuroscientist Susanna or ku Lana Uzel went on. 402 00:21:27,280 --> 00:21:29,800 Speaker 1: And so let's pick back up with her story from 403 00:21:29,800 --> 00:21:31,879 Speaker 1: the event you saw in New York City. All right, 404 00:21:31,920 --> 00:21:34,600 Speaker 1: So she was talking about another number that she kept 405 00:21:34,600 --> 00:21:37,480 Speaker 1: coming across that she found curious, that she found suspicious, 406 00:21:38,040 --> 00:21:41,359 Speaker 1: and that is the number one billion. Oh now, this 407 00:21:41,480 --> 00:21:43,359 Speaker 1: just came up a minute ago when we were talking 408 00:21:43,359 --> 00:21:46,040 Speaker 1: about an approximation. Now I don't want to indict the 409 00:21:46,040 --> 00:21:50,360 Speaker 1: author that cited and because it's a reasonable approximation, but well, yeah, 410 00:21:50,400 --> 00:21:52,920 Speaker 1: and we'll get into we'll get into that. But but yeah, 411 00:21:52,960 --> 00:21:55,560 Speaker 1: she kept running across this idea that the human brain 412 00:21:55,640 --> 00:21:58,880 Speaker 1: contains a hundred billion neurons and ten times as many 413 00:21:58,880 --> 00:22:02,240 Speaker 1: glial cells. But the thing is unlike that ten percent 414 00:22:02,320 --> 00:22:04,959 Speaker 1: of the brain thing. This was not merely the domain 415 00:22:05,040 --> 00:22:08,440 Speaker 1: of pop culture and science fiction. No, this figure freak 416 00:22:08,560 --> 00:22:12,199 Speaker 1: was frequently cited by neuroscientists, by psychologists, and and sometimes 417 00:22:12,240 --> 00:22:15,080 Speaker 1: applied as a as a comparison to the number of 418 00:22:15,160 --> 00:22:17,879 Speaker 1: stars in the Milky Way. I imagine a number of 419 00:22:17,920 --> 00:22:20,439 Speaker 1: you have heard this one before. What a hundred billion 420 00:22:20,440 --> 00:22:22,560 Speaker 1: stars in the Milky Way. Yeah, So the idea of 421 00:22:22,640 --> 00:22:24,639 Speaker 1: being like, hey, you have a hundred a hundred billion 422 00:22:24,680 --> 00:22:26,760 Speaker 1: eurons in your head. That's as many stars as there 423 00:22:26,800 --> 00:22:29,120 Speaker 1: are in the Milky Way, which I mean, on one level, 424 00:22:29,160 --> 00:22:31,640 Speaker 1: I think that's a useful metaphor because you're basically making 425 00:22:31,640 --> 00:22:35,880 Speaker 1: the statement that um that that inner space is as 426 00:22:35,920 --> 00:22:38,800 Speaker 1: complex and vast as we take outer space to be. 427 00:22:39,200 --> 00:22:41,159 Speaker 1: But on the other hand, when you start looking at 428 00:22:41,200 --> 00:22:44,720 Speaker 1: those actual numbers, uh, there's some problems on both sides 429 00:22:45,359 --> 00:22:48,159 Speaker 1: because for starters, the hundred billion star estimate in the 430 00:22:48,160 --> 00:22:51,200 Speaker 1: Milky Way is not a solid number. By some estimations, 431 00:22:51,240 --> 00:22:53,640 Speaker 1: the Milky Way has the mass of a hundred billion 432 00:22:53,720 --> 00:22:57,320 Speaker 1: solar masses, but other estimates, say four hundred billion or 433 00:22:57,320 --> 00:23:00,760 Speaker 1: even seven hundred billion solar masses is more accurate. Yeah, 434 00:23:00,760 --> 00:23:03,880 Speaker 1: this is a fascinating question on its own. So it's 435 00:23:03,880 --> 00:23:05,960 Speaker 1: difficult to know the number of stars in the Milky 436 00:23:06,000 --> 00:23:09,520 Speaker 1: Way galaxy because obviously we can't just count them, as 437 00:23:09,520 --> 00:23:11,879 Speaker 1: you can't look up there and count them. The best 438 00:23:11,920 --> 00:23:14,480 Speaker 1: we can do is estimate on the basis of the 439 00:23:14,520 --> 00:23:17,159 Speaker 1: mass and luminosity of the galaxy as a whole. But 440 00:23:17,520 --> 00:23:20,280 Speaker 1: this presents difficulties too because there are plenty of things 441 00:23:20,400 --> 00:23:24,280 Speaker 1: in the galaxy other than stars. Right, the vast majority 442 00:23:24,320 --> 00:23:27,560 Speaker 1: of the mass of our galaxy is not even normal matter. 443 00:23:27,640 --> 00:23:29,919 Speaker 1: It seems to be dark matter, which is still an 444 00:23:30,000 --> 00:23:33,960 Speaker 1: unsolved mystery and astrophysics and dark matter, of course is 445 00:23:34,000 --> 00:23:37,919 Speaker 1: this hypothetical stuff that we detect by measuring its mass, 446 00:23:37,960 --> 00:23:41,080 Speaker 1: in other words, the gravitational effect it has on stuff 447 00:23:41,119 --> 00:23:44,400 Speaker 1: around it, but it doesn't appear to interact with electromagnetic 448 00:23:44,480 --> 00:23:48,200 Speaker 1: radiation like light, making it unlike any other ordinary matter 449 00:23:48,280 --> 00:23:50,520 Speaker 1: that we know of. So at this point we don't 450 00:23:50,520 --> 00:23:52,399 Speaker 1: know what that stuff is. And that's most of the 451 00:23:52,440 --> 00:23:55,000 Speaker 1: mass out there. And then past that, we know that 452 00:23:55,119 --> 00:23:57,159 Speaker 1: a lot of the ordinary matter in the galaxy is 453 00:23:57,160 --> 00:24:01,480 Speaker 1: not just stars. Some of it is uh like unincorporated debris, 454 00:24:01,680 --> 00:24:05,800 Speaker 1: cosmic gas and dust, just hydrogen clouds out there. Of 455 00:24:05,840 --> 00:24:08,119 Speaker 1: course that's not the majority of the luminous matter, but 456 00:24:08,119 --> 00:24:11,240 Speaker 1: it's enough to complicate the problem of estimating star counts. 457 00:24:11,560 --> 00:24:13,520 Speaker 1: And then on top of that, you've got ice and 458 00:24:13,600 --> 00:24:17,080 Speaker 1: comets and planets and black holes and a supermassive black 459 00:24:17,080 --> 00:24:19,320 Speaker 1: hole at the center of the galaxy. And on top 460 00:24:19,359 --> 00:24:22,720 Speaker 1: of that also stars are of dramatically different masses, so 461 00:24:22,760 --> 00:24:26,080 Speaker 1: you have to figure out the right average stellar mass 462 00:24:26,119 --> 00:24:29,640 Speaker 1: to divide by. So it's it's almost like this isn't 463 00:24:29,680 --> 00:24:31,680 Speaker 1: a perfect metaphor, but I was trying to think of one. 464 00:24:31,840 --> 00:24:35,439 Speaker 1: It's almost like catching an unknown species of fish and 465 00:24:35,480 --> 00:24:37,920 Speaker 1: then weighing it and then trying to guess how many 466 00:24:38,040 --> 00:24:41,000 Speaker 1: bones it has. You know, like, you can't count them 467 00:24:41,000 --> 00:24:43,960 Speaker 1: just from looking from the outside. Uh. And the bones 468 00:24:44,000 --> 00:24:46,920 Speaker 1: aren't all the same size and weight. Uh. And there's 469 00:24:46,920 --> 00:24:49,480 Speaker 1: a lot of other stuff in there too that's not bones. 470 00:24:49,520 --> 00:24:52,040 Speaker 1: But so you can't sort of estimate based on a 471 00:24:52,119 --> 00:24:55,439 Speaker 1: number of likely assumptions, but you can't get an accurate count, 472 00:24:55,720 --> 00:24:58,679 Speaker 1: which leads us back to the human brain. And this 473 00:24:58,880 --> 00:25:01,760 Speaker 1: idea of this number that was thrown around again not 474 00:25:02,359 --> 00:25:05,280 Speaker 1: non in science fiction films, but in pure viewed papers, 475 00:25:05,280 --> 00:25:08,119 Speaker 1: in uh in in in textbooks as well. It was 476 00:25:08,200 --> 00:25:11,040 Speaker 1: just out there and sort of the the scientific zeite 477 00:25:11,080 --> 00:25:14,080 Speaker 1: guys the idea that there were a hundred billion neurons 478 00:25:14,119 --> 00:25:16,199 Speaker 1: in the human brain. And we didn't know where this 479 00:25:16,280 --> 00:25:21,159 Speaker 1: number comes from, right, And so Susannah Rocolano Hozel she 480 00:25:21,280 --> 00:25:23,639 Speaker 1: was wondering too, and so she she started looking into it, 481 00:25:23,640 --> 00:25:26,200 Speaker 1: and she looked and looked, and she found no pre 482 00:25:26,280 --> 00:25:29,320 Speaker 1: existing count, no scientific basis for the number at all. 483 00:25:29,359 --> 00:25:32,080 Speaker 1: The number was just floating around in the scientific world. 484 00:25:32,800 --> 00:25:36,240 Speaker 1: So the obvious solution is, well, somebody's got to to 485 00:25:36,600 --> 00:25:39,840 Speaker 1: look it up. Somebody's got to count these neurons, and 486 00:25:39,880 --> 00:25:42,679 Speaker 1: so she set out to do just that via a 487 00:25:42,720 --> 00:25:45,840 Speaker 1: novel tool that she developed in the lab in two 488 00:25:45,920 --> 00:25:49,760 Speaker 1: thousand five, and that is turning brains into soup, the 489 00:25:49,840 --> 00:25:54,280 Speaker 1: cification of the of of of the animal brain question. 490 00:25:54,960 --> 00:25:58,239 Speaker 1: Did she say what the soup tastes like? No? I 491 00:25:58,240 --> 00:26:02,919 Speaker 1: believe she said that the soup looked like unfiltered apple juice, 492 00:26:03,400 --> 00:26:06,080 Speaker 1: and apparently turned some of the students off of apple juice. 493 00:26:06,119 --> 00:26:08,160 Speaker 1: Perhaps forever it looks I mean, I've seen it looks 494 00:26:08,200 --> 00:26:11,280 Speaker 1: like murky swamp water. Yeah, so I doubt that anybody 495 00:26:11,320 --> 00:26:14,600 Speaker 1: tasted it, because this is sort of like priceless research material. 496 00:26:15,280 --> 00:26:18,480 Speaker 1: But uh, I don't know. Then again, you wonder if 497 00:26:18,520 --> 00:26:21,240 Speaker 1: somebody got curious at some point. I don't know. She's 498 00:26:21,359 --> 00:26:24,240 Speaker 1: she's a she's a pretty good she's a wonderful communicator, 499 00:26:24,520 --> 00:26:27,000 Speaker 1: and she has a wonderful sense of humor. I imagine 500 00:26:27,000 --> 00:26:30,280 Speaker 1: if somebody had tasted it, even accidentally, she would she 501 00:26:30,320 --> 00:26:32,240 Speaker 1: would have mentioned it in one of the talks or 502 00:26:32,520 --> 00:26:34,520 Speaker 1: or write ups or interviews that we looked at. You 503 00:26:34,600 --> 00:26:36,439 Speaker 1: know what, I've got a guess as to what it 504 00:26:36,440 --> 00:26:39,800 Speaker 1: tastes like. I bet it tastes like chicken stock. Probably 505 00:26:40,040 --> 00:26:43,040 Speaker 1: probably a good a good guess and well and perhaps 506 00:26:43,080 --> 00:26:47,200 Speaker 1: from Mount hot but we'll get into that. Uh So, basically, yeah, 507 00:26:47,240 --> 00:26:49,680 Speaker 1: so she she set out to make brains into soup. 508 00:26:50,200 --> 00:26:53,960 Speaker 1: Uh So, Basically she came across previous lab efforts from 509 00:26:54,000 --> 00:26:56,639 Speaker 1: the nineteen seventies to turn the brain into soups to 510 00:26:56,640 --> 00:27:00,520 Speaker 1: measure DNA concentrations um. And one of the keys here 511 00:27:00,600 --> 00:27:03,919 Speaker 1: is that soup is a homogeneous solution. So if you 512 00:27:03,960 --> 00:27:08,560 Speaker 1: reduce something like like the brain that to soup, then 513 00:27:08,640 --> 00:27:10,639 Speaker 1: you you have a better ability to like get a 514 00:27:10,680 --> 00:27:14,199 Speaker 1: sample of that soup, do account within that sample, and 515 00:27:14,240 --> 00:27:18,399 Speaker 1: then apply that, you know, to the full volume of 516 00:27:18,440 --> 00:27:20,920 Speaker 1: the soup. So anyway, she said, well, well I could 517 00:27:21,000 --> 00:27:24,400 Speaker 1: use this method then, uh to figure out how many 518 00:27:24,400 --> 00:27:27,400 Speaker 1: neurons during the brain. You know, liquefy cell membranes leave 519 00:27:27,440 --> 00:27:30,840 Speaker 1: the nuclei intact, allowing them, allowing everyone to you know, 520 00:27:30,880 --> 00:27:33,639 Speaker 1: to count the remaining nuclei in a small sample and 521 00:27:33,680 --> 00:27:36,280 Speaker 1: then multiply the number by the overall volume to get 522 00:27:36,320 --> 00:27:38,480 Speaker 1: the whole brain total of neurons. This is great. You 523 00:27:38,520 --> 00:27:41,159 Speaker 1: can't do this with a galaxy, can you, Like, you 524 00:27:41,200 --> 00:27:43,800 Speaker 1: can't just look at one section of a galaxy and say, okay, 525 00:27:43,840 --> 00:27:46,360 Speaker 1: now multiply that by the total area of the galaxy. 526 00:27:46,440 --> 00:27:49,840 Speaker 1: Because galaxies are not homogeneous, you would have to turn 527 00:27:49,880 --> 00:27:52,760 Speaker 1: it to soup first, which you cannot do or I 528 00:27:52,800 --> 00:27:54,800 Speaker 1: mean not on our scale. Anyway, you would have to 529 00:27:54,840 --> 00:27:57,679 Speaker 1: have godlike powers, and then you would destroy the universe. 530 00:27:57,840 --> 00:28:00,920 Speaker 1: Maybe that's how the world ends. That's the true Naroc scenario. 531 00:28:01,600 --> 00:28:05,560 Speaker 1: Is the god or gods asked the question how many planets? 532 00:28:05,600 --> 00:28:08,400 Speaker 1: How many stars are kicking around this thing? Well, let's 533 00:28:08,400 --> 00:28:10,640 Speaker 1: turn it to soup and find out exactly, Loki says, 534 00:28:10,880 --> 00:28:13,040 Speaker 1: or you can't count the number of stars in the sky, 535 00:28:13,200 --> 00:28:16,240 Speaker 1: and he's like, watch me, Well this sounds this is 536 00:28:16,280 --> 00:28:18,880 Speaker 1: exactly the kind of gamble that that ancient gods would 537 00:28:18,920 --> 00:28:21,560 Speaker 1: get into. You know, um, just a mere bet, and 538 00:28:21,720 --> 00:28:25,840 Speaker 1: everything for mortals is at stake, right, But anyway, or 539 00:28:25,920 --> 00:28:29,240 Speaker 1: Colono was not interested in that. She's interested in the brain. 540 00:28:29,320 --> 00:28:32,959 Speaker 1: So the main challenge with souping the brain was souping 541 00:28:33,000 --> 00:28:35,760 Speaker 1: it just enough to retain the cell nuclei, but without 542 00:28:35,760 --> 00:28:40,000 Speaker 1: breaking any of that down. Initial souping experiments via essentially 543 00:28:40,000 --> 00:28:43,440 Speaker 1: a detergent went too far and attempts to flash freeze 544 00:28:43,440 --> 00:28:46,320 Speaker 1: them with wicked night liquid nitrogen and then blend them. Well, 545 00:28:46,320 --> 00:28:48,120 Speaker 1: that just caused a cracked mess. And she says that 546 00:28:48,120 --> 00:28:50,560 Speaker 1: they were like frozen pieces of brain all over the place. 547 00:28:51,000 --> 00:28:53,480 Speaker 1: I think if anyone was going to taste her brain soup, 548 00:28:53,520 --> 00:28:55,560 Speaker 1: that was probably gonna be when it would have occurred. 549 00:28:56,520 --> 00:28:59,440 Speaker 1: But anyway, then she found a solution fixing the brain 550 00:28:59,480 --> 00:29:02,360 Speaker 1: tissue with from all the eyed before the dissolving it. 551 00:29:02,400 --> 00:29:05,800 Speaker 1: And the result, she says, again, looks like unfiltered apple 552 00:29:05,880 --> 00:29:08,560 Speaker 1: juice and allows this kind of count to take place. Right, so, 553 00:29:08,640 --> 00:29:11,360 Speaker 1: you can pull out a small sample, you know what 554 00:29:11,440 --> 00:29:14,240 Speaker 1: the volume of that is compared to the entire volume 555 00:29:14,240 --> 00:29:16,440 Speaker 1: of the sample, and then you count the number of 556 00:29:16,880 --> 00:29:20,000 Speaker 1: neuron nuclei in the small sample and then multiply that 557 00:29:20,040 --> 00:29:23,440 Speaker 1: by the total volume exactly. Okay, So I think I 558 00:29:23,480 --> 00:29:26,520 Speaker 1: mentioned earlier there's a TED talk she did in where 559 00:29:26,520 --> 00:29:29,560 Speaker 1: she shows off anyway, in the middle of this this 560 00:29:29,600 --> 00:29:32,040 Speaker 1: TED talk, she shows off a vial of this brain 561 00:29:32,080 --> 00:29:34,400 Speaker 1: soup on the stage. It's a little glass jar of 562 00:29:34,520 --> 00:29:36,880 Speaker 1: mouse brain souper. Might be plastic, I don't know. It's 563 00:29:36,880 --> 00:29:39,080 Speaker 1: a jar of mouse brain soup. It looks like murky 564 00:29:39,120 --> 00:29:41,600 Speaker 1: swamp water. It's kind of sort of like an off 565 00:29:41,680 --> 00:29:44,560 Speaker 1: beige kind of color. One of the best things about 566 00:29:44,600 --> 00:29:47,360 Speaker 1: this particular talk, though, is that when she shows off 567 00:29:47,400 --> 00:29:49,600 Speaker 1: the jar, she doesn't go and pick it up from 568 00:29:49,600 --> 00:29:53,520 Speaker 1: a table or demonstration stand or something. She's just got 569 00:29:53,520 --> 00:29:55,960 Speaker 1: it tucked into the back of her belt like a gun. 570 00:29:57,040 --> 00:29:59,760 Speaker 1: She just pulls it out, and then when she's done 571 00:29:59,760 --> 00:30:02,200 Speaker 1: show in it off, she just slips it right back 572 00:30:02,240 --> 00:30:04,320 Speaker 1: in under the belt. I mean, I guess it must 573 00:30:04,320 --> 00:30:06,640 Speaker 1: have been there all day. Uh well, you know, one 574 00:30:06,640 --> 00:30:08,440 Speaker 1: of the things that she pointed out in the talk 575 00:30:08,520 --> 00:30:10,440 Speaker 1: is that some when she started doing these experiments, some 576 00:30:11,000 --> 00:30:13,360 Speaker 1: uh people objected. They were like, what are you doing 577 00:30:13,400 --> 00:30:16,680 Speaker 1: through these precious brains? As if she were wasting brains, 578 00:30:16,720 --> 00:30:19,800 Speaker 1: as if she were essentially the dan ackroid character from 579 00:30:19,800 --> 00:30:23,200 Speaker 1: it came from Hollywood, just you know, massacuring brain material. 580 00:30:23,600 --> 00:30:25,640 Speaker 1: But she pointed out that, you know, the aside from 581 00:30:25,680 --> 00:30:29,040 Speaker 1: it being highly useful, as they will explore the rest 582 00:30:29,040 --> 00:30:32,400 Speaker 1: of this episode, also they freeze it and so they 583 00:30:32,400 --> 00:30:34,000 Speaker 1: can save it for later. She says, she has a 584 00:30:34,000 --> 00:30:37,680 Speaker 1: whole database. The freezer is full of brain soup. I'm 585 00:30:37,720 --> 00:30:41,640 Speaker 1: sure all properly labeled and dated. Yes, So what they 586 00:30:41,640 --> 00:30:44,400 Speaker 1: did is yeah, they apply to fluorescent stain to differentiate 587 00:30:44,640 --> 00:30:48,480 Speaker 1: the neurons from other cells. And she started with rats 588 00:30:48,480 --> 00:30:51,880 Speaker 1: and other rodents, eventually working up to human brains, and 589 00:30:52,120 --> 00:30:57,440 Speaker 1: eventually she had a verified count not one billion neurons, 590 00:30:57,480 --> 00:31:01,400 Speaker 1: but eighties six billion neurons. Apparently this can be higher, 591 00:31:01,440 --> 00:31:03,400 Speaker 1: as high as they ninety one billion, for instance, but 592 00:31:03,440 --> 00:31:07,440 Speaker 1: eighty six billion is the like the ballpark count. That's uh, well, 593 00:31:07,480 --> 00:31:09,520 Speaker 1: that that doesn't seem too far off. I mean that 594 00:31:09,560 --> 00:31:13,680 Speaker 1: seems within a rough order of magnitude level of reasonableness. 595 00:31:13,880 --> 00:31:16,560 Speaker 1: Well it does, and it doesn't like you know, she 596 00:31:16,640 --> 00:31:18,600 Speaker 1: points out that this may not may not sound like 597 00:31:18,720 --> 00:31:20,480 Speaker 1: much of a difference to a lot of us. And 598 00:31:20,480 --> 00:31:22,080 Speaker 1: I have to admit, when I first heard it it, 599 00:31:22,160 --> 00:31:24,560 Speaker 1: I didn't really register that it was that much different 600 00:31:24,560 --> 00:31:27,800 Speaker 1: thundered billion versus six So it sounds like, all right, well, 601 00:31:28,000 --> 00:31:31,440 Speaker 1: we almost got it right. And uh, the thing is, 602 00:31:31,560 --> 00:31:35,440 Speaker 1: you know, she says that ten billion neurons is really 603 00:31:35,560 --> 00:31:38,680 Speaker 1: is not a small sum when it comes to neural change, 604 00:31:38,760 --> 00:31:42,200 Speaker 1: and when we're talking about the evolution of of of brains, 605 00:31:42,480 --> 00:31:45,600 Speaker 1: she says, the difference there is an entire baboon brain 606 00:31:45,720 --> 00:31:48,920 Speaker 1: and chain. So so yeah, that's a lot of neural 607 00:31:48,920 --> 00:31:51,840 Speaker 1: power we're talking about there, and to be off uh 608 00:31:52,240 --> 00:31:56,000 Speaker 1: on that can you know, can have consequences as well, 609 00:31:56,040 --> 00:31:58,760 Speaker 1: explore when we're talking about like what makes the human 610 00:31:58,800 --> 00:32:01,960 Speaker 1: brain seemingly special? Yeah, that that's interesting, I mean, and 611 00:32:02,000 --> 00:32:04,320 Speaker 1: that's where it ties into her larger theory about what 612 00:32:04,440 --> 00:32:08,280 Speaker 1: it is that's special about human brains. So obviously, neuro 613 00:32:08,280 --> 00:32:11,280 Speaker 1: anatomists and other scientists have long debated this question what 614 00:32:11,440 --> 00:32:14,600 Speaker 1: makes human brains unique? Why? Why are we the only 615 00:32:14,640 --> 00:32:19,080 Speaker 1: ones with computers? You know? Why don't rabbits have computers 616 00:32:19,120 --> 00:32:22,080 Speaker 1: and all that? Uh? And so what is the physical 617 00:32:22,120 --> 00:32:25,400 Speaker 1: property that gives the human brain its power? Is its size? 618 00:32:25,640 --> 00:32:28,400 Speaker 1: I mean, that clearly doesn't make any sense because sperm 619 00:32:28,440 --> 00:32:31,080 Speaker 1: whales have brains that there are something like, I think 620 00:32:31,080 --> 00:32:34,200 Speaker 1: at least six or seven times larger than human brains, 621 00:32:34,480 --> 00:32:37,800 Speaker 1: and yet they don't seem to be smarter than us. Likewise, 622 00:32:37,800 --> 00:32:40,200 Speaker 1: the elephant brain is much bigger and exactly, and we're 623 00:32:40,200 --> 00:32:43,800 Speaker 1: not discounting the intelligence of whales and elephants, but but 624 00:32:43,800 --> 00:32:47,640 Speaker 1: but clearly their their neural power is not on the 625 00:32:47,680 --> 00:32:50,520 Speaker 1: same scale as as human power, and we have to 626 00:32:50,520 --> 00:32:53,080 Speaker 1: ask the question why, right, So we're able to do 627 00:32:53,160 --> 00:32:55,640 Speaker 1: things that they're not able to do. And is that 628 00:32:55,760 --> 00:32:58,760 Speaker 1: difference related to sheer size of the brain? Clearly not 629 00:32:59,360 --> 00:33:01,720 Speaker 1: could it be? And another thing that's often put forward 630 00:33:01,800 --> 00:33:05,640 Speaker 1: is the ratio of brain size to body size, right. Uh, 631 00:33:05,680 --> 00:33:09,640 Speaker 1: the encephialization quotation that actually doesn't seem to quite cover 632 00:33:09,720 --> 00:33:12,000 Speaker 1: it either. Right. So one of the things you pointed 633 00:33:12,040 --> 00:33:14,800 Speaker 1: out here is that we'll just consider a couple of 634 00:33:14,880 --> 00:33:19,360 Speaker 1: these neuron counts again, humans eighty six billion neurons. Uh. 635 00:33:19,400 --> 00:33:21,880 Speaker 1: And then we have something like like a rat two 636 00:33:21,920 --> 00:33:25,840 Speaker 1: hundred million neurons uh. And a goody which is a 637 00:33:25,840 --> 00:33:29,320 Speaker 1: South American like rather plump kind of critter, but it 638 00:33:29,440 --> 00:33:33,040 Speaker 1: is a rodent eight hundred fifty seven million neurons. Al 639 00:33:33,120 --> 00:33:36,960 Speaker 1: Monkeys have a one thousand, four hundred sixty eight million neurons, 640 00:33:36,960 --> 00:33:40,120 Speaker 1: while a cappy bara again a rodent one thousand, six 641 00:33:40,200 --> 00:33:43,480 Speaker 1: hundred million neurons. Okay, so they're up into the billion range, 642 00:33:43,520 --> 00:33:47,920 Speaker 1: but the difference here is still orders of magnitude of difference, right. 643 00:33:48,000 --> 00:33:50,760 Speaker 1: And and her argument is like when you compare like 644 00:33:50,760 --> 00:33:55,680 Speaker 1: like sized rodents and primates the primate brains, uh, you know, 645 00:33:55,760 --> 00:33:58,480 Speaker 1: maybe the same size, but there are more neurons in 646 00:33:58,520 --> 00:34:01,480 Speaker 1: the more advanced primate brains. Right, So what seems to 647 00:34:01,480 --> 00:34:03,640 Speaker 1: be going on here is that it's not so much 648 00:34:03,720 --> 00:34:06,880 Speaker 1: that there's something really special about human brains, but there's 649 00:34:06,880 --> 00:34:11,680 Speaker 1: something special about primate brains. Primate, the brains of monkeys 650 00:34:11,760 --> 00:34:15,080 Speaker 1: and apes, you know, the primate creatures like us, that 651 00:34:15,200 --> 00:34:20,239 Speaker 1: they have these densely packed, neuron heavy cortex cortices. Yeah, 652 00:34:20,280 --> 00:34:22,239 Speaker 1: and then it comes down to concentration of neurons. And 653 00:34:22,280 --> 00:34:25,040 Speaker 1: by the way, she also has found that birds have 654 00:34:25,200 --> 00:34:28,600 Speaker 1: primate like concentrations in the forebrain, which lines up with 655 00:34:28,640 --> 00:34:32,319 Speaker 1: the with their intelligence despite much smaller brains than other 656 00:34:32,360 --> 00:34:35,520 Speaker 1: intelligent animals. Yeah, we mentioned this before, but this really 657 00:34:35,520 --> 00:34:39,960 Speaker 1: comes through in the sometimes startling intelligence of birds like 658 00:34:40,000 --> 00:34:42,600 Speaker 1: Corvid's and parrots. Yeah. You look at a parrot, like, 659 00:34:42,600 --> 00:34:44,360 Speaker 1: how big is it's brain? Right, It's like, you know, 660 00:34:44,719 --> 00:34:48,000 Speaker 1: you you snack on nuts larger than this creature's brain, 661 00:34:48,080 --> 00:34:52,160 Speaker 1: and yet it has this startling intelligence about the efficiently 662 00:34:52,239 --> 00:34:57,080 Speaker 1: packed to the neuron crammed forebrain, sort of like a primate. Yeah. Yeah. Meanwhile, 663 00:34:57,160 --> 00:34:59,839 Speaker 1: whale brain, you could climb inside of it, and it's 664 00:35:00,560 --> 00:35:03,120 Speaker 1: and and it's it's it's not quite there. All right. 665 00:35:03,160 --> 00:35:04,640 Speaker 1: On that note, we're gonna take a quick break, and 666 00:35:04,640 --> 00:35:07,239 Speaker 1: when we come back, we're gonna explore more of Susanna 667 00:35:07,600 --> 00:35:11,719 Speaker 1: or Colono Ozell's ideas about the human brain, how it 668 00:35:11,719 --> 00:35:15,920 Speaker 1: has evolved and whine has so many neurons. Thank you, 669 00:35:16,120 --> 00:35:20,680 Speaker 1: thank you. All Right, we're back, so more about brains, alright. 670 00:35:20,719 --> 00:35:25,920 Speaker 1: So one of the big take aways from Susanna Orcolano 671 00:35:26,000 --> 00:35:28,880 Speaker 1: Ozell's work is that, of course, a big brain doesn't 672 00:35:28,880 --> 00:35:32,720 Speaker 1: necessarily mean high levels of cognition. Again, it's more about 673 00:35:32,920 --> 00:35:35,480 Speaker 1: the neuron concentration. Yeah, this is what we were just 674 00:35:35,480 --> 00:35:38,320 Speaker 1: talking about with the fact that obviously, you know, like whales, elephants, 675 00:35:38,320 --> 00:35:40,800 Speaker 1: all these have much bigger brains than us. But there's 676 00:35:40,800 --> 00:35:43,040 Speaker 1: a lot that our brains seem able to do that 677 00:35:43,120 --> 00:35:46,399 Speaker 1: those brains can't. So it's not just size. There does 678 00:35:46,400 --> 00:35:50,400 Speaker 1: seem to be something special about the way that primate 679 00:35:50,600 --> 00:35:55,040 Speaker 1: brains are organized. And that's the thing, primate brains, not 680 00:35:55,040 --> 00:35:59,320 Speaker 1: not just human brains, because that's something that Arcolano Ozel 681 00:35:59,640 --> 00:36:03,399 Speaker 1: really drives home, is that the human brain is just 682 00:36:03,560 --> 00:36:07,040 Speaker 1: a scaled up primate brain. There's nothing special about the 683 00:36:07,120 --> 00:36:11,600 Speaker 1: human brain beyond that, nothing God touched or anything of 684 00:36:11,640 --> 00:36:13,920 Speaker 1: the sort. She says that this is very much in 685 00:36:14,000 --> 00:36:17,400 Speaker 1: line with what Darwin thought and was criticized for. Darwin 686 00:36:17,480 --> 00:36:21,040 Speaker 1: thought the human brain was just another primate brain, but 687 00:36:21,080 --> 00:36:23,080 Speaker 1: those who came after him, they often took their evolution 688 00:36:23,080 --> 00:36:26,880 Speaker 1: with a hefty dose of human exceptionalism. Wanting to think 689 00:36:26,920 --> 00:36:29,600 Speaker 1: about us is a special case that the rules of 690 00:36:29,640 --> 00:36:32,399 Speaker 1: life don't apply to like they apply to every other 691 00:36:32,440 --> 00:36:36,600 Speaker 1: animal on Earth. Right and likewise, um l Ozell says 692 00:36:36,600 --> 00:36:38,719 Speaker 1: that she faced resistance to her results due to the 693 00:36:38,760 --> 00:36:40,959 Speaker 1: fact that they didn't support this view of human brain 694 00:36:41,040 --> 00:36:45,000 Speaker 1: exceptionalism versus the brains of other primates. Again, it's just 695 00:36:45,080 --> 00:36:47,920 Speaker 1: a scaled up a scaled up primate brain, right, So 696 00:36:48,000 --> 00:36:51,840 Speaker 1: her idea, what she argues, is that what the human 697 00:36:51,880 --> 00:36:55,280 Speaker 1: brain is. You start with a primate brain, which primate brains, 698 00:36:55,320 --> 00:36:58,719 Speaker 1: like brains of monkeys and apes in general, are more 699 00:36:58,800 --> 00:37:01,440 Speaker 1: crammed with neurons other types of brains in the animal 700 00:37:01,520 --> 00:37:04,000 Speaker 1: kingdom usually, and then you look at all the primates 701 00:37:04,040 --> 00:37:07,520 Speaker 1: and humans have by far the biggest primate brain. And 702 00:37:07,560 --> 00:37:10,760 Speaker 1: then because it's the biggest version of this densely packed 703 00:37:11,280 --> 00:37:16,040 Speaker 1: primate neuron housing center, we are the smartest. That's essentially 704 00:37:16,040 --> 00:37:18,600 Speaker 1: what makes us the smartest. And we'll get into why 705 00:37:18,640 --> 00:37:21,480 Speaker 1: this seems to be the case and why um the 706 00:37:21,480 --> 00:37:25,440 Speaker 1: the argument that Susanna rocol Hosel makes for it as well. Uh, 707 00:37:25,480 --> 00:37:26,960 Speaker 1: but before we do that, I want to I want 708 00:37:27,000 --> 00:37:28,960 Speaker 1: to touch base in another area that she lines up 709 00:37:28,960 --> 00:37:31,400 Speaker 1: with all of this, and that is the subject of 710 00:37:31,440 --> 00:37:36,680 Speaker 1: longevity in warm blooded vertebrates anyway, UM, because it seems 711 00:37:36,880 --> 00:37:39,600 Speaker 1: to be the case that neurons are more important than 712 00:37:39,640 --> 00:37:43,400 Speaker 1: body size here as well. So in two thousand eighteen 713 00:37:43,440 --> 00:37:48,120 Speaker 1: research that was published in the Journal of Comparative Neurology, UM, 714 00:37:48,200 --> 00:37:52,000 Speaker 1: Rocolino Ozel and co authors found that in primates, birds, 715 00:37:52,000 --> 00:37:54,799 Speaker 1: and other warm blooded creatures, the number of neurons that 716 00:37:54,880 --> 00:37:57,239 Speaker 1: you have in the core in the cortex of a 717 00:37:57,320 --> 00:38:01,360 Speaker 1: species predicts about seventy five sent of all the variation 718 00:38:01,760 --> 00:38:05,920 Speaker 1: in the longevity across species. Body size and metabolism the 719 00:38:06,000 --> 00:38:08,000 Speaker 1: usual standard because usually that's what we think of, right, 720 00:38:08,040 --> 00:38:10,799 Speaker 1: an elephant lives a while, it's big, whales of a 721 00:38:10,800 --> 00:38:13,960 Speaker 1: while big, right, She says that this only predicts around 722 00:38:13,960 --> 00:38:20,000 Speaker 1: twenty of of of the cases of longevity. And here's 723 00:38:20,000 --> 00:38:23,880 Speaker 1: another slice of human non exceptionalism. If you do the math. 724 00:38:24,160 --> 00:38:26,680 Speaker 1: She says, this means that humans live about as long 725 00:38:26,719 --> 00:38:31,160 Speaker 1: as you expect based on their neuron load. In this 726 00:38:31,200 --> 00:38:34,240 Speaker 1: particular brain soup study examined more than seven hundred warm 727 00:38:34,280 --> 00:38:36,960 Speaker 1: blooded animal species. I want to see a list of 728 00:38:37,000 --> 00:38:41,799 Speaker 1: those soups flavors. Yes, the super menu is. It's quite 729 00:38:41,800 --> 00:38:45,000 Speaker 1: extensive at her lap. Now, why is there a connection 730 00:38:45,040 --> 00:38:48,320 Speaker 1: between neurons and the cerebral cortex and longevity. Well, she 731 00:38:48,360 --> 00:38:50,319 Speaker 1: says more work is required to figure this out, but 732 00:38:50,320 --> 00:38:52,239 Speaker 1: she does have some ideas. And here's what she said 733 00:38:52,239 --> 00:38:55,560 Speaker 1: in a press release on the paper from quote. The 734 00:38:55,640 --> 00:38:59,520 Speaker 1: data suggests the warm blooded species accumulate damages at the 735 00:38:59,560 --> 00:39:02,480 Speaker 1: same eight as they age. But what curtails life are 736 00:39:02,560 --> 00:39:05,720 Speaker 1: damages to the cerebral cortex, not the rest of the body. 737 00:39:05,960 --> 00:39:08,239 Speaker 1: The more cortical neurons you have, the longer you will 738 00:39:08,239 --> 00:39:11,000 Speaker 1: still have enough to keep your body functional. The cortex 739 00:39:11,120 --> 00:39:12,960 Speaker 1: is the part of your brain that is capable of 740 00:39:12,960 --> 00:39:16,640 Speaker 1: making our behavior complex and flexible. Yes, but that extends 741 00:39:16,640 --> 00:39:20,120 Speaker 1: well beyond cognition and doing mental math and logic reasoning. 742 00:39:20,440 --> 00:39:23,840 Speaker 1: The cerebrial cortex also gives your body adaptability as it 743 00:39:23,880 --> 00:39:27,120 Speaker 1: adjusts and learns how to react to stresses and predicts them. 744 00:39:27,280 --> 00:39:30,920 Speaker 1: That includes keeping your physiological functions running smoothly and making 745 00:39:30,960 --> 00:39:34,200 Speaker 1: sure your heart rate, your respiratory rate, and your metabolism 746 00:39:34,239 --> 00:39:38,080 Speaker 1: are on track with what you're doing and how you feel, uh, 747 00:39:38,120 --> 00:39:40,640 Speaker 1: and with what you expect to happen next. And that 748 00:39:40,719 --> 00:39:45,400 Speaker 1: apparently is a key factor that impacts longevity. Fascinating. Yeah, 749 00:39:45,480 --> 00:39:48,600 Speaker 1: so I love how already just this this concept of 750 00:39:48,600 --> 00:39:50,759 Speaker 1: of brain, not this concept of brain soup, but this 751 00:39:50,840 --> 00:39:54,640 Speaker 1: tool of making brain soup. It already you know, is 752 00:39:54,680 --> 00:39:57,640 Speaker 1: turning certain things that uh, we used to think we 753 00:39:57,719 --> 00:40:00,640 Speaker 1: knew about brains and the human rain and how it's 754 00:40:00,640 --> 00:40:03,879 Speaker 1: different from other animals and indeed how how other animals think, 755 00:40:04,160 --> 00:40:06,000 Speaker 1: turning all of that on its head and change us 756 00:40:06,080 --> 00:40:08,520 Speaker 1: another way of thinking about it. Yeah. So one thing 757 00:40:08,640 --> 00:40:13,080 Speaker 1: that this would, obviously, I think have somebody wondering about, is, Okay, 758 00:40:13,320 --> 00:40:17,560 Speaker 1: if if humans are the upper end of the primate brain, 759 00:40:17,680 --> 00:40:19,879 Speaker 1: you know, with your neuronal loads, you've got a lot 760 00:40:19,920 --> 00:40:22,160 Speaker 1: of neurons in the human human brain, how did we 761 00:40:22,200 --> 00:40:25,040 Speaker 1: get that way? Does it happen? Because again, we can't 762 00:40:25,040 --> 00:40:28,760 Speaker 1: go with any mythic interpretation here, whether there's no ancient 763 00:40:28,760 --> 00:40:33,319 Speaker 1: alien or gods scenario where where this particular primate was 764 00:40:33,719 --> 00:40:36,440 Speaker 1: was touched and made special. No, we need to look 765 00:40:36,480 --> 00:40:40,360 Speaker 1: to something in the natural world, something that that that 766 00:40:40,480 --> 00:40:45,160 Speaker 1: explains this, like rapid growth of the brain and uh Erculano. 767 00:40:45,280 --> 00:40:49,960 Speaker 1: Uzel's hypothesis has to do with straightforward energy economy, the 768 00:40:50,120 --> 00:40:53,960 Speaker 1: energy that different species are able to take into their bodies. Right. 769 00:40:54,000 --> 00:40:55,680 Speaker 1: She says that she thinks it all comes down to 770 00:40:55,760 --> 00:41:00,680 Speaker 1: the calories a mass via a very early technological development 771 00:41:00,840 --> 00:41:04,040 Speaker 1: that our ancestors made, that being cooking. Now, this is 772 00:41:04,080 --> 00:41:07,040 Speaker 1: an interesting hypothesis. I like this. Yeah, we know, we've 773 00:41:07,080 --> 00:41:11,000 Speaker 1: discussed the essential role that cooking played in human advancement before, 774 00:41:11,360 --> 00:41:16,000 Speaker 1: how it externalized. Digestion allowed you to uh sort of 775 00:41:16,520 --> 00:41:18,399 Speaker 1: to you know, heat up this pod or not even 776 00:41:18,400 --> 00:41:20,800 Speaker 1: necessarily a pod, just you know, a pit with fire 777 00:41:21,200 --> 00:41:27,640 Speaker 1: even and partially digest various organic matter before you give 778 00:41:27,719 --> 00:41:31,040 Speaker 1: your own digestions a digestive system a shot at it. 779 00:41:31,600 --> 00:41:34,239 Speaker 1: This makes it easier to digest. You know a lot 780 00:41:34,280 --> 00:41:39,000 Speaker 1: of foods that would be impossible or difficult to consume otherwise, 781 00:41:39,320 --> 00:41:42,560 Speaker 1: either because of their chemical components, or maybe they're just 782 00:41:42,680 --> 00:41:45,719 Speaker 1: tough they're hard to chew, or or or they can't 783 00:41:45,719 --> 00:41:47,840 Speaker 1: be chewed, and cooking can make them softer, and of 784 00:41:47,880 --> 00:41:51,560 Speaker 1: course just speed up overall digestion. Yeah, it just gives 785 00:41:51,600 --> 00:41:54,800 Speaker 1: your digestive system much more access to the nutrients inside 786 00:41:54,840 --> 00:41:58,399 Speaker 1: through a variety of means. Yeah. Because and this ties 787 00:41:58,440 --> 00:42:02,799 Speaker 1: into neurons, because neurons require energy, a lot of energy actually, Yeah. 788 00:42:02,800 --> 00:42:05,120 Speaker 1: And the more neurons you have, and the more energy 789 00:42:05,280 --> 00:42:09,719 Speaker 1: you require to charge them up. Like your brain consumes 790 00:42:09,920 --> 00:42:13,000 Speaker 1: energy at a rate that is not proportional to its 791 00:42:13,040 --> 00:42:15,359 Speaker 1: size relative to the rest of your body. It is 792 00:42:15,440 --> 00:42:19,080 Speaker 1: the the great energy hog of your body. Yeah, we're talking. 793 00:42:19,480 --> 00:42:23,960 Speaker 1: Neurons require six kilo calories per billion neurons per day, 794 00:42:24,080 --> 00:42:28,360 Speaker 1: so um urkilno Ozell points out that the human brain 795 00:42:28,520 --> 00:42:33,160 Speaker 1: costs on average five kilo calories per day, so, in 796 00:42:33,200 --> 00:42:37,440 Speaker 1: her words, not quite a whole hamburger. But but I mean, 797 00:42:37,640 --> 00:42:39,799 Speaker 1: when you think about that in terms of food, the 798 00:42:39,840 --> 00:42:43,320 Speaker 1: types of calories you can acquire in the wild, without grains, 799 00:42:43,360 --> 00:42:46,720 Speaker 1: without access to easy meat and all that, that that's 800 00:42:47,600 --> 00:42:49,799 Speaker 1: that's a tough requirement. Yeah, let's take a moment to 801 00:42:49,880 --> 00:42:54,640 Speaker 1: just realize how how, how how amazing the hamburger is. 802 00:42:55,000 --> 00:42:56,919 Speaker 1: I mean, and just in terms of like how much 803 00:42:57,400 --> 00:43:01,120 Speaker 1: uh you know, protein and potential new atrition is just 804 00:43:01,239 --> 00:43:03,400 Speaker 1: jammed into that thing. There's a lot of energy in 805 00:43:03,400 --> 00:43:06,120 Speaker 1: the hamburger, like it or not. And we, you know, 806 00:43:06,320 --> 00:43:08,719 Speaker 1: we take for granted, you know what a robust chunk 807 00:43:08,760 --> 00:43:10,920 Speaker 1: of energy that is. And if we had and if 808 00:43:10,960 --> 00:43:12,839 Speaker 1: we had we didn't have the hamburger and of course 809 00:43:12,880 --> 00:43:14,880 Speaker 1: all the things that are comparable to the hamburger and 810 00:43:15,239 --> 00:43:18,720 Speaker 1: modern culinary tradition, and if we didn't have that to 811 00:43:18,719 --> 00:43:20,880 Speaker 1: to eat, if we had to eat like our primate 812 00:43:20,880 --> 00:43:25,960 Speaker 1: brethren did and still do without cooking and modern food. Uh. 813 00:43:26,120 --> 00:43:28,360 Speaker 1: She points out that we'd have to spend nine point 814 00:43:28,440 --> 00:43:31,480 Speaker 1: five hours every day eating so instead of so you're 815 00:43:31,480 --> 00:43:34,160 Speaker 1: just eating like raw vegetable matter most all that you 816 00:43:34,719 --> 00:43:36,840 Speaker 1: like a lot of animals do in the world. If you, 817 00:43:37,040 --> 00:43:39,560 Speaker 1: if you like, like we do, watch a lot of documentaries, 818 00:43:39,760 --> 00:43:43,520 Speaker 1: you'll frequently encounter, uh, you know, some an animal or 819 00:43:43,520 --> 00:43:46,279 Speaker 1: another of panda and you're like, oh, all it does 820 00:43:46,360 --> 00:43:48,640 Speaker 1: is eat because it has to. It has to eat 821 00:43:48,719 --> 00:43:52,400 Speaker 1: all the time, you know, to maintain um uh, this 822 00:43:52,400 --> 00:43:54,799 Speaker 1: this body and ultimately it's brain as well. But of 823 00:43:54,840 --> 00:43:58,960 Speaker 1: course our brain again has tremendous energy requirements. It only 824 00:43:59,000 --> 00:44:01,200 Speaker 1: makes up two percent of our body, but it requires 825 00:44:01,239 --> 00:44:04,680 Speaker 1: twenty five of the body's energy, so that five calorie 826 00:44:04,719 --> 00:44:06,920 Speaker 1: burger is just part of a two thousand calorie per 827 00:44:07,000 --> 00:44:12,560 Speaker 1: day diet. So Kolona Ozel contends that without cooking, we'd 828 00:44:12,560 --> 00:44:15,560 Speaker 1: be in the same state we'd evolved to one point 829 00:44:15,560 --> 00:44:19,360 Speaker 1: five million years ago. We'd be small primates with essentially 830 00:44:19,400 --> 00:44:22,040 Speaker 1: the brain power of a modern guerrilla, but with some 831 00:44:22,120 --> 00:44:25,719 Speaker 1: stone tool making abilities. This is kind of encouraging as 832 00:44:25,760 --> 00:44:29,000 Speaker 1: a as a hypothesis for people who love cooking kitchen 833 00:44:29,200 --> 00:44:33,680 Speaker 1: culinary enthusiasts, You you may well be taking part in 834 00:44:33,920 --> 00:44:37,600 Speaker 1: the most crucially human of all activities. Yeah, and I 835 00:44:37,600 --> 00:44:41,000 Speaker 1: believe Michael Pollen has made a very very similar argument 836 00:44:41,040 --> 00:44:43,319 Speaker 1: before in terms of like why we we we love 837 00:44:43,400 --> 00:44:45,800 Speaker 1: cooking and why even if we don't cook, we're drawn 838 00:44:45,880 --> 00:44:49,200 Speaker 1: to the myriad cooking shows that are out there. You know, 839 00:44:49,239 --> 00:44:51,120 Speaker 1: we want to watch somebody cook, we want to learn 840 00:44:51,320 --> 00:44:54,920 Speaker 1: about about different culinaries, traditions, and even for my own part, 841 00:44:54,960 --> 00:44:57,359 Speaker 1: I've never been much of a chef. I am not 842 00:44:57,600 --> 00:45:00,840 Speaker 1: much of a chef either, but I've been enjoying some 843 00:45:00,920 --> 00:45:04,439 Speaker 1: of these various like meal box things recently, and it's 844 00:45:04,600 --> 00:45:08,360 Speaker 1: teaching me some of the basics of cooking. And you know, 845 00:45:08,400 --> 00:45:12,360 Speaker 1: I'll still curse at a tomato, but there's something fulfilling 846 00:45:12,360 --> 00:45:15,200 Speaker 1: about going through all the instructions and and turning this 847 00:45:15,719 --> 00:45:21,040 Speaker 1: sack of raw materials into a delicious meal. Well, uh, 848 00:45:21,200 --> 00:45:26,160 Speaker 1: tomatoes are tricky devils that can be seriously, I they're 849 00:45:26,239 --> 00:45:28,920 Speaker 1: They're at both ends of my you know, food love 850 00:45:28,960 --> 00:45:32,200 Speaker 1: and hate spectrum. Basically, my favorite food in the entire 851 00:45:32,280 --> 00:45:36,759 Speaker 1: world is a really good ripe summer tomato, and my 852 00:45:37,000 --> 00:45:39,640 Speaker 1: least favorite food in the entire world is like a 853 00:45:39,760 --> 00:45:43,520 Speaker 1: mealy white winter tomato. It's the worst thing on earth. Yeah, 854 00:45:43,560 --> 00:45:45,400 Speaker 1: it's there's It's kind of I feel a similar way 855 00:45:45,400 --> 00:45:48,280 Speaker 1: with cantalope, Like there are a lot of mediocre cantlelopes 856 00:45:48,320 --> 00:45:51,000 Speaker 1: out there, and if you have one, you could easily 857 00:45:51,040 --> 00:45:53,160 Speaker 1: turn your back on cantal loops forever. But when you 858 00:45:53,239 --> 00:45:56,719 Speaker 1: have like a really good cantlelope, there's nothing else that 859 00:45:56,800 --> 00:45:58,600 Speaker 1: can beat it. Well, it's funny we talked about these 860 00:45:58,600 --> 00:46:04,040 Speaker 1: extremes with agriculturally produced fruits and vegetables. Arklanzel points out 861 00:46:04,040 --> 00:46:06,520 Speaker 1: in Somewhere I was reading or one of her talks 862 00:46:06,560 --> 00:46:08,360 Speaker 1: I think where she says, you know, it's kind of 863 00:46:08,360 --> 00:46:11,240 Speaker 1: funny that the inherent logic of of what she's showing 864 00:46:11,280 --> 00:46:15,520 Speaker 1: here is even there in some of the diet trends, 865 00:46:15,560 --> 00:46:17,480 Speaker 1: where like what do people do when they want to 866 00:46:17,520 --> 00:46:20,400 Speaker 1: lose weight? Well, one popular thing is the raw food diet, 867 00:46:20,440 --> 00:46:23,920 Speaker 1: because suddenly you're condemning yourself to a forging type existence 868 00:46:23,960 --> 00:46:28,279 Speaker 1: without you know, without the calorie benefits of cooked food. Yeah, exactly, 869 00:46:29,400 --> 00:46:32,480 Speaker 1: but but cook But cooking food is something we did 870 00:46:32,520 --> 00:46:35,120 Speaker 1: develop again, and that was roughly one point five million 871 00:46:35,160 --> 00:46:38,719 Speaker 1: years ago. Um, and this is the key technology, she says. 872 00:46:38,760 --> 00:46:42,439 Speaker 1: It changed what was possible energy wise for the evolving brain. 873 00:46:42,520 --> 00:46:45,520 Speaker 1: Our brains got big in a hurry after this, and 874 00:46:45,560 --> 00:46:48,560 Speaker 1: our food technology, of course, continued to evolve, most notably 875 00:46:48,640 --> 00:46:53,040 Speaker 1: via the agricultural revolution. Definitely. I mean that the history 876 00:46:53,120 --> 00:46:56,200 Speaker 1: of of of human civilization is the history of our 877 00:46:57,000 --> 00:47:00,120 Speaker 1: manipulation of food really and our stockpiling of food it 878 00:47:00,160 --> 00:47:02,520 Speaker 1: and then our trade of food in our wars for food. 879 00:47:02,800 --> 00:47:07,719 Speaker 1: So we are using research that makes use of the 880 00:47:07,760 --> 00:47:13,320 Speaker 1: brain soup technique to discover the possibility to discover how 881 00:47:13,360 --> 00:47:17,640 Speaker 1: important the invention of literal soup might have been. Yeah, 882 00:47:17,680 --> 00:47:21,080 Speaker 1: I love the cyclical nature of this particular episode. We 883 00:47:21,160 --> 00:47:24,000 Speaker 1: started with with the brain soup, and we came back 884 00:47:24,040 --> 00:47:29,560 Speaker 1: to brain soup. It's glorious, Okay. Introducing a new show segment, 885 00:47:29,760 --> 00:47:34,520 Speaker 1: Soup Facts, Facts Facts, with Joe McCornick. Here's something. If 886 00:47:34,560 --> 00:47:37,080 Speaker 1: you're ever trying to figure out how much to season 887 00:47:37,200 --> 00:47:38,960 Speaker 1: your soup. You know you don't want it to be 888 00:47:38,960 --> 00:47:41,360 Speaker 1: too salty, of course, but you also don't want to 889 00:47:41,440 --> 00:47:43,560 Speaker 1: underseason and how much salt should go in a soup. 890 00:47:43,840 --> 00:47:46,520 Speaker 1: The best way to figure that out is to get 891 00:47:46,560 --> 00:47:49,719 Speaker 1: the soup to the temperature that you plan to serve 892 00:47:49,760 --> 00:47:52,520 Speaker 1: it at, and then taste and see how much salt 893 00:47:52,520 --> 00:47:54,640 Speaker 1: it needs there. Because the amount of salt that we 894 00:47:54,680 --> 00:47:58,279 Speaker 1: can taste when we taste something for seasoning varies drastically 895 00:47:58,400 --> 00:48:01,200 Speaker 1: depending on what temperature the food it is at. So 896 00:48:01,400 --> 00:48:04,000 Speaker 1: you might you might taste it and it tastes like 897 00:48:04,040 --> 00:48:06,920 Speaker 1: it needs more salt, or it already has enough salt 898 00:48:07,120 --> 00:48:09,440 Speaker 1: at one temperature, but in a different temperature it might 899 00:48:09,440 --> 00:48:12,840 Speaker 1: taste totally different. Interesting. I hadn't thought about that. I 900 00:48:12,840 --> 00:48:15,440 Speaker 1: will use that the next time I'm I'm I'm following 901 00:48:15,440 --> 00:48:18,200 Speaker 1: instructions from a box and have to make soup all right. 902 00:48:19,680 --> 00:48:22,720 Speaker 1: Another place I always mess up with salt is with pasta. 903 00:48:23,120 --> 00:48:26,919 Speaker 1: If I'm boiling pasta, like I keep putting into little soup, 904 00:48:26,960 --> 00:48:30,200 Speaker 1: like I'm so afraid of oversalting something, well, then I'm 905 00:48:30,239 --> 00:48:32,600 Speaker 1: I'm and then my my wife will come in and say, oh, 906 00:48:32,640 --> 00:48:34,360 Speaker 1: then you're supposed to be throwing like a whole fistful 907 00:48:34,360 --> 00:48:36,960 Speaker 1: assault from there. Basically, it's a reasonable concern. I mean, 908 00:48:37,480 --> 00:48:39,920 Speaker 1: you can if you under salt something, you can always 909 00:48:39,960 --> 00:48:42,759 Speaker 1: add more. But if you over salt you can't take 910 00:48:42,760 --> 00:48:44,799 Speaker 1: it out right. Well, you know, I think back to 911 00:48:44,880 --> 00:48:46,960 Speaker 1: the cocktail scenario, like if I mix up, if I 912 00:48:46,960 --> 00:48:49,000 Speaker 1: mess up a cocktail to the point where I can't 913 00:48:49,040 --> 00:48:53,200 Speaker 1: fix it, um, then I've wasted like two drinks, you know. 914 00:48:53,560 --> 00:48:56,040 Speaker 1: But if I do that with a super stew and 915 00:48:56,080 --> 00:48:58,439 Speaker 1: I've ruined dinner, like I have to then go and 916 00:48:58,480 --> 00:49:00,400 Speaker 1: get a pizza or something, and I've way did all 917 00:49:00,400 --> 00:49:03,040 Speaker 1: of these resources. And you know, the most heartbreaking thing 918 00:49:03,040 --> 00:49:05,600 Speaker 1: you can do is have to potentially throw food away. 919 00:49:05,719 --> 00:49:08,440 Speaker 1: And I have I have got I have over salted 920 00:49:08,520 --> 00:49:11,040 Speaker 1: things in the past to that point, you know, where 921 00:49:11,040 --> 00:49:14,000 Speaker 1: it's just basically inedible. There was some recipe where I 922 00:49:14,040 --> 00:49:16,160 Speaker 1: had to. And this was like a box meal years 923 00:49:16,160 --> 00:49:18,520 Speaker 1: ago when I was first figuring it out, and I 924 00:49:18,560 --> 00:49:20,240 Speaker 1: had like a thing of salt and a thing of sugar, 925 00:49:20,280 --> 00:49:22,560 Speaker 1: and the sugar went into one component salting the other. 926 00:49:23,239 --> 00:49:26,359 Speaker 1: I flipped them, which made like a pretty sweet cole 927 00:49:26,440 --> 00:49:28,799 Speaker 1: slab that also just did whatever the other thing was 928 00:49:28,800 --> 00:49:33,880 Speaker 1: was just inedible. That's a sad story, all right, So 929 00:49:33,920 --> 00:49:37,719 Speaker 1: I guess we're gonna leave it there. Uh again Susannah 930 00:49:37,800 --> 00:49:42,279 Speaker 1: Roclinol wonderful science communicator. UH look her up. She's all 931 00:49:42,320 --> 00:49:45,440 Speaker 1: over the internet. You can find her ted talk readily easily, 932 00:49:45,840 --> 00:49:48,160 Speaker 1: and they're also various interviews with her. And then of 933 00:49:48,200 --> 00:49:51,200 Speaker 1: course the scientific papers are out there to look at 934 00:49:51,280 --> 00:49:55,000 Speaker 1: as well. Uh. So also thanks to the World Science 935 00:49:55,040 --> 00:49:58,080 Speaker 1: Festival UM for letting me attend and uh and you know, 936 00:49:58,160 --> 00:50:00,680 Speaker 1: taking in all this data. Look up the World Science 937 00:50:00,719 --> 00:50:04,080 Speaker 1: Festival as well. Uh. They have a wonderful YouTube page. 938 00:50:04,239 --> 00:50:06,680 Speaker 1: They'll put a lot of these talks up um, you know, 939 00:50:06,960 --> 00:50:10,600 Speaker 1: over the course of the weeks and months ahead moving 940 00:50:10,600 --> 00:50:12,640 Speaker 1: into next year. And in the meantime, if you want 941 00:50:12,640 --> 00:50:14,640 Speaker 1: to check out more episodes of Stuff to Blow your mind, 942 00:50:14,640 --> 00:50:16,239 Speaker 1: head on over to Stuff to Blow your mind dot com. 943 00:50:16,239 --> 00:50:17,879 Speaker 1: That's the mother ship, that's where you'll find them all. 944 00:50:18,120 --> 00:50:20,319 Speaker 1: And if you want to support our show, uh, we 945 00:50:20,400 --> 00:50:22,839 Speaker 1: commend you for for wanting to do so. The best 946 00:50:22,840 --> 00:50:24,920 Speaker 1: thing you can do is to just rate and review 947 00:50:25,000 --> 00:50:26,560 Speaker 1: us wherever you have the power to do so, wherever 948 00:50:26,600 --> 00:50:28,840 Speaker 1: you give it, wherever you get this podcast, leave some 949 00:50:28,920 --> 00:50:30,919 Speaker 1: stars and a nice review, and of course just tell 950 00:50:30,960 --> 00:50:34,200 Speaker 1: people about it, about us, uh, tell your friends, tell 951 00:50:34,239 --> 00:50:36,800 Speaker 1: your family, and then tell them how to find this show. 952 00:50:37,239 --> 00:50:41,240 Speaker 1: Huge thanks as always to our excellent audio producer, Tari Harrison. 953 00:50:41,560 --> 00:50:43,120 Speaker 1: If you would like to get in touch with us 954 00:50:43,120 --> 00:50:45,680 Speaker 1: with feedback on this episode or any other, to suggest 955 00:50:45,760 --> 00:50:47,959 Speaker 1: a topic or a guest for the future of the show, 956 00:50:48,320 --> 00:50:50,640 Speaker 1: or just to say hello, you can email us at 957 00:50:51,000 --> 00:51:03,560 Speaker 1: contact at stuff to Blow your Mind dot com. Stuff 958 00:51:03,600 --> 00:51:05,520 Speaker 1: to Blow Your Mind is a production of iHeart Radios 959 00:51:05,520 --> 00:51:07,879 Speaker 1: How Stuff Works. For more podcasts from my Heart Radio, 960 00:51:07,960 --> 00:51:10,719 Speaker 1: visit the iHeart Radio app, Apple Podcasts, or wherever you 961 00:51:10,719 --> 00:51:17,120 Speaker 1: listen to your favorite shows.