1 00:00:00,200 --> 00:00:27,600 Speaker 1: Ridiculous History is a production of iHeartRadio. Welcome back to 2 00:00:27,640 --> 00:00:31,160 Speaker 1: the show. Thank you, as always so much Ridiculous Historians 3 00:00:31,200 --> 00:00:33,960 Speaker 1: for joining us. Let's hear it for the man the 4 00:00:34,000 --> 00:00:40,599 Speaker 1: myth our super producer mister Max Williams Imax. All right, yeah, 5 00:00:40,680 --> 00:00:44,279 Speaker 1: we're getting some snapping fingers. This is part two of 6 00:00:44,320 --> 00:00:48,519 Speaker 1: a continuing exploration. The guy who just said Imax is 7 00:00:48,720 --> 00:00:51,080 Speaker 1: mister Noel Brown, none other than. 8 00:00:51,280 --> 00:00:53,880 Speaker 2: Oh thanks buddy, You're welcome, Buckeroo. 9 00:00:54,000 --> 00:00:57,000 Speaker 1: They called me Ben Bullen in this part of the world, 10 00:00:57,280 --> 00:01:01,960 Speaker 1: and Noel, if we could do a previous on Ridiculous History, 11 00:01:05,280 --> 00:01:10,760 Speaker 1: the smoke Monster was bullsh wasn't anything, well, it was 12 00:01:10,760 --> 00:01:12,720 Speaker 1: the thing. It was the man in Black, it was 13 00:01:12,880 --> 00:01:16,920 Speaker 1: you know, that's the show Lost, which dated references. 14 00:01:17,760 --> 00:01:19,839 Speaker 2: But that's what we're here for. No, it's true. 15 00:01:19,880 --> 00:01:24,080 Speaker 3: We did previously on Ridiculous History talk with the brilliant 16 00:01:24,160 --> 00:01:28,520 Speaker 3: and lovely doctor Jorge at Sham, who helped us navigate 17 00:01:28,600 --> 00:01:31,319 Speaker 3: the universe a little bit, from the Big Bang to 18 00:01:31,640 --> 00:01:37,200 Speaker 3: the singularities to AI and the way that artificial intelligence 19 00:01:37,280 --> 00:01:40,560 Speaker 3: may well be close to, if not already at a 20 00:01:40,600 --> 00:01:43,319 Speaker 3: place where it resembles the human brain. 21 00:01:44,200 --> 00:01:47,680 Speaker 1: Yes, that is true. And Noel, as you know, I'm 22 00:01:47,720 --> 00:01:52,080 Speaker 1: really proud of you and Maxim myself for being quite 23 00:01:52,120 --> 00:01:57,240 Speaker 1: transparent and forthright with the fact that we are ourselves 24 00:01:57,720 --> 00:01:58,640 Speaker 1: not experts. 25 00:01:58,920 --> 00:01:59,280 Speaker 2: Folks. 26 00:02:00,400 --> 00:02:04,560 Speaker 1: We know everybody enjoyed the first part of our conversations 27 00:02:04,680 --> 00:02:07,840 Speaker 1: with doctor or a Champ, and believe it or not, 28 00:02:08,520 --> 00:02:13,920 Speaker 1: we were able to get Orgey back for another episode 29 00:02:14,160 --> 00:02:16,920 Speaker 1: or Hey, welcome, thank you for joining us. 30 00:02:17,160 --> 00:02:20,239 Speaker 4: Hey, super fun to be here. Thanks for having me back. 31 00:02:20,840 --> 00:02:24,079 Speaker 3: Of course, of course you're helping us out as well. 32 00:02:24,320 --> 00:02:26,400 Speaker 3: We love a good two part, especially one that we 33 00:02:26,480 --> 00:02:27,560 Speaker 3: decide on in advance. 34 00:02:27,639 --> 00:02:31,920 Speaker 4: Yeah, and this will be the Empire Streks back of podcasts. 35 00:02:31,320 --> 00:02:35,520 Speaker 2: Oh, that's the film. It's a universe. 36 00:02:42,200 --> 00:02:44,960 Speaker 1: We're talking about the history of what we call the 37 00:02:45,000 --> 00:02:50,320 Speaker 1: Big Bang in our previous exploration, which led us to, 38 00:02:50,560 --> 00:02:54,280 Speaker 1: as you were saying, Noel, an exploration of so called 39 00:02:54,480 --> 00:02:59,720 Speaker 1: artificial intelligence, the nature of reality, the observable universe. And 40 00:02:59,760 --> 00:03:05,520 Speaker 1: we tease just a bit episode two. This is our 41 00:03:05,800 --> 00:03:12,799 Speaker 1: exploration of the ridiculous history of a thing that studies 42 00:03:12,919 --> 00:03:17,200 Speaker 1: itself the science of the human brain. So real quick 43 00:03:17,360 --> 00:03:20,840 Speaker 1: I could if we could open it up this way, Orgey, 44 00:03:22,520 --> 00:03:25,400 Speaker 1: just small talk, What is the human brain. 45 00:03:28,280 --> 00:03:31,360 Speaker 4: Was this little about sixty eight pound organ that you 46 00:03:31,360 --> 00:03:34,160 Speaker 4: have in your head. And it's just another organ in 47 00:03:34,200 --> 00:03:34,520 Speaker 4: your head. 48 00:03:34,560 --> 00:03:34,720 Speaker 1: You know. 49 00:03:34,760 --> 00:03:39,840 Speaker 4: It's a lump of fleshy stuff that basically makes who 50 00:03:39,920 --> 00:03:43,960 Speaker 4: you are happen, Like your conscious experience, all your memories, 51 00:03:44,440 --> 00:03:48,520 Speaker 4: how you feel, that's all happening in that little gelatinous 52 00:03:48,560 --> 00:03:49,760 Speaker 4: blob inside your skull. 53 00:03:50,160 --> 00:03:53,400 Speaker 3: I've always felt calling the brain an organ was selling 54 00:03:53,400 --> 00:03:55,120 Speaker 3: it short just a little bit. When I think of 55 00:03:55,160 --> 00:03:57,600 Speaker 3: an organ, I guess I think of just like guts. 56 00:03:57,680 --> 00:04:01,200 Speaker 3: You know, it's important. But the brain is a very 57 00:04:01,240 --> 00:04:04,440 Speaker 3: peculiar and powerful and incredible organ. 58 00:04:04,440 --> 00:04:06,600 Speaker 4: It is it is. People say it's the most complex 59 00:04:07,920 --> 00:04:10,840 Speaker 4: organization of matter that we know about in the whole universe. 60 00:04:11,160 --> 00:04:14,440 Speaker 2: And it's also the biggest erogenoist zone. 61 00:04:14,280 --> 00:04:17,320 Speaker 4: In the human we go. It's the biggest everything zone. 62 00:04:17,400 --> 00:04:21,479 Speaker 4: Basically hunger, heye, love, it's all happening in your head. 63 00:04:21,880 --> 00:04:25,240 Speaker 1: Now, this is fascinating. I love that phrase because we 64 00:04:25,520 --> 00:04:31,320 Speaker 1: also have to acknowledge an inherent dilemma. We're asking a 65 00:04:31,400 --> 00:04:37,200 Speaker 1: thing to research and explain itself, to measure itself. Could 66 00:04:37,240 --> 00:04:43,680 Speaker 1: you tell us a little bit about the philosophical quandaries 67 00:04:43,839 --> 00:04:47,479 Speaker 1: involved with that like because ordinarily, you know, if you 68 00:04:47,560 --> 00:04:53,080 Speaker 1: asked if you asked a duck its opinion about ducks, 69 00:04:53,680 --> 00:04:58,599 Speaker 1: you would get a weird you know, not completely objective. 70 00:04:58,320 --> 00:05:00,599 Speaker 4: Right, right, right, and everyone knows are just a bunch 71 00:05:00,600 --> 00:05:11,600 Speaker 4: of quacks. Now it's a super interesting philosophical conversation, right, 72 00:05:11,640 --> 00:05:15,200 Speaker 4: because right, there's eighty six billion neurons in your head, 73 00:05:15,320 --> 00:05:18,159 Speaker 4: and so the idea that eighty six billion neurons could 74 00:05:18,240 --> 00:05:22,520 Speaker 4: ever really really understand what's happening in eighty six billion neurons, 75 00:05:23,440 --> 00:05:26,200 Speaker 4: even if somebody else's eighty six billionaires, it's sort of impossible, right. 76 00:05:26,200 --> 00:05:28,479 Speaker 4: It's sort of like a car understanding a car, or 77 00:05:28,920 --> 00:05:32,680 Speaker 4: you know, a switch understanding a switch. So it's probably 78 00:05:32,760 --> 00:05:35,719 Speaker 4: not possible for the human brain to really understand everything 79 00:05:35,760 --> 00:05:38,080 Speaker 4: that's going on in your brain, like down to the tee. 80 00:05:38,760 --> 00:05:41,000 Speaker 4: But you know, we have science, and so we can 81 00:05:41,040 --> 00:05:44,560 Speaker 4: make kind of generalizations, we can make certain rules, we 82 00:05:44,560 --> 00:05:47,640 Speaker 4: can understand the general structure and organization of the brain, 83 00:05:48,080 --> 00:05:52,040 Speaker 4: but like totally understanding the brain and predicting what it's 84 00:05:52,040 --> 00:05:54,640 Speaker 4: going to do, it's probably impossible for our brains. But 85 00:05:54,680 --> 00:05:58,440 Speaker 4: you can imagine, like maybe aliens who have four hundred 86 00:05:58,480 --> 00:06:01,000 Speaker 4: billion neurons, you know them, we might be like, oh, 87 00:06:01,040 --> 00:06:03,720 Speaker 4: look at these cats, you know, running around, We can 88 00:06:03,800 --> 00:06:06,000 Speaker 4: totally understand what's going on in their heads. 89 00:06:06,240 --> 00:06:11,440 Speaker 1: So this leads us to the ridiculous history of neuroscience 90 00:06:11,640 --> 00:06:14,760 Speaker 1: or brain science. Could you tell us a little bit 91 00:06:14,880 --> 00:06:20,280 Speaker 1: about the first I guess there was a moment in 92 00:06:20,440 --> 00:06:25,039 Speaker 1: human civilization, or a series of moments more accurately, wherein 93 00:06:25,320 --> 00:06:31,279 Speaker 1: people realized this gelatinous blob in their head was doing something. 94 00:06:31,480 --> 00:06:35,160 Speaker 1: We're under the impression that for a lot of human history, 95 00:06:36,520 --> 00:06:41,760 Speaker 1: various cultures believed that the soul or the consciousness maybe 96 00:06:41,880 --> 00:06:46,479 Speaker 1: resided in other organs, like the heart or the I'll 97 00:06:46,520 --> 00:06:51,960 Speaker 1: say it, sorry, substitute teachers, genitalia. 98 00:06:52,520 --> 00:06:54,520 Speaker 4: Was there a lot of it? A lot of our 99 00:06:54,560 --> 00:06:55,599 Speaker 4: actions do come. 100 00:06:55,400 --> 00:06:59,560 Speaker 2: From it's like a second brain. 101 00:07:00,120 --> 00:07:06,440 Speaker 1: So what was there any sort of inflection point or 102 00:07:06,760 --> 00:07:11,320 Speaker 1: crossroads in history of humans where they started to look 103 00:07:11,360 --> 00:07:14,680 Speaker 1: at the brain as a seat of consciousness. 104 00:07:15,920 --> 00:07:19,160 Speaker 4: Yeah, yeah, now, yeah, it's a pretty interesting history because, 105 00:07:19,480 --> 00:07:22,000 Speaker 4: as you mentioned, probably for most of human history, we 106 00:07:22,000 --> 00:07:24,000 Speaker 4: had no idea what was going on inside of our bodies. 107 00:07:24,040 --> 00:07:26,200 Speaker 4: How though our bodies work, much less how our brains 108 00:07:27,000 --> 00:07:30,920 Speaker 4: worked or what it did Aristotle back in the ancient 109 00:07:31,200 --> 00:07:35,200 Speaker 4: Greek era. Uh, basically agree with what you just said, 110 00:07:35,200 --> 00:07:38,280 Speaker 4: which is he thought that everything that makes us who 111 00:07:38,280 --> 00:07:41,160 Speaker 4: we are is in our hearts, like in our chest. 112 00:07:41,680 --> 00:07:43,760 Speaker 4: And he thought that the brain was really just like 113 00:07:44,040 --> 00:07:48,480 Speaker 4: a radiator, like a brain just there to like cool people. Yeah, 114 00:07:49,200 --> 00:07:51,800 Speaker 4: jes United has all these like lumps and wriggles like 115 00:07:51,840 --> 00:07:53,040 Speaker 4: that would make sense, right kind of. 116 00:07:53,560 --> 00:07:57,080 Speaker 2: Yeah, sure, maybe dissipute the heat like coils like coils in. 117 00:07:57,040 --> 00:07:59,920 Speaker 4: Area, Yeah, yeah, like increasing the surface area. 118 00:08:00,480 --> 00:08:00,720 Speaker 2: Uh. 119 00:08:00,760 --> 00:08:02,720 Speaker 4: And if you think about it, it kind of makes sense, right, 120 00:08:02,760 --> 00:08:05,960 Speaker 4: Like if you are designing a human being, Like, why 121 00:08:05,960 --> 00:08:09,000 Speaker 4: would you put the most important part on this little 122 00:08:09,040 --> 00:08:12,880 Speaker 4: like appendage sitting at the top, you know, exposed to 123 00:08:13,560 --> 00:08:16,400 Speaker 4: Yeah it's exposed. Yeah yeah what did you put it? 124 00:08:16,440 --> 00:08:19,960 Speaker 4: Like in the chest right protected by ribs and organs 125 00:08:20,040 --> 00:08:22,160 Speaker 4: Like that would make a little bit more sense. But no, 126 00:08:22,280 --> 00:08:25,520 Speaker 4: it's like it's sticking up in our hands. It's a 127 00:08:25,560 --> 00:08:29,840 Speaker 4: little peninsula on the box. Yeah yeah, so easy to 128 00:08:29,920 --> 00:08:33,680 Speaker 4: knock or or chop off. So he didn't quite have 129 00:08:33,720 --> 00:08:37,360 Speaker 4: it right, But people think that maybe the Egyptians knew 130 00:08:37,360 --> 00:08:39,800 Speaker 4: a little bit more about the brain. So there's this 131 00:08:39,920 --> 00:08:47,040 Speaker 4: famous papyrus papyrus scroll, yeah, that they found made by 132 00:08:47,080 --> 00:08:52,200 Speaker 4: the ancient Egyptians where they kind of like documented medical cases. 133 00:08:52,480 --> 00:08:55,800 Speaker 4: It's called the Edwin Smith Papyrus. 134 00:08:56,040 --> 00:09:01,960 Speaker 2: Yes, like the thought yeah. 135 00:09:02,640 --> 00:09:07,280 Speaker 4: Oh that's another brain game right there. So we know, 136 00:09:07,400 --> 00:09:09,679 Speaker 4: like the Addigus catalog all these medical things like oh, 137 00:09:09,679 --> 00:09:11,280 Speaker 4: if you break an arm, this is what happens. If 138 00:09:11,320 --> 00:09:13,840 Speaker 4: you sever your spinacle, this is what happened. And there's 139 00:09:13,880 --> 00:09:17,320 Speaker 4: like an entry number twenty in this squirrel that says, 140 00:09:17,880 --> 00:09:19,320 Speaker 4: you know, if you get hit in the head card, 141 00:09:19,960 --> 00:09:23,040 Speaker 4: we know that sometimes you can lose the ability to talk. 142 00:09:24,800 --> 00:09:27,880 Speaker 4: So like in their heads they're thinking, oh, your brain 143 00:09:28,160 --> 00:09:31,760 Speaker 4: is good for things like talking. So that's kind of 144 00:09:31,800 --> 00:09:35,760 Speaker 4: like the earliest we think kind of a record of 145 00:09:36,160 --> 00:09:39,679 Speaker 4: people humans kind of understanding what the brain is doing. 146 00:09:40,400 --> 00:09:41,120 Speaker 2: Oh wow. 147 00:09:41,240 --> 00:09:48,800 Speaker 1: And this this also reminds us clearly of a similar 148 00:09:49,240 --> 00:09:53,760 Speaker 1: practice indicative of recognizing the importance of the brain as 149 00:09:53,800 --> 00:10:00,480 Speaker 1: an organ, which is the practice of trepan nation, right. Uh, 150 00:10:00,640 --> 00:10:02,280 Speaker 1: trepanation being. 151 00:10:03,640 --> 00:10:04,160 Speaker 2: The demons. 152 00:10:04,679 --> 00:10:10,720 Speaker 1: Yeah yeah, yeah, yeah, venting relieving pleasure, yeah, drilling a 153 00:10:10,760 --> 00:10:16,600 Speaker 1: physical hole somewhere in the cranium. I think for a 154 00:10:16,600 --> 00:10:21,520 Speaker 1: lot of us lay for the primary amazing thing about 155 00:10:21,559 --> 00:10:27,040 Speaker 1: the practice of trapanation is that people survived somehow. 156 00:10:27,120 --> 00:10:31,679 Speaker 4: Yeah right, Yeah, it's wild. Basically, there's not much activity 157 00:10:31,720 --> 00:10:35,360 Speaker 4: going on like understanding the brain up until about the 158 00:10:35,400 --> 00:10:39,439 Speaker 4: eighteen hundreds, and that's when people really started to record 159 00:10:39,480 --> 00:10:41,600 Speaker 4: things like hey, if you open up your skull and 160 00:10:41,640 --> 00:10:44,160 Speaker 4: you dig a hole through here, this can happen. Or 161 00:10:44,160 --> 00:10:45,720 Speaker 4: if you open up this part of the brain and 162 00:10:45,800 --> 00:10:48,199 Speaker 4: you kind of like mess around with it or apply 163 00:10:48,240 --> 00:10:52,920 Speaker 4: electricity like that, your arm will move and so yeah, 164 00:10:53,240 --> 00:10:55,400 Speaker 4: so that's one of the things that people did in 165 00:10:55,400 --> 00:10:58,200 Speaker 4: the eighteen hundreds, which which opened up your brain and 166 00:10:58,200 --> 00:11:01,000 Speaker 4: pok it. And it's sort of dubious to us. 167 00:11:01,080 --> 00:11:02,520 Speaker 2: Now that's not to say. 168 00:11:04,440 --> 00:11:07,920 Speaker 3: Syram that gave us that information, but which is often 169 00:11:07,920 --> 00:11:10,760 Speaker 3: the case with early discovery, it's like, yeah, you know, 170 00:11:10,840 --> 00:11:14,120 Speaker 3: you gotta break a few eggs to make a science online. 171 00:11:14,320 --> 00:11:16,840 Speaker 4: Be a little bit of a mad scientist. Yeah, but 172 00:11:16,960 --> 00:11:18,959 Speaker 4: it taught us a lot, Like it started to kind 173 00:11:18,960 --> 00:11:22,160 Speaker 4: of piece everything together. People started to figure out, oh, 174 00:11:22,240 --> 00:11:24,760 Speaker 4: like the brain has parts. You know, it's not just 175 00:11:24,880 --> 00:11:28,160 Speaker 4: like one giant computer chip. It's got like a little 176 00:11:28,160 --> 00:11:30,360 Speaker 4: processor for this over here and a little processor for 177 00:11:30,360 --> 00:11:31,920 Speaker 4: that out over there, and so that's kind of how 178 00:11:31,920 --> 00:11:34,840 Speaker 4: it started, really kind of ramping up the history of 179 00:11:35,040 --> 00:11:35,680 Speaker 4: brain science. 180 00:11:36,040 --> 00:11:38,959 Speaker 1: Now we're talking the eighteen hundreds at this point, which 181 00:11:39,080 --> 00:11:45,319 Speaker 1: means from what Noel and I understand, this means this happens, 182 00:11:45,880 --> 00:11:51,000 Speaker 1: this great scientific inquiry occurs in the same sort of 183 00:11:51,240 --> 00:11:56,160 Speaker 1: historical milieu as a lot of quack science. Right, I 184 00:11:56,160 --> 00:11:59,160 Speaker 1: guess we should talk about phrenology a little bit. 185 00:11:59,720 --> 00:12:02,360 Speaker 4: Yeah, yeah, yeah, Well it's it's sort of a it's 186 00:12:02,440 --> 00:12:05,040 Speaker 4: a bit of a sketchy thing to delineate between quack 187 00:12:05,080 --> 00:12:07,760 Speaker 4: science and really thought right, you know what I mean. 188 00:12:07,800 --> 00:12:10,160 Speaker 4: Like back then, it's like it's like accusing a cavement 189 00:12:10,200 --> 00:12:13,280 Speaker 4: of not being good scientists. You know, they just didn't know, 190 00:12:13,360 --> 00:12:15,480 Speaker 4: and so they were just trying all these different theories. 191 00:12:15,520 --> 00:12:18,120 Speaker 4: They had all these ideas. They were going on vibes 192 00:12:18,280 --> 00:12:24,600 Speaker 4: basically vibes, and yeah, it's a Phrenology was a huge 193 00:12:24,640 --> 00:12:27,400 Speaker 4: deal in the eighteen hundreds. People thought that, you know, 194 00:12:27,440 --> 00:12:29,520 Speaker 4: there was this idea that the brain is mapped, like 195 00:12:29,520 --> 00:12:32,040 Speaker 4: there's areas that do different things. But so people would 196 00:12:32,080 --> 00:12:33,640 Speaker 4: just guess, like, oh, the part of the back part of 197 00:12:33,720 --> 00:12:36,800 Speaker 4: your rank, that's where that's where your love is located 198 00:12:36,880 --> 00:12:40,680 Speaker 4: and the front part is where your egoism is located, 199 00:12:40,720 --> 00:12:42,840 Speaker 4: and so they just if you look it up phrenology, 200 00:12:42,920 --> 00:12:45,040 Speaker 4: there's all these like maps. I think most people have 201 00:12:45,080 --> 00:12:47,760 Speaker 4: probably have probably seen them, just like a human head 202 00:12:47,800 --> 00:12:50,280 Speaker 4: with like areas kind of like a like a meat 203 00:12:50,280 --> 00:12:53,319 Speaker 4: diagram and a cow for one little. 204 00:12:53,040 --> 00:12:55,319 Speaker 2: Regions where the tenderline is exactly. 205 00:12:55,080 --> 00:12:57,480 Speaker 4: Yeah, yeah, yeah. 206 00:12:57,600 --> 00:13:04,360 Speaker 1: And that was informed, unfortunately by a lot of confirmation bias, 207 00:13:04,559 --> 00:13:09,400 Speaker 1: we could argue from Western Europe at the time, wherein 208 00:13:09,600 --> 00:13:14,760 Speaker 1: people would say, oh, we've made this rough Butcher's diagram 209 00:13:14,880 --> 00:13:19,000 Speaker 1: or topography of a cranium, and based on this bump, 210 00:13:19,600 --> 00:13:23,760 Speaker 1: this guy is going to be good at what's a 211 00:13:23,800 --> 00:13:25,200 Speaker 1: silly thing to be good at? 212 00:13:25,280 --> 00:13:26,319 Speaker 2: In the unicycle. 213 00:13:26,600 --> 00:13:31,000 Speaker 1: This guy's great at unicycles. Yeah, look at the other bump, 214 00:13:31,240 --> 00:13:33,640 Speaker 1: you know, right there by the temple. This is our 215 00:13:34,040 --> 00:13:37,319 Speaker 1: unicyclist juggler phrenology. 216 00:13:38,400 --> 00:13:38,679 Speaker 2: Jorge. 217 00:13:39,080 --> 00:13:41,400 Speaker 1: Was it widely accepted in its day. 218 00:13:41,960 --> 00:13:44,720 Speaker 4: I think it was, you know, as accepted as like 219 00:13:45,000 --> 00:13:49,040 Speaker 4: you know, somebody selling tonics, you know, going from town 220 00:13:49,040 --> 00:13:52,599 Speaker 4: to town in the Old West selling tonics to rejuvenate 221 00:13:52,640 --> 00:13:55,120 Speaker 4: your vigor or something like that. You know, it's something 222 00:13:55,160 --> 00:13:57,760 Speaker 4: that people weren't quite sure it was true or not, 223 00:13:57,880 --> 00:14:00,240 Speaker 4: and some people claim that they were starting up it 224 00:14:00,840 --> 00:14:02,640 Speaker 4: and you know, people rolled the dice. 225 00:14:03,200 --> 00:14:06,240 Speaker 3: Not a far jump from things like the humors, you know, 226 00:14:06,480 --> 00:14:09,760 Speaker 3: the idea of leeching and blood letting in order to 227 00:14:09,800 --> 00:14:13,960 Speaker 3: balance out these supposed you know, materials within the body. 228 00:14:14,040 --> 00:14:17,840 Speaker 3: And then honestly, I'm not trying to poop anybody's beliefs here, 229 00:14:17,880 --> 00:14:20,640 Speaker 3: but not too too far off from things like chakras 230 00:14:20,680 --> 00:14:24,240 Speaker 3: and meridian lines in some you know, Eastern medicine that 231 00:14:24,320 --> 00:14:26,800 Speaker 3: some people think is quackery and some people swear by. 232 00:14:26,960 --> 00:14:28,480 Speaker 3: So I mean, it's just interesting the way some of 233 00:14:28,480 --> 00:14:31,280 Speaker 3: that stuff is still around and believed in or not 234 00:14:31,320 --> 00:14:32,360 Speaker 3: believed in depending. 235 00:14:32,600 --> 00:14:33,920 Speaker 4: No, no, that stuff is quackery. 236 00:14:34,080 --> 00:14:37,560 Speaker 2: No okay, Oh the. 237 00:14:37,480 --> 00:14:38,920 Speaker 1: Professor came out on that one. 238 00:14:40,320 --> 00:14:42,120 Speaker 4: No fense to the Crystal fans. 239 00:14:42,760 --> 00:14:45,040 Speaker 2: Yes, yes, yes, we just being diplomatic. 240 00:14:45,200 --> 00:14:49,040 Speaker 1: We are being diplomatic, and we we share the same pursuits, 241 00:14:49,080 --> 00:14:55,000 Speaker 1: which is always going to be hopefully the collective, objective 242 00:14:55,440 --> 00:15:06,600 Speaker 1: interrogation of the world around us and within us. One 243 00:15:06,640 --> 00:15:09,400 Speaker 1: thing we were talking about a little bit off air. 244 00:15:11,200 --> 00:15:18,080 Speaker 1: It pertained to some some just phenomenal and again, as 245 00:15:18,080 --> 00:15:24,160 Speaker 1: you said Jorge. Ethically, dubious learnings that came. You mentioned 246 00:15:24,640 --> 00:15:30,000 Speaker 1: the earlier realization from humanity that certain parts of the 247 00:15:30,080 --> 00:15:35,240 Speaker 1: brain function in certain ways right and excel in certain things. 248 00:15:35,560 --> 00:15:41,560 Speaker 1: And at the end of our first conversation we introduced 249 00:15:41,720 --> 00:15:44,840 Speaker 1: a patient who for a long time was simply known 250 00:15:44,960 --> 00:15:51,960 Speaker 1: as HM, and HM, as we find, is pivotal to neuroscience, 251 00:15:52,120 --> 00:15:57,120 Speaker 1: perhaps because of well, gosh, can we tell can we 252 00:15:57,160 --> 00:16:02,280 Speaker 1: tell the story? Yes, super fasting story. This was a 253 00:16:02,320 --> 00:16:05,000 Speaker 1: little bit later. This was in the nineteen thirties that 254 00:16:05,080 --> 00:16:09,040 Speaker 1: this man came about. His name was Henry Malaysan, but 255 00:16:09,080 --> 00:16:11,000 Speaker 1: for a long time people didn't know his name because 256 00:16:11,000 --> 00:16:14,400 Speaker 1: when you become a medical subject, they kind of make 257 00:16:14,440 --> 00:16:16,800 Speaker 1: you anonymous. So for a long time, for many years, 258 00:16:16,840 --> 00:16:20,720 Speaker 1: he was just known as patient AHM. And so this 259 00:16:20,800 --> 00:16:23,160 Speaker 1: was a guy who had a lot of seizures as 260 00:16:23,200 --> 00:16:26,560 Speaker 1: a kid. Some people think it maybe happened after he 261 00:16:26,640 --> 00:16:28,800 Speaker 1: had like a bike accident and then he knocked his head. 262 00:16:28,960 --> 00:16:30,680 Speaker 1: Some people think he was just kind of he just 263 00:16:30,720 --> 00:16:32,920 Speaker 1: had these seizures for no reason. So it's not quite 264 00:16:32,920 --> 00:16:34,800 Speaker 1: clear how or why he got these features, but they 265 00:16:34,840 --> 00:16:38,000 Speaker 1: were like super intense, like he couldn't really function it 266 00:16:38,040 --> 00:16:42,040 Speaker 1: would really kind of make him not able to have 267 00:16:42,080 --> 00:16:44,200 Speaker 1: a job or go to school or things like that. 268 00:16:44,800 --> 00:16:47,440 Speaker 4: And at the time, as you said, the doctors were like, oh, 269 00:16:47,560 --> 00:16:49,000 Speaker 4: I know how to cure it this. I'll just poke 270 00:16:49,000 --> 00:16:51,800 Speaker 4: a hole in your head and mess around with it. 271 00:16:51,880 --> 00:16:57,120 Speaker 4: And so a common procedure back then was basically a lobotomy, 272 00:16:57,280 --> 00:17:01,640 Speaker 4: Like they would just kind of stick this long needle 273 00:17:02,320 --> 00:17:04,800 Speaker 4: kind of through your eye socket, did you like move 274 00:17:04,880 --> 00:17:06,240 Speaker 4: your eye a little bit out of the way, and 275 00:17:06,280 --> 00:17:08,600 Speaker 4: then you stick it in there and then kind of 276 00:17:08,600 --> 00:17:12,280 Speaker 4: mess around. And what they did for him specifically was 277 00:17:12,359 --> 00:17:17,280 Speaker 4: they destroyed most of his hippocampus, so like deep inside 278 00:17:17,320 --> 00:17:19,520 Speaker 4: your brain you have this little these little lumps called 279 00:17:19,560 --> 00:17:24,879 Speaker 4: the hippocampus. And you know, his seizures were so severe 280 00:17:24,920 --> 00:17:29,200 Speaker 4: that this doctor called Williams Coville thought that he needed 281 00:17:29,240 --> 00:17:31,760 Speaker 4: to take out both his hippocampus, and so he did. 282 00:17:32,440 --> 00:17:35,560 Speaker 4: And the amazing thing is it sort of worked, like 283 00:17:35,640 --> 00:17:40,400 Speaker 4: his seizure stopped. But unfortunately it had a bad side effect, 284 00:17:40,640 --> 00:17:44,360 Speaker 4: which is that Henry Molaysan was not able to make 285 00:17:44,480 --> 00:17:49,240 Speaker 4: new memories. So like he remembered his whole life, his childhood, 286 00:17:49,400 --> 00:17:52,160 Speaker 4: you know, his early adulthood, right up until the day 287 00:17:52,200 --> 00:17:56,000 Speaker 4: he had his surgery. But after that he couldn't remember 288 00:17:56,160 --> 00:17:57,760 Speaker 4: more than thirty minutes at a time. 289 00:17:58,200 --> 00:18:02,120 Speaker 1: He couldn't encode information. 290 00:18:02,119 --> 00:18:03,920 Speaker 4: Right, Like if you met him and you said hey, 291 00:18:03,920 --> 00:18:05,919 Speaker 4: and he like, he could tell you where he grew up, 292 00:18:06,200 --> 00:18:09,400 Speaker 4: who his mother was, and everything, but thirty minutes later 293 00:18:09,560 --> 00:18:12,920 Speaker 4: he would totally forget he met you, Or thirty minutes later, 294 00:18:12,960 --> 00:18:14,919 Speaker 4: he couldn't tell you how he got to where he was, 295 00:18:15,040 --> 00:18:18,200 Speaker 4: what he had for breakfast, you know, any he couldn't 296 00:18:18,200 --> 00:18:20,200 Speaker 4: tell you anything beyond thirty minutes ago. 297 00:18:20,600 --> 00:18:26,680 Speaker 1: Okay, well, we've all met executive producers and before you see. 298 00:18:26,840 --> 00:18:30,840 Speaker 4: A botimized Oh. 299 00:18:29,359 --> 00:18:36,600 Speaker 1: So this this is a moment of tremendous significance to 300 00:18:37,040 --> 00:18:43,800 Speaker 1: the history of neuroscience when and first off, him's existence 301 00:18:43,920 --> 00:18:48,080 Speaker 1: at this point is cursed. You mentioned earlier that this 302 00:18:48,440 --> 00:18:55,320 Speaker 1: was this was a primary source for the film Momento. 303 00:18:55,640 --> 00:18:56,200 Speaker 2: Is that correct? 304 00:18:56,640 --> 00:18:59,960 Speaker 4: Yeah, the Christopher Nolan film The When He Got kind 305 00:18:59,960 --> 00:19:03,239 Speaker 4: Of I got famous for initially Memento. It was this 306 00:19:03,280 --> 00:19:06,840 Speaker 4: guy who basically lived this life, but in our modern 307 00:19:06,960 --> 00:19:09,840 Speaker 4: world right now and so he couldn't remember more than 308 00:19:09,880 --> 00:19:12,960 Speaker 4: thirty minutes ago, and so and so that movie's kind 309 00:19:12,960 --> 00:19:15,720 Speaker 4: of told backwards in time. It's like he's you're sort 310 00:19:15,720 --> 00:19:17,879 Speaker 4: of living the movie like he is living in It's like, oh, 311 00:19:17,920 --> 00:19:19,600 Speaker 4: I'm here, Why am I here? I don't know why 312 00:19:19,600 --> 00:19:21,800 Speaker 4: I'm here? And then you flash back to like thirty 313 00:19:21,840 --> 00:19:24,439 Speaker 4: minutes ago to figure out how he got there, and 314 00:19:24,440 --> 00:19:26,560 Speaker 4: then you keep flashing back thirty minutes at a time. 315 00:19:26,600 --> 00:19:31,520 Speaker 4: That's the movie. And this guy like actually lifted. He 316 00:19:31,560 --> 00:19:34,280 Speaker 4: just woke up every day and thirty minutes the world 317 00:19:34,359 --> 00:19:35,240 Speaker 4: was brand new to him. 318 00:19:35,560 --> 00:19:36,280 Speaker 2: That's a good yeah. 319 00:19:36,320 --> 00:19:38,119 Speaker 3: Yeah, And I don't know if that's like terrifying or 320 00:19:38,200 --> 00:19:40,400 Speaker 3: kind of awesome, but now I think it's terrifying. 321 00:19:40,440 --> 00:19:42,800 Speaker 2: I don't think it would a good way to experience 322 00:19:42,840 --> 00:19:43,200 Speaker 2: the world. 323 00:19:43,320 --> 00:19:45,959 Speaker 4: Well, it wouldn't make you useful for pretty much anything 324 00:19:46,000 --> 00:19:48,960 Speaker 4: except subject There's. 325 00:19:48,800 --> 00:19:50,639 Speaker 3: There's something to be said about, like, you know, if 326 00:19:50,640 --> 00:19:54,280 Speaker 3: we could just wipe our memories or do a little reset. 327 00:19:54,320 --> 00:19:56,000 Speaker 2: But no, I'm sorry not making life. 328 00:19:56,040 --> 00:19:59,280 Speaker 3: This is a very very serious condition, but one that 329 00:19:59,600 --> 00:20:02,240 Speaker 3: shut up lot of light on some things that didn't 330 00:20:02,280 --> 00:20:06,520 Speaker 3: require people to go necessarily digging around in people's brains, right. 331 00:20:06,560 --> 00:20:08,520 Speaker 4: Right, right, yeah. In particularly, she had a lot of 332 00:20:08,560 --> 00:20:11,119 Speaker 4: light on how memory works in our brain, so, like 333 00:20:11,160 --> 00:20:15,439 Speaker 4: before this patient, mostly people thought that memory was something 334 00:20:15,480 --> 00:20:19,199 Speaker 4: that was spread across your brain, Like people thought your 335 00:20:19,200 --> 00:20:21,960 Speaker 4: brain was used to giant kind of a computer ship 336 00:20:21,960 --> 00:20:23,919 Speaker 4: and you had just like you know, you store memories 337 00:20:23,960 --> 00:20:25,240 Speaker 4: in a little bit here, a little bit there, a 338 00:20:25,280 --> 00:20:28,480 Speaker 4: little bit everywhere. It's just kind of like evenly spread out. 339 00:20:29,000 --> 00:20:32,280 Speaker 4: But what was fascinating about patient HM is that he 340 00:20:32,720 --> 00:20:35,600 Speaker 4: could remember his childhood, his early adulthood. He couldn't make 341 00:20:35,640 --> 00:20:39,520 Speaker 4: me memories, but you could teach him motor skills like 342 00:20:39,560 --> 00:20:42,679 Speaker 4: he could. Yeah, Like you could teach him how to 343 00:20:42,680 --> 00:20:46,400 Speaker 4: play tennis, and the first day he would be terrible 344 00:20:46,440 --> 00:20:48,719 Speaker 4: at it, but if he kept practicing, he would get 345 00:20:48,760 --> 00:20:51,639 Speaker 4: better at it, but he wouldn't remember having practice. 346 00:20:52,000 --> 00:20:52,440 Speaker 2: Wow. 347 00:20:52,800 --> 00:20:57,720 Speaker 1: So every time this guy, for example, plays tennis, he's 348 00:20:57,880 --> 00:21:01,159 Speaker 1: just a little bit better, right, but he doesn't know why. 349 00:21:01,680 --> 00:21:03,840 Speaker 2: He's just really good at it. 350 00:21:04,840 --> 00:21:08,120 Speaker 4: Yeah, Like you know, by session thirty, I imagine, right, 351 00:21:08,520 --> 00:21:11,000 Speaker 4: this is hypothetically because I don't think they really has 352 00:21:11,040 --> 00:21:13,760 Speaker 4: them to play tennis, But like by session thirty, he'd 353 00:21:13,760 --> 00:21:16,639 Speaker 4: be like, you know, hitting aces and returning loves and 354 00:21:16,680 --> 00:21:18,200 Speaker 4: he'd be like, oh my god, I had no idea 355 00:21:18,200 --> 00:21:20,520 Speaker 4: I could play tennis. That was basically his experience. 356 00:21:21,480 --> 00:21:24,160 Speaker 1: I hope he has a I hope he has a 357 00:21:25,080 --> 00:21:30,040 Speaker 1: lived perception that he is a virtuoso. You know what 358 00:21:30,080 --> 00:21:33,040 Speaker 1: I mean, because it's always the first thirty minutes that 359 00:21:33,119 --> 00:21:34,080 Speaker 1: he's played tennis. 360 00:21:34,160 --> 00:21:36,840 Speaker 4: Yeah, right right, But then in the next session if 361 00:21:36,880 --> 00:21:38,800 Speaker 4: he doesn't play tennis, he has no idea he's good 362 00:21:38,800 --> 00:21:39,520 Speaker 4: to play tennis. 363 00:21:39,680 --> 00:21:41,320 Speaker 1: Al true periodic victory. 364 00:21:41,480 --> 00:21:42,600 Speaker 2: Yeah, real conundrum. 365 00:21:42,680 --> 00:21:45,359 Speaker 3: So what did this teach us though about the brain? 366 00:21:45,520 --> 00:21:49,199 Speaker 3: That there are certain things that could not be recalled, 367 00:21:49,359 --> 00:21:52,399 Speaker 3: but yet certain things had a bit more of a 368 00:21:52,440 --> 00:21:53,840 Speaker 3: sense memory perhaps. 369 00:21:54,200 --> 00:21:56,399 Speaker 4: Yeah. Mainly what it taught people is that there are 370 00:21:56,480 --> 00:21:59,280 Speaker 4: different kinds of memory. Like I think most people who 371 00:21:59,280 --> 00:22:02,200 Speaker 4: are listening to about short term memory long term memory. 372 00:22:02,600 --> 00:22:05,239 Speaker 4: So those are two separate things in your brain. But 373 00:22:05,280 --> 00:22:07,199 Speaker 4: for a long time, we didn't know that, and we 374 00:22:07,240 --> 00:22:09,640 Speaker 4: didn't know there was such a thing as motor memory. 375 00:22:09,760 --> 00:22:12,439 Speaker 4: Like when you like your signature, you can do you 376 00:22:12,440 --> 00:22:14,760 Speaker 4: write your signature without thinking about it. It's just in 377 00:22:14,760 --> 00:22:16,720 Speaker 4: your muscle memory. Or some people can play the piano 378 00:22:16,760 --> 00:22:19,240 Speaker 4: with is your muscle memory. So that's in a different 379 00:22:19,280 --> 00:22:21,840 Speaker 4: part of your brain than the long term. The short 380 00:22:21,920 --> 00:22:25,439 Speaker 4: term even your verbal and language all that is just 381 00:22:25,480 --> 00:22:29,440 Speaker 4: in separate areas. It's still in areas, but it's a 382 00:22:29,440 --> 00:22:30,879 Speaker 4: little bit more spread out of the brain. And so 383 00:22:30,920 --> 00:22:33,119 Speaker 4: that's kind of what it unlocked for scientists is like, 384 00:22:33,160 --> 00:22:36,359 Speaker 4: oh wait, memory is not just this like one hard drive. 385 00:22:36,760 --> 00:22:38,680 Speaker 4: It's like a whole bunch of little hard drive and 386 00:22:38,680 --> 00:22:42,399 Speaker 4: a whole bunch of different ways that memories gets stored. 387 00:22:42,840 --> 00:22:46,359 Speaker 1: So that teaches us as well about that reminds us 388 00:22:46,359 --> 00:22:49,920 Speaker 1: of things like what is it called broksophasia or something? 389 00:22:50,119 --> 00:22:55,879 Speaker 1: So patient HM was capable of retaining and speaking the 390 00:22:55,960 --> 00:22:59,960 Speaker 1: language she had learned in his childhood. What we're talking about, 391 00:23:00,080 --> 00:23:02,399 Speaker 1: we talk about brocos of phasia is of course the 392 00:23:02,440 --> 00:23:06,159 Speaker 1: breakdown of as you said, Jorge, a different version of 393 00:23:06,200 --> 00:23:10,120 Speaker 1: the hard drive, right, the one that controls linguistic aptitude. 394 00:23:11,359 --> 00:23:16,040 Speaker 1: Did he I don't know the answer. I'm asking honestly, 395 00:23:17,480 --> 00:23:20,600 Speaker 1: did sense memory impact anything? 396 00:23:20,880 --> 00:23:20,960 Speaker 2: Like? 397 00:23:21,119 --> 00:23:24,800 Speaker 1: Could he the same way he was hypothetically taught to 398 00:23:24,840 --> 00:23:27,879 Speaker 1: be good at tennis? Could he be taught to and 399 00:23:27,920 --> 00:23:33,159 Speaker 1: we're breaking tons of ethical experimentation laws on this, but 400 00:23:33,480 --> 00:23:39,840 Speaker 1: could you possibly teach this patient to be avoidant of 401 00:23:40,280 --> 00:23:44,439 Speaker 1: or attracted to, say, a certain stimuli like a smell 402 00:23:45,040 --> 00:23:49,679 Speaker 1: and then wait for that reset in the brain function. 403 00:23:50,880 --> 00:23:53,479 Speaker 1: You know, is this guy? Is it possible to make 404 00:23:53,520 --> 00:23:58,440 Speaker 1: a world where this guy is waking up experiencing lucidity 405 00:23:58,640 --> 00:24:02,159 Speaker 1: every thirty minutes and now he just hates the No, 406 00:24:02,320 --> 00:24:04,879 Speaker 1: what's a good smell to hate? Oh, my gosh, the 407 00:24:05,160 --> 00:24:09,360 Speaker 1: smell of coffee, like coffee, coffee, coffee is a great one. 408 00:24:09,560 --> 00:24:13,919 Speaker 1: Is it possible to uh, would it be possible to 409 00:24:13,960 --> 00:24:18,440 Speaker 1: teach sense memory in an olfactory way or what does 410 00:24:18,480 --> 00:24:19,080 Speaker 1: that take? Right? 411 00:24:20,720 --> 00:24:23,000 Speaker 4: I think the idea is that he couldn't make long 412 00:24:23,080 --> 00:24:27,800 Speaker 4: term memories, so and I'm not sure that like our 413 00:24:27,840 --> 00:24:31,960 Speaker 4: associations with certain smells, I'm not sure quite sure where 414 00:24:31,960 --> 00:24:33,520 Speaker 4: that is in the brain. So it might some of 415 00:24:33,560 --> 00:24:35,120 Speaker 4: it might be in our long term memory, in which 416 00:24:35,160 --> 00:24:38,639 Speaker 4: case he couldn't. You couldn't train that in him. But 417 00:24:38,720 --> 00:24:42,040 Speaker 4: some of it might be automatic, in which case you 418 00:24:42,160 --> 00:24:42,840 Speaker 4: probably could. 419 00:24:42,960 --> 00:24:43,360 Speaker 2: Yeah. 420 00:24:43,760 --> 00:24:50,720 Speaker 1: That is fascinating, and it occurs in step with we 421 00:24:50,840 --> 00:24:54,560 Speaker 1: wanted to ask you about this with another pivotal moment 422 00:24:54,960 --> 00:24:58,959 Speaker 1: in neuroscience understanding and and folks, by the way, I 423 00:24:58,960 --> 00:25:01,959 Speaker 1: feel like we've been very very clear about this. Do 424 00:25:02,000 --> 00:25:06,119 Speaker 1: not try this at home, no matter how mad you 425 00:25:06,160 --> 00:25:09,880 Speaker 1: are at your sibling. Do not try this at home. 426 00:25:10,080 --> 00:25:10,280 Speaker 2: Right. 427 00:25:11,160 --> 00:25:17,840 Speaker 1: There is another case that occurred in the eighteen hundreds, 428 00:25:17,840 --> 00:25:23,120 Speaker 1: so a little bit prior to HM, the infamous Phineas Gauge. 429 00:25:23,480 --> 00:25:25,240 Speaker 2: Yes, yeah, railroad tie. 430 00:25:25,880 --> 00:25:26,760 Speaker 1: Yeah yeah. 431 00:25:26,880 --> 00:25:29,840 Speaker 4: So Finished Gage was a railroad worker. He's just a 432 00:25:29,840 --> 00:25:35,760 Speaker 4: regular dude in eighteen mid eighteen hundreds up there in Vermont, 433 00:25:36,440 --> 00:25:39,560 Speaker 4: and you know, by all accounts, he was a nice guy, 434 00:25:39,760 --> 00:25:42,679 Speaker 4: you know, lived his life, worked hard. One day, he 435 00:25:42,800 --> 00:25:46,320 Speaker 4: was like installing these long metal rods on a rock 436 00:25:46,640 --> 00:25:49,479 Speaker 4: to make way for like a railroad. And what they 437 00:25:49,520 --> 00:25:53,080 Speaker 4: do is they pack some dynamite in there, they stick 438 00:25:53,119 --> 00:25:54,800 Speaker 4: the rod, and then they light the fuse and then 439 00:25:54,840 --> 00:25:58,960 Speaker 4: that blows up the rock. What could go wrong? Right, 440 00:26:00,119 --> 00:26:04,600 Speaker 4: thing exploded on him and this meter long iron rod 441 00:26:04,880 --> 00:26:08,240 Speaker 4: basically went through his head. So he kind of went 442 00:26:08,400 --> 00:26:12,719 Speaker 4: in through his left cheek, kind of up and above, 443 00:26:12,840 --> 00:26:15,960 Speaker 4: kind of behind his eyeballs, through the front part of 444 00:26:16,000 --> 00:26:20,160 Speaker 4: his brain, and then it came out the top. Gar 445 00:26:24,680 --> 00:26:27,960 Speaker 4: He did not. He survived, that is the fascinating thing. 446 00:26:28,480 --> 00:26:31,080 Speaker 4: And so he got better and everyone's like, oh, oh 447 00:26:31,119 --> 00:26:35,119 Speaker 4: my god, that's amazing. He's a miracle. I guess the 448 00:26:35,160 --> 00:26:40,840 Speaker 4: brain is not that important. For take away. Yeah, for 449 00:26:40,880 --> 00:26:43,360 Speaker 4: a long time, he was basically used as an example 450 00:26:43,359 --> 00:26:45,480 Speaker 4: of like, oh, the brain is like, you know, just 451 00:26:45,560 --> 00:26:48,440 Speaker 4: one big amorsive most mass, and if you lose a 452 00:26:48,480 --> 00:26:50,080 Speaker 4: little bit, it's like, you know, you lose a little bit, 453 00:26:50,119 --> 00:26:53,359 Speaker 4: but whatever, your brain has the risk of itself to 454 00:26:53,560 --> 00:26:56,639 Speaker 4: like make you who you are. And so he's revived. 455 00:26:56,640 --> 00:26:59,240 Speaker 4: People thought he was fine, but literally little people sort 456 00:26:59,240 --> 00:27:03,080 Speaker 4: of realized there was something a little off about him, 457 00:27:03,119 --> 00:27:06,800 Speaker 4: like he quite wasn't quite himself like he reportedly, he 458 00:27:07,080 --> 00:27:10,240 Speaker 4: just had like a he kind of basically became an 459 00:27:10,280 --> 00:27:10,680 Speaker 4: a hole. 460 00:27:10,960 --> 00:27:11,919 Speaker 2: I don't know if I can say that. 461 00:27:13,000 --> 00:27:16,800 Speaker 1: Yeah, he had a he had a market change in 462 00:27:16,840 --> 00:27:22,639 Speaker 1: his temperance, right he was he was now seen as yeah, 463 00:27:22,760 --> 00:27:25,440 Speaker 1: like you said, a whole and thank you for keeping 464 00:27:25,480 --> 00:27:26,400 Speaker 1: it a family show. 465 00:27:27,119 --> 00:27:29,440 Speaker 4: But he was not a nice guy. Not a nice guy. 466 00:27:29,520 --> 00:27:30,560 Speaker 2: Oh by all accounts. 467 00:27:30,600 --> 00:27:33,240 Speaker 3: Before that though, he was a perfectly nice fellow, a 468 00:27:33,240 --> 00:27:35,600 Speaker 3: hard worker, like you said, and in fun and fine 469 00:27:35,640 --> 00:27:38,760 Speaker 3: to be around, jovial. And then after this people started 470 00:27:38,800 --> 00:27:42,160 Speaker 3: noticing that something in his personality had shifted. 471 00:27:42,320 --> 00:27:45,080 Speaker 4: You know, Yeah, his personality had shifted from being like 472 00:27:45,119 --> 00:27:47,399 Speaker 4: a nice guy to like being very ill tempered, like 473 00:27:47,440 --> 00:27:50,760 Speaker 4: he just grumpy all the time. But and then beyond 474 00:27:50,800 --> 00:27:53,159 Speaker 4: his personality, he also kind of had a little bit 475 00:27:53,200 --> 00:27:56,080 Speaker 4: of ADHD now, like he could he find it hard 476 00:27:56,080 --> 00:27:58,320 Speaker 4: to focus, He couldn't really concentrate, and so he's just 477 00:27:58,480 --> 00:28:01,400 Speaker 4: he was just kind of like a frustrated person all 478 00:28:01,520 --> 00:28:03,800 Speaker 4: the time. And you know, people who knew me were like, 479 00:28:04,000 --> 00:28:07,560 Speaker 4: this is not the same person. And so only later 480 00:28:08,600 --> 00:28:11,560 Speaker 4: when the people realize, like, oh, this is perfectly explained 481 00:28:11,600 --> 00:28:14,240 Speaker 4: because the front part of your brain, that's kind of 482 00:28:14,280 --> 00:28:18,159 Speaker 4: where your personality is and where your ability to focus is. 483 00:28:19,119 --> 00:28:21,159 Speaker 4: So that was another big part of kind of like 484 00:28:21,320 --> 00:28:24,320 Speaker 4: mapping the brain and figuring out that there are parts 485 00:28:24,359 --> 00:28:26,480 Speaker 4: to it that do different things and which parts do 486 00:28:26,560 --> 00:28:27,359 Speaker 4: different things. 487 00:28:27,960 --> 00:28:30,720 Speaker 3: I gotta say, there's a really great episode of our 488 00:28:30,800 --> 00:28:34,320 Speaker 3: sister podcast, Stuff You Missed in History Class, all about 489 00:28:34,320 --> 00:28:37,040 Speaker 3: Phineas Gage back in the Archives. I remember back when 490 00:28:37,040 --> 00:28:39,400 Speaker 3: I produced that show, that was the first time I'd 491 00:28:39,400 --> 00:28:41,160 Speaker 3: heard of him, and I thought it was super fascinating. 492 00:28:41,160 --> 00:28:43,080 Speaker 3: So do check that one out for a deep dive 493 00:28:43,200 --> 00:28:44,840 Speaker 3: on this fascinating character. 494 00:28:45,920 --> 00:28:51,520 Speaker 1: Also fascinating to borrow that word there. It's also fascinating 495 00:28:51,680 --> 00:28:58,360 Speaker 1: that his treating physician was a guy named jam Horlow 496 00:28:59,080 --> 00:29:05,240 Speaker 1: and jam Harlow, like many Western physicians of his day, 497 00:29:05,440 --> 00:29:11,960 Speaker 1: held in abiding interest in phrenology. So maybe if you are, 498 00:29:12,120 --> 00:29:14,760 Speaker 1: if you are a boffin or a doctor of the day, 499 00:29:15,200 --> 00:29:19,200 Speaker 1: maybe it's not the fact that this meter long rod 500 00:29:19,600 --> 00:29:22,640 Speaker 1: went through the front part of this human brain, it's 501 00:29:22,680 --> 00:29:25,880 Speaker 1: that it altered the shape of the cranium. So now 502 00:29:25,920 --> 00:29:27,760 Speaker 1: he doesn't have good guy bumps. 503 00:29:29,960 --> 00:29:32,960 Speaker 4: Well, what's interesting is that, you know, this idea sounds crazy, right, 504 00:29:33,000 --> 00:29:36,240 Speaker 4: like you could judge how a brain works by how 505 00:29:36,280 --> 00:29:39,600 Speaker 4: well brain is good at something by its shape. But 506 00:29:39,640 --> 00:29:42,120 Speaker 4: that's actually something they found kind of in the nineties 507 00:29:42,160 --> 00:29:45,400 Speaker 4: early two thousands, was that they looked at brain scans 508 00:29:45,520 --> 00:29:47,440 Speaker 4: of taxi drivers in London. 509 00:29:48,040 --> 00:29:52,480 Speaker 2: The knowledge, yeah, the knowledge of this, the way it 510 00:29:52,640 --> 00:29:54,960 Speaker 2: sort of changed the pathways, right. 511 00:29:55,280 --> 00:29:59,760 Speaker 4: Yeah, yeah, So they studied the brains of taxi drivers 512 00:29:59,760 --> 00:30:02,280 Speaker 4: in life then, and they talk about the knowledge, right, like, 513 00:30:02,400 --> 00:30:04,360 Speaker 4: you know, if you're brand new at driving a taxi 514 00:30:04,400 --> 00:30:06,640 Speaker 4: in London, you're clueless. You have no idea where anything 515 00:30:06,760 --> 00:30:09,640 Speaker 4: is if you've ever been there, there's like alleys everywhere 516 00:30:09,680 --> 00:30:12,600 Speaker 4: with all these it's impossible to navigate. But once you've 517 00:30:12,640 --> 00:30:14,440 Speaker 4: done it for a while you know, the whole day 518 00:30:14,480 --> 00:30:17,040 Speaker 4: of the land, you can take anyone anywhere. They studied 519 00:30:17,080 --> 00:30:19,480 Speaker 4: the brains of these people, and they found that people 520 00:30:19,480 --> 00:30:21,560 Speaker 4: who've been doing it for a long time, that part 521 00:30:21,600 --> 00:30:26,520 Speaker 4: of your brain that stores like locations and spatial memory 522 00:30:27,000 --> 00:30:30,520 Speaker 4: that's actually bigger, like it grows. 523 00:30:30,520 --> 00:30:34,080 Speaker 2: The mind is shriveled. Y'all. I'm so bad at directions. 524 00:30:34,200 --> 00:30:36,200 Speaker 2: I have no sense of geography. 525 00:30:36,280 --> 00:30:41,520 Speaker 3: It's really bad because yeah, because maybe it was, it's 526 00:30:41,560 --> 00:30:44,360 Speaker 3: been bad even since before, you know what. It's just 527 00:30:44,360 --> 00:30:47,160 Speaker 3: funny though, I will say this, I have recently been 528 00:30:47,320 --> 00:30:52,160 Speaker 3: trying to actively not use maps, and I have found 529 00:30:52,440 --> 00:30:55,880 Speaker 3: that it improves my sense of direction overall. 530 00:30:56,040 --> 00:31:00,440 Speaker 2: So I think it can be almost relearned or you know, improved. DuPont. 531 00:31:00,920 --> 00:31:01,160 Speaker 5: Yeah. 532 00:31:01,360 --> 00:31:01,600 Speaker 2: Yeah. 533 00:31:01,640 --> 00:31:05,000 Speaker 4: My wife just talked to a coworker who said he's 534 00:31:05,000 --> 00:31:08,040 Speaker 4: taking his son out for walks just to teach him 535 00:31:08,080 --> 00:31:09,840 Speaker 4: how to go on walks. 536 00:31:09,920 --> 00:31:15,200 Speaker 2: Yeah. Yeah, he's and it's not always inherent, you know, Yeah. 537 00:31:15,040 --> 00:31:17,440 Speaker 4: Because he's going out to college and you know, kids 538 00:31:17,440 --> 00:31:21,480 Speaker 4: do they basically maybe don't have that ability to walk around. 539 00:31:22,160 --> 00:31:25,480 Speaker 1: Yeah. I haven't had to uh be lost in the woods, 540 00:31:25,720 --> 00:31:29,760 Speaker 1: right like our four bears. Uh Yeah, It's funny. It's 541 00:31:29,760 --> 00:31:33,640 Speaker 1: funny you say that because we we've talked in the 542 00:31:33,680 --> 00:31:40,240 Speaker 1: past about these arguments regarding you know, regarding the advent 543 00:31:40,520 --> 00:31:44,840 Speaker 1: of offloading uh, some sort of process right from the 544 00:31:44,960 --> 00:31:49,160 Speaker 1: human machine to an external machine. And I love that 545 00:31:49,200 --> 00:31:54,280 Speaker 1: you're bringing up the knowledge because that was a revolutionary 546 00:31:54,800 --> 00:31:58,560 Speaker 1: study in neuroscience. It reminds me as well of a 547 00:31:59,600 --> 00:32:02,880 Speaker 1: study I want to say it was in Why You 548 00:32:04,520 --> 00:32:08,920 Speaker 1: in two thousand and eight, admittedly small sample size, but 549 00:32:09,000 --> 00:32:14,680 Speaker 1: they studied people who meditate, Buddhist monks in particular, and 550 00:32:14,720 --> 00:32:21,240 Speaker 1: they found something similar similar to the similar to how 551 00:32:21,360 --> 00:32:24,160 Speaker 1: the use of the part of the brain that is 552 00:32:24,200 --> 00:32:31,479 Speaker 1: occupied with spatial positioning appropreception. You could argue, similar to 553 00:32:31,600 --> 00:32:35,960 Speaker 1: how their continued practice of knowing where they are and 554 00:32:36,040 --> 00:32:41,760 Speaker 1: where they're going literally became mind over matter and increased 555 00:32:42,000 --> 00:32:46,240 Speaker 1: the I believe the argument is not just the increase 556 00:32:46,320 --> 00:32:51,760 Speaker 1: in size, but the increase in density of synaptic connections. 557 00:32:52,080 --> 00:32:59,280 Speaker 1: The study in eight with Buddhist monks found, and I'm 558 00:32:59,320 --> 00:33:02,040 Speaker 1: going to sound so pops, I hear, and I apologize, 559 00:33:02,200 --> 00:33:08,760 Speaker 1: it found that the part of the brain position toward 560 00:33:08,960 --> 00:33:13,800 Speaker 1: or associated with things like empathy and compassion was actually 561 00:33:13,920 --> 00:33:19,760 Speaker 1: denser and larger and exhibited more activity in those Buddhist 562 00:33:19,800 --> 00:33:25,200 Speaker 1: monks versus a sample size of you know, jerks like. 563 00:33:25,160 --> 00:33:34,920 Speaker 4: Us, non Buddhist monks, not. 564 00:33:37,040 --> 00:33:42,600 Speaker 1: Enlightened, not yes, not enlightened, perfect diplomacy. So this brings 565 00:33:42,680 --> 00:33:46,720 Speaker 1: us to I think I think the general umbrella term 566 00:33:46,800 --> 00:33:50,680 Speaker 1: for this concept is neuroplasticity. Is that what we're kind 567 00:33:50,680 --> 00:33:53,160 Speaker 1: of talking about? What is neuroplasticity. 568 00:33:53,600 --> 00:33:55,960 Speaker 4: Yeah, it's kind of this idea that your brain is 569 00:33:56,000 --> 00:33:59,440 Speaker 4: not static, like at all, Like your brain is constantly 570 00:34:00,120 --> 00:34:04,920 Speaker 4: kind of rewiring itself, kind of constantly tuning itself. And 571 00:34:05,000 --> 00:34:09,120 Speaker 4: it's not like you're necessarily growing new neurons, but these 572 00:34:09,120 --> 00:34:14,480 Speaker 4: neurons are making new connections between themselves. And also kind 573 00:34:14,480 --> 00:34:18,080 Speaker 4: of more importantly is that these the connections that they have, 574 00:34:18,360 --> 00:34:22,400 Speaker 4: they're constantly kind of recalibrating themselves. And actually that's kind 575 00:34:22,400 --> 00:34:25,160 Speaker 4: of what's happening. That's how like ais learn. You know, 576 00:34:25,200 --> 00:34:29,799 Speaker 4: if you look at these neural net models, basically what 577 00:34:29,840 --> 00:34:36,960 Speaker 4: they're changing when they're learning stuff is the weighing of 578 00:34:37,080 --> 00:34:42,040 Speaker 4: the synaptic connections. So like how strong priority wise or 579 00:34:42,280 --> 00:34:45,279 Speaker 4: yeah kind of priority wise. Yeah, Like each neurons is 580 00:34:45,320 --> 00:34:48,560 Speaker 4: connected to like let's say a hundred other neurons, and 581 00:34:48,719 --> 00:34:51,080 Speaker 4: like it's getting all these inputs, so which ones do 582 00:34:51,160 --> 00:34:54,359 Speaker 4: you ignore? Which ones do you listen to? And so 583 00:34:54,520 --> 00:34:57,120 Speaker 4: that happens at what are called synapses, which is kind 584 00:34:57,120 --> 00:34:59,120 Speaker 4: of where like you know, the little branches of two 585 00:34:59,200 --> 00:35:00,080 Speaker 4: neurons kind of mean. 586 00:35:00,800 --> 00:35:04,839 Speaker 2: And firing, right, firing synapses. 587 00:35:04,400 --> 00:35:07,879 Speaker 4: Firing, yeah, firing synapses where that gets transmitted from one 588 00:35:07,880 --> 00:35:10,040 Speaker 4: to the other. So that's that's where that's happening. 589 00:35:10,120 --> 00:35:12,440 Speaker 3: Yeah, And when you look at a brain scan or 590 00:35:12,440 --> 00:35:15,360 Speaker 3: an MRI, if I'm not mistaken, you can literally see 591 00:35:16,280 --> 00:35:19,839 Speaker 3: this activity right lighting up in different regions of the brain. 592 00:35:20,440 --> 00:35:23,160 Speaker 4: Kind of these these are like almost like molecule size. 593 00:35:23,160 --> 00:35:25,239 Speaker 4: They're like super super tiny, so you can't see them 594 00:35:25,440 --> 00:35:27,120 Speaker 4: like an MRI, but you can see kind of like 595 00:35:27,280 --> 00:35:29,120 Speaker 4: when you're looking at MRI, what you're looking at is 596 00:35:29,160 --> 00:35:32,120 Speaker 4: like the oxygen consumption of your neurons. So you can 597 00:35:32,160 --> 00:35:36,120 Speaker 4: tell like, oh, these neurons are being active because they're God, drinking. 598 00:35:35,880 --> 00:35:37,600 Speaker 2: Up a lot makes a lot of sense. So it's 599 00:35:37,640 --> 00:35:40,880 Speaker 2: an indicator, got it. Yeah, Yeah, that's fascinating. 600 00:35:40,920 --> 00:35:46,960 Speaker 1: So now we have learned that in given that the 601 00:35:47,120 --> 00:35:52,240 Speaker 1: universe observable according to Big Bang theory is about fourteen 602 00:35:52,400 --> 00:35:58,160 Speaker 1: billion years old. Humans are real up and coming fad overall. 603 00:35:58,560 --> 00:36:03,520 Speaker 4: Right, we're the latest it beings. Yeah we are, We 604 00:36:03,640 --> 00:36:08,799 Speaker 4: sure are, and for long. That's how fads work. 605 00:36:08,960 --> 00:36:12,680 Speaker 1: And and we we also learned in that very brief 606 00:36:12,800 --> 00:36:17,600 Speaker 1: span of time that we call humanity. Uh, people went 607 00:36:17,800 --> 00:36:22,320 Speaker 1: from totally thinking the soul was in the most protected 608 00:36:22,360 --> 00:36:26,280 Speaker 1: part of the body, the torso right to figuring out, oh, 609 00:36:26,320 --> 00:36:30,680 Speaker 1: that fatty thing in your head does something. Right after 610 00:36:30,719 --> 00:36:34,680 Speaker 1: we figured out just how to stop eating each other's brains. Uh, 611 00:36:34,840 --> 00:36:38,520 Speaker 1: just gonna throw that in there, and and and then 612 00:36:38,719 --> 00:36:45,120 Speaker 1: from there we see this, this vast series of at 613 00:36:45,120 --> 00:36:51,560 Speaker 1: times problematic innovations, sometimes based in accident, sometimes based in 614 00:36:52,320 --> 00:36:56,239 Speaker 1: confirmation bias or as you said, quack science of phrenology 615 00:36:57,840 --> 00:37:03,200 Speaker 1: or hate. Where does the exploration of consciousness and neuroscience 616 00:37:03,800 --> 00:37:06,600 Speaker 1: go in the future, By the way, just going to 617 00:37:06,640 --> 00:37:10,000 Speaker 1: put this out there for posterity. Jorge, Noel Max and 618 00:37:10,040 --> 00:37:14,200 Speaker 1: I are recording this on Monday, June twenty third, twenty 619 00:37:14,280 --> 00:37:18,440 Speaker 1: twenty five, So no pressure, Orge. Next thousand years, where 620 00:37:18,480 --> 00:37:19,120 Speaker 1: are we at. 621 00:37:21,560 --> 00:37:25,040 Speaker 4: No set, assuming we survived the next I don't know months, 622 00:37:25,360 --> 00:37:29,640 Speaker 4: Oh my god, existence here you know. I think these 623 00:37:29,640 --> 00:37:32,399 Speaker 4: things are like Pandora's blogs. You know, once you open them, 624 00:37:32,440 --> 00:37:34,640 Speaker 4: you can't go back, you know. And so I think 625 00:37:36,560 --> 00:37:40,160 Speaker 4: we're going to be understanding the universe a lot more, 626 00:37:41,080 --> 00:37:43,720 Speaker 4: We're going to be understanding things at the quantum level 627 00:37:43,840 --> 00:37:48,040 Speaker 4: a lot more. And who knows what's going to happen 628 00:37:48,080 --> 00:37:51,640 Speaker 4: with AIS. You know, it's quite I think possible that 629 00:37:52,080 --> 00:37:56,640 Speaker 4: within I don't know, twenty years, there'll be a conscious 630 00:37:56,680 --> 00:37:58,840 Speaker 4: AI who is smarter than us. 631 00:37:59,160 --> 00:38:01,200 Speaker 3: I think can That is the term that gets thrown 632 00:38:01,200 --> 00:38:03,799 Speaker 3: around a lot, is singularity in terms of AI. But 633 00:38:03,840 --> 00:38:05,680 Speaker 3: then we also in talking about the Big Bang that's 634 00:38:05,719 --> 00:38:08,680 Speaker 3: referred to as a singularity event, and maybe the terms 635 00:38:08,680 --> 00:38:10,640 Speaker 3: are sort of used loosely, But can you kind of 636 00:38:11,000 --> 00:38:13,600 Speaker 3: talk about that term and how it applies to both 637 00:38:13,600 --> 00:38:15,200 Speaker 3: of those different things. Is it really just kind of 638 00:38:15,280 --> 00:38:18,400 Speaker 3: like an it moment where a big thing happens. 639 00:38:18,960 --> 00:38:19,280 Speaker 2: Yeah. 640 00:38:19,320 --> 00:38:22,880 Speaker 4: So consciousness is one of the most debated things in 641 00:38:23,160 --> 00:38:26,480 Speaker 4: like neuroscience psychology. Like if you ask any scientist like 642 00:38:26,480 --> 00:38:29,400 Speaker 4: what is even consciousness? You'll get one hundred different answers. 643 00:38:30,320 --> 00:38:35,520 Speaker 4: Some people think it's like totally kind of biologically based, 644 00:38:35,680 --> 00:38:38,000 Speaker 4: like it's you know, a dog can have some kind 645 00:38:38,040 --> 00:38:40,879 Speaker 4: of consciousness and can have a little bit of a consciousness, 646 00:38:41,120 --> 00:38:49,000 Speaker 4: you know, machine can have a consciousness the material materialist philosophy. Yeah, yeah, exactly. 647 00:38:49,080 --> 00:38:52,200 Speaker 4: And some people think there's something kind of special and 648 00:38:53,120 --> 00:38:57,000 Speaker 4: almost supernatural about it, you know, even scientists sometimes think 649 00:38:57,680 --> 00:39:03,120 Speaker 4: that when it happens, it's like this kind of indescribable 650 00:39:03,120 --> 00:39:04,280 Speaker 4: thing that happens. 651 00:39:04,600 --> 00:39:09,840 Speaker 1: Yeah, Promethean lighting in a bottle. Yeah, And this is 652 00:39:10,920 --> 00:39:18,000 Speaker 1: where we get to the vast precipice, some would say, 653 00:39:18,200 --> 00:39:23,239 Speaker 1: or the vast horizon, depends upon your interpretations of as 654 00:39:23,280 --> 00:39:30,320 Speaker 1: you said, Nol, singularity or transhumanism and future futurism. Nolan 655 00:39:30,360 --> 00:39:34,320 Speaker 1: and I were talking off air or at length about 656 00:39:34,640 --> 00:39:41,120 Speaker 1: the concept of AI, artificial intelligence large language models. We 657 00:39:41,239 --> 00:39:44,439 Speaker 1: touched on it naturally a little bit in part one, 658 00:39:44,560 --> 00:39:49,319 Speaker 1: but perhaps we close out chapter two of our conversation 659 00:39:49,800 --> 00:39:54,680 Speaker 1: on the history of brain science by exploring the nature 660 00:39:55,320 --> 00:40:00,239 Speaker 1: of AI just a bit further. Now you have you have, 661 00:40:00,360 --> 00:40:07,720 Speaker 1: through your work explore human human computer interaction in depth, 662 00:40:08,040 --> 00:40:13,360 Speaker 1: right and yeah, and so where do you see the 663 00:40:15,000 --> 00:40:20,760 Speaker 1: future of human interaction with AI going? You already said, 664 00:40:21,000 --> 00:40:24,440 Speaker 1: you know, there is a horizon where this kind of 665 00:40:24,440 --> 00:40:28,560 Speaker 1: thing exists. What will that tell us about the human brain? Oh, 666 00:40:28,600 --> 00:40:34,080 Speaker 1: my goodness, small dog. Well, it kind of depends on 667 00:40:34,120 --> 00:40:36,759 Speaker 1: where you land about how special the human brain is. 668 00:40:37,560 --> 00:40:40,880 Speaker 1: You know, like I personally am an engineer by training, 669 00:40:41,120 --> 00:40:44,600 Speaker 1: and to me, brains are really just like meat machines. 670 00:40:44,760 --> 00:40:49,640 Speaker 1: You know, It's like it's mechanical, there's chemicals, involves some 671 00:40:49,680 --> 00:40:53,080 Speaker 1: people think like quantum physics and quantum and certainly plays 672 00:40:53,080 --> 00:40:56,000 Speaker 1: a role in like those little tiny synapsis that we have, 673 00:40:56,160 --> 00:40:59,200 Speaker 1: in which case, you know, there is maybe some magic 674 00:40:59,239 --> 00:41:00,600 Speaker 1: to how the human rain works. 675 00:41:00,840 --> 00:41:02,320 Speaker 2: I think we're all made of star stuff. 676 00:41:04,440 --> 00:41:07,000 Speaker 1: Well, if the Big Bang is true, then technically that 677 00:41:07,000 --> 00:41:08,360 Speaker 1: that's also true, right. 678 00:41:08,200 --> 00:41:11,120 Speaker 3: Well yeah, we arose from something like that, right, and 679 00:41:11,280 --> 00:41:11,920 Speaker 3: we had to have. 680 00:41:12,239 --> 00:41:14,759 Speaker 4: Yeah yeah, well most of our the atoms and our 681 00:41:14,760 --> 00:41:18,560 Speaker 4: bodies were made inside of a star. Yeah, because the 682 00:41:18,640 --> 00:41:21,000 Speaker 4: universe at the beginning was just all hydrogen, and so 683 00:41:21,080 --> 00:41:24,600 Speaker 4: anything other than hydrogen was basically made by a star 684 00:41:25,040 --> 00:41:26,320 Speaker 4: and usually starts dying. 685 00:41:26,960 --> 00:41:31,600 Speaker 1: So yeah, this podcast brought to you by hydrogen. So 686 00:41:33,480 --> 00:41:39,480 Speaker 1: we're saying then that the nature of consciousness is still 687 00:41:39,600 --> 00:41:45,200 Speaker 1: something that the world's smartest people past, present, and possibly 688 00:41:45,280 --> 00:41:50,319 Speaker 1: future have debate, have debated, right, the materialist view of 689 00:41:51,080 --> 00:41:57,200 Speaker 1: this one thing in this one case, these physical processes, 690 00:41:57,520 --> 00:42:01,640 Speaker 1: these mechanics, and these chemical interactions, and then there's the 691 00:42:01,880 --> 00:42:07,000 Speaker 1: larger question is there something bigger? Right? Is an individual 692 00:42:07,040 --> 00:42:12,399 Speaker 1: consciousness only a node for a larger system which gets 693 00:42:12,480 --> 00:42:18,880 Speaker 1: little It's a lot of my old professors hate that idea. 694 00:42:19,360 --> 00:42:22,239 Speaker 4: Yeah, it's this idea like humans are the way that 695 00:42:22,280 --> 00:42:25,720 Speaker 4: the universe understands itself kind of. Is that what you're 696 00:42:25,760 --> 00:42:27,560 Speaker 4: talking about a little bit, Yeah. 697 00:42:27,400 --> 00:42:30,600 Speaker 1: Because you're talking about we're talking about now what we 698 00:42:30,640 --> 00:42:37,560 Speaker 1: would call Homo sapiens exceptionalism, right, yeah, right, the idea 699 00:42:37,680 --> 00:42:45,320 Speaker 1: that although one can observe perhaps emotions in a pet, 700 00:42:46,360 --> 00:42:49,319 Speaker 1: or what seems to be emotions in a pet, even 701 00:42:50,040 --> 00:42:55,480 Speaker 1: on the strength of various cognitive diagnostics, you could observe 702 00:42:55,760 --> 00:43:01,320 Speaker 1: maybe the way an octopus dreams, or the the functions 703 00:43:01,360 --> 00:43:03,640 Speaker 1: of certain higher order mammals. 704 00:43:03,640 --> 00:43:06,200 Speaker 3: To take what we're talking about metacognition here, right, like 705 00:43:06,280 --> 00:43:10,720 Speaker 3: the fact that humans are uniquely built to think about thinking, 706 00:43:10,880 --> 00:43:15,760 Speaker 3: to analyze themselves, sometimes into oblivion, which is why maybe 707 00:43:15,800 --> 00:43:18,560 Speaker 3: sometimes I'm jealous of the guy that can't remember anything 708 00:43:18,600 --> 00:43:21,960 Speaker 3: for more than thirty minutes, because it can be a 709 00:43:21,960 --> 00:43:25,040 Speaker 3: awaking nightmare at times what we do to ourselves in 710 00:43:25,120 --> 00:43:28,560 Speaker 3: terms of like thinking about thinking and all of the possibilities, 711 00:43:28,680 --> 00:43:31,520 Speaker 3: and it can be really exhausting, right. 712 00:43:31,640 --> 00:43:34,359 Speaker 4: Yeah, yeah, And they're fascinating cases in the history of 713 00:43:34,440 --> 00:43:37,160 Speaker 4: brain science to the kind of speak to consciousness. So 714 00:43:37,280 --> 00:43:41,920 Speaker 4: one fascinating case that is pretty recent are conjoined twins. 715 00:43:42,960 --> 00:43:47,040 Speaker 4: So there's a pair of famous twins called the Hogan twins, 716 00:43:47,440 --> 00:43:50,440 Speaker 4: and these are two girls who were born conjoined were 717 00:43:50,600 --> 00:43:53,440 Speaker 4: they basically share a brain, or they share parts of 718 00:43:53,440 --> 00:43:56,239 Speaker 4: the brain, and specifically they share like this part called 719 00:43:56,239 --> 00:43:58,600 Speaker 4: the thalamus, which is kind of like a kind of 720 00:43:58,600 --> 00:44:01,280 Speaker 4: a hub inside of your brain kind of relays information 721 00:44:01,880 --> 00:44:04,160 Speaker 4: and so it's too definitely to people, like you can 722 00:44:04,200 --> 00:44:05,520 Speaker 4: talk to one of them, you can talk to the 723 00:44:05,560 --> 00:44:09,200 Speaker 4: other of them. But they sort of share their consciousness 724 00:44:09,200 --> 00:44:12,920 Speaker 4: almost in a way like like one of them can 725 00:44:13,160 --> 00:44:21,520 Speaker 4: sense what when the other person. Sometimes one of them 726 00:44:21,560 --> 00:44:23,560 Speaker 4: can sort of sense what the other person is thinking, 727 00:44:24,239 --> 00:44:26,920 Speaker 4: and they can sort of each control different parts of 728 00:44:26,960 --> 00:44:29,600 Speaker 4: the other person's body kind of like one of them 729 00:44:29,600 --> 00:44:32,520 Speaker 4: controls the left leg of the other one, the other 730 00:44:32,560 --> 00:44:35,040 Speaker 4: one controls the right arm of the other one. I 731 00:44:35,080 --> 00:44:37,520 Speaker 4: forget the exact details, but it's kind of like, like 732 00:44:37,560 --> 00:44:41,720 Speaker 4: you said, kind of like we sometimes think being conscious 733 00:44:42,480 --> 00:44:44,320 Speaker 4: the only way to be conscious is to be conscious 734 00:44:44,360 --> 00:44:46,959 Speaker 4: like humans are right now, But there are other ways 735 00:44:46,960 --> 00:44:47,880 Speaker 4: that we can be conscious. 736 00:44:48,080 --> 00:44:50,799 Speaker 3: Well, I mean, even other non conjoined twin studies yield 737 00:44:50,880 --> 00:44:55,520 Speaker 3: some pretty interesting results, like in terms of potentially some 738 00:44:55,640 --> 00:44:58,359 Speaker 3: kind of link, you know, where there's at the very 739 00:44:58,440 --> 00:45:01,879 Speaker 3: least a what's the word I'm looking for, a kind 740 00:45:01,960 --> 00:45:04,879 Speaker 3: of intuition, you know, in terms of like that would 741 00:45:04,960 --> 00:45:08,960 Speaker 3: you know, surpass normal intuition maybe between regular siblings. I've 742 00:45:08,960 --> 00:45:12,240 Speaker 3: met known in my life multiple sets of identical twins, 743 00:45:12,280 --> 00:45:13,759 Speaker 3: and there's something to it. 744 00:45:13,760 --> 00:45:14,840 Speaker 2: It's very fascinating. 745 00:45:14,880 --> 00:45:16,920 Speaker 3: I would say that is a different kind of consciousness 746 00:45:16,960 --> 00:45:19,840 Speaker 3: in some ways where you are. Maybe it's a product 747 00:45:19,840 --> 00:45:22,160 Speaker 3: of sharing the same space so much and you know, 748 00:45:22,239 --> 00:45:24,759 Speaker 3: spending so much time around each other. But I have 749 00:45:24,880 --> 00:45:27,720 Speaker 3: seen something, some things that have a hard time explaining 750 00:45:27,760 --> 00:45:30,160 Speaker 3: in terms of the way twins can kind of know 751 00:45:30,200 --> 00:45:32,160 Speaker 3: what each other are thinking and feeling. 752 00:45:32,600 --> 00:45:35,040 Speaker 4: Yeah, I mean, there's definitely a role for intuition there, 753 00:45:35,080 --> 00:45:37,799 Speaker 4: and you know, it doesn't have to be physical consciousness, 754 00:45:37,800 --> 00:45:40,000 Speaker 4: Like you know, I kind of personally think that we 755 00:45:40,080 --> 00:45:42,680 Speaker 4: all kind of share as a human species some sort 756 00:45:42,680 --> 00:45:44,040 Speaker 4: of consciousness. 757 00:45:43,400 --> 00:45:46,680 Speaker 1: You know, through the collect it's unconscious, Yeah, like the 758 00:45:46,840 --> 00:45:50,839 Speaker 1: young and super consciousness, right right, Like that just. 759 00:45:51,160 --> 00:45:52,080 Speaker 2: Adds to that. You're right. 760 00:45:52,080 --> 00:45:54,080 Speaker 3: I'm sorry I didn't interrupt, but that's a really good point. 761 00:45:54,120 --> 00:45:58,920 Speaker 3: The Internet is, in and of itself, a super scaled 762 00:45:59,080 --> 00:46:02,400 Speaker 3: version of that just contributes to what we're talking about. 763 00:46:02,640 --> 00:46:05,000 Speaker 4: Yeah, it just really all depends on how you define 764 00:46:05,040 --> 00:46:06,040 Speaker 4: this word consciousness. 765 00:46:06,160 --> 00:46:06,319 Speaker 2: Mm. 766 00:46:07,200 --> 00:46:09,759 Speaker 4: Like if it means, like, you know, being able to 767 00:46:09,800 --> 00:46:13,239 Speaker 4: write poetry and understand Shakespeare, that's one you might not 768 00:46:13,320 --> 00:46:15,320 Speaker 4: get very far there. But you know, I've talked to 769 00:46:15,360 --> 00:46:21,719 Speaker 4: scientists basically, just define it as our sense are as 770 00:46:22,680 --> 00:46:27,320 Speaker 4: one of our senses that keeps track of our internal state. 771 00:46:28,480 --> 00:46:30,719 Speaker 4: So like you have a visual sense that tells you, oh, 772 00:46:30,760 --> 00:46:33,120 Speaker 4: I'm in the room, there's a door over there, that's 773 00:46:33,120 --> 00:46:35,680 Speaker 4: an apple over there. You have kind of an inner 774 00:46:35,719 --> 00:46:38,040 Speaker 4: looking sense that just tells you like, oh, I'm feeling 775 00:46:38,080 --> 00:46:41,760 Speaker 4: this way, I'm thinking about this, I'm having this memory 776 00:46:41,800 --> 00:46:44,480 Speaker 4: of the apple eight this morning. It's just kind of 777 00:46:44,520 --> 00:46:46,839 Speaker 4: like something that tells your body, Oh, this is what's 778 00:46:46,840 --> 00:46:47,880 Speaker 4: going on inside your brain. 779 00:46:48,120 --> 00:46:50,880 Speaker 3: Well, the key word there also is I you know, 780 00:46:51,080 --> 00:46:54,680 Speaker 3: and this idea of identity and this idea of consciousness 781 00:46:54,719 --> 00:46:56,040 Speaker 3: revolving around. 782 00:46:55,719 --> 00:46:58,000 Speaker 2: Who we are and when who we are. 783 00:46:57,760 --> 00:47:01,040 Speaker 3: Being ultimately a collection of experience is that are in 784 00:47:01,080 --> 00:47:03,920 Speaker 3: many ways influenced by the society we live in and 785 00:47:04,080 --> 00:47:05,399 Speaker 3: you know, and if you want to take it further 786 00:47:05,440 --> 00:47:07,880 Speaker 3: than them people trying to achieve enlightenment, the idea is 787 00:47:07,920 --> 00:47:10,360 Speaker 3: sort of disconnect from all of those aspects and like 788 00:47:10,640 --> 00:47:14,120 Speaker 3: truly experience the spiritual part of what it is to 789 00:47:14,200 --> 00:47:15,799 Speaker 3: have me have a soul or to like, you know, 790 00:47:15,880 --> 00:47:18,759 Speaker 3: what it means to be part of the universe, rather 791 00:47:18,800 --> 00:47:21,359 Speaker 3: than this identity, this construct that we sort of you know, 792 00:47:21,600 --> 00:47:25,160 Speaker 3: force upon ourselves or is forced upon us oftentimes. 793 00:47:25,200 --> 00:47:26,840 Speaker 2: It's fascinating. Obviously I'm super into this. 794 00:47:28,040 --> 00:47:30,959 Speaker 4: Well did this gets We just did an episode about 795 00:47:30,960 --> 00:47:35,480 Speaker 4: this on Signed Stuff, the podcast on and about near 796 00:47:35,560 --> 00:47:36,800 Speaker 4: death experiences. 797 00:47:37,160 --> 00:47:38,120 Speaker 2: Oh, we just. 798 00:47:38,080 --> 00:47:41,919 Speaker 3: Talked to an incredible podcast creator and friend of the show, 799 00:47:42,000 --> 00:47:43,600 Speaker 3: Dan Bush on stuff that I want you to know 800 00:47:43,640 --> 00:47:47,040 Speaker 3: about his incredible podcasts Alive Again. That is all about 801 00:47:47,080 --> 00:47:50,279 Speaker 3: interviews with folks who've experienced your experiences. 802 00:47:50,560 --> 00:47:53,279 Speaker 1: I'm going to connect you with those folks. You guys 803 00:47:53,280 --> 00:47:56,279 Speaker 1: should hang out can you tell us just a bit 804 00:47:56,320 --> 00:48:00,360 Speaker 1: of a tease as we wrap up what you found 805 00:48:00,400 --> 00:48:05,319 Speaker 1: in your explorations on ND or near death experiences in 806 00:48:05,440 --> 00:48:06,200 Speaker 1: science stuff. 807 00:48:06,719 --> 00:48:08,439 Speaker 4: Yeah, well, it kind of goes back to this idea 808 00:48:08,480 --> 00:48:11,080 Speaker 4: of what consciousness is and because you know, a big 809 00:48:11,120 --> 00:48:14,759 Speaker 4: part of near death experiences is this out of body experience. 810 00:48:14,920 --> 00:48:17,840 Speaker 4: People feel like they're outside their body, and what we 811 00:48:17,920 --> 00:48:22,680 Speaker 4: found was that it can it can all be explained 812 00:48:22,960 --> 00:48:27,719 Speaker 4: by science by how your brain works, and but whether 813 00:48:27,880 --> 00:48:30,600 Speaker 4: that's actually what's going on, like you know, scientists can't 814 00:48:30,600 --> 00:48:33,480 Speaker 4: answer that because you know, we can't test someone while 815 00:48:33,480 --> 00:48:37,640 Speaker 4: they're having it experience. But basically all of these phenomena 816 00:48:37,840 --> 00:48:40,560 Speaker 4: near thats you know, feeling outside of your body, having 817 00:48:40,600 --> 00:48:46,040 Speaker 4: weird visions, talking to people who are already dead, there 818 00:48:46,120 --> 00:48:49,319 Speaker 4: are brain processes that you can say, Okay, I think 819 00:48:49,360 --> 00:48:51,840 Speaker 4: that's what's going on there, and that we can replicate 820 00:48:51,880 --> 00:48:53,680 Speaker 4: that in the lab. If I give you, you know, 821 00:48:54,320 --> 00:48:58,400 Speaker 4: a hallucinogen, well in a control environment, you're also going 822 00:48:58,440 --> 00:49:01,040 Speaker 4: to have these experiences. If I take i'm achine that 823 00:49:01,120 --> 00:49:03,319 Speaker 4: disrupts this part of your brain, I can make you 824 00:49:03,360 --> 00:49:05,520 Speaker 4: feel like you're stepping outside your body. 825 00:49:05,640 --> 00:49:11,520 Speaker 1: Or experiencing divinity like the famous god helmet experiments, and 826 00:49:11,520 --> 00:49:15,560 Speaker 1: and also very well done doctor cham to note that 827 00:49:15,600 --> 00:49:22,600 Speaker 1: we cannot ethically pursue some direct experiments that would lead 828 00:49:22,680 --> 00:49:29,120 Speaker 1: to breakthroughs there because it would require doing kind of 829 00:49:29,200 --> 00:49:32,719 Speaker 1: evil things to innocent people, even if they signed up. 830 00:49:32,760 --> 00:49:38,520 Speaker 1: You know, we've all seen flatliners. Yeah, classic Julia Roberts movie. Yeah, 831 00:49:38,600 --> 00:49:43,799 Speaker 1: oh yeah, that was joy, that was yeah, young Sutherland, 832 00:49:43,920 --> 00:49:47,080 Speaker 1: Yeah yeah yeah. And uh, you know, I've got to 833 00:49:47,120 --> 00:49:50,719 Speaker 1: be honest. Uh, it's a super up to date pop 834 00:49:50,760 --> 00:49:55,759 Speaker 1: culture reference, I'm sure, but I remember seeing flatliners and 835 00:49:57,080 --> 00:50:03,799 Speaker 1: being convinced that this is why people join med school and. 836 00:50:03,520 --> 00:50:05,719 Speaker 2: So they can do the flat so they can do flatliners. 837 00:50:05,840 --> 00:50:11,680 Speaker 1: Yeah. Luckily, Luckily, my uncle, a very nice, very learned man, 838 00:50:12,520 --> 00:50:16,600 Speaker 1: assured a young Ben Bullen that it would still be 839 00:50:16,719 --> 00:50:21,640 Speaker 1: illegal to quote unquote flat line people to quote unquote 840 00:50:21,680 --> 00:50:22,920 Speaker 1: see what happens. 841 00:50:23,200 --> 00:50:26,719 Speaker 2: Well, And speaking of other cinematic masterpieces, we were talking 842 00:50:26,760 --> 00:50:28,880 Speaker 2: a little earlier about the idea of Vibes, and I 843 00:50:28,920 --> 00:50:31,479 Speaker 2: have to take this opportunity to recommend the movie Vibes. 844 00:50:31,520 --> 00:50:33,640 Speaker 2: Oh my god, it's one of your favorites, Ben. I 845 00:50:33,640 --> 00:50:36,120 Speaker 2: actually recommended it to a friend the other day. 846 00:50:36,800 --> 00:50:41,720 Speaker 3: Seminole Ghostbusters slash Indiana Jones mashup slash Ripoff that everyone 847 00:50:41,719 --> 00:50:46,480 Speaker 3: should see, starring Jeff Goldbloom and Cindy Love Hare, No 848 00:50:47,480 --> 00:50:52,600 Speaker 3: True No Vibes. Well, get you to a cinema or hey, 849 00:50:52,760 --> 00:50:55,759 Speaker 3: hopefully they're doing a revival screening of Vibes somewhere in 850 00:50:55,800 --> 00:50:56,640 Speaker 3: your neck of the woods. 851 00:50:56,800 --> 00:50:59,560 Speaker 4: I think I missed that seminal moment in neuroscience. 852 00:50:59,840 --> 00:51:03,280 Speaker 2: It's okay, Yeah, it explains a lot. It's a real breakthrough. 853 00:51:05,120 --> 00:51:08,439 Speaker 1: It's sort of like the Police Academy four of its time. 854 00:51:09,800 --> 00:51:12,960 Speaker 2: It's probably contemporary with Police Academy. It probably is. Actually 855 00:51:13,200 --> 00:51:15,439 Speaker 2: I've come out the same year with that. 856 00:51:15,840 --> 00:51:19,080 Speaker 1: Thank you so much, Orgee for spending time with us 857 00:51:19,080 --> 00:51:23,600 Speaker 1: and making this Orgey and Science Stuff Week. Where can 858 00:51:23,760 --> 00:51:27,680 Speaker 1: people learn more about your explorations. 859 00:51:28,160 --> 00:51:31,320 Speaker 4: Yeah, so right now my big project is sign Stuff. 860 00:51:31,400 --> 00:51:34,000 Speaker 4: It's a new iHeart podcast. You can find it anywhere 861 00:51:34,040 --> 00:51:37,200 Speaker 4: you get your podcasts. Search for sign Stuff one word 862 00:51:37,600 --> 00:51:40,640 Speaker 4: and look for the purple icon. That's us. And we 863 00:51:41,520 --> 00:51:46,279 Speaker 4: asked her awesome questions like do animals understand death? Or 864 00:51:46,360 --> 00:51:48,600 Speaker 4: do they like to get drunk? Or what's inside of 865 00:51:48,640 --> 00:51:54,239 Speaker 4: a black hole? Or oh, grow limb Yeah, that's the 866 00:51:54,440 --> 00:51:57,640 Speaker 4: one that was super fascinating recently. Why can't we regrow limbs? 867 00:51:57,600 --> 00:51:59,839 Speaker 4: There are animals who, like you, cut up their arm, 868 00:52:00,040 --> 00:52:02,560 Speaker 4: don't just grow a brand new one. Why can't we 869 00:52:02,600 --> 00:52:05,440 Speaker 4: do it? And we find out the answer is maybe 870 00:52:05,440 --> 00:52:05,799 Speaker 4: we can. 871 00:52:06,360 --> 00:52:11,440 Speaker 1: And while you are on the internet, please do check 872 00:52:11,520 --> 00:52:18,040 Speaker 1: out one of our favorite aspects of Doctor Chimp or Hey. 873 00:52:18,280 --> 00:52:23,800 Speaker 1: In addition to being one of, if not the smartest 874 00:52:23,840 --> 00:52:27,200 Speaker 1: people on the history of this show, definitely in this episode. 875 00:52:27,960 --> 00:52:34,719 Speaker 1: You are not just a mechanical engineer Stanford graduated. You 876 00:52:34,880 --> 00:52:39,240 Speaker 1: did not just attend Georgia Tech, one of the most 877 00:52:39,239 --> 00:52:44,480 Speaker 1: difficult schools of its caliber. You are also the creator 878 00:52:44,520 --> 00:52:47,520 Speaker 1: of a comic strip called PhD Comics. Could you tell 879 00:52:47,600 --> 00:52:48,680 Speaker 1: us a little bit about that? 880 00:52:49,360 --> 00:52:52,360 Speaker 4: Yeah, PhD Comics. You can find out a PhD Comics 881 00:52:52,440 --> 00:52:54,799 Speaker 4: dot com. It's a comic strip I started when I 882 00:52:54,840 --> 00:52:58,239 Speaker 4: was in grad school, and it's all about what it's 883 00:52:58,360 --> 00:53:01,200 Speaker 4: like to do science, what it's like to be an academic. 884 00:53:01,360 --> 00:53:05,520 Speaker 4: It's kind of people describe it as the Dilbert of academia. 885 00:53:06,080 --> 00:53:07,600 Speaker 2: I don't know this, guys. I'll have to check it 886 00:53:07,600 --> 00:53:10,280 Speaker 2: out next trip in my life. 887 00:53:10,520 --> 00:53:12,520 Speaker 4: Yeah. Yeah, and so I was super lucky to be 888 00:53:12,560 --> 00:53:14,960 Speaker 4: able to do that for many many years on the internet, 889 00:53:15,280 --> 00:53:18,080 Speaker 4: huge amount of support for people out there, and then 890 00:53:18,080 --> 00:53:21,239 Speaker 4: that translated to me doing movies and then a TV show. 891 00:53:21,280 --> 00:53:24,480 Speaker 4: Recently you can find that one on PPS Kids. It's 892 00:53:24,480 --> 00:53:28,600 Speaker 4: called Eleanor Wonders Why. And I've also gone in to 893 00:53:28,719 --> 00:53:31,920 Speaker 4: write and draw a lot of books. So my probably 894 00:53:31,960 --> 00:53:34,600 Speaker 4: my most famous one I've worked on is called We 895 00:53:34,680 --> 00:53:37,400 Speaker 4: Have No Idea, which is a guide to everything we 896 00:53:37,440 --> 00:53:40,960 Speaker 4: don't know about the universe. And now the most popular 897 00:53:41,000 --> 00:53:44,319 Speaker 4: one is something called Oliver's Great Big Universe, which is 898 00:53:44,360 --> 00:53:47,160 Speaker 4: for kids. If you have a kid who's really curious 899 00:53:47,200 --> 00:53:50,839 Speaker 4: and like science but also likes, you know, far jokes 900 00:53:50,920 --> 00:53:55,719 Speaker 4: and really fun, fun middle school middle grade stories, please 901 00:53:55,800 --> 00:53:56,319 Speaker 4: check it out. 902 00:53:56,880 --> 00:54:01,840 Speaker 3: A polymath and renaissance man, indeed, doctor Horte Sham, thanks 903 00:54:01,880 --> 00:54:03,720 Speaker 3: again for joining us on Ridiculous History. 904 00:54:03,760 --> 00:54:08,680 Speaker 5: For Hoorte Sham week, oh thank you, and well well 905 00:54:08,800 --> 00:54:17,279 Speaker 5: well bully for us, congratulations and tally ho Wecei Thury, Yes, 906 00:54:17,440 --> 00:54:20,080 Speaker 5: Noel we once again, my friend, uh. 907 00:54:20,520 --> 00:54:26,040 Speaker 1: We managed to speak with a world class expert uh 908 00:54:26,160 --> 00:54:31,480 Speaker 1: in science, and I think we post some interesting questions. 909 00:54:31,680 --> 00:54:33,600 Speaker 3: I think so, I think we held our own with 910 00:54:33,719 --> 00:54:36,720 Speaker 3: doctor cham Jorge to his friends. I'd like to think 911 00:54:37,080 --> 00:54:39,879 Speaker 3: that we walked away from these recordings as friends. He said, 912 00:54:39,920 --> 00:54:41,360 Speaker 3: like said, he wanted to come back on again, and 913 00:54:41,400 --> 00:54:42,800 Speaker 3: that we made him laugh and smile. 914 00:54:43,040 --> 00:54:45,280 Speaker 2: That made us feel really good. That's true. 915 00:54:45,360 --> 00:54:52,000 Speaker 5: Our neurons were firing, were they which whichever part is associated. 916 00:54:51,719 --> 00:54:53,879 Speaker 1: With learning and with us? 917 00:54:53,880 --> 00:54:54,440 Speaker 2: And joy? 918 00:54:54,680 --> 00:54:58,000 Speaker 1: And joy? That's the word we were looking for. Uh 919 00:54:58,200 --> 00:55:01,280 Speaker 1: we I don't know. No, this guy is so close 920 00:55:01,360 --> 00:55:05,360 Speaker 1: to getting a cool ridiculous history, street name, a nickname, 921 00:55:05,600 --> 00:55:06,600 Speaker 1: an operator name. 922 00:55:06,800 --> 00:55:10,319 Speaker 2: You know it's doctor horget riverside Champ. Oh, that's true, 923 00:55:10,360 --> 00:55:11,239 Speaker 2: he made his own. 924 00:55:11,520 --> 00:55:12,120 Speaker 5: There we go. 925 00:55:12,400 --> 00:55:15,160 Speaker 3: Now we can do better, we'll workshop that one, but 926 00:55:15,239 --> 00:55:18,200 Speaker 3: for now. Huge days to you, Ben. That was a 927 00:55:18,200 --> 00:55:21,719 Speaker 3: fun exploration of all things heady and universal. 928 00:55:22,200 --> 00:55:25,480 Speaker 1: Huge thanks to you know, huge thanks to our super producer, 929 00:55:25,960 --> 00:55:31,200 Speaker 1: mister Max Frictionless. Williams got a nice haircut there. Also, 930 00:55:31,360 --> 00:55:37,320 Speaker 1: big big thanks to aj Bahama's Jacob's Jonathan Strickland aka 931 00:55:37,719 --> 00:55:42,359 Speaker 1: the Twister. Okay, yep, big thanks to him. 932 00:55:42,640 --> 00:55:45,520 Speaker 2: Oh yeah, of course. Sure, the rude dudes over a 933 00:55:45,600 --> 00:55:46,360 Speaker 2: ridiculous crime. 934 00:55:46,440 --> 00:55:48,640 Speaker 3: We've got Chris Praciota's and Eve Jeff Coates here and 935 00:55:48,680 --> 00:55:51,279 Speaker 3: Spirit you know what, I think. 936 00:55:51,320 --> 00:55:57,680 Speaker 4: We'll see you next time, folks. 937 00:55:59,640 --> 00:56:03,520 Speaker 3: For more podcast from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 938 00:56:03,560 --> 00:56:05,720 Speaker 3: or wherever you listen to your favorite shows.