1 00:00:00,360 --> 00:00:02,880 Speaker 1: If you've ever been at home and wondered, Josh and Chuck, 2 00:00:03,000 --> 00:00:05,480 Speaker 1: is it really worth going to see them perform live? 3 00:00:06,000 --> 00:00:09,520 Speaker 1: The answer is a resounding yes yes. And if you 4 00:00:09,560 --> 00:00:12,639 Speaker 1: live in Vancouver, b C or anywhere near there, come 5 00:00:12,680 --> 00:00:15,240 Speaker 1: on out to the Chance Center on Sunday, March twenty 6 00:00:15,360 --> 00:00:17,919 Speaker 1: nine to see us and find out for yourself. And 7 00:00:17,960 --> 00:00:20,160 Speaker 1: then the next night, if you live around Portland, Oregon, 8 00:00:20,400 --> 00:00:22,840 Speaker 1: you can go to the Arlene Schnitzer Concert Hall and 9 00:00:22,880 --> 00:00:26,680 Speaker 1: we'll be there ready to go on Monday, March That's right. 10 00:00:26,680 --> 00:00:28,880 Speaker 1: You can get all ticket information at s y s 11 00:00:28,960 --> 00:00:32,800 Speaker 1: K live dot com. Welcome to Stuff you should know, 12 00:00:33,040 --> 00:00:41,600 Speaker 1: a production of I Heart Radios How Stuff Works. Hey, 13 00:00:41,640 --> 00:00:45,080 Speaker 1: and welcome to the podcast. I'm Josh Clark. There's Charles 14 00:00:45,200 --> 00:00:48,320 Speaker 1: w Chuck Bryan over there, and there's guest producer Dylan 15 00:00:48,440 --> 00:00:53,279 Speaker 1: sitting in this fine Wednesday morning of weirdness. Everything's out 16 00:00:53,280 --> 00:00:57,360 Speaker 1: of whack and strange. I know, right, Hopefully our voices 17 00:00:57,400 --> 00:01:00,320 Speaker 1: sound normal, Chuck. I was worried about that so out, 18 00:01:01,240 --> 00:01:05,920 Speaker 1: not that anyone cares, but our regular Tuesday sesh got 19 00:01:06,120 --> 00:01:10,679 Speaker 1: pushed because our computer took a dump technical difficulties, and 20 00:01:10,720 --> 00:01:12,920 Speaker 1: then we said, hey, let's just do it tomorrow morning. 21 00:01:13,560 --> 00:01:15,880 Speaker 1: I had a few more technical difficulties, but here we 22 00:01:15,920 --> 00:01:21,520 Speaker 1: are going strong. Buddy, I'm tired, are you No? I'm okay. 23 00:01:21,600 --> 00:01:23,839 Speaker 1: I've had enough coffee that I'm not tired. It's weird 24 00:01:23,880 --> 00:01:25,800 Speaker 1: for us to record in the morning and just everything's 25 00:01:25,800 --> 00:01:28,240 Speaker 1: out of whack. I would call it eerie, I think, 26 00:01:28,360 --> 00:01:32,240 Speaker 1: because you know, I do the movie Crush Mini Crushes 27 00:01:32,440 --> 00:01:35,960 Speaker 1: on Wednesday mornings usually, So usually I'm just making dumb 28 00:01:36,000 --> 00:01:38,840 Speaker 1: jokes and cussing a lot with nol. Right now, right, 29 00:01:40,560 --> 00:01:43,520 Speaker 1: I gotta switch my brain back into g rated mode. 30 00:01:44,120 --> 00:01:48,400 Speaker 1: Yep to talk about Rupert Childrake. That's what we're doing. 31 00:01:48,760 --> 00:01:51,040 Speaker 1: And you can curse if you want. We'll beep it 32 00:01:51,080 --> 00:01:55,400 Speaker 1: out Childrake, depending on Yeah. I think that's how the 33 00:01:55,440 --> 00:01:58,320 Speaker 1: scientific community refers to That's just one of those names. 34 00:01:58,360 --> 00:02:00,360 Speaker 1: It seems like it should be yelled at like that. 35 00:02:00,840 --> 00:02:05,200 Speaker 1: And I like to call him Rupert after Michael Caine 36 00:02:05,240 --> 00:02:11,040 Speaker 1: in Um Dirty Rotten Scoundrels. Remember he called Steve Martin's 37 00:02:11,120 --> 00:02:13,920 Speaker 1: name at one point was Rupert. There's a Cork on 38 00:02:13,960 --> 00:02:16,920 Speaker 1: the Fork. Yeah, man, that was a good movie. It 39 00:02:17,120 --> 00:02:20,239 Speaker 1: was so um we're talking about a different Ruprit or 40 00:02:20,320 --> 00:02:24,040 Speaker 1: rupert Rupert Shel Drake, who is widely considered in the 41 00:02:24,080 --> 00:02:31,680 Speaker 1: scientific community um, a heretic, fraud hoaxter, a pseudo scientist, 42 00:02:32,160 --> 00:02:36,680 Speaker 1: all sorts of things. And normally we don't entertain um 43 00:02:36,840 --> 00:02:39,840 Speaker 1: that kind of stuff, or specifically people who are considered 44 00:02:39,840 --> 00:02:42,600 Speaker 1: as such, because we tend to be like, yeah, pseudo 45 00:02:42,600 --> 00:02:46,000 Speaker 1: science is not so great. But there's something about Rupert 46 00:02:46,040 --> 00:02:51,560 Speaker 1: shell Drake is that maybe a little bit something like that. 47 00:02:52,000 --> 00:02:55,760 Speaker 1: But he he is different in some ways. He kind 48 00:02:55,760 --> 00:02:58,760 Speaker 1: of stands alone. He's got staying power, to say the least. 49 00:02:58,760 --> 00:03:02,720 Speaker 1: He was first branded as sign antific heretic in and 50 00:03:02,800 --> 00:03:07,840 Speaker 1: he's still around doing his thing, ticking off the scientific establishment. Um. 51 00:03:08,000 --> 00:03:12,639 Speaker 1: But see, he's also in certain circles labeled as an 52 00:03:12,639 --> 00:03:17,400 Speaker 1: open minded scientist and someone not afraid to kind of 53 00:03:17,520 --> 00:03:22,360 Speaker 1: question the unquestionable, and someone who flies in the face 54 00:03:22,560 --> 00:03:29,280 Speaker 1: of what some people call the scientific orthodoxy or dogma 55 00:03:29,320 --> 00:03:32,720 Speaker 1: where everything is so rigid that there is no room 56 00:03:32,840 --> 00:03:39,080 Speaker 1: for new ideas. Right that that basically the scientific orthodoxy 57 00:03:39,080 --> 00:03:42,480 Speaker 1: that you refer to. It kind of says we are 58 00:03:42,680 --> 00:03:48,440 Speaker 1: on basically the right track. We generally have the parameters 59 00:03:48,520 --> 00:03:51,600 Speaker 1: kind of figured out. We know the math we need 60 00:03:51,640 --> 00:03:53,240 Speaker 1: to be using. We know the places we need to 61 00:03:53,280 --> 00:03:56,240 Speaker 1: be looking. We have generally and everything from physics to 62 00:03:56,280 --> 00:04:00,680 Speaker 1: biology and understanding of the general structure. Now it's just 63 00:04:00,720 --> 00:04:03,440 Speaker 1: a matter of filling in the details. We looked under 64 00:04:03,480 --> 00:04:06,600 Speaker 1: the hood, uh huh, and we know what's going on generally. 65 00:04:07,000 --> 00:04:09,200 Speaker 1: We know that it's a combustion engine and not an 66 00:04:09,200 --> 00:04:12,280 Speaker 1: electric car. That the universe is that kind of thing 67 00:04:13,080 --> 00:04:16,760 Speaker 1: electric car, right, So when somebody comes along and says, no, no, 68 00:04:17,320 --> 00:04:19,880 Speaker 1: that's not even a car that you're looking at, that's 69 00:04:19,880 --> 00:04:25,400 Speaker 1: a boat. Um. The scientific establishment or really any establishment, 70 00:04:25,480 --> 00:04:27,880 Speaker 1: really tends to get shook by that kind of stuff. 71 00:04:27,920 --> 00:04:30,240 Speaker 1: They don't like that. And one of the reasons I 72 00:04:30,240 --> 00:04:32,400 Speaker 1: was trying to figure out why people get so invested 73 00:04:32,400 --> 00:04:34,599 Speaker 1: in this. I think there's a lot of people who 74 00:04:34,640 --> 00:04:37,680 Speaker 1: come and say, I am a person of science, I 75 00:04:37,720 --> 00:04:40,080 Speaker 1: believe in this, I subscribe to it, and they end 76 00:04:40,120 --> 00:04:42,760 Speaker 1: up going so far as to pin their identity to it. 77 00:04:43,000 --> 00:04:45,839 Speaker 1: And this happens with just about any any structure. And 78 00:04:45,839 --> 00:04:48,760 Speaker 1: when you pin your identity to something, when that when 79 00:04:48,760 --> 00:04:52,800 Speaker 1: that's something that structure is attacked, you take it as 80 00:04:52,800 --> 00:04:54,640 Speaker 1: a personal attack. And I think that that's one of 81 00:04:54,640 --> 00:04:56,839 Speaker 1: the reasons why a lot of people are so rapidly 82 00:04:56,920 --> 00:05:00,400 Speaker 1: against Rupert Sheldrake. So should we talk about this guy. 83 00:05:01,040 --> 00:05:03,240 Speaker 1: I think we should. I think Chuck, we should explain. 84 00:05:03,279 --> 00:05:06,080 Speaker 1: One of the reasons why he has such staying power 85 00:05:06,120 --> 00:05:08,440 Speaker 1: and what makes them different is that he is about 86 00:05:08,480 --> 00:05:12,239 Speaker 1: as trained as scientist as a scientist can be. Yeah, 87 00:05:12,279 --> 00:05:14,400 Speaker 1: And you know, as we move through this, you'll see 88 00:05:14,400 --> 00:05:17,000 Speaker 1: that what makes him stand out from kind of other 89 00:05:17,880 --> 00:05:22,520 Speaker 1: uh kus is that he's very very intelligent guy. He's 90 00:05:22,520 --> 00:05:25,160 Speaker 1: he's not a kook. No, So that's why it's kind 91 00:05:25,160 --> 00:05:27,599 Speaker 1: of like, that's why certain people listen to him. One 92 00:05:27,600 --> 00:05:29,120 Speaker 1: of the other things I really want to point out 93 00:05:29,200 --> 00:05:31,600 Speaker 1: at the start of this, and this is what really 94 00:05:31,640 --> 00:05:35,039 Speaker 1: differentiates him from a lot of people on the fringes today, 95 00:05:35,240 --> 00:05:38,800 Speaker 1: is he's not in a hole. He's very polite, he's 96 00:05:38,920 --> 00:05:41,640 Speaker 1: very calm, he's very measured. He doesn't engage in ad 97 00:05:41,640 --> 00:05:45,440 Speaker 1: hominum attacks against his critics. He engages with his critics. 98 00:05:45,800 --> 00:05:49,919 Speaker 1: He is he's actually a very congenial person. Yeah, he's 99 00:05:49,960 --> 00:05:52,440 Speaker 1: just on a different side of the coin from the 100 00:05:52,440 --> 00:05:56,560 Speaker 1: scientific establishment and almost every respect. Yeah. And when I've 101 00:05:56,560 --> 00:06:00,000 Speaker 1: read articles and interviews with other people from the established 102 00:06:00,120 --> 00:06:02,440 Speaker 1: mit that have hung out with him and done experiments. 103 00:06:02,480 --> 00:06:05,040 Speaker 1: They're all like, he's a really affable kind of fun guy, 104 00:06:05,480 --> 00:06:08,320 Speaker 1: even though when you look at him and when you 105 00:06:08,320 --> 00:06:12,839 Speaker 1: hear Rupert Sheldrake doesn't scream fun and affable. No, but 106 00:06:12,920 --> 00:06:15,280 Speaker 1: he is. Yeah, he's got a lampshade on his head 107 00:06:15,360 --> 00:06:17,960 Speaker 1: nine tenths at the time. Man, can you imagine the 108 00:06:18,040 --> 00:06:24,359 Speaker 1: party the lab parties beer boney. Alright. So he started 109 00:06:24,360 --> 00:06:27,240 Speaker 1: out his career um kind of right down the middle 110 00:06:27,240 --> 00:06:32,440 Speaker 1: of science science wise. Went to Cambridge as an undergrad. 111 00:06:32,800 --> 00:06:36,960 Speaker 1: He won a Botany prize. Uh there, the University Botany Prize. 112 00:06:36,960 --> 00:06:39,839 Speaker 1: He then went to Harvard study philosophy, he studied the 113 00:06:39,880 --> 00:06:43,840 Speaker 1: history of science, went back to Cambridge. Yeah, apparently he's 114 00:06:43,920 --> 00:06:46,440 Speaker 1: just like a savant when it comes to science history. 115 00:06:47,400 --> 00:06:51,120 Speaker 1: Went back to Cambridge, got his PhD in biochemistry, and 116 00:06:51,120 --> 00:06:54,240 Speaker 1: then a post stock with the Royal Society, uh in 117 00:06:54,320 --> 00:06:57,479 Speaker 1: plant development in the aging of cells. So I think 118 00:06:57,480 --> 00:07:02,240 Speaker 1: that's unassailable, unimpeached bowl. It really isn't. Had he just 119 00:07:02,360 --> 00:07:05,480 Speaker 1: kind of continued along this is largely in the seventies. 120 00:07:05,480 --> 00:07:08,640 Speaker 1: Had he just kind of continued along this path um, 121 00:07:08,680 --> 00:07:12,560 Speaker 1: he probably would have been a really widely respected although 122 00:07:12,760 --> 00:07:18,400 Speaker 1: pretty obscure plant scientist or biologist of some sort. But 123 00:07:18,520 --> 00:07:20,160 Speaker 1: one of the things that happened to him was he 124 00:07:20,240 --> 00:07:24,559 Speaker 1: went to India and studied and lived at an ash 125 00:07:24,640 --> 00:07:29,000 Speaker 1: room for about a year and a half and apparently 126 00:07:29,040 --> 00:07:33,120 Speaker 1: smoked a lot of hashish while he was there. The 127 00:07:33,160 --> 00:07:36,520 Speaker 1: hashish part, Oh, I'm just goofing, but but surely he did. 128 00:07:38,840 --> 00:07:41,560 Speaker 1: But the thing is is, around this time, UM he 129 00:07:41,760 --> 00:07:45,560 Speaker 1: elaborated on an idea that he'd had that he learned about, 130 00:07:45,600 --> 00:07:50,760 Speaker 1: probably in his history of science. UM the classes that 131 00:07:51,080 --> 00:07:56,240 Speaker 1: science can't explain how you can take some cells that 132 00:07:56,360 --> 00:07:58,560 Speaker 1: start out as like a seed or something like that, 133 00:07:59,000 --> 00:08:03,080 Speaker 1: and that little seed grows into an oak tree, and 134 00:08:03,120 --> 00:08:07,600 Speaker 1: that that oak tree looks startlingly similar to other oak 135 00:08:07,680 --> 00:08:10,600 Speaker 1: trees that you know, you can dig up from a 136 00:08:10,600 --> 00:08:13,560 Speaker 1: thousand years ago, or imagine that they'll basically look like 137 00:08:13,600 --> 00:08:15,560 Speaker 1: a thousand years from now, or that are spread out 138 00:08:15,560 --> 00:08:20,840 Speaker 1: on different continents. They can't science can't explain how um 139 00:08:21,560 --> 00:08:25,240 Speaker 1: morphology works. That that how something becomes the thing that 140 00:08:25,320 --> 00:08:28,080 Speaker 1: it is, and that that resembles something else. And you say, well, 141 00:08:28,080 --> 00:08:30,920 Speaker 1: it's genetics, like that's kind of the common thing. But 142 00:08:31,080 --> 00:08:33,520 Speaker 1: here we get to that point where science is like, 143 00:08:33,720 --> 00:08:36,200 Speaker 1: we're we've got the broad strokes, we just don't understand 144 00:08:36,240 --> 00:08:40,439 Speaker 1: the details. And and genetics can possibly be the thing 145 00:08:40,520 --> 00:08:43,280 Speaker 1: that explains this later on, but we really have no 146 00:08:43,360 --> 00:08:47,839 Speaker 1: idea how this stuff works because it's really really intricate 147 00:08:48,040 --> 00:08:51,280 Speaker 1: how something like that happens. Yeah, it's almost like shell 148 00:08:51,360 --> 00:08:57,120 Speaker 1: Drake was like, uh, Tom Hanks and Big in the 149 00:08:57,520 --> 00:09:00,520 Speaker 1: in the boardroom when they're talking about the toys, and 150 00:09:00,559 --> 00:09:03,240 Speaker 1: he's just like, I don't get it. Yeah, like because 151 00:09:03,240 --> 00:09:05,960 Speaker 1: they'll say, oh, well it's d n A and he's like, yeah, 152 00:09:06,000 --> 00:09:07,960 Speaker 1: but I don't get it. Like how does a toolip 153 00:09:07,960 --> 00:09:10,960 Speaker 1: become the tulip? Well it's d n A, yeah, but 154 00:09:11,000 --> 00:09:13,480 Speaker 1: that really doesn't explain at all. Well it's d n A. 155 00:09:13,880 --> 00:09:17,480 Speaker 1: We understand d NA. He's like, yeah, I don't get it. Yeah. 156 00:09:17,520 --> 00:09:21,320 Speaker 1: And to him specifically, DNA is a chemical that that 157 00:09:21,600 --> 00:09:25,040 Speaker 1: um dictates how other chemicals are produced. Right, he thinks 158 00:09:25,040 --> 00:09:28,400 Speaker 1: it's very overrated. He does um, which that in and 159 00:09:28,440 --> 00:09:31,400 Speaker 1: of itself is heretical. But it's pretty funny too. But 160 00:09:31,480 --> 00:09:35,000 Speaker 1: it is. But with with this, with morphology, with the 161 00:09:35,240 --> 00:09:38,040 Speaker 1: how something takes the shape that it has eventually in 162 00:09:38,080 --> 00:09:41,720 Speaker 1: it's a mature state. There's a lot going on there. 163 00:09:41,720 --> 00:09:44,199 Speaker 1: There's like little cells that have to set up and 164 00:09:44,320 --> 00:09:47,120 Speaker 1: arrange in a certain pattern that later on down the road, 165 00:09:47,160 --> 00:09:50,480 Speaker 1: after all these processes play out, will form another pattern. 166 00:09:50,520 --> 00:09:54,920 Speaker 1: So there's basically planning. There's um timing like all those 167 00:09:55,040 --> 00:09:57,440 Speaker 1: that process has to happen at just the right steps 168 00:09:57,440 --> 00:09:59,680 Speaker 1: and just the right stages for that end result to 169 00:09:59,720 --> 00:10:03,439 Speaker 1: be it's supposed to be. There's differentiation of cells where 170 00:10:03,679 --> 00:10:05,640 Speaker 1: one cell can produce a new cell and the new 171 00:10:05,679 --> 00:10:09,120 Speaker 1: cell has totally different genes turned offer on that will 172 00:10:09,160 --> 00:10:11,440 Speaker 1: allow it to specialize. And these are the things we 173 00:10:11,520 --> 00:10:15,319 Speaker 1: don't understand what's guiding it. And so Rupert Sheldrake kind 174 00:10:15,320 --> 00:10:18,400 Speaker 1: of tapped into a thought that started, I think back 175 00:10:18,440 --> 00:10:22,160 Speaker 1: in the nineteen twenties among biologists that there must be 176 00:10:22,400 --> 00:10:29,800 Speaker 1: some unseen guide or force that is that basically says 177 00:10:29,920 --> 00:10:32,040 Speaker 1: I've got this, I know what the end result is. 178 00:10:32,160 --> 00:10:35,720 Speaker 1: I can take the starting bit and guide it into 179 00:10:35,760 --> 00:10:39,640 Speaker 1: this end result, and we don't understand what that is. Yeah, 180 00:10:39,640 --> 00:10:42,840 Speaker 1: there were a couple of scientists in the twenties and 181 00:10:42,880 --> 00:10:49,360 Speaker 1: thirties studying what they called uh morpha genetic fields, which 182 00:10:49,360 --> 00:10:51,920 Speaker 1: is sort of like the idea that there's this invisible 183 00:10:52,000 --> 00:10:55,840 Speaker 1: mold that we don't fully understand that gives the shape 184 00:10:55,840 --> 00:10:59,520 Speaker 1: to these things. Uh. Guy named C. H. Waddington in 185 00:10:59,600 --> 00:11:03,200 Speaker 1: nineteen thirty six had a paper called morphogenesis and the 186 00:11:03,280 --> 00:11:08,199 Speaker 1: field concept, and then a Russian biologist named Alexander gurvitz 187 00:11:09,120 --> 00:11:11,840 Speaker 1: Um kind of had the same thoughts. But you know, 188 00:11:11,960 --> 00:11:15,840 Speaker 1: I think he came independently to these thoughts, yeah, which was, Hey, 189 00:11:15,880 --> 00:11:18,520 Speaker 1: there's something else going on here. We're calling it morpha 190 00:11:18,559 --> 00:11:22,360 Speaker 1: genetic fields. Uh. And this is this, like I said, 191 00:11:22,400 --> 00:11:25,120 Speaker 1: this idea that there these invisible molds that we don't 192 00:11:25,120 --> 00:11:29,040 Speaker 1: fully get that gives things their eventual shape and that's 193 00:11:29,040 --> 00:11:32,640 Speaker 1: why they all look alike. Right. So so on the 194 00:11:32,720 --> 00:11:35,920 Speaker 1: astroom in the late seventies early eighties, Childrake was kind 195 00:11:35,920 --> 00:11:39,680 Speaker 1: of vibeing on this idea of there must be some 196 00:11:39,679 --> 00:11:43,840 Speaker 1: some some field, these morphic fields or whatever that guide 197 00:11:43,880 --> 00:11:49,680 Speaker 1: the development of something living into its mature form because 198 00:11:49,720 --> 00:11:52,480 Speaker 1: we just don't understand it. So hey, maybe that's just 199 00:11:52,559 --> 00:11:55,679 Speaker 1: as good an explanation as our current understanding, which is 200 00:11:55,760 --> 00:12:01,040 Speaker 1: really nonexistent. So um. He took it further though, and 201 00:12:01,080 --> 00:12:03,800 Speaker 1: he wrote a book and he took it further, he 202 00:12:03,800 --> 00:12:06,040 Speaker 1: wrote a book called A New Science of Life. It 203 00:12:06,120 --> 00:12:08,240 Speaker 1: was his first book, um, as far as I know, 204 00:12:08,280 --> 00:12:09,880 Speaker 1: at the very least, it was his first book that 205 00:12:09,920 --> 00:12:13,200 Speaker 1: really kind of made a splash. And in it, um 206 00:12:13,280 --> 00:12:15,720 Speaker 1: he he kind of said, these these morpha genetic fields, 207 00:12:15,720 --> 00:12:18,439 Speaker 1: we're gonna call him morphic fields now. And not only 208 00:12:18,480 --> 00:12:22,840 Speaker 1: do they they guide the morpha genesis of a living thing, 209 00:12:23,679 --> 00:12:28,280 Speaker 1: they guide its behavior from from that moment on, from 210 00:12:28,280 --> 00:12:31,600 Speaker 1: the moment of conception onto I guess it's death. And 211 00:12:31,600 --> 00:12:34,400 Speaker 1: then when that thing dies, the life that it's led 212 00:12:35,040 --> 00:12:39,319 Speaker 1: will contribute to this morphic resonance that carries on to 213 00:12:39,440 --> 00:12:42,360 Speaker 1: the next generation and the generation beyond that. And so 214 00:12:42,400 --> 00:12:46,000 Speaker 1: you eventually have this long line of tulips that know 215 00:12:46,200 --> 00:12:49,160 Speaker 1: not only how to grow into the right shape, but 216 00:12:49,320 --> 00:12:51,920 Speaker 1: how to behave and do all the things that tulips 217 00:12:51,960 --> 00:12:56,000 Speaker 1: do because of all of the living tulips that came 218 00:12:56,040 --> 00:13:00,240 Speaker 1: before it through this process of morphic residence. Yeah, and 219 00:13:00,320 --> 00:13:04,880 Speaker 1: not just like, uh, that tulip growing nearby at the 220 00:13:04,920 --> 00:13:08,080 Speaker 1: same time, but he said, what if it just was 221 00:13:08,200 --> 00:13:11,760 Speaker 1: across all of space and time, and the tulip in 222 00:13:11,880 --> 00:13:16,959 Speaker 1: Africa in the nineteenth century is has informed the tulip 223 00:13:17,000 --> 00:13:20,640 Speaker 1: in Florida in the year how to grow right, and 224 00:13:20,679 --> 00:13:25,640 Speaker 1: everyone went oh good hashish over there in India in 225 00:13:25,679 --> 00:13:31,280 Speaker 1: the nineteen seventies, right, Sheldrake, Right, So um, yeah, Well 226 00:13:31,320 --> 00:13:33,080 Speaker 1: we'll get to how it was received in a minute, 227 00:13:33,080 --> 00:13:34,560 Speaker 1: but let's you want to take a break and then 228 00:13:34,559 --> 00:13:36,960 Speaker 1: come back and kind of explain how how how he 229 00:13:37,000 --> 00:13:39,080 Speaker 1: says it works a little more. Yeah, but I also 230 00:13:39,120 --> 00:13:42,040 Speaker 1: think I totally spoiled how it was received. But that's okay, 231 00:13:42,080 --> 00:14:08,640 Speaker 1: that's all right, man, all right, okay. So we're at 232 00:14:08,640 --> 00:14:12,760 Speaker 1: the point where um Rupert Sheldrake has published his book 233 00:14:12,800 --> 00:14:15,440 Speaker 1: A New Science of Life, and in it he's talking 234 00:14:15,440 --> 00:14:19,840 Speaker 1: about this morphic residence that basically says, um, anything that 235 00:14:19,960 --> 00:14:25,960 Speaker 1: self organizes, from a molecule to a giraffe, UM knows 236 00:14:26,000 --> 00:14:28,800 Speaker 1: how to take the shape or is guided by a 237 00:14:28,880 --> 00:14:32,320 Speaker 1: process that shapes it called morphic fields. But even more 238 00:14:32,360 --> 00:14:35,840 Speaker 1: than that, its behavior, it's future behavior is shaped by 239 00:14:35,920 --> 00:14:38,800 Speaker 1: these same morphic fields. All of the all of the 240 00:14:38,840 --> 00:14:42,200 Speaker 1: things that the giraffes that came before it learned and 241 00:14:42,360 --> 00:14:47,000 Speaker 1: new and saw and eight and figured out becomes this 242 00:14:47,080 --> 00:14:50,680 Speaker 1: kind of body of consciousness that's passed along to every 243 00:14:50,720 --> 00:14:53,240 Speaker 1: new giraffe that's born. Yeah, I think we should read 244 00:14:53,240 --> 00:14:57,120 Speaker 1: this quote. Okay, there's a great interview in Scientific American. Uh, 245 00:14:57,960 --> 00:14:59,920 Speaker 1: who was it? Was it? Who was it interviewing him? Like, 246 00:15:00,040 --> 00:15:03,640 Speaker 1: can't remember now? Was it Rose? Now? I'm not sure 247 00:15:03,800 --> 00:15:08,040 Speaker 1: it was. It was a contemporary who was more you know, 248 00:15:08,080 --> 00:15:12,080 Speaker 1: traditional mainstream science. But he again was like the Celdric guy, 249 00:15:12,120 --> 00:15:15,720 Speaker 1: he's got something, He's got equality alright. So here's how 250 00:15:15,800 --> 00:15:20,360 Speaker 1: Sheldrake himself answers the question of morphic residence. Morphic residence 251 00:15:20,400 --> 00:15:23,800 Speaker 1: is the influence of previous structures of activity on subsequent 252 00:15:23,840 --> 00:15:28,400 Speaker 1: similar structures of activity. Organized by morphic fields. It enables 253 00:15:28,440 --> 00:15:31,440 Speaker 1: memories to pass across both the space and time from 254 00:15:31,440 --> 00:15:34,840 Speaker 1: the past. The greater the similarity, the greater the influence 255 00:15:35,000 --> 00:15:37,840 Speaker 1: of morphic resonance. What this means is that all self 256 00:15:37,920 --> 00:15:41,880 Speaker 1: organizing systems like molecules, crystal cells, plants, animals, and animal 257 00:15:41,960 --> 00:15:46,080 Speaker 1: societies have a collective memory on which each individual draws 258 00:15:46,400 --> 00:15:48,880 Speaker 1: into which it contributes. And here's the key here, I 259 00:15:48,920 --> 00:15:52,600 Speaker 1: think he says, in its most general sense, this hypothesis 260 00:15:52,640 --> 00:15:55,480 Speaker 1: implies that the so called laws of nature are more 261 00:15:55,520 --> 00:16:00,720 Speaker 1: like habits. Yet scientific establishment really particularly doesn't like that 262 00:16:00,840 --> 00:16:04,760 Speaker 1: last bit right there. Yeah, shel Drake just uh called 263 00:16:04,760 --> 00:16:08,240 Speaker 1: out the laws so called laws of nature. Right, So 264 00:16:08,360 --> 00:16:10,360 Speaker 1: there's something in there that kind of stuck out to 265 00:16:10,400 --> 00:16:12,160 Speaker 1: me that I was curious about. I couldn't find an 266 00:16:12,200 --> 00:16:15,560 Speaker 1: answer to. Is that, Um, he says, the greater the similarity, 267 00:16:15,640 --> 00:16:18,440 Speaker 1: the greater the influence of morphic residents. But what is 268 00:16:18,480 --> 00:16:23,000 Speaker 1: the similarity, say, in like a giraffe embryo, that allows 269 00:16:23,080 --> 00:16:26,080 Speaker 1: the morphic residents of all the giraffes that came before 270 00:16:26,120 --> 00:16:27,920 Speaker 1: to be like this, this is the thing we need 271 00:16:27,960 --> 00:16:31,200 Speaker 1: to exert our influence on. Like what similarity attracts that 272 00:16:31,280 --> 00:16:34,160 Speaker 1: morphic residence. I took that to mean maybe not in 273 00:16:34,160 --> 00:16:36,280 Speaker 1: the case of giraffes, but in the case of like 274 00:16:36,920 --> 00:16:41,320 Speaker 1: different varieties of an orchid, like the more similar you know, 275 00:16:41,440 --> 00:16:44,800 Speaker 1: because that's why they are all different ones. Yeah, But 276 00:16:44,880 --> 00:16:49,640 Speaker 1: what is the initial similarity that that morphic field recognizes 277 00:16:49,760 --> 00:16:52,680 Speaker 1: in that specific kind of orchid that says, Oh, I'm 278 00:16:52,680 --> 00:16:55,400 Speaker 1: going to influence you. Or is it just we should 279 00:16:55,440 --> 00:16:57,880 Speaker 1: call him it just naturally happens. I don't know, But 280 00:16:57,960 --> 00:17:00,000 Speaker 1: these are the questions that you start to wonder about 281 00:17:00,000 --> 00:17:02,720 Speaker 1: out when you read sheldrake stuff. Which is I think 282 00:17:02,760 --> 00:17:07,680 Speaker 1: the reason why I like him. UM like he it 283 00:17:07,720 --> 00:17:10,320 Speaker 1: just makes you think you just start to think differently 284 00:17:10,359 --> 00:17:13,160 Speaker 1: than than just like it's it's d n A, Yeah, 285 00:17:13,160 --> 00:17:17,680 Speaker 1: where are you with this guy? Overall, I am sympathetic 286 00:17:17,800 --> 00:17:22,680 Speaker 1: to him because I admire that he has a tremendous 287 00:17:22,720 --> 00:17:26,840 Speaker 1: amount of courage uh and willingness to take tons of 288 00:17:26,880 --> 00:17:28,760 Speaker 1: flak and I'm sure in this day and age, lots 289 00:17:28,760 --> 00:17:34,080 Speaker 1: of hate and threats. Um. I think that I am 290 00:17:34,160 --> 00:17:37,320 Speaker 1: critical of the fact that he stopped publishing pure of 291 00:17:37,400 --> 00:17:39,520 Speaker 1: view papers all the way back in the mid eighties. 292 00:17:40,040 --> 00:17:42,960 Speaker 1: That makes him currently less of a scientist and more 293 00:17:44,359 --> 00:17:47,760 Speaker 1: science communicator. But he's also kind of making up his 294 00:17:47,760 --> 00:17:49,800 Speaker 1: own science too, so I don't know if he qualifies 295 00:17:49,840 --> 00:17:52,359 Speaker 1: a science communicator, but I generally like him, and I 296 00:17:52,400 --> 00:17:57,280 Speaker 1: appreciate the role that he um he plays in this 297 00:17:57,280 --> 00:18:00,040 Speaker 1: this uh with science. What about you? I'm kind of 298 00:18:00,160 --> 00:18:06,280 Speaker 1: with you there. I admire his his chutzpah um because 299 00:18:07,200 --> 00:18:09,560 Speaker 1: I don't think that he is as charlatan out just 300 00:18:09,640 --> 00:18:12,080 Speaker 1: to make money selling books like some people think. I 301 00:18:12,119 --> 00:18:14,240 Speaker 1: don't either. I think he's a really smart guy who 302 00:18:14,280 --> 00:18:18,000 Speaker 1: gave has given his whole life to deep, deep thought 303 00:18:18,119 --> 00:18:20,760 Speaker 1: and research on this stuff, and I read some of 304 00:18:20,800 --> 00:18:22,720 Speaker 1: it and I think he may be onto something. I 305 00:18:22,760 --> 00:18:25,159 Speaker 1: read other stuff, and I think that sounds like magic, 306 00:18:25,880 --> 00:18:29,320 Speaker 1: right right. Uh. And we are men of science, you know, 307 00:18:29,600 --> 00:18:34,119 Speaker 1: we're we are podcasters, but we have always you know, 308 00:18:34,640 --> 00:18:39,080 Speaker 1: roundly sided with the scientific method as sort of the baseline. 309 00:18:39,080 --> 00:18:42,040 Speaker 1: And if you can't you can't satisfy the scientific method, 310 00:18:42,080 --> 00:18:45,439 Speaker 1: then uh, we typically kind of pooh pooh it. But 311 00:18:45,480 --> 00:18:48,560 Speaker 1: there's something again about the way he's gone about it 312 00:18:48,600 --> 00:18:51,520 Speaker 1: that just doesn't seem like he's just some whacko out 313 00:18:51,520 --> 00:18:53,720 Speaker 1: there making stuff up. Yeah, And I think he also 314 00:18:53,800 --> 00:18:56,199 Speaker 1: kind of tunes into something that I dislike, which is, 315 00:18:56,440 --> 00:18:59,520 Speaker 1: you know, he's really critical and really challenges, you know, 316 00:19:00,440 --> 00:19:04,480 Speaker 1: hardened dogma of a lot of the scientific community where 317 00:19:04,480 --> 00:19:06,800 Speaker 1: it's like, this is just how it is. Well why, 318 00:19:07,280 --> 00:19:09,080 Speaker 1: I don't know, but I was taught that, but that's 319 00:19:09,119 --> 00:19:11,360 Speaker 1: just how it is, and stop questioning it. And I'm 320 00:19:11,640 --> 00:19:15,320 Speaker 1: really dislike that, and I like him that he challenges 321 00:19:15,359 --> 00:19:18,600 Speaker 1: that as well. Yeah, there's a rigidity in science that 322 00:19:19,119 --> 00:19:23,320 Speaker 1: turns us both off. I think, um so turned off 323 00:19:23,440 --> 00:19:27,600 Speaker 1: right now. I was gonna make a joke, but I'm 324 00:19:27,600 --> 00:19:30,120 Speaker 1: not going there. Because I'm in the mini crush mode. 325 00:19:30,160 --> 00:19:32,600 Speaker 1: So right on, right on, keep it, keep it, keep 326 00:19:32,640 --> 00:19:35,600 Speaker 1: it in all right. So let's let's look at a 327 00:19:35,600 --> 00:19:40,160 Speaker 1: few examples of claims that he makes about uh things 328 00:19:40,160 --> 00:19:44,000 Speaker 1: that he thinks morphi residents might explain in nature UM, 329 00:19:44,040 --> 00:19:51,200 Speaker 1: specifically with animal behaviors. He says things like fish schooling UM, butterflies, 330 00:19:51,280 --> 00:19:55,800 Speaker 1: monarchs flying thousands of miles to to uh the same place, 331 00:19:56,400 --> 00:20:01,000 Speaker 1: homing pigeons, um, termites in Africa that are blind, that build, 332 00:20:01,440 --> 00:20:05,159 Speaker 1: you know, a tin foot tall nest with ventilation structures. 333 00:20:05,240 --> 00:20:07,880 Speaker 1: He said, all this stuff or more importantly, and we'll 334 00:20:07,880 --> 00:20:09,840 Speaker 1: look at this a little closer in a minute. A 335 00:20:09,960 --> 00:20:14,040 Speaker 1: dog and their owner, and a dog anticipating their owner's return, 336 00:20:14,200 --> 00:20:18,520 Speaker 1: even though it might vary on what time that happens. 337 00:20:18,560 --> 00:20:20,760 Speaker 1: Like the sense that the dog knows and is waiting 338 00:20:20,800 --> 00:20:25,600 Speaker 1: by the door. He thinks that's all explained by morphic resonance. Yeah, 339 00:20:25,600 --> 00:20:28,399 Speaker 1: and I mean like it, it's curious in that, you know, 340 00:20:28,480 --> 00:20:31,720 Speaker 1: how does a how does a bee know after it 341 00:20:31,800 --> 00:20:34,479 Speaker 1: makes that wax ring in a honeycomb? How does it 342 00:20:34,520 --> 00:20:39,280 Speaker 1: no to to melt it into um a polygon shape 343 00:20:39,359 --> 00:20:42,240 Speaker 1: rather than just a circle, or like those termites, Like 344 00:20:42,280 --> 00:20:45,800 Speaker 1: why does a termite nest look almost identical to other 345 00:20:45,920 --> 00:20:49,560 Speaker 1: termite nests? You know, yeah, exactly, Like there's there's a 346 00:20:49,640 --> 00:20:52,120 Speaker 1: lot of behaviors that we can't quite explain that if 347 00:20:52,160 --> 00:20:54,159 Speaker 1: you do kind of buy into this morphic grows in 348 00:20:54,240 --> 00:20:57,840 Speaker 1: this idea, you could say, well, that's actually really really 349 00:20:57,880 --> 00:21:01,040 Speaker 1: interesting now. And this is a real good criticism of 350 00:21:01,200 --> 00:21:04,680 Speaker 1: morphic residents is you could also just as equally say 351 00:21:04,840 --> 00:21:10,119 Speaker 1: magic or god or whatever. There's not that no one's 352 00:21:10,160 --> 00:21:14,040 Speaker 1: proven that morphic resonance exists. This is just shel Drake saying, 353 00:21:14,240 --> 00:21:16,920 Speaker 1: here's a good examples of what I'm talking about, this 354 00:21:17,040 --> 00:21:20,560 Speaker 1: morphic resonance stuff. Yeah, and this is kind of important too. 355 00:21:20,680 --> 00:21:24,040 Speaker 1: He talks about the fact that humans are not as 356 00:21:24,040 --> 00:21:26,640 Speaker 1: sensitive to this because and this is where he kind 357 00:21:26,640 --> 00:21:29,200 Speaker 1: of got me a little bit thinking. He says, we're 358 00:21:29,200 --> 00:21:32,800 Speaker 1: so distracted by technology and we don't need collective memory 359 00:21:32,840 --> 00:21:36,920 Speaker 1: of past humans to survive anymore. So that's why we're 360 00:21:36,960 --> 00:21:40,399 Speaker 1: not that's why we can't really sense these fields. And 361 00:21:40,440 --> 00:21:42,679 Speaker 1: I kind of disagree with that in some ways, Like 362 00:21:42,720 --> 00:21:45,679 Speaker 1: I think, if if it does exist, it's still is 363 00:21:46,040 --> 00:21:48,880 Speaker 1: it still survives in humans and things like think about 364 00:21:48,920 --> 00:21:53,520 Speaker 1: how how um easily the average human can pick a 365 00:21:53,560 --> 00:21:57,479 Speaker 1: snake out of the grass with with peripheral vision. Right, 366 00:21:57,760 --> 00:22:00,639 Speaker 1: I wasn't raised around snakes. My parents didn't drummed in 367 00:22:00,680 --> 00:22:03,520 Speaker 1: my into my head to be really wary of all snakes. 368 00:22:03,760 --> 00:22:06,359 Speaker 1: And yet I'm a pro at picking a snake out 369 00:22:06,400 --> 00:22:10,600 Speaker 1: in the grass with my peripheral Sure, And it's been 370 00:22:10,640 --> 00:22:13,200 Speaker 1: shown that that people can like pick a gun out 371 00:22:13,240 --> 00:22:15,440 Speaker 1: as quickly as they can't pick out snakes and spiders. 372 00:22:15,440 --> 00:22:17,680 Speaker 1: And we're really good at picking out snakes and spiders 373 00:22:17,680 --> 00:22:21,520 Speaker 1: in our environment. Um and uh that this would be 374 00:22:21,560 --> 00:22:23,600 Speaker 1: a pretty good example of that, if you ask me. 375 00:22:23,720 --> 00:22:25,920 Speaker 1: That is the most common descriptor I think when people 376 00:22:26,000 --> 00:22:29,600 Speaker 1: say what's Josh like, I'm was like, he drinks a 377 00:22:29,600 --> 00:22:34,080 Speaker 1: lot of beverages, coffee, water, you know, energy drinks. He Uh, 378 00:22:34,160 --> 00:22:36,439 Speaker 1: he's a hard worker. And man, you should see that 379 00:22:36,480 --> 00:22:40,320 Speaker 1: guy pick a snake out of his peripheral vision. It's uncanny. 380 00:22:40,520 --> 00:22:43,719 Speaker 1: It makes a gunshot ricochet sound like. I don't go 381 00:22:43,800 --> 00:22:47,560 Speaker 1: for a walk in the woods without him anymore. Sometimes 382 00:22:47,560 --> 00:22:49,280 Speaker 1: you're nice and carry me on your back when I 383 00:22:49,320 --> 00:22:54,159 Speaker 1: get tired. So, um, here's a couple of things with 384 00:22:54,240 --> 00:22:57,119 Speaker 1: human morphic residents. That this is where it gets a 385 00:22:57,119 --> 00:23:01,960 Speaker 1: little wacky to me. Um, he says. He claims that 386 00:23:02,040 --> 00:23:06,240 Speaker 1: a crossword puzzle is easier to complete later in the 387 00:23:06,359 --> 00:23:09,800 Speaker 1: day because of all the other people that had solved 388 00:23:09,840 --> 00:23:12,800 Speaker 1: it earlier in the day, and they are broadcasting this 389 00:23:12,960 --> 00:23:17,520 Speaker 1: morphick resonance out into the universe. I guess just their 390 00:23:17,560 --> 00:23:20,639 Speaker 1: general awareness of the answers. That gets a little wacky 391 00:23:20,680 --> 00:23:23,320 Speaker 1: to me. Yeah, a little. The other one is not 392 00:23:23,400 --> 00:23:26,760 Speaker 1: as wacky. Um. Is that feeling eyes in the back 393 00:23:26,760 --> 00:23:30,800 Speaker 1: of your head like you're being stared at? That's a thing? Uh, 394 00:23:30,800 --> 00:23:34,720 Speaker 1: he says, that's morphick fields. Yeah, that your morphic field 395 00:23:34,720 --> 00:23:39,439 Speaker 1: extends beyond your head and that it's sensitive and is 396 00:23:39,480 --> 00:23:42,119 Speaker 1: the first thing contacted by that person stare and it 397 00:23:42,200 --> 00:23:44,880 Speaker 1: lets you know basically that you're being stared at. Yeah, 398 00:23:44,880 --> 00:23:48,040 Speaker 1: this is very just he's going in the right direction. 399 00:23:48,080 --> 00:23:49,879 Speaker 1: Then I hear that, and I'm think, oh boy, that 400 00:23:50,000 --> 00:23:52,560 Speaker 1: is sounds a little wacky. I wrote another really good 401 00:23:52,560 --> 00:23:56,600 Speaker 1: explanation for that. Um that it's a self fulfilling thing 402 00:23:56,640 --> 00:23:59,600 Speaker 1: where you, um say, you're in the library to what 403 00:24:00,040 --> 00:24:01,840 Speaker 1: over and you get the sense that you're being stared 404 00:24:01,880 --> 00:24:04,040 Speaker 1: at by somebody at a table behind you, and when 405 00:24:04,080 --> 00:24:06,880 Speaker 1: you start to turn around, the movement of your head 406 00:24:06,960 --> 00:24:09,800 Speaker 1: look at a person's attention. Interesting And when you look, 407 00:24:09,840 --> 00:24:12,960 Speaker 1: when you finally complete that turn, that person is looking 408 00:24:13,000 --> 00:24:16,600 Speaker 1: at you. That makes sense, especially if you're like, oh, 409 00:24:16,640 --> 00:24:20,719 Speaker 1: for God's sake, can you turn around right? Yeah, then 410 00:24:20,760 --> 00:24:22,480 Speaker 1: they're definitely going to look the next time you turn 411 00:24:22,520 --> 00:24:24,400 Speaker 1: to because they're keeping an eye on you, that's right, 412 00:24:24,480 --> 00:24:30,240 Speaker 1: or they're just looking for snakes. So, Charles, Um, as 413 00:24:30,320 --> 00:24:33,120 Speaker 1: you kind of said earlier, this has not all been 414 00:24:33,680 --> 00:24:38,000 Speaker 1: very well received by the scientific community. They tend to 415 00:24:38,040 --> 00:24:40,040 Speaker 1: think of it as hocum um. The fact that he 416 00:24:40,080 --> 00:24:43,040 Speaker 1: doesn't publish peer of view papers anymore, and and said 417 00:24:43,240 --> 00:24:46,560 Speaker 1: writes books directly to the to the public um. The 418 00:24:46,600 --> 00:24:50,040 Speaker 1: fact that, uh, they claim that his stuff isn't false viable, 419 00:24:50,080 --> 00:24:53,479 Speaker 1: but if you read his explanations and descriptions, he's like, no, actually, 420 00:24:53,600 --> 00:24:56,000 Speaker 1: this all is false viable, And I try to run 421 00:24:56,040 --> 00:25:00,600 Speaker 1: experiments all the time. Sometimes it comes back with positive results. Um. 422 00:25:00,640 --> 00:25:03,680 Speaker 1: But they generally don't like the stuff that he's saying. 423 00:25:03,680 --> 00:25:08,720 Speaker 1: And in particular, there was one guy who, looking back, 424 00:25:09,080 --> 00:25:13,080 Speaker 1: made Rupert Sheldrake's career, and his name was Sir John 425 00:25:13,160 --> 00:25:18,119 Speaker 1: Maddox and at the time that um, uh what it 426 00:25:18,160 --> 00:25:22,240 Speaker 1: was that the Science the New Science of Life. When 427 00:25:22,280 --> 00:25:25,600 Speaker 1: that book came out, Uh, Sir John Maddox happened to 428 00:25:25,600 --> 00:25:29,280 Speaker 1: be the editor of the journal Nature. Nature and Science 429 00:25:29,280 --> 00:25:33,399 Speaker 1: are the two most prestigious scientific, peer reviewed publications in 430 00:25:33,400 --> 00:25:38,080 Speaker 1: the entire world. And this guy, right was the editor 431 00:25:38,160 --> 00:25:40,600 Speaker 1: of that and he got his hands on the Science 432 00:25:40,640 --> 00:25:44,600 Speaker 1: of a New Science of Life and wrote not just 433 00:25:44,680 --> 00:25:49,400 Speaker 1: a book review, an editorial about this book from the 434 00:25:49,560 --> 00:25:55,120 Speaker 1: editors of Nature, claiming that it was an infuriating tract 435 00:25:55,680 --> 00:25:58,720 Speaker 1: and that it was the best candidate for burning there 436 00:25:58,760 --> 00:26:03,040 Speaker 1: has been for many years. Yeah. Also this in an interview, 437 00:26:03,119 --> 00:26:07,000 Speaker 1: he said, Chouldrake is putting forward magic instead of science 438 00:26:07,720 --> 00:26:10,159 Speaker 1: and that can be condemned. And exactly the language that 439 00:26:10,240 --> 00:26:13,119 Speaker 1: the Pope used to condemned Galileo for the same reason. 440 00:26:13,160 --> 00:26:16,600 Speaker 1: It is heresy. Right, So if you were curious about 441 00:26:16,640 --> 00:26:20,480 Speaker 1: how Sir John Maddox felt about the dogma of science, 442 00:26:20,800 --> 00:26:22,800 Speaker 1: the fact that he used the word heresy kind of 443 00:26:22,840 --> 00:26:25,480 Speaker 1: says it all right. And this is thirteen years after 444 00:26:25,560 --> 00:26:28,119 Speaker 1: that first and he poked the Pope. Yeah, he was 445 00:26:28,160 --> 00:26:30,560 Speaker 1: doubling down on this, and he didn't mention that it 446 00:26:30,600 --> 00:26:33,480 Speaker 1: turned out Galileo was right, even though he positioned himself 447 00:26:33,480 --> 00:26:37,040 Speaker 1: and science in the pope position in this one. The 448 00:26:37,119 --> 00:26:41,280 Speaker 1: fact is he used the word burning. He he and 449 00:26:41,359 --> 00:26:44,359 Speaker 1: his defenders later on we'll say like, no, he if 450 00:26:44,400 --> 00:26:47,080 Speaker 1: you read the whole thing at the end, he says, no, 451 00:26:47,240 --> 00:26:50,080 Speaker 1: we shouldn't be burning books. But he does say that 452 00:26:50,160 --> 00:26:52,959 Speaker 1: there's hadn't been a better candidate for but if we 453 00:26:52,960 --> 00:26:56,040 Speaker 1: were to be the first one on the pile, right, 454 00:26:56,400 --> 00:26:59,520 Speaker 1: and so that you know, from that point on, Rupert 455 00:26:59,520 --> 00:27:01,919 Speaker 1: Sheldrake's publishers are like, we'll be using that on the 456 00:27:02,000 --> 00:27:04,600 Speaker 1: dust jacket of every addation of this from now on. 457 00:27:04,920 --> 00:27:07,520 Speaker 1: And it made his career. He went from somebody who 458 00:27:07,640 --> 00:27:13,280 Speaker 1: might have never been anybody to the premier heretic of 459 00:27:13,359 --> 00:27:18,080 Speaker 1: science thanks to that dusty old crotch. Sir John Maddox, Yeah, 460 00:27:18,080 --> 00:27:21,360 Speaker 1: here's dusty old crotch. Yeah. I'm not a big fan 461 00:27:21,440 --> 00:27:24,919 Speaker 1: of him, man, or anybody who's suggests we should burn books. 462 00:27:25,720 --> 00:27:28,960 Speaker 1: So here's another quote from another professor of biology at 463 00:27:29,200 --> 00:27:33,600 Speaker 1: University College London, Lewis Walpert more Fick. Residents is rubbish. 464 00:27:33,800 --> 00:27:37,040 Speaker 1: It is unmitigated junk and a great insult to the 465 00:27:37,080 --> 00:27:40,119 Speaker 1: people who do real work in the field. Yeah. And 466 00:27:40,119 --> 00:27:42,320 Speaker 1: I'm sure we could spend the next twenty minutes finding 467 00:27:42,400 --> 00:27:46,399 Speaker 1: quotes like that, um about that book. Yeah, and and 468 00:27:46,560 --> 00:27:50,920 Speaker 1: shel Drake's response has always been, um, I mean he'll 469 00:27:50,960 --> 00:27:53,840 Speaker 1: go back at people, for sure, but not in a 470 00:27:53,920 --> 00:27:57,439 Speaker 1: sort of a poopy pants way. Uh. He's basically is 471 00:27:57,480 --> 00:28:01,000 Speaker 1: like and and you know, in any idea that doesn't 472 00:28:01,040 --> 00:28:04,880 Speaker 1: conform to this religion of science is denounced. And he said, 473 00:28:04,920 --> 00:28:07,760 Speaker 1: it's close minded. It's a close minded system. It goes 474 00:28:07,800 --> 00:28:10,800 Speaker 1: against the nature of what science is, which should be 475 00:28:10,840 --> 00:28:15,080 Speaker 1: discovery and investigating hypotheses. Uh. And the fact that they're 476 00:28:15,160 --> 00:28:19,640 Speaker 1: valid until they're proven or disproven by experimenting. Uh So 477 00:28:20,000 --> 00:28:23,920 Speaker 1: get off my back with your dusty crotch. And in fact, um, 478 00:28:24,080 --> 00:28:29,520 Speaker 1: some some scientists in the field, are in a number 479 00:28:29,560 --> 00:28:33,920 Speaker 1: of fields, have kind of come to shell Drake's defense, 480 00:28:34,880 --> 00:28:37,480 Speaker 1: not so much that they've criticized John Maddox. I get 481 00:28:37,520 --> 00:28:40,240 Speaker 1: the impression that you don't you don't criticize sir John 482 00:28:40,440 --> 00:28:44,560 Speaker 1: unless the editor of Josh Clark. Right. Yeah, Well, I'm 483 00:28:44,560 --> 00:28:46,920 Speaker 1: not a scientist. I've got no skin in this game. 484 00:28:47,360 --> 00:28:51,040 Speaker 1: But um, they came to shel Drake's defense and that, Um, 485 00:28:51,200 --> 00:28:54,240 Speaker 1: they said, Okay, if you're saying these are falsifiable, let's 486 00:28:54,320 --> 00:28:57,360 Speaker 1: let's do some experimentation. Let's take let's take this to 487 00:28:57,440 --> 00:28:59,480 Speaker 1: the to the point you're you're putting in at and 488 00:28:59,520 --> 00:29:02,400 Speaker 1: let's let's apply the scientific method to this. Yeah, here, 489 00:29:02,440 --> 00:29:06,480 Speaker 1: smoke this hash right. No, Sheldrick said that to them, like, okay, 490 00:29:06,520 --> 00:29:08,520 Speaker 1: for the for the data to make sense, you got 491 00:29:08,520 --> 00:29:12,480 Speaker 1: to smoke this first, that's right, And they went, oh, okay, right, yeah, 492 00:29:12,600 --> 00:29:15,480 Speaker 1: I got it. But that's why everybody likes to hang 493 00:29:15,480 --> 00:29:19,440 Speaker 1: out with him, Um, because he's got the good stuff, right, 494 00:29:19,800 --> 00:29:24,000 Speaker 1: all right, So should we talk about, um a little 495 00:29:24,000 --> 00:29:27,480 Speaker 1: bit about what he claims and what he's tried to prove. Yeah, 496 00:29:27,520 --> 00:29:30,520 Speaker 1: because so again like he's he's run these experiments. But 497 00:29:30,560 --> 00:29:31,960 Speaker 1: there have been I just want to say, there have 498 00:29:32,040 --> 00:29:33,680 Speaker 1: been a few people who have come up and been like, 499 00:29:33,720 --> 00:29:35,760 Speaker 1: you know, that was bs what sir John said. We 500 00:29:35,760 --> 00:29:39,280 Speaker 1: shouldn't be burning books. I'm going to extend an olive 501 00:29:39,360 --> 00:29:42,840 Speaker 1: branch on behalf of the scientific community, and and we're 502 00:29:42,840 --> 00:29:45,920 Speaker 1: going to test some of these experiments. Yeah. So he 503 00:29:46,040 --> 00:29:48,600 Speaker 1: drilled down on a few in a few different areas Um, 504 00:29:48,640 --> 00:29:50,640 Speaker 1: that we're going to talk about. One is the one 505 00:29:50,680 --> 00:29:54,480 Speaker 1: I talked about about humans being stared at. The other 506 00:29:54,560 --> 00:29:58,720 Speaker 1: is the dogs anticipating their owners in return, right, basically 507 00:29:59,000 --> 00:30:03,400 Speaker 1: human dog eelepathy right Uh. And then yeah, boy, as 508 00:30:03,440 --> 00:30:05,560 Speaker 1: soon as that work telepathy is thrown out there, that's 509 00:30:05,760 --> 00:30:09,040 Speaker 1: a science killer. Yeah. And that's a I mean, that's 510 00:30:09,040 --> 00:30:12,960 Speaker 1: a big easy criticism of Sheldrig's ideas is that they 511 00:30:13,000 --> 00:30:16,520 Speaker 1: they include telepathy. That the idea that we're tuned into 512 00:30:16,520 --> 00:30:20,960 Speaker 1: this general body of conscious knowledge, that that was accumulated 513 00:30:20,960 --> 00:30:23,600 Speaker 1: by all the living things that came before us, and 514 00:30:23,600 --> 00:30:26,200 Speaker 1: that this exists outside of our minds and we can 515 00:30:26,240 --> 00:30:29,720 Speaker 1: connect to it with our minds. That's telepathy. And yeah, 516 00:30:29,760 --> 00:30:32,160 Speaker 1: there's no way to put it otherwise. Well in PSI 517 00:30:32,320 --> 00:30:34,720 Speaker 1: in general, which we should probably do a podcast on 518 00:30:34,720 --> 00:30:37,920 Speaker 1: at some point. Sure, I mean we've been chipping away 519 00:30:37,920 --> 00:30:40,400 Speaker 1: at a little by little bit. Uh. And then the 520 00:30:40,440 --> 00:30:42,640 Speaker 1: third one that that he kind of drilled down on 521 00:30:42,680 --> 00:30:47,040 Speaker 1: was the idea that different that successive generations of lab 522 00:30:47,120 --> 00:30:50,960 Speaker 1: rats can solve their little puzzles and problems faster and 523 00:30:51,000 --> 00:30:54,760 Speaker 1: easier than generations before that is because of morphic residence. Yeah, 524 00:30:54,800 --> 00:30:58,200 Speaker 1: and there's been data he's either carried out experiments himself 525 00:30:58,200 --> 00:31:00,920 Speaker 1: where he's he's you know, pointed out to publish data 526 00:31:00,960 --> 00:31:03,720 Speaker 1: before that has shown that. I think back in the thirties, 527 00:31:03,760 --> 00:31:08,200 Speaker 1: there was a a I guess a biologist or a 528 00:31:08,240 --> 00:31:12,160 Speaker 1: psychologist who was training matt or rats how to um 529 00:31:12,280 --> 00:31:16,680 Speaker 1: run a maze. And he found, to his amazement that 530 00:31:17,280 --> 00:31:21,480 Speaker 1: rats of successive generations over like thirty six generations did 531 00:31:21,560 --> 00:31:25,120 Speaker 1: better initially on these mazes than their predecessors, which would 532 00:31:25,280 --> 00:31:28,680 Speaker 1: suggest well a lot of things, but apparently controlled for 533 00:31:28,760 --> 00:31:32,880 Speaker 1: genetics and environment, and said, it's possible that this is 534 00:31:33,080 --> 00:31:35,560 Speaker 1: this is somehow being passed down from one generation to 535 00:31:35,600 --> 00:31:39,320 Speaker 1: the next outside of jeans. So let's talk about the 536 00:31:39,320 --> 00:31:41,720 Speaker 1: dog thing, because we have dogs, and we love to 537 00:31:41,720 --> 00:31:45,520 Speaker 1: think that our dogs are little people, and that they 538 00:31:45,560 --> 00:31:48,160 Speaker 1: sit by the door waiting on us and look out 539 00:31:48,160 --> 00:31:52,520 Speaker 1: the window and are just sad until we get home. Uh. 540 00:31:52,560 --> 00:31:55,840 Speaker 1: And so he did this experiment, and then later on 541 00:31:55,880 --> 00:32:00,000 Speaker 1: did some more experiments with a partner um which will 542 00:32:00,040 --> 00:32:02,280 Speaker 1: talk about here in a second. But he found a lady, 543 00:32:02,320 --> 00:32:05,200 Speaker 1: a British woman who her name was Pam, and she 544 00:32:05,320 --> 00:32:08,840 Speaker 1: had a dog named j t j y t E. 545 00:32:10,120 --> 00:32:13,520 Speaker 1: And she said, hey, use me because I got this 546 00:32:13,640 --> 00:32:19,400 Speaker 1: dog who waits by the window, uh, before I come home, 547 00:32:19,600 --> 00:32:22,160 Speaker 1: no matter when I come home, So it doesn't matter 548 00:32:22,160 --> 00:32:23,800 Speaker 1: if you come home with five or ten o'clock at 549 00:32:23,880 --> 00:32:26,520 Speaker 1: night or three in the afternoon. This dog is by 550 00:32:26,560 --> 00:32:30,120 Speaker 1: the window. So I think there's some telepathy going on. 551 00:32:30,720 --> 00:32:33,800 Speaker 1: And Sheldrake said, well, step right up and let's see 552 00:32:33,840 --> 00:32:35,960 Speaker 1: what's going on here. Yeah, And it wasn't just that 553 00:32:36,040 --> 00:32:38,640 Speaker 1: her dog sits by the window the whole time she's gone. 554 00:32:38,640 --> 00:32:42,920 Speaker 1: It's that people had noticed that would suddenly sit up, 555 00:32:43,160 --> 00:32:45,480 Speaker 1: go to the window, and then within a few ten 556 00:32:45,520 --> 00:32:47,920 Speaker 1: minutes or something like that, Pam would come home right 557 00:32:47,960 --> 00:32:51,520 Speaker 1: and started singing True Colors by Cindy Lauper right, and 558 00:32:51,600 --> 00:32:53,320 Speaker 1: she would come home at different times of the day 559 00:32:53,360 --> 00:32:55,720 Speaker 1: like this is a pretty It was a remarkable thing. 560 00:32:55,880 --> 00:33:00,560 Speaker 1: So um, apparently, over a hundred different tests, Sheldre found 561 00:33:00,880 --> 00:33:04,720 Speaker 1: that eight four out of a hundred times this dog 562 00:33:05,040 --> 00:33:08,960 Speaker 1: accurately predicted when Pam was coming home. And shel Drake's 563 00:33:08,960 --> 00:33:13,000 Speaker 1: whole hypoth in eleven seconds. We well, people should understand 564 00:33:13,040 --> 00:33:16,320 Speaker 1: not that this dog, here's the car pull in within 565 00:33:16,440 --> 00:33:19,920 Speaker 1: eleven seconds of her leaving to go home, right, leaving 566 00:33:19,920 --> 00:33:23,480 Speaker 1: her office I think miles away. And again this is 567 00:33:23,480 --> 00:33:27,440 Speaker 1: at different times of day. Um. They apparently experimented so 568 00:33:27,480 --> 00:33:29,560 Speaker 1: that she would come home in different kinds of cars, 569 00:33:29,600 --> 00:33:33,040 Speaker 1: including taxis, so that the dog couldn't somehow like hear 570 00:33:33,120 --> 00:33:36,640 Speaker 1: this this particular come of the of Pam's motor or 571 00:33:36,640 --> 00:33:39,040 Speaker 1: something like that. Um. But he controlled for a lot 572 00:33:39,040 --> 00:33:40,640 Speaker 1: of stuff. And this is something you got to understand 573 00:33:40,640 --> 00:33:45,680 Speaker 1: about Rupert Sheldrake. He carries out scientific experiments, like under 574 00:33:45,720 --> 00:33:49,880 Speaker 1: the scientific method. What people disagree with his interpretation of 575 00:33:49,920 --> 00:33:52,960 Speaker 1: the data typically, but he controlled for all this different 576 00:33:53,000 --> 00:33:55,400 Speaker 1: stuff and he found the eighty four times out of 577 00:33:55,400 --> 00:34:00,440 Speaker 1: a hundred, UM j t accurately predicted roughly when UM 578 00:34:00,440 --> 00:34:02,120 Speaker 1: Pam was going to come home by getting up and 579 00:34:02,120 --> 00:34:04,240 Speaker 1: going to the window to wait for her. Right. Yeah, 580 00:34:04,360 --> 00:34:07,440 Speaker 1: I think that first one, though, was not quite so scientific. 581 00:34:07,560 --> 00:34:10,879 Speaker 1: Wasn't that the deal? This is why they redid it. No, No, 582 00:34:10,960 --> 00:34:13,480 Speaker 1: they didn't redo it because it wasn't scientific. They redid 583 00:34:13,480 --> 00:34:16,319 Speaker 1: it because one of his greatest critics, Richard Wiseman, who 584 00:34:16,400 --> 00:34:20,040 Speaker 1: was a UM uh I think it's psychologist but also 585 00:34:20,160 --> 00:34:24,920 Speaker 1: like a professional skeptic, UM said this is b s. 586 00:34:25,200 --> 00:34:28,360 Speaker 1: But let's let's let me carry out let me replicate 587 00:34:28,440 --> 00:34:31,920 Speaker 1: your experiment and see if I get the same results. Okay, 588 00:34:31,920 --> 00:34:34,960 Speaker 1: because the he had an Austrian documentary crew and they 589 00:34:34,960 --> 00:34:38,200 Speaker 1: said that the test wasn't scientific. Oh I'm sorry, right, 590 00:34:38,320 --> 00:34:42,239 Speaker 1: they like they're they're filming. What they did was not scientific. 591 00:34:42,280 --> 00:34:47,520 Speaker 1: But he had already previously carried out in private that. Yeah. 592 00:34:47,560 --> 00:34:49,680 Speaker 1: But I mean, you know, the scientists do that all 593 00:34:49,680 --> 00:34:52,840 Speaker 1: the time. I don't think that he's I don't think 594 00:34:52,920 --> 00:34:58,600 Speaker 1: he has been accused of fudging his methodology. I think 595 00:34:58,680 --> 00:35:03,520 Speaker 1: he's just roundly accused of cherry cherry picking data, or 596 00:35:03,640 --> 00:35:07,720 Speaker 1: misinterpreting the data, or interpreting the data to suit as needs, 597 00:35:07,719 --> 00:35:10,080 Speaker 1: that kind of stuff. But I don't get the impression 598 00:35:10,080 --> 00:35:11,880 Speaker 1: that a lot of people are like this data on 599 00:35:12,000 --> 00:35:15,680 Speaker 1: its at its cores hocum. All right, Well, Wiseman, like 600 00:35:15,760 --> 00:35:21,000 Speaker 1: you said, noted skeptic, professional poopo or of things, experimental psychologist. 601 00:35:21,560 --> 00:35:23,360 Speaker 1: He comes in and says, all right, let's do this together. 602 00:35:24,040 --> 00:35:28,000 Speaker 1: He said, all right, this dog did not uh. In 603 00:35:28,040 --> 00:35:31,680 Speaker 1: four different occasions or four different experiments, this dog failed, 604 00:35:32,640 --> 00:35:35,360 Speaker 1: right uh. And this dog is going to the window 605 00:35:35,480 --> 00:35:40,239 Speaker 1: a lot. He's a window hanger outer yeah, this dog 606 00:35:40,360 --> 00:35:43,239 Speaker 1: loves that window. And so they said, all right, let's 607 00:35:43,320 --> 00:35:45,919 Speaker 1: rule out some of these false positives and let's say, 608 00:35:46,640 --> 00:35:50,399 Speaker 1: let's let's define what the real signal would be. That's 609 00:35:50,440 --> 00:35:52,480 Speaker 1: if this dog stays at the window for two minutes, 610 00:35:52,520 --> 00:35:55,040 Speaker 1: not just pops up to see if it's raining or 611 00:35:55,080 --> 00:35:58,400 Speaker 1: to sing a stands of true colors, but really sits 612 00:35:58,440 --> 00:36:02,960 Speaker 1: there for two minutes um, and then let's see what happens. 613 00:36:03,440 --> 00:36:07,600 Speaker 1: They did that, and Wiseman said, uh, all right, and 614 00:36:07,760 --> 00:36:09,680 Speaker 1: we all got to also say it's got to be 615 00:36:09,719 --> 00:36:13,239 Speaker 1: within ten minutes of her leaving for from home, not 616 00:36:13,320 --> 00:36:17,359 Speaker 1: eleven minutes, right, shaved off one minute. And in all 617 00:36:17,480 --> 00:36:21,400 Speaker 1: four of these experiments, this dog gives a signal before 618 00:36:21,440 --> 00:36:24,399 Speaker 1: that ten minute period, before she even started for home. 619 00:36:24,640 --> 00:36:28,360 Speaker 1: So Wiseman said failed. So if you read shell Drake's 620 00:36:28,920 --> 00:36:32,759 Speaker 1: rebuttal to Wiseman's findings, and Wiseman ran around not just 621 00:36:32,800 --> 00:36:35,720 Speaker 1: saying failed, he like gave I think four different talks 622 00:36:35,760 --> 00:36:38,720 Speaker 1: about this experiment, how like it was it didn't amount 623 00:36:38,719 --> 00:36:42,920 Speaker 1: to anything um And so shell Drake responded to it 624 00:36:42,960 --> 00:36:47,040 Speaker 1: and he was like, well, this two minute duration was 625 00:36:47,200 --> 00:36:49,839 Speaker 1: an arbitrary signal that you came up with that wasn't 626 00:36:49,840 --> 00:36:53,200 Speaker 1: part of my original methodology. And then also and I 627 00:36:53,239 --> 00:36:57,720 Speaker 1: think all four of those experiments, right, maybe all four. Um, 628 00:36:57,800 --> 00:37:01,520 Speaker 1: So the dog went to the window or lee and um. 629 00:37:01,600 --> 00:37:04,479 Speaker 1: And then afterward if he went to the window again, 630 00:37:04,480 --> 00:37:07,640 Speaker 1: which apparently he did to wait for Pam, that was 631 00:37:07,760 --> 00:37:10,280 Speaker 1: thrown out because he'd already gone to the window before. 632 00:37:10,280 --> 00:37:12,239 Speaker 1: He's like, well, I never said the dog only went 633 00:37:12,280 --> 00:37:15,600 Speaker 1: to the window when Pam was coming home. I just 634 00:37:15,640 --> 00:37:18,720 Speaker 1: said he would go to the window to wait for Pam. Um, 635 00:37:18,760 --> 00:37:21,800 Speaker 1: you know, within some certain time frame of her leaving. 636 00:37:21,960 --> 00:37:23,920 Speaker 1: And apparently the dog continued to do this, but it 637 00:37:23,960 --> 00:37:27,200 Speaker 1: wasn't included in these tests because he had already gone 638 00:37:27,239 --> 00:37:30,319 Speaker 1: to the window. So it's really detailed and you can 639 00:37:30,440 --> 00:37:33,080 Speaker 1: read it yourself if you want to. But um, he 640 00:37:33,160 --> 00:37:36,680 Speaker 1: has a good explanation for why Wiseman's interpretation of the 641 00:37:36,760 --> 00:37:39,960 Speaker 1: data was you know, or his methodology was flawed. But 642 00:37:40,160 --> 00:37:43,479 Speaker 1: it's all very civil like you were saying before. He's 643 00:37:43,520 --> 00:37:46,400 Speaker 1: not like Wiseman's a moron who couldn't do science if 644 00:37:46,440 --> 00:37:49,600 Speaker 1: it's sat on him and and caused him to stop 645 00:37:49,680 --> 00:37:52,759 Speaker 1: respirating or anything like that. Yeah, And I think it's uh, 646 00:37:52,840 --> 00:37:54,719 Speaker 1: the other thing, he because you know, the thing that 647 00:37:54,760 --> 00:37:57,240 Speaker 1: Wiseman poop pood was the fact that the dog started 648 00:37:57,280 --> 00:38:01,879 Speaker 1: this behavior before she started from home. And Sheldrike was like, hey, 649 00:38:02,000 --> 00:38:05,520 Speaker 1: I think that further proves it, actually, because I think 650 00:38:05,920 --> 00:38:09,160 Speaker 1: that Pam is sending signals before she starts for home 651 00:38:09,200 --> 00:38:13,359 Speaker 1: that she didn't even realize, Like maybe she she gets 652 00:38:13,360 --> 00:38:15,279 Speaker 1: her coat and goes to the restroom for a few 653 00:38:15,320 --> 00:38:18,200 Speaker 1: minutes or something. Even well she was with the beginning 654 00:38:18,200 --> 00:38:20,680 Speaker 1: of the going home process, yeah, or she was with 655 00:38:20,719 --> 00:38:23,600 Speaker 1: Wiseman's assistant, and Wiseman's assistant was the one who knew 656 00:38:23,600 --> 00:38:26,359 Speaker 1: what time they were going home, so she he said, 657 00:38:26,400 --> 00:38:28,360 Speaker 1: maybe Pam was picking up on the guy looking at 658 00:38:28,400 --> 00:38:30,279 Speaker 1: his watch or something like that and knew when she 659 00:38:30,360 --> 00:38:32,520 Speaker 1: was going to go home. Anyway, Um, there's a lot 660 00:38:32,560 --> 00:38:34,920 Speaker 1: of he has a lot of explanations for It's very 661 00:38:34,960 --> 00:38:37,239 Speaker 1: interesting to kind of read the back and forth. But 662 00:38:37,360 --> 00:38:40,520 Speaker 1: um so that so, but Wiseman won that one because 663 00:38:40,880 --> 00:38:43,520 Speaker 1: everyone wanted Wiseman to win that one, and and I 664 00:38:43,560 --> 00:38:45,600 Speaker 1: think that's kind of par for the course. For Sheldrake, 665 00:38:45,680 --> 00:38:48,640 Speaker 1: He's like, well, no, here's all these other explanations for 666 00:38:48,680 --> 00:38:52,040 Speaker 1: this interpretation, and people just kind of ignored unless you 667 00:38:52,120 --> 00:38:55,359 Speaker 1: want to believe what Sheldrake has to say. Um, if 668 00:38:55,400 --> 00:38:57,279 Speaker 1: you don't want to believe what Sheldrick has to say, 669 00:38:57,480 --> 00:39:01,480 Speaker 1: people like Wiseman and other skeptics provide, you know, well 670 00:39:01,520 --> 00:39:04,600 Speaker 1: here we carried at this experiment and now this is disproven. Right. 671 00:39:05,160 --> 00:39:08,359 Speaker 1: Another guy who did that is a really um big 672 00:39:08,360 --> 00:39:10,840 Speaker 1: critic of shell Drake. His name is Steven Rose. I 673 00:39:10,880 --> 00:39:13,680 Speaker 1: think he's a no, I'm sorry, he's yeah. He's a 674 00:39:13,680 --> 00:39:17,480 Speaker 1: biologist and a neuroscientist, Steven Rose, And um, he carried 675 00:39:17,520 --> 00:39:20,480 Speaker 1: out another experiment about how chicks might be able to 676 00:39:20,600 --> 00:39:23,080 Speaker 1: learn kind of like that lab rate experiment, how they 677 00:39:23,440 --> 00:39:27,000 Speaker 1: successive generations learned, um, how to do amaze. Well, they 678 00:39:27,000 --> 00:39:29,680 Speaker 1: did this with chicks, and they did this experiment together, 679 00:39:30,080 --> 00:39:32,600 Speaker 1: and they had different interpretations of the data and it 680 00:39:32,640 --> 00:39:35,280 Speaker 1: went back and forth and in different journals or whatever. 681 00:39:35,520 --> 00:39:38,320 Speaker 1: But the fact is there are scientists out there, skeptics 682 00:39:38,320 --> 00:39:42,120 Speaker 1: who are critics of shell Drake and his ideas and methods. 683 00:39:42,160 --> 00:39:45,680 Speaker 1: But um still scientists that are willing to engage his ideas. 684 00:39:45,960 --> 00:39:47,799 Speaker 1: And I think that that's healthy, even if they are 685 00:39:47,840 --> 00:39:50,239 Speaker 1: coming at it from you know, the standpoint like this 686 00:39:50,280 --> 00:39:52,719 Speaker 1: is bunk this is howcome they're still willing to go 687 00:39:52,800 --> 00:39:57,239 Speaker 1: through with these experiments. And I respect that you want 688 00:39:57,239 --> 00:40:00,560 Speaker 1: to take another break. Yes, all right, Chuck, We're going 689 00:40:00,640 --> 00:40:23,640 Speaker 1: to take another break everybody, in case you didn't hear, 690 00:40:27,480 --> 00:40:31,080 Speaker 1: all right. So Rose, uh, Stephen Rose. And this, this 691 00:40:31,160 --> 00:40:34,399 Speaker 1: quote kind of really puts the nail on the head 692 00:40:35,200 --> 00:40:39,440 Speaker 1: of the critics of shell Drake. Uh. And this actually 693 00:40:39,560 --> 00:40:42,920 Speaker 1: is one that spoke to me because it's not it's 694 00:40:42,960 --> 00:40:45,719 Speaker 1: not an attack on on shel Drake. It's more of 695 00:40:45,760 --> 00:40:49,600 Speaker 1: a sympathetic view, which is this shell Drake is so 696 00:40:49,760 --> 00:40:54,080 Speaker 1: committed to his hypotheses that it is very hard to 697 00:40:54,200 --> 00:40:59,000 Speaker 1: envisage the circumstances in which he would accept its disconfirmation. 698 00:40:59,719 --> 00:41:02,480 Speaker 1: So it's a very sweet way of saying, like, this 699 00:41:02,520 --> 00:41:06,600 Speaker 1: guy really really believes this stuff so much that like, 700 00:41:06,719 --> 00:41:09,960 Speaker 1: I don't think he is able to look at the 701 00:41:10,040 --> 00:41:15,319 Speaker 1: data in a in a sort of level headed, unbiased way. Right. So, 702 00:41:15,520 --> 00:41:17,800 Speaker 1: and that says at all. I mean, that's tough to 703 00:41:17,800 --> 00:41:24,000 Speaker 1: to It's a very cutting criticism because how do you 704 00:41:25,360 --> 00:41:27,680 Speaker 1: how do you show that's not true other than to 705 00:41:27,760 --> 00:41:32,240 Speaker 1: say these are wrong. You know, you have to admit 706 00:41:32,280 --> 00:41:37,160 Speaker 1: that your hypothesis is wrong. Um, to to get away, 707 00:41:37,239 --> 00:41:38,880 Speaker 1: to get around that, and then once you've done that, 708 00:41:38,960 --> 00:41:42,920 Speaker 1: you've just lost anyway. So it's a it's a tough, 709 00:41:43,120 --> 00:41:46,320 Speaker 1: very it's a very shrewd criticism. Yeah. So Sheil Drake 710 00:41:46,480 --> 00:41:49,080 Speaker 1: over the years has um, he's written a lot of books. 711 00:41:50,280 --> 00:41:55,239 Speaker 1: He has been accused of. Um, he's been accused by 712 00:41:55,239 --> 00:41:57,719 Speaker 1: some of like, hey, this guy is just out there 713 00:41:57,719 --> 00:42:00,440 Speaker 1: writing books to make money, uh, and has sort of 714 00:42:00,480 --> 00:42:05,040 Speaker 1: made himself the superstar of the altar side of science. 715 00:42:05,480 --> 00:42:08,319 Speaker 1: That seems to be the biggest explanation for what he's 716 00:42:08,480 --> 00:42:11,040 Speaker 1: why he's doing what he's doing that he found it easier, 717 00:42:11,640 --> 00:42:15,040 Speaker 1: an easier and quicker path to fame and recognition and 718 00:42:15,080 --> 00:42:18,640 Speaker 1: probably money writing these books about his own made up 719 00:42:18,680 --> 00:42:23,120 Speaker 1: ideas rather than writing you know, academic papers like everybody else. Right, 720 00:42:23,320 --> 00:42:26,160 Speaker 1: And on Sheldrake's side, he's like, listen this uh what 721 00:42:26,239 --> 00:42:31,200 Speaker 1: he calls the default worldview of science and these dogmas. Um, 722 00:42:31,239 --> 00:42:35,480 Speaker 1: he said they should be sort of pushback against. And um, 723 00:42:35,640 --> 00:42:38,760 Speaker 1: what about questioned, Yeah, questioned, what about the Big Bang? 724 00:42:39,320 --> 00:42:41,319 Speaker 1: He's like, everyone you know thinks they have it all 725 00:42:41,360 --> 00:42:44,959 Speaker 1: figured out and all laws in the universal constant, well 726 00:42:45,000 --> 00:42:47,160 Speaker 1: except for the Big Bang and then you know, we 727 00:42:47,280 --> 00:42:51,400 Speaker 1: can't fully explain that and that There's another great quote 728 00:42:51,440 --> 00:42:56,560 Speaker 1: from a philosopher named Terence McKenna is, give us one 729 00:42:56,600 --> 00:42:59,520 Speaker 1: free miracle and we'll explain the rest, and you know 730 00:42:59,560 --> 00:43:01,000 Speaker 1: when it comes, and things like the Big Bang. That 731 00:43:01,080 --> 00:43:04,000 Speaker 1: kind of holds true. Yes, specifically with the Big Bang, 732 00:43:04,040 --> 00:43:05,640 Speaker 1: I think is what he's talking about. That if you 733 00:43:05,640 --> 00:43:12,080 Speaker 1: can just allow for their nothing, nothing that came before, 734 00:43:12,200 --> 00:43:14,120 Speaker 1: and all of a sudden, all the matter and energy 735 00:43:14,160 --> 00:43:18,440 Speaker 1: in the universe, um, suddenly existed, then we can pretty 736 00:43:18,520 --> 00:43:21,080 Speaker 1: much explain all other physics from that point on, or 737 00:43:21,120 --> 00:43:24,200 Speaker 1: we can use that and um. And that's the big question, 738 00:43:24,320 --> 00:43:27,800 Speaker 1: is what happened right before the Big Bang? Yeah, yeah, exactly. 739 00:43:27,880 --> 00:43:30,400 Speaker 1: But we also have other questions about the universe, like 740 00:43:30,560 --> 00:43:32,759 Speaker 1: is it inflating, you know, is there gonna be a 741 00:43:32,760 --> 00:43:34,439 Speaker 1: big bang? Or is there going to be a big 742 00:43:34,480 --> 00:43:37,799 Speaker 1: crunch or um. We have a lot of questions and 743 00:43:37,840 --> 00:43:41,600 Speaker 1: a lot of misunderstanding about it too. But we physics 744 00:43:41,640 --> 00:43:44,799 Speaker 1: needs the Big Bang to have happened the way that 745 00:43:44,880 --> 00:43:47,440 Speaker 1: we think it might have happened. But even still, the 746 00:43:47,480 --> 00:43:50,560 Speaker 1: way we think it might have happened doesn't follow the 747 00:43:50,600 --> 00:43:53,840 Speaker 1: physical laws as we understand them, and so shell Drake 748 00:43:53,880 --> 00:43:56,200 Speaker 1: and others point to that one and they're like, come on, guys, 749 00:43:56,320 --> 00:44:00,880 Speaker 1: like this is just one of several examples of science 750 00:44:01,000 --> 00:44:03,960 Speaker 1: just saying this is the way it is, even though 751 00:44:04,040 --> 00:44:06,880 Speaker 1: we don't fully understand it or the data we're getting 752 00:44:06,880 --> 00:44:11,160 Speaker 1: suggests otherwise. And oh man, how can I say this 753 00:44:11,200 --> 00:44:15,799 Speaker 1: in a way that's not like controversial? There isn't one. 754 00:44:15,880 --> 00:44:20,680 Speaker 1: It's not the same as when let's say a a 755 00:44:20,719 --> 00:44:27,000 Speaker 1: creationist saying, well, you can't really explain the Big Bang, so, um, 756 00:44:27,239 --> 00:44:31,440 Speaker 1: it's all magic, and that's okay, right right, you know 757 00:44:31,560 --> 00:44:34,120 Speaker 1: it's not It's not along those same lines. There's something 758 00:44:34,160 --> 00:44:37,080 Speaker 1: different about what Sheldrake is saying. I think what he's 759 00:44:37,080 --> 00:44:40,239 Speaker 1: saying is, yeah, in some cases he's like, we don't 760 00:44:40,280 --> 00:44:43,279 Speaker 1: understand this. So here's my interpretation. I think. In his 761 00:44:43,440 --> 00:44:46,359 Speaker 1: more recent UM book, The Science Delusion, which came out 762 00:44:46,400 --> 00:44:48,960 Speaker 1: in two thousand twelve in the US is called Science 763 00:44:48,960 --> 00:44:53,960 Speaker 1: Set Free. Um, he's saying, here are some some essential 764 00:44:54,440 --> 00:44:58,719 Speaker 1: dogmatic beliefs of science that are worth challenging, and that 765 00:44:58,800 --> 00:45:01,960 Speaker 1: if we don't chan challenge them, we we might end 766 00:45:02,040 --> 00:45:05,600 Speaker 1: up going down this wrong path of scientific inquiry. UM. 767 00:45:05,640 --> 00:45:07,520 Speaker 1: And we need to be a little more free to 768 00:45:08,000 --> 00:45:12,759 Speaker 1: differing ideas because we don't understand these things like everybody 769 00:45:12,800 --> 00:45:15,799 Speaker 1: generally believes we do. Yeah, I think he kind of 770 00:45:16,120 --> 00:45:18,720 Speaker 1: in the book title itself, he said in America's Science 771 00:45:18,719 --> 00:45:22,000 Speaker 1: Set Free, that kind of encapsulates. I think he sees 772 00:45:22,080 --> 00:45:27,160 Speaker 1: himself as as some kind of emancipator of science rather 773 00:45:27,239 --> 00:45:32,919 Speaker 1: than um a kuk who believes in telepathy, right, even 774 00:45:32,920 --> 00:45:35,440 Speaker 1: though he kind of I mean that stuff. And Dave 775 00:45:35,520 --> 00:45:38,560 Speaker 1: Russ helped us with this, uh, with this research. Yeah, 776 00:45:38,560 --> 00:45:40,399 Speaker 1: I did a great job he did. And you know, 777 00:45:40,840 --> 00:45:44,319 Speaker 1: he points out, and he's right, morphic resonance sounds very 778 00:45:44,360 --> 00:45:48,120 Speaker 1: strange and weird, you know, and it also sounds like 779 00:45:48,160 --> 00:45:51,080 Speaker 1: something that would sell a book, right right. But to 780 00:45:51,239 --> 00:45:54,080 Speaker 1: throw him in there, I think it's a Rose said 781 00:45:55,160 --> 00:45:59,160 Speaker 1: he's basically no, no better than someone who endorses crop 782 00:45:59,239 --> 00:46:03,439 Speaker 1: circles and create ationism and pseudoscience. Yeah, And I don't 783 00:46:03,440 --> 00:46:07,200 Speaker 1: think that's necessarily fair. No, And even so, if you 784 00:46:07,320 --> 00:46:11,920 Speaker 1: take away things like morphic resonance and things like um, 785 00:46:12,120 --> 00:46:15,920 Speaker 1: uh well morphic residence basically, um, if you if you 786 00:46:15,920 --> 00:46:18,920 Speaker 1: take that stuff away and just look at him as like, uh, 787 00:46:18,960 --> 00:46:23,440 Speaker 1: a challenger to scientific Dogma. You can appreciate him on 788 00:46:23,440 --> 00:46:26,560 Speaker 1: the on that level as well, like he he you 789 00:46:26,600 --> 00:46:29,759 Speaker 1: can peel back different layers of this guy and appreciate 790 00:46:29,800 --> 00:46:32,759 Speaker 1: different parts and also disagree with different parts. But so 791 00:46:32,880 --> 00:46:35,880 Speaker 1: in his most recent when the Science Solution, he basically says, 792 00:46:36,160 --> 00:46:39,200 Speaker 1: here are some things that science believes that we shouldn't 793 00:46:39,200 --> 00:46:43,160 Speaker 1: necessarily believe, like that matters unconscious. And there are people 794 00:46:43,160 --> 00:46:45,839 Speaker 1: out there, including physicists, who are like, you know, if 795 00:46:46,000 --> 00:46:50,759 Speaker 1: consciousness were just a a property of all matter, and 796 00:46:50,800 --> 00:46:53,960 Speaker 1: that the more matter you put together, the more sophisticated 797 00:46:54,040 --> 00:46:57,880 Speaker 1: consciousness you got, that would explain a lot of stuff, 798 00:46:57,920 --> 00:47:01,480 Speaker 1: including human consciousness. That's just an emergent property of all 799 00:47:01,520 --> 00:47:05,880 Speaker 1: these particles that came together to form human beings. Um. 800 00:47:05,960 --> 00:47:12,200 Speaker 1: But currently scientific establishment says no matters unconscious. That's just 801 00:47:12,280 --> 00:47:15,640 Speaker 1: our understanding of it. That's the way it is, although 802 00:47:15,680 --> 00:47:18,320 Speaker 1: that is being challenged by more people into super Sheldrake. 803 00:47:18,480 --> 00:47:20,360 Speaker 1: And then there was another one. He gave it ted Talk, 804 00:47:20,600 --> 00:47:24,399 Speaker 1: a Ted x white Chapel Talk was later banned, right, 805 00:47:25,280 --> 00:47:28,480 Speaker 1: but they took it from their YouTube channel and then 806 00:47:28,520 --> 00:47:31,200 Speaker 1: inserted it into a blog post. You can still see it, 807 00:47:31,360 --> 00:47:33,279 Speaker 1: but they put it in a blog post because their 808 00:47:33,280 --> 00:47:36,799 Speaker 1: science advisors have been like, uh, this is pretty heretical, 809 00:47:36,920 --> 00:47:38,680 Speaker 1: and I don't think you should just be presenting it 810 00:47:38,760 --> 00:47:41,400 Speaker 1: like it's just you need to couch it in some language. 811 00:47:41,520 --> 00:47:43,040 Speaker 1: So they did and they put in a blog post, 812 00:47:43,080 --> 00:47:44,480 Speaker 1: but you can still see it. It's not like they 813 00:47:44,520 --> 00:47:47,640 Speaker 1: just took it down altogether. But um, it's a really 814 00:47:47,680 --> 00:47:50,480 Speaker 1: interesting talk. It's only like twenty minutes long, but in 815 00:47:50,560 --> 00:47:53,279 Speaker 1: it he makes a really good case about how the 816 00:47:53,360 --> 00:47:57,200 Speaker 1: laws of nature, like the gravitational constant or the speed 817 00:47:57,239 --> 00:48:02,120 Speaker 1: of light aren't actually constant, and physics needs those things 818 00:48:02,120 --> 00:48:05,239 Speaker 1: to be constant for it to do its inquiries, for 819 00:48:05,320 --> 00:48:08,880 Speaker 1: it to do its formula and equations, for the current 820 00:48:08,960 --> 00:48:12,160 Speaker 1: theories to work. And he's saying, no, there's been periods 821 00:48:12,160 --> 00:48:14,880 Speaker 1: in history where we've measured these things and gotten different, 822 00:48:15,360 --> 00:48:18,919 Speaker 1: different measurements, and that during that same period, all these 823 00:48:18,960 --> 00:48:22,000 Speaker 1: different um scientists around the world, we're getting roughly the 824 00:48:22,080 --> 00:48:25,200 Speaker 1: same different measurements from what we thought it was before. 825 00:48:25,520 --> 00:48:28,120 Speaker 1: How do you explain that? And I think that point 826 00:48:28,200 --> 00:48:30,960 Speaker 1: is really important because if something like the speed of 827 00:48:31,040 --> 00:48:34,359 Speaker 1: light does change, and understanding that it does and how 828 00:48:34,400 --> 00:48:37,640 Speaker 1: it does could give us an even greater understanding of physics. 829 00:48:37,719 --> 00:48:41,040 Speaker 1: And that right there, I think is the greatest role 830 00:48:41,080 --> 00:48:44,840 Speaker 1: that Rupert Sheldrake plays is to say, no, stop looking 831 00:48:44,880 --> 00:48:48,160 Speaker 1: at it through this lens. This lens is possibly incorrect 832 00:48:48,239 --> 00:48:51,200 Speaker 1: at the very least, don't burn all your old stuff, 833 00:48:51,239 --> 00:48:53,760 Speaker 1: don't throw it away, but just step to the side 834 00:48:53,840 --> 00:48:56,760 Speaker 1: and approach it from a different way, just to see 835 00:48:56,800 --> 00:48:59,279 Speaker 1: if that's the truth, and if it is the truth, 836 00:48:59,280 --> 00:49:01,279 Speaker 1: and will have a great or understanding of how things 837 00:49:01,320 --> 00:49:06,080 Speaker 1: actually work. Yeah, because the unexplainable are only unexplainable until 838 00:49:06,120 --> 00:49:09,800 Speaker 1: they can be explained, You're right, you know, and various 839 00:49:09,880 --> 00:49:13,400 Speaker 1: points throughout history there there were a lot of claims 840 00:49:13,440 --> 00:49:17,040 Speaker 1: that things were unexplainable until they figured it out. And 841 00:49:17,120 --> 00:49:19,719 Speaker 1: again we're not touting pseudoscience here because we have a 842 00:49:19,719 --> 00:49:23,520 Speaker 1: pretty good track record of uh roundly sighting on the 843 00:49:23,560 --> 00:49:27,400 Speaker 1: side of science. Yes, but and expertise to like, we 844 00:49:27,480 --> 00:49:31,000 Speaker 1: both have a tremendous amount of respect for expertise. And 845 00:49:31,080 --> 00:49:32,880 Speaker 1: you know, people who go study things for years and 846 00:49:32,920 --> 00:49:35,640 Speaker 1: years and years and apply themselves to that understanding that 847 00:49:35,719 --> 00:49:38,040 Speaker 1: one thing, that's an expert and they should generally be 848 00:49:38,120 --> 00:49:41,120 Speaker 1: listened to. That's right, And shel Drake has proven himself 849 00:49:41,200 --> 00:49:44,160 Speaker 1: out enough to be listened to. I think he's not. Uh, 850 00:49:44,880 --> 00:49:50,680 Speaker 1: aliens Man, whoever that guy was? So yeah, I mean, 851 00:49:50,719 --> 00:49:52,960 Speaker 1: make up your own mind. Go read about him, Go 852 00:49:53,080 --> 00:49:55,440 Speaker 1: read both sides about him if you if you're interested 853 00:49:55,480 --> 00:49:57,680 Speaker 1: in this, don't just listen to us like he's definitely 854 00:49:57,719 --> 00:49:59,680 Speaker 1: one of those people. You should make your own mind up. 855 00:49:59,680 --> 00:50:03,160 Speaker 1: And if you disagree, great, If you agree, fantastic. We're 856 00:50:03,200 --> 00:50:06,279 Speaker 1: just kinda we just kind of admire thinkers like that. 857 00:50:06,719 --> 00:50:09,560 Speaker 1: That's right. Are you got anything else? I got nothing else, 858 00:50:10,040 --> 00:50:14,400 Speaker 1: Rupert Sheldrake. That was it. If you want to know 859 00:50:14,440 --> 00:50:17,680 Speaker 1: more about him, go read I go, go read his books. 860 00:50:17,719 --> 00:50:20,000 Speaker 1: Go do what you want. Um. And in the meantime, 861 00:50:20,040 --> 00:50:21,640 Speaker 1: since I say go do what you want, it's time 862 00:50:21,640 --> 00:50:26,160 Speaker 1: for listener mail. Yeah. This one is a little long, 863 00:50:26,239 --> 00:50:28,840 Speaker 1: but this was a firsthand account from the Iowa Caucus, 864 00:50:29,719 --> 00:50:33,360 Speaker 1: so I thought it bared reading. Uh. This is from Lauren, 865 00:50:33,560 --> 00:50:36,719 Speaker 1: a student at the University of Iowa, participated in her 866 00:50:36,760 --> 00:50:40,560 Speaker 1: first Iowa Caucus. She said I went last Well, she 867 00:50:40,600 --> 00:50:42,920 Speaker 1: did say that I went because it's such a big 868 00:50:42,920 --> 00:50:44,839 Speaker 1: deal here and since I'm graduating, I thought it might 869 00:50:44,880 --> 00:50:48,200 Speaker 1: be my only chance. However, this week's events. It's looking 870 00:50:48,200 --> 00:50:50,799 Speaker 1: like it maybe the last time anyone is going to participate. 871 00:50:51,440 --> 00:50:54,040 Speaker 1: There were several logistical issues in my opinion, that led 872 00:50:54,080 --> 00:50:58,239 Speaker 1: to issues where I was participating. My caucus location was 873 00:50:58,280 --> 00:51:02,520 Speaker 1: downtown Iowa City at the Ingle Alert Englert Theater. My 874 00:51:02,600 --> 00:51:05,799 Speaker 1: roommate and I arrived about forty five minutes before to 875 00:51:05,840 --> 00:51:08,200 Speaker 1: make sure we could be in the door before seven, 876 00:51:08,280 --> 00:51:10,800 Speaker 1: since we had been told if you weren't in the door, 877 00:51:11,160 --> 00:51:14,640 Speaker 1: then t s for you. Two lines for the caucus 878 00:51:14,640 --> 00:51:16,719 Speaker 1: wrapped around the block, one for people who registered in 879 00:51:16,760 --> 00:51:19,279 Speaker 1: the correct precinct and one for people who needed to 880 00:51:19,360 --> 00:51:22,279 Speaker 1: change their registration to the correct precinct. There were over 881 00:51:22,320 --> 00:51:25,520 Speaker 1: seven people in our caucus location alone, far more than 882 00:51:25,560 --> 00:51:30,160 Speaker 1: a the Democratic Party of Iowa had expected. Since my 883 00:51:30,200 --> 00:51:33,520 Speaker 1: caucus location was at theater, was almost impossible to distinguish 884 00:51:33,520 --> 00:51:36,320 Speaker 1: where different candidates were in the room. The Bernie and 885 00:51:36,360 --> 00:51:38,560 Speaker 1: Warren groups were so large they had locations on the 886 00:51:38,600 --> 00:51:42,160 Speaker 1: floor and had to have satellite spots in the balcony 887 00:51:42,600 --> 00:51:44,879 Speaker 1: when it came time to do the head count. The 888 00:51:44,960 --> 00:51:48,319 Speaker 1: overcrowded space led to issues tallying people. All of the 889 00:51:48,360 --> 00:51:52,000 Speaker 1: campaign volunteers had reported numbers caucus delegate informed us that 890 00:51:52,040 --> 00:51:54,840 Speaker 1: the total number of people under each candidate was about 891 00:51:54,840 --> 00:51:58,279 Speaker 1: fifty under the amount of total people checked in, so 892 00:51:58,400 --> 00:52:01,720 Speaker 1: they assumed that fifty all had chosen to leave before 893 00:52:01,760 --> 00:52:04,600 Speaker 1: the votes were all tallied, but the campaign volunteers demanded 894 00:52:04,600 --> 00:52:07,360 Speaker 1: a recount. All in all, it took about two and 895 00:52:07,360 --> 00:52:09,000 Speaker 1: a half hours to get through the first round of 896 00:52:09,040 --> 00:52:11,960 Speaker 1: caucusing to find out which candidates would be bible. My 897 00:52:12,040 --> 00:52:15,040 Speaker 1: prediction is that because Islands are so passionate about their 898 00:52:15,080 --> 00:52:18,560 Speaker 1: caucus system, they'll probably happen again next election cycle, but 899 00:52:18,719 --> 00:52:22,640 Speaker 1: since their faux importance is brought to public consciousness this year, 900 00:52:22,960 --> 00:52:26,319 Speaker 1: they'll eventually die away to a primary system soon. That 901 00:52:26,440 --> 00:52:30,120 Speaker 1: is from Lauren Cheshire. Lauren, that was a great account. 902 00:52:30,120 --> 00:52:35,560 Speaker 1: You're basically like the Hunter Thompson correspondent of stuff you 903 00:52:35,560 --> 00:52:37,839 Speaker 1: should know, and we appreciate that big time. That's right 904 00:52:38,560 --> 00:52:42,720 Speaker 1: the Iowa after that, and the name of the subject 905 00:52:42,800 --> 00:52:45,280 Speaker 1: line of the email was the Iowa caucus is depraved 906 00:52:45,320 --> 00:52:50,160 Speaker 1: and decadent. Very nice. Um, So if you want to 907 00:52:50,160 --> 00:52:51,840 Speaker 1: get in touch with us to let us know something 908 00:52:51,840 --> 00:52:53,879 Speaker 1: that's going on in your neck of the woods while 909 00:52:53,920 --> 00:52:55,480 Speaker 1: we want to hear about it, you can go on 910 00:52:55,520 --> 00:52:57,400 Speaker 1: to stuff you Should Know dot com. I don't know 911 00:52:57,440 --> 00:53:00,239 Speaker 1: why you would want to these days. Instead to send 912 00:53:00,320 --> 00:53:02,640 Speaker 1: us an email, wrap it up, spank it on the bottom, 913 00:53:02,640 --> 00:53:05,279 Speaker 1: and send it off to stuff podcast at i heeart 914 00:53:05,360 --> 00:53:10,960 Speaker 1: radio dot com. Stuff you Should Know is a production 915 00:53:11,000 --> 00:53:13,719 Speaker 1: of iHeart Radio's How Stuff Works. For more podcasts for 916 00:53:13,800 --> 00:53:16,600 Speaker 1: my heart Radio, visit the iHeart Radio app, Apple Podcasts, 917 00:53:16,640 --> 00:53:21,520 Speaker 1: or wherever you listen to your favorite shows. H