1 00:00:00,840 --> 00:00:03,640 Speaker 1: Hey, everybody, it's me Josh, your old pal, and for 2 00:00:03,680 --> 00:00:06,960 Speaker 1: this week's s Y s K Selects, I've chosen How 3 00:00:07,040 --> 00:00:10,520 Speaker 1: Chaos Theory Changed the Universe first came out in July 4 00:00:10,640 --> 00:00:13,000 Speaker 1: of two thousand sixteen, and I have to say I 5 00:00:13,080 --> 00:00:16,239 Speaker 1: think it's um one of the better science e Stuff 6 00:00:16,239 --> 00:00:18,880 Speaker 1: you should Know episodes of all time, because there's just 7 00:00:18,960 --> 00:00:22,319 Speaker 1: something about this that grabbed me and Chuck by the 8 00:00:22,400 --> 00:00:25,439 Speaker 1: callers and said I'm interesting, aren't I? And we said, yes, 9 00:00:25,480 --> 00:00:28,720 Speaker 1: you definitely are. And this one has everything. It has science, 10 00:00:28,760 --> 00:00:32,160 Speaker 1: it has philosophy, it has our understanding of the universe. 11 00:00:32,280 --> 00:00:34,919 Speaker 1: Is just all around good episode. So I hope you 12 00:00:35,080 --> 00:00:38,280 Speaker 1: enjoy it as much as I did listening to it. Again. 13 00:00:41,240 --> 00:00:43,559 Speaker 1: Welcome to Stuff You Should Know, a production of I 14 00:00:43,720 --> 00:00:52,640 Speaker 1: Heart Radios How Stuff Works. Hey, and welcome to the podcast. 15 00:00:52,720 --> 00:00:56,000 Speaker 1: I'm Josh Clark with Charles W Chuck Bryant and there's 16 00:00:56,160 --> 00:00:58,800 Speaker 1: Jerry over there. So this is stuff you should know, 17 00:00:59,000 --> 00:01:03,920 Speaker 1: the podcast about Chaos Theory. Like, uh, have you ever 18 00:01:03,920 --> 00:01:09,280 Speaker 1: seen Event Horizon? I did not bad? Great movie? Are 19 00:01:09,280 --> 00:01:11,000 Speaker 1: you crazy? I think it was great? Oh it was 20 00:01:11,040 --> 00:01:13,679 Speaker 1: so imagining it. I thought it was okay, it was 21 00:01:14,920 --> 00:01:17,560 Speaker 1: like a love crafty and thing in outer space. Yeah, 22 00:01:17,680 --> 00:01:20,160 Speaker 1: I loved it was all right, I love crafted it. 23 00:01:21,040 --> 00:01:24,200 Speaker 1: I liked it. Um, that's what I think of when 24 00:01:24,240 --> 00:01:26,559 Speaker 1: I think of chaos. You know, there's that one part 25 00:01:26,600 --> 00:01:29,080 Speaker 1: where they kind of give you like a glimpse behind, 26 00:01:29,240 --> 00:01:32,640 Speaker 1: like the dimension that this action is taking place in, 27 00:01:33,200 --> 00:01:35,640 Speaker 1: to see the chaos underneath. And you should check that 28 00:01:35,640 --> 00:01:42,319 Speaker 1: out again. Yeah, I think about Jurassic Park and Jeff 29 00:01:42,360 --> 00:01:48,680 Speaker 1: Goldblum as as the creep Dr Malcolm explaining chaos in 30 00:01:49,480 --> 00:01:54,640 Speaker 1: the little auto driving suv or whatever that was. Yeah, 31 00:01:54,760 --> 00:01:56,400 Speaker 1: that's what it was called in the script, the auto 32 00:01:56,480 --> 00:01:59,880 Speaker 1: driving suv scene. Yeah. And you know what, actually rewatched 33 00:01:59,880 --> 00:02:02,640 Speaker 1: the scene and it confirmed two things. One is that 34 00:02:03,480 --> 00:02:05,840 Speaker 1: he uh, he actually did a pretty decent job for 35 00:02:05,880 --> 00:02:10,560 Speaker 1: a Hollywood movie with a very rudimentary explanation of chaos. Um, 36 00:02:10,600 --> 00:02:13,840 Speaker 1: and you watched it for this yeah, yeah, just that scene. 37 00:02:14,040 --> 00:02:16,120 Speaker 1: And then it also confirmed of what a creep that 38 00:02:16,240 --> 00:02:19,359 Speaker 1: character was. Yeah. If you watch that scene, he's like, 39 00:02:20,480 --> 00:02:22,320 Speaker 1: you know, he was all gross and flirty with her 40 00:02:22,400 --> 00:02:25,920 Speaker 1: right in front of her ex but there's just you know, 41 00:02:26,000 --> 00:02:28,040 Speaker 1: he's talking to her. I didn't even notice this at first. 42 00:02:28,440 --> 00:02:30,640 Speaker 1: He like he just like touches her hair out of 43 00:02:30,680 --> 00:02:33,480 Speaker 1: nowhere for no reason. He's just talking to her and 44 00:02:33,480 --> 00:02:35,640 Speaker 1: he just like grabs her hair and touches it. And 45 00:02:35,680 --> 00:02:38,200 Speaker 1: I'm like, what a creep. I know, if you look closely, 46 00:02:38,240 --> 00:02:41,840 Speaker 1: you can see the hormones emerging through his chest hair. Yeah, 47 00:02:42,680 --> 00:02:45,720 Speaker 1: and I love Jeff Goldblum. It's not a reflection on him. 48 00:02:46,160 --> 00:02:49,480 Speaker 1: He was basically doing Jeff Goldbloom. Well that's what. Yeah, sure, 49 00:02:49,560 --> 00:02:51,960 Speaker 1: he's Jeff Goldblum, but I don't think that's how in 50 00:02:51,960 --> 00:02:53,720 Speaker 1: the manner in which he speaks. But I don't think 51 00:02:53,760 --> 00:02:59,200 Speaker 1: he's a creep, do you. Wow, I've got nothing against 52 00:02:59,280 --> 00:03:02,560 Speaker 1: Jeff gold I think he's a I think he's doing 53 00:03:02,600 --> 00:03:04,920 Speaker 1: Jeff Goldblum. It was also a sign of the times, 54 00:03:04,919 --> 00:03:09,160 Speaker 1: Like if that movie were made today, doctor what was 55 00:03:09,200 --> 00:03:12,080 Speaker 1: her name in the movie? Think, Yeah, Dr Sadler would 56 00:03:12,080 --> 00:03:16,240 Speaker 1: be like, it's very inappropriate to stroke my hair. Yeah, like, 57 00:03:16,480 --> 00:03:21,560 Speaker 1: don't touch me. But this was the nineties, nineties, free wheeling, 58 00:03:21,919 --> 00:03:24,440 Speaker 1: it was eight it was nineties, it was the early 59 00:03:24,480 --> 00:03:29,200 Speaker 1: mid nineties thing. The book came out in and in 60 00:03:29,280 --> 00:03:35,080 Speaker 1: the book, uh Ian Malcolm, who's a Kayetician? Yeah, creep Kaetician? Right? 61 00:03:35,480 --> 00:03:40,360 Speaker 1: He um? He he goes into even more depth about chaos. 62 00:03:40,560 --> 00:03:42,080 Speaker 1: But that was I mean, that was the first time 63 00:03:42,120 --> 00:03:44,440 Speaker 1: I ever heard of chaos theory was from Jurassic Park, 64 00:03:45,440 --> 00:03:50,160 Speaker 1: and um it really it was really misleading. I think 65 00:03:50,200 --> 00:03:54,280 Speaker 1: the entire term chaos is very misleading as far as 66 00:03:54,320 --> 00:03:58,120 Speaker 1: the general public goes as from what I researched in 67 00:03:58,200 --> 00:04:00,560 Speaker 1: this this for this article, well, yeah, I mean you 68 00:04:00,600 --> 00:04:04,000 Speaker 1: hear the word chaos as an English speaker and you 69 00:04:04,040 --> 00:04:08,560 Speaker 1: think frenetic and crazy, out of control. Yeah, and that's 70 00:04:08,560 --> 00:04:11,480 Speaker 1: not what it means in terms of science like this, right. 71 00:04:11,800 --> 00:04:14,040 Speaker 1: What it means, I guess we can say up front 72 00:04:14,160 --> 00:04:19,479 Speaker 1: is is basically the idea that complex systems do not 73 00:04:19,600 --> 00:04:24,919 Speaker 1: behave in very neat ways that we can easily grasp, understand, 74 00:04:25,000 --> 00:04:29,680 Speaker 1: or measure right, and not even even simple systems don't. 75 00:04:29,880 --> 00:04:33,440 Speaker 1: Sometimes it doesn't always have to be complex. But um, 76 00:04:33,480 --> 00:04:35,400 Speaker 1: I want to give a shout out in addition to 77 00:04:35,400 --> 00:04:38,800 Speaker 1: our own article to uh when you know, when it 78 00:04:38,839 --> 00:04:41,680 Speaker 1: comes to stuff like this, the brain breaking stuff for me, man, 79 00:04:41,760 --> 00:04:43,799 Speaker 1: this is a brain breaker. You know how I always 80 00:04:43,839 --> 00:04:47,760 Speaker 1: go to like blank blank for kids because it always 81 00:04:47,760 --> 00:04:51,240 Speaker 1: helps if there's a dinosaur mascot on the page. It's 82 00:04:51,240 --> 00:04:54,320 Speaker 1: a sure thing, we can understand it. But the best 83 00:04:54,400 --> 00:04:56,640 Speaker 1: explanation for all this stuff that I found on the 84 00:04:56,680 --> 00:05:01,600 Speaker 1: internet was from a website called a bar Um A 85 00:05:01,760 --> 00:05:04,919 Speaker 1: B A R I. M. Publications, which turns out to 86 00:05:04,960 --> 00:05:10,520 Speaker 1: be a website about biblical patterns and sandwiched in the middle, 87 00:05:10,560 --> 00:05:14,760 Speaker 1: there is a really great, easy to understand uh series 88 00:05:14,800 --> 00:05:17,160 Speaker 1: of pages on chaos there. So I was like, man, 89 00:05:17,200 --> 00:05:21,200 Speaker 1: I get it now, I mean in a rudimentary way, right, Well, yeah, um, 90 00:05:21,920 --> 00:05:26,760 Speaker 1: I think even a lot of people who deal with 91 00:05:26,880 --> 00:05:30,520 Speaker 1: systems that display chaotic behavior, which I guess is to 92 00:05:30,560 --> 00:05:34,599 Speaker 1: say basically all systems eventually under the right conditions, UM, 93 00:05:34,800 --> 00:05:38,480 Speaker 1: don't necessarily understand chaos. Yeah. And they define a complex 94 00:05:38,480 --> 00:05:42,800 Speaker 1: system is specifically. It doesn't mean just like, oh it's complex, 95 00:05:43,720 --> 00:05:46,800 Speaker 1: I mean it is, but specifically, Um, they define it 96 00:05:46,839 --> 00:05:48,880 Speaker 1: in a way that helped me understand. It's a system 97 00:05:48,960 --> 00:05:52,480 Speaker 1: that has so much motion, so many elements that are 98 00:05:52,480 --> 00:05:55,200 Speaker 1: in motion, moving parts. Yeah, that it takes like a 99 00:05:55,240 --> 00:06:00,120 Speaker 1: computer to calculate all the possibilities of like what that 100 00:06:00,120 --> 00:06:02,360 Speaker 1: could look like five minutes from now, ten years from now. 101 00:06:03,240 --> 00:06:09,719 Speaker 1: So before computers came around, we before the quantum mechanical revolution. 102 00:06:09,760 --> 00:06:12,320 Speaker 1: It was there's a lot more basic. It was like 103 00:06:12,560 --> 00:06:15,320 Speaker 1: what comes up must come down, stuff like that. Let's 104 00:06:15,360 --> 00:06:18,839 Speaker 1: talk about that, Chuckers, because when you're talking about chaos theory, 105 00:06:18,920 --> 00:06:23,839 Speaker 1: it helps to understand how it revolutionized the universe by 106 00:06:24,000 --> 00:06:27,320 Speaker 1: getting a clear picture of how we understood the universe 107 00:06:27,400 --> 00:06:32,039 Speaker 1: leading up to the discovery of chaos. Right, So, prior 108 00:06:32,120 --> 00:06:37,760 Speaker 1: to the um the scientific Revolution, everybody was like, oh, well, 109 00:06:37,800 --> 00:06:40,120 Speaker 1: it's it's God. The Earth is at the center of 110 00:06:40,160 --> 00:06:42,960 Speaker 1: the universe, and God is spinning everything around like a top. Right. 111 00:06:43,720 --> 00:06:47,200 Speaker 1: It was all a theistic explanation. Then the scientific revolution 112 00:06:47,279 --> 00:06:50,680 Speaker 1: happens and people start applying things like math and making 113 00:06:50,760 --> 00:06:56,120 Speaker 1: like mathematical discoveries and and and figuring out that there 114 00:06:56,160 --> 00:07:01,840 Speaker 1: are there's order. They're finding order in patterns and predictability 115 00:07:01,960 --> 00:07:07,320 Speaker 1: to the universe if you can apply mathematics to it. Yes, specifically, 116 00:07:07,400 --> 00:07:12,000 Speaker 1: if you can apply mathematics to the starting point right, right, So, 117 00:07:12,120 --> 00:07:14,720 Speaker 1: if you can if you can um figure out how 118 00:07:14,720 --> 00:07:18,320 Speaker 1: a system works mathematically speaking, right, you can go in 119 00:07:18,400 --> 00:07:22,520 Speaker 1: and plug in whatever coordinates you want to and watch 120 00:07:22,560 --> 00:07:24,560 Speaker 1: it go. You can predict what what the outcome is 121 00:07:24,560 --> 00:07:27,200 Speaker 1: going to be, and what this is the it's based 122 00:07:27,240 --> 00:07:30,880 Speaker 1: on what at the time was a totally revolutionary idea 123 00:07:31,440 --> 00:07:36,040 Speaker 1: um By Initially, I think the cart was the first 124 00:07:36,080 --> 00:07:38,840 Speaker 1: one to kind of say cause and effect is a 125 00:07:38,840 --> 00:07:41,400 Speaker 1: pretty big part of our universe, right. Yeah. It was 126 00:07:41,440 --> 00:07:43,880 Speaker 1: sort of like where this is the the sixteen hundreds where 127 00:07:44,160 --> 00:07:49,120 Speaker 1: early science met philosophy. They kind of complemented one another 128 00:07:49,400 --> 00:07:53,120 Speaker 1: as far as something that's we're talking about determinism, right, 129 00:07:53,360 --> 00:07:56,320 Speaker 1: So that was the kind of the seeds of determinism 130 00:07:56,400 --> 00:07:59,320 Speaker 1: was the scientific revolution, and like you said, where philosophy 131 00:07:59,320 --> 00:08:01,880 Speaker 1: and science came together in the form of Descartes, right. 132 00:08:02,360 --> 00:08:04,120 Speaker 1: And then Newton came along and we did a whole 133 00:08:04,120 --> 00:08:07,840 Speaker 1: episode on him. Yeah, January of this year. That was 134 00:08:07,840 --> 00:08:10,320 Speaker 1: a good one. It was really good. Like I think 135 00:08:10,360 --> 00:08:13,440 Speaker 1: you said in that episode that there's possibly no scientists 136 00:08:13,480 --> 00:08:17,040 Speaker 1: that changed the world more than Newton has. He's he's 137 00:08:17,080 --> 00:08:20,880 Speaker 1: got legs. People shouted out others and email, but I'll 138 00:08:20,920 --> 00:08:23,200 Speaker 1: just say he's at the near the top for sure 139 00:08:23,280 --> 00:08:26,600 Speaker 1: with some other people. The cream. So Newton came along 140 00:08:26,720 --> 00:08:29,920 Speaker 1: and Newton said that was his name, Isaac the Cream Newton, right, 141 00:08:30,800 --> 00:08:33,880 Speaker 1: and anytime he don't to be like cream, Yeah, you 142 00:08:34,000 --> 00:08:36,120 Speaker 1: just got creamed. So I thought he was a boxer. 143 00:08:36,320 --> 00:08:39,640 Speaker 1: He's a basketball player. He was much more well known 144 00:08:39,679 --> 00:08:42,079 Speaker 1: as a boxer, but he definitely could dunk as a 145 00:08:42,080 --> 00:08:47,520 Speaker 1: as a B baller. So um Man, that threw me 146 00:08:47,520 --> 00:08:51,600 Speaker 1: off a little bit. Yeah, the cream comes along and uh, 147 00:08:52,640 --> 00:08:55,320 Speaker 1: he basically says, watch this, dude, this causing effect thing 148 00:08:55,320 --> 00:08:59,240 Speaker 1: you're talking about, I can express it in quantifiable terms. 149 00:09:00,000 --> 00:09:01,840 Speaker 1: And he comes up with all of these great laws 150 00:09:02,480 --> 00:09:06,120 Speaker 1: and and basically sets the stage the foundation for science 151 00:09:06,160 --> 00:09:09,160 Speaker 1: for the next three centuries or so. Yeah, these these 152 00:09:09,240 --> 00:09:14,760 Speaker 1: laws that were so rock solid and powerful that scientists 153 00:09:14,880 --> 00:09:17,520 Speaker 1: kind of got ahead of themselves a little and said 154 00:09:17,559 --> 00:09:21,920 Speaker 1: we're done. Like with Newton's laws, we can predict Uh, 155 00:09:21,960 --> 00:09:24,520 Speaker 1: we can predict everything if we have a good enough 156 00:09:24,960 --> 00:09:29,760 Speaker 1: beginning accurate value to plug into his equations, and they weren't. 157 00:09:30,440 --> 00:09:32,360 Speaker 1: I think there was a little hubris and a little 158 00:09:32,559 --> 00:09:36,480 Speaker 1: just excitement about like, well, we figured it all out right, 159 00:09:36,559 --> 00:09:39,600 Speaker 1: that that you could take Newton's laws and if you 160 00:09:39,720 --> 00:09:45,160 Speaker 1: had accurate enough measurements, you could predict what the outcome 161 00:09:45,200 --> 00:09:47,880 Speaker 1: would be of that system that you plug those measurements 162 00:09:47,880 --> 00:09:50,280 Speaker 1: into using these simular and at the time, a lot 163 00:09:50,320 --> 00:09:54,040 Speaker 1: of this was like planetary like, well, we know that 164 00:09:54,080 --> 00:09:57,400 Speaker 1: these planets are here and they're moving and their orbiting. 165 00:09:57,760 --> 00:09:59,240 Speaker 1: So if we know these things, we can plug it 166 00:09:59,280 --> 00:10:01,760 Speaker 1: into an equation and we can figure out what it's 167 00:10:01,800 --> 00:10:03,920 Speaker 1: going to be like in a hundred years exactly. And 168 00:10:05,000 --> 00:10:08,839 Speaker 1: they've figured out the basis of determinism is what we 169 00:10:09,000 --> 00:10:12,320 Speaker 1: just said, that if you have accurate measurements, you can 170 00:10:12,360 --> 00:10:15,880 Speaker 1: take those measurements and use them to predict, um how 171 00:10:15,920 --> 00:10:19,960 Speaker 1: a system is going to change over time using differential equations. Right, 172 00:10:20,720 --> 00:10:22,800 Speaker 1: so this is what this is what Newton comes along 173 00:10:22,840 --> 00:10:24,959 Speaker 1: and figures out that you can describe the universe and 174 00:10:25,000 --> 00:10:30,280 Speaker 1: these mathematical terms using differential equations and um. Like you 175 00:10:30,280 --> 00:10:34,040 Speaker 1: said there was a tremendous amount of hubris, and well, 176 00:10:34,440 --> 00:10:36,240 Speaker 1: I think you said there's some hubris. I think there's 177 00:10:36,240 --> 00:10:39,560 Speaker 1: a tremendous amount of hubris where science basically said, we've 178 00:10:39,640 --> 00:10:43,760 Speaker 1: mastered the universe, We've uncovered the blueprint of the universe, 179 00:10:44,000 --> 00:10:46,960 Speaker 1: and now we understand everything. It's just a matter now 180 00:10:47,360 --> 00:10:50,960 Speaker 1: of getting our scientific measurements more and more and more exact. 181 00:10:51,480 --> 00:10:54,600 Speaker 1: Because again, the hallmark of determinism is that if you 182 00:10:54,679 --> 00:10:58,160 Speaker 1: have exact measurements, you can predict an outcome accurately, like 183 00:10:58,320 --> 00:11:03,200 Speaker 1: the pool queue example. Well they're the pool table example, right, right, 184 00:11:03,320 --> 00:11:05,800 Speaker 1: So if you've got a pool table, let's say you're 185 00:11:05,800 --> 00:11:09,199 Speaker 1: playing some nine ball. You have that beautiful little diamond 186 00:11:09,960 --> 00:11:12,200 Speaker 1: set up, you got your cue ball, you put that 187 00:11:12,280 --> 00:11:14,439 Speaker 1: cue ball, and you you crack it with the queue. 188 00:11:14,640 --> 00:11:18,800 Speaker 1: And if you are super accurate with your initial measurements, 189 00:11:18,840 --> 00:11:22,160 Speaker 1: you should be able to mathematically plot out via angles 190 00:11:22,280 --> 00:11:25,200 Speaker 1: where the balls will end up right exactly, Like you 191 00:11:25,200 --> 00:11:27,120 Speaker 1: can say, this is what the table will look like 192 00:11:27,160 --> 00:11:30,480 Speaker 1: after the break, if you know the force, the angle, 193 00:11:30,840 --> 00:11:33,920 Speaker 1: all those little variable temperature, if there's wind in the room, 194 00:11:34,280 --> 00:11:37,720 Speaker 1: like the felt on the table, like everything. The more 195 00:11:37,760 --> 00:11:40,880 Speaker 1: specific you are, the more accurate your end result will be. Right. 196 00:11:40,960 --> 00:11:43,320 Speaker 1: And then one of the other hallmarks of determinism is 197 00:11:43,360 --> 00:11:47,319 Speaker 1: that if you take those exact same initial conditions and 198 00:11:47,360 --> 00:11:50,120 Speaker 1: do them again, the table, the pool table will look 199 00:11:50,160 --> 00:11:52,959 Speaker 1: exactly the same after the break. Yeah, which is pretty 200 00:11:53,000 --> 00:11:56,480 Speaker 1: much impossible for like a human to do with their hands. Sure, 201 00:11:56,679 --> 00:11:59,760 Speaker 1: but the idea at the time of science was that 202 00:12:00,040 --> 00:12:02,840 Speaker 1: if you could build a perfect machine, sure that could 203 00:12:02,880 --> 00:12:06,920 Speaker 1: recreate these conditions, it will happen the same way every time, right, Yeah, 204 00:12:07,120 --> 00:12:09,840 Speaker 1: And this, I mean, this led to they had hubris, 205 00:12:09,960 --> 00:12:13,480 Speaker 1: but you could understand it when like literally in eighteen 206 00:12:13,559 --> 00:12:20,520 Speaker 1: forty six, two people predicted Neptune would exist within months, 207 00:12:20,600 --> 00:12:23,200 Speaker 1: that would exist, but does exist. And this is not 208 00:12:23,240 --> 00:12:25,200 Speaker 1: by looking up in the sky like they did it 209 00:12:25,240 --> 00:12:28,560 Speaker 1: with math and they were right. Yea. So imagine in 210 00:12:28,679 --> 00:12:32,520 Speaker 1: eighteen forty when that happens, they're like, yeah, we kinda 211 00:12:33,000 --> 00:12:35,280 Speaker 1: we've got the math down, so we're pretty much all 212 00:12:35,320 --> 00:12:39,240 Speaker 1: knowing well. Plus also for the most part these not 213 00:12:39,400 --> 00:12:43,280 Speaker 1: just with Neptune, they were finding, um that this stuff 214 00:12:43,360 --> 00:12:46,960 Speaker 1: really panned out. It held true for everything from um, 215 00:12:47,000 --> 00:12:51,960 Speaker 1: you know, the investigation into electricity to new chemical reactions 216 00:12:52,000 --> 00:12:56,680 Speaker 1: and understanding those, and it laid the scientific revolution, laid 217 00:12:56,679 --> 00:12:59,960 Speaker 1: the basis for the industrial revolution, and just the change 218 00:13:00,080 --> 00:13:02,280 Speaker 1: inge that came out of the world like that. It 219 00:13:02,360 --> 00:13:06,080 Speaker 1: definitely there. It is understandable how science kind of was 220 00:13:06,120 --> 00:13:08,280 Speaker 1: like we got it all figured out well. And like 221 00:13:08,360 --> 00:13:13,679 Speaker 1: you said, they even Galileo was smart enough to know 222 00:13:15,559 --> 00:13:21,000 Speaker 1: there's uncertainty in these measurements, like the precision is key, 223 00:13:21,160 --> 00:13:23,640 Speaker 1: so they spent what does the article say, A lot 224 00:13:23,640 --> 00:13:26,440 Speaker 1: of the much of the nineteenth and twentieth century just 225 00:13:26,520 --> 00:13:30,080 Speaker 1: trying to build better instrumentation to get more and more 226 00:13:30,720 --> 00:13:33,760 Speaker 1: smaller and smaller and more precise measurements. Right, that was 227 00:13:33,800 --> 00:13:35,880 Speaker 1: like basically the goal of it, right, Yeah, which was 228 00:13:35,960 --> 00:13:38,320 Speaker 1: the right direction. That's like exactly what they should have 229 00:13:38,320 --> 00:13:43,280 Speaker 1: been doing. The problem is there, Like you said, Galileo 230 00:13:43,400 --> 00:13:46,600 Speaker 1: knew that there was some sort of there, There're gonna 231 00:13:46,600 --> 00:13:49,960 Speaker 1: be some flaws and measurement that we just didn't have 232 00:13:50,080 --> 00:13:54,720 Speaker 1: those great scientific instruments yet. Yeah. It's called the uncertainty principle. Okay, 233 00:13:55,520 --> 00:14:00,960 Speaker 1: it's accuracy, right, But the idea is if you have 234 00:14:01,440 --> 00:14:04,400 Speaker 1: a good enough instrument, you can overcome that, and that 235 00:14:04,640 --> 00:14:11,400 Speaker 1: the the more you shrink the um error in measuring 236 00:14:11,440 --> 00:14:15,600 Speaker 1: the initial conditions, the more you're going to shrink the 237 00:14:15,720 --> 00:14:18,440 Speaker 1: error in the outcome. It would be proportionate. Right. They 238 00:14:18,440 --> 00:14:22,720 Speaker 1: were correct. The thing is they were also aware but 239 00:14:23,600 --> 00:14:28,200 Speaker 1: ignoring in a lot a lot of ways some outstanding problems, 240 00:14:28,760 --> 00:14:32,920 Speaker 1: specifically something called the end body problem. You know what, 241 00:14:33,320 --> 00:14:35,440 Speaker 1: I'm so excited about this. I need to take a break. 242 00:14:35,560 --> 00:14:37,120 Speaker 1: I think that's a good idea. I need to go 243 00:14:37,560 --> 00:14:42,200 Speaker 1: check out my end body in the bathroom, Okay, and 244 00:14:42,280 --> 00:15:03,160 Speaker 1: we'll be back, all right. Took we're back. So there's 245 00:15:03,160 --> 00:15:07,240 Speaker 1: some there's some issues right with determinism. There's some some 246 00:15:07,320 --> 00:15:12,160 Speaker 1: weird problems out there that are saying like, hey, pay 247 00:15:12,200 --> 00:15:17,200 Speaker 1: attention to me because I'm not sure determinism works right. Uh. 248 00:15:17,240 --> 00:15:19,760 Speaker 1: And one one is the end body problem. Yeah. How 249 00:15:19,760 --> 00:15:24,800 Speaker 1: this came about was five. That was King Oscar number 250 00:15:24,800 --> 00:15:28,840 Speaker 1: two of Sweden and Norway. Yeah, I don't want to 251 00:15:28,920 --> 00:15:32,000 Speaker 1: leave out Norway both. Uh. He said, you know what, 252 00:15:32,320 --> 00:15:34,840 Speaker 1: let's offer a prize to anyone who can prove the 253 00:15:34,880 --> 00:15:38,640 Speaker 1: stability of the Solar system, something that has been stable 254 00:15:38,960 --> 00:15:42,080 Speaker 1: for a long time before that. And a lot of 255 00:15:42,440 --> 00:15:45,120 Speaker 1: the most brilliant minds on planet Earth got together and 256 00:15:45,160 --> 00:15:49,640 Speaker 1: tried to do this, uh, with mathematical proofs, and no 257 00:15:49,680 --> 00:15:54,800 Speaker 1: one could do it. Uh. And then a dude name Honoree. 258 00:15:55,280 --> 00:15:59,320 Speaker 1: You gotta help me there with that, Oh, say the 259 00:15:59,320 --> 00:16:03,320 Speaker 1: whole thing. Red Plank, a very nice he was French, 260 00:16:03,560 --> 00:16:06,080 Speaker 1: believe it or not, and he was a mathematician, and 261 00:16:06,080 --> 00:16:08,520 Speaker 1: he said, you know what, I'm not gonna look at 262 00:16:08,600 --> 00:16:10,920 Speaker 1: this big picture of all the planets in the Sun 263 00:16:11,000 --> 00:16:13,000 Speaker 1: and all their orbits. You'd have to be a fool 264 00:16:13,040 --> 00:16:15,400 Speaker 1: to try that. Sure, he said, I'm gonna shrink this down. 265 00:16:16,080 --> 00:16:19,920 Speaker 1: Like we talked about shrinking that initial value, you know, 266 00:16:20,600 --> 00:16:23,080 Speaker 1: and um, that initial condition, he shrunk it down. He said, 267 00:16:23,080 --> 00:16:26,640 Speaker 1: I'm gonna look at just a couple of bodies orbiting 268 00:16:26,640 --> 00:16:30,640 Speaker 1: one another, uh, with a common center of gravity, And 269 00:16:30,680 --> 00:16:33,600 Speaker 1: I'm gonna look at this. And this was called the 270 00:16:33,680 --> 00:16:36,360 Speaker 1: N body problem, Yeah, which was smart to do, because 271 00:16:36,800 --> 00:16:42,280 Speaker 1: the more variables you factor into um a nonlinear equation 272 00:16:42,360 --> 00:16:45,280 Speaker 1: like that, just the harder it's gonna be. So he 273 00:16:45,400 --> 00:16:48,400 Speaker 1: shrunk it down. So the N body problem has to 274 00:16:48,440 --> 00:16:52,480 Speaker 1: do with three or more celestial bodies orbiting one another. 275 00:16:52,520 --> 00:16:56,240 Speaker 1: So Plank said, oh, I'll just start with three. Ye. Smart. 276 00:16:56,520 --> 00:16:59,480 Speaker 1: And what he found from doing his equations for this 277 00:16:59,720 --> 00:17:05,600 Speaker 1: this King Oscar the Sequel Prize um, was that shrinking 278 00:17:05,720 --> 00:17:11,920 Speaker 1: the initial conditions measurement or rate of error right yeah, 279 00:17:12,480 --> 00:17:17,960 Speaker 1: did not really shrink the the error in the outcome, right, 280 00:17:18,200 --> 00:17:20,679 Speaker 1: which flies in the face of determinism. What he found 281 00:17:20,760 --> 00:17:27,200 Speaker 1: was that just very very minute differences in the initial 282 00:17:27,280 --> 00:17:33,360 Speaker 1: conditions fed into a system produced wildly different outcomes after 283 00:17:33,400 --> 00:17:35,879 Speaker 1: a fairly short time. Yeah. Like, let me just round 284 00:17:35,920 --> 00:17:38,399 Speaker 1: off the mass of this planet at like the eighth 285 00:17:38,400 --> 00:17:42,919 Speaker 1: decimal point, and you know who cares, who cares? At 286 00:17:42,960 --> 00:17:45,600 Speaker 1: that point? Let me just round that one to a two, 287 00:17:46,000 --> 00:17:48,320 Speaker 1: and that would throw everything off at a at a 288 00:17:48,320 --> 00:17:52,439 Speaker 1: pretty high rate. And he said, wait a minute, I 289 00:17:52,480 --> 00:17:57,680 Speaker 1: think this contest is in possible, right, He said, there 290 00:17:57,800 --> 00:18:02,040 Speaker 1: is no way to prude, prud to roof the stability 291 00:18:02,080 --> 00:18:07,040 Speaker 1: of the Solar system, because he just uncovered the idea 292 00:18:07,119 --> 00:18:12,000 Speaker 1: that it's impossible for us to predict the um, the 293 00:18:12,000 --> 00:18:16,480 Speaker 1: the rate of change among celestial bodies. Yeah, it's such 294 00:18:16,520 --> 00:18:21,440 Speaker 1: a complex system. There are far too many variables that, uh, 295 00:18:21,480 --> 00:18:25,840 Speaker 1: it's impossible to start with something so minute to get 296 00:18:26,119 --> 00:18:29,920 Speaker 1: the equation whatever, the sum that you want at the end. Well, 297 00:18:29,960 --> 00:18:32,080 Speaker 1: not only that a sum I guess, but the result 298 00:18:32,320 --> 00:18:35,720 Speaker 1: not only that, And this is what really undermined determinism 299 00:18:35,960 --> 00:18:39,080 Speaker 1: was that he figured out that you would have to 300 00:18:39,160 --> 00:18:44,680 Speaker 1: have an infinitely precise measurement. Yeah, which even if you 301 00:18:44,800 --> 00:18:48,800 Speaker 1: build a perfect machine that could take the infinitely a 302 00:18:48,800 --> 00:18:51,800 Speaker 1: perfect machine that could take a measurement of like the 303 00:18:51,359 --> 00:18:56,640 Speaker 1: the movement of a celestial body around another, you, it's 304 00:18:57,119 --> 00:19:02,440 Speaker 1: literally impossible to get infinite an infinitely precise measurement, which 305 00:19:02,480 --> 00:19:05,800 Speaker 1: means that we could never predict out to a certain 306 00:19:05,800 --> 00:19:10,679 Speaker 1: degree the movement of these celestial bodies. Like he was saying, like, no, 307 00:19:11,080 --> 00:19:14,680 Speaker 1: you you can't get you can't build a machine there 308 00:19:14,800 --> 00:19:17,760 Speaker 1: that gets measurements enough that we can overcome this, Like 309 00:19:17,880 --> 00:19:22,560 Speaker 1: determinism is wrong, Like you can't just say, uh, we 310 00:19:22,640 --> 00:19:27,159 Speaker 1: have the understanding to predict everything. There's a lot of 311 00:19:27,160 --> 00:19:29,560 Speaker 1: stuff out there that we're not able to predict. And 312 00:19:29,600 --> 00:19:32,680 Speaker 1: he uncovered it trying to figure out this end body problem. Yeah, 313 00:19:32,680 --> 00:19:36,320 Speaker 1: and King Oscar the sequel said you win, Yeah, bring 314 00:19:36,320 --> 00:19:39,040 Speaker 1: me another rack of lamb and uh, here's your prize. 315 00:19:39,520 --> 00:19:42,400 Speaker 1: And he won by proving that it was impossible, which 316 00:19:42,440 --> 00:19:46,919 Speaker 1: is pretty interesting, and that utterly and completely changed not 317 00:19:47,080 --> 00:19:49,600 Speaker 1: just math, but like our our our understanding of the 318 00:19:49,680 --> 00:19:52,560 Speaker 1: universe and our understanding of our understanding of the universe, 319 00:19:52,600 --> 00:19:55,000 Speaker 1: which is even more kind of earth shaking. Yeah, he 320 00:19:55,119 --> 00:20:01,119 Speaker 1: discovered dynamical instability or chaos, and um, they didn't have 321 00:20:01,160 --> 00:20:03,400 Speaker 1: supercomputers at the time, so it would be a little while, 322 00:20:04,280 --> 00:20:09,040 Speaker 1: about seventy years at m I T until uh, we 323 00:20:09,080 --> 00:20:12,920 Speaker 1: could actually kind of feed these things into machines capable 324 00:20:13,040 --> 00:20:15,280 Speaker 1: of plotting these things out in a way that we 325 00:20:15,280 --> 00:20:19,040 Speaker 1: could see, which was really incredible. So there was this 326 00:20:19,119 --> 00:20:25,360 Speaker 1: dude um seventy years later, uh named um Edward Lawrence 327 00:20:26,240 --> 00:20:28,840 Speaker 1: or Lawrence. Yeah. Well, first of all, we should set 328 00:20:28,840 --> 00:20:32,320 Speaker 1: the stage the reason this guy he was a meteorologist 329 00:20:32,680 --> 00:20:36,440 Speaker 1: and scientist, not that those are not the same thing, right, 330 00:20:36,760 --> 00:20:41,120 Speaker 1: He's a scientist who dabbled the meteorology. He was a mathematician, Yeah, 331 00:20:41,880 --> 00:20:44,640 Speaker 1: but he was really into meteorology because it was there 332 00:20:44,680 --> 00:20:48,360 Speaker 1: was a weird juxtaposition at the time where we were 333 00:20:48,480 --> 00:20:51,800 Speaker 1: sending people into outer space but we couldn't predict the weather. Yeah, 334 00:20:51,800 --> 00:20:54,560 Speaker 1: and it was it was definitely a blot on the 335 00:20:54,600 --> 00:20:57,359 Speaker 1: field of meteorology. People were like, do you guys know 336 00:20:57,400 --> 00:21:00,719 Speaker 1: what you're doing? And and meteor all just like you 337 00:21:00,760 --> 00:21:03,960 Speaker 1: have no idea how hard this is? Yeah, Like, yeah, 338 00:21:03,960 --> 00:21:06,080 Speaker 1: we can predicted a couple of days out, but after that, 339 00:21:06,320 --> 00:21:10,160 Speaker 1: it's just it's totally unpredictable. It drives as mad and 340 00:21:10,440 --> 00:21:13,840 Speaker 1: it's not It wasn't just there, um, their reputations that 341 00:21:13,880 --> 00:21:16,080 Speaker 1: were at stakes, like people were losing their lives because 342 00:21:16,119 --> 00:21:18,440 Speaker 1: of it, right, Yeah. Nine six two there were two 343 00:21:18,520 --> 00:21:20,680 Speaker 1: notorious storms, one on the East coast and one on 344 00:21:20,720 --> 00:21:23,520 Speaker 1: the west. Uh, the ash Wednesday storm in the east 345 00:21:23,640 --> 00:21:26,000 Speaker 1: and the Big Blow in the west. That killed a 346 00:21:26,040 --> 00:21:29,080 Speaker 1: lot of people, cost hundreds of millions of dollars in damage, 347 00:21:29,840 --> 00:21:31,480 Speaker 1: and people were like, you know, we need to be 348 00:21:31,560 --> 00:21:33,800 Speaker 1: able to see these things coming a little more because 349 00:21:33,840 --> 00:21:36,959 Speaker 1: it's a problem. And meteorologists were like, why did you 350 00:21:37,000 --> 00:21:41,880 Speaker 1: do it then? So they thought the key was these 351 00:21:41,920 --> 00:21:45,280 Speaker 1: big supercomputers. Remember the supercomputers. When they came out the 352 00:21:45,320 --> 00:21:49,560 Speaker 1: big rooms full of hardware, it was amazing and they 353 00:21:49,640 --> 00:21:52,480 Speaker 1: were finally able to do like these incredible calculations that 354 00:21:52,480 --> 00:21:54,280 Speaker 1: we could never do before. I know, they were able 355 00:21:54,320 --> 00:21:57,159 Speaker 1: to like crunch sixty four bites a second. Yeah, we 356 00:21:57,240 --> 00:22:01,920 Speaker 1: had the abacus and then the supercomputer, nothing in between. Um. 357 00:22:02,040 --> 00:22:04,840 Speaker 1: I looked up the computer that Lawrence was working with, 358 00:22:05,000 --> 00:22:08,639 Speaker 1: the Whopper a Royal McBee. What was the whopper board games? 359 00:22:09,280 --> 00:22:13,160 Speaker 1: Was it called the whopper? W a pr? I can't 360 00:22:13,200 --> 00:22:16,480 Speaker 1: believe they called it that. So the guy just nicknamed 361 00:22:16,480 --> 00:22:23,760 Speaker 1: it Joshua. No, Joshua was the the software Falcon was 362 00:22:23,880 --> 00:22:26,240 Speaker 1: the old man who designed all this stuff, and his 363 00:22:26,280 --> 00:22:29,399 Speaker 1: son was Joshua. And that was the password? Oh, that 364 00:22:29,480 --> 00:22:32,400 Speaker 1: was the password. Yeah, I guess I was too young 365 00:22:32,440 --> 00:22:35,320 Speaker 1: to understand what a password was. Yeah, okay, you didn't 366 00:22:35,320 --> 00:22:38,760 Speaker 1: even there weren't passwords at the time. Shouted it at 367 00:22:38,760 --> 00:22:42,720 Speaker 1: the computer and they're like, okay, access granted. Yeah, still 368 00:22:42,880 --> 00:22:45,639 Speaker 1: that movie holds up, does it really totally got to 369 00:22:45,720 --> 00:22:48,920 Speaker 1: check it out? Yeah, still, very very fun. Young Ali 370 00:22:48,960 --> 00:22:51,240 Speaker 1: Sheety boy had a crush on her from that movie. 371 00:22:51,320 --> 00:22:54,040 Speaker 1: She was great. Yeah, what else was she in recently? 372 00:22:55,119 --> 00:22:57,480 Speaker 1: Wasn't she in something? Well? I mean she kind of 373 00:22:57,520 --> 00:22:59,560 Speaker 1: went away for a while and then had her big 374 00:22:59,600 --> 00:23:03,040 Speaker 1: comeback with the indie movie High Art, But that was 375 00:23:03,080 --> 00:23:06,600 Speaker 1: a while ago. Has she been in anything else recently? Sure? 376 00:23:07,560 --> 00:23:10,080 Speaker 1: I think I saw something and something recently and I 377 00:23:10,080 --> 00:23:13,480 Speaker 1: didn't realize that was her. She looks familiar. I was like, oh, 378 00:23:13,560 --> 00:23:18,520 Speaker 1: that's Ali Sheity. I don't know all right. I could 379 00:23:18,520 --> 00:23:20,960 Speaker 1: look it up, but I won't as it doesn't matter anyway. 380 00:23:21,200 --> 00:23:25,600 Speaker 1: I still crushed on her. So the the Royal McBee 381 00:23:25,640 --> 00:23:28,520 Speaker 1: was not quite the whopper. You could actually sit down 382 00:23:28,560 --> 00:23:30,720 Speaker 1: at it. The Royal McBee, that's the name of that 383 00:23:30,800 --> 00:23:33,639 Speaker 1: sounds like a Hamburger too. It was by the Royal 384 00:23:33,960 --> 00:23:37,679 Speaker 1: Typewriter Company and they got into computers for a second. 385 00:23:38,119 --> 00:23:40,520 Speaker 1: And this is the kind of computer that Lawrence was 386 00:23:40,560 --> 00:23:44,560 Speaker 1: working with, and it was a huge deal, like you 387 00:23:44,600 --> 00:23:49,439 Speaker 1: were saying Abacus supercomputer. UM. But it was still pretty 388 00:23:49,560 --> 00:23:52,680 Speaker 1: dumb as far as what we have today is concerned. 389 00:23:52,760 --> 00:23:54,800 Speaker 1: But it was enough that Lawrence is like Lawrence and 390 00:23:54,880 --> 00:23:58,920 Speaker 1: his ILK, where like, finally we can start running models 391 00:23:59,200 --> 00:24:02,640 Speaker 1: and actually pretty the weather. Yeah, he started doing just that. 392 00:24:02,800 --> 00:24:06,240 Speaker 1: He did. So he started off with, UM, a computational 393 00:24:06,280 --> 00:24:13,399 Speaker 1: model of twelve meteorological meteorological calculations, which is very basic 394 00:24:13,920 --> 00:24:18,920 Speaker 1: because they're infinite meteorological calculations, probably depending I stay it 395 00:24:18,960 --> 00:24:22,160 Speaker 1: wrong again, Like it sounds like you're about to say 396 00:24:22,160 --> 00:24:23,440 Speaker 1: it wrong and then you pull it out at the 397 00:24:23,520 --> 00:24:26,880 Speaker 1: last second. Maybe it's really impressive. But h So that's 398 00:24:26,920 --> 00:24:29,040 Speaker 1: very basic. But he wanted to start out, you know, 399 00:24:29,240 --> 00:24:32,760 Speaker 1: with something attainable, so he narrowed it down to twelve conditions, 400 00:24:32,800 --> 00:24:37,840 Speaker 1: basically twelve calculations that had you know, temperature, wind, speed, pressure, 401 00:24:38,080 --> 00:24:42,600 Speaker 1: stuff like that started forecasting weather. Uh. And then he said, 402 00:24:42,640 --> 00:24:44,120 Speaker 1: you know, it'd be great if you could see this, 403 00:24:44,880 --> 00:24:47,840 Speaker 1: So I'm gonna spit it into my wonder machine, the 404 00:24:47,920 --> 00:24:52,960 Speaker 1: McWhopper Royal MCBE and I'm going to get a print 405 00:24:52,960 --> 00:24:56,760 Speaker 1: out so you can visualize what this looks like. So 406 00:24:56,840 --> 00:24:58,359 Speaker 1: things were going well, and he had this print out, 407 00:24:58,359 --> 00:25:02,280 Speaker 1: and everyone was amazed because these these calculations never seemed 408 00:25:02,280 --> 00:25:07,600 Speaker 1: to repeat themselves. He was making like um, like like 409 00:25:07,760 --> 00:25:10,640 Speaker 1: word art. You remember that. That was the first thing 410 00:25:10,640 --> 00:25:13,440 Speaker 1: anybody did on a computer. It was to make word 411 00:25:13,520 --> 00:25:16,600 Speaker 1: art like a butterfly or right, you would print out. Yeah, 412 00:25:16,760 --> 00:25:19,520 Speaker 1: I never could do that. I couldn't either, Like you 413 00:25:19,520 --> 00:25:22,560 Speaker 1: have to be able to visualize things spatially that you 414 00:25:22,560 --> 00:25:25,119 Speaker 1: have to have that right kind of brain for that, right, 415 00:25:25,240 --> 00:25:27,040 Speaker 1: or you have to be following a guide book. But 416 00:25:27,320 --> 00:25:32,200 Speaker 1: you have you ever seen me? You and everyone we know? Yeah, 417 00:25:32,200 --> 00:25:34,400 Speaker 1: I love that movie. That's a great movie. Those little 418 00:25:34,480 --> 00:25:38,480 Speaker 1: kids in there, they were doing that Oh yeah, yeah, forever, 419 00:25:39,080 --> 00:25:41,560 Speaker 1: back and forth poop. Well I haven't I haven't seen 420 00:25:41,560 --> 00:25:43,160 Speaker 1: that since it came out. It's been a while. Oh 421 00:25:43,200 --> 00:25:46,919 Speaker 1: you gotta see it again. Yeah, great movie. Ali's not 422 00:25:47,000 --> 00:25:50,240 Speaker 1: in it. It's a Miranda July right, and she wrote 423 00:25:50,280 --> 00:25:52,840 Speaker 1: and directed to right. She did a great job. It's 424 00:25:52,880 --> 00:25:57,560 Speaker 1: like it's one of those rare movies where like there's 425 00:25:57,800 --> 00:26:01,400 Speaker 1: just the right amount of whimsy. Is whimsy so easily 426 00:26:01,480 --> 00:26:06,119 Speaker 1: overpowers everything else and becomes like yeah, yeah, this is 427 00:26:06,160 --> 00:26:09,280 Speaker 1: like the most perfectly balanced amount of like whimsy you've 428 00:26:09,320 --> 00:26:11,560 Speaker 1: ever seen in a movie. Yeah, there's too much whimsy. 429 00:26:11,600 --> 00:26:13,800 Speaker 1: I just like terrible Garden State. I just want to 430 00:26:13,840 --> 00:26:16,760 Speaker 1: punch in the face Terrible. Although I like Garden State, 431 00:26:16,800 --> 00:26:18,480 Speaker 1: but I haven't seen it since it came out. It 432 00:26:18,520 --> 00:26:21,439 Speaker 1: hasn't aged. Well, it's just when you look at it now, 433 00:26:21,440 --> 00:26:28,160 Speaker 1: it's just so cutesye and whimsical. It's like, come on, yeah, boy, 434 00:26:28,160 --> 00:26:30,720 Speaker 1: we're getting to a lot of movies today. Oh yeah, 435 00:26:30,720 --> 00:26:33,480 Speaker 1: Well we're stalling and we haven't even talked about butterfly 436 00:26:33,520 --> 00:26:37,080 Speaker 1: effect yet, which is coming and I'm dreading it. That's 437 00:26:37,080 --> 00:26:40,480 Speaker 1: why I'm stalling, all right. So where were we? He 438 00:26:40,720 --> 00:26:45,160 Speaker 1: was running his calculations, printing out his values so people 439 00:26:45,160 --> 00:26:48,080 Speaker 1: could see it, and then he got a little lazy 440 00:26:48,119 --> 00:26:54,200 Speaker 1: one day. In this output he noticed was interesting, so 441 00:26:54,240 --> 00:26:56,920 Speaker 1: he said, you know, I'm gonna repeat this calculation see 442 00:26:56,960 --> 00:27:00,359 Speaker 1: it again, but I'm gonna to save time, just gonna 443 00:27:00,400 --> 00:27:03,280 Speaker 1: kind of pick up in the middle, and I'm not 444 00:27:03,320 --> 00:27:06,720 Speaker 1: gonna input as many numbers, but I'm still using the 445 00:27:06,760 --> 00:27:10,119 Speaker 1: same values, just I'm not going out to six decimal points. 446 00:27:10,240 --> 00:27:13,760 Speaker 1: So the print out he had went to three decimal points, 447 00:27:13,840 --> 00:27:16,480 Speaker 1: so he was working from the print out and didn't 448 00:27:16,480 --> 00:27:19,800 Speaker 1: take into account that the computer accepted six decimal points, 449 00:27:19,840 --> 00:27:22,520 Speaker 1: so he was just putting in three and expecting that 450 00:27:22,560 --> 00:27:24,280 Speaker 1: the outcome would be the same. Right, Yes, but the 451 00:27:24,320 --> 00:27:29,600 Speaker 1: outcome was way different, right, And he went whoa, whoa what? Yeah, 452 00:27:29,960 --> 00:27:33,240 Speaker 1: He's like, what's going on here? There was a big deal. 453 00:27:33,480 --> 00:27:35,160 Speaker 1: I mean someone would have come up with this eventually 454 00:27:35,200 --> 00:27:38,240 Speaker 1: probably yeah, but sort of accidentally came upon it. It's 455 00:27:38,320 --> 00:27:41,119 Speaker 1: neat that this guy did this because it changed his career. 456 00:27:41,359 --> 00:27:44,640 Speaker 1: I think he went from emphasis on meteorology to an 457 00:27:44,680 --> 00:27:50,320 Speaker 1: emphasis on chaos math to stud scientists basically. So, I mean, 458 00:27:50,320 --> 00:27:52,680 Speaker 1: the guy's got an attractor named after him, you know 459 00:27:52,720 --> 00:27:55,280 Speaker 1: what I mean. Yeah, well let's get to that. So 460 00:27:55,840 --> 00:27:58,720 Speaker 1: Lorenz starts looking at this and he's like, wait a minute, 461 00:27:58,760 --> 00:28:02,000 Speaker 1: this is this is weird. This is worth investigating, and 462 00:28:02,160 --> 00:28:07,200 Speaker 1: like uh like uh, what was his name? Pwankara? He said, 463 00:28:07,400 --> 00:28:10,040 Speaker 1: I need fewer variables, so I'm not going to try 464 00:28:10,080 --> 00:28:14,639 Speaker 1: to predict weather with these twelve differential equations that you 465 00:28:14,680 --> 00:28:17,600 Speaker 1: have to take into account. I'm just gonna take one 466 00:28:18,119 --> 00:28:21,880 Speaker 1: aspect of weather called the rolling convection current, and I'm 467 00:28:21,920 --> 00:28:24,879 Speaker 1: going to see how I can write it down in 468 00:28:24,960 --> 00:28:29,359 Speaker 1: formula form. So rolling convection current, chuck is where you know, 469 00:28:29,400 --> 00:28:33,440 Speaker 1: how the wind is created where air at the surface 470 00:28:34,240 --> 00:28:38,440 Speaker 1: is heated and then starts to rise and suddenly cool 471 00:28:38,520 --> 00:28:41,640 Speaker 1: air from higher above comes in to fill that that 472 00:28:41,720 --> 00:28:46,160 Speaker 1: vacuum that's left, and that creates a rolling um or 473 00:28:46,720 --> 00:28:51,000 Speaker 1: vertically based convection current. Okay you could, I would describe 474 00:28:51,000 --> 00:28:55,960 Speaker 1: it as oven oven, boiling water, a cup of coffee. 475 00:28:56,280 --> 00:29:01,560 Speaker 1: Wherever there's a temperature differential based on a vertical alignment, 476 00:29:01,800 --> 00:29:05,120 Speaker 1: you're going to have a role in convection current. Okay, yeah, 477 00:29:05,160 --> 00:29:07,760 Speaker 1: it sounds complex, but he just picked out one thing, 478 00:29:07,840 --> 00:29:10,640 Speaker 1: basically one condition and this is the one he picked out. 479 00:29:10,680 --> 00:29:13,920 Speaker 1: But had you seen my hands moving listeners, you would 480 00:29:13,960 --> 00:29:17,160 Speaker 1: be like, oh yeah, I know he made little rolling emotions. 481 00:29:17,360 --> 00:29:20,320 Speaker 1: So um, he's like, okay, I can figure this out. 482 00:29:20,320 --> 00:29:24,640 Speaker 1: So he comes up with three three formula that kind 483 00:29:24,640 --> 00:29:28,720 Speaker 1: of describe a rolling convection current, and he starts trying 484 00:29:28,760 --> 00:29:33,080 Speaker 1: to figure out how to describe this rolling convection current 485 00:29:33,480 --> 00:29:35,960 Speaker 1: right correct, And so, like I said, he got this 486 00:29:36,120 --> 00:29:38,920 Speaker 1: these three formula, which we're basically three variables that he 487 00:29:39,240 --> 00:29:42,320 Speaker 1: calculated over time, and he plugged him in and he 488 00:29:42,440 --> 00:29:46,280 Speaker 1: found three variables that changed over time, and he found 489 00:29:46,320 --> 00:29:49,720 Speaker 1: that after a certain point, when you graph these things out, 490 00:29:49,840 --> 00:29:51,560 Speaker 1: and since there are three, you graph them out on 491 00:29:51,600 --> 00:29:55,160 Speaker 1: a three dimensional graph, so x, y and z. Again, 492 00:29:55,240 --> 00:29:57,800 Speaker 1: he wanted to just be able to visualize this because 493 00:29:57,800 --> 00:29:59,640 Speaker 1: it's easier for people to understand. He was a very 494 00:29:59,720 --> 00:30:02,800 Speaker 1: visual guy. All of a sudden, it made this crazy 495 00:30:02,920 --> 00:30:07,920 Speaker 1: graph that where the line as it progressed forward through time, 496 00:30:08,240 --> 00:30:10,440 Speaker 1: went all over the place. It went from this access 497 00:30:10,440 --> 00:30:13,000 Speaker 1: to another access to the other axis, and it would 498 00:30:13,240 --> 00:30:15,680 Speaker 1: spend some time over here, and then it would suddenly 499 00:30:15,680 --> 00:30:18,360 Speaker 1: loop over to the other one, and it followed no 500 00:30:18,560 --> 00:30:23,840 Speaker 1: rhyme or reason. It never retraced its path. And it 501 00:30:23,920 --> 00:30:28,080 Speaker 1: was describing how a convection current changes over time. Right, 502 00:30:29,160 --> 00:30:33,440 Speaker 1: and Lorenz is looking at this. He was expecting these 503 00:30:33,480 --> 00:30:37,840 Speaker 1: three things to equalize and eventually form a line, because 504 00:30:37,880 --> 00:30:40,560 Speaker 1: that's what determinism says, things are going to fall into 505 00:30:40,640 --> 00:30:44,680 Speaker 1: a certain amount of equilibrium and just even out over time. 506 00:30:44,920 --> 00:30:47,840 Speaker 1: That is not what he found. And what he discovered 507 00:30:47,880 --> 00:30:52,920 Speaker 1: was what poncar A discovered, which was that some systems, 508 00:30:52,960 --> 00:30:59,200 Speaker 1: even relatively simple systems, exhibit very complex, unpredictable behavior, which 509 00:30:59,200 --> 00:31:02,640 Speaker 1: you could call offs. Yeah. And when you say things 510 00:31:02,640 --> 00:31:04,719 Speaker 1: were going all over like if you look at the graph, 511 00:31:04,840 --> 00:31:08,400 Speaker 1: it it's not just lines going in straight lines bouncing 512 00:31:08,400 --> 00:31:11,240 Speaker 1: all over the place randomly, like there was an order 513 00:31:11,280 --> 00:31:13,800 Speaker 1: to it, but the lines were not on top of 514 00:31:13,800 --> 00:31:16,320 Speaker 1: one another. Like let's say you draw a figure eight 515 00:31:16,400 --> 00:31:19,800 Speaker 1: with your pencil and then you continue drawing that figure eight, 516 00:31:19,800 --> 00:31:23,640 Speaker 1: It's gonna slip outside those curves every time unless you're 517 00:31:24,240 --> 00:31:28,400 Speaker 1: a robot um. And that's what it ended up looking like. Yeah. Yeah, 518 00:31:28,440 --> 00:31:33,000 Speaker 1: it never retraced the same path twice ever. Um. It 519 00:31:33,120 --> 00:31:36,560 Speaker 1: had a lot of really surprising properties and at the time, 520 00:31:36,880 --> 00:31:40,840 Speaker 1: it just fell completely outside the understanding of science, right yeah. 521 00:31:41,200 --> 00:31:44,440 Speaker 1: Luckily this happened to Lawrence who was curious enough to 522 00:31:44,440 --> 00:31:47,560 Speaker 1: be like, what is going on here? And again he 523 00:31:47,640 --> 00:31:49,680 Speaker 1: sat down and started to do the math and thinking 524 00:31:49,680 --> 00:31:52,240 Speaker 1: about this and especially how it applied to the weather, 525 00:31:52,440 --> 00:31:57,080 Speaker 1: right yeah, and he came up with something very famous. Yes, 526 00:31:57,240 --> 00:32:02,520 Speaker 1: the butterfly effect. Yes, uh a, this thing kind of 527 00:32:02,560 --> 00:32:05,400 Speaker 1: looked like butterfly wings a little bit. Uh and b 528 00:32:05,840 --> 00:32:09,920 Speaker 1: When he went to present his findings, he he basically 529 00:32:10,000 --> 00:32:12,280 Speaker 1: had the notion He's like, I'm gonna I'm gonna wile 530 00:32:12,320 --> 00:32:15,600 Speaker 1: these people in the crowd. In nine two, it's a 531 00:32:15,680 --> 00:32:18,640 Speaker 1: conference that I'm going to and I'm gonna I'm gonna 532 00:32:18,640 --> 00:32:21,440 Speaker 1: say something like, you know, the seagull flaps his wings 533 00:32:21,480 --> 00:32:24,360 Speaker 1: and it starts a small turbulence that can one that 534 00:32:24,400 --> 00:32:26,440 Speaker 1: can affect whether on the other side of the world. 535 00:32:27,200 --> 00:32:29,560 Speaker 1: The small little thing will just grow and grow in 536 00:32:29,680 --> 00:32:32,880 Speaker 1: snowball and effective things. And he had a colleague goes, like, 537 00:32:34,200 --> 00:32:37,640 Speaker 1: seagull wings, that's nice, and he said, how about this? 538 00:32:37,800 --> 00:32:40,640 Speaker 1: And this is the title they ended up with, predictability 539 00:32:40,760 --> 00:32:44,120 Speaker 1: Colin does the flap of a butterfly's wings in Brazil 540 00:32:44,640 --> 00:32:48,600 Speaker 1: set off a tornado and Texas And everyone was like, 541 00:32:48,960 --> 00:32:54,719 Speaker 1: whoa mind's blown? Should we take a break? All right, 542 00:32:54,760 --> 00:33:16,640 Speaker 1: We'll be right back, all right. So the lawns attractor, 543 00:33:18,800 --> 00:33:22,560 Speaker 1: uh is that picture that he ended up with, The 544 00:33:22,640 --> 00:33:29,520 Speaker 1: lawns attractor? And this biblical pattern website that I found 545 00:33:30,480 --> 00:33:33,960 Speaker 1: described attractors and strange attractors in a way that even 546 00:33:34,040 --> 00:33:37,040 Speaker 1: dumb old me could understand what you got. So if 547 00:33:37,040 --> 00:33:41,200 Speaker 1: I may, he says, all right, here's the cycle of chaos. 548 00:33:41,400 --> 00:33:47,200 Speaker 1: He said, Actually, I don't know who wrote this. Woman 549 00:33:47,240 --> 00:33:49,120 Speaker 1: could have been a small child could have been no 550 00:33:49,400 --> 00:33:53,160 Speaker 1: of undetermined gender. I have no idea but the gender 551 00:33:53,240 --> 00:33:57,920 Speaker 1: neutral narrator. They said, He's sorry. Think about a town 552 00:33:59,000 --> 00:34:01,720 Speaker 1: that has like in thousand people living in it. To 553 00:34:02,080 --> 00:34:04,200 Speaker 1: make that town work, you've got to have like a 554 00:34:04,240 --> 00:34:09,400 Speaker 1: gas station, a grocery store, a library, um, whatever you 555 00:34:09,440 --> 00:34:12,720 Speaker 1: need to sustain that town. So all these things are built, 556 00:34:12,760 --> 00:34:16,680 Speaker 1: everyone's happy. You have equilibrium. He said, So that's great. 557 00:34:16,960 --> 00:34:20,360 Speaker 1: Then let's say you build some Someone comes and builds 558 00:34:20,360 --> 00:34:23,600 Speaker 1: a factory on the outskirts of that town, and there's 559 00:34:23,600 --> 00:34:25,839 Speaker 1: gonna be ten thousand more people living there, and they 560 00:34:25,840 --> 00:34:30,640 Speaker 1: don't go to church. Maybe so, uh did I say church? 561 00:34:30,680 --> 00:34:33,480 Speaker 1: They needed a church? Okay? I was just assuming this 562 00:34:33,520 --> 00:34:37,000 Speaker 1: is what's caum no, no, no, but you just have 563 00:34:37,080 --> 00:34:39,920 Speaker 1: more people. So there's you need another gas station and 564 00:34:39,920 --> 00:34:43,200 Speaker 1: another grocery store. Let's say so they build all these things, 565 00:34:43,280 --> 00:34:47,719 Speaker 1: and then you reach equilibrium. Again, it's maintained because you 566 00:34:47,760 --> 00:34:51,960 Speaker 1: build all these other systems up. That equilibrium is called 567 00:34:52,000 --> 00:34:57,480 Speaker 1: an attractor. Okay, So then he said it said, they 568 00:34:57,520 --> 00:35:05,000 Speaker 1: said he capital he the royal. He said, all right, 569 00:35:05,000 --> 00:35:07,480 Speaker 1: now let's say instead of that that factory being built, 570 00:35:08,160 --> 00:35:10,040 Speaker 1: and you have those original tin thous and let's say 571 00:35:10,040 --> 00:35:12,879 Speaker 1: three thousand. Those people just up and leave one day, 572 00:35:13,360 --> 00:35:16,040 Speaker 1: and the grocery store guy says, well, there's only seven 573 00:35:16,040 --> 00:35:18,239 Speaker 1: thousand people here. We need eight thousand people living here 574 00:35:18,239 --> 00:35:21,720 Speaker 1: to make a profit. So I'm shutting down this grocery store. 575 00:35:23,040 --> 00:35:25,600 Speaker 1: Then all of a sudden, you have demand for groceries. 576 00:35:26,520 --> 00:35:28,000 Speaker 1: So things go on for a little while, and someone 577 00:35:28,000 --> 00:35:29,840 Speaker 1: comes in and say, hey, this town needs a grocery store. 578 00:35:30,000 --> 00:35:33,719 Speaker 1: They build a grocery store, they can't sustain, they shut down. 579 00:35:33,840 --> 00:35:36,640 Speaker 1: Someone else comes along because the demand, and it is 580 00:35:36,680 --> 00:35:43,280 Speaker 1: this search for equilibrium, this dinet. Well, you reach equallylibrium 581 00:35:43,360 --> 00:35:46,839 Speaker 1: here and there as the store opens periods of stability, 582 00:35:46,920 --> 00:35:51,480 Speaker 1: periods of stability, and that dynamic equilibrium is called a 583 00:35:51,560 --> 00:35:55,560 Speaker 1: strange attractor. So an attractor is the state which a 584 00:35:55,640 --> 00:36:00,680 Speaker 1: system settles on. Stranger attractor is the trajector on which 585 00:36:00,719 --> 00:36:04,640 Speaker 1: it never settles down but tries to reach the equilibrium 586 00:36:04,680 --> 00:36:07,680 Speaker 1: with periods of stability. Man, does that make sense that 587 00:36:07,800 --> 00:36:12,759 Speaker 1: Bible based explanation was dynamite? I understand it better than 588 00:36:12,800 --> 00:36:17,160 Speaker 1: I did before, and I understood it okay before. That's great, 589 00:36:17,280 --> 00:36:22,279 Speaker 1: surely can add yeah, yeah, now you're gonna add to it. No, 590 00:36:22,520 --> 00:36:26,320 Speaker 1: that's it. No, I mean like it. Yeah. And attractor 591 00:36:26,400 --> 00:36:30,439 Speaker 1: is where if you graph something and eventually it reaches equilibrium, 592 00:36:30,440 --> 00:36:33,880 Speaker 1: it's a regular attractor. If it never reaches equilibrium, is 593 00:36:33,920 --> 00:36:36,840 Speaker 1: constantly trying to and has periods of stability. Strange attractor. 594 00:36:37,120 --> 00:36:39,719 Speaker 1: I can't. I can't top that. All right, grocery store, 595 00:36:39,760 --> 00:36:43,240 Speaker 1: small town. That was great. So um Lorenz, a strange 596 00:36:43,239 --> 00:36:47,279 Speaker 1: attractor was named a Lorenz attractor named after him. Big deal. 597 00:36:47,360 --> 00:36:50,759 Speaker 1: They weren't using the word chaos yet. No, but he 598 00:36:50,880 --> 00:36:54,800 Speaker 1: published that paper about butterfly wings, right, the butterfly effect, 599 00:36:55,160 --> 00:36:58,880 Speaker 1: and it coupled with his pictures, the picture of a 600 00:36:58,920 --> 00:37:04,240 Speaker 1: strange attractor, which is almost the aside from fractals, almost 601 00:37:04,239 --> 00:37:10,080 Speaker 1: the the the um emblem or the logo for chaos theory, 602 00:37:10,120 --> 00:37:14,359 Speaker 1: the lords attractor is um. It got attention off the bat. 603 00:37:14,480 --> 00:37:17,280 Speaker 1: It wasn't like plan Cares findings, where he got neglected 604 00:37:17,320 --> 00:37:20,600 Speaker 1: for seventy years. Almost immediately everybody was talking about this 605 00:37:20,920 --> 00:37:23,799 Speaker 1: because again, what Lawrence had uncovered, which is the same 606 00:37:23,840 --> 00:37:27,800 Speaker 1: thing that Planka had uncovered, is that determinism is possibly 607 00:37:28,560 --> 00:37:32,640 Speaker 1: based on an illusion that the universe isn't stable, that 608 00:37:32,719 --> 00:37:35,280 Speaker 1: the universe isn't predictable, and that what we are seeing 609 00:37:35,800 --> 00:37:39,800 Speaker 1: as stable and predictable are these little periods windows of 610 00:37:39,840 --> 00:37:44,120 Speaker 1: stability that are found in strange attractor graphs. That that's 611 00:37:44,160 --> 00:37:46,120 Speaker 1: what we think the order of the universe is, but 612 00:37:46,200 --> 00:37:51,000 Speaker 1: that that is actually the abnormal aspect of the universe, 613 00:37:51,200 --> 00:37:55,680 Speaker 1: and that instability, unpredictability, as far as we're concerned, is 614 00:37:55,760 --> 00:37:59,319 Speaker 1: the actual state of affairs in in nature. And I 615 00:37:59,360 --> 00:38:02,200 Speaker 1: think as as we're concerned is a really important point 616 00:38:02,280 --> 00:38:09,200 Speaker 1: to chuck, because it doesn't mean that nature is unstable chaotic. 617 00:38:09,719 --> 00:38:13,360 Speaker 1: It means that our picture of what we understand is 618 00:38:13,520 --> 00:38:18,600 Speaker 1: order doesn't jibe with how the universe actually functions. It's 619 00:38:18,640 --> 00:38:22,560 Speaker 1: just our understanding of it, and we're's just so um 620 00:38:22,800 --> 00:38:26,680 Speaker 1: anthropocentric that, you know, we we see it as chaos 621 00:38:26,760 --> 00:38:29,279 Speaker 1: and disorder and something to be feared, when really it's 622 00:38:29,320 --> 00:38:33,680 Speaker 1: just complexity that we don't have the capability of predicting. 623 00:38:35,000 --> 00:38:37,400 Speaker 1: After a certain degree. Yeah, I think that makes me 624 00:38:37,440 --> 00:38:39,560 Speaker 1: feel a little better, because when you read stuff like this, 625 00:38:39,960 --> 00:38:42,400 Speaker 1: you start to feel like, well, the Earth could just 626 00:38:42,480 --> 00:38:45,320 Speaker 1: throw us all off of its face at any moment, 627 00:38:46,000 --> 00:38:49,280 Speaker 1: because it starts spinning so fast that gravity becomes undone, 628 00:38:49,440 --> 00:38:50,960 Speaker 1: and I know that's not right. By the way, I've 629 00:38:50,960 --> 00:38:54,120 Speaker 1: always loved that kind of science that shows we don't 630 00:38:54,160 --> 00:38:57,440 Speaker 1: know anything, like Ert Robert Hume, who I know, I 631 00:38:57,480 --> 00:39:01,080 Speaker 1: understand was a philosopher, but he was a philosopher scientist. Um. 632 00:39:01,120 --> 00:39:03,680 Speaker 1: His whole jam was like causing effect as an illusion 633 00:39:04,320 --> 00:39:07,200 Speaker 1: that like we all we it's it's just an assumption, 634 00:39:07,320 --> 00:39:09,680 Speaker 1: like that if you drop a pencil, it will always 635 00:39:09,680 --> 00:39:12,840 Speaker 1: fall down. It's an illusion. And this is pre um 636 00:39:12,840 --> 00:39:16,720 Speaker 1: gravity understanding gravity, but he makes a good pret gravity 637 00:39:16,719 --> 00:39:20,439 Speaker 1: when everyone's just floating around. Yeah, going this pencils got 638 00:39:20,480 --> 00:39:23,080 Speaker 1: me wacky. But but the point was that you know, 639 00:39:23,160 --> 00:39:27,000 Speaker 1: we we are. We base a lot of our assumptions 640 00:39:27,640 --> 00:39:30,040 Speaker 1: um or a lot of stuff that we take as 641 00:39:30,239 --> 00:39:33,239 Speaker 1: law are actually based on assumptions that are made from 642 00:39:33,280 --> 00:39:36,600 Speaker 1: observations over time, and that we're just making predictions that 643 00:39:36,800 --> 00:39:39,319 Speaker 1: causing effect as an illusion. I love that guy, and 644 00:39:39,400 --> 00:39:45,240 Speaker 1: this this definitely supports that idea for sure. Yeah. Sorry, 645 00:39:45,239 --> 00:39:48,440 Speaker 1: I'm I'm excited about chaos theory. Can't believe it? Well, 646 00:39:48,480 --> 00:39:51,560 Speaker 1: I mean I like that I'm able to understand it 647 00:39:51,719 --> 00:39:54,520 Speaker 1: and enough of a rudimentary way that I can talk 648 00:39:54,560 --> 00:39:59,080 Speaker 1: about it at a dinner party. Well, thank your Bible website. Well, 649 00:39:59,120 --> 00:40:02,839 Speaker 1: once you take the formulas out, yeah, for people like us, 650 00:40:03,280 --> 00:40:06,239 Speaker 1: we're like, okay, we can understand chaos. Yeah. Then when 651 00:40:06,239 --> 00:40:08,839 Speaker 1: somebody says, good, do a differential equation, you're just like, 652 00:40:10,320 --> 00:40:14,239 Speaker 1: what a different equation? Alright? So earlier I said that 653 00:40:14,360 --> 00:40:17,440 Speaker 1: chaos had not been used the word chaos to describe 654 00:40:17,440 --> 00:40:21,440 Speaker 1: all this junk. Uh, And that didn't happen until later on, 655 00:40:21,520 --> 00:40:24,880 Speaker 1: and well actually about ten years, you know, but it 656 00:40:24,920 --> 00:40:26,360 Speaker 1: was kind of at the same time this other stuff 657 00:40:26,440 --> 00:40:30,440 Speaker 1: was going on with the Lorenz late sixties early seventies. 658 00:40:30,800 --> 00:40:35,480 Speaker 1: There was a guy named Stevens Smile uh Fields metal recipient, 659 00:40:35,600 --> 00:40:40,040 Speaker 1: so you know, he's good at math and um he 660 00:40:40,760 --> 00:40:44,480 Speaker 1: describes something that we now know as the small horseshoe, 661 00:40:45,400 --> 00:40:50,279 Speaker 1: and it goes a little something like this. Uh so, 662 00:40:50,320 --> 00:40:53,439 Speaker 1: all right, take a piece of dough, like like bread dough, 663 00:40:54,080 --> 00:40:56,880 Speaker 1: and you smash it out into a big flat rectangle 664 00:40:57,040 --> 00:40:59,800 Speaker 1: can do. So you're looking at that thing and you're like, boy, 665 00:41:00,040 --> 00:41:02,080 Speaker 1: I hope this makes some good bread. This is gonna 666 00:41:02,120 --> 00:41:04,719 Speaker 1: be so good. So then you do a little roseberry 667 00:41:04,760 --> 00:41:07,720 Speaker 1: on it. Yeah maybe so uh well, see salt. Yeah, 668 00:41:07,800 --> 00:41:10,640 Speaker 1: and then um, lick it before you bake it, so 669 00:41:10,719 --> 00:41:14,040 Speaker 1: you know it's yours. No one else can happen. Uh. 670 00:41:14,080 --> 00:41:16,680 Speaker 1: So you you have that flat rectangle of dough, you 671 00:41:16,800 --> 00:41:21,160 Speaker 1: roll it up into a tube, and then you smash 672 00:41:21,239 --> 00:41:23,560 Speaker 1: that down kind of flat, and then you bend that 673 00:41:23,640 --> 00:41:26,360 Speaker 1: down to where it eventually looks like a horse shoe. 674 00:41:27,000 --> 00:41:29,840 Speaker 1: So now you take that horseshoe. You take another rectangle 675 00:41:29,880 --> 00:41:33,520 Speaker 1: of dough, and you throw that horseshoe onto that, and 676 00:41:33,520 --> 00:41:36,640 Speaker 1: then you do the same thing. The smell horseshoe basically 677 00:41:36,680 --> 00:41:40,399 Speaker 1: says you cannot predict where the two points of that 678 00:41:40,480 --> 00:41:44,239 Speaker 1: horse shoe will end up. Yeah, you can roll it 679 00:41:44,560 --> 00:41:47,600 Speaker 1: a million times and they'll end up in a million 680 00:41:47,640 --> 00:41:52,080 Speaker 1: different places, totally random, different places to totally random. You 681 00:41:52,120 --> 00:41:54,440 Speaker 1: never know. It's like a box of chocolates. You never 682 00:41:54,440 --> 00:41:56,719 Speaker 1: know what you're gonna get. You have to say it, 683 00:41:57,040 --> 00:41:59,359 Speaker 1: and that became known. You have to say it. Oh 684 00:41:59,440 --> 00:42:01,959 Speaker 1: what a tate Forrest Coump And I can't do that. 685 00:42:01,960 --> 00:42:04,040 Speaker 1: That's fine, he's not one. He's not in my repertoire. 686 00:42:04,640 --> 00:42:07,239 Speaker 1: That's fine. Although I did see that again part of 687 00:42:07,239 --> 00:42:10,560 Speaker 1: it recently. Does it hold up well? I mean, take 688 00:42:10,560 --> 00:42:12,960 Speaker 1: out forty minutes of it and it would have been 689 00:42:13,000 --> 00:42:16,960 Speaker 1: a better movie, like all of that coincidence stuff that 690 00:42:17,680 --> 00:42:20,640 Speaker 1: Oh I love that and also did the smile t 691 00:42:20,760 --> 00:42:23,600 Speaker 1: shirt Like it was just too much, Like he really 692 00:42:23,840 --> 00:42:28,640 Speaker 1: hammered it too much. That was the basis of the movie. 693 00:42:28,760 --> 00:42:30,680 Speaker 1: I know. But see it again and I guarantee you, 694 00:42:30,760 --> 00:42:32,560 Speaker 1: like an hour and a half into it, you'll be like, 695 00:42:32,640 --> 00:42:36,879 Speaker 1: I get it. You know. It was a good Tom 696 00:42:36,920 --> 00:42:41,479 Speaker 1: Hanks movie that was overlooked. A Road to Perdition, Yeah, 697 00:42:41,640 --> 00:42:43,640 Speaker 1: that bad. That was a good one. Great sam In 698 00:42:43,719 --> 00:42:47,040 Speaker 1: does Oh man, that guy is awesome. Yeah, Oh, what 699 00:42:47,200 --> 00:42:49,879 Speaker 1: is he gonna do? He might do something he did 700 00:42:49,880 --> 00:42:51,840 Speaker 1: the James Bot he did Skyfall. Yeah, yeah, I know 701 00:42:51,920 --> 00:42:54,120 Speaker 1: he's gonna also that last one that wasn't so great. 702 00:42:54,320 --> 00:42:57,319 Speaker 1: He's got a potential project coming up and he would 703 00:42:57,320 --> 00:42:59,560 Speaker 1: be amazing for it. And I don't remember what it was. 704 00:42:59,680 --> 00:43:04,040 Speaker 1: Did you see Revolutionary Road? Yes? God have it was 705 00:43:04,120 --> 00:43:06,799 Speaker 1: just like, yeah, you want to jump off a bridge? 706 00:43:06,840 --> 00:43:10,840 Speaker 1: That like every five minutes during that movie. That was hardcore. 707 00:43:12,239 --> 00:43:14,200 Speaker 1: Uh he did that one too, huh. Yeah, And don't 708 00:43:14,239 --> 00:43:16,200 Speaker 1: see that if you're like engaged to be married or 709 00:43:16,239 --> 00:43:19,319 Speaker 1: thinking about it, yeah, or if you're blue already. Yeah, 710 00:43:19,480 --> 00:43:22,319 Speaker 1: I'm yeah to take a really good good mood and 711 00:43:22,360 --> 00:43:24,160 Speaker 1: be like I'm sick of being in a good mood, 712 00:43:24,440 --> 00:43:27,520 Speaker 1: sit down and watch Revolutionary Road, watch Joe versus the 713 00:43:27,560 --> 00:43:32,880 Speaker 1: Volcano instead? Uh? Where was I smell? Horseshoe? Is what 714 00:43:33,000 --> 00:43:37,800 Speaker 1: that's called? And um? That was he was the first 715 00:43:37,800 --> 00:43:40,560 Speaker 1: person to actually use the word chaos. Oh he was? 716 00:43:40,920 --> 00:43:44,719 Speaker 1: I think so? No? No, No York was Tom York's dad. Yeah, 717 00:43:44,800 --> 00:43:46,640 Speaker 1: you're right, he wasn't the first person New York correct, 718 00:43:46,760 --> 00:43:49,880 Speaker 1: But it's male's horseshoe illustrates a really good point, Chuck, 719 00:43:50,040 --> 00:43:54,799 Speaker 1: is it Tom York's dad? Okay, no, but they're both British, Sure, 720 00:43:54,960 --> 00:44:01,080 Speaker 1: Yorky's actually one's Australian. No, they're British. Um. So, uh 721 00:44:01,200 --> 00:44:04,680 Speaker 1: those two points which should which started out right by 722 00:44:04,680 --> 00:44:07,120 Speaker 1: each other, and then end up in two totally different places. 723 00:44:07,680 --> 00:44:10,640 Speaker 1: That applies not just a bread dough, but also too 724 00:44:10,760 --> 00:44:13,960 Speaker 1: things like water molecules that are right next to each 725 00:44:14,000 --> 00:44:17,640 Speaker 1: other at some point and then uh month later, they're 726 00:44:17,640 --> 00:44:20,680 Speaker 1: in two different oceans. Even though you would assume that 727 00:44:20,719 --> 00:44:22,960 Speaker 1: they would go through all the same motions and everything, 728 00:44:23,560 --> 00:44:25,839 Speaker 1: but they're not. There's so many different variables with things 729 00:44:25,920 --> 00:44:29,640 Speaker 1: like ocean currents, that two water molecules that were one 730 00:44:29,680 --> 00:44:33,320 Speaker 1: side by side end up in totally random different places. 731 00:44:34,040 --> 00:44:38,000 Speaker 1: And that's part of chaos. It's basically chaos personified or 732 00:44:38,120 --> 00:44:44,279 Speaker 1: chaos molecule fied. So we mentioned York. Where I was 733 00:44:44,320 --> 00:44:47,520 Speaker 1: going with that was, Um, there was an Australian named 734 00:44:47,640 --> 00:44:52,200 Speaker 1: Robert May and he was a population biologist. So he 735 00:44:52,280 --> 00:44:56,080 Speaker 1: was using math to model how animal populations would change 736 00:44:56,080 --> 00:45:01,000 Speaker 1: over time, giving certain starting conditions. Uh. So he started 737 00:45:01,080 --> 00:45:05,879 Speaker 1: using uh these equations is differential equations, and he came 738 00:45:05,960 --> 00:45:08,600 Speaker 1: up with a formula known as the logistic difference equation 739 00:45:09,360 --> 00:45:14,760 Speaker 1: that basically enabled him to predict these animal populations pretty well. Yeah, 740 00:45:14,800 --> 00:45:17,279 Speaker 1: and it was working pretty well for a while, but 741 00:45:17,400 --> 00:45:22,239 Speaker 1: he noticed something really really weird, right. He had this formula, Um, 742 00:45:22,960 --> 00:45:26,800 Speaker 1: the logistic difference equation is the name of it. Sure, Okay, 743 00:45:26,840 --> 00:45:30,719 Speaker 1: So we had that formula and he figured out that 744 00:45:30,760 --> 00:45:33,560 Speaker 1: if you took our which in this case was the 745 00:45:33,600 --> 00:45:38,040 Speaker 1: reproductive rate of a animal population, and you pushed it 746 00:45:38,160 --> 00:45:41,560 Speaker 1: past three, the number three, So that meant that the 747 00:45:41,680 --> 00:45:48,080 Speaker 1: average animal in this population of animals had three offspring 748 00:45:48,239 --> 00:45:51,600 Speaker 1: in its lifetime or in a season whatever. If you 749 00:45:51,680 --> 00:45:54,480 Speaker 1: pushed the past three, all of a sudden, the number 750 00:45:55,480 --> 00:45:59,800 Speaker 1: of the population would diverge. If you pushed it equal 751 00:45:59,800 --> 00:46:03,680 Speaker 1: to three actually, or more right, it would diverge, which 752 00:46:03,719 --> 00:46:07,239 Speaker 1: is weird because a population of animals can't be two 753 00:46:07,239 --> 00:46:11,000 Speaker 1: different numbers, you know, like that herd of antelope is 754 00:46:11,000 --> 00:46:14,000 Speaker 1: not there's not thirty, but there's also forty five of 755 00:46:14,000 --> 00:46:17,040 Speaker 1: them at the same time. That's called a super position, 756 00:46:17,360 --> 00:46:20,960 Speaker 1: and that has to do with quantum states, not herds 757 00:46:21,000 --> 00:46:24,640 Speaker 1: of antelopes. That was kind of weird. And then he 758 00:46:24,640 --> 00:46:26,440 Speaker 1: found if you pushed it a little further, if you 759 00:46:26,520 --> 00:46:30,560 Speaker 1: made the reproductive rate like three point oh five seven 760 00:46:30,640 --> 00:46:33,680 Speaker 1: or something like that. I think it was a different number, 761 00:46:34,120 --> 00:46:35,799 Speaker 1: but you just tweaked it a little bit, not even 762 00:46:35,840 --> 00:46:39,120 Speaker 1: to four. We're talking like millions of a of a 763 00:46:39,360 --> 00:46:43,120 Speaker 1: um of a degree um. All of a sudden it 764 00:46:43,160 --> 00:46:46,040 Speaker 1: would turn into four, so there'll be four different numbers 765 00:46:46,120 --> 00:46:48,480 Speaker 1: for that was the animal population, and then would turn 766 00:46:48,520 --> 00:46:50,480 Speaker 1: into sixteen. And then all of a sudden, after a 767 00:46:50,480 --> 00:46:53,239 Speaker 1: certain point, it would turn into chaos. The number would 768 00:46:53,280 --> 00:46:55,680 Speaker 1: be everything at once, all over the place, just totally 769 00:46:55,800 --> 00:47:01,040 Speaker 1: random numbers that it oscillated between. But in all that chaos. 770 00:47:01,080 --> 00:47:03,640 Speaker 1: There would be periods of stability. Right, you push it 771 00:47:03,680 --> 00:47:05,319 Speaker 1: a little further and all of a sudden it would 772 00:47:05,360 --> 00:47:08,080 Speaker 1: just go to two again. But beyond that, it didn't 773 00:47:08,120 --> 00:47:10,719 Speaker 1: go back to the original two numbers and went to 774 00:47:10,760 --> 00:47:12,840 Speaker 1: another two. So if you looked at it on a graph, 775 00:47:13,200 --> 00:47:16,640 Speaker 1: it went line divided into two, divided into four eight 776 00:47:16,800 --> 00:47:23,400 Speaker 1: sixteen chaos, two four sixteen sixteen chaos, all before you 777 00:47:23,440 --> 00:47:27,359 Speaker 1: even got to the number four of the reproductive right. Yeah, 778 00:47:27,360 --> 00:47:30,239 Speaker 1: And he was working with Mr York because he was 779 00:47:30,280 --> 00:47:32,880 Speaker 1: a little confounded. So he was a mathematician buddy of his, 780 00:47:33,719 --> 00:47:36,319 Speaker 1: James York from the University of Maryland, So they worked 781 00:47:36,320 --> 00:47:39,239 Speaker 1: together on this. In the nineteen seventy five they co 782 00:47:39,400 --> 00:47:44,760 Speaker 1: authored a paper called Period three Implies Chaos And man, 783 00:47:45,200 --> 00:47:48,920 Speaker 1: finally somebody said the word. I kept thinking it was 784 00:47:48,920 --> 00:47:51,759 Speaker 1: all these other people. Yeah, and the this this paper 785 00:47:51,800 --> 00:47:57,080 Speaker 1: where they first debut the name chaos. Um. They they 786 00:47:57,200 --> 00:48:01,640 Speaker 1: based it um. Tom York's stay based it on Edward 787 00:48:01,719 --> 00:48:04,440 Speaker 1: Lawrence's paper. He was like, you know what, I have 788 00:48:04,520 --> 00:48:07,040 Speaker 1: a feeling this has something to do with the Lawrence attractor. 789 00:48:07,440 --> 00:48:11,680 Speaker 1: So that um, that that provided chaos to the world, 790 00:48:11,719 --> 00:48:15,720 Speaker 1: and it it was the basically the third the third 791 00:48:15,719 --> 00:48:20,920 Speaker 1: time a scientist had said, we don't understand the universe 792 00:48:21,040 --> 00:48:24,200 Speaker 1: like we think we do, and determinism is based on 793 00:48:24,239 --> 00:48:31,239 Speaker 1: an illusion of order in a really chaotic universe. And this, uh, 794 00:48:31,360 --> 00:48:33,759 Speaker 1: this established chaos. It took off like a rocket. And 795 00:48:33,800 --> 00:48:36,000 Speaker 1: the eighties and the nineties, you know, as you know 796 00:48:36,040 --> 00:48:39,759 Speaker 1: from Jurassic Park, chaos was everything. Everybody's like chaos, this 797 00:48:39,840 --> 00:48:42,920 Speaker 1: is totally awesome. It's the new frontier of science. And 798 00:48:42,920 --> 00:48:45,120 Speaker 1: then it just went It just went away, And a 799 00:48:45,120 --> 00:48:48,359 Speaker 1: lot of people said, well, it was a little overhyped, 800 00:48:48,960 --> 00:48:51,600 Speaker 1: but I think more than anything, and I think this 801 00:48:51,640 --> 00:48:53,600 Speaker 1: is kind of the current understanding of chaos because it 802 00:48:53,640 --> 00:48:56,200 Speaker 1: didn't actually go away. It became a deeper and deeper field. 803 00:48:56,400 --> 00:49:01,759 Speaker 1: As you'll see, Um, people miss took what chaos meant. 804 00:49:02,000 --> 00:49:06,480 Speaker 1: It wasn't the a new the new type of science. 805 00:49:06,719 --> 00:49:09,040 Speaker 1: It was a new understanding of the universe. It was 806 00:49:09,080 --> 00:49:12,279 Speaker 1: saying like, yes, you can still use Newtonian physics, like 807 00:49:12,320 --> 00:49:14,960 Speaker 1: don't throw everything out the window. You can still try 808 00:49:15,000 --> 00:49:17,040 Speaker 1: and predict weather and still try and build more accurate 809 00:49:17,080 --> 00:49:21,240 Speaker 1: instruments and get you know, decent results, but you can't 810 00:49:21,680 --> 00:49:28,400 Speaker 1: with absolute perfection. Complex systems like determinism. The the ultimate 811 00:49:28,480 --> 00:49:32,160 Speaker 1: goal of determinism is false. It can never be it 812 00:49:32,200 --> 00:49:34,719 Speaker 1: can never be done because we can't have an infinitely 813 00:49:34,760 --> 00:49:38,239 Speaker 1: precise measurement for every variable or any variable. Therefore, we 814 00:49:38,239 --> 00:49:41,400 Speaker 1: can't predict these outcomes. Right, So you would expect science 815 00:49:41,440 --> 00:49:44,879 Speaker 1: to be like, what's the point, what's the point of anything? No, 816 00:49:45,000 --> 00:49:49,200 Speaker 1: not science. Well, some some chaos people have said, no, 817 00:49:49,320 --> 00:49:53,080 Speaker 1: this is this is great, this is good. We'll take this. 818 00:49:53,400 --> 00:49:56,520 Speaker 1: Will take the universe as it is, rather than trying 819 00:49:56,560 --> 00:49:59,760 Speaker 1: to force it into our pretty little equations and saying, 820 00:50:00,400 --> 00:50:04,080 Speaker 1: if the ocean temperature is this at this time of year, uh, 821 00:50:04,120 --> 00:50:06,840 Speaker 1: and the fish population is this at that time, then 822 00:50:07,160 --> 00:50:10,160 Speaker 1: this is how many offspring this fish style, this fish 823 00:50:10,200 --> 00:50:15,040 Speaker 1: population is going to have. Um, say, okay, here is 824 00:50:15,080 --> 00:50:18,640 Speaker 1: the fish population, Here is the ocean temperature, here all 825 00:50:18,640 --> 00:50:21,279 Speaker 1: these other variables. Let's feed it into a model and 826 00:50:21,320 --> 00:50:25,319 Speaker 1: see what happens. Not this is going to happen. What 827 00:50:25,560 --> 00:50:29,080 Speaker 1: happens instead, And this is kind of the understanding of 828 00:50:29,160 --> 00:50:33,520 Speaker 1: chaos theory. Now, it's taking raw data, as much data 829 00:50:33,520 --> 00:50:36,120 Speaker 1: as you can possibly get your hands on, as precise 830 00:50:36,239 --> 00:50:38,000 Speaker 1: data as you could possibly get your hands on, and 831 00:50:38,040 --> 00:50:41,520 Speaker 1: just feeding it into a model and seeing what patterns emerge. 832 00:50:41,880 --> 00:50:45,600 Speaker 1: Rather than making assumptions, it's saying, what's the outcome, what 833 00:50:45,680 --> 00:50:48,120 Speaker 1: comes out of this model? Yeah, and that's why like 834 00:50:48,800 --> 00:50:50,960 Speaker 1: when you see some things like you know, fifty years 835 00:50:51,000 --> 00:50:55,640 Speaker 1: ago they predicted this animal be its extinct and it's not. Well, 836 00:50:55,719 --> 00:51:00,640 Speaker 1: it's because the variations were too complex they tried to predict. Uh. 837 00:51:00,680 --> 00:51:06,239 Speaker 1: And that's why if you look at a ten day forecast, you, sir, 838 00:51:06,280 --> 00:51:10,319 Speaker 1: are a fool. All right, It's true, Well, ten days 839 00:51:10,320 --> 00:51:12,320 Speaker 1: from now says it's going to rain in the afternoon. 840 00:51:12,840 --> 00:51:15,520 Speaker 1: Come on. But if you take if you took enough 841 00:51:15,600 --> 00:51:19,319 Speaker 1: variables for weather for like a city, and fed it 842 00:51:19,360 --> 00:51:22,799 Speaker 1: into a model of the weather for that city, you 843 00:51:22,840 --> 00:51:28,000 Speaker 1: could find, uh, you could find a time when it 844 00:51:28,040 --> 00:51:30,560 Speaker 1: was similar to what it is now, and you could 845 00:51:30,600 --> 00:51:34,279 Speaker 1: conceivably make some assumptions based on that. You can say, well, 846 00:51:34,360 --> 00:51:37,160 Speaker 1: actually we can we can predict a little further out 847 00:51:37,200 --> 00:51:41,040 Speaker 1: than we think. But um, it's it's based on this 848 00:51:41,040 --> 00:51:45,879 Speaker 1: this theory, this understanding of chaos, of unpredictability, of not 849 00:51:46,000 --> 00:51:51,000 Speaker 1: just not forcing nature into our formulas, but putting data 850 00:51:51,040 --> 00:51:53,560 Speaker 1: into a model and seeing what comes out of it. Yeah. 851 00:51:53,600 --> 00:51:55,400 Speaker 1: And then at the end of that, you learn like 852 00:51:55,440 --> 00:51:58,560 Speaker 1: when that animal is not extinct like you thought it 853 00:51:58,600 --> 00:52:00,279 Speaker 1: would be, you go back and look at the general 854 00:52:00,360 --> 00:52:03,399 Speaker 1: thing and you have a more accurate picture of how 855 00:52:03,440 --> 00:52:06,120 Speaker 1: the you know, data could have been off slightly this 856 00:52:06,280 --> 00:52:12,440 Speaker 1: one value, and then you have more buffalo than you think. Yeah, 857 00:52:12,719 --> 00:52:15,680 Speaker 1: sure you got buffaloed by chaos. And we're not even 858 00:52:15,680 --> 00:52:18,239 Speaker 1: getting into fractals. It's a whole other thing. And we 859 00:52:18,280 --> 00:52:23,319 Speaker 1: did a whole other podcast in June about fractals and 860 00:52:23,680 --> 00:52:28,279 Speaker 1: Mandel bin Wa mandel Brett, mandel Brett, mandel Brett, and 861 00:52:28,640 --> 00:52:31,480 Speaker 1: go listen to that one and hear me clinging to 862 00:52:31,480 --> 00:52:35,240 Speaker 1: the edge of a clift Clift man. We we should 863 00:52:35,320 --> 00:52:38,239 Speaker 1: end this, but first, um, I want to say, there 864 00:52:38,320 --> 00:52:42,000 Speaker 1: is a really interesting article it's pretty understandable on Quanta 865 00:52:42,239 --> 00:52:49,120 Speaker 1: magazine about a guy named George Sara and he is 866 00:52:49,200 --> 00:52:54,040 Speaker 1: a chaos theory dude who's got a whole lab and 867 00:52:54,160 --> 00:52:57,160 Speaker 1: is applying it to real life. So it's a really 868 00:52:57,200 --> 00:53:04,239 Speaker 1: good picture of chaos the re inaction. Go check it out. Okay, uh, 869 00:53:04,280 --> 00:53:07,680 Speaker 1: if you want to know more about chaos theory, I 870 00:53:07,680 --> 00:53:09,799 Speaker 1: hope your brain is not broken. Yeah, go take some 871 00:53:09,920 --> 00:53:14,120 Speaker 1: LSD and look at don't do that. Um, you can 872 00:53:14,200 --> 00:53:17,879 Speaker 1: type those words into how stuff works in the search bar, 873 00:53:18,000 --> 00:53:21,360 Speaker 1: any of those fractals LST chaos. It'll bring up some 874 00:53:21,440 --> 00:53:24,320 Speaker 1: good stuff. And since I said good stuff, it's time 875 00:53:24,320 --> 00:53:29,080 Speaker 1: for a listener. Now, I'm gonna call this rare shout out. 876 00:53:30,120 --> 00:53:32,000 Speaker 1: You get requests all the time. I bet I know 877 00:53:32,080 --> 00:53:36,520 Speaker 1: which one is. Really yeah, duding his girlfriend. Yeah, No, 878 00:53:37,400 --> 00:53:40,239 Speaker 1: so far, so good. Hey, guys, just want to say 879 00:53:40,239 --> 00:53:41,959 Speaker 1: I think you're doing a wonderful job with the show 880 00:53:42,360 --> 00:53:45,120 Speaker 1: to the state. My first time listening was during my 881 00:53:45,160 --> 00:53:49,719 Speaker 1: first deployment. Uh yeah, when I listened to your list 882 00:53:49,960 --> 00:53:53,520 Speaker 1: on famous and influential films, and I was hooked after that. 883 00:53:53,840 --> 00:53:55,719 Speaker 1: Since I came back state side, I spent many hours 884 00:53:55,760 --> 00:53:59,840 Speaker 1: driving to and fro uh see my girlfriend to my barracks, 885 00:54:00,280 --> 00:54:02,759 Speaker 1: and I can happily say that they've been made all 886 00:54:02,800 --> 00:54:06,520 Speaker 1: the more enjoyable by listening to you guys. Even my 887 00:54:06,560 --> 00:54:09,200 Speaker 1: girlfriend Rachel has warmed up to you dudes, which was 888 00:54:09,239 --> 00:54:11,600 Speaker 1: not a pleasant I'm sorry, which was a pleasant shock 889 00:54:11,640 --> 00:54:14,200 Speaker 1: to me, as she has told me repeatedly that she 890 00:54:14,840 --> 00:54:18,600 Speaker 1: cannot listen to audiobooks because quote, hearing people talk on 891 00:54:18,600 --> 00:54:21,800 Speaker 1: the radio gives me a headache. End quote. Anyway, I 892 00:54:21,800 --> 00:54:24,319 Speaker 1: hope you guys continue to make awesome podcasts as I'm 893 00:54:24,360 --> 00:54:26,839 Speaker 1: headed out on my next deployment. And if you could 894 00:54:26,840 --> 00:54:28,479 Speaker 1: give a shout out to Rachel, I'm sure it would 895 00:54:28,480 --> 00:54:30,560 Speaker 1: make her feel a little better that I got the 896 00:54:30,600 --> 00:54:33,799 Speaker 1: pleasant people on the podcast to reaffirm how much I 897 00:54:33,880 --> 00:54:38,880 Speaker 1: love her. That is John, Rachel hanging there. John, be 898 00:54:39,000 --> 00:54:42,920 Speaker 1: safe and uh, thanks for listening. Yeah, man, thank you. 899 00:54:42,960 --> 00:54:44,840 Speaker 1: That's a greed email. I love that one. Glad we 900 00:54:44,840 --> 00:54:47,120 Speaker 1: don't give you a headache. Rachel. Yeah, for she listened 901 00:54:47,120 --> 00:54:49,400 Speaker 1: to this son and she's like, Okay, oh yeah, everybody's 902 00:54:49,440 --> 00:54:51,520 Speaker 1: gonna get a headache from this one. Like I I 903 00:54:51,680 --> 00:54:53,600 Speaker 1: came to hate the sound of my own voice from 904 00:54:53,600 --> 00:54:57,600 Speaker 1: this one. How You'll be all right. If you want 905 00:54:57,600 --> 00:54:59,120 Speaker 1: to get in touch with us, you can hang out 906 00:54:59,160 --> 00:55:02,040 Speaker 1: with us on Twitter at s y SKA podcast Saying 907 00:55:02,120 --> 00:55:04,680 Speaker 1: goes for Instagram. You can hang out with us on Facebook, 908 00:55:04,719 --> 00:55:06,799 Speaker 1: dot com slash Stuff you Should Know. You can send 909 00:55:06,880 --> 00:55:09,360 Speaker 1: us an email to Stuff podcast at how Stuff Works 910 00:55:09,400 --> 00:55:11,440 Speaker 1: dot com and has always joined us at home on 911 00:55:11,480 --> 00:55:17,640 Speaker 1: the web. Stuff you Should Know dot Com. Stuff you 912 00:55:17,640 --> 00:55:19,880 Speaker 1: Should Know is a production of iHeart Radio's How Stuff 913 00:55:19,920 --> 00:55:22,359 Speaker 1: Works for more podcasts for my heart Radio because at 914 00:55:22,360 --> 00:55:25,040 Speaker 1: the iHeart Radio app, Apple Podcasts, or wherever you listen 915 00:55:25,080 --> 00:55:25,960 Speaker 1: to your favorite shows,