1 00:00:00,280 --> 00:00:03,880 Speaker 1: This episode of Stuff You Should Know is sponsored by Squarespace. 2 00:00:04,120 --> 00:00:06,640 Speaker 1: Whether you need a landing page, a beautiful gallery of 3 00:00:06,720 --> 00:00:10,160 Speaker 1: professional blogger and online store, it's all possible with the 4 00:00:10,200 --> 00:00:14,200 Speaker 1: Squarespace website. Go to squarespace dot com and set your 5 00:00:14,240 --> 00:00:19,200 Speaker 1: website apart. Welcome to Stuff you Should Know from House 6 00:00:19,200 --> 00:00:27,880 Speaker 1: Stuff Works dot com. Hey, and welcome to the podcast. 7 00:00:27,960 --> 00:00:31,240 Speaker 1: I'm Josh Clark with Charles W. Chuck Bryant, and there's 8 00:00:31,400 --> 00:00:34,040 Speaker 1: Jerry over there. So this is Stuff you Should Know, 9 00:00:34,240 --> 00:00:39,120 Speaker 1: the podcast about chaos theory. Like, uh, have you ever 10 00:00:39,159 --> 00:00:44,479 Speaker 1: seen Event Horizon? I did not bad great movie? Are 11 00:00:44,520 --> 00:00:46,440 Speaker 1: you crazy? I don't think it was great? Oh so 12 00:00:46,520 --> 00:00:48,920 Speaker 1: imagine it it. I thought it was okay. It was 13 00:00:50,120 --> 00:00:53,440 Speaker 1: like a love crafty and thing in our space. Loved 14 00:00:53,479 --> 00:00:58,480 Speaker 1: it all right, I love crafted it. I liked it. Um. 15 00:00:58,600 --> 00:01:00,440 Speaker 1: That's what I think of when I think of k US. 16 00:01:00,760 --> 00:01:02,840 Speaker 1: You know, there's that one part where they kind of 17 00:01:02,880 --> 00:01:05,760 Speaker 1: give you like a glimpse behind, like the dimension that 18 00:01:05,840 --> 00:01:10,039 Speaker 1: this action is taking place in, to see the chaos underneath, 19 00:01:10,040 --> 00:01:13,880 Speaker 1: and you should check that out again. I think about 20 00:01:14,280 --> 00:01:21,760 Speaker 1: Jurassic Park and Jeff Goldblum as as the creep. Dr 21 00:01:21,800 --> 00:01:28,520 Speaker 1: Malcolm explaining chaos in the little auto driving suv or 22 00:01:28,520 --> 00:01:30,920 Speaker 1: whatever that was. Yeah, that's what it was called in 23 00:01:30,920 --> 00:01:33,520 Speaker 1: the script, the auto driving suv scene. Yeah, and you 24 00:01:33,520 --> 00:01:36,640 Speaker 1: know what, he actually rewatched that scene and it confirmed 25 00:01:36,640 --> 00:01:39,959 Speaker 1: two things. One is that he uh, he actually did 26 00:01:39,959 --> 00:01:42,480 Speaker 1: a pretty decent job for a Hollywood movie with a 27 00:01:42,560 --> 00:01:46,720 Speaker 1: very rudimentary explanation of chaos. Um, and you watched it 28 00:01:46,920 --> 00:01:49,680 Speaker 1: for this Yeah, yeah, just that scene. And then it 29 00:01:49,720 --> 00:01:53,000 Speaker 1: also confirmed of what a creep that character was. Yeah, 30 00:01:53,160 --> 00:01:56,120 Speaker 1: if you watched that scene, he's like, you know, he's 31 00:01:56,120 --> 00:01:58,720 Speaker 1: all gross and flirty with her right in front of 32 00:01:58,800 --> 00:02:01,880 Speaker 1: her ex but there's just you know, he's talking to her. 33 00:02:01,880 --> 00:02:04,440 Speaker 1: I didn't even notice this at first. He like he 34 00:02:04,560 --> 00:02:06,880 Speaker 1: just like touches her hair out of nowhere for no reason. 35 00:02:07,760 --> 00:02:09,480 Speaker 1: He's just talking to her and he just like grabs 36 00:02:09,480 --> 00:02:12,120 Speaker 1: her hair and touches it. And I'm like, what a creep. 37 00:02:12,280 --> 00:02:14,080 Speaker 1: I know, if you look closely, you can see the 38 00:02:14,120 --> 00:02:18,280 Speaker 1: hormones emerging through his chest hair. Yeah, and I love 39 00:02:18,320 --> 00:02:21,760 Speaker 1: Jeff Goldblum. It's not a reflection on him. He was 40 00:02:21,800 --> 00:02:25,000 Speaker 1: basically doing Jeff Goldblum. Well that's what Yeah, sure he's 41 00:02:25,080 --> 00:02:27,239 Speaker 1: Jeff Goldblum, but I don't think that's how in the 42 00:02:27,280 --> 00:02:29,160 Speaker 1: manner in which he speaks. But I don't think he's 43 00:02:29,160 --> 00:02:34,760 Speaker 1: a creep, do you. Wow, I've got nothing against Jeff 44 00:02:34,760 --> 00:02:37,760 Speaker 1: gold Okay, I think he's a I think he's doing 45 00:02:37,840 --> 00:02:40,120 Speaker 1: Jeff Goldblum. It was also a sign of the times, 46 00:02:40,160 --> 00:02:44,400 Speaker 1: like if that movie were made today, doctor what was 47 00:02:44,440 --> 00:02:47,560 Speaker 1: her name in the movie? Yeah, Dr Sutler would be like, 48 00:02:47,960 --> 00:02:52,480 Speaker 1: it's very inappropriate to stroke my hair, like, don't touch me. 49 00:02:54,200 --> 00:02:58,359 Speaker 1: But this was the nineties. Was eight No, it was nineties. 50 00:02:58,400 --> 00:03:02,640 Speaker 1: It was the early mid ninety. Other thing, the book 51 00:03:02,680 --> 00:03:07,080 Speaker 1: came out and in the book, uh Ian Malcolm, who's 52 00:03:07,120 --> 00:03:13,240 Speaker 1: a Kayetician creep Kaetician? Right he um he he goes 53 00:03:13,280 --> 00:03:16,600 Speaker 1: into even more depth about chaos. But that was I mean, 54 00:03:16,600 --> 00:03:18,200 Speaker 1: that was the first time I ever heard of chaos 55 00:03:18,280 --> 00:03:23,840 Speaker 1: theory was from Jurassic Parks, and um it really it 56 00:03:23,960 --> 00:03:28,200 Speaker 1: was really misleading. I think the entire term chaos is 57 00:03:28,280 --> 00:03:31,919 Speaker 1: very misleading as far as the general public goes as 58 00:03:32,040 --> 00:03:35,400 Speaker 1: from what I researched in this this for this article. Well, yeah, 59 00:03:35,440 --> 00:03:38,320 Speaker 1: I mean you hear the word chaos as an English 60 00:03:38,360 --> 00:03:43,440 Speaker 1: speaker and you think frenetic and crazy, out of control. Yeah, 61 00:03:43,440 --> 00:03:45,960 Speaker 1: and that's not what it means in terms of science 62 00:03:46,000 --> 00:03:48,520 Speaker 1: like this, right. What it means, I guess we can 63 00:03:48,560 --> 00:03:53,400 Speaker 1: say up front is is basically the idea that complex 64 00:03:53,440 --> 00:03:57,680 Speaker 1: systems do not behave in very neat ways that we 65 00:03:58,240 --> 00:04:02,680 Speaker 1: can easily grasp, understand, ander measure, right, and not even 66 00:04:03,320 --> 00:04:06,960 Speaker 1: even simple systems don't. Sometimes it doesn't always have to 67 00:04:06,960 --> 00:04:09,520 Speaker 1: be complex. But um, I want to give a shout 68 00:04:09,560 --> 00:04:13,400 Speaker 1: out in addition to our own article to UH when 69 00:04:13,480 --> 00:04:15,280 Speaker 1: you know, when it comes to stuff like this, the 70 00:04:15,320 --> 00:04:18,520 Speaker 1: brain breaking stuff. For me, this is a breaker. You 71 00:04:18,520 --> 00:04:21,320 Speaker 1: know how I always go to like blank blank for 72 00:04:21,440 --> 00:04:25,240 Speaker 1: kids because it always helps. If there's a dinosaur mascot 73 00:04:25,320 --> 00:04:28,120 Speaker 1: on the page, it's a sure thing we can understand it. 74 00:04:28,480 --> 00:04:31,240 Speaker 1: But the best explanation for all this stuff that I 75 00:04:31,279 --> 00:04:35,560 Speaker 1: found on the internet was from a website called a 76 00:04:35,680 --> 00:04:39,320 Speaker 1: bar Um A B A R I. M. Publications, which 77 00:04:39,560 --> 00:04:44,400 Speaker 1: turns out to be a website about biblical patterns and 78 00:04:44,800 --> 00:04:47,400 Speaker 1: sandwiched in the middle, there is a really great, easy 79 00:04:47,400 --> 00:04:51,839 Speaker 1: to understand UH series of pages on chaos there. So 80 00:04:51,920 --> 00:04:54,040 Speaker 1: I was like, man, I get it now in a 81 00:04:54,120 --> 00:04:59,520 Speaker 1: rudimentary way, right, Well, yeah, um, I think even a 82 00:04:59,560 --> 00:05:05,000 Speaker 1: lot of people who deal with systems that display chaotic behavior, 83 00:05:05,040 --> 00:05:07,560 Speaker 1: which I guess is to say basically all systems eventually 84 00:05:07,800 --> 00:05:12,520 Speaker 1: under the right conditions, Um, don't necessarily understand chaos. Yeah, 85 00:05:12,520 --> 00:05:15,320 Speaker 1: And they defined a complex system is specifically. It doesn't 86 00:05:15,320 --> 00:05:19,520 Speaker 1: mean just like, oh it's complex, I mean it is, 87 00:05:20,040 --> 00:05:22,479 Speaker 1: but specifically, Um, they define it in a way that 88 00:05:22,600 --> 00:05:26,039 Speaker 1: helped me understand it's a system that has so much motion, 89 00:05:26,640 --> 00:05:29,400 Speaker 1: so many elements that are in motion, moving parts. Yeah, 90 00:05:29,440 --> 00:05:33,039 Speaker 1: that it takes like a computer to calculate all the 91 00:05:33,040 --> 00:05:36,359 Speaker 1: possibilities of like what that could look like five minutes 92 00:05:36,360 --> 00:05:40,760 Speaker 1: from now, ten years from now. So before computers came around, 93 00:05:41,720 --> 00:05:46,320 Speaker 1: we before the quantum mechanical revolution, it was there's a 94 00:05:46,320 --> 00:05:48,599 Speaker 1: lot more basic. It was like what comes up must 95 00:05:48,600 --> 00:05:51,719 Speaker 1: come down, stuff like that. Let's talk about that, Chuckers, 96 00:05:51,760 --> 00:05:54,760 Speaker 1: because when you're talking about chaos theory, it helps to 97 00:05:54,839 --> 00:06:00,200 Speaker 1: understand how it revolutionized the universe by getting a clear 98 00:06:00,279 --> 00:06:03,279 Speaker 1: picture of how we understood the universe leading up to 99 00:06:03,320 --> 00:06:09,600 Speaker 1: the discovery of chaos. Right, So, prior to the um 100 00:06:10,040 --> 00:06:14,120 Speaker 1: the scientific revolution, everybody was like, oh, well, it's it's God. 101 00:06:14,320 --> 00:06:16,000 Speaker 1: The Earth is at the center of the universe and 102 00:06:16,080 --> 00:06:19,000 Speaker 1: God is spinning everything around like a top right, it 103 00:06:19,120 --> 00:06:22,839 Speaker 1: was all a theistic explanation. Then the scientific revolution happens 104 00:06:22,839 --> 00:06:26,080 Speaker 1: and people start applying things like math and making like 105 00:06:26,880 --> 00:06:31,600 Speaker 1: mathematical discoveries and and and figuring out that there are 106 00:06:32,240 --> 00:06:37,279 Speaker 1: there's order. They're finding order in patterns and predictability to 107 00:06:37,360 --> 00:06:42,560 Speaker 1: the universe if you can apply mathematics to it, specifically, 108 00:06:42,640 --> 00:06:47,040 Speaker 1: if you can apply mathematics to the starting point, right, right, 109 00:06:47,120 --> 00:06:49,800 Speaker 1: So if you can, if you can um figure out 110 00:06:49,800 --> 00:06:53,360 Speaker 1: how a system works mathematically speaking, right, you can go 111 00:06:53,480 --> 00:06:57,440 Speaker 1: in and plug in whatever coordinates you want to and 112 00:06:57,480 --> 00:06:59,760 Speaker 1: watch it go. You can predict what what the outcomes 113 00:06:59,760 --> 00:07:02,640 Speaker 1: can be and what this is the it's based on 114 00:07:03,200 --> 00:07:07,160 Speaker 1: what at the time was a totally revolutionary idea um 115 00:07:07,320 --> 00:07:11,480 Speaker 1: By Initially, I think the cart was the first one 116 00:07:11,520 --> 00:07:14,600 Speaker 1: to kind of say, causing effect is a pretty big 117 00:07:14,640 --> 00:07:17,000 Speaker 1: part of our universe, right. Yeah. It was sort of 118 00:07:17,040 --> 00:07:21,520 Speaker 1: like where this is sixteen hundreds, where early science met philosophy. 119 00:07:22,000 --> 00:07:25,520 Speaker 1: They kind of complimented one another as far as something 120 00:07:25,560 --> 00:07:29,320 Speaker 1: that's we're talking about determinism, right, So that was the 121 00:07:29,800 --> 00:07:32,840 Speaker 1: kind of the seeds of determinism. Was the scientific revolution, 122 00:07:32,880 --> 00:07:35,640 Speaker 1: And like you said, where philosophy and science came together 123 00:07:35,760 --> 00:07:38,440 Speaker 1: in the form of Descartes, right, and then Newton came 124 00:07:38,440 --> 00:07:40,640 Speaker 1: along and we did a whole episode on him. Yeah, 125 00:07:40,760 --> 00:07:43,720 Speaker 1: January of this year. That was a good one. It 126 00:07:43,800 --> 00:07:46,280 Speaker 1: was really good. Like I think you said in that 127 00:07:46,360 --> 00:07:49,560 Speaker 1: episode that there's possibly no scienists that changed the world 128 00:07:49,640 --> 00:07:53,720 Speaker 1: more than Newton has. He's he's got legs. People shouted 129 00:07:53,760 --> 00:07:57,200 Speaker 1: out others and email, but I'll just say he's at 130 00:07:57,200 --> 00:07:59,239 Speaker 1: the near the top for sure with some other people. 131 00:07:59,320 --> 00:08:02,800 Speaker 1: The Cream. Yeah, so Newton came along and Newton said 132 00:08:02,920 --> 00:08:06,720 Speaker 1: that was his name, Isaac the Cream new anytime he 133 00:08:06,840 --> 00:08:10,080 Speaker 1: don't to be like cream. Yeah, you just got creamed. 134 00:08:10,400 --> 00:08:12,559 Speaker 1: I thought he was a boxer. He's a basketball player. 135 00:08:13,120 --> 00:08:15,680 Speaker 1: He was much more well known as a boxer, but 136 00:08:15,760 --> 00:08:20,000 Speaker 1: he definitely could dunk as a as a B baller. So, 137 00:08:20,240 --> 00:08:24,760 Speaker 1: um man, that threw me off a little bit. Yeah, 138 00:08:24,800 --> 00:08:29,440 Speaker 1: the Cream comes along and uh, he basically says, watch this, dude, 139 00:08:29,560 --> 00:08:32,840 Speaker 1: this causing effect thing you're talking about, I can express 140 00:08:32,880 --> 00:08:36,080 Speaker 1: it in quantifiable terms. And he comes up with all 141 00:08:36,120 --> 00:08:39,439 Speaker 1: of these great laws and and basically sets the stage 142 00:08:39,800 --> 00:08:43,800 Speaker 1: the foundation for science for the next three centuries or so. Yeah, 143 00:08:43,880 --> 00:08:47,880 Speaker 1: these these laws that were so rock solid and powerful 144 00:08:48,720 --> 00:08:51,880 Speaker 1: that scientists kind of got ahead of themselves a little 145 00:08:52,400 --> 00:08:57,120 Speaker 1: and said we're done. Like with Newton's laws, we can predict, uh, 146 00:08:57,200 --> 00:08:59,720 Speaker 1: we can predict everything if we have a good enough 147 00:09:00,200 --> 00:09:04,920 Speaker 1: beginning accurate value to plug into his equations, and they weren't. 148 00:09:05,640 --> 00:09:07,600 Speaker 1: I think there was a little hubrius and a little 149 00:09:07,800 --> 00:09:11,280 Speaker 1: just excitement about like, well we figured it all out 150 00:09:11,520 --> 00:09:14,640 Speaker 1: right that that you could take Newton's laws and if 151 00:09:14,720 --> 00:09:19,520 Speaker 1: you had accurate enough measurements, uh, you could predict what 152 00:09:19,760 --> 00:09:22,360 Speaker 1: the outcome would be of that system that you plug 153 00:09:22,440 --> 00:09:25,240 Speaker 1: those measurements into using these formula. And at the time, 154 00:09:25,280 --> 00:09:28,839 Speaker 1: a lot of this was like planetary, like, well, we 155 00:09:28,880 --> 00:09:31,959 Speaker 1: know that these planets are here and they're moving and 156 00:09:31,960 --> 00:09:34,120 Speaker 1: their orbiting, So if we know these things, we can 157 00:09:34,160 --> 00:09:36,600 Speaker 1: plug it into an equation and we can figure out 158 00:09:36,600 --> 00:09:39,079 Speaker 1: what it's going to be like in a hundred years exactly. 159 00:09:39,120 --> 00:09:43,959 Speaker 1: And they figured out the basis of determinism is what 160 00:09:44,000 --> 00:09:47,400 Speaker 1: we just said, that if you have accurate measurements, you 161 00:09:47,400 --> 00:09:50,720 Speaker 1: can take those measurements and use them to predict um 162 00:09:50,920 --> 00:09:53,679 Speaker 1: how a system is going to change over time using 163 00:09:53,720 --> 00:09:56,880 Speaker 1: differential equations. Right, yeah, so this is what this is 164 00:09:56,920 --> 00:09:59,200 Speaker 1: what Newton comes along and figures out that you can 165 00:09:59,200 --> 00:10:03,280 Speaker 1: describe the uni us in these mathematical terms using differential 166 00:10:03,280 --> 00:10:06,560 Speaker 1: equations and um like, you said there was a tremendous 167 00:10:06,559 --> 00:10:10,440 Speaker 1: amount of hubris, and well, I think you said there's 168 00:10:10,480 --> 00:10:12,680 Speaker 1: some hubris. I think there's a tremendous amount of hubris 169 00:10:12,679 --> 00:10:16,959 Speaker 1: where science basically said, we've mastered the universe, We've uncovered 170 00:10:17,200 --> 00:10:20,720 Speaker 1: the blueprint of the universe, and now we understand everything. 171 00:10:20,920 --> 00:10:24,439 Speaker 1: It's just a matter now of getting our scientific measurements 172 00:10:24,480 --> 00:10:28,040 Speaker 1: more and more and more exact. Because again, the hallmark 173 00:10:28,080 --> 00:10:31,080 Speaker 1: of determinism is that if you have exact measurements, you 174 00:10:31,120 --> 00:10:35,280 Speaker 1: can predict an outcome accurately, like the pool queue example 175 00:10:35,400 --> 00:10:39,080 Speaker 1: or the pool table example. Right right, So if you've 176 00:10:39,080 --> 00:10:42,079 Speaker 1: got a pool table, let's say you're playing some nine ball. 177 00:10:42,720 --> 00:10:45,840 Speaker 1: You have that beautiful little diamond set up, you got 178 00:10:45,920 --> 00:10:48,360 Speaker 1: your cue ball, you put that cue ball, and you 179 00:10:48,360 --> 00:10:52,320 Speaker 1: you crack it with the queue, and if you are 180 00:10:52,360 --> 00:10:54,640 Speaker 1: super accurate with your initial measurements, you should be able 181 00:10:54,679 --> 00:10:58,640 Speaker 1: to mathematically plot out via angles where the balls will 182 00:10:58,720 --> 00:11:01,280 Speaker 1: end up, right exactly, like you can say, this is 183 00:11:01,320 --> 00:11:03,559 Speaker 1: what the table will look like after the break. If 184 00:11:03,600 --> 00:11:07,959 Speaker 1: you know the force the angle, all those little variable temperature, 185 00:11:08,040 --> 00:11:10,720 Speaker 1: if there's wind in the room, like the felt on 186 00:11:10,760 --> 00:11:13,960 Speaker 1: the table, like everything. The more specific you are, the 187 00:11:14,080 --> 00:11:16,480 Speaker 1: more accurate your end result will be. Right. And then 188 00:11:16,480 --> 00:11:18,920 Speaker 1: one of the other hallmarks of determinism is that if 189 00:11:18,960 --> 00:11:23,240 Speaker 1: you take those exact same initial conditions and do them again, 190 00:11:23,720 --> 00:11:26,200 Speaker 1: the table, the pool table will look exactly the same 191 00:11:26,240 --> 00:11:29,200 Speaker 1: after the break, which is pretty much impossible for like 192 00:11:29,240 --> 00:11:32,480 Speaker 1: a human to do with their hands. Sure, but the 193 00:11:32,520 --> 00:11:35,480 Speaker 1: idea at the time of science is that if you 194 00:11:35,520 --> 00:11:39,200 Speaker 1: could build a perfect machine, sure that could recreate these conditions, 195 00:11:39,240 --> 00:11:42,520 Speaker 1: it will happen the same way every time. Right, And this, 196 00:11:42,720 --> 00:11:45,480 Speaker 1: I mean this led to they had hubris, but you 197 00:11:45,520 --> 00:11:52,239 Speaker 1: could understand it when like literally in two people predicted 198 00:11:53,000 --> 00:11:57,480 Speaker 1: Neptune would exist within months, that would exist, but does exist. 199 00:11:57,920 --> 00:11:59,480 Speaker 1: And this is not by looking up in the sky 200 00:11:59,600 --> 00:12:02,199 Speaker 1: like they did it with math and they were right. 201 00:12:02,760 --> 00:12:07,160 Speaker 1: So imagine in eighty when that happens, they're like, yeah, 202 00:12:07,280 --> 00:12:10,079 Speaker 1: we kinda we've got the math down, so we're pretty 203 00:12:10,160 --> 00:12:12,800 Speaker 1: much all knowing well. Plus also, for the most part, 204 00:12:13,240 --> 00:12:17,960 Speaker 1: these not just with Neptune, they were finding um that 205 00:12:18,080 --> 00:12:20,880 Speaker 1: this stuff really panned out. It held true for everything 206 00:12:21,040 --> 00:12:25,920 Speaker 1: from um you know, the investigation into electricity to new 207 00:12:26,000 --> 00:12:30,240 Speaker 1: chemical reactions and understanding those and it it laid the 208 00:12:30,559 --> 00:12:34,520 Speaker 1: scientific revolution, laid the basis for the industrial revolution, and 209 00:12:34,559 --> 00:12:37,080 Speaker 1: just the change that came out of the world like that. 210 00:12:37,440 --> 00:12:41,080 Speaker 1: It definitely there. It is understandable how science kind of 211 00:12:41,200 --> 00:12:43,400 Speaker 1: was like we got it all figured out well, and 212 00:12:43,440 --> 00:12:48,680 Speaker 1: like you said, they even Galileo was smart enough to 213 00:12:48,800 --> 00:12:56,240 Speaker 1: know there's uncertainty in these measurements, like the precision is key. 214 00:12:56,400 --> 00:12:58,880 Speaker 1: So they spent what does the article say, a lot 215 00:12:58,880 --> 00:13:01,680 Speaker 1: of the much of an enth and twenty century just 216 00:13:01,720 --> 00:13:05,920 Speaker 1: trying to build better instrumentation to get more and more 217 00:13:05,960 --> 00:13:09,000 Speaker 1: smaller and smaller and more precise measurements. Right, That was 218 00:13:09,040 --> 00:13:11,120 Speaker 1: like basically the goal of it, right, Yeah, which was 219 00:13:11,200 --> 00:13:13,520 Speaker 1: the right direction. That's like exactly what they should have 220 00:13:13,559 --> 00:13:18,520 Speaker 1: been doing. The problem is there, Like you said, Galileo 221 00:13:18,640 --> 00:13:21,800 Speaker 1: knew that there was some sort of there there gonna 222 00:13:21,840 --> 00:13:25,199 Speaker 1: be some flaws and measurement that we just didn't have 223 00:13:25,320 --> 00:13:29,480 Speaker 1: those great scientific instruments yet. It's called the uncertainty principle. 224 00:13:30,720 --> 00:13:35,880 Speaker 1: It's accuracy, right, But the idea is that if you 225 00:13:36,000 --> 00:13:39,520 Speaker 1: have a good enough instruments, you can overcome that, and 226 00:13:39,559 --> 00:13:46,000 Speaker 1: that the the more you shrink the um error in 227 00:13:46,120 --> 00:13:50,800 Speaker 1: measuring the initial conditions, the more you're gonna shrink the 228 00:13:50,960 --> 00:13:53,640 Speaker 1: error in the outcome. It would be proportionate. Right. They 229 00:13:53,679 --> 00:13:58,040 Speaker 1: were correct. The thing is they were also aware but 230 00:13:58,800 --> 00:14:03,439 Speaker 1: ignoring in a lot a lot of ways some outstanding problems, 231 00:14:03,960 --> 00:14:08,120 Speaker 1: specifically something called the end body problem. You know what, 232 00:14:08,559 --> 00:14:10,679 Speaker 1: I'm so excited about this. I need to take a break. 233 00:14:10,800 --> 00:14:12,320 Speaker 1: I think that's a good idea. I need to go 234 00:14:12,800 --> 00:14:17,400 Speaker 1: check out my end body in the bathroom. Okay, and 235 00:14:17,520 --> 00:14:37,240 Speaker 1: we'll be back. All right, check, we're back. So there's 236 00:14:37,240 --> 00:14:41,320 Speaker 1: some there's some issues right with determinism. There's some some 237 00:14:41,400 --> 00:14:46,280 Speaker 1: weird problems out there that are saying like, hey, pay 238 00:14:46,320 --> 00:14:51,280 Speaker 1: attention to me because I'm not sure determinism works. Uh. 239 00:14:51,320 --> 00:14:53,640 Speaker 1: And one of the one is the end body problem. Yeah. 240 00:14:53,680 --> 00:14:58,880 Speaker 1: How this came about was that was King Oscar number 241 00:14:58,880 --> 00:15:02,960 Speaker 1: two of Sweet and Norway. Yeah, I don't want to 242 00:15:03,040 --> 00:15:06,080 Speaker 1: leave out Norway both. Uh. He said, you know what, 243 00:15:06,400 --> 00:15:08,920 Speaker 1: let's offer a prize to anyone who can prove the 244 00:15:08,960 --> 00:15:12,720 Speaker 1: stability of the solar system something that has been stable 245 00:15:13,040 --> 00:15:16,160 Speaker 1: for a long time before that and a lot of 246 00:15:16,560 --> 00:15:19,240 Speaker 1: the most brilliant minds on planet Earth got together and 247 00:15:19,240 --> 00:15:23,720 Speaker 1: tried to do this, uh with mathematical proofs, and no 248 00:15:23,760 --> 00:15:28,640 Speaker 1: one could do it. Uh. And then a dude name Honoree. 249 00:15:29,400 --> 00:15:33,400 Speaker 1: You gotta help me there with that, Oh, say, the 250 00:15:33,400 --> 00:15:38,240 Speaker 1: whole thing very nice. He was French, believe it or not, 251 00:15:38,880 --> 00:15:40,920 Speaker 1: and he was a mathematician, and he said, you know what, 252 00:15:41,560 --> 00:15:44,080 Speaker 1: I'm not gonna look at this big picture of all 253 00:15:44,080 --> 00:15:46,320 Speaker 1: the planets in the sun and all their orbits. You'd 254 00:15:46,320 --> 00:15:48,360 Speaker 1: have to be a fool to try that. Sure, he said, 255 00:15:48,360 --> 00:15:51,320 Speaker 1: I'm gonna shrink this down, Like we talked about shrinking 256 00:15:51,320 --> 00:15:56,200 Speaker 1: that initial value, you know, and um, that initial condition. 257 00:15:56,240 --> 00:15:57,840 Speaker 1: He shrunk it down. He said, I'm gonna look at 258 00:15:58,360 --> 00:16:02,960 Speaker 1: just a couple of bodies orbiting one another, uh, with 259 00:16:03,000 --> 00:16:05,640 Speaker 1: a common center of gravity. And I'm gonna look at this. 260 00:16:06,360 --> 00:16:09,240 Speaker 1: And this was called the N body problem. Yeah, which 261 00:16:09,240 --> 00:16:13,200 Speaker 1: was smart to do, because the more variables you factor 262 00:16:13,280 --> 00:16:18,240 Speaker 1: into um a nonlinear equation like that, just the harder 263 00:16:18,240 --> 00:16:20,600 Speaker 1: it's gonna be. So he shrunk it down. So the 264 00:16:20,760 --> 00:16:24,520 Speaker 1: N body problem has to do with three or more 265 00:16:24,640 --> 00:16:28,360 Speaker 1: celestial bodies orbiting one another. So Plank said, I'll just 266 00:16:28,400 --> 00:16:32,160 Speaker 1: start with three. Smart and what he found from doing 267 00:16:32,200 --> 00:16:35,960 Speaker 1: his equations for this this King Oscar. The sequel prize 268 00:16:36,640 --> 00:16:43,520 Speaker 1: um was that shrinking the initial conditions um measurement or 269 00:16:43,880 --> 00:16:49,760 Speaker 1: rate of error right, did not really shrink the the 270 00:16:49,960 --> 00:16:54,160 Speaker 1: error in the outcome, which flies in the face of determinism. 271 00:16:54,200 --> 00:16:59,840 Speaker 1: What he found was that just very very minute different 272 00:17:00,120 --> 00:17:04,880 Speaker 1: is in the initial conditions fed into a system produced 273 00:17:05,160 --> 00:17:09,119 Speaker 1: wildly different outcomes after a fairly short time. Yeah, like, 274 00:17:09,200 --> 00:17:11,360 Speaker 1: let me just round off the mass of this planet 275 00:17:11,400 --> 00:17:16,159 Speaker 1: at like the eighth decimal point, and you know who cares? 276 00:17:16,280 --> 00:17:18,760 Speaker 1: Who cares at that point? Let me just round that 277 00:17:18,800 --> 00:17:21,800 Speaker 1: one to a two, and that would throw everything off 278 00:17:21,880 --> 00:17:24,840 Speaker 1: at a at a pretty high rate. And he said, 279 00:17:25,480 --> 00:17:30,560 Speaker 1: wait a minute, I think this contest is in polsib right, 280 00:17:30,840 --> 00:17:33,840 Speaker 1: He said, there is no way to prove prove, to 281 00:17:34,000 --> 00:17:39,520 Speaker 1: prove the stability of the Solar system, because he just 282 00:17:39,680 --> 00:17:44,120 Speaker 1: uncovered the idea that it's impossible for us to predict 283 00:17:44,680 --> 00:17:50,200 Speaker 1: the the the rate of change among celestial bodies. Yeah, 284 00:17:50,240 --> 00:17:53,360 Speaker 1: it's such a complex system. There are far too many 285 00:17:53,480 --> 00:17:58,639 Speaker 1: variables that, uh, it's impossible to start with something so 286 00:17:58,720 --> 00:18:04,000 Speaker 1: minute to get the equation whatever the sum that you want. Well, 287 00:18:04,040 --> 00:18:07,080 Speaker 1: not only that, but the result, not only that and 288 00:18:07,119 --> 00:18:11,800 Speaker 1: this is what really undermined determinism was that he figured 289 00:18:11,800 --> 00:18:16,640 Speaker 1: out that you would have to have an infinitely precise measurement, 290 00:18:18,000 --> 00:18:20,640 Speaker 1: which even if you build a perfect machine that could 291 00:18:20,640 --> 00:18:24,000 Speaker 1: take the infinitely or a perfect machine that could take 292 00:18:24,000 --> 00:18:27,240 Speaker 1: a measurement of like the the movement of a celestial 293 00:18:27,240 --> 00:18:33,960 Speaker 1: body around another, you, it's literally impossible to get infinite 294 00:18:34,160 --> 00:18:38,280 Speaker 1: and infinitely precise measurement, which means that we could never 295 00:18:38,400 --> 00:18:41,399 Speaker 1: predict out to a certain degree the movement of the 296 00:18:41,480 --> 00:18:45,840 Speaker 1: celestial bodies. Like he was saying, like, no, you you 297 00:18:46,000 --> 00:18:49,320 Speaker 1: can't get You can't build a machine that that gets 298 00:18:49,320 --> 00:18:53,359 Speaker 1: measurements enough that we can overcome this, like determinism is wrong, 299 00:18:53,800 --> 00:18:57,960 Speaker 1: Like you can't just say, uh, we have the understanding 300 00:18:58,000 --> 00:19:02,000 Speaker 1: to predict everything. There's a lot of stuff out there 301 00:19:02,000 --> 00:19:04,480 Speaker 1: that we're not able to predict. And he uncovered it 302 00:19:04,560 --> 00:19:06,879 Speaker 1: trying to figure out this end body problem. Yeah, and 303 00:19:07,000 --> 00:19:10,520 Speaker 1: King Oscar the sequel said you win, Yeah, bring me 304 00:19:10,560 --> 00:19:13,840 Speaker 1: another rack of lamb and uh, here's your prize. And 305 00:19:13,920 --> 00:19:16,600 Speaker 1: he won by proving that it was impossible, which is 306 00:19:16,600 --> 00:19:21,680 Speaker 1: pretty interesting. And they utterly and completely changed not just math, 307 00:19:21,800 --> 00:19:24,199 Speaker 1: but like our our our understanding of the universe, and 308 00:19:24,200 --> 00:19:26,960 Speaker 1: our understanding of our understanding of the universe, which is 309 00:19:27,000 --> 00:19:30,719 Speaker 1: even more kind of earth shaking. Yeah, he discovered dynamical 310 00:19:30,840 --> 00:19:36,119 Speaker 1: instability or chaos, and um, they didn't have supercomputers at 311 00:19:36,119 --> 00:19:38,639 Speaker 1: the time, so it would be a little while, about 312 00:19:38,720 --> 00:19:43,400 Speaker 1: seventy years at m I T until uh we could 313 00:19:43,400 --> 00:19:47,280 Speaker 1: actually kind of feed these things into machines capable of 314 00:19:47,680 --> 00:19:50,280 Speaker 1: plotting these things out in a way that we could see, 315 00:19:50,600 --> 00:19:54,440 Speaker 1: which was really incredible. So there is this dude um 316 00:19:54,600 --> 00:20:01,800 Speaker 1: seven years later, uh named um Edward Lawrence Lawrence. Yeah. Well, 317 00:20:01,840 --> 00:20:03,879 Speaker 1: first of all, we should set the stage the reason 318 00:20:03,960 --> 00:20:08,040 Speaker 1: this guy he was a meteorologist and scientists, right, not 319 00:20:08,119 --> 00:20:11,119 Speaker 1: that those are not the same thing, right, He's a 320 00:20:11,160 --> 00:20:15,960 Speaker 1: scientist who dabbled the meteorology. Here was a mathematician, Yeah, 321 00:20:15,960 --> 00:20:18,720 Speaker 1: but he was really into meteorology because it was there 322 00:20:18,760 --> 00:20:22,440 Speaker 1: was a weird juxtaposition at the time where we were 323 00:20:22,560 --> 00:20:25,880 Speaker 1: sending people into outer space but we couldn't predict the weather. Yeah, 324 00:20:25,920 --> 00:20:28,640 Speaker 1: and it was it was definitely a blot on the 325 00:20:28,680 --> 00:20:31,479 Speaker 1: field of meteorology. People were like, do you guys know 326 00:20:31,520 --> 00:20:35,000 Speaker 1: what you're doing? And and meteorologists are like, you have 327 00:20:35,080 --> 00:20:38,280 Speaker 1: no idea how hard this is? Like yeah, we can 328 00:20:38,320 --> 00:20:40,160 Speaker 1: predict it a couple of days out, but after that, 329 00:20:40,400 --> 00:20:44,280 Speaker 1: it's just it's totally unpredictable. It drives as mad and 330 00:20:44,520 --> 00:20:47,960 Speaker 1: it's not. It wasn't just there. Um, their reputations that 331 00:20:47,960 --> 00:20:50,200 Speaker 1: were at stake, like people were losing their lives because 332 00:20:50,200 --> 00:20:53,640 Speaker 1: of it, right, Yeah. N two there were two notorious storms, 333 00:20:53,760 --> 00:20:56,000 Speaker 1: one on the East coast and one on the west. Uh, 334 00:20:56,040 --> 00:20:58,280 Speaker 1: the ash Wednesday storm in the East and the big 335 00:20:58,320 --> 00:21:00,680 Speaker 1: blow on the West that of a lot of people, 336 00:21:00,840 --> 00:21:04,639 Speaker 1: cost hundreds of millions of dollars in damage. And people 337 00:21:04,640 --> 00:21:05,879 Speaker 1: were like, you know, we need to be able to 338 00:21:05,920 --> 00:21:09,200 Speaker 1: see these things coming a little more because it's a problem. 339 00:21:09,280 --> 00:21:11,600 Speaker 1: And meteorologists were like, why did you do it then? 340 00:21:13,920 --> 00:21:17,359 Speaker 1: So they thought the key was these big supercomputers. Remember 341 00:21:17,400 --> 00:21:20,640 Speaker 1: the supercomputers. When they came out the big rooms full 342 00:21:20,720 --> 00:21:24,480 Speaker 1: of hardware, it was amazing, and they were finally able 343 00:21:24,520 --> 00:21:27,040 Speaker 1: to do like these incredible calculations that we could never 344 00:21:27,040 --> 00:21:29,040 Speaker 1: do before. I know, they were able to like crunch 345 00:21:29,119 --> 00:21:32,040 Speaker 1: sixty four bites a second. Yeah, we had the advocates 346 00:21:32,160 --> 00:21:36,080 Speaker 1: and then the supercomputer. There was nothing in between. Um. 347 00:21:36,119 --> 00:21:38,919 Speaker 1: I looked up the computer that Lawrence was working with 348 00:21:39,080 --> 00:21:42,719 Speaker 1: the Whopper Royal McBee. What was the Whopper board games? 349 00:21:43,359 --> 00:21:47,000 Speaker 1: Was it? It called the Whopper w a PR I 350 00:21:47,040 --> 00:21:49,960 Speaker 1: can't believe they called it that. So the guy just 351 00:21:50,119 --> 00:21:57,560 Speaker 1: nicknamed it Joshua. No, Joshua was the the software Falcon 352 00:21:57,760 --> 00:22:00,159 Speaker 1: was the old man who designed all the stuff up 353 00:22:00,160 --> 00:22:03,360 Speaker 1: and his son was Joshua. And that was the password. Oh, 354 00:22:03,400 --> 00:22:06,280 Speaker 1: that was the password. Yeah, I guess I was too 355 00:22:06,320 --> 00:22:09,400 Speaker 1: young to understand what a password was. Okay, you didn't 356 00:22:09,400 --> 00:22:12,840 Speaker 1: even there weren't passwords at the time. Shouted it at 357 00:22:12,880 --> 00:22:16,800 Speaker 1: the computer and they're like, okay, access granted. Yeah. Still 358 00:22:16,960 --> 00:22:20,400 Speaker 1: that movie holds up. Does it really check it out? Yeah, 359 00:22:20,400 --> 00:22:23,719 Speaker 1: it's still very very fun. Young Ali Sheety boy had 360 00:22:23,720 --> 00:22:26,520 Speaker 1: a crush on her from that movie. She was great. Yeah. 361 00:22:26,720 --> 00:22:30,440 Speaker 1: What else was she in recently? Wasn't she in something? Well? 362 00:22:30,480 --> 00:22:32,520 Speaker 1: I mean she kind of went away for a while 363 00:22:32,800 --> 00:22:35,480 Speaker 1: and then had her big comeback with the indie movie 364 00:22:35,560 --> 00:22:38,679 Speaker 1: High Art, But that was a while ago. Has she 365 00:22:38,760 --> 00:22:42,160 Speaker 1: been in anything else recently? Sure? I think I saw 366 00:22:42,520 --> 00:22:45,200 Speaker 1: something and something recently and I didn't realize that was her. 367 00:22:46,160 --> 00:22:48,639 Speaker 1: She looks familiar and I was like, oh, that's Ali Sheety. 368 00:22:50,560 --> 00:22:53,000 Speaker 1: I don't know, all right, I could look it up, 369 00:22:53,000 --> 00:22:55,960 Speaker 1: but I won't. It doesn't matter anyway. I still crushed 370 00:22:56,000 --> 00:23:00,879 Speaker 1: on her. So the the Royal mcbeebe was not quite 371 00:23:00,920 --> 00:23:03,080 Speaker 1: the whopper. You could actually sit down at it. The 372 00:23:03,160 --> 00:23:05,359 Speaker 1: Royal McBee that's the name of that sounds like a 373 00:23:05,359 --> 00:23:10,040 Speaker 1: hamburger too. It was by the Royal Typewriter Company. And 374 00:23:10,080 --> 00:23:12,840 Speaker 1: they got into computers for a second. And this is 375 00:23:12,920 --> 00:23:16,080 Speaker 1: the kind of computer that Lawrence was working with, and 376 00:23:16,280 --> 00:23:21,760 Speaker 1: it was a huge deal, Like you were saying, Avacus supercomputer. Um. 377 00:23:21,800 --> 00:23:25,200 Speaker 1: But it was still pretty dumb as far as what 378 00:23:25,240 --> 00:23:27,560 Speaker 1: we have today is concerned. But it was enough that 379 00:23:27,640 --> 00:23:31,000 Speaker 1: Lawrence is like Lawrence and his ILK, where like, finally 380 00:23:31,400 --> 00:23:35,320 Speaker 1: we can start running models and actually predict the weather. Yeah, 381 00:23:35,480 --> 00:23:38,360 Speaker 1: he started doing just that. He did. So he started 382 00:23:38,400 --> 00:23:44,240 Speaker 1: off with UM, a computational model of twelve meteorological meteorological 383 00:23:44,560 --> 00:23:48,280 Speaker 1: I like how you calculations, which is very basic because 384 00:23:48,320 --> 00:23:53,480 Speaker 1: they're infinite meteorological calculations, probably depending to stay wrong again, 385 00:23:54,320 --> 00:23:56,520 Speaker 1: like it sounds like you're about to say it wrong 386 00:23:56,560 --> 00:23:58,199 Speaker 1: and then you pull it out at the last second. 387 00:23:58,240 --> 00:24:01,520 Speaker 1: Maybe it's really impressive, but uh so that's very basic. 388 00:24:01,560 --> 00:24:03,879 Speaker 1: But he wanted to start out you know with something 389 00:24:03,920 --> 00:24:06,840 Speaker 1: at Hannibal. So he narrowed it down to twelve conditions, 390 00:24:06,880 --> 00:24:11,920 Speaker 1: basically twelve calculations that had you know, temperature, wind, speed, pressure, 391 00:24:12,160 --> 00:24:16,720 Speaker 1: stuff like that started forecasting weather. Uh. And then he said, 392 00:24:16,720 --> 00:24:18,199 Speaker 1: you know, it'd be great if you could see this, 393 00:24:18,960 --> 00:24:21,919 Speaker 1: So I'm gonna spit it into my wonder machine, the 394 00:24:22,000 --> 00:24:27,280 Speaker 1: McWhopper Royal MCB, and I'm gonna get a print out 395 00:24:27,840 --> 00:24:31,120 Speaker 1: so you can visualize what this looks like. So things 396 00:24:31,119 --> 00:24:32,520 Speaker 1: were going well and you had this print out, and 397 00:24:32,560 --> 00:24:36,439 Speaker 1: everyone was amazed because these these calculations never seemed to 398 00:24:36,920 --> 00:24:42,560 Speaker 1: repeat themselves. He was making like, um, like like word art. 399 00:24:43,080 --> 00:24:45,320 Speaker 1: You remember that. That was the first thing anybody did 400 00:24:45,320 --> 00:24:47,520 Speaker 1: on a computer. Oh yeah, it was to make word 401 00:24:47,600 --> 00:24:50,760 Speaker 1: art like a butterfly, right you would print out. Yeah. 402 00:24:50,840 --> 00:24:53,600 Speaker 1: I never could do that. I couldn't either, Like you 403 00:24:53,640 --> 00:24:56,639 Speaker 1: have to be able to visualize things spatially that you 404 00:24:56,680 --> 00:24:59,240 Speaker 1: have to that right kind of brain for that, right 405 00:24:59,320 --> 00:25:01,080 Speaker 1: or you have to be following a guy book that 406 00:25:01,400 --> 00:25:06,280 Speaker 1: you have you ever seen? Me? You and everyone we know. Yeah, 407 00:25:06,320 --> 00:25:08,520 Speaker 1: I love that movie. That's a great movie. Those little 408 00:25:08,560 --> 00:25:12,600 Speaker 1: kids in there they were doing that. Oh yeah, yeah, forever, 409 00:25:13,160 --> 00:25:15,640 Speaker 1: back and forth, poop. Well, I haven't. I haven't seen 410 00:25:15,640 --> 00:25:17,240 Speaker 1: that since it came out. It's been a while. Oh 411 00:25:17,280 --> 00:25:21,000 Speaker 1: you gotta see it again? Yeah, great movie. Ali's not 412 00:25:21,119 --> 00:25:24,119 Speaker 1: in it. It's a Miranda July right, and she like 413 00:25:24,160 --> 00:25:26,160 Speaker 1: wrote and directed to right. She did a great job. 414 00:25:26,800 --> 00:25:30,200 Speaker 1: It's like it's one of those rare movies where like 415 00:25:31,359 --> 00:25:34,919 Speaker 1: there's just the right amount of whimsy, because whimsy so 416 00:25:35,040 --> 00:25:40,239 Speaker 1: easily overpowers everything else and becomes like, yeah, this is 417 00:25:40,280 --> 00:25:43,360 Speaker 1: like the most perfectly balanced amount of like whimsy you've 418 00:25:43,400 --> 00:25:45,640 Speaker 1: ever seen in a movie. Yeah, there's too much whimsy. 419 00:25:45,720 --> 00:25:47,879 Speaker 1: I just like terrible Garden State. I just want to 420 00:25:47,920 --> 00:25:50,840 Speaker 1: punch in the face terrible. Although I like Garden State, 421 00:25:50,880 --> 00:25:52,560 Speaker 1: but I haven't seen it since it came out. It 422 00:25:52,640 --> 00:25:55,520 Speaker 1: hasn't aged. Well, it's just when you look at it now, 423 00:25:55,520 --> 00:25:58,399 Speaker 1: it's just so cute and whimsical. Oh yeah, it's like 424 00:25:58,720 --> 00:26:03,040 Speaker 1: come on, yeah, boy, we're getting to a lot of 425 00:26:03,080 --> 00:26:06,480 Speaker 1: movies today. Oh yeah, well we're stalling. We haven't even 426 00:26:06,520 --> 00:26:10,280 Speaker 1: talked about butterfly Effect yet, which is coming and I'm 427 00:26:10,359 --> 00:26:13,840 Speaker 1: dreading it. That's why I'm stalling. All right, So where 428 00:26:13,840 --> 00:26:17,399 Speaker 1: were we? He was running his calculations, printing out his 429 00:26:17,520 --> 00:26:21,520 Speaker 1: values so people could see it, and then he got 430 00:26:21,520 --> 00:26:26,840 Speaker 1: a little lazy one day. In this output he noticed 431 00:26:27,080 --> 00:26:29,639 Speaker 1: was interesting, so he said, you know, I'm gonna repeat 432 00:26:29,640 --> 00:26:33,600 Speaker 1: this calculation see it again, but I'm gonna save time. 433 00:26:33,880 --> 00:26:35,439 Speaker 1: I'm just gonna kind of pick up in the middle, 434 00:26:36,600 --> 00:26:40,160 Speaker 1: and I'm not gonna input as many numbers, but I'm 435 00:26:40,160 --> 00:26:43,040 Speaker 1: still using the same values, just I'm not going out 436 00:26:43,080 --> 00:26:45,800 Speaker 1: to six decimal points. So the print out he had 437 00:26:46,160 --> 00:26:49,080 Speaker 1: went to three decimal points. So he was working from 438 00:26:49,119 --> 00:26:51,600 Speaker 1: the print out and didn't take into account that the 439 00:26:51,640 --> 00:26:54,800 Speaker 1: computer accepted six decimal points, so he was just getting 440 00:26:54,840 --> 00:26:57,399 Speaker 1: in three correct and expecting that the outcome would be 441 00:26:57,400 --> 00:26:59,480 Speaker 1: the same, right, yes, but the outcome was way different. 442 00:27:00,040 --> 00:27:04,760 Speaker 1: He went, whoa, whoa what? Yeah, he's like, what's going 443 00:27:04,840 --> 00:27:08,080 Speaker 1: on here? It was a big deal. I mean, someone 444 00:27:08,080 --> 00:27:10,399 Speaker 1: would have come up with this eventually, probably, yeah, but 445 00:27:10,359 --> 00:27:12,720 Speaker 1: I sort of accidentally came upon it. It's neat that 446 00:27:12,800 --> 00:27:15,520 Speaker 1: this guy did this because it changed his career. I 447 00:27:15,560 --> 00:27:19,320 Speaker 1: think he went from emphasis on meteorology to an emphasis 448 00:27:19,359 --> 00:27:24,400 Speaker 1: on chaos math to stud scientists basically. So look, I mean, 449 00:27:24,400 --> 00:27:26,800 Speaker 1: the guy's got an attractor named after him, you know 450 00:27:26,800 --> 00:27:29,399 Speaker 1: what I mean. Yeah, Well, let's get to that. So 451 00:27:29,920 --> 00:27:32,840 Speaker 1: Lorenz starts looking at this and he's like, wait a minute, 452 00:27:32,840 --> 00:27:36,080 Speaker 1: this is this is weird, this is worth investigating, and 453 00:27:36,240 --> 00:27:41,240 Speaker 1: like uh, like uh, what was his name? Plankara? He said, 454 00:27:41,480 --> 00:27:44,120 Speaker 1: I need fewer variables, So I'm not going to try 455 00:27:44,160 --> 00:27:48,720 Speaker 1: to predict weather with these twelve differential equations that you 456 00:27:48,760 --> 00:27:52,680 Speaker 1: have to take into Account'm just gonna take one aspect 457 00:27:52,760 --> 00:27:56,160 Speaker 1: of weather called the rolling convection current, and I'm going 458 00:27:56,240 --> 00:27:59,840 Speaker 1: to see how I can write it down in formula form. 459 00:28:00,160 --> 00:28:03,480 Speaker 1: So a rolling convection current, chuck, is where you know, 460 00:28:03,480 --> 00:28:07,520 Speaker 1: how the wind is created where air at the surface 461 00:28:08,359 --> 00:28:12,520 Speaker 1: is heated and then starts to rise and suddenly cool 462 00:28:12,600 --> 00:28:15,720 Speaker 1: air from higher above comes in to fill that that 463 00:28:15,800 --> 00:28:20,520 Speaker 1: vacuum that's left, and that creates a rolling um or 464 00:28:20,840 --> 00:28:25,080 Speaker 1: vertically based convection current. Okay you could. I would describe 465 00:28:25,080 --> 00:28:30,080 Speaker 1: it as oven oven boiling water, a cup of coffee. 466 00:28:30,359 --> 00:28:35,640 Speaker 1: Wherever there's a temperature differential based on a vertical alignment, 467 00:28:35,880 --> 00:28:39,200 Speaker 1: you're going to have a rolling convection current. Okay, yeah, 468 00:28:39,240 --> 00:28:42,360 Speaker 1: it sounds complex, but he just picked out one thing, basically, 469 00:28:42,360 --> 00:28:44,720 Speaker 1: one condition, and this is the one he picked out. 470 00:28:44,760 --> 00:28:48,040 Speaker 1: But had you seen my hands moving listeners, you would 471 00:28:48,040 --> 00:28:53,560 Speaker 1: be like, oh, yeah, I know. So um He's like, okay, 472 00:28:53,600 --> 00:28:55,320 Speaker 1: I can figure this out. So he comes up with 473 00:28:55,400 --> 00:29:00,680 Speaker 1: three three formula that kind of describe a rolling invection current, 474 00:29:01,040 --> 00:29:05,880 Speaker 1: and he starts trying to figure out how to describe 475 00:29:05,920 --> 00:29:09,680 Speaker 1: this rolling convection current. Right, and so, like I said, 476 00:29:09,680 --> 00:29:12,240 Speaker 1: he got this these three formula, which we're basically three 477 00:29:12,320 --> 00:29:15,520 Speaker 1: variables that he calculated over time, and he plugged him 478 00:29:15,520 --> 00:29:19,200 Speaker 1: in and he found three variables that changed over time. 479 00:29:19,720 --> 00:29:22,320 Speaker 1: And he found that after a certain point, when you 480 00:29:22,560 --> 00:29:25,200 Speaker 1: graph these things out, and since there're three, you graph 481 00:29:25,280 --> 00:29:28,520 Speaker 1: them out on a three dimensional graph. So x, Y 482 00:29:28,560 --> 00:29:30,360 Speaker 1: and Z. Again, he wanted to just be able to 483 00:29:30,400 --> 00:29:33,400 Speaker 1: visualize this because it's easier for people to understand. He 484 00:29:33,440 --> 00:29:35,840 Speaker 1: was a very visual guy. All of a sudden, it 485 00:29:35,920 --> 00:29:40,400 Speaker 1: made this crazy graph that where the the line as 486 00:29:40,440 --> 00:29:43,440 Speaker 1: it progressed forward through time, went all over the place. 487 00:29:43,480 --> 00:29:45,760 Speaker 1: It went from this access to another access to the 488 00:29:45,840 --> 00:29:48,680 Speaker 1: other axis, and it would spend some time over here, 489 00:29:48,720 --> 00:29:51,000 Speaker 1: and then it would suddenly loop over to the other one, 490 00:29:51,040 --> 00:29:55,080 Speaker 1: and it followed no rhyme or reason. It never retraced 491 00:29:55,120 --> 00:30:00,240 Speaker 1: its path. And it was describing how a convey action 492 00:30:00,320 --> 00:30:04,920 Speaker 1: current changes over time, right, and Lorenz is looking at this, 493 00:30:06,000 --> 00:30:10,120 Speaker 1: he was expecting these three things to equalize and eventually 494 00:30:10,160 --> 00:30:13,680 Speaker 1: form a line, because that's what determinism says, things are 495 00:30:13,720 --> 00:30:16,960 Speaker 1: going to fall into a certain amount of equilibrium and 496 00:30:17,040 --> 00:30:19,760 Speaker 1: just even out over time. That is not what he 497 00:30:19,800 --> 00:30:22,880 Speaker 1: found now, And what he discovered was what pan quar 498 00:30:23,000 --> 00:30:29,000 Speaker 1: A discovered, which was that some systems, even relatively simple systems, 499 00:30:29,400 --> 00:30:35,320 Speaker 1: exhibit very complex, unpredictable behavior, which you could call chaos. Yeah. 500 00:30:35,360 --> 00:30:37,719 Speaker 1: And when you say things were going all over like 501 00:30:37,760 --> 00:30:40,200 Speaker 1: if you look at the graph, it it's not just 502 00:30:40,320 --> 00:30:43,720 Speaker 1: lines going in straight lines bouncing all over the place randomly, 503 00:30:43,800 --> 00:30:46,920 Speaker 1: like there was an order to it, but the lines 504 00:30:46,960 --> 00:30:49,360 Speaker 1: were not on top of one another. Like let's say 505 00:30:49,360 --> 00:30:52,160 Speaker 1: you draw a figure eight with your pencil, and then 506 00:30:52,200 --> 00:30:55,360 Speaker 1: you continue drawing that figure eight, it's gonna slip outside 507 00:30:55,400 --> 00:31:00,640 Speaker 1: those curves every time unless you're a robot. Um. And 508 00:31:00,680 --> 00:31:02,600 Speaker 1: that's what it ended up looking like. Yeah, yeah, it 509 00:31:02,680 --> 00:31:07,360 Speaker 1: never retraced the same path twice ever. Um. It had 510 00:31:07,400 --> 00:31:10,640 Speaker 1: a lot of really surprising properties, and at the time 511 00:31:11,000 --> 00:31:14,840 Speaker 1: it just fell completely outside the understanding of science, right. Yeah. 512 00:31:15,280 --> 00:31:18,520 Speaker 1: Luckily this happened to Lawrence, who was curious enough to 513 00:31:18,520 --> 00:31:21,640 Speaker 1: be like, what is going on here? And again he 514 00:31:21,720 --> 00:31:23,760 Speaker 1: sat down and started to do the math and thinking 515 00:31:23,760 --> 00:31:26,680 Speaker 1: about this and especially how it applied to the weather right, 516 00:31:27,720 --> 00:31:31,400 Speaker 1: and he came up with something very famous. Yes, the 517 00:31:31,440 --> 00:31:36,840 Speaker 1: butterfly effect. Yes, uh a, this thing kind of looked 518 00:31:36,880 --> 00:31:40,160 Speaker 1: like butterfly wings a little bit, uh and be When 519 00:31:40,200 --> 00:31:44,280 Speaker 1: he went to present his findings, he basically had the 520 00:31:44,320 --> 00:31:46,920 Speaker 1: notion He's like, I'm gonna I'm gonna wile these people 521 00:31:47,080 --> 00:31:50,520 Speaker 1: in the crowd in No, it's a conference that I'm 522 00:31:50,520 --> 00:31:53,440 Speaker 1: going to and I'm gonna I'm gonna say something like, 523 00:31:53,480 --> 00:31:56,160 Speaker 1: you know, the seagull flaps his wings and it starts 524 00:31:56,160 --> 00:31:59,360 Speaker 1: a small turbulence that can one that can affect whether 525 00:31:59,400 --> 00:32:02,160 Speaker 1: on the other side the world, the small little thing 526 00:32:02,160 --> 00:32:05,800 Speaker 1: will just grow and grow in snowball and effective things. 527 00:32:05,840 --> 00:32:09,920 Speaker 1: And he had a colleague goes like, seagull wings, that's nice, 528 00:32:10,720 --> 00:32:12,720 Speaker 1: and he said, how about this, and this is the title. 529 00:32:12,760 --> 00:32:16,600 Speaker 1: They ended up with, predictability Colin does the flap of 530 00:32:16,640 --> 00:32:20,120 Speaker 1: a butterfly's wings in Brazil set off a tornado and 531 00:32:20,200 --> 00:32:27,280 Speaker 1: Texas and everyone was like, WHOA mind's blown? Should we 532 00:32:27,280 --> 00:32:46,920 Speaker 1: take a break? All right, We'll be right back, all right. 533 00:32:47,040 --> 00:32:53,400 Speaker 1: So the lawns attractor. Uh, is that picture that he 534 00:32:53,520 --> 00:33:00,320 Speaker 1: ended up with? The Laurens attractor? And this biblical pattern 535 00:33:00,440 --> 00:33:05,760 Speaker 1: website that I found described attractors and strange attractors in 536 00:33:05,760 --> 00:33:09,600 Speaker 1: a way that even dumb old me could understand. So 537 00:33:09,640 --> 00:33:13,400 Speaker 1: if I may, he says, all right, here's the cycle 538 00:33:13,440 --> 00:33:18,360 Speaker 1: of chaos. He said, Actually, I don't know who wrote this. 539 00:33:19,520 --> 00:33:21,600 Speaker 1: A woman could have been a small child, could have 540 00:33:21,600 --> 00:33:25,200 Speaker 1: been no of undetermined gender. I have no idea. So 541 00:33:25,400 --> 00:33:30,120 Speaker 1: the gender neutral narrator, they said, he's sorry. I think 542 00:33:30,120 --> 00:33:33,640 Speaker 1: about a town that has like ten thousand people living 543 00:33:33,640 --> 00:33:36,680 Speaker 1: in it. To make that town work, you gotta have 544 00:33:36,720 --> 00:33:41,520 Speaker 1: like a gas station, a grocery store, a library, um, 545 00:33:41,680 --> 00:33:44,800 Speaker 1: whatever you need to sustain that town. So all these 546 00:33:44,840 --> 00:33:48,640 Speaker 1: things are built, everyone's happy. You have equilibrium, he said. 547 00:33:48,720 --> 00:33:52,440 Speaker 1: So that's great. Then let's say you build some Someone 548 00:33:52,560 --> 00:33:55,640 Speaker 1: comes and build a factory on the outskirts of that town, 549 00:33:56,000 --> 00:33:58,000 Speaker 1: and there's gonna be ten thousand more people living there 550 00:33:58,320 --> 00:34:02,800 Speaker 1: and they don't go to church. Maybe so, uh, did 551 00:34:02,800 --> 00:34:05,480 Speaker 1: I say church? They needed a church? Okay? I was 552 00:34:05,520 --> 00:34:09,600 Speaker 1: just assuming this is what's called no. But you just 553 00:34:09,640 --> 00:34:12,560 Speaker 1: have more people. So there's you need another gas station 554 00:34:12,600 --> 00:34:15,400 Speaker 1: and another grocery store. Let's say, so they build all 555 00:34:15,400 --> 00:34:20,080 Speaker 1: these things, and then you reach equilibrium. Again, it's maintained 556 00:34:20,120 --> 00:34:24,200 Speaker 1: because you build all these other systems up. That equilibrium 557 00:34:24,320 --> 00:34:29,320 Speaker 1: is called an attractor. Okay, so then he said it's said, 558 00:34:30,040 --> 00:34:37,880 Speaker 1: they said he capital he the royal. He said, all right, now, 559 00:34:37,960 --> 00:34:40,960 Speaker 1: let's say instead of that that factory being built, and 560 00:34:41,080 --> 00:34:42,920 Speaker 1: you have those original tin bowls, and let's say three 561 00:34:42,920 --> 00:34:46,200 Speaker 1: thousand those people just up and leave one day, and 562 00:34:46,239 --> 00:34:49,120 Speaker 1: the grocery store guy says, well, there's only seven thousand 563 00:34:49,120 --> 00:34:51,200 Speaker 1: people here. We need eight thousand people living here to 564 00:34:51,200 --> 00:34:54,439 Speaker 1: to make a profit. So I'm shutting down this grocery store. 565 00:34:55,760 --> 00:34:58,319 Speaker 1: Then all of a sudden, you have demand for groceries. 566 00:34:59,239 --> 00:35:00,919 Speaker 1: So things go on a little while, and someone comes 567 00:35:00,920 --> 00:35:02,560 Speaker 1: in and say, hey, this town needs a grocery store. 568 00:35:02,719 --> 00:35:06,439 Speaker 1: They build a grocery store, they can't sustain, they shut down. 569 00:35:06,560 --> 00:35:09,360 Speaker 1: Someone else comes along because the demand and it is 570 00:35:09,440 --> 00:35:15,360 Speaker 1: this search for equilibrium, this dyna Well, you reach equal 571 00:35:15,480 --> 00:35:19,560 Speaker 1: delibrium here and there as the store opens, periods of stability, 572 00:35:19,640 --> 00:35:24,239 Speaker 1: periods of stability, and that dynamic equilibrium is called a 573 00:35:24,320 --> 00:35:28,279 Speaker 1: strange attractor. So, an attractor is the state which a 574 00:35:28,400 --> 00:35:33,399 Speaker 1: system settles on. Stranger attractor is the trajectory on which 575 00:35:33,440 --> 00:35:37,359 Speaker 1: it never settles down but tries to reach the equilibrium 576 00:35:37,400 --> 00:35:40,960 Speaker 1: with periods of stability. Does that make sense that Bible 577 00:35:41,000 --> 00:35:45,600 Speaker 1: based explanation was dynamite. I understand it better than I 578 00:35:45,640 --> 00:35:49,919 Speaker 1: did before, and I understood it okay before. That's great. 579 00:35:50,040 --> 00:35:55,040 Speaker 1: Surely can add yeah, yeah, now you're gonna add to it. No, 580 00:35:55,239 --> 00:35:59,040 Speaker 1: that's it, No, I mean like it. Yeah. An attractor 581 00:35:59,160 --> 00:36:03,160 Speaker 1: is where if you raft something and eventually it reaches equilibrium, 582 00:36:03,160 --> 00:36:06,600 Speaker 1: it's a regular attractor. If it never reaches equilibrium, is 583 00:36:06,640 --> 00:36:09,600 Speaker 1: constantly trying to and has periods of stability. Strange attractor. 584 00:36:09,840 --> 00:36:13,040 Speaker 1: I can't. I can't top that, alright, grocery store, small town. 585 00:36:13,120 --> 00:36:16,840 Speaker 1: That was great. So um Lorenz, a strange attractor was 586 00:36:16,960 --> 00:36:20,279 Speaker 1: named a Lorenz attractor named after him. Big deal. They 587 00:36:20,280 --> 00:36:24,080 Speaker 1: weren't using the word chaos yet. No. But he published 588 00:36:24,120 --> 00:36:28,120 Speaker 1: that paper about butterfly wings, right, the butterfly effect, and 589 00:36:28,800 --> 00:36:32,440 Speaker 1: it coupled with his pictures the picture of a strange attractor, 590 00:36:32,480 --> 00:36:38,720 Speaker 1: which is almost the aside from fractals, almost the the 591 00:36:38,719 --> 00:36:42,920 Speaker 1: the um emblem or the logo for chaos theory. The 592 00:36:42,960 --> 00:36:47,160 Speaker 1: Laurens attractor is um. It got attention off the bat. 593 00:36:47,200 --> 00:36:50,000 Speaker 1: It wasn't like plan cares findings where he got neglected 594 00:36:50,040 --> 00:36:53,319 Speaker 1: for seventy years. Almost immediately everybody was talking about this 595 00:36:53,640 --> 00:36:56,520 Speaker 1: because again, what Lorenz had uncovered, which is the same 596 00:36:56,560 --> 00:36:59,840 Speaker 1: thing that plan Care had uncovered, is that determinism is 597 00:37:00,080 --> 00:37:05,240 Speaker 1: possibly based on an illusion that the universe isn't stable, 598 00:37:05,280 --> 00:37:07,680 Speaker 1: that the universe isn't predictable, and that what we are 599 00:37:07,680 --> 00:37:12,440 Speaker 1: seeing as stable and predictable are these little periods windows 600 00:37:12,480 --> 00:37:16,600 Speaker 1: of stability that are found in strange attractor graphs, that 601 00:37:16,600 --> 00:37:18,680 Speaker 1: that's what we think the order of the universe is, 602 00:37:18,800 --> 00:37:23,760 Speaker 1: but that that is actually the abnormal aspect of the universe, 603 00:37:23,920 --> 00:37:28,440 Speaker 1: and that instability unpredictability, as far as we're concerned, is 604 00:37:28,520 --> 00:37:32,080 Speaker 1: the actual state of affairs in in nature. And I 605 00:37:32,120 --> 00:37:34,640 Speaker 1: think as far as we're concerned, is a really important 606 00:37:34,680 --> 00:37:39,719 Speaker 1: point to Chuck, because it doesn't mean that nature is 607 00:37:39,920 --> 00:37:45,160 Speaker 1: unstable chaotic. It means that our picture of what we 608 00:37:45,400 --> 00:37:50,440 Speaker 1: understand as order doesn't jibe with how the universe actually functions. 609 00:37:51,120 --> 00:37:54,360 Speaker 1: It's just our understanding of it, and we're's just so 610 00:37:54,560 --> 00:37:59,040 Speaker 1: um anthropocentric that you know, we we see it as 611 00:37:59,120 --> 00:38:01,759 Speaker 1: chaos and disorder and something to be feared, when really 612 00:38:01,800 --> 00:38:05,920 Speaker 1: it's just complexity that we don't have the capability of 613 00:38:05,920 --> 00:38:10,040 Speaker 1: predicting after a certain degree. Yeah, I think that makes 614 00:38:10,040 --> 00:38:11,960 Speaker 1: me feel a little better, because when you read stuff 615 00:38:12,000 --> 00:38:14,799 Speaker 1: like this, you start to feel like, well, the Earth 616 00:38:14,800 --> 00:38:17,440 Speaker 1: could just throw us all off of its face at 617 00:38:17,480 --> 00:38:21,200 Speaker 1: any moment because it starts spinning so fast that gravity 618 00:38:21,239 --> 00:38:23,439 Speaker 1: becomes undone. And I know that's not right. By the way, 619 00:38:23,520 --> 00:38:26,480 Speaker 1: I've always loved that kind of science that shows we 620 00:38:26,600 --> 00:38:30,160 Speaker 1: don't know anything. Like Robert Hume, who I know, I 621 00:38:30,239 --> 00:38:33,800 Speaker 1: understand was a philosopher, but he was a philosopher scientist. Um. 622 00:38:33,840 --> 00:38:36,400 Speaker 1: His whole jam was like causing effect as an illusion 623 00:38:37,080 --> 00:38:39,920 Speaker 1: that like we all we it's it's just an assumption, 624 00:38:40,040 --> 00:38:42,440 Speaker 1: like that if you drop a pencil, it will always 625 00:38:42,440 --> 00:38:44,880 Speaker 1: fall down, and it's an illusion. And this is pretty 626 00:38:45,200 --> 00:38:49,080 Speaker 1: um gravity understanding gravity. But he makes a good pet 627 00:38:49,120 --> 00:38:52,760 Speaker 1: gravity when everyone's just floating around. Yeah, going this pencils 628 00:38:52,920 --> 00:38:55,799 Speaker 1: got me wacky. But but the point was that you know, 629 00:38:55,920 --> 00:38:59,680 Speaker 1: we we are. We base a lot of our assumptions 630 00:39:00,400 --> 00:39:02,759 Speaker 1: um or a lot of stuff that we take as 631 00:39:02,960 --> 00:39:06,000 Speaker 1: law are actually based on assumptions that are made from 632 00:39:06,040 --> 00:39:09,360 Speaker 1: observations over time, and that we're just making predictions that 633 00:39:09,520 --> 00:39:12,040 Speaker 1: causing effect as an illusion. I love that guy, and 634 00:39:12,160 --> 00:39:18,000 Speaker 1: this this definitely supports that idea for sure. Yeah. Sorry, 635 00:39:18,000 --> 00:39:21,279 Speaker 1: I'm I'm excited about chaos theory. Believe it. Well, I 636 00:39:21,280 --> 00:39:24,520 Speaker 1: mean I like that I'm able to understand it and 637 00:39:24,760 --> 00:39:27,560 Speaker 1: enough of a rudimentary way that I can talk about 638 00:39:27,560 --> 00:39:31,839 Speaker 1: it at a dinner party. Well, thank your Bible website. Well, 639 00:39:31,880 --> 00:39:35,600 Speaker 1: once you take the formulas out for people like us, 640 00:39:36,000 --> 00:39:39,799 Speaker 1: we're like, okay, we can understand chaos. Then when somebody says, good, 641 00:39:39,840 --> 00:39:45,160 Speaker 1: do a differential equation, just like, what a different equation? Right? 642 00:39:45,480 --> 00:39:47,840 Speaker 1: All right? So earlier I said that chaos had not 643 00:39:47,960 --> 00:39:52,160 Speaker 1: been used the word chaos to describe all this junk, uh, 644 00:39:52,160 --> 00:39:55,240 Speaker 1: And that didn't happen until later on and well actually 645 00:39:56,280 --> 00:39:57,960 Speaker 1: about ten years, you know, but it was kind of 646 00:39:57,960 --> 00:39:59,520 Speaker 1: at the same time this other stuff was going on 647 00:39:59,600 --> 00:40:03,800 Speaker 1: with rinds. Yeah, late sixties, early seventies. There was a 648 00:40:03,800 --> 00:40:08,880 Speaker 1: guy named Steven Smile Uh fields metal recipients. So you know, 649 00:40:08,960 --> 00:40:14,560 Speaker 1: he's good at math, and um, he describes something that 650 00:40:14,600 --> 00:40:18,560 Speaker 1: we now know as the small horseshoe, and it goes 651 00:40:18,600 --> 00:40:23,799 Speaker 1: a little something like this. Uh So, all right, take 652 00:40:23,840 --> 00:40:27,160 Speaker 1: a piece of dough with like bread dough, and you 653 00:40:27,239 --> 00:40:30,759 Speaker 1: smash it out into a big flat rectangle. So you're 654 00:40:30,760 --> 00:40:33,040 Speaker 1: looking at that thing and you're like, boy, I hope 655 00:40:33,080 --> 00:40:35,600 Speaker 1: this makes some good bread. This is gonna be so good. 656 00:40:36,040 --> 00:40:37,960 Speaker 1: So then you do a little rosemary on it. Yeah 657 00:40:38,000 --> 00:40:42,759 Speaker 1: maybe so yeah, and then um lick it before you 658 00:40:42,800 --> 00:40:45,000 Speaker 1: bake it, so you know it's yours. No one else 659 00:40:45,040 --> 00:40:48,759 Speaker 1: can happen. Uh So, you you have that flat rectangle 660 00:40:48,760 --> 00:40:53,239 Speaker 1: of dough, you roll it up into a tube, and 661 00:40:53,239 --> 00:40:55,680 Speaker 1: then you smash that down kind of flat, and then 662 00:40:55,719 --> 00:40:57,960 Speaker 1: you bend that down to where it eventually looks like 663 00:40:57,960 --> 00:41:01,239 Speaker 1: a horse shoe. So now how you take that horseshoe. 664 00:41:01,400 --> 00:41:04,000 Speaker 1: You take another rectangle of dough and you throw that 665 00:41:04,080 --> 00:41:07,359 Speaker 1: horseshoe onto that, and then you do the same thing. 666 00:41:07,880 --> 00:41:12,080 Speaker 1: The smell horseshoe basically says you cannot predict where the 667 00:41:12,200 --> 00:41:16,560 Speaker 1: two points of that horseshoe will end up. You can 668 00:41:16,640 --> 00:41:19,840 Speaker 1: roll it a million times and they'll end up in 669 00:41:19,880 --> 00:41:24,680 Speaker 1: a million different places, totally random, different places to totally random. 670 00:41:24,719 --> 00:41:26,960 Speaker 1: You never know. It's like a box of chocolates. You 671 00:41:27,000 --> 00:41:29,480 Speaker 1: never know what you're gonna get. You have to say it, 672 00:41:29,760 --> 00:41:32,120 Speaker 1: and that became known. You have to say it. Oh 673 00:41:32,320 --> 00:41:35,080 Speaker 1: what imitate Forrest Gumps? Now I can't do that. That's fine. 674 00:41:35,239 --> 00:41:37,839 Speaker 1: He's not one. He's not in my repertoire. That's fine. 675 00:41:38,440 --> 00:41:40,520 Speaker 1: Although I did see that again part of it recently. 676 00:41:41,440 --> 00:41:43,920 Speaker 1: Does it hold up well? I mean, take out forty 677 00:41:43,960 --> 00:41:46,920 Speaker 1: minutes of it and it would have been a better movie, 678 00:41:47,360 --> 00:41:51,960 Speaker 1: like all of that coincidence stuff that that and he 679 00:41:52,080 --> 00:41:54,880 Speaker 1: also did the smile t shirt like it was just 680 00:41:54,960 --> 00:42:00,160 Speaker 1: too much, Like he really hammered it too much was 681 00:42:00,200 --> 00:42:02,239 Speaker 1: the basis of the movie. I know. But see it 682 00:42:02,280 --> 00:42:04,319 Speaker 1: again and I guarantee you, like an hour and a 683 00:42:04,320 --> 00:42:08,799 Speaker 1: half into it, you'll be like, I get it. You know. 684 00:42:08,840 --> 00:42:11,680 Speaker 1: It was a good Tom Hanks movie that was overlooked. 685 00:42:12,239 --> 00:42:16,000 Speaker 1: Road to Perdition, Yeah, that was a good one. Great 686 00:42:16,040 --> 00:42:19,640 Speaker 1: Sam Indees. Oh man, that guy is awesome. Yeah, Oh 687 00:42:19,680 --> 00:42:22,360 Speaker 1: what is he gonna do? He might do something he 688 00:42:22,440 --> 00:42:24,480 Speaker 1: did the James bo he did Skyfall. Yeah, yeah, I 689 00:42:24,480 --> 00:42:26,839 Speaker 1: know he's gonna also that last one that wasn't so great. 690 00:42:27,040 --> 00:42:30,040 Speaker 1: He's got a potential project coming up and he would 691 00:42:30,080 --> 00:42:32,319 Speaker 1: be amazing for and I don't remember what it was. 692 00:42:32,440 --> 00:42:36,719 Speaker 1: Did you see Revolutionary Road? Yes? God have it was 693 00:42:36,840 --> 00:42:40,000 Speaker 1: just like, yeah, you want to jump off a bridge 694 00:42:40,600 --> 00:42:43,600 Speaker 1: like every five minutes during that movie. That was hardcore. 695 00:42:45,040 --> 00:42:46,960 Speaker 1: H he did that one too. Huh yeah, And don't 696 00:42:46,960 --> 00:42:48,960 Speaker 1: see that if you're like engaged to be married or 697 00:42:49,000 --> 00:42:53,040 Speaker 1: thinking about it, yeah, or if you're blue already. I'm yeah, 698 00:42:53,200 --> 00:42:55,359 Speaker 1: just take a really good good mood and be like 699 00:42:55,400 --> 00:42:57,560 Speaker 1: I'm sick of being in a good mood, sit down 700 00:42:57,560 --> 00:43:03,640 Speaker 1: and watch Revolutionary Road. Watch Joe Versus of Volcano instead. Uh? 701 00:43:03,719 --> 00:43:09,120 Speaker 1: Where was I smell? Horseshoe? Is what that's called? And? Um? 702 00:43:09,239 --> 00:43:11,560 Speaker 1: That was he was the first person to actually use 703 00:43:11,600 --> 00:43:14,879 Speaker 1: the word chaos. Oh he was? I think so? No? No, No, 704 00:43:15,280 --> 00:43:18,239 Speaker 1: York was Tom York's dad. Yeah, you're right, he wasn't 705 00:43:18,239 --> 00:43:21,400 Speaker 1: the first person York correct. But it's male's horseshoe. Illustrates 706 00:43:21,440 --> 00:43:24,120 Speaker 1: a really good point, Chuck, is it Tom York's dad? No? 707 00:43:25,520 --> 00:43:30,480 Speaker 1: But they're both British, sure, York. Actually one's Australian. No, 708 00:43:31,080 --> 00:43:36,439 Speaker 1: they're British. Um. So those two points, which should which 709 00:43:36,440 --> 00:43:38,440 Speaker 1: started out right by each other and then end up 710 00:43:38,440 --> 00:43:41,600 Speaker 1: in two totally different places. That applies not just a 711 00:43:41,640 --> 00:43:45,759 Speaker 1: bread dough, but also too, things like water molecules that 712 00:43:45,800 --> 00:43:48,359 Speaker 1: are right next to each other at some point and 713 00:43:48,400 --> 00:43:52,439 Speaker 1: then months later they're in two different oceans, even though 714 00:43:52,560 --> 00:43:54,400 Speaker 1: you would assume that they would go through all the 715 00:43:54,480 --> 00:43:57,319 Speaker 1: same motions and everything, but they're not. There's so many 716 00:43:57,360 --> 00:44:01,319 Speaker 1: different variables with things like ocean currents that two water 717 00:44:01,400 --> 00:44:03,520 Speaker 1: molecules that were once side by side end up in 718 00:44:04,000 --> 00:44:08,120 Speaker 1: totally random different places. And that's part of chaos. It's 719 00:44:08,160 --> 00:44:15,879 Speaker 1: basically chaos personified or chaos molecule fied. So we mentioned York. 720 00:44:16,560 --> 00:44:19,040 Speaker 1: Where I was going with that was, Um, there was 721 00:44:19,080 --> 00:44:24,160 Speaker 1: an Australian named Robert May and he was a population biologist. 722 00:44:24,640 --> 00:44:28,239 Speaker 1: So he was using math to model how animal populations 723 00:44:28,239 --> 00:44:33,000 Speaker 1: would change over time, giving certain starting conditions. Uh. So 724 00:44:33,080 --> 00:44:38,239 Speaker 1: he started using uh these equations is differential equations, and 725 00:44:38,320 --> 00:44:40,320 Speaker 1: he came up with a formula known as the logistic 726 00:44:40,440 --> 00:44:44,840 Speaker 1: difference equation that basically enabled him to predict these animal 727 00:44:44,880 --> 00:44:49,280 Speaker 1: populations pretty well. Yeah, and it was working pretty well 728 00:44:49,360 --> 00:44:53,240 Speaker 1: for a while, but he noticed something really really weird, right, Yeah, 729 00:44:53,520 --> 00:44:57,920 Speaker 1: he had this formula. Um, the logistic difference equation is 730 00:44:57,960 --> 00:45:00,080 Speaker 1: the name of it. Sure, Okay, So we had of 731 00:45:00,120 --> 00:45:04,440 Speaker 1: that formula, and he figured out that if you took 732 00:45:04,440 --> 00:45:07,600 Speaker 1: our which in this case was the reproductive rate of 733 00:45:07,640 --> 00:45:12,080 Speaker 1: a animal population, and you pushed it past three, the 734 00:45:12,160 --> 00:45:16,440 Speaker 1: number three, so that meant that the average animal in 735 00:45:16,480 --> 00:45:21,800 Speaker 1: this population of animals had three offspring in its lifetime 736 00:45:22,320 --> 00:45:25,279 Speaker 1: or in a season, whatever. If you pushed the past three, 737 00:45:25,280 --> 00:45:31,480 Speaker 1: all of a sudden, the number of the population would diverge. 738 00:45:31,560 --> 00:45:33,799 Speaker 1: If you pushed it equal to three, actually or more, 739 00:45:34,520 --> 00:45:38,279 Speaker 1: it would diverge, which is weird because a population of 740 00:45:38,320 --> 00:45:42,160 Speaker 1: animals can't be two different numbers, you know, like that 741 00:45:42,239 --> 00:45:45,680 Speaker 1: herd of antelope is not there's not thirty, but there's 742 00:45:45,719 --> 00:45:48,560 Speaker 1: also forty five of them at the same time. That's 743 00:45:48,560 --> 00:45:51,800 Speaker 1: called the superposition, and that has to do with quantum states, 744 00:45:51,880 --> 00:45:57,160 Speaker 1: not herds of antelopes. That was kind of weird. And 745 00:45:57,160 --> 00:45:58,920 Speaker 1: then he found if you pushed it a little further, 746 00:45:58,960 --> 00:46:02,440 Speaker 1: if you made the productive rate like three point oh 747 00:46:02,480 --> 00:46:05,600 Speaker 1: five seven or something like that. I think it was 748 00:46:05,640 --> 00:46:08,160 Speaker 1: a different number, but you just tweaked it a little bit, 749 00:46:08,160 --> 00:46:11,440 Speaker 1: not even to four. We're talking like millions of a 750 00:46:11,600 --> 00:46:15,800 Speaker 1: of a of a degree um, all of a sudden 751 00:46:15,840 --> 00:46:18,280 Speaker 1: it would turn into four so there'd be four different 752 00:46:18,360 --> 00:46:21,040 Speaker 1: numbers for that was the animal population, and then would 753 00:46:21,080 --> 00:46:23,120 Speaker 1: turn into sixteen, and then all of a sudden, after 754 00:46:23,120 --> 00:46:25,759 Speaker 1: a certain point, it would turn into chaos. The number 755 00:46:25,760 --> 00:46:27,960 Speaker 1: would be everything at once, all over the place, just 756 00:46:28,040 --> 00:46:33,080 Speaker 1: totally random numbers that it oscillated between. But in all 757 00:46:33,160 --> 00:46:36,040 Speaker 1: that chaos, there would be periods of stability. Right, you 758 00:46:36,080 --> 00:46:37,760 Speaker 1: push it a little further, and all of a sudden 759 00:46:37,760 --> 00:46:40,400 Speaker 1: it would just go to two again. But beyond that 760 00:46:40,480 --> 00:46:43,120 Speaker 1: it didn't go back to the original two numbers. It 761 00:46:43,160 --> 00:46:45,000 Speaker 1: went to another two. So if you looked at it 762 00:46:45,040 --> 00:46:48,520 Speaker 1: on a graph, it went line divided into two, divided 763 00:46:48,600 --> 00:46:54,320 Speaker 1: into four eight sixteen chaos, two four sixteen sixteen chaos, 764 00:46:55,120 --> 00:46:58,160 Speaker 1: all before you even got to the number four of 765 00:46:58,280 --> 00:47:02,160 Speaker 1: the reproductive right. And he was working with Mr York 766 00:47:02,520 --> 00:47:04,440 Speaker 1: because he was a little confounded, so he was a 767 00:47:04,480 --> 00:47:08,280 Speaker 1: mathematician buddy of his, James Yorke from the University of Maryland, 768 00:47:08,520 --> 00:47:11,839 Speaker 1: so they worked together on this. In the nine they 769 00:47:11,880 --> 00:47:17,480 Speaker 1: co authored a paper called Period three implies Chaos and man, 770 00:47:17,920 --> 00:47:21,640 Speaker 1: finally somebody said the word. I kept thinking it was 771 00:47:21,680 --> 00:47:24,480 Speaker 1: all these other people. Yeah, and the this this paper 772 00:47:24,520 --> 00:47:29,799 Speaker 1: where they first debut the name chaos. Um. They they 773 00:47:29,960 --> 00:47:34,399 Speaker 1: based it. UM. Tom York's Dead based it on Edward 774 00:47:34,480 --> 00:47:37,160 Speaker 1: Lawrences paper. He was like, you know what, I have 775 00:47:37,280 --> 00:47:39,800 Speaker 1: a feeling that has something to do with the Lawrens attractor. 776 00:47:40,200 --> 00:47:44,400 Speaker 1: So that, um, that that provided chaos to the world. 777 00:47:44,440 --> 00:47:48,440 Speaker 1: And it it was the basically the third the third 778 00:47:48,480 --> 00:47:53,680 Speaker 1: time a scientist had said we don't understand the universe 779 00:47:53,760 --> 00:47:56,960 Speaker 1: like we think we do, and determinism is based on 780 00:47:57,000 --> 00:48:04,040 Speaker 1: an illusion of order, a really chaotic universe. And this, uh, 781 00:48:04,120 --> 00:48:06,480 Speaker 1: this established chaos. It took off like a rocket. And 782 00:48:06,520 --> 00:48:08,719 Speaker 1: the eighties and the nineties, you know, as you know 783 00:48:08,800 --> 00:48:12,480 Speaker 1: from Jurassic Park, chaos was everything. Everybody's like, chaos, this 784 00:48:12,600 --> 00:48:15,799 Speaker 1: is totally awesome. It's the new frontier science. And then 785 00:48:15,800 --> 00:48:18,000 Speaker 1: it just went It just went away, And a lot 786 00:48:18,040 --> 00:48:21,759 Speaker 1: of people said, well, it was a little overhyped, but 787 00:48:21,840 --> 00:48:24,440 Speaker 1: I think more than anything, and I think this is 788 00:48:24,480 --> 00:48:26,600 Speaker 1: kind of the current understanding of chaos because it didn't 789 00:48:26,600 --> 00:48:28,920 Speaker 1: actually go away. It became a deeper and deeper field. 790 00:48:29,120 --> 00:48:34,920 Speaker 1: As you'll see, Um, people mistook what chaos meant. It 791 00:48:35,000 --> 00:48:39,560 Speaker 1: wasn't the a new the new type of science. It 792 00:48:39,600 --> 00:48:42,399 Speaker 1: was a new understanding of the universe. It was saying like, yes, 793 00:48:42,480 --> 00:48:45,319 Speaker 1: you can still use new Tony in physics, like don't 794 00:48:45,360 --> 00:48:47,799 Speaker 1: throw everything out the window. You can still try and 795 00:48:47,840 --> 00:48:50,360 Speaker 1: predict weather and still try and build more accurate instruments 796 00:48:50,800 --> 00:48:54,560 Speaker 1: and get you know, decent results. But you can't with 797 00:48:54,719 --> 00:49:01,480 Speaker 1: absolute perfection. Complex systems like determinism. The the ultimate goal 798 00:49:01,520 --> 00:49:05,080 Speaker 1: of determinism is false. It can never be it can 799 00:49:05,120 --> 00:49:07,920 Speaker 1: never be done because we can't have an infinitely precise 800 00:49:07,920 --> 00:49:11,200 Speaker 1: measurement for every variable or any variable. Therefore, we can't 801 00:49:11,200 --> 00:49:14,239 Speaker 1: predict these outcomes. Right, So you would expect science to 802 00:49:14,280 --> 00:49:18,640 Speaker 1: be like, what's the point, what's the point of anything? Well, 803 00:49:18,719 --> 00:49:22,600 Speaker 1: some some chaos people have said, no, this is this 804 00:49:22,680 --> 00:49:26,560 Speaker 1: is great, this is good. We'll take this. Will take 805 00:49:26,560 --> 00:49:29,680 Speaker 1: the universe as it is, rather than trying to force 806 00:49:29,719 --> 00:49:33,279 Speaker 1: it into our pretty little equations and saying like if 807 00:49:33,280 --> 00:49:36,839 Speaker 1: the ocean temperature is this at this time of year, uh, 808 00:49:36,840 --> 00:49:39,480 Speaker 1: and the fish population is this at that time, then 809 00:49:39,880 --> 00:49:42,919 Speaker 1: this is how many offspring this fish stole. This fish 810 00:49:42,920 --> 00:49:47,759 Speaker 1: population is going to have. Um, say, okay, here is 811 00:49:47,840 --> 00:49:51,359 Speaker 1: the fish population, here is the ocean temperature, here all 812 00:49:51,400 --> 00:49:54,000 Speaker 1: these other variables. Let's feed it into a model and 813 00:49:54,040 --> 00:49:58,080 Speaker 1: see what happens. Not this is going to happen. What 814 00:49:58,280 --> 00:50:01,799 Speaker 1: happens instead? And this is kind of the understanding of 815 00:50:01,880 --> 00:50:06,239 Speaker 1: chaos theory. Now. It's taking raw data, as much data 816 00:50:06,280 --> 00:50:08,880 Speaker 1: as you can possibly get your hands on, as precise 817 00:50:08,960 --> 00:50:10,719 Speaker 1: data as you could possibly get your hands on, and 818 00:50:10,800 --> 00:50:14,280 Speaker 1: just feeding it into a model and seeing what patterns emerge. 819 00:50:14,640 --> 00:50:18,319 Speaker 1: Rather than making assumptions, it's saying, what's the outcome? What 820 00:50:18,400 --> 00:50:20,839 Speaker 1: comes out of this model? Yeah, And that's why, like 821 00:50:21,520 --> 00:50:23,719 Speaker 1: when you see some things, like you know, fifty years 822 00:50:23,719 --> 00:50:28,359 Speaker 1: ago they predicted this animal be its extinct and it's not. Well, 823 00:50:28,440 --> 00:50:33,399 Speaker 1: it's because the variations were too complex they tried to predict. Uh. 824 00:50:33,400 --> 00:50:38,960 Speaker 1: And that's why if you look at a ten day forecast, you, sir, 825 00:50:39,000 --> 00:50:43,319 Speaker 1: are a fool. All right, Well, ten days from now 826 00:50:43,400 --> 00:50:45,960 Speaker 1: says it's going to rain in the afternoon. Come on. 827 00:50:46,440 --> 00:50:49,239 Speaker 1: But if you take if you took enough variables for 828 00:50:49,440 --> 00:50:52,799 Speaker 1: weather for like a city, and fed it into a 829 00:50:52,840 --> 00:50:57,600 Speaker 1: model of the weather for that city, you could find, uh, 830 00:50:57,760 --> 00:51:01,520 Speaker 1: you could find a time when it was similar to 831 00:51:01,680 --> 00:51:04,400 Speaker 1: what it is now, and you could conceivably make some 832 00:51:04,480 --> 00:51:07,960 Speaker 1: assumptions based on that. You can say, well, actually we 833 00:51:08,040 --> 00:51:10,600 Speaker 1: can we can predict a little further out than we think. 834 00:51:11,120 --> 00:51:14,600 Speaker 1: But um, it's it's based on this, this theory, this 835 00:51:14,800 --> 00:51:20,360 Speaker 1: understanding of chaos, of unpredictability, of not just not forcing 836 00:51:20,480 --> 00:51:24,480 Speaker 1: nature into our formulas, but putting data into a model 837 00:51:24,480 --> 00:51:26,600 Speaker 1: and seeing what comes out of it. Yeah, and then 838 00:51:26,600 --> 00:51:28,480 Speaker 1: at the end of that you learn like when that 839 00:51:28,520 --> 00:51:31,600 Speaker 1: animal is not extinct like you thought it would be, 840 00:51:31,640 --> 00:51:33,399 Speaker 1: you go back and look at the original thing and 841 00:51:33,440 --> 00:51:36,759 Speaker 1: you have a more accurate picture of how the you know, 842 00:51:36,880 --> 00:51:40,600 Speaker 1: data could have been off slightly this one value, and 843 00:51:41,000 --> 00:51:45,720 Speaker 1: then you have more buffalo than you think. Yeah, sure 844 00:51:45,920 --> 00:51:48,680 Speaker 1: you got buffaloed by chaos. And we're not even getting 845 00:51:48,719 --> 00:51:51,160 Speaker 1: into fractals. It's a whole other thing. And we did 846 00:51:51,160 --> 00:51:57,280 Speaker 1: a whole other podcast in June about fractals and Mandel Bena, 847 00:51:57,360 --> 00:52:01,879 Speaker 1: mandel Brett, mendel Brett, mandel Brett. Yeah, and go listen 848 00:52:01,880 --> 00:52:04,560 Speaker 1: to that one and hear me clinging to the edge 849 00:52:04,560 --> 00:52:09,759 Speaker 1: of a clift. Man. We we should end this, but first, um, 850 00:52:09,800 --> 00:52:12,400 Speaker 1: I want to say, there is a really interesting article 851 00:52:12,560 --> 00:52:17,480 Speaker 1: it's pretty understandable on Quanta magazine about a guy named 852 00:52:17,520 --> 00:52:25,439 Speaker 1: George and he is a chaos theory dude who's got 853 00:52:25,440 --> 00:52:28,360 Speaker 1: a whole lab and is applying it to real life. 854 00:52:28,400 --> 00:52:33,440 Speaker 1: So it's a really good picture of chaos theory and action. 855 00:52:34,360 --> 00:52:37,520 Speaker 1: Go check it out. Okay, Uh, if you want to 856 00:52:37,520 --> 00:52:41,239 Speaker 1: know more about chaos theory, I hope your brain is 857 00:52:41,239 --> 00:52:46,440 Speaker 1: not broken. Yeah, go take some LSD and look attical that. Um, 858 00:52:46,520 --> 00:52:49,920 Speaker 1: you can type those words into how stuff works in 859 00:52:49,960 --> 00:52:53,600 Speaker 1: the search bar any of those fractals LSD chaos. It'll 860 00:52:53,600 --> 00:52:56,520 Speaker 1: bring up some good stuff. And since I said good stuff, 861 00:52:56,560 --> 00:53:00,759 Speaker 1: it's time for a listener. Now, I'm gonna call this 862 00:53:00,920 --> 00:53:04,520 Speaker 1: rare shout out get requests all the time. I bet 863 00:53:04,520 --> 00:53:09,239 Speaker 1: I know which one is really dude his girlfriend? Yeah no, 864 00:53:10,160 --> 00:53:12,960 Speaker 1: so far, so good. Hey, guys, just want to say 865 00:53:13,000 --> 00:53:14,680 Speaker 1: I think you're doing a wonderful job with the show. 866 00:53:15,120 --> 00:53:17,840 Speaker 1: To this date. My first time listening was during my 867 00:53:17,880 --> 00:53:22,480 Speaker 1: first deployment. Uh yeah, when I listened to your list 868 00:53:22,680 --> 00:53:26,200 Speaker 1: on famous and influential films and I was hooked after that. 869 00:53:26,560 --> 00:53:28,439 Speaker 1: Since I came back State Side has spent many hours 870 00:53:28,480 --> 00:53:32,600 Speaker 1: driving to and fro uh see my girlfriend, to my barracks, 871 00:53:33,000 --> 00:53:35,480 Speaker 1: and I can happily say that they've been made all 872 00:53:35,560 --> 00:53:39,279 Speaker 1: the more enjoyable by listening to you guys. Even my 873 00:53:39,280 --> 00:53:41,960 Speaker 1: girlfriend Rachel has warmed up to you dudes, which was 874 00:53:42,000 --> 00:53:44,319 Speaker 1: not a pleasant I'm sorry, which was a pleasant shock 875 00:53:44,360 --> 00:53:46,960 Speaker 1: to me that she has told me repeatedly that she 876 00:53:47,600 --> 00:53:51,319 Speaker 1: cannot listen to audiobooks because quote hearing people talk on 877 00:53:51,360 --> 00:53:54,520 Speaker 1: the radio gives me a headache end quote. Anyway, I 878 00:53:54,520 --> 00:53:57,080 Speaker 1: hope you guys continue to make awesome podcasts as I'm 879 00:53:57,080 --> 00:53:59,560 Speaker 1: headed out on my next deployment. And if you could 880 00:53:59,560 --> 00:54:00,960 Speaker 1: give a show it out to Rachel, I'm sure it 881 00:54:00,960 --> 00:54:03,200 Speaker 1: would make her feel a little better that I got 882 00:54:03,200 --> 00:54:06,399 Speaker 1: the pleasant people on the podcast to reaffirm how much 883 00:54:06,440 --> 00:54:11,399 Speaker 1: I love her. That is John Rachel hanging there, John, 884 00:54:11,520 --> 00:54:15,640 Speaker 1: be safe and thanks for listening. Yeah, man, thank you. 885 00:54:15,680 --> 00:54:17,479 Speaker 1: That is a greed email. I love that one. Glad 886 00:54:17,520 --> 00:54:19,520 Speaker 1: we don't give you a headache. Rachel. Yeah. For she 887 00:54:19,600 --> 00:54:21,359 Speaker 1: listened to this son, and she's like, okay, oh yeah, 888 00:54:21,680 --> 00:54:24,000 Speaker 1: everybody's gonna get a headache from this one. Like I 889 00:54:24,440 --> 00:54:26,319 Speaker 1: came to hate the sound of my own voice from 890 00:54:26,360 --> 00:54:30,560 Speaker 1: this one. You'll be right. If you want to get 891 00:54:30,560 --> 00:54:32,239 Speaker 1: in touch with us, you can hang out with us 892 00:54:32,320 --> 00:54:35,000 Speaker 1: on Twitter at s y s K podcast thing goes 893 00:54:35,040 --> 00:54:37,440 Speaker 1: for Instagram. You can hang out with us on Facebook 894 00:54:37,440 --> 00:54:39,560 Speaker 1: dot com slash stuff. You should know. You can send 895 00:54:39,600 --> 00:54:42,080 Speaker 1: us an email to stuff podcast to how stuff Works 896 00:54:42,120 --> 00:54:44,160 Speaker 1: dot com and has always joined us at home on 897 00:54:44,200 --> 00:54:52,799 Speaker 1: the web. Stuff you Should Know dot com. For more 898 00:54:52,840 --> 00:54:55,120 Speaker 1: on this and thousands of other topics, is it how 899 00:54:55,200 --> 00:55:03,360 Speaker 1: stuff Works dot com.