1 00:00:00,720 --> 00:00:03,640 Speaker 1: Hey, March is tripod month, my friend, and you know 2 00:00:03,640 --> 00:00:06,680 Speaker 1: what that means. Yes, that means it's time to let 3 00:00:06,920 --> 00:00:11,360 Speaker 1: people know about your favorite podcasts, just to share the 4 00:00:11,480 --> 00:00:14,400 Speaker 1: sheer joy of podcast listening. That's right, it's t r 5 00:00:14,640 --> 00:00:17,479 Speaker 1: y pod still in nascent industry. A lot of people 6 00:00:17,520 --> 00:00:20,880 Speaker 1: don't know what podcasts are and helps everybody out if 7 00:00:20,920 --> 00:00:24,240 Speaker 1: you would go out and just say, hey, family member, 8 00:00:24,640 --> 00:00:26,920 Speaker 1: who I see it? Thanksgiving once a year? Right, you 9 00:00:26,960 --> 00:00:29,000 Speaker 1: should try out this thing called a podcast. Here's what 10 00:00:29,040 --> 00:00:30,920 Speaker 1: they are. Here's a cool show you should try, and 11 00:00:30,960 --> 00:00:32,479 Speaker 1: here's how to get it. Yeah, and it doesn't have 12 00:00:32,520 --> 00:00:34,800 Speaker 1: to be our show, just any podcast you like in 13 00:00:34,880 --> 00:00:38,320 Speaker 1: general that you think someone else would like, just share it. Yeah, 14 00:00:38,600 --> 00:00:44,120 Speaker 1: So get on board the dry pod train. Welcome to 15 00:00:44,520 --> 00:00:53,840 Speaker 1: stuff you should know from house stuff Works dot com. Hey, 16 00:00:53,880 --> 00:00:57,000 Speaker 1: and welcome to the podcast. I'm Josh Clark, This Charles W. 17 00:00:57,160 --> 00:01:00,760 Speaker 1: Chuck Bryant, and there is well Jerry just issappure Chuck 18 00:01:01,440 --> 00:01:05,400 Speaker 1: did she she did? Oh? There she's back. David copper 19 00:01:05,440 --> 00:01:08,480 Speaker 1: Fields in here with us as well. Today he made 20 00:01:08,480 --> 00:01:12,520 Speaker 1: the Statue of Liberty disappear. And now Jerry Jerry as 21 00:01:12,600 --> 00:01:17,319 Speaker 1: drawn by mc escher. Oh that's nice. How do you 22 00:01:17,319 --> 00:01:27,080 Speaker 1: feel about optical illusions? I feel I feel happy about 23 00:01:27,160 --> 00:01:31,440 Speaker 1: optical illusions. I'm not asking Josh from the third grade, 24 00:01:31,680 --> 00:01:38,880 Speaker 1: but I feel sad about articles on optical illusions in general. 25 00:01:39,160 --> 00:01:41,959 Speaker 1: It's a it's a really difficult thing to write about, 26 00:01:42,240 --> 00:01:45,160 Speaker 1: as we're about to demonstrate, it's an even more difficult 27 00:01:45,160 --> 00:01:50,360 Speaker 1: thing to talk about. Um. But it's just I think 28 00:01:50,520 --> 00:01:55,360 Speaker 1: the idea that every article has to inherently describe an 29 00:01:55,360 --> 00:02:00,240 Speaker 1: optical illusion and then basically follows that discrip o shin 30 00:02:00,400 --> 00:02:03,640 Speaker 1: up with and scientists don't really know what's going on. 31 00:02:03,920 --> 00:02:06,640 Speaker 1: Here's a couple of guesses that will be fully discredited 32 00:02:06,640 --> 00:02:10,400 Speaker 1: in twenty years. It's just dissatisfying. Yeah, I mean because 33 00:02:10,400 --> 00:02:12,280 Speaker 1: we were kind of we're the kind of dudes who 34 00:02:12,320 --> 00:02:17,840 Speaker 1: like concrete answers or at least like really solid hypotheses. 35 00:02:17,960 --> 00:02:21,400 Speaker 1: Some of these are flimsy to me. Yeah, So we 36 00:02:21,440 --> 00:02:26,480 Speaker 1: would encourage folks if you are listening at home or work, 37 00:02:26,880 --> 00:02:30,360 Speaker 1: because you can blow off work. Let's be honest, um, like, 38 00:02:30,440 --> 00:02:32,200 Speaker 1: look look up. Some of these will describe them as 39 00:02:32,240 --> 00:02:34,760 Speaker 1: best we can, and most of them you've probably seen 40 00:02:34,800 --> 00:02:40,440 Speaker 1: before because as you will learn, uh many many illusions 41 00:02:40,520 --> 00:02:45,720 Speaker 1: optical illusions were were drawn and conceived many years ago, 42 00:02:46,440 --> 00:02:49,680 Speaker 1: and I have just been sort of played upon over 43 00:02:49,720 --> 00:02:52,880 Speaker 1: the years in different ways. Right. Yeah, the nineteenth century was, 44 00:02:52,919 --> 00:02:57,960 Speaker 1: like the classics, the foundation of optical illusions, which not 45 00:02:58,120 --> 00:03:04,120 Speaker 1: coincidentally coincided with the foundation of psychology and brain research. Um, 46 00:03:04,160 --> 00:03:06,880 Speaker 1: and optical illusions were created to kind of test the 47 00:03:06,919 --> 00:03:09,920 Speaker 1: stuff or explore this stuff, right. But yeah, most of 48 00:03:09,919 --> 00:03:13,080 Speaker 1: the stuff today are just variations on these themes. Yeah. So, 49 00:03:13,120 --> 00:03:15,040 Speaker 1: like I was saying, if you're if you're able to 50 00:03:15,320 --> 00:03:18,520 Speaker 1: just you know, kind of just google this junk as 51 00:03:18,560 --> 00:03:21,840 Speaker 1: we say them, and you'll go, oh that thing and chuck. Actually, 52 00:03:21,880 --> 00:03:26,880 Speaker 1: there's a website called Michael Bach dot d E okay 53 00:03:27,480 --> 00:03:32,680 Speaker 1: might yeah, which is Deutscheland Germany in the English, but 54 00:03:32,800 --> 00:03:35,000 Speaker 1: it's h M I C H A E l B 55 00:03:35,240 --> 00:03:37,720 Speaker 1: A c H dot d E. And this guy just 56 00:03:37,880 --> 00:03:41,360 Speaker 1: listed he's got links to every optical illusion you could 57 00:03:41,400 --> 00:03:43,640 Speaker 1: possibly imagine. So that'd be a good place to go. 58 00:03:43,960 --> 00:03:46,040 Speaker 1: Just sit there and click on his site while we're 59 00:03:46,040 --> 00:03:50,040 Speaker 1: talking about these things. Yeah, and UM, what I found 60 00:03:50,080 --> 00:03:53,720 Speaker 1: is that I get um a bit of optical illusion 61 00:03:53,800 --> 00:03:56,400 Speaker 1: fatigue when I look at too many of these things 62 00:03:56,400 --> 00:03:59,560 Speaker 1: in a row. Oh, well that should be studied. Well, 63 00:04:00,360 --> 00:04:03,040 Speaker 1: I think that. I mean, we know so little about 64 00:04:03,080 --> 00:04:07,720 Speaker 1: optical illusions that that is, I mean, that's kind of groundbreaking. Well, 65 00:04:08,800 --> 00:04:11,840 Speaker 1: I don't mean fatigue is in like scientifically, I just 66 00:04:11,920 --> 00:04:13,960 Speaker 1: mean like I'm tired of looking at this junk. Oh, 67 00:04:14,000 --> 00:04:16,240 Speaker 1: I see what you mean. Yeah, it just bores me 68 00:04:16,279 --> 00:04:20,120 Speaker 1: after a bit. Plus a lot of require ugly color 69 00:04:20,240 --> 00:04:24,039 Speaker 1: combinations or un unpleasant color combinations. So I think that 70 00:04:24,120 --> 00:04:27,719 Speaker 1: probably contributes to it too. And we should do a 71 00:04:29,200 --> 00:04:30,760 Speaker 1: We don't talk a lot about Er in this one, 72 00:04:30,800 --> 00:04:33,400 Speaker 1: but we should. He deserves his own show. Sure, you 73 00:04:33,440 --> 00:04:36,760 Speaker 1: know Escher and Geiger. Maybe we'll do a combo show 74 00:04:36,800 --> 00:04:40,760 Speaker 1: with those two. Oh yeah, oh yeah, man, that guy's 75 00:04:40,760 --> 00:04:44,920 Speaker 1: brain is beautiful. Yeah. There's a lot of cultural icon 76 00:04:45,000 --> 00:04:48,279 Speaker 1: biographies that are floating out there. Mr Rogers and Dr SEUs. 77 00:04:48,360 --> 00:04:50,560 Speaker 1: I know we've talked about those. So maybe we'll go 78 00:04:50,560 --> 00:04:55,160 Speaker 1: on a kick. Okay, I'm ready for some kicking. Alright, 79 00:04:55,200 --> 00:04:57,599 Speaker 1: so we were Let's go back a little bit to 80 00:04:57,720 --> 00:05:01,400 Speaker 1: the history of thinking about or study optical illusions. Right, 81 00:05:02,400 --> 00:05:06,039 Speaker 1: As with most things in the West, the basis of 82 00:05:06,080 --> 00:05:09,120 Speaker 1: optical illusions of the first mention of optical illusions in 83 00:05:09,120 --> 00:05:13,520 Speaker 1: the literature comes from the Greeks and Aristotle in particular. Yeah. 84 00:05:13,560 --> 00:05:18,680 Speaker 1: He uh, he probably munched on some some some weird 85 00:05:18,760 --> 00:05:21,960 Speaker 1: route and stared at a waterfall for a little while 86 00:05:23,160 --> 00:05:26,080 Speaker 1: and he said, hey, dudes, if you stare at that 87 00:05:26,120 --> 00:05:29,359 Speaker 1: waterfall long enough, man, and then you quickly look at 88 00:05:29,440 --> 00:05:32,560 Speaker 1: that rock, it looks like the rock is moving, right, 89 00:05:34,080 --> 00:05:40,400 Speaker 1: and then the rocks like, AREM not moving? Aristotle? But 90 00:05:40,520 --> 00:05:44,719 Speaker 1: that actually has a name, correct, Yeah, it's called the 91 00:05:44,720 --> 00:05:50,400 Speaker 1: waterfall illusion appropriately or um, what's the other word for it, 92 00:05:50,560 --> 00:05:53,159 Speaker 1: the motion after effect. Yeah, that's what I was looking for. 93 00:05:53,560 --> 00:05:57,400 Speaker 1: This is this is like, if this is true the 94 00:05:57,520 --> 00:06:00,760 Speaker 1: explanation for it, then I'm just disappointed with our our 95 00:06:00,800 --> 00:06:04,800 Speaker 1: brains hit. The explanation is that when we're staring at 96 00:06:04,880 --> 00:06:08,239 Speaker 1: the waterfall, are neurons tracking the movement of the water 97 00:06:08,760 --> 00:06:14,160 Speaker 1: become tired out, exhausted, yeah, overwhelmed. So when we stopped 98 00:06:14,200 --> 00:06:16,479 Speaker 1: looking at it and they take a break, all the 99 00:06:16,560 --> 00:06:19,360 Speaker 1: other ones that weren't at work are suddenly working over 100 00:06:19,440 --> 00:06:23,160 Speaker 1: time and making things move that aren't actually moving right. 101 00:06:23,640 --> 00:06:28,000 Speaker 1: That's a stupid explanation. Uh, I don't know how buy that. 102 00:06:29,080 --> 00:06:31,160 Speaker 1: I mean, it makes sense but I think it's stupid, 103 00:06:31,160 --> 00:06:34,520 Speaker 1: it's boring, you know, just worn out neurons. Yeah, I'm 104 00:06:34,600 --> 00:06:38,720 Speaker 1: tired to sit down over here. Uh yeah. And then 105 00:06:38,760 --> 00:06:40,720 Speaker 1: there was, um, if we go forward a bit in 106 00:06:40,760 --> 00:06:43,599 Speaker 1: the nineteenth century, like you were talking about, there was 107 00:06:44,600 --> 00:06:47,120 Speaker 1: that that was when people got really sort of interested 108 00:06:47,279 --> 00:06:51,640 Speaker 1: in studying these things and what was going on in 109 00:06:51,680 --> 00:06:55,120 Speaker 1: the brain, because it's sort of coincided with um, studying 110 00:06:55,160 --> 00:07:00,479 Speaker 1: perception and how our eyes worked, and how our eyes 111 00:07:00,520 --> 00:07:04,680 Speaker 1: worked in relation to our brain. Right. And then I 112 00:07:04,720 --> 00:07:08,120 Speaker 1: guess what some of the earliest optical illusions kind of 113 00:07:08,240 --> 00:07:14,320 Speaker 1: proved though, was this longstanding idea that it are perception 114 00:07:14,360 --> 00:07:18,800 Speaker 1: of vision. Our visual experience was based in how the 115 00:07:18,920 --> 00:07:24,120 Speaker 1: eyes interpreted objects. And what these early optical illusions started 116 00:07:24,160 --> 00:07:27,960 Speaker 1: to prove was, no, it's actually the brain that's getting 117 00:07:28,000 --> 00:07:32,200 Speaker 1: messed up here. And some now we're starting to get 118 00:07:32,240 --> 00:07:35,560 Speaker 1: into here at this point, like some some theories that 119 00:07:35,560 --> 00:07:37,520 Speaker 1: that makes sense to me, that I think are cool. 120 00:07:37,800 --> 00:07:41,400 Speaker 1: But what these this early study started to reveal was 121 00:07:41,440 --> 00:07:45,240 Speaker 1: that the brain is extremely lazy and it likes to 122 00:07:45,280 --> 00:07:49,000 Speaker 1: take shortcuts. Right, Yeah, I thought, I thought this is 123 00:07:49,040 --> 00:07:51,200 Speaker 1: actually pretty interesting. Are you talking about the lag time? 124 00:07:51,880 --> 00:07:54,520 Speaker 1: The lag time, but also, yeah, there's there's plenty of 125 00:07:54,520 --> 00:07:56,600 Speaker 1: other stuff, but the lag time seems to me to 126 00:07:56,680 --> 00:08:01,360 Speaker 1: be like one specific slice of this the general tricks 127 00:08:01,360 --> 00:08:04,400 Speaker 1: of the trade that the brain uses to cut corners. Yeah, 128 00:08:04,520 --> 00:08:07,600 Speaker 1: and the lag time is basically, when you know, everything 129 00:08:07,600 --> 00:08:10,400 Speaker 1: seems to happen instantaneous when you when you look at something, 130 00:08:10,800 --> 00:08:13,600 Speaker 1: your eyeballs pick it up, the neurons start firing, and 131 00:08:13,640 --> 00:08:15,640 Speaker 1: the brain tells you, you know, that's a coffee cup. 132 00:08:16,200 --> 00:08:19,280 Speaker 1: But there's just the slightest little lag and the time 133 00:08:19,320 --> 00:08:22,000 Speaker 1: it takes for that to happen. And one of the 134 00:08:22,000 --> 00:08:25,760 Speaker 1: theories with optical illusions is the brain is trying to 135 00:08:25,840 --> 00:08:30,720 Speaker 1: predict and that slight, slight, slight. You know, I'm not 136 00:08:30,760 --> 00:08:37,040 Speaker 1: good with small uh. Units of time is nanosecond, Jordana, Yeah, Nana, 137 00:08:37,080 --> 00:08:39,280 Speaker 1: second is definitely sure, but I think we're talking tenth 138 00:08:39,320 --> 00:08:42,600 Speaker 1: of a second, Okay, So the brain basically tries to 139 00:08:42,640 --> 00:08:47,319 Speaker 1: predict what what is should come next based on what 140 00:08:47,360 --> 00:08:49,760 Speaker 1: we're used to seeing in real life. Is that a 141 00:08:49,800 --> 00:08:51,960 Speaker 1: good way to say it? Yeah? And the reason they 142 00:08:52,000 --> 00:08:54,280 Speaker 1: would do this is because a tenth of a second, 143 00:08:54,320 --> 00:08:58,280 Speaker 1: something can change, like a tiger can suddenly appear. Um, 144 00:08:58,320 --> 00:09:01,200 Speaker 1: so the brain is constantly looking clues in the environment 145 00:09:01,240 --> 00:09:05,120 Speaker 1: to to predict what what is that what a tenth 146 00:09:05,120 --> 00:09:07,199 Speaker 1: of a second in the future is going to be? Like, right, 147 00:09:08,080 --> 00:09:12,240 Speaker 1: I think things move slow enough for us humans that 148 00:09:12,760 --> 00:09:16,960 Speaker 1: it usually works pretty well. But what this researcher Mark 149 00:09:17,480 --> 00:09:22,199 Speaker 1: Changizi says is an optical illusion. Some of the optical 150 00:09:22,280 --> 00:09:28,520 Speaker 1: illusions are actually reliable ways to trick the brain into 151 00:09:28,520 --> 00:09:31,920 Speaker 1: making the wrong decision about what the future is gonna hold. 152 00:09:32,120 --> 00:09:35,240 Speaker 1: One of the ones that that classically falls into this 153 00:09:35,280 --> 00:09:40,920 Speaker 1: example is um what's the one that that he talks 154 00:09:40,920 --> 00:09:44,560 Speaker 1: about where it's the It's the one with the so 155 00:09:44,640 --> 00:09:50,520 Speaker 1: that you've got two parallel lines running horizontally, just you know, 156 00:09:50,600 --> 00:09:53,840 Speaker 1: separated by a little amount of space, and then in 157 00:09:53,920 --> 00:09:59,359 Speaker 1: the background there's radial lines all going toward a um 158 00:09:59,520 --> 00:10:02,920 Speaker 1: the point, the vanishing point on the horizon. Right. Yes, 159 00:10:03,880 --> 00:10:06,440 Speaker 1: I can't remember the name of this one, but the 160 00:10:06,440 --> 00:10:11,360 Speaker 1: the point that Changizi makes is that the radial lines, 161 00:10:11,440 --> 00:10:14,720 Speaker 1: lines that radiate from a center point our brains use 162 00:10:14,920 --> 00:10:20,800 Speaker 1: as a shortcut indicator of motion, the Herring illusion. Thank you. 163 00:10:22,440 --> 00:10:26,559 Speaker 1: So these radial lines that we see tell our brain, oh, 164 00:10:26,600 --> 00:10:29,440 Speaker 1: we're moving and we're moving towards this vanishing point in 165 00:10:29,480 --> 00:10:33,720 Speaker 1: the distance. So these these horizontal lines that are in 166 00:10:33,720 --> 00:10:37,760 Speaker 1: the foreground are actually appear to be bent in the center, 167 00:10:37,840 --> 00:10:41,240 Speaker 1: bent outward from one another very much. So, So what 168 00:10:41,360 --> 00:10:44,199 Speaker 1: Changizi saying is that the brain is predicting since it 169 00:10:44,280 --> 00:10:46,960 Speaker 1: thinks we're moving forward towards this point and then toward 170 00:10:47,040 --> 00:10:49,440 Speaker 1: these lines that as we get closer, they have to 171 00:10:49,520 --> 00:10:54,160 Speaker 1: bend to basically allow us to enter in another way. 172 00:10:54,400 --> 00:10:56,480 Speaker 1: But the thing is that they're they're not moving because 173 00:10:56,480 --> 00:10:58,600 Speaker 1: it's a static image, but it's the brain being tricked 174 00:10:58,600 --> 00:11:03,280 Speaker 1: into thinking we're moving forward, changing our perspective unnecessarily. Yeah, 175 00:11:03,320 --> 00:11:05,439 Speaker 1: because the brain is used to the way we move 176 00:11:05,480 --> 00:11:09,120 Speaker 1: forward in real life. I r L right for you 177 00:11:09,200 --> 00:11:12,400 Speaker 1: kids out there, and so you know, it's a lot 178 00:11:12,440 --> 00:11:14,600 Speaker 1: of this seemed like the brain almost kind of negotiating 179 00:11:14,640 --> 00:11:18,560 Speaker 1: with itself. Yeah, you know, yes, but I think part 180 00:11:18,600 --> 00:11:21,640 Speaker 1: of it so that lag time one makes sense, right. 181 00:11:22,480 --> 00:11:24,480 Speaker 1: Another one that makes sense to me as far as 182 00:11:24,480 --> 00:11:28,719 Speaker 1: why the brain makes shortcuts is that when when like 183 00:11:28,760 --> 00:11:31,760 Speaker 1: the physical world is in at least three dimensions that 184 00:11:31,840 --> 00:11:35,240 Speaker 1: we interact with it in, right, but our eyes are 185 00:11:35,280 --> 00:11:38,880 Speaker 1: giving us two dimensional representations that the brain then has 186 00:11:38,920 --> 00:11:43,640 Speaker 1: to reconstruct into three dimensions, and it's learned to take 187 00:11:43,760 --> 00:11:48,480 Speaker 1: all sorts of neat little um, it's neat little clues 188 00:11:48,600 --> 00:11:51,640 Speaker 1: to put together a pretty good prediction of what it's 189 00:11:51,679 --> 00:11:55,240 Speaker 1: looking at. Yeah, and and it can also flip flop 190 00:11:55,480 --> 00:11:58,560 Speaker 1: between different two different views like the is it the 191 00:11:58,640 --> 00:12:03,000 Speaker 1: Necker cube? I love it any c K E R. 192 00:12:03,200 --> 00:12:06,280 Speaker 1: And it's it's sort of that classic cube that you 193 00:12:06,360 --> 00:12:10,120 Speaker 1: learned to draw, the one that's slightly more advanced and 194 00:12:10,200 --> 00:12:12,880 Speaker 1: than the basic cube that you first learned to draw. 195 00:12:13,640 --> 00:12:16,079 Speaker 1: It's the second cube that you learned to draw right 196 00:12:16,160 --> 00:12:19,160 Speaker 1: on your your What was those things that you put 197 00:12:19,160 --> 00:12:22,000 Speaker 1: on your books and people? Oh, just like yeah, homemade 198 00:12:22,000 --> 00:12:25,960 Speaker 1: book covers, right exactly. Yeah, basically a brown grocery sack 199 00:12:26,040 --> 00:12:29,240 Speaker 1: is what I used. Yeah, same here. Uh that's because 200 00:12:29,280 --> 00:12:33,800 Speaker 1: we were poor. Well plus those those things held up 201 00:12:34,240 --> 00:12:37,560 Speaker 1: oh sure yeah. Um. So you look at the Necker 202 00:12:37,600 --> 00:12:40,640 Speaker 1: cube and um. The fun thing about the Necker cube 203 00:12:40,720 --> 00:12:42,679 Speaker 1: is you you look at it and your brain is 204 00:12:42,720 --> 00:12:45,120 Speaker 1: able to flip back and forth between the cube basically 205 00:12:45,200 --> 00:12:49,679 Speaker 1: having two different um uh positions? Is that the best 206 00:12:49,720 --> 00:12:51,800 Speaker 1: way to say it? I keep saying that, But you 207 00:12:51,840 --> 00:12:54,440 Speaker 1: know again, these things are kind of hard to describe. Well, yeah, 208 00:12:54,440 --> 00:12:57,240 Speaker 1: it's kind of like the cube is transparent and you 209 00:12:57,280 --> 00:13:00,199 Speaker 1: can see all corners of it. Yeah, so you're rain 210 00:13:00,280 --> 00:13:03,560 Speaker 1: is saying, Okay, is that corner close to me or 211 00:13:03,679 --> 00:13:08,160 Speaker 1: furthest away from me? It changes perspective. Yeah, and so 212 00:13:08,520 --> 00:13:12,200 Speaker 1: thanks to UM, to the wonder machine, we can put 213 00:13:12,240 --> 00:13:17,559 Speaker 1: people in these things and UM see the neurons responsible 214 00:13:17,640 --> 00:13:21,840 Speaker 1: for the different perspectives, uh, flipping back and forth depending 215 00:13:21,880 --> 00:13:26,679 Speaker 1: on how we're looking at it. Yeah, exactly. Pretty helpful 216 00:13:26,720 --> 00:13:29,040 Speaker 1: at this point because you had the nineteenth century where 217 00:13:29,040 --> 00:13:32,600 Speaker 1: they started to to suss out that the ideas that 218 00:13:32,920 --> 00:13:35,319 Speaker 1: the brain was responsible for this, it was the brain 219 00:13:35,440 --> 00:13:39,080 Speaker 1: messing up. And then not a lot happened in between 220 00:13:39,160 --> 00:13:42,240 Speaker 1: then and the two thousands when f M R I 221 00:13:42,400 --> 00:13:45,920 Speaker 1: came into UM widespread use. And then now we're starting 222 00:13:45,920 --> 00:13:48,959 Speaker 1: to see, yeah, this the a lot of these early 223 00:13:49,040 --> 00:13:51,880 Speaker 1: theories are actually correct because we can see the neurons 224 00:13:51,920 --> 00:13:54,640 Speaker 1: responsible for them. All Right, Well, let's take a little 225 00:13:54,640 --> 00:13:57,480 Speaker 1: break here and then we're gonna come back and talk 226 00:13:57,520 --> 00:14:00,320 Speaker 1: about the Herman illusion and what the Mr I said 227 00:14:00,320 --> 00:14:27,280 Speaker 1: about that one. Okay, so the Herman Herman I'm not 228 00:14:27,280 --> 00:14:30,160 Speaker 1: sure how to pronounce that H E R M A 229 00:14:30,400 --> 00:14:34,600 Speaker 1: N N. The Herman Grid conceived by Ludamore Herman in 230 00:14:35,840 --> 00:14:38,480 Speaker 1: you nailed this first first name? Oh yeah, Well, it's 231 00:14:38,520 --> 00:14:41,520 Speaker 1: it's one of those classic illusions that we've all seen 232 00:14:41,640 --> 00:14:44,400 Speaker 1: and it's really simple. It's just a black and white 233 00:14:44,440 --> 00:14:48,040 Speaker 1: grid of squares, and that's the one where if you're 234 00:14:48,040 --> 00:14:50,080 Speaker 1: just looking at it, it looks like there's these little 235 00:14:50,600 --> 00:14:55,440 Speaker 1: gray circles, little gray dots in between where these things intersect, 236 00:14:56,000 --> 00:14:58,280 Speaker 1: and there's really nothing there though, of course when you 237 00:14:58,280 --> 00:15:00,880 Speaker 1: focus on that, it goes away. Uh. And the m 238 00:15:01,000 --> 00:15:03,400 Speaker 1: R I showed that when you're looking at an illusion 239 00:15:03,440 --> 00:15:08,080 Speaker 1: like this and others like this, um the neurons are 240 00:15:08,080 --> 00:15:10,920 Speaker 1: competing with one another to see the light and the dark, 241 00:15:11,440 --> 00:15:15,640 Speaker 1: and basically one set of neurons wins out over the 242 00:15:15,680 --> 00:15:18,600 Speaker 1: other and then influences the message to the other. For 243 00:15:18,640 --> 00:15:21,480 Speaker 1: what you end up perceiving, right, fairly interesting and think 244 00:15:21,720 --> 00:15:23,760 Speaker 1: it is it is And this one kind of stands 245 00:15:23,800 --> 00:15:26,240 Speaker 1: on its owner in its own class, and that it's 246 00:15:26,280 --> 00:15:29,040 Speaker 1: not really the brain that's being duped. It's because of 247 00:15:29,040 --> 00:15:31,640 Speaker 1: the physiology of the eyes and the light receptors in 248 00:15:31,720 --> 00:15:35,160 Speaker 1: the eyes. Right, So there are arranged so that they 249 00:15:35,200 --> 00:15:40,280 Speaker 1: are they they sense distinction, like contrast between light and dark, right, 250 00:15:40,920 --> 00:15:44,240 Speaker 1: and if they're sensing both, they create this blob. There's 251 00:15:44,240 --> 00:15:49,280 Speaker 1: spill over where um, some receptors in a single cell 252 00:15:49,880 --> 00:15:52,720 Speaker 1: um are getting dark and some are getting light. When 253 00:15:52,760 --> 00:15:55,000 Speaker 1: you can create these blobs in the intersection, but then 254 00:15:55,040 --> 00:15:57,560 Speaker 1: when you focus your attention on the white part the 255 00:15:57,600 --> 00:16:02,239 Speaker 1: intersection between the black squares, you're using your phobio receptors, 256 00:16:02,240 --> 00:16:06,880 Speaker 1: which have far less inhibition or spill over um, so 257 00:16:06,920 --> 00:16:09,360 Speaker 1: that the gray blob disappears. In what you see is 258 00:16:09,360 --> 00:16:13,880 Speaker 1: is white. It's actually really I read probably like four 259 00:16:14,040 --> 00:16:17,800 Speaker 1: different explanations of it before it started to sink in. Yeah, 260 00:16:18,200 --> 00:16:22,040 Speaker 1: it's straightforward, but it's tough to explain. I think another word. Yeah, 261 00:16:22,120 --> 00:16:24,800 Speaker 1: and uh, I totally agree. Um. And one of the 262 00:16:24,800 --> 00:16:29,040 Speaker 1: reasons we know that these neurons are sort of individually 263 00:16:29,080 --> 00:16:33,520 Speaker 1: picking things up is because in these two dudes, David 264 00:16:33,600 --> 00:16:39,160 Speaker 1: Hubil and torched In Festal, great name, you're going to 265 00:16:39,240 --> 00:16:42,200 Speaker 1: say that, uh, in one they won the Nobel Prize 266 00:16:42,200 --> 00:16:46,240 Speaker 1: in Physiology or Medicine because they found out that there's 267 00:16:46,240 --> 00:16:50,000 Speaker 1: actually a process and how the brain uh picks of stuff. 268 00:16:50,080 --> 00:16:51,680 Speaker 1: Up and and what the I C s. And they 269 00:16:51,720 --> 00:16:55,800 Speaker 1: found that each neuron is actually responsible for one little part, 270 00:16:55,920 --> 00:16:59,760 Speaker 1: one little detail of that pattern uh in the retinal image. 271 00:16:59,760 --> 00:17:03,080 Speaker 1: And so that explains why these neurons can do get 272 00:17:03,080 --> 00:17:07,320 Speaker 1: out basically on what it's seen. Yeah. So, and it's 273 00:17:07,320 --> 00:17:11,120 Speaker 1: not just like uh, like neurons competing seeing light and dark. 274 00:17:11,240 --> 00:17:16,280 Speaker 1: It's it's from what I understand the the understanding of 275 00:17:16,280 --> 00:17:20,879 Speaker 1: our brain and vision is that an individual neuron is 276 00:17:20,920 --> 00:17:26,399 Speaker 1: responsible for, um, say a circle, It sees circles, and 277 00:17:26,440 --> 00:17:30,359 Speaker 1: it's transmitting any circular information to the brain. Another neuron 278 00:17:30,440 --> 00:17:33,439 Speaker 1: is responsible for seeing dark, another is responsible for seeing light, 279 00:17:33,440 --> 00:17:37,360 Speaker 1: another's responsible for seeing red, another is responsible for seeing texture, 280 00:17:37,760 --> 00:17:41,199 Speaker 1: and all of this sensory information, this visual information is 281 00:17:41,240 --> 00:17:44,320 Speaker 1: coming to the brain all at once, and these various 282 00:17:44,320 --> 00:17:48,480 Speaker 1: brain regions responsible for vision putting it together the best 283 00:17:48,520 --> 00:17:52,560 Speaker 1: way it can. You see a red ball. And there's 284 00:17:52,600 --> 00:17:55,159 Speaker 1: a lot of cues that the brain uses that just 285 00:17:55,280 --> 00:18:01,199 Speaker 1: fascinate me. For basically what's called monocular vision. Right, so 286 00:18:01,240 --> 00:18:03,920 Speaker 1: when you are using both of your eyes, especially when 287 00:18:03,960 --> 00:18:07,760 Speaker 1: something's up close, you're getting two separate pictures of the 288 00:18:07,800 --> 00:18:11,520 Speaker 1: same thing, and the differences between these pictures the brain 289 00:18:11,600 --> 00:18:14,560 Speaker 1: can use to easily translate it into three dimensions, right 290 00:18:14,600 --> 00:18:17,680 Speaker 1: to to to handle things like perspective and stuff like that. 291 00:18:18,000 --> 00:18:21,840 Speaker 1: But when something's further away, um, the brain has to 292 00:18:21,960 --> 00:18:25,280 Speaker 1: use other little tricks of the trade. Right. So you've 293 00:18:25,280 --> 00:18:27,719 Speaker 1: got things like inner position. That's a pretty straight up 294 00:18:27,760 --> 00:18:32,120 Speaker 1: one where if one objects in front of another object, 295 00:18:32,119 --> 00:18:36,199 Speaker 1: your brain says, well, the object that's behind is further away. Yeah. 296 00:18:36,280 --> 00:18:39,359 Speaker 1: Is that is that what explains like force perspective? Yes? 297 00:18:39,880 --> 00:18:44,480 Speaker 1: In art, Yes, right, I do like force perspective stuff 298 00:18:44,560 --> 00:18:46,679 Speaker 1: I do. It's kind of cool, it's neat stuff. I 299 00:18:46,680 --> 00:18:50,800 Speaker 1: guess that's that's probably part of the op art movement, right, Uh, yeah, 300 00:18:50,840 --> 00:18:52,800 Speaker 1: when it was at like sixties and seventies. Yeah, it 301 00:18:52,840 --> 00:18:55,280 Speaker 1: seems like yeah. And then you know, kind of coincided 302 00:18:55,320 --> 00:18:59,800 Speaker 1: with drugs, right, not surprisingly. And then there's another one 303 00:18:59,800 --> 00:19:03,119 Speaker 1: that I hadn't heard of, called atmospheric perspective. Had you 304 00:19:03,160 --> 00:19:07,800 Speaker 1: heard of that one? I had not. So. Atmospheric perspective is, um, 305 00:19:07,840 --> 00:19:10,840 Speaker 1: it's basically the dust, particles in the water, vapor in 306 00:19:10,840 --> 00:19:13,480 Speaker 1: the air. The further something is a way, the more 307 00:19:13,520 --> 00:19:15,920 Speaker 1: of an effect those things have on the detail you see. 308 00:19:15,920 --> 00:19:19,200 Speaker 1: If it so your brain says, well, that's a little blurry. Uh, 309 00:19:19,240 --> 00:19:22,880 Speaker 1: that's that's a far away object. And then there's there's 310 00:19:22,920 --> 00:19:26,560 Speaker 1: plenty of other ones, but the gold standard is um 311 00:19:27,080 --> 00:19:31,480 Speaker 1: is object size. Right, that's where you um, you know 312 00:19:32,080 --> 00:19:34,680 Speaker 1: that roughly the size of an object, and you can 313 00:19:34,760 --> 00:19:37,400 Speaker 1: use it to compare to see whether it's it's far 314 00:19:37,480 --> 00:19:40,440 Speaker 1: away or close, depending on whether it's small or large, 315 00:19:40,600 --> 00:19:42,679 Speaker 1: or if you don't know the size of an object, 316 00:19:42,720 --> 00:19:45,240 Speaker 1: but you know two objects are identical and one is 317 00:19:45,280 --> 00:19:47,479 Speaker 1: smaller than the other, well, then you know the smaller 318 00:19:47,520 --> 00:19:49,720 Speaker 1: one is further away. So the brain is like constantly 319 00:19:49,840 --> 00:19:52,960 Speaker 1: using all of these little cues and tricks to put 320 00:19:53,000 --> 00:19:56,359 Speaker 1: together a conception of what it's seeing at any given 321 00:19:56,400 --> 00:20:00,400 Speaker 1: point in time. And then what what um optical allusions 322 00:20:00,440 --> 00:20:03,919 Speaker 1: are are Again these things you can produce to to 323 00:20:04,040 --> 00:20:07,479 Speaker 1: reliably trick the brain into making these wrong decisions. That 324 00:20:07,480 --> 00:20:12,119 Speaker 1: that shows its hand. It reveals how the brain functions 325 00:20:12,200 --> 00:20:14,840 Speaker 1: to take these shortcuts and the tricks that uses. Right, 326 00:20:14,920 --> 00:20:17,320 Speaker 1: Like a brain you think you're so smart, you're really dumb. 327 00:20:18,240 --> 00:20:22,120 Speaker 1: Look at this, and the brain says, oh, stop looking, 328 00:20:22,280 --> 00:20:26,959 Speaker 1: stop looking at those things, look at normal things. Um 329 00:20:27,040 --> 00:20:30,480 Speaker 1: kind of like sorry, I kind of like the apparent 330 00:20:30,560 --> 00:20:34,119 Speaker 1: motion ones, although I can't look at a lot of them. Um, 331 00:20:34,160 --> 00:20:38,200 Speaker 1: those are the ones where something is drawn in such 332 00:20:38,200 --> 00:20:40,000 Speaker 1: a way that it looks like it's moving when it's 333 00:20:40,040 --> 00:20:43,720 Speaker 1: not right. The very famous snake illusion is a great example. 334 00:20:44,440 --> 00:20:47,160 Speaker 1: And you know, this is another one of those theories 335 00:20:47,200 --> 00:20:49,679 Speaker 1: that to me is a little weak. But one of 336 00:20:49,720 --> 00:20:54,359 Speaker 1: the theories is that, um, they're these almost like unnoticeable 337 00:20:54,640 --> 00:20:57,240 Speaker 1: rapid eye movements that we make. Uh how do you 338 00:20:57,280 --> 00:21:02,800 Speaker 1: pronounce that? Yeah, no, s A C C A D 339 00:21:02,960 --> 00:21:05,760 Speaker 1: E s YEA CICADs the cads. I think you could 340 00:21:05,760 --> 00:21:07,679 Speaker 1: probably get away with either one. All right, Well that's 341 00:21:07,720 --> 00:21:10,399 Speaker 1: what they're called. And uh, it's it's like Pruett Taylor 342 00:21:10,480 --> 00:21:15,240 Speaker 1: Vince syndrome. You remember him. Yeah, he's a great actor, 343 00:21:15,400 --> 00:21:19,000 Speaker 1: Yeah he is. Um, so those little movements he usually 344 00:21:19,000 --> 00:21:21,800 Speaker 1: get smoothed out by the brain, so we you know, 345 00:21:21,920 --> 00:21:25,200 Speaker 1: get like a static picture. But um, what it's causing 346 00:21:25,240 --> 00:21:27,840 Speaker 1: in this case is perceiving motion where there was no motion. 347 00:21:28,680 --> 00:21:30,760 Speaker 1: And then the other theory on this one for apparent 348 00:21:30,840 --> 00:21:34,520 Speaker 1: motion illusions is there's just so much information going on 349 00:21:35,320 --> 00:21:40,160 Speaker 1: that uh, you know, there's just confusion. Right. I saw 350 00:21:40,240 --> 00:21:44,159 Speaker 1: one that actually combined the two that that's basically said, Um, 351 00:21:44,280 --> 00:21:49,439 Speaker 1: the the CICADs are creating the illusion of motion, but 352 00:21:49,640 --> 00:21:52,560 Speaker 1: what they're really doing is because the brain is being 353 00:21:52,640 --> 00:21:55,719 Speaker 1: hit with all this visual information that is just totally 354 00:21:55,760 --> 00:21:59,520 Speaker 1: doesn't make sense that we never never happen in nature 355 00:21:59,560 --> 00:22:03,720 Speaker 1: except maybe in motion. That these CICADs actually each time 356 00:22:03,800 --> 00:22:07,960 Speaker 1: your eye makes this tiny movement, it refreshes this overwhelming 357 00:22:08,640 --> 00:22:13,600 Speaker 1: um overload of information onto the brain, which creates the 358 00:22:13,720 --> 00:22:17,960 Speaker 1: sensation of movement. Oh yeah, pretty cool. Well, one of 359 00:22:18,000 --> 00:22:20,800 Speaker 1: the cool aspects of all of this to me is 360 00:22:21,160 --> 00:22:25,080 Speaker 1: um is the fact that once you've once you've seen 361 00:22:25,760 --> 00:22:30,280 Speaker 1: the illusion and the trick to it, you can't undo that, right, 362 00:22:30,400 --> 00:22:32,720 Speaker 1: So the brain is like a, ha, you know, I 363 00:22:32,840 --> 00:22:35,960 Speaker 1: got this one, like you know, the famous one, the 364 00:22:35,960 --> 00:22:39,320 Speaker 1: old lady or the or the young woman, the black 365 00:22:39,359 --> 00:22:42,560 Speaker 1: and white. It's a you know, classic illusion. And once 366 00:22:42,640 --> 00:22:44,560 Speaker 1: you you know, you can stare at and be like, 367 00:22:44,640 --> 00:22:46,400 Speaker 1: I just see the young lady or I just see 368 00:22:46,440 --> 00:22:49,879 Speaker 1: the old lady. Once you've seen both, then your brain 369 00:22:50,000 --> 00:22:52,200 Speaker 1: has like I said, it says ah, and it files 370 00:22:52,240 --> 00:22:55,320 Speaker 1: that away as prior knowledge and a little folder in 371 00:22:55,359 --> 00:22:58,239 Speaker 1: the brain and you can't undo that. So once you've 372 00:22:58,280 --> 00:23:00,199 Speaker 1: seen it, and you've seen the trick. You can aways 373 00:23:00,240 --> 00:23:02,199 Speaker 1: look at it and kind of make that flip in 374 00:23:02,240 --> 00:23:05,160 Speaker 1: your mind, right exactly. And it's the same thing too 375 00:23:05,200 --> 00:23:08,960 Speaker 1: with them contour less figures. Where is it a wine 376 00:23:09,000 --> 00:23:12,479 Speaker 1: goblet or is it two people's faces facing one another? 377 00:23:12,560 --> 00:23:15,720 Speaker 1: Kind of thing? Right? Oh yeah, yeah, yeah, you know 378 00:23:15,880 --> 00:23:18,560 Speaker 1: the negative space. Yeah, And apparently the trick to those 379 00:23:18,760 --> 00:23:22,199 Speaker 1: is you focus on the black or the white and 380 00:23:22,240 --> 00:23:24,960 Speaker 1: you see whichever one appears to be in the foreground, 381 00:23:25,040 --> 00:23:28,679 Speaker 1: because what your brain is doing is saying, um, I 382 00:23:28,760 --> 00:23:30,600 Speaker 1: need a foreground and I need a background, and then 383 00:23:30,640 --> 00:23:33,480 Speaker 1: I've got something to work with, and depending on what 384 00:23:33,520 --> 00:23:36,399 Speaker 1: it's which one it's looking at, it decides this is 385 00:23:36,440 --> 00:23:38,480 Speaker 1: the foreground or this is the background. So it's either 386 00:23:38,520 --> 00:23:40,960 Speaker 1: a wine goblet in the foreground or it's two people's 387 00:23:41,000 --> 00:23:43,560 Speaker 1: faces in the foreground. You know, I wonder if this 388 00:23:43,680 --> 00:23:46,120 Speaker 1: stuff if they know anything about because they didn't see 389 00:23:46,119 --> 00:23:49,160 Speaker 1: anything in the research, but if they know anything that 390 00:23:49,160 --> 00:23:54,280 Speaker 1: that this is like a brain exercise and helps you out, 391 00:23:55,080 --> 00:23:58,600 Speaker 1: like you know, playing suduco or doing word puzzles, or 392 00:23:58,640 --> 00:24:02,720 Speaker 1: if the brain is like stop looking at these, you know, 393 00:24:03,280 --> 00:24:06,399 Speaker 1: I don't like this I can't take anymore, you know, 394 00:24:06,600 --> 00:24:08,719 Speaker 1: or you know, like literally maybe, or if it causes 395 00:24:08,760 --> 00:24:12,199 Speaker 1: stress on the brain by by taxing it in a 396 00:24:12,200 --> 00:24:15,280 Speaker 1: way that it is not accustomed to or doesn't say 397 00:24:15,240 --> 00:24:17,520 Speaker 1: it doesn't like. Obviously the brain doesn't have a you know, 398 00:24:17,960 --> 00:24:21,600 Speaker 1: it's not a little person, but you know what I'm saying. Yeah, no, 399 00:24:21,680 --> 00:24:24,400 Speaker 1: I know what you mean. But the brain, even if 400 00:24:24,400 --> 00:24:28,240 Speaker 1: it's not a little person, it could still not like things. Right, 401 00:24:28,760 --> 00:24:31,120 Speaker 1: So let's take another break and then, um, I want 402 00:24:31,119 --> 00:24:33,959 Speaker 1: to tell everybody what my favorite optical illusion of all 403 00:24:34,040 --> 00:24:59,640 Speaker 1: time is. Alright, chuck, Yeah, I'm ready. Well there's two 404 00:24:59,760 --> 00:25:03,719 Speaker 1: one I like slightly less than the other. Okay, so 405 00:25:03,840 --> 00:25:06,960 Speaker 1: start with the second place one. Okay, I knew you 406 00:25:06,960 --> 00:25:08,760 Speaker 1: were gonna say that. I think that's a great way 407 00:25:08,800 --> 00:25:11,880 Speaker 1: to do it too. So you've got um, I don't 408 00:25:11,880 --> 00:25:13,720 Speaker 1: know the name of it. I'm sure there is a name, 409 00:25:13,760 --> 00:25:17,159 Speaker 1: but actually I think it's the contour less figure as well. 410 00:25:17,680 --> 00:25:21,560 Speaker 1: You take three circles, and you cut a pie slice 411 00:25:21,560 --> 00:25:24,159 Speaker 1: out of all of them, like a pac man, and 412 00:25:24,280 --> 00:25:27,800 Speaker 1: you orient those pie slices so that each one forms 413 00:25:27,840 --> 00:25:30,800 Speaker 1: what appears to be the corner of a coherent square, 414 00:25:31,840 --> 00:25:33,240 Speaker 1: and you look at it, and you're like, well, there's 415 00:25:33,240 --> 00:25:37,200 Speaker 1: a square right there with some that's overlaying four circles. 416 00:25:37,240 --> 00:25:38,960 Speaker 1: But if you stop and think about it, there's no 417 00:25:39,080 --> 00:25:43,720 Speaker 1: line whatsoever that that makes that square. It's your brain 418 00:25:44,160 --> 00:25:50,120 Speaker 1: exclusively filling in some suggestible information to say, well, there's 419 00:25:50,119 --> 00:25:52,960 Speaker 1: a square over a field of four circles. It's pretty 420 00:25:53,000 --> 00:25:55,920 Speaker 1: neat to me. I like that one. Uh so what's 421 00:25:56,080 --> 00:26:03,720 Speaker 1: what's number one? Um? Yeah, it's called the Addleston checkerboard. Okay, 422 00:26:04,200 --> 00:26:06,600 Speaker 1: surely you've seen this one, right, So I'm looking it 423 00:26:06,680 --> 00:26:08,800 Speaker 1: up as we speak. It's from the nineties. There was 424 00:26:08,880 --> 00:26:13,719 Speaker 1: an M I. T Vision researcher named Edward Addolson and um, 425 00:26:13,760 --> 00:26:17,359 Speaker 1: he created this checkerboard where on the checkerboard there's you know, 426 00:26:17,520 --> 00:26:20,560 Speaker 1: dark and light squares like a normal checkerboard, and then 427 00:26:20,600 --> 00:26:23,639 Speaker 1: there's like I think, a cylinder on the checkerboard and 428 00:26:23,680 --> 00:26:28,200 Speaker 1: it's casting a shadow. And so um, he says, look 429 00:26:28,240 --> 00:26:32,800 Speaker 1: at this white square and then look at this black square, um, 430 00:26:32,840 --> 00:26:35,240 Speaker 1: which is lighter, which is darker. And you say, well, 431 00:26:35,280 --> 00:26:39,320 Speaker 1: that's easy. The darker square, figure B say, is obviously 432 00:26:39,440 --> 00:26:43,440 Speaker 1: darker than figure A. And he says that's wrong, that's 433 00:26:43,480 --> 00:26:46,520 Speaker 1: absolutely wrong. Figuring and figure B are exactly the same 434 00:26:46,600 --> 00:26:49,800 Speaker 1: color and shade. Um, I'm looking at it. I've seen 435 00:26:49,800 --> 00:26:53,240 Speaker 1: that one for sure. The whole thing. Um, it really 436 00:26:53,280 --> 00:26:56,399 Speaker 1: really works because it takes it takes advantage of two 437 00:26:56,440 --> 00:26:59,840 Speaker 1: different um tricks that you can play on the brain, 438 00:27:00,320 --> 00:27:03,359 Speaker 1: or it takes advantage of two different shortcuts the brain makes. Right. 439 00:27:03,440 --> 00:27:07,200 Speaker 1: One is that cylinder is casting a shadow, that's putting 440 00:27:07,440 --> 00:27:12,200 Speaker 1: appears to be putting Figure A into um well into 441 00:27:12,200 --> 00:27:16,840 Speaker 1: a shadow. Right, So your brain automatically makes assumptions that 442 00:27:16,920 --> 00:27:20,800 Speaker 1: if something is in a shadow, it would normally be lighter, 443 00:27:21,680 --> 00:27:24,159 Speaker 1: which is in this case an incorrect assumption. It's actually 444 00:27:24,200 --> 00:27:27,560 Speaker 1: the same shade as the other one. And then the 445 00:27:27,600 --> 00:27:31,000 Speaker 1: other assumption that's making is that because that square is 446 00:27:31,040 --> 00:27:36,040 Speaker 1: surrounded by um squares of a darker color, uh, and 447 00:27:36,240 --> 00:27:38,480 Speaker 1: it seems and it's in a shadow, it seems to 448 00:27:38,520 --> 00:27:41,879 Speaker 1: contrast it where the other figure figure B is a 449 00:27:42,000 --> 00:27:45,040 Speaker 1: dark square surrounded by light. Uh, it seems to be 450 00:27:45,560 --> 00:27:49,760 Speaker 1: darker because of the it's surrounded, because the context of 451 00:27:49,760 --> 00:27:52,800 Speaker 1: the squares that it's surrounded by. So your brain is 452 00:27:52,880 --> 00:27:55,320 Speaker 1: using two different things, the presence of a shadow and 453 00:27:55,359 --> 00:27:59,800 Speaker 1: then the context where if something is surrounded by lighter stuff, 454 00:27:59,840 --> 00:28:02,480 Speaker 1: it seems darker. If something surrounded by darker stuff, it 455 00:28:02,520 --> 00:28:06,640 Speaker 1: seems lighter. UM. And that's just not always the case, obviously, 456 00:28:06,640 --> 00:28:09,920 Speaker 1: because Edward Adolson proved it's not. So you want to 457 00:28:09,920 --> 00:28:16,240 Speaker 1: know My favorite the classic Ebbing House illusion E B 458 00:28:16,320 --> 00:28:19,320 Speaker 1: B I N G H A U s uh. This 459 00:28:19,359 --> 00:28:22,400 Speaker 1: one is sort of similar, but it's not so much 460 00:28:22,400 --> 00:28:25,760 Speaker 1: about color, but it uses adjacent objects and a lot 461 00:28:25,760 --> 00:28:29,399 Speaker 1: of these do too. They use other things surrounding something 462 00:28:29,440 --> 00:28:32,320 Speaker 1: to trick your brain. Uh. And in this case, it's 463 00:28:32,400 --> 00:28:35,920 Speaker 1: it's the classic one. Go look it up. It's the UM. 464 00:28:36,000 --> 00:28:40,160 Speaker 1: You have two orange dots. One on the left, let's say, uh, 465 00:28:40,360 --> 00:28:44,960 Speaker 1: is surrounded by six larger gray dots, and the other 466 00:28:44,960 --> 00:28:47,640 Speaker 1: one on the right is surrounded by eight smaller dots. 467 00:28:47,720 --> 00:28:50,160 Speaker 1: It's very simple, that's why I love it. And the 468 00:28:50,200 --> 00:28:53,360 Speaker 1: orange dots are the same size, but they look completely 469 00:28:53,400 --> 00:28:59,000 Speaker 1: different sizes, and it's just it's so simple. And I 470 00:28:59,040 --> 00:29:01,600 Speaker 1: think this is one of the one is that UM 471 00:29:01,640 --> 00:29:04,120 Speaker 1: and too they have this contest every year, I think 472 00:29:04,160 --> 00:29:06,280 Speaker 1: for like, I don't know, it's been going on for 473 00:29:06,960 --> 00:29:10,240 Speaker 1: at least ten or twelve years, right, uh for new 474 00:29:10,280 --> 00:29:12,720 Speaker 1: illusions and like we said earlier. You know, a lot 475 00:29:12,760 --> 00:29:15,280 Speaker 1: of these new illusions are still just sort of riffs 476 00:29:15,280 --> 00:29:18,080 Speaker 1: on the classics. Um. But the one that won a 477 00:29:18,080 --> 00:29:22,280 Speaker 1: couple of years ago was a new version of the 478 00:29:22,280 --> 00:29:25,240 Speaker 1: ebbing House illusion where um, it's actually a video that 479 00:29:25,240 --> 00:29:28,960 Speaker 1: you have to play, so it moves. Um that the 480 00:29:29,000 --> 00:29:33,920 Speaker 1: outer dots they it looks like at pulsates and uh, 481 00:29:33,920 --> 00:29:37,000 Speaker 1: well it is pulsating. They get bigger and smaller, and 482 00:29:37,160 --> 00:29:39,080 Speaker 1: the orange dots stays the same, but it looks like 483 00:29:39,120 --> 00:29:42,960 Speaker 1: it's shrinking and expanding, right, So it's kind of cool. 484 00:29:43,040 --> 00:29:45,120 Speaker 1: It's just to just to play on the ebbing House illusion. 485 00:29:46,280 --> 00:29:48,680 Speaker 1: But that's that's what we were saying earlier too. It's 486 00:29:48,720 --> 00:29:50,720 Speaker 1: like it's almost like they invented all of them in 487 00:29:50,760 --> 00:29:52,960 Speaker 1: the nineteenth century and then now we're just able to 488 00:29:53,000 --> 00:29:57,280 Speaker 1: perfect him a little more. Yeah, pretty cool. Another thing 489 00:29:57,280 --> 00:30:01,040 Speaker 1: I thought was really neat was that there's this biological 490 00:30:01,080 --> 00:30:05,240 Speaker 1: basis that is the same for everyone on planet Earth, obviously, 491 00:30:05,760 --> 00:30:10,440 Speaker 1: but they did find there some across different cultures that 492 00:30:11,000 --> 00:30:14,640 Speaker 1: they didn't take the same visual cues necessarily. And the 493 00:30:14,760 --> 00:30:20,360 Speaker 1: classic Mula Layah illusion that everyone has seen, and that's 494 00:30:20,400 --> 00:30:25,840 Speaker 1: just the really simple one of two straight lines, horizontal lines, uh, 495 00:30:25,880 --> 00:30:28,880 Speaker 1: and they have arrows on the ends. On one of them, 496 00:30:28,880 --> 00:30:31,280 Speaker 1: the arrows are pointing out. On the other the arrows 497 00:30:31,280 --> 00:30:34,320 Speaker 1: are pointing in. And those two horizontal lines appear to 498 00:30:34,360 --> 00:30:36,880 Speaker 1: be different links. And so they did a study in 499 00:30:36,880 --> 00:30:39,320 Speaker 1: South Africa and they found that most of the European 500 00:30:39,840 --> 00:30:43,320 Speaker 1: South Africans thought, yeah, like, look at them, they're different links. 501 00:30:43,800 --> 00:30:45,880 Speaker 1: Then they showed it to like, you know, the bushmen 502 00:30:45,920 --> 00:30:50,920 Speaker 1: of South Africa, and they're like, no, dummies, they're the 503 00:30:50,960 --> 00:30:53,600 Speaker 1: same weak can't you see that? And the researchers are 504 00:30:53,640 --> 00:30:57,320 Speaker 1: like what yeah, and they really, I mean, they had 505 00:30:57,360 --> 00:31:00,040 Speaker 1: some theories about it. Uh. That kind of makes and 506 00:31:00,120 --> 00:31:02,960 Speaker 1: so that Western societies may be a little more used 507 00:31:03,000 --> 00:31:05,360 Speaker 1: to these things that are built in straight lines and 508 00:31:05,360 --> 00:31:09,040 Speaker 1: a little more geometrical, where the other culture might be 509 00:31:09,120 --> 00:31:12,040 Speaker 1: like just more attuned to nature where there aren't so 510 00:31:12,040 --> 00:31:15,520 Speaker 1: many straight lines. Right. And because the explanation for the 511 00:31:16,000 --> 00:31:20,400 Speaker 1: what was it, the Meyer the mula, the Mueller liar effect, 512 00:31:20,920 --> 00:31:24,840 Speaker 1: or optical illusion is that depending on which way the 513 00:31:25,240 --> 00:31:27,600 Speaker 1: arrow was pointing, whether at the at the end of 514 00:31:27,640 --> 00:31:31,120 Speaker 1: the line or away from the line um, the brain 515 00:31:31,320 --> 00:31:34,719 Speaker 1: is used to seeing corners right. Two walls coming together 516 00:31:34,760 --> 00:31:37,880 Speaker 1: at a ceiling make that that same kind of arrow, 517 00:31:38,320 --> 00:31:41,360 Speaker 1: and one that's pointing away means the point of it 518 00:31:41,400 --> 00:31:43,840 Speaker 1: is further away, so it would make the line look longer, 519 00:31:43,960 --> 00:31:46,600 Speaker 1: whereas one that's pointing inward would make it look like 520 00:31:46,640 --> 00:31:49,640 Speaker 1: the corners closest to us, right, so it would seem 521 00:31:49,720 --> 00:31:53,880 Speaker 1: like the line is shorter. But but the explanation was that, well, bushman, 522 00:31:54,120 --> 00:31:56,479 Speaker 1: have never seen two walls come together at the ceiling, 523 00:31:56,560 --> 00:31:58,880 Speaker 1: so that's why it didn't happen to them. But the 524 00:31:58,880 --> 00:32:02,440 Speaker 1: thing that just proved that is that, um, they trained 525 00:32:03,080 --> 00:32:06,800 Speaker 1: UH computer to to look at this stuff, and they 526 00:32:06,840 --> 00:32:09,840 Speaker 1: didn't train it on three dimensional objects, so it wasn't 527 00:32:09,880 --> 00:32:12,480 Speaker 1: familiar with walls coming together with the ceiling, and it 528 00:32:12,520 --> 00:32:14,480 Speaker 1: was fooled by it as well. So they were like, well, 529 00:32:14,520 --> 00:32:17,200 Speaker 1: we we have no idea. What's going on? Then, bushman 530 00:32:17,320 --> 00:32:20,760 Speaker 1: or magic is what they said. I wonder why so 531 00:32:20,840 --> 00:32:26,520 Speaker 1: many of these UH illusion UH enthusiasts seemed to be 532 00:32:27,440 --> 00:32:31,120 Speaker 1: like German and Austrian. I think I had to do. 533 00:32:31,440 --> 00:32:37,480 Speaker 1: That was where the UH largely where psychology took off. Yeah, 534 00:32:37,520 --> 00:32:39,480 Speaker 1: I guess that makes sense. I guess Escher was Dutch 535 00:32:40,040 --> 00:32:43,920 Speaker 1: buzzy but um, yeah, it seems like a lot of 536 00:32:43,920 --> 00:32:47,440 Speaker 1: these are like German and Austrian. Yeah. I think it 537 00:32:47,440 --> 00:32:50,760 Speaker 1: has to do with that was where the hot seat 538 00:32:50,920 --> 00:32:54,640 Speaker 1: of psychology and brain research was at the time. Interesting, 539 00:32:55,720 --> 00:32:58,400 Speaker 1: h you got anything else? Actually I do have one more. 540 00:32:58,480 --> 00:33:02,280 Speaker 1: There was um a guy named Hermann von helm Holtz. 541 00:33:02,600 --> 00:33:06,360 Speaker 1: Oh he wasn't German, right, No, nice Irish guy who's 542 00:33:06,400 --> 00:33:12,360 Speaker 1: from Indiana. Von Helmholtz came up with these um squares 543 00:33:12,560 --> 00:33:15,960 Speaker 1: right that are not Actually they don't have confining lines 544 00:33:16,040 --> 00:33:20,400 Speaker 1: or defining lines. They're just equal lines of squares or 545 00:33:20,560 --> 00:33:24,280 Speaker 1: lines lines equally apart that formed to the brain a square. 546 00:33:24,960 --> 00:33:30,360 Speaker 1: But ones that are horizontal seems smaller and shorter than 547 00:33:30,560 --> 00:33:35,560 Speaker 1: ones that are vertical, which is weird because if you 548 00:33:35,720 --> 00:33:38,680 Speaker 1: are wearing like a horizontally striped shirt, everybody's like, you 549 00:33:38,720 --> 00:33:42,440 Speaker 1: look fat in that shirt. Well, von Helmholtz, you don't. 550 00:33:42,760 --> 00:33:46,120 Speaker 1: You should actually look slimmer, which surprised me. So I 551 00:33:46,160 --> 00:33:49,200 Speaker 1: started wearing horizontal stripes as a result. You got your 552 00:33:49,280 --> 00:33:52,960 Speaker 1: Charlie brown shirt out. Yeah, because that was sort of 553 00:33:52,960 --> 00:33:54,680 Speaker 1: the old I don't know if it's true or not, 554 00:33:54,720 --> 00:33:57,040 Speaker 1: but they said that the New York Yankees designed their 555 00:33:57,080 --> 00:34:00,240 Speaker 1: pin stripes to make Babe Ruth look thinner. I could 556 00:34:00,240 --> 00:34:01,920 Speaker 1: totally buy that, but I don't know if that's true. 557 00:34:01,960 --> 00:34:04,120 Speaker 1: I thought they had been stripes before then. But Babe 558 00:34:04,160 --> 00:34:06,239 Speaker 1: Ruth was eating a steak while they were fitting him 559 00:34:06,280 --> 00:34:08,440 Speaker 1: for and he said, thanks for thinking of me. But 560 00:34:08,520 --> 00:34:10,560 Speaker 1: he wasn't even using silver. Is he eating it with 561 00:34:10,640 --> 00:34:14,840 Speaker 1: his hand? Yeah? And he also, um uh, blended a 562 00:34:14,840 --> 00:34:17,239 Speaker 1: steak into a milkshake and drank that along with his 563 00:34:17,560 --> 00:34:20,359 Speaker 1: regular steak, right, And he didn't take a cigar out 564 00:34:20,360 --> 00:34:22,120 Speaker 1: while he drank it. He just put that in the 565 00:34:22,160 --> 00:34:24,800 Speaker 1: corner of his mouth. Yeah, And it's after dinner cognac. 566 00:34:25,840 --> 00:34:28,400 Speaker 1: That's why we love Babe Ruth. Ye, you know what, 567 00:34:28,440 --> 00:34:29,920 Speaker 1: we didn't get into it all. And I don't know 568 00:34:30,680 --> 00:34:32,680 Speaker 1: if they even count as illusions or if there's something 569 00:34:32,680 --> 00:34:34,640 Speaker 1: else or those and they were a boy, they were 570 00:34:34,680 --> 00:34:37,600 Speaker 1: all the rage and the early nineties I feel like 571 00:34:37,640 --> 00:34:41,600 Speaker 1: were those um magic I yeah, where you stare at 572 00:34:41,600 --> 00:34:43,920 Speaker 1: the thing and all of a sudden a ship pops 573 00:34:43,960 --> 00:34:47,160 Speaker 1: out at you. If you're you know, lucky enough to 574 00:34:47,160 --> 00:34:48,239 Speaker 1: be able to see it. I know a lot of 575 00:34:48,239 --> 00:34:50,719 Speaker 1: people that would just endlessly not be able to see him, 576 00:34:50,760 --> 00:34:52,719 Speaker 1: and it would frustrate them to no end. I think, 577 00:34:52,760 --> 00:34:56,320 Speaker 1: if I remember correctly, they advised that you stare into 578 00:34:56,360 --> 00:35:00,279 Speaker 1: the middle ground. Yeah, and sort of like unFocus your eyes. Yeah, 579 00:35:00,320 --> 00:35:03,120 Speaker 1: I was looking those up. Um. There's a Mental Floss 580 00:35:03,320 --> 00:35:06,560 Speaker 1: article on it that was pretty brief and it made sense. 581 00:35:06,600 --> 00:35:10,319 Speaker 1: I think they were machine vision learning researchers who were like, hey, 582 00:35:10,360 --> 00:35:12,640 Speaker 1: let's make some money on the side if they start 583 00:35:12,640 --> 00:35:15,600 Speaker 1: with like a depth map of something and put it 584 00:35:15,600 --> 00:35:18,600 Speaker 1: in gray scale. And I think they make two of them, 585 00:35:18,880 --> 00:35:22,719 Speaker 1: so your eyes are getting the two different versions of it, 586 00:35:22,760 --> 00:35:25,160 Speaker 1: but one smaller than the other, so it really makes 587 00:35:25,160 --> 00:35:28,320 Speaker 1: it pop as far as depth goes. And then somehow 588 00:35:28,360 --> 00:35:34,279 Speaker 1: the random repeating pattern that overlays it transmits that information 589 00:35:34,320 --> 00:35:38,279 Speaker 1: to your brain unconsciously. Well, so you did look it up, 590 00:35:38,320 --> 00:35:41,160 Speaker 1: then I did. I don't know if I got it 591 00:35:41,239 --> 00:35:44,319 Speaker 1: fully right because it's it's actually kind of complex, but 592 00:35:44,440 --> 00:35:46,200 Speaker 1: I thought they did a pretty good I would describe me. 593 00:35:46,320 --> 00:35:50,360 Speaker 1: Could you see those? Yeah? Sometimes sometimes Yeah, I always 594 00:35:50,400 --> 00:35:52,200 Speaker 1: could see them. And that's another one of those where 595 00:35:52,200 --> 00:35:55,600 Speaker 1: once you see it you can just immediately like draw 596 00:35:55,640 --> 00:35:58,680 Speaker 1: it out. Um. And of course that's there's the one 597 00:35:59,600 --> 00:36:03,919 Speaker 1: Ethan's supply and mall Rats sort of the one joke 598 00:36:04,000 --> 00:36:05,640 Speaker 1: through that movie. It was he just stares at this 599 00:36:05,680 --> 00:36:08,160 Speaker 1: thing like through the whole movie and he couldn't see it. 600 00:36:08,200 --> 00:36:10,879 Speaker 1: I couldn't see it, poor guy. What a great joke, 601 00:36:11,280 --> 00:36:13,680 Speaker 1: you know, speaking of that something that's always bothered me. 602 00:36:14,040 --> 00:36:16,840 Speaker 1: Stephen King said, and one of his books or something 603 00:36:16,880 --> 00:36:19,360 Speaker 1: like that, he was talking about how you can't unsee something. 604 00:36:19,480 --> 00:36:21,920 Speaker 1: I thought you said, he's talking about Mall Rats. He was, 605 00:36:22,480 --> 00:36:24,400 Speaker 1: and he he used the Man in the moon as 606 00:36:24,440 --> 00:36:26,399 Speaker 1: an example. He's like, it's like the man in the moon. 607 00:36:26,440 --> 00:36:29,560 Speaker 1: Once you see it, you can't unsee it, right can 608 00:36:30,680 --> 00:36:32,640 Speaker 1: I Like, I've seen the man in the moon before 609 00:36:32,760 --> 00:36:36,960 Speaker 1: and I totally can't find him again, so you can 610 00:36:37,040 --> 00:36:39,040 Speaker 1: un see it. Stephen King is wrong? What is the 611 00:36:39,040 --> 00:36:41,359 Speaker 1: man in the moon? What are you talking about? You've 612 00:36:41,360 --> 00:36:44,400 Speaker 1: never seen the man in the moon now, so I 613 00:36:44,440 --> 00:36:46,759 Speaker 1: guess probably look it up. I think it would help 614 00:36:47,000 --> 00:36:50,080 Speaker 1: to see somebody else pointing it out. And then when 615 00:36:50,080 --> 00:36:51,879 Speaker 1: you see the when you look at the full moon, 616 00:36:51,960 --> 00:36:53,959 Speaker 1: you you should be able to see it. But there's 617 00:36:54,120 --> 00:36:58,160 Speaker 1: a man looking down. It's Jackie Gleeson. I don't think 618 00:37:00,680 --> 00:37:04,520 Speaker 1: are you looking it up right now? Yeah? I'd never 619 00:37:04,600 --> 00:37:06,640 Speaker 1: do that was a thing that's weird. And then the 620 00:37:06,719 --> 00:37:09,279 Speaker 1: Japanese think it's a rabbit and that the rabbit is 621 00:37:09,360 --> 00:37:13,000 Speaker 1: up there making mochi. Really, I don't know what other 622 00:37:13,040 --> 00:37:18,359 Speaker 1: cultures think. Those are the two I'm familiar with. Huh yeah, 623 00:37:18,719 --> 00:37:22,520 Speaker 1: so mochi. All right, if you want to know more 624 00:37:22,560 --> 00:37:25,920 Speaker 1: about optical illusions, type those words into the search bar 625 00:37:25,960 --> 00:37:28,920 Speaker 1: at how stuff works better yet, go to Michael Bach 626 00:37:29,080 --> 00:37:32,239 Speaker 1: dot d E and just have some fun. Uh. And 627 00:37:32,280 --> 00:37:34,960 Speaker 1: since I said d E, it's time for a listener mail. 628 00:37:37,440 --> 00:37:41,640 Speaker 1: I'm gonna call this Aussie slang. We love our and 629 00:37:41,680 --> 00:37:44,640 Speaker 1: I said Aussie. I met Aussie, right, we love our 630 00:37:44,640 --> 00:37:47,960 Speaker 1: Australian listeners. We've got a lot of them. They've long 631 00:37:48,000 --> 00:37:51,600 Speaker 1: supported the show, so we'd like to shout them out. Yeah, Australia. Uh, 632 00:37:51,719 --> 00:37:55,799 Speaker 1: he said, get eye, fellas. That's pretty good. I'm not 633 00:37:55,800 --> 00:37:57,120 Speaker 1: gonna read the whole thing like that, but I will 634 00:37:57,239 --> 00:38:00,879 Speaker 1: just nailed Canberra. I'm a debut a listener from down 635 00:38:00,880 --> 00:38:03,960 Speaker 1: on and I'm doing my best to get through your podcast. 636 00:38:04,000 --> 00:38:05,480 Speaker 1: I love the show and finish every show with a 637 00:38:05,520 --> 00:38:08,560 Speaker 1: smile and some new fact to tell my mates about. Anyway, 638 00:38:08,560 --> 00:38:09,880 Speaker 1: I got a quick story for you to have a 639 00:38:09,920 --> 00:38:13,279 Speaker 1: laugh at and possibly be very confused by the Other night, 640 00:38:13,320 --> 00:38:15,760 Speaker 1: my mate and I were going on a uh Marcus 641 00:38:15,880 --> 00:38:18,480 Speaker 1: run M A C C A S. I think we've 642 00:38:18,520 --> 00:38:22,120 Speaker 1: talked about that before, right, didn't that beer? I don't know? 643 00:38:22,760 --> 00:38:27,800 Speaker 1: Uh Fosters was us Chilian? And he goes, OHI mate, 644 00:38:28,360 --> 00:38:30,640 Speaker 1: after we've been to Macus, we can drop by the 645 00:38:30,640 --> 00:38:33,759 Speaker 1: sub grab a packet dairies and then the bottle. I'll 646 00:38:33,760 --> 00:38:36,400 Speaker 1: grab a slab of EV Studies and head back to 647 00:38:36,480 --> 00:38:38,800 Speaker 1: yells and get piste. Okay, so let me let me 648 00:38:40,160 --> 00:38:45,560 Speaker 1: oi mate, Hello friend, after we've been to Marcus after 649 00:38:45,960 --> 00:38:48,880 Speaker 1: I don't know what that was. Next, can we dropped 650 00:38:48,880 --> 00:38:52,440 Speaker 1: by the servo? We can go hang out with Tom Servo. 651 00:38:53,120 --> 00:38:55,160 Speaker 1: I bet you anything. A servo is like a gas station. 652 00:38:56,520 --> 00:39:01,360 Speaker 1: Grab a pack of dirries, Uh, get some milk and 653 00:39:01,400 --> 00:39:06,400 Speaker 1: then the bottle. Oh uh, get a bottle, grab a 654 00:39:06,400 --> 00:39:11,440 Speaker 1: slab of v B Studies, get some ribs. I think 655 00:39:11,480 --> 00:39:15,040 Speaker 1: I think that's it, uh, and head back to yours 656 00:39:15,040 --> 00:39:18,800 Speaker 1: and get pissed uh and then go to sleep. I 657 00:39:18,840 --> 00:39:22,120 Speaker 1: think you're right on the money. Yeah, I know you 658 00:39:22,160 --> 00:39:24,360 Speaker 1: guys don't often do request be rad If you guys 659 00:39:24,360 --> 00:39:27,959 Speaker 1: did a podcast on Aussie slang's history and meanings, mostly 660 00:39:28,000 --> 00:39:30,399 Speaker 1: because I would love to hear Chuck's Aussie accent. Oh 661 00:39:30,400 --> 00:39:33,880 Speaker 1: well yeah, granted he didn't wait, he didn't translate it 662 00:39:34,000 --> 00:39:37,719 Speaker 1: himself now, so we'll never know whether I was completely right. 663 00:39:37,960 --> 00:39:40,879 Speaker 1: Someone someone will uh. And I'd love to hear both 664 00:39:40,880 --> 00:39:43,400 Speaker 1: of you pronounced as much Assie slang as possible. But 665 00:39:43,480 --> 00:39:46,360 Speaker 1: also because i'd like to have facts about why I 666 00:39:46,440 --> 00:39:49,759 Speaker 1: speak the way I do. Stay rad And that is 667 00:39:49,800 --> 00:39:53,120 Speaker 1: from Liam and he said, ps, we swear a lot 668 00:39:53,360 --> 00:39:56,400 Speaker 1: down here. Uh. And if that's why you can't do 669 00:39:56,440 --> 00:40:00,160 Speaker 1: an Aussie slang podcast, I don't blame you. Well, I 670 00:40:00,200 --> 00:40:03,719 Speaker 1: swear a lot. I r L Liam, but we just 671 00:40:03,800 --> 00:40:08,279 Speaker 1: keep it clean for the show. That's right. Nice? Yeah, 672 00:40:08,719 --> 00:40:11,520 Speaker 1: well thanks Liam. I'm not gonna do an Australian accent 673 00:40:11,560 --> 00:40:15,040 Speaker 1: because it would hurt everyone's ears. Uh. If you want 674 00:40:15,040 --> 00:40:16,839 Speaker 1: to get in touch with us, like Liam did, you 675 00:40:16,880 --> 00:40:19,560 Speaker 1: can tweet to us. I'm at josh um Clark and 676 00:40:19,760 --> 00:40:22,879 Speaker 1: at s y s K podcast, Chuck's on Facebook dot 677 00:40:22,880 --> 00:40:25,319 Speaker 1: com slash Stuff you Should Know, and Charles W. Chuck 678 00:40:25,400 --> 00:40:28,000 Speaker 1: Bryant and you can send us an email to Stuff 679 00:40:28,040 --> 00:40:30,640 Speaker 1: Podcasts at how stuff works dot com, and it's always 680 00:40:30,719 --> 00:40:32,200 Speaker 1: hang out with us at her home on the web. 681 00:40:32,440 --> 00:40:39,200 Speaker 1: Stuff you Should Know dot com For more on this 682 00:40:39,320 --> 00:40:41,839 Speaker 1: and thousands of other topics. Does it How stuff works 683 00:40:41,880 --> 00:40:52,759 Speaker 1: dot com