1 00:00:00,320 --> 00:00:02,960 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,200 --> 00:00:07,560 Speaker 1: It's ready. Are you welcome to Stuff you Should Know 3 00:00:08,200 --> 00:00:17,239 Speaker 1: from house Stuff Works dot com. Hey, welcome to the podcast. 4 00:00:17,320 --> 00:00:19,919 Speaker 1: I'm Josh Clark. With me is always this Charles W. 5 00:00:20,160 --> 00:00:25,160 Speaker 1: Chucker Bryant and uh that makes this stuff you should know? 6 00:00:25,560 --> 00:00:28,159 Speaker 1: Is your seat? Okay, Frank, the chair is letting me 7 00:00:28,200 --> 00:00:33,480 Speaker 1: down again. Yeah, he'll do that. Cher he recently fell 8 00:00:33,520 --> 00:00:38,159 Speaker 1: in with a bad crowd and I do too, from 9 00:00:38,159 --> 00:00:40,720 Speaker 1: the way he's making me said, So I'll just I'll 10 00:00:40,800 --> 00:00:44,960 Speaker 1: lean forward. He's become unreliable. Who is it that messes 11 00:00:45,000 --> 00:00:48,360 Speaker 1: with him? I don't know. Somebody who has no idea 12 00:00:48,360 --> 00:00:50,199 Speaker 1: how to sit in a chair properly. That's how you 13 00:00:50,240 --> 00:00:52,120 Speaker 1: need to get next, not just your own mic cover, 14 00:00:52,200 --> 00:00:54,800 Speaker 1: but your own chair. Yeah. I think it's a strickling 15 00:00:54,840 --> 00:00:57,560 Speaker 1: guy from tech stuff. It would be very cool as 16 00:00:57,600 --> 00:01:00,120 Speaker 1: if you had it lower down from the ceiling like 17 00:01:00,160 --> 00:01:03,320 Speaker 1: it was stored up there, and then hang like the 18 00:01:03,400 --> 00:01:06,640 Speaker 1: sort of damocles over everybody else's head while they were recording. 19 00:01:08,000 --> 00:01:12,640 Speaker 1: We'll look into that, Chuck. Have you ever heard of 20 00:01:12,680 --> 00:01:19,360 Speaker 1: a little place called the Massachusetts Institute of Technology. Yeah, okay, 21 00:01:19,840 --> 00:01:25,200 Speaker 1: go eggheads. UM. So M I T is like the hotbed, 22 00:01:25,480 --> 00:01:31,120 Speaker 1: the center of the linguistics field, among many other fields 23 00:01:31,680 --> 00:01:37,640 Speaker 1: I didn't know. Yeah. Um Noam Chomsky is there. UM. 24 00:01:37,720 --> 00:01:40,319 Speaker 1: And there's another guy whose name escapes me right now, 25 00:01:40,360 --> 00:01:44,320 Speaker 1: but he recently made some headlines because he, I guess, 26 00:01:44,319 --> 00:01:47,199 Speaker 1: got a grant and had his house wired with fish 27 00:01:47,240 --> 00:01:52,559 Speaker 1: I cameras in every room with really high tech audio 28 00:01:52,600 --> 00:01:56,880 Speaker 1: equipment to and from the moment his newborn son came 29 00:01:56,920 --> 00:02:00,360 Speaker 1: home to the age of five. This guy recorded ninety 30 00:02:00,520 --> 00:02:06,320 Speaker 1: thousand hours the whole five years of this kid's life. UM, 31 00:02:06,360 --> 00:02:10,600 Speaker 1: in an effort to see how language acquisition develops in children. 32 00:02:10,639 --> 00:02:13,399 Speaker 1: That's pretty cool, and this child specifically, it is very cool. 33 00:02:13,440 --> 00:02:19,240 Speaker 1: There's a really clumsily titled UM Fast Company article called 34 00:02:19,720 --> 00:02:23,239 Speaker 1: M I T. Scientists captures ninety thousand hours of video 35 00:02:23,400 --> 00:02:27,760 Speaker 1: of his son's first words Comma, grafs it, comma what 36 00:02:28,120 --> 00:02:33,560 Speaker 1: graphs it? He and then he grafts it. Yeah. The 37 00:02:33,680 --> 00:02:37,120 Speaker 1: editor was like, I'm going home, yeah exactly so UM. 38 00:02:37,160 --> 00:02:39,360 Speaker 1: But anyway, there's some video and some audio clips in 39 00:02:39,360 --> 00:02:41,400 Speaker 1: there where you can hear like this condensed like over 40 00:02:41,400 --> 00:02:43,200 Speaker 1: five years or over like a six month period. Or 41 00:02:43,200 --> 00:02:45,520 Speaker 1: something like that. That's like the kid going from like 42 00:02:45,600 --> 00:02:49,600 Speaker 1: gaga to water and you can hear it like evolve. Interesting. 43 00:02:50,280 --> 00:02:52,120 Speaker 1: Did they learn anything from that? I don't know if 44 00:02:52,120 --> 00:02:54,799 Speaker 1: they have quite yet. And plus I mean like this 45 00:02:54,840 --> 00:02:58,480 Speaker 1: is one child, but it's at the very least very 46 00:02:58,560 --> 00:03:03,519 Speaker 1: very interesting. Um. But the the idea that you can 47 00:03:03,720 --> 00:03:07,040 Speaker 1: learn something about the evolution of language and human beings 48 00:03:07,080 --> 00:03:13,480 Speaker 1: from language acquisition and children is a hotly contested idea. UM, 49 00:03:13,840 --> 00:03:17,000 Speaker 1: you wrote what I think is a very fine article. Thanks. 50 00:03:17,120 --> 00:03:19,280 Speaker 1: You did a good job with this. How did language evolve? 51 00:03:19,360 --> 00:03:22,320 Speaker 1: It was shorty you were talking about, though not how 52 00:03:22,560 --> 00:03:26,120 Speaker 1: we acquire language skills as as um kids, but as 53 00:03:26,200 --> 00:03:30,880 Speaker 1: a species. How did humans acquire language? Because we're the 54 00:03:30,880 --> 00:03:33,480 Speaker 1: only ones that can say things like this, But you 55 00:03:33,560 --> 00:03:35,080 Speaker 1: go to great lens to point out that we're not 56 00:03:35,080 --> 00:03:38,600 Speaker 1: the only ones that communicate. True, I wouldn't say great 57 00:03:38,640 --> 00:03:41,320 Speaker 1: links that sort of al It was like two or 58 00:03:41,320 --> 00:03:46,440 Speaker 1: three sentences, Sure animals communicate. Well, no, it is. It 59 00:03:46,560 --> 00:03:47,920 Speaker 1: is very true, and I think it was a good 60 00:03:47,920 --> 00:03:50,320 Speaker 1: thing to start off with because humans can often be 61 00:03:50,440 --> 00:03:58,120 Speaker 1: very um homocentric. You know, so you say birds chirp, 62 00:04:00,040 --> 00:04:05,400 Speaker 1: porpoises go right, Yeah, uh, there are community. We're the 63 00:04:05,480 --> 00:04:10,040 Speaker 1: only ones who converbialize. That's right, right, Yes, we talk words. 64 00:04:10,200 --> 00:04:16,599 Speaker 1: And we don't know exactly how this evolved for sure, because, um, 65 00:04:16,640 --> 00:04:18,839 Speaker 1: there's a problem when it comes to things like evolution. 66 00:04:18,880 --> 00:04:20,920 Speaker 1: There's not a ton of evidence. A lot of times 67 00:04:21,640 --> 00:04:26,360 Speaker 1: like hard evidence. Um, I read this one guy's paper. 68 00:04:26,400 --> 00:04:28,520 Speaker 1: There's a lot of papers on this. Yeah, this is 69 00:04:28,560 --> 00:04:30,840 Speaker 1: a really First of all, I want you to just 70 00:04:30,880 --> 00:04:34,279 Speaker 1: be very quiet. Do you hear that off in the 71 00:04:34,360 --> 00:04:38,200 Speaker 1: business the explosion? Yes, we're standing in the midst of 72 00:04:38,279 --> 00:04:43,479 Speaker 1: a mine field. Linguistics is a mine field, and they 73 00:04:43,560 --> 00:04:48,719 Speaker 1: love like linguistics. People really love language and talking about 74 00:04:48,760 --> 00:04:51,560 Speaker 1: it like and putting down people who disagree with them. 75 00:04:51,960 --> 00:04:55,680 Speaker 1: So we should tread lightly here, we should. Um. But 76 00:04:55,760 --> 00:04:58,560 Speaker 1: one guy's paper that I read today meets some university paper. 77 00:04:59,000 --> 00:05:01,680 Speaker 1: He said that ideally, if we're going to study something 78 00:05:01,720 --> 00:05:04,640 Speaker 1: like these neurological changes that happened in the brain, we 79 00:05:04,640 --> 00:05:09,840 Speaker 1: would have, um, a large number of petrified whole brains 80 00:05:10,680 --> 00:05:14,560 Speaker 1: representing lots of species over lots of time. But we 81 00:05:14,600 --> 00:05:20,080 Speaker 1: don't have that. Unfortunately, they're big gaps. Uh. And even 82 00:05:20,200 --> 00:05:23,360 Speaker 1: even even not taking into account gaps, we don't have 83 00:05:23,520 --> 00:05:27,640 Speaker 1: fossilized brains. Yeah, the closest thing we have is a 84 00:05:27,680 --> 00:05:30,680 Speaker 1: fossilized skull, which we can analyze and be like, well, 85 00:05:30,720 --> 00:05:33,159 Speaker 1: there is kind of room for a big enough brain 86 00:05:33,240 --> 00:05:37,719 Speaker 1: maybe for language. Yeah, what's that called cranial indocasting? I 87 00:05:37,760 --> 00:05:41,080 Speaker 1: think so, Um, that's a that's a good term for it. 88 00:05:41,200 --> 00:05:44,919 Speaker 1: And people won't even know if I'm wrong. Um. What 89 00:05:45,040 --> 00:05:47,680 Speaker 1: they do have a little bit of evidence on is 90 00:05:47,800 --> 00:05:53,159 Speaker 1: that the shape of our vocal tract um wasn't until 91 00:05:53,160 --> 00:05:55,640 Speaker 1: about a hundred thousand years ago, wasn't able to even 92 00:05:55,680 --> 00:06:01,239 Speaker 1: able to make um, the vocalizations of the modern speech sounds. 93 00:06:01,920 --> 00:06:05,320 Speaker 1: So it wasn't even possible, Although that doesn't necessarily mean 94 00:06:05,320 --> 00:06:07,800 Speaker 1: there wasn't language, because it just could have been a 95 00:06:07,960 --> 00:06:12,720 Speaker 1: much more primitive version of what evolved of grunts. Yeah, exactly, 96 00:06:12,920 --> 00:06:15,000 Speaker 1: and I did see someone. Even though that's all this 97 00:06:15,080 --> 00:06:17,800 Speaker 1: was poo pooed by most people. Uh, some people think 98 00:06:17,800 --> 00:06:22,320 Speaker 1: that spoken language of all from sign language, and that 99 00:06:22,440 --> 00:06:26,440 Speaker 1: our modern gestures are holdover from that, which I thought 100 00:06:26,480 --> 00:06:29,320 Speaker 1: was interesting, but most people go, now that's not true. Um. 101 00:06:29,360 --> 00:06:31,760 Speaker 1: I ran into another one. And what you're talking about, um, 102 00:06:31,800 --> 00:06:36,680 Speaker 1: for everybody listening, is called a proto language, and evidence 103 00:06:36,680 --> 00:06:39,120 Speaker 1: of a proto language supports one theory, which we'll talk 104 00:06:39,160 --> 00:06:41,960 Speaker 1: about in a minute. But um, one idea for proto 105 00:06:42,000 --> 00:06:45,360 Speaker 1: language is that we started talking using on a monopeia, 106 00:06:46,200 --> 00:06:49,159 Speaker 1: which would make snap crackling pop like the oldest words 107 00:06:49,440 --> 00:06:53,520 Speaker 1: on earth, you know. Um. But well, I guess we 108 00:06:53,560 --> 00:06:55,760 Speaker 1: should probably get into it now, Like what are there 109 00:06:55,800 --> 00:07:00,800 Speaker 1: basically two competing theories for how we whire the language 110 00:07:00,800 --> 00:07:04,240 Speaker 1: as a species? Right? Yeah? And I like both of them. 111 00:07:04,360 --> 00:07:06,320 Speaker 1: I noticed at the end You're like, why can't we 112 00:07:06,360 --> 00:07:09,560 Speaker 1: all just get along as far as linguistics go? Because 113 00:07:09,560 --> 00:07:11,200 Speaker 1: I'm not a linguist, and so I'm not gonna sit 114 00:07:11,200 --> 00:07:14,080 Speaker 1: here and poople and argue um because I'm not smart enough. 115 00:07:14,120 --> 00:07:18,800 Speaker 1: I don't know enough about it. But the first Josh 116 00:07:19,000 --> 00:07:24,600 Speaker 1: is um that we adapted to survive, so we learned 117 00:07:24,600 --> 00:07:27,400 Speaker 1: how to speak. Then that's kind of the simplest way 118 00:07:27,440 --> 00:07:29,400 Speaker 1: to say it. Um. The example I gave in here 119 00:07:29,560 --> 00:07:32,880 Speaker 1: is uh. And then we'll talk about who you know, 120 00:07:33,320 --> 00:07:38,120 Speaker 1: who are the leaders in this whole category that believed this? 121 00:07:38,240 --> 00:07:41,480 Speaker 1: But uh took took's hunting on the range on the 122 00:07:41,480 --> 00:07:46,400 Speaker 1: plains in the Savannah, and took took um thunder scares 123 00:07:46,400 --> 00:07:49,920 Speaker 1: away the deer, so Took Took goes hungry. So then 124 00:07:50,080 --> 00:07:53,560 Speaker 1: later on Took took his already maybe learned to grunt 125 00:07:53,600 --> 00:07:56,280 Speaker 1: about like the deer being being nearby to his buddy. 126 00:07:56,560 --> 00:07:58,680 Speaker 1: Who's his friend? Did we name every name my friend? 127 00:07:58,880 --> 00:08:04,360 Speaker 1: Oh more Morty. Yeah, so he's already learned how to 128 00:08:04,400 --> 00:08:08,080 Speaker 1: tell Morty that deer nearby, so shut up, right, because 129 00:08:08,120 --> 00:08:12,440 Speaker 1: Morty talks incessantly. Uh So, now all of a sudden, 130 00:08:12,520 --> 00:08:16,280 Speaker 1: he learns that thunder and bad weather might scare deer away, 131 00:08:16,320 --> 00:08:18,400 Speaker 1: so he goes hungry. So he learns now, I've got 132 00:08:18,400 --> 00:08:21,600 Speaker 1: to learn like what bad weather looks like coming in 133 00:08:21,720 --> 00:08:24,680 Speaker 1: and how to tell Morty, hey, pick up the paste, dude, 134 00:08:24,680 --> 00:08:27,000 Speaker 1: because bad weather is coming and we don't want to 135 00:08:27,000 --> 00:08:29,760 Speaker 1: go hungry again. So that was just one of the 136 00:08:29,800 --> 00:08:34,360 Speaker 1: stepping stones in evolving speech, right, And it is kind 137 00:08:34,360 --> 00:08:37,880 Speaker 1: of like, um, the the idea behind it is that 138 00:08:38,080 --> 00:08:42,080 Speaker 1: the speech evolved out of the combinations of these things, 139 00:08:42,120 --> 00:08:44,440 Speaker 1: like you're saying, yeah, so you put them together and 140 00:08:44,480 --> 00:08:46,840 Speaker 1: all of a sudden, huh that makes a lot of sense. 141 00:08:46,880 --> 00:08:50,880 Speaker 1: I'm able to describe some larger portion of the world 142 00:08:50,920 --> 00:08:54,480 Speaker 1: around us. Yeah, And that's got more complex. The language 143 00:08:54,520 --> 00:08:56,800 Speaker 1: had to like as they learn more things, right, like 144 00:08:56,840 --> 00:08:59,600 Speaker 1: we settled down and sure culture would have had like 145 00:08:59,640 --> 00:09:02,520 Speaker 1: a you jimpact on something like that. Yeah, and keeping 146 00:09:02,679 --> 00:09:05,960 Speaker 1: children alive apparently, like once we settled in villages, A 147 00:09:05,960 --> 00:09:07,839 Speaker 1: lot of people think that language really took a leap 148 00:09:07,880 --> 00:09:11,280 Speaker 1: forward because we had to, you know, keep the species 149 00:09:11,320 --> 00:09:16,040 Speaker 1: alive by protecting the kids. Um. I guess also the 150 00:09:16,480 --> 00:09:21,160 Speaker 1: idea that you could warn somebody about something, right that 151 00:09:21,520 --> 00:09:25,120 Speaker 1: isn't necessarily just something you could point at and be like, 152 00:09:25,320 --> 00:09:28,560 Speaker 1: you know, let's get out of here through gesture, something 153 00:09:28,600 --> 00:09:31,120 Speaker 1: maybe further away, something that you couldn't see right then, 154 00:09:31,559 --> 00:09:35,760 Speaker 1: that would that would lead directly to um to a 155 00:09:35,880 --> 00:09:39,400 Speaker 1: trait that was that led to survival, which is the 156 00:09:39,440 --> 00:09:42,719 Speaker 1: whole basis of natural selection, which means that people who 157 00:09:42,760 --> 00:09:45,000 Speaker 1: could do that would be able to go reproduce, and 158 00:09:45,120 --> 00:09:48,040 Speaker 1: that trait would survive and be passed along. And I 159 00:09:48,040 --> 00:09:52,200 Speaker 1: imagine reproduction and all needed its own language as well, right, 160 00:09:52,520 --> 00:09:56,079 Speaker 1: you know, I'm like, hey, mama, although in Quest for 161 00:09:56,200 --> 00:09:59,800 Speaker 1: Fire it pretty much just happened, didn't I haven't seen 162 00:09:59,840 --> 00:10:02,520 Speaker 1: that so long. I think that was the first movie 163 00:10:02,520 --> 00:10:04,720 Speaker 1: I ever saw in showtime and Yeah, there wasn't a 164 00:10:04,720 --> 00:10:06,920 Speaker 1: lot of words going on. It was like, you know, 165 00:10:07,040 --> 00:10:10,520 Speaker 1: the ladies are down by the river bending over, filling 166 00:10:10,600 --> 00:10:14,680 Speaker 1: up water buckets and or you know, water pods and 167 00:10:14,760 --> 00:10:18,720 Speaker 1: man comes along and just you know, takes care of business. Yeah? 168 00:10:19,120 --> 00:10:21,959 Speaker 1: Is that? Is that an ancient phrase? Takes care of business? 169 00:10:22,360 --> 00:10:29,360 Speaker 1: TCB YEA. So I remember Quest for Fire and another movie. Um, 170 00:10:29,600 --> 00:10:31,640 Speaker 1: we're out on showtime at about the same time a 171 00:10:31,800 --> 00:10:35,120 Speaker 1: movie is called Caveman and it started Ringo Star. Is 172 00:10:35,160 --> 00:10:37,760 Speaker 1: that his picture that is in this article? Yeah, don't 173 00:10:37,760 --> 00:10:40,280 Speaker 1: you like my caption? Yeah? That's I couldn't. I couldn't 174 00:10:40,280 --> 00:10:43,199 Speaker 1: tell just by looking at it, but um yeah, the caption, 175 00:10:43,240 --> 00:10:44,839 Speaker 1: I think is what gave it away. These were the 176 00:10:44,880 --> 00:10:46,960 Speaker 1: old days where I would like the highlight of my 177 00:10:46,960 --> 00:10:50,079 Speaker 1: week was writing really clever picture captain star articles. I 178 00:10:50,080 --> 00:10:51,720 Speaker 1: would go home and say, look at this one, Emily, 179 00:10:52,040 --> 00:10:54,320 Speaker 1: it's pretty good. Someone might get this joke. So there's 180 00:10:54,320 --> 00:10:58,120 Speaker 1: a production still of from Caveman of Ringo Star standing 181 00:10:58,120 --> 00:11:01,040 Speaker 1: there and the caption is this cave by Caveman gets 182 00:11:01,040 --> 00:11:04,720 Speaker 1: by with a little help from his friends, beautiful Matty 183 00:11:04,840 --> 00:11:10,280 Speaker 1: guest producer Mattie. Yeah, so that's adaptation theory that basically 184 00:11:10,320 --> 00:11:13,120 Speaker 1: we figured out that we could survive better and more 185 00:11:13,240 --> 00:11:17,040 Speaker 1: robustly by talking to one another, and language evolved in 186 00:11:17,200 --> 00:11:19,960 Speaker 1: fits and starts through there. Right. Yeah. And who who's 187 00:11:19,960 --> 00:11:23,800 Speaker 1: gradually I should say gradually? Yes? Um? And that's Stephen Pinker, 188 00:11:23,840 --> 00:11:25,960 Speaker 1: who's a great dude. Yeah. Pinker and Bloom in their 189 00:11:26,000 --> 00:11:30,320 Speaker 1: paper Natural Language and Natural Selection. Uh. I mean there 190 00:11:30,320 --> 00:11:31,920 Speaker 1: are a lot of people who agree with them and 191 00:11:31,920 --> 00:11:34,959 Speaker 1: have written quote unquote the book on this since then, 192 00:11:35,679 --> 00:11:41,680 Speaker 1: or several books. It's very dense subject. Makes my mind 193 00:11:41,760 --> 00:11:46,800 Speaker 1: melt a little bit. Um, and Pinker and Bloom basically say, uh, 194 00:11:47,920 --> 00:11:51,560 Speaker 1: this is the case. It makes sense. This is just 195 00:11:52,040 --> 00:11:57,640 Speaker 1: standard Darwinian natural selection. What's the problem. I don't have 196 00:11:57,679 --> 00:12:02,160 Speaker 1: a problem with it. Why doesn't everybody just get on board? Uh? Well, 197 00:12:02,200 --> 00:12:04,880 Speaker 1: because there's another competing theory, and there are all sorts 198 00:12:04,920 --> 00:12:08,480 Speaker 1: of sub theories, but these are the two big big daddies. Um, 199 00:12:08,520 --> 00:12:12,679 Speaker 1: and this is no nom Chomsky and um. Evolutionary biologists 200 00:12:12,760 --> 00:12:16,719 Speaker 1: Stephen Jay Gould, and they think that it was a 201 00:12:16,760 --> 00:12:21,320 Speaker 1: spandrel or an axaptation. So you know what a spandrel 202 00:12:21,440 --> 00:12:26,160 Speaker 1: is well in biology or for real in architecture. But 203 00:12:26,400 --> 00:12:30,840 Speaker 1: please explain. Okay, Well, Stephen J. Gould um coined the 204 00:12:30,920 --> 00:12:33,920 Speaker 1: term spandrel, as you point out in the article. Um, 205 00:12:33,960 --> 00:12:37,400 Speaker 1: and it's just perfect actually in this application, because the spandrel, 206 00:12:37,520 --> 00:12:42,440 Speaker 1: architecturally speaking, is um this triangular area that inevitably is 207 00:12:42,520 --> 00:12:46,000 Speaker 1: created when you put two arched domes next to another 208 00:12:46,080 --> 00:12:49,000 Speaker 1: at right angles, and it looks like if you're looking 209 00:12:49,040 --> 00:12:53,559 Speaker 1: at it looks like purposeful design, like ornamentation. But it's 210 00:12:53,559 --> 00:12:57,320 Speaker 1: actually a byproduct you can't get around. And that's what 211 00:12:57,400 --> 00:13:00,480 Speaker 1: a spandrel is. As far as Gould is concerned, the 212 00:13:00,559 --> 00:13:07,240 Speaker 1: product of another evolutionary process right and language supposedly was 213 00:13:07,800 --> 00:13:10,360 Speaker 1: as far as gold and chomp scare concern just kind 214 00:13:10,360 --> 00:13:15,640 Speaker 1: of came about as the result of other stuff, specifically toolmaking. Yeah, 215 00:13:15,840 --> 00:13:19,720 Speaker 1: Darwin calls it pre adaptation and later became acceptation. And 216 00:13:19,880 --> 00:13:22,520 Speaker 1: which one do you like more, pre adaptation or acceptation? 217 00:13:22,800 --> 00:13:25,000 Speaker 1: Acceptation is a little hard to say, so I'm gonna 218 00:13:25,040 --> 00:13:31,160 Speaker 1: go predation. It sounds so like important for us, you know. Um. 219 00:13:31,240 --> 00:13:33,200 Speaker 1: But a quick example of that, and this is the 220 00:13:33,240 --> 00:13:36,320 Speaker 1: one most often cited, is that, uh, there's a theory 221 00:13:36,320 --> 00:13:38,959 Speaker 1: out there that bird feathers were originally meant to keep birds, warm, 222 00:13:39,320 --> 00:13:44,360 Speaker 1: and flying came about after that as a spandrel makes sense. 223 00:13:46,040 --> 00:13:51,320 Speaker 1: Well's the problem exactly. So you said that our brains 224 00:13:51,320 --> 00:13:54,480 Speaker 1: adapted to where we could they got larger, to where 225 00:13:54,480 --> 00:13:57,960 Speaker 1: we could make tools, and things in language came about 226 00:13:58,000 --> 00:14:00,400 Speaker 1: because of the result of that. And this isn't just 227 00:14:00,520 --> 00:14:02,560 Speaker 1: kind of I mean, it's not like they're like, well, 228 00:14:02,600 --> 00:14:06,679 Speaker 1: we can run, so we can talk. Um. There's specific 229 00:14:06,800 --> 00:14:10,320 Speaker 1: areas of the brain that are associated with both toolmaking 230 00:14:10,679 --> 00:14:16,080 Speaker 1: and tool use and language. UM. And there's actually two. 231 00:14:16,440 --> 00:14:20,320 Speaker 1: There's um Broker's area and there's more. Nikki's area and 232 00:14:20,480 --> 00:14:24,640 Speaker 1: Broker's area was named after a French neurosurgeon named Paul Broca, 233 00:14:25,200 --> 00:14:29,080 Speaker 1: and in eighteen sixty one he um described the patient 234 00:14:29,200 --> 00:14:32,400 Speaker 1: named Tan. Tan wasn't the guy's real name. No one 235 00:14:32,440 --> 00:14:36,680 Speaker 1: knew his real name. Um. He was the only thing, 236 00:14:36,840 --> 00:14:39,920 Speaker 1: the only syllable he could pronounce that he could form 237 00:14:40,280 --> 00:14:43,800 Speaker 1: was Tan. So they're like, well, that's your name, pal Um. 238 00:14:43,880 --> 00:14:46,400 Speaker 1: And after he died, Brokea opened up his skull and 239 00:14:46,440 --> 00:14:49,120 Speaker 1: looked at his brain and found a huge lesion on 240 00:14:49,200 --> 00:14:53,160 Speaker 1: the area now named broke A's area, and that's come 241 00:14:53,240 --> 00:14:56,200 Speaker 1: to be associated with speech production. The weird thing about 242 00:14:56,200 --> 00:15:00,760 Speaker 1: Tan is he could understand spoken language. If you're Tan, 243 00:15:00,840 --> 00:15:03,360 Speaker 1: you look um, you look kind of Tan. I think 244 00:15:03,440 --> 00:15:05,120 Speaker 1: maybe you should stay out of the sun. He could 245 00:15:05,880 --> 00:15:08,000 Speaker 1: notice stay out of the sun. He wasn't There wasn't 246 00:15:08,040 --> 00:15:12,840 Speaker 1: anything wrong with him other than he could not produce speech. Well, 247 00:15:12,880 --> 00:15:15,080 Speaker 1: I bet he was really ticked off with his name. 248 00:15:15,120 --> 00:15:17,560 Speaker 1: Then I can say his Tan. They're like, we'll just 249 00:15:17,600 --> 00:15:20,720 Speaker 1: call you Tan, And he said he was probably like Nor, Yeah, 250 00:15:20,920 --> 00:15:24,280 Speaker 1: it's ignacious anything but Tan. Yeah. I mean, I I 251 00:15:24,280 --> 00:15:27,240 Speaker 1: imagine the guy probably was like, you know, half mad 252 00:15:27,280 --> 00:15:29,960 Speaker 1: by the time he died, just out of frustration. Well, 253 00:15:30,240 --> 00:15:32,800 Speaker 1: stroke patients. You know, my grandfather had a stroke and 254 00:15:33,600 --> 00:15:36,000 Speaker 1: tried to speak in his head. He was saying words, 255 00:15:36,040 --> 00:15:37,760 Speaker 1: but it would come out as gobbledygook and he would 256 00:15:37,800 --> 00:15:39,960 Speaker 1: get really frustrated. It was very sad. So now that 257 00:15:40,040 --> 00:15:43,280 Speaker 1: was your grandfather. So what it sounds like your grandfather 258 00:15:43,600 --> 00:15:47,160 Speaker 1: UM had a problem with was his were Nikki's Area. Yeah, 259 00:15:47,480 --> 00:15:50,560 Speaker 1: and that was named after a German neuro surgeon who 260 00:15:50,680 --> 00:15:56,240 Speaker 1: found that UM his patients who could speak but they 261 00:15:56,240 --> 00:15:59,680 Speaker 1: weren't making any sense, had lesions on the area now 262 00:15:59,720 --> 00:16:01,280 Speaker 1: known it is were Nicky's Area. So if you put 263 00:16:01,280 --> 00:16:03,960 Speaker 1: the two together, broke his area which is involved in 264 00:16:04,000 --> 00:16:07,120 Speaker 1: speech production, and we're Nikki's area, which is involved with 265 00:16:07,320 --> 00:16:13,800 Speaker 1: UM speech comprehension, language comprehension. You have normally talking people 266 00:16:13,920 --> 00:16:17,640 Speaker 1: like us. Yes, and we first saw we're Nikki's area. 267 00:16:18,040 --> 00:16:21,960 Speaker 1: I think it was Ricky, we're Nikki the guy, we're 268 00:16:22,040 --> 00:16:27,360 Speaker 1: Nikki's area, and broke his area. UM, and the temporal, parietal, 269 00:16:27,480 --> 00:16:31,400 Speaker 1: and UH occipital lobes of the brain physically connected for 270 00:16:31,440 --> 00:16:36,080 Speaker 1: the first time in Homo habilis or habilis. So wait, 271 00:16:36,120 --> 00:16:38,720 Speaker 1: what was it? What's that called where you examine sculls 272 00:16:38,800 --> 00:16:41,480 Speaker 1: to see if there was probably some brain there. UM. 273 00:16:41,520 --> 00:16:46,400 Speaker 1: I believe that it's called cranial indocasting. Nice, I think so. 274 00:16:46,400 --> 00:16:49,160 Speaker 1: So they think that UM that this This would make 275 00:16:49,200 --> 00:16:51,640 Speaker 1: a lot of sense if Homo habilis UM was the 276 00:16:51,640 --> 00:16:56,680 Speaker 1: first one to talk, because they also often associate Homo 277 00:16:56,720 --> 00:16:59,800 Speaker 1: habilis is the first one to use tools. Right, this 278 00:16:59,880 --> 00:17:03,560 Speaker 1: is this isn't question I found recently it's to come 279 00:17:03,640 --> 00:17:07,280 Speaker 1: under question that possibly UM the oldest tools, the older 280 00:17:07,320 --> 00:17:12,600 Speaker 1: one tools which are like UM scrapers, hammers, UM I 281 00:17:12,640 --> 00:17:16,719 Speaker 1: think brain crushers, right, basically just stone tools that are 282 00:17:16,800 --> 00:17:19,840 Speaker 1: used to like skin meat off a bone. They're like 283 00:17:19,840 --> 00:17:22,199 Speaker 1: two point three million years old. They think that they 284 00:17:22,280 --> 00:17:25,760 Speaker 1: might be slightly older than Homo habilists. Then the other 285 00:17:25,800 --> 00:17:29,280 Speaker 1: problem with linking language and humans and Homo habilis is 286 00:17:29,320 --> 00:17:34,119 Speaker 1: we're not we're not sure we're on habiliss same tree. 287 00:17:34,640 --> 00:17:39,120 Speaker 1: Oh yeah, But nonetheless, Homo habilist does have the cute 288 00:17:39,200 --> 00:17:42,520 Speaker 1: nickname a handyman. I've never heard that because he was 289 00:17:42,560 --> 00:17:45,560 Speaker 1: supposedly the first tool users, and that makes sense. Yeah, 290 00:17:45,640 --> 00:17:48,840 Speaker 1: it's better than Bob the Builder, but there's still that 291 00:17:48,840 --> 00:17:53,880 Speaker 1: that link right there between tool use and um language, right, 292 00:17:54,119 --> 00:17:57,959 Speaker 1: which they think is makes him in her much more 293 00:17:58,000 --> 00:18:05,359 Speaker 1: advanced than the Australia Eopithecus who came before homoobilist. So um, 294 00:18:05,520 --> 00:18:08,360 Speaker 1: the whole reason why this is important is because they're 295 00:18:08,400 --> 00:18:12,000 Speaker 1: trying to nail down where language first came about. And 296 00:18:13,320 --> 00:18:16,880 Speaker 1: if you subscribe to Gold and Chomsky, it just all 297 00:18:16,880 --> 00:18:18,400 Speaker 1: of a sudden it was there and people were talking 298 00:18:18,440 --> 00:18:21,120 Speaker 1: to each other. Yeah, it was like one one mutation 299 00:18:21,280 --> 00:18:23,800 Speaker 1: happened and then all of a sudden people were able 300 00:18:23,840 --> 00:18:26,560 Speaker 1: to speak and they were like, oh man, I've been 301 00:18:26,600 --> 00:18:30,760 Speaker 1: wanting to get some stuff off my chest for generations. Um, 302 00:18:30,800 --> 00:18:33,719 Speaker 1: if you listen to Pinker and Bloom or you know what, 303 00:18:33,760 --> 00:18:35,800 Speaker 1: I feel bad for Bloom. If you listen to Bloom 304 00:18:35,840 --> 00:18:40,400 Speaker 1: and Pinker, Um, I mean we know about that, don't we. Uh. 305 00:18:40,440 --> 00:18:42,680 Speaker 1: If you listen to Bloom and Pinker, then it took, 306 00:18:42,920 --> 00:18:45,280 Speaker 1: you know, a very long time for language to evolve, 307 00:18:45,400 --> 00:18:50,240 Speaker 1: and gradually by putting combinations together. The thing is Ghoul 308 00:18:50,359 --> 00:18:53,080 Speaker 1: before he died said you know what, there's not nearly 309 00:18:53,240 --> 00:18:59,160 Speaker 1: enough time for language to evolve. And what's more, if 310 00:18:59,200 --> 00:19:02,480 Speaker 1: there was some sort of gradual evolution of language, then 311 00:19:02,560 --> 00:19:08,840 Speaker 1: chimps should show some sort of propensity towards language. They do, 312 00:19:09,160 --> 00:19:13,040 Speaker 1: but apparently not in any way that any linguist who's 313 00:19:13,200 --> 00:19:18,400 Speaker 1: saying would it would call actual language the beginnings of language. Um, 314 00:19:18,440 --> 00:19:21,320 Speaker 1: it's communication, but not actual language, right, like you mentioned 315 00:19:21,320 --> 00:19:24,960 Speaker 1: at the beginning of the article. Sure, but um, Bloom 316 00:19:24,960 --> 00:19:30,000 Speaker 1: and Pinker point out, So chimps and humans diverged about 317 00:19:30,040 --> 00:19:33,560 Speaker 1: six million years ago. That's three hundred thousand generations for 318 00:19:33,640 --> 00:19:37,040 Speaker 1: language to evolve. That's plenty of time, they say, and 319 00:19:37,119 --> 00:19:41,199 Speaker 1: Gould from beyond the Grace says, no, it's not as 320 00:19:41,240 --> 00:19:45,000 Speaker 1: he did. Okay, Uh, well you know what Pinker actually 321 00:19:45,000 --> 00:19:48,760 Speaker 1: said about that in his defense was look at the 322 00:19:48,840 --> 00:19:52,040 Speaker 1: high racks h y r a X. It is um 323 00:19:52,080 --> 00:19:54,879 Speaker 1: because people say, well, we see in the DNA the 324 00:19:54,960 --> 00:19:58,960 Speaker 1: high Racks shares the DNA with the African elephant. And 325 00:19:59,080 --> 00:20:00,639 Speaker 1: if you look at a higher because it looks like 326 00:20:00,680 --> 00:20:04,080 Speaker 1: a large rat, oh, it looks nothing like an elephant. 327 00:20:04,160 --> 00:20:06,240 Speaker 1: So he's like, just because you share all that DNA 328 00:20:06,359 --> 00:20:10,800 Speaker 1: doesn't mean that you're gonna evolve the exact same way. Yeah, 329 00:20:11,240 --> 00:20:15,159 Speaker 1: so that makes sense. Yeah, and some people pose and 330 00:20:15,200 --> 00:20:18,399 Speaker 1: I sort of agree that they're not mutually exclusive. You 331 00:20:18,440 --> 00:20:21,240 Speaker 1: don't have to have one without the other. Uh. It 332 00:20:21,560 --> 00:20:25,439 Speaker 1: may have been acceptation, and then from that point it 333 00:20:25,520 --> 00:20:27,960 Speaker 1: may have very much been a matter of natural selection 334 00:20:28,040 --> 00:20:31,760 Speaker 1: because the better you were communicating, the better you were surviving. 335 00:20:33,000 --> 00:20:36,800 Speaker 1: I like that idea too. Um. I don't think that though, 336 00:20:36,840 --> 00:20:40,040 Speaker 1: if you put Stephen Pinker in um Chomsky in the 337 00:20:40,080 --> 00:20:43,639 Speaker 1: same room that they would be like, you know this, 338 00:20:43,640 --> 00:20:47,320 Speaker 1: this all this works together. I think like they're tracing 339 00:20:47,320 --> 00:20:49,919 Speaker 1: it back to the origin point, the moment where it 340 00:20:49,960 --> 00:20:53,520 Speaker 1: began either as a either it began to evolve or 341 00:20:53,640 --> 00:20:57,439 Speaker 1: just appeared as a result of a incredibly sophisticated machine 342 00:20:57,440 --> 00:21:00,760 Speaker 1: that just started performing another function as a result of 343 00:21:00,800 --> 00:21:04,159 Speaker 1: its sophistication. And it's all kind of conjecture anyway, But 344 00:21:04,200 --> 00:21:06,560 Speaker 1: there's still I mean, there's still support. There's support for 345 00:21:06,600 --> 00:21:11,480 Speaker 1: different ones like UM, like brain plasticity, neural plasticity. The 346 00:21:11,520 --> 00:21:14,960 Speaker 1: fact that our our brains can be restructured and reorganized 347 00:21:15,000 --> 00:21:18,000 Speaker 1: supports the idea that language evolved gradually, right, and they 348 00:21:18,080 --> 00:21:20,479 Speaker 1: just started to build and building building, Possibly that's how 349 00:21:20,520 --> 00:21:24,120 Speaker 1: our brains became larger, right, chicken and the egg thing. 350 00:21:24,320 --> 00:21:27,000 Speaker 1: But people also say, like, if large brain equals things 351 00:21:27,040 --> 00:21:30,359 Speaker 1: like speech, then why don't like whales and things like 352 00:21:30,400 --> 00:21:33,080 Speaker 1: that with much larger brains and things like speech. That's 353 00:21:33,080 --> 00:21:37,040 Speaker 1: another great argument too, uh. And then mirror neurons UM 354 00:21:37,160 --> 00:21:39,880 Speaker 1: kind of lends support to the idea that it's just 355 00:21:40,520 --> 00:21:45,760 Speaker 1: it's just a spandrel of brain function because toolmaking and 356 00:21:45,880 --> 00:21:50,600 Speaker 1: UM and speech both used the same areas, right, and 357 00:21:50,640 --> 00:21:54,679 Speaker 1: then both um and then toolmaking lights up when you 358 00:21:54,720 --> 00:21:57,280 Speaker 1: watch somebody use tools, and when you're using tools yourself 359 00:21:57,440 --> 00:22:01,120 Speaker 1: in the broker's area. Interesting, yeah, or friend, samir neurons 360 00:22:01,720 --> 00:22:05,919 Speaker 1: they're back. Have you got anything else? There's a lot 361 00:22:05,920 --> 00:22:10,240 Speaker 1: of scrippling over there. Oh yeah. So one of Chomsky's 362 00:22:10,280 --> 00:22:15,480 Speaker 1: big points is that, uh, that grammar or the language 363 00:22:15,480 --> 00:22:20,840 Speaker 1: is innate, which makes it biological not cultural. Okay, UM 364 00:22:21,000 --> 00:22:24,480 Speaker 1: is universal grammar, which is that like if he always 365 00:22:24,480 --> 00:22:27,480 Speaker 1: says that if if a Martian anthropologist came down and 366 00:22:27,560 --> 00:22:31,480 Speaker 1: studied all human languages, that they would he would reasonably 367 00:22:31,520 --> 00:22:36,120 Speaker 1: conclude that all of that information is based on an 368 00:22:36,119 --> 00:22:41,040 Speaker 1: internal structure rather than culture. Basically, um and the the 369 00:22:41,240 --> 00:22:45,000 Speaker 1: key to universal grammar supposedly is recursion, which is like 370 00:22:45,080 --> 00:22:47,280 Speaker 1: me saying, like, I'm gonna go to the store, the 371 00:22:47,320 --> 00:22:49,360 Speaker 1: one down the street, you know, the one that has 372 00:22:49,400 --> 00:22:51,399 Speaker 1: the really good hot dogs. I'll be back in a 373 00:22:51,440 --> 00:22:55,680 Speaker 1: little bit. It's taking um. It's adding phrases within phrases. 374 00:22:56,160 --> 00:23:00,520 Speaker 1: There's no other um. There's no other community cation in 375 00:23:00,560 --> 00:23:04,200 Speaker 1: any animal species that would include this, which makes that human. 376 00:23:04,440 --> 00:23:08,879 Speaker 1: And supposedly all human languages contain recursion. Except there's a 377 00:23:08,960 --> 00:23:14,600 Speaker 1: challenger now called Paraha. It's Amazonian. There's like five people 378 00:23:14,640 --> 00:23:17,760 Speaker 1: who speak it, really, five hundred and one, including the 379 00:23:17,840 --> 00:23:23,560 Speaker 1: one um M I T trained linguist who studied it 380 00:23:23,600 --> 00:23:26,320 Speaker 1: for thirty years, is the only one who knows it 381 00:23:26,560 --> 00:23:30,120 Speaker 1: who's now saying this thing. They don't have recursion, so 382 00:23:30,680 --> 00:23:34,880 Speaker 1: universal grammars wrong. Therefore Chomsky's whole thing is wrong. There's 383 00:23:34,920 --> 00:23:38,560 Speaker 1: a pretty cool article on Chronicle of Higher Education that's 384 00:23:38,560 --> 00:23:43,720 Speaker 1: worth reading called Angry Words. You know that's a big 385 00:23:43,720 --> 00:23:46,400 Speaker 1: deal right now, is disappearing languages. And I don't think 386 00:23:46,440 --> 00:23:48,360 Speaker 1: I think these people are As far as I got 387 00:23:48,400 --> 00:23:52,280 Speaker 1: from the article, they seem like they are fine. There's 388 00:23:52,400 --> 00:23:54,800 Speaker 1: not that many of them, but they're not being encroached 389 00:23:54,880 --> 00:23:57,440 Speaker 1: upon any further. I think they're protected. Interesting. They're just 390 00:23:57,520 --> 00:24:01,159 Speaker 1: kind of living out there their existence and doing their thing. Actually, no, 391 00:24:01,200 --> 00:24:03,800 Speaker 1: I'm Chomsky is in the bushes behind them with the blowgun. 392 00:24:06,080 --> 00:24:09,080 Speaker 1: Well that's that's all I got. That's good stuff. This 393 00:24:09,160 --> 00:24:11,480 Speaker 1: could have been like ten hours long. Yeah. Easy for 394 00:24:11,560 --> 00:24:14,160 Speaker 1: linguists out there, they're like, oh, what a broad overview. 395 00:24:14,240 --> 00:24:18,280 Speaker 1: That's exactly what this is like. Um, if you want 396 00:24:18,320 --> 00:24:20,160 Speaker 1: to learn more and you want to see this picture 397 00:24:20,160 --> 00:24:22,400 Speaker 1: of ringo start just as a caveman, you should read 398 00:24:22,440 --> 00:24:25,680 Speaker 1: the article written by one Charles W. Bryant called how 399 00:24:25,720 --> 00:24:29,520 Speaker 1: did Language Evolve? Uh? Type that into the search bar 400 00:24:29,600 --> 00:24:32,200 Speaker 1: at how stuff works dot com and it will bring 401 00:24:32,240 --> 00:24:34,720 Speaker 1: it up. Uh. And I said search bar. So it's 402 00:24:34,720 --> 00:24:40,240 Speaker 1: time for listening, mate, Josh, I'm gonna call this we 403 00:24:40,320 --> 00:24:45,960 Speaker 1: saved another life apparently again, Hey guys and Jerry, I 404 00:24:46,000 --> 00:24:49,080 Speaker 1: thought I would We'll say hey Matt. Since Matt's here today, 405 00:24:49,520 --> 00:24:50,919 Speaker 1: I thought it would tell you a little about how 406 00:24:50,960 --> 00:24:53,680 Speaker 1: your podcast has quite literally saved and changed my life. 407 00:24:54,000 --> 00:24:57,760 Speaker 1: I'm seventeen and living a small town of Galesburg, Illinois, 408 00:24:58,119 --> 00:25:01,840 Speaker 1: but consider myself a citizen of the world. Uh. From 409 00:25:01,920 --> 00:25:06,520 Speaker 1: when I was six months young ten years after Ghostbusters, 410 00:25:07,200 --> 00:25:10,119 Speaker 1: my family exactly, My family and I have traveled back 411 00:25:10,160 --> 00:25:13,520 Speaker 1: and forth from Illinois to Barcelona, Spain, every two years. 412 00:25:14,240 --> 00:25:16,080 Speaker 1: This last time back in the US, I fell in 413 00:25:16,119 --> 00:25:19,280 Speaker 1: love with your podcast and have listened almost religiously every 414 00:25:19,280 --> 00:25:22,200 Speaker 1: morning for almost three years. Then one day my faith 415 00:25:22,240 --> 00:25:25,439 Speaker 1: was solidified. While listening and walking my dog cheap Ee, 416 00:25:26,320 --> 00:25:30,359 Speaker 1: I crossed the street. I should probably c o a walking. 417 00:25:30,520 --> 00:25:34,320 Speaker 1: This guy's really hitting all the points here. Um, walking 418 00:25:34,320 --> 00:25:37,000 Speaker 1: with headphones is a dangerous thing. Uh. If one of 419 00:25:37,040 --> 00:25:39,280 Speaker 1: you died because of a headphone walking incident, I would 420 00:25:39,280 --> 00:25:43,159 Speaker 1: never forgive myself. Back to the story, A car was 421 00:25:43,240 --> 00:25:45,800 Speaker 1: unbeknownst to me, hurtling down the street at me, I 422 00:25:45,840 --> 00:25:51,240 Speaker 1: started crossing when all of a sudden buzz how flies 423 00:25:51,280 --> 00:25:55,680 Speaker 1: work had just begun. Remember the loud buzz. It really 424 00:25:55,680 --> 00:25:57,920 Speaker 1: flipped me out and I jumped backwards just as the 425 00:25:57,960 --> 00:26:00,359 Speaker 1: car flew by me. Oh no, it was Um, it 426 00:26:00,520 --> 00:26:04,480 Speaker 1: was from the fly remember help me? And he said 427 00:26:04,520 --> 00:26:06,840 Speaker 1: that scared him enough to jump back and didn't even 428 00:26:06,840 --> 00:26:10,560 Speaker 1: see the car. So we saved this dude and cheap ee, 429 00:26:11,359 --> 00:26:14,479 Speaker 1: which is pretty exciting. That was a while ago, but recently, 430 00:26:14,520 --> 00:26:16,520 Speaker 1: actually two days ago. You have changed the course of 431 00:26:16,520 --> 00:26:19,199 Speaker 1: the rest of my life. After listening to the Sauna 432 00:26:19,760 --> 00:26:23,440 Speaker 1: and Viking podcasts, I have fallen in love with Scandinavia 433 00:26:23,640 --> 00:26:25,800 Speaker 1: so much so that I'm going to be a foreign 434 00:26:25,800 --> 00:26:29,840 Speaker 1: exchange student in Finland for the entirety of next year. 435 00:26:30,119 --> 00:26:34,240 Speaker 1: That's awesome. So we inspired uh Noah to go to 436 00:26:34,280 --> 00:26:38,359 Speaker 1: Finland because of the SOUNDA and Viking casts enjoyed the 437 00:26:38,440 --> 00:26:42,040 Speaker 1: really so much love and many thanks. And that's from 438 00:26:42,080 --> 00:26:48,480 Speaker 1: Noah f F nice Noah Finster Finkelstein, that's his name 439 00:26:48,480 --> 00:26:51,080 Speaker 1: now it is um. Thanks for that. No, we're glad 440 00:26:51,119 --> 00:26:52,840 Speaker 1: you're alive. We hope you have a very good time 441 00:26:52,840 --> 00:26:57,320 Speaker 1: in Finland. Um and uh, we're glad cheap. He's doing 442 00:26:57,320 --> 00:27:01,520 Speaker 1: well too. I bet no one never comes back like 443 00:27:01,560 --> 00:27:03,640 Speaker 1: in a sinister way or now. I bet he loves 444 00:27:03,640 --> 00:27:07,480 Speaker 1: it so much that he's like, I'm here nice until 445 00:27:07,520 --> 00:27:12,879 Speaker 1: winter hits exactly. UM. If you we always love hearing 446 00:27:12,920 --> 00:27:15,960 Speaker 1: how we've saved your life or enriched your life or 447 00:27:16,000 --> 00:27:18,320 Speaker 1: something like that. UM, we want to hear about it. 448 00:27:18,359 --> 00:27:20,840 Speaker 1: You can tweet to us at s y s K podcast. 449 00:27:21,119 --> 00:27:24,080 Speaker 1: You can join us on Facebook, Facebook dot com slash 450 00:27:24,119 --> 00:27:27,840 Speaker 1: stuff you Should Know, and you can email us directly 451 00:27:28,160 --> 00:27:31,240 Speaker 1: just between us, like five other people who are included 452 00:27:31,280 --> 00:27:41,520 Speaker 1: on the email at Stuff podcast at Discovery dot com. 453 00:27:41,600 --> 00:27:44,159 Speaker 1: Be sure to check out our new video podcast, Stuff 454 00:27:44,200 --> 00:27:46,840 Speaker 1: from the Future. Join How Stuff Work staff as we 455 00:27:46,880 --> 00:27:53,320 Speaker 1: explore the most promising and perplexing possibilities of tomorrow, brought 456 00:27:53,359 --> 00:27:56,560 Speaker 1: to you by the reinvented two thousand twelve camera. It's ready, 457 00:27:56,760 --> 00:27:57,200 Speaker 1: are you