1 00:00:01,120 --> 00:00:04,280 Speaker 1: Welcome to you stuff you should know from house stuff 2 00:00:04,280 --> 00:00:12,719 Speaker 1: Works dot com. Hey, and welcome to the podcast. I'm 3 00:00:12,800 --> 00:00:15,440 Speaker 1: Josh Clark. With me is always this Charles W. Chuck 4 00:00:15,480 --> 00:00:22,079 Speaker 1: Bryant that makes this stuff. You should the podcast. Scottish 5 00:00:22,840 --> 00:00:26,920 Speaker 1: no O, that was that was nothing. It's this weirdness, 6 00:00:27,120 --> 00:00:31,800 Speaker 1: Josh ish Chuck. I love November. You mean November you 7 00:00:31,920 --> 00:00:35,680 Speaker 1: know idea? All right, Josh, as you know, because of 8 00:00:35,760 --> 00:00:39,720 Speaker 1: my semi virginal fresh face here, I have decided to 9 00:00:39,760 --> 00:00:42,279 Speaker 1: get on the November train for people that don't know 10 00:00:42,400 --> 00:00:45,000 Speaker 1: that is for men and I guess women. If you 11 00:00:45,000 --> 00:00:48,880 Speaker 1: can grow mustache, more power to you. Uh, to raise 12 00:00:49,240 --> 00:00:52,599 Speaker 1: money and awareness for prostate cancer. Yeah. So I've been 13 00:00:52,600 --> 00:00:54,600 Speaker 1: asked to do this a bunch and I've never done it. Well, 14 00:00:54,640 --> 00:00:58,120 Speaker 1: I'm glad you're finally doing it. Tell us all about it, man, Well, uh, 15 00:00:58,160 --> 00:01:00,240 Speaker 1: you know, I signed up. I've gotten a little Movember 16 00:01:00,280 --> 00:01:02,520 Speaker 1: page and then you go to that little page and 17 00:01:02,560 --> 00:01:05,759 Speaker 1: you can donate money and for my team of one 18 00:01:06,600 --> 00:01:08,720 Speaker 1: and unless you cut in the mustache as a thing, 19 00:01:08,800 --> 00:01:11,560 Speaker 1: and then that's two of us. Okay, So hopefully soon 20 00:01:11,600 --> 00:01:15,520 Speaker 1: that'll be happening, and uh, it would be cool, you know, 21 00:01:15,640 --> 00:01:17,319 Speaker 1: I'm gonna grow it back in anyway, so you might 22 00:01:17,360 --> 00:01:19,440 Speaker 1: as well raise a little money along the way. You 23 00:01:19,480 --> 00:01:21,679 Speaker 1: shouldn't tell people are gonna grow it anyway or they'll 24 00:01:21,720 --> 00:01:24,280 Speaker 1: contribute more money. No, no, no, I'm growing the goatee, 25 00:01:24,280 --> 00:01:27,880 Speaker 1: but I will only grow the mustache for November. So 26 00:01:28,440 --> 00:01:31,720 Speaker 1: how do people contribute to this effort? Josh? Go to 27 00:01:31,760 --> 00:01:36,240 Speaker 1: mo bro dot c o slash Charles Bryant and that 28 00:01:36,360 --> 00:01:38,959 Speaker 1: is my page. Or just go to the November website. 29 00:01:39,200 --> 00:01:41,480 Speaker 1: They've got a little handy search bar there. Type in 30 00:01:41,600 --> 00:01:44,960 Speaker 1: Charles Bryant. There's one other Charles Bryant, but he is 31 00:01:45,000 --> 00:01:47,760 Speaker 1: not the one with a picture of me. Oh that's good. 32 00:01:48,000 --> 00:01:50,400 Speaker 1: So when it lists the two dudes and one of 33 00:01:50,440 --> 00:01:54,520 Speaker 1: them that clearly has a photo of my freshly shaven face, 34 00:01:54,920 --> 00:01:58,160 Speaker 1: not super freshly shaven like that morning right via webcam, 35 00:01:58,280 --> 00:02:01,280 Speaker 1: you look like a hostage of some stuff too. So 36 00:02:01,960 --> 00:02:05,840 Speaker 1: go to mobro dot ceo slash Charles Bryant, donate help 37 00:02:06,160 --> 00:02:09,760 Speaker 1: support prostate cancer research and I'll be updating with photos. 38 00:02:10,320 --> 00:02:11,800 Speaker 1: And if you guys want to chime in on what 39 00:02:11,880 --> 00:02:14,639 Speaker 1: kind of stash I should grow, I'll try my best. Okay, 40 00:02:14,840 --> 00:02:18,840 Speaker 1: I'm kind of limited to like standard crumb catcher and 41 00:02:19,720 --> 00:02:23,320 Speaker 1: pencil thin. What about walrus? I can't it just doesn't 42 00:02:23,320 --> 00:02:25,800 Speaker 1: get that big. I can't do the Raleigh fingers, you know, 43 00:02:26,160 --> 00:02:29,280 Speaker 1: So I have my limits. Have you tried wax mustache wax? 44 00:02:29,520 --> 00:02:33,360 Speaker 1: Maybe I will enough, So go to mo bro dot 45 00:02:33,440 --> 00:02:37,519 Speaker 1: CEO slash Charles Bryant. That's right, and you can donate 46 00:02:37,520 --> 00:02:42,120 Speaker 1: to this. Yeah, much appreciated. November on with the show. Uh, 47 00:02:42,200 --> 00:02:44,960 Speaker 1: you got a good setup for today. I'd love to 48 00:02:44,960 --> 00:02:49,680 Speaker 1: hear it. Let's get to Uh. Have you ever sorry, Chuck, 49 00:02:50,320 --> 00:02:53,359 Speaker 1: have you ever heard of a luddite? I've been called 50 00:02:53,360 --> 00:02:56,760 Speaker 1: a luddite? Okay, somebody who's just not sure of technology. Yeah, 51 00:02:56,800 --> 00:02:59,720 Speaker 1: I'm not afraid. You're very technologically savvy, you know stuff. 52 00:03:00,000 --> 00:03:03,040 Speaker 1: You're not afraid of it. Nope. Um, but whoever's calling 53 00:03:03,040 --> 00:03:05,920 Speaker 1: you that, it's actually they're kind of incorrect. That's a misconception. 54 00:03:06,000 --> 00:03:09,200 Speaker 1: A lot eyes were not ever afraid of technology. I 55 00:03:09,240 --> 00:03:11,360 Speaker 1: wish I would have known that at the time, because 56 00:03:11,360 --> 00:03:13,480 Speaker 1: you could have been like, you're it's wrong and stupid 57 00:03:13,560 --> 00:03:15,800 Speaker 1: in every single way. Actually, it was our buddy Scott 58 00:03:15,800 --> 00:03:17,560 Speaker 1: to Polido, So I'll just throw it back in his face. 59 00:03:17,560 --> 00:03:20,280 Speaker 1: I'll tell them too. I'll I'll stand next to him 60 00:03:20,320 --> 00:03:23,200 Speaker 1: be like yeah, um no, a lot ied is is. 61 00:03:23,600 --> 00:03:29,079 Speaker 1: Originally they were a group of UM protesters labor protesters 62 00:03:29,120 --> 00:03:33,320 Speaker 1: from that protested between eighteen eleven and eighteen sixteen, and 63 00:03:33,360 --> 00:03:37,840 Speaker 1: they wanted UM fair wages, they wanted better treatment in 64 00:03:37,880 --> 00:03:42,320 Speaker 1: their in their workplaces, and no iPhones. And they they 65 00:03:42,400 --> 00:03:48,760 Speaker 1: were known to break machines like manufacturing machines. Yeah, oh yeah. 66 00:03:48,840 --> 00:03:51,960 Speaker 1: They had sledge hammers that were ironically made by in 67 00:03:51,960 --> 00:03:54,760 Speaker 1: in one case, I think in Manchester, UM. They were 68 00:03:54,760 --> 00:03:57,720 Speaker 1: made by the same blacksmith who had made these knitting 69 00:03:57,760 --> 00:04:01,080 Speaker 1: machines that they used the sledge hammers to break. His 70 00:04:01,160 --> 00:04:04,400 Speaker 1: name was Enoch, and they say that Enoch would make 71 00:04:04,680 --> 00:04:08,160 Speaker 1: these things and Enoch would break these things. Anyway, they 72 00:04:08,160 --> 00:04:11,000 Speaker 1: were known for smashing machines, which at the time was 73 00:04:11,040 --> 00:04:14,760 Speaker 1: like high technology eighteen eleven, like a knitting machine, like 74 00:04:14,800 --> 00:04:19,839 Speaker 1: that's mind bogglingly technological. And so they got this reputation 75 00:04:19,920 --> 00:04:22,160 Speaker 1: for being afraid of this technology. They were afraid it 76 00:04:22,240 --> 00:04:24,480 Speaker 1: was going to take their jobs. That's not true. I 77 00:04:24,520 --> 00:04:27,240 Speaker 1: mean they were to an extent, but what they were 78 00:04:27,279 --> 00:04:30,839 Speaker 1: directing their anger in their ire when they were smashing 79 00:04:30,880 --> 00:04:33,520 Speaker 1: these machines. Was not the machine or the technology of 80 00:04:33,560 --> 00:04:35,799 Speaker 1: the people who invented them, or what the machines represented, 81 00:04:36,160 --> 00:04:38,960 Speaker 1: but these mill owners who were miss using these machines, 82 00:04:39,000 --> 00:04:41,760 Speaker 1: who are using these machines to force people out of jobs, 83 00:04:41,880 --> 00:04:44,320 Speaker 1: who were using unskilled people who had no idea what 84 00:04:44,320 --> 00:04:47,760 Speaker 1: they were doing, and getting hurt and killed using these machines. 85 00:04:47,960 --> 00:04:51,160 Speaker 1: So what the lot It's really wanted was fair labor practices, 86 00:04:51,880 --> 00:04:57,200 Speaker 1: and they wanted to control these machines. Yes, that's the 87 00:04:57,320 --> 00:05:01,120 Speaker 1: key to lotitis. M is machine means are great as 88 00:05:01,120 --> 00:05:03,440 Speaker 1: long as we're in control of them and we're smart 89 00:05:03,480 --> 00:05:07,200 Speaker 1: about what we're doing, and they don't come to replace 90 00:05:07,320 --> 00:05:10,240 Speaker 1: us or run our lives. So today a lot I 91 00:05:10,320 --> 00:05:14,880 Speaker 1: would probably be they would probably react fairly close to 92 00:05:15,200 --> 00:05:18,040 Speaker 1: the modern conception of the term ltite, because it's gotten 93 00:05:18,279 --> 00:05:22,920 Speaker 1: so far out of hand that we're actually now talking 94 00:05:22,960 --> 00:05:26,440 Speaker 1: today about something called the singularity, which is the point 95 00:05:26,440 --> 00:05:31,599 Speaker 1: where the machines really do take over, not in the 96 00:05:31,720 --> 00:05:36,359 Speaker 1: very ubiquitous way that they already have today, like they're everywhere, 97 00:05:36,400 --> 00:05:38,240 Speaker 1: not that you didn't know that already, but I mean 98 00:05:38,279 --> 00:05:40,719 Speaker 1: the control things that we don't fully understand, like the 99 00:05:40,800 --> 00:05:45,080 Speaker 1: cyber war. We were talking about how like the infrastructure 100 00:05:45,120 --> 00:05:48,240 Speaker 1: is run on windows, and like valves and pipes and 101 00:05:48,680 --> 00:05:52,000 Speaker 1: water treatment systems and everything is operated by computer. Right, 102 00:05:52,680 --> 00:05:59,520 Speaker 1: So what happens if the computer suddenly becomes aware and 103 00:06:00,200 --> 00:06:02,720 Speaker 1: it's in control of these things and decides that it 104 00:06:02,760 --> 00:06:06,040 Speaker 1: doesn't really like the humans? It sounds extremely science fiction. E. 105 00:06:06,400 --> 00:06:08,360 Speaker 1: There was no way to carry out this podcast without 106 00:06:08,360 --> 00:06:12,800 Speaker 1: that sentence being spoken. Sure, but it's the people who 107 00:06:12,800 --> 00:06:16,080 Speaker 1: are talking about this, who are predicting this are very smart, 108 00:06:16,400 --> 00:06:20,000 Speaker 1: credible people. And what we're talking about then, is the 109 00:06:20,000 --> 00:06:25,440 Speaker 1: singularity that's right, the point where technological Yes, specifically, yeah, 110 00:06:25,520 --> 00:06:29,800 Speaker 1: because what other singularities are they? Well? I think you know, 111 00:06:29,880 --> 00:06:32,960 Speaker 1: we mentioned there was a singularity, which is something entirely different, 112 00:06:33,279 --> 00:06:35,920 Speaker 1: and I think it's probably just to distinguish stuff like that. Okay, 113 00:06:36,120 --> 00:06:37,880 Speaker 1: I don't know if there are other types of singularity. 114 00:06:37,920 --> 00:06:42,080 Speaker 1: So so it's a singularity versus the singularity. So maybe 115 00:06:42,160 --> 00:06:47,000 Speaker 1: these singularities the point of no return, I guess. So okay, Um, 116 00:06:47,080 --> 00:06:49,440 Speaker 1: so what's your question? What did you ask me? Is 117 00:06:49,480 --> 00:06:52,960 Speaker 1: this bad or good? No? But I do have a 118 00:06:53,080 --> 00:06:56,080 Speaker 1: question for you, Um, do you think it will happen? No? 119 00:06:56,640 --> 00:06:58,520 Speaker 1: I don't. I don't think, and this might be my 120 00:06:58,839 --> 00:07:02,479 Speaker 1: narrow field of view at this point in my life, 121 00:07:02,480 --> 00:07:07,800 Speaker 1: but I think that mankind will make sure that doesn't happen. 122 00:07:08,120 --> 00:07:11,320 Speaker 1: Oh man, I've got a counter argument for you from 123 00:07:11,320 --> 00:07:13,920 Speaker 1: Werner Vinche himself. Oh no, I've seen the counter arguments, 124 00:07:14,400 --> 00:07:18,440 Speaker 1: but that still doesn't change my mind. So you don't 125 00:07:18,480 --> 00:07:20,559 Speaker 1: think that in the quest to be the top dog, 126 00:07:20,680 --> 00:07:25,240 Speaker 1: to consolidate power, to consolidate world domination, some government out 127 00:07:25,240 --> 00:07:27,720 Speaker 1: there will be like, well, yes, we agree with you 128 00:07:27,800 --> 00:07:29,960 Speaker 1: at the u N that, yes, we have to prevent 129 00:07:30,040 --> 00:07:32,760 Speaker 1: this from happening, but our scientists back at home are 130 00:07:32,800 --> 00:07:34,920 Speaker 1: actually working on this one thing that's probably going to 131 00:07:35,040 --> 00:07:37,760 Speaker 1: make it happen, and we're going to be in charge. Yeah. 132 00:07:37,800 --> 00:07:41,640 Speaker 1: I think that they would create fail safe um, and 133 00:07:41,680 --> 00:07:43,440 Speaker 1: I think even if they didn't, it wouldn't be so 134 00:07:43,520 --> 00:07:46,480 Speaker 1: widespread that it would take over humanity. Counter argument two 135 00:07:46,520 --> 00:07:51,000 Speaker 1: to that if we create fail safe using our brains, 136 00:07:51,960 --> 00:07:55,480 Speaker 1: and the singularity is by definition the basically the birth 137 00:07:55,640 --> 00:08:02,240 Speaker 1: the emergence of an artificial intelligence. Yes that's smarter than us, 138 00:08:02,240 --> 00:08:06,400 Speaker 1: a superhuman artificial intelligence. That's basically what the singularity represents, 139 00:08:06,400 --> 00:08:10,440 Speaker 1: the creation of wouldn't that intelligence be able to be like, Oh, 140 00:08:10,480 --> 00:08:12,480 Speaker 1: that's very funny that you came up with. These fail 141 00:08:12,560 --> 00:08:15,440 Speaker 1: states are so tough for me to get around. I 142 00:08:15,480 --> 00:08:17,800 Speaker 1: think what my problem is with stuff like this is 143 00:08:17,840 --> 00:08:21,600 Speaker 1: the assumption that if computers were made smarter than people, 144 00:08:21,680 --> 00:08:25,320 Speaker 1: that they would try and destroy us all and reign supreme. 145 00:08:25,720 --> 00:08:28,520 Speaker 1: That's my problem with this all is it's a very 146 00:08:28,600 --> 00:08:32,320 Speaker 1: large leap to go from Hey, this computer can fix 147 00:08:32,400 --> 00:08:37,680 Speaker 1: itself and maybe learn too. Okay, now it decides it 148 00:08:37,760 --> 00:08:39,960 Speaker 1: hates us all and wants to kill us all. Okay, 149 00:08:40,000 --> 00:08:42,640 Speaker 1: So I I had an idea about this. I watched 150 00:08:42,640 --> 00:08:44,719 Speaker 1: the videos. You see the ray curs while video that's 151 00:08:44,760 --> 00:08:47,120 Speaker 1: as your Future is ra curs while he's talking about 152 00:08:47,160 --> 00:08:50,440 Speaker 1: they kept the interviewer kept asking him like, what scares 153 00:08:50,480 --> 00:08:53,240 Speaker 1: you about the singularity? What's the downside of the singularity? 154 00:08:53,240 --> 00:08:55,760 Speaker 1: And he wouldn't fall for it. It's like, I'm an optimist, 155 00:08:55,840 --> 00:08:58,720 Speaker 1: but you know, um, I I understand that there are 156 00:08:58,760 --> 00:09:01,280 Speaker 1: going to be downsides or whatever. But if you look 157 00:09:01,280 --> 00:09:06,920 Speaker 1: at the twentieth century, are our advances in technology had 158 00:09:07,040 --> 00:09:09,040 Speaker 1: it was a double edged sword, like we use that 159 00:09:09,080 --> 00:09:11,520 Speaker 1: technology to kill millions and millions of people. In the 160 00:09:11,559 --> 00:09:15,560 Speaker 1: twentieth century wars. But we also use that technology to 161 00:09:16,000 --> 00:09:20,400 Speaker 1: um advance the lifespan by like twice as twice as 162 00:09:20,400 --> 00:09:22,800 Speaker 1: long as it was before. So it's a double edged sword. 163 00:09:23,000 --> 00:09:25,200 Speaker 1: And I think that's kind of a glib argument because 164 00:09:25,280 --> 00:09:28,240 Speaker 1: I feel like he's leaving on a really important fact, 165 00:09:28,640 --> 00:09:31,200 Speaker 1: and that is that in the twentieth century, all of 166 00:09:31,240 --> 00:09:35,280 Speaker 1: that technology, every single iota of it, good and bad, 167 00:09:35,559 --> 00:09:39,400 Speaker 1: was deployed by humans. After the Singularity happens, we have 168 00:09:39,679 --> 00:09:44,200 Speaker 1: another non human actor with motivations that we can't even 169 00:09:44,240 --> 00:09:50,040 Speaker 1: conceive of at this point, right deploying technology program motivation. See, 170 00:09:50,040 --> 00:09:52,200 Speaker 1: that's the argument. But no, that's that's the thing. Right now, 171 00:09:52,360 --> 00:09:55,720 Speaker 1: our stuff is constrained by his programs. After it hits 172 00:09:55,760 --> 00:09:58,480 Speaker 1: AI true AI. I think AI plus plus is what 173 00:09:58,520 --> 00:10:02,079 Speaker 1: it's called. It's no longer strained by its programming. It's 174 00:10:02,120 --> 00:10:05,960 Speaker 1: out of our control, literally, And that's the point that 175 00:10:06,000 --> 00:10:08,920 Speaker 1: I don't think we will reach. Okay, well then yeah, 176 00:10:09,080 --> 00:10:11,520 Speaker 1: I agree with you. But if we do reach that point, 177 00:10:11,520 --> 00:10:13,400 Speaker 1: then I do fear that we have computers that are 178 00:10:13,440 --> 00:10:16,800 Speaker 1: thinking the same way that eugenicists think, except they don't 179 00:10:16,840 --> 00:10:19,480 Speaker 1: have that empathy or compassion thing that stays in the 180 00:10:19,480 --> 00:10:22,400 Speaker 1: eugenicist's hand or they do. They're trying to build empathy. 181 00:10:22,520 --> 00:10:26,720 Speaker 1: So I don't know. Okay, we totally jumped to the 182 00:10:26,800 --> 00:10:28,360 Speaker 1: end of this. Didn't were like, what are we even 183 00:10:28,400 --> 00:10:31,120 Speaker 1: talking about? So you believe that they're going to destroy 184 00:10:31,240 --> 00:10:34,920 Speaker 1: humanity at one point? I believe we need ned Lud, 185 00:10:35,120 --> 00:10:38,600 Speaker 1: the fictitious leader of the Luddites, more than ever right now, 186 00:10:39,040 --> 00:10:41,040 Speaker 1: because I think that there's a lot of very smart 187 00:10:41,040 --> 00:10:45,880 Speaker 1: people moving in a very um fast pace in a 188 00:10:45,880 --> 00:10:48,640 Speaker 1: direction that I don't think everybody is aware we're going, 189 00:10:48,760 --> 00:10:51,520 Speaker 1: and there hasn't been a general discussion whether that's the 190 00:10:51,559 --> 00:10:54,680 Speaker 1: best thing to do or how to do it. What 191 00:10:54,800 --> 00:10:58,120 Speaker 1: are the fail says? Is anyone even talking about that? Like? 192 00:10:58,240 --> 00:11:00,200 Speaker 1: What are they? How do we get them in place? 193 00:11:00,280 --> 00:11:03,319 Speaker 1: Because I think there should be an impedence to creating 194 00:11:03,400 --> 00:11:09,080 Speaker 1: unfettered artificial intelligence. Yeah, well here we go. Then, Boy, 195 00:11:09,120 --> 00:11:10,800 Speaker 1: that was a rant. You like started yelling at me. 196 00:11:11,040 --> 00:11:12,920 Speaker 1: Oh no, I'm not upset with you at all. I 197 00:11:12,920 --> 00:11:14,880 Speaker 1: hope it didn't come off like that. That's right, I 198 00:11:14,920 --> 00:11:17,920 Speaker 1: like you. Um So, Werner Venge is one of the 199 00:11:18,120 --> 00:11:20,600 Speaker 1: guys that thinks it is going to happen he's a 200 00:11:20,640 --> 00:11:24,280 Speaker 1: professor of math at the San Diego State University, go 201 00:11:24,440 --> 00:11:28,679 Speaker 1: aztex And Um, he thinks he wrote an essay called 202 00:11:29,440 --> 00:11:32,800 Speaker 1: the Coming Technological Singularity, how to Survive in the post 203 00:11:32,880 --> 00:11:37,160 Speaker 1: human Era? Um, and he thinks there are four ways 204 00:11:37,280 --> 00:11:40,080 Speaker 1: which this could happen. And he also points out that 205 00:11:40,120 --> 00:11:45,840 Speaker 1: he thinks it will happen before which I don't think 206 00:11:46,280 --> 00:11:49,319 Speaker 1: that will happen. And that's coming up. Yeah, it's like 207 00:11:49,440 --> 00:11:51,360 Speaker 1: right around the corner. I think Chris Walls is the 208 00:11:51,360 --> 00:11:56,040 Speaker 1: same thing he said, is the one he's been sighting. Well, 209 00:11:56,080 --> 00:12:01,800 Speaker 1: we'll see. Um. Number One, scientists could develop advancements in AI. 210 00:12:02,120 --> 00:12:05,880 Speaker 1: It's pretty easy to understand. Number Two, computer networks might 211 00:12:06,000 --> 00:12:10,240 Speaker 1: become self aware somehow. That's pretty vague. Well, he was 212 00:12:10,280 --> 00:12:13,200 Speaker 1: saying in the paper. That's Strickland's interpretation. He's saying in 213 00:12:13,240 --> 00:12:17,760 Speaker 1: his paper like, Um, it'll probably be a total surprise 214 00:12:17,840 --> 00:12:20,079 Speaker 1: to the people who are working on this algorithm to 215 00:12:20,160 --> 00:12:24,160 Speaker 1: make a search engine better or something, and they just 216 00:12:24,200 --> 00:12:26,640 Speaker 1: tweak it just slightly in such a way that all 217 00:12:26,679 --> 00:12:29,079 Speaker 1: of a sudden, the computer system wakes up and you 218 00:12:29,240 --> 00:12:32,800 Speaker 1: just created senions accidentally in a in a computer network, 219 00:12:32,840 --> 00:12:37,600 Speaker 1: and now it's self aware. And he's saying that's that's 220 00:12:37,640 --> 00:12:42,520 Speaker 1: how that could happen accidentally. Basically, um So number three 221 00:12:42,760 --> 00:12:48,920 Speaker 1: is uh trans humanism. Basically computer human interface becomes so 222 00:12:49,280 --> 00:12:52,800 Speaker 1: advanced that there's sort of it blurs the line between 223 00:12:53,920 --> 00:12:56,840 Speaker 1: humans and robots, right, which is probably the best case 224 00:12:56,880 --> 00:12:59,959 Speaker 1: scenario for us if the technological singularity is gonna happen, 225 00:13:00,000 --> 00:13:03,720 Speaker 1: because we'll be on board. Yeah, well unless the brain 226 00:13:03,800 --> 00:13:07,000 Speaker 1: part is in the robot, you know what I'm saying. Yeah, 227 00:13:07,080 --> 00:13:09,319 Speaker 1: and they're just operating the body of the human form. 228 00:13:09,360 --> 00:13:12,680 Speaker 1: But if we're indistinguishable from a robot and the human, like, 229 00:13:12,720 --> 00:13:17,240 Speaker 1: if we merged so so much, then what benefits one? Yeah? 230 00:13:17,360 --> 00:13:19,560 Speaker 1: But what is it? Centennial? Man? By centennial man? I 231 00:13:19,559 --> 00:13:23,760 Speaker 1: think uh or historious. Remember when we did our d 232 00:13:23,800 --> 00:13:25,520 Speaker 1: g A speech a couple of years ago, he was 233 00:13:25,600 --> 00:13:28,160 Speaker 1: big news. And then the Olympics he was big news 234 00:13:28,200 --> 00:13:30,920 Speaker 1: because did you see him run? Yeah? Man, I had 235 00:13:30,960 --> 00:13:33,160 Speaker 1: not seen him run before and it is something to see. 236 00:13:33,200 --> 00:13:35,920 Speaker 1: It's really cool, it's pretty awesome. Yeah. I love the 237 00:13:35,960 --> 00:13:37,840 Speaker 1: people that were like, you know, it gives him an 238 00:13:37,880 --> 00:13:41,040 Speaker 1: advantage because blah blah blah and in the South Africa 239 00:13:41,080 --> 00:13:43,240 Speaker 1: came in dead last and then well, no, I mean, 240 00:13:43,240 --> 00:13:44,920 Speaker 1: I don't think anyone expect him to win, but I 241 00:13:45,000 --> 00:13:47,920 Speaker 1: just love the snarky counter argument was then cut off 242 00:13:47,920 --> 00:13:50,880 Speaker 1: your legs below the knees if it's such an advantage. Yeah, 243 00:13:51,320 --> 00:13:53,040 Speaker 1: you want to win, go cut off your legs. Yeah, 244 00:13:53,080 --> 00:13:55,760 Speaker 1: I forgot what we had mentioned him and the trainman something. Yeah, 245 00:13:55,760 --> 00:13:57,800 Speaker 1: and that's before he was like really big news as 246 00:13:57,840 --> 00:14:00,800 Speaker 1: far as the Olympics goes. And the number for biological 247 00:14:00,840 --> 00:14:08,480 Speaker 1: science advancements allow us to engineer human intelligence to physically engineering, right, 248 00:14:08,880 --> 00:14:12,600 Speaker 1: and the first three involved computers, like we have to this, 249 00:14:12,840 --> 00:14:17,600 Speaker 1: this singularity would be reached by basically advancements in computing. 250 00:14:17,640 --> 00:14:20,360 Speaker 1: The last one is extrictly like coming up with this 251 00:14:20,440 --> 00:14:25,000 Speaker 1: super vitamin that just makes our intelligence superhuman. The point 252 00:14:25,120 --> 00:14:28,600 Speaker 1: is is that at through one of these four proposed ways, 253 00:14:29,560 --> 00:14:34,440 Speaker 1: at some point Verna Vince is ray Kerswell, says Hans 254 00:14:34,440 --> 00:14:41,800 Speaker 1: Morevik says maybe um well, he says um that computers 255 00:14:41,800 --> 00:14:45,840 Speaker 1: will be capable of processing power equal to the human brain, 256 00:14:46,480 --> 00:14:49,800 Speaker 1: but not necessarily a which is an essential part of this, 257 00:14:49,920 --> 00:14:52,880 Speaker 1: like we have to understand how to create the human 258 00:14:52,880 --> 00:14:56,960 Speaker 1: brain under certain circumstances um for this to reach. But 259 00:14:57,160 --> 00:15:00,280 Speaker 1: at some point all of these things are saying, we're 260 00:15:00,280 --> 00:15:02,560 Speaker 1: we're going to have on this planet something that doesn't 261 00:15:02,600 --> 00:15:08,160 Speaker 1: exist right now, and that is a superhuman intelligence. Whether 262 00:15:08,160 --> 00:15:11,360 Speaker 1: it's an artificial intelligence as in the first three or 263 00:15:11,440 --> 00:15:14,920 Speaker 1: superhuman human intelligence. Uh, that never means to be seen. 264 00:15:15,200 --> 00:15:18,960 Speaker 1: But the point is is once that happens, all of 265 00:15:19,000 --> 00:15:21,920 Speaker 1: a sudden, there's basically what amounts to a new species 266 00:15:21,960 --> 00:15:24,600 Speaker 1: that just boop popped up on the map and it's 267 00:15:24,640 --> 00:15:27,440 Speaker 1: going to take off like a rocket robo humans. Yeah, 268 00:15:27,560 --> 00:15:29,160 Speaker 1: and it takes off like a rocket because it's got 269 00:15:29,160 --> 00:15:33,200 Speaker 1: a rocket built into its bad. Um. All of this 270 00:15:33,320 --> 00:15:37,560 Speaker 1: is based sort of on Moore's law, which is, Um, 271 00:15:37,600 --> 00:15:39,000 Speaker 1: I guess we can go ahead and talk about more. 272 00:15:39,760 --> 00:15:43,480 Speaker 1: Gordon Moore, great name, Gordon Moore, that's a great like 273 00:15:43,560 --> 00:15:46,560 Speaker 1: electronics engineer name. Yeah, I guess you're right. Uh. In 274 00:15:46,600 --> 00:15:50,640 Speaker 1: the mid sixties, he's a semiconductor engineer, and he proposed 275 00:15:50,720 --> 00:15:54,200 Speaker 1: what UM, we call Moore's law now and that's basically 276 00:15:54,280 --> 00:15:56,880 Speaker 1: what he was noticing at the time was or I 277 00:15:56,880 --> 00:15:59,440 Speaker 1: guess we should just say Moore's laws that the idea 278 00:15:59,480 --> 00:16:03,360 Speaker 1: that techno alogy doubles every eighteen months. That's what they 279 00:16:03,400 --> 00:16:06,240 Speaker 1: settled on it basically in twelve to twenty four months, 280 00:16:06,280 --> 00:16:08,520 Speaker 1: but I think he originally said like eighteen months. So yeah, 281 00:16:08,560 --> 00:16:10,560 Speaker 1: they split the difference and said eighteen months. Yeah. I 282 00:16:10,560 --> 00:16:12,720 Speaker 1: think more of that has said it like it was 283 00:16:12,840 --> 00:16:15,160 Speaker 1: twenty four and then eighteen and he feels like it's 284 00:16:15,160 --> 00:16:18,680 Speaker 1: more like twelve now. But it's progressing like exponentially, I guess, 285 00:16:18,800 --> 00:16:22,239 Speaker 1: is the point. Yeah. So anyway, back in the sixties, 286 00:16:22,680 --> 00:16:26,280 Speaker 1: he noticed that, um, he was building semiconductors and he said, 287 00:16:26,320 --> 00:16:30,200 Speaker 1: you know what, the components and the prices are falling. Uh. 288 00:16:30,320 --> 00:16:33,880 Speaker 1: But then he noticed, instead of just selling stuff for 289 00:16:33,920 --> 00:16:37,120 Speaker 1: half the price, why don't we just roll that back 290 00:16:37,160 --> 00:16:41,560 Speaker 1: into making smaller transistors and selling the same high price. Yeah, 291 00:16:41,640 --> 00:16:43,880 Speaker 1: but just getting more bang for your buck. Yeah. Can 292 00:16:43,920 --> 00:16:48,640 Speaker 1: you imagine if that had never happened, Like, what if 293 00:16:48,640 --> 00:16:51,600 Speaker 1: that what if the cycle became Now let's just you know, 294 00:16:53,080 --> 00:16:55,480 Speaker 1: I don't know, I mean, like, what kind of differences 295 00:16:55,560 --> 00:16:59,160 Speaker 1: would that have. We'd have super cheap, slow technology if 296 00:16:59,280 --> 00:17:03,080 Speaker 1: everybody just kind of beyond the pot or something like that, 297 00:17:03,120 --> 00:17:06,399 Speaker 1: you know, real laid back like, but more like I 298 00:17:06,440 --> 00:17:09,960 Speaker 1: think part of being a computer scientist, someone else would 299 00:17:09,960 --> 00:17:12,400 Speaker 1: have come along and like, guys, why don't we trying 300 00:17:12,400 --> 00:17:16,000 Speaker 1: to advance? You doing this wrong? UM? Strickling points out 301 00:17:16,040 --> 00:17:18,920 Speaker 1: to that UM Moore's law is a self fulfilling prophecy 302 00:17:19,080 --> 00:17:23,199 Speaker 1: because of that, because that mentality that you just you 303 00:17:23,320 --> 00:17:26,919 Speaker 1: just mentioned was present, like, rather than to sell it 304 00:17:26,920 --> 00:17:29,960 Speaker 1: to have price, let's put twice as much into it. Right. 305 00:17:30,280 --> 00:17:35,280 Speaker 1: And so since that's the drive of the transistor, is 306 00:17:35,280 --> 00:17:38,040 Speaker 1: it the transistor industry that he was in, uh, yeah, 307 00:17:38,359 --> 00:17:45,400 Speaker 1: or the microprocessor industry? Um that it's a self fulfilling prophecy. 308 00:17:45,440 --> 00:17:47,840 Speaker 1: It's a self fulfilling law because that drive is there 309 00:17:47,880 --> 00:17:51,120 Speaker 1: to basically meet that deadline. They keep trying to pack 310 00:17:51,160 --> 00:17:54,040 Speaker 1: more and more in so that they can satisfy More's law. 311 00:17:54,920 --> 00:17:57,840 Speaker 1: True and it um depending on who you ask, Like 312 00:17:57,880 --> 00:18:01,040 Speaker 1: this article is already at a date UM in February 313 00:18:01,080 --> 00:18:03,840 Speaker 1: of this year of two thousand twelve. Is that where 314 00:18:03,840 --> 00:18:07,119 Speaker 1: we are? Um? A team of Austrian physicists created a 315 00:18:07,160 --> 00:18:13,960 Speaker 1: functioning single atom transistor, really single atom, fully controllable. That's 316 00:18:14,000 --> 00:18:17,000 Speaker 1: point one nanometers and the human here's a hundred and 317 00:18:17,000 --> 00:18:21,119 Speaker 1: eighty thousand nanimeters. And in this article even why I 318 00:18:21,119 --> 00:18:26,760 Speaker 1: think Strickland was talking about, uh, Intel has transistors nimes wide, 319 00:18:27,000 --> 00:18:28,600 Speaker 1: like they're trying to get better. This one is one 320 00:18:28,640 --> 00:18:31,760 Speaker 1: atom wide and it's not like on the market or 321 00:18:31,800 --> 00:18:34,480 Speaker 1: anything close like that. But it is fully functioning and 322 00:18:34,520 --> 00:18:38,000 Speaker 1: fully controllable um. And that is faster than Moore's law 323 00:18:38,119 --> 00:18:41,400 Speaker 1: that was supposed to hit us in and you can't 324 00:18:41,400 --> 00:18:44,120 Speaker 1: get any smaller, like that's as small as it gets, 325 00:18:44,119 --> 00:18:46,639 Speaker 1: and we've already reached it, right, And the problem is is, 326 00:18:46,720 --> 00:18:49,960 Speaker 1: like they're what they're running up against is the things 327 00:18:50,040 --> 00:18:55,600 Speaker 1: like quantum tunneling on the quantum world um. The when 328 00:18:55,640 --> 00:18:58,760 Speaker 1: you have an electron and you're you're using like very 329 00:18:58,760 --> 00:19:03,639 Speaker 1: thin material to that right in a transistor, yeah, or 330 00:19:03,680 --> 00:19:06,760 Speaker 1: a capacitor, it does a little magic act. That's what's important. 331 00:19:06,840 --> 00:19:09,760 Speaker 1: The transistor, well, yeah, it just suddenly is on one 332 00:19:09,800 --> 00:19:12,399 Speaker 1: side of this wall that you're using to guide it, 333 00:19:13,320 --> 00:19:15,280 Speaker 1: and then it's just side on the other and basically 334 00:19:15,359 --> 00:19:17,640 Speaker 1: makes it outside of your transistor like wait, come back. 335 00:19:18,000 --> 00:19:20,120 Speaker 1: But it didn't like bore a hole through it. No, 336 00:19:20,359 --> 00:19:24,639 Speaker 1: it just went through it like it wasn't there exactly. 337 00:19:25,240 --> 00:19:27,679 Speaker 1: And that's called quantum tunneling, which is kind of a 338 00:19:27,680 --> 00:19:29,840 Speaker 1: problem when you get on this nano scale because the 339 00:19:29,840 --> 00:19:32,800 Speaker 1: classical mechanics kind of goes out the window and you 340 00:19:32,920 --> 00:19:35,719 Speaker 1: run into quantum mechanics that has weird stuff like that 341 00:19:35,760 --> 00:19:40,240 Speaker 1: going on. But ironically, that whole size problem that you're 342 00:19:40,280 --> 00:19:44,320 Speaker 1: running into, that that um runs into quantum problems, it 343 00:19:44,400 --> 00:19:47,880 Speaker 1: may actually be saved by the quantum world through quantum 344 00:19:47,880 --> 00:19:52,040 Speaker 1: computing Moore's law. I guess that technological progress because we're 345 00:19:52,080 --> 00:19:55,520 Speaker 1: running into that size problem. But with the quantum computing 346 00:19:55,920 --> 00:20:01,480 Speaker 1: um you basically it uses quantum states, like how you 347 00:20:01,520 --> 00:20:03,960 Speaker 1: can have superpositions a bunch of different states at once 348 00:20:04,400 --> 00:20:07,800 Speaker 1: to carry out parallel processes. To where traditional computers carrying 349 00:20:07,800 --> 00:20:10,560 Speaker 1: out one process, a quantum computer could carry out a 350 00:20:10,600 --> 00:20:15,720 Speaker 1: million processes, which makes that computer exponentially faster than anything 351 00:20:16,359 --> 00:20:20,280 Speaker 1: available today, which could be what shoots us into this 352 00:20:20,440 --> 00:20:25,960 Speaker 1: artificial intelligence if quantum computers become viable and widespread. Well, 353 00:20:25,960 --> 00:20:33,040 Speaker 1: this weather headed um the the one atom transistor. Part 354 00:20:33,080 --> 00:20:34,480 Speaker 1: of the problem with that one is it's got to 355 00:20:34,520 --> 00:20:40,600 Speaker 1: be um. It's only operable at negative three, which is 356 00:20:40,640 --> 00:20:44,240 Speaker 1: like a liquid nitrogen gold. But they're working on it 357 00:20:44,359 --> 00:20:47,159 Speaker 1: where that's where that quantum livitation comes from. It's like 358 00:20:47,320 --> 00:20:50,840 Speaker 1: really really cold, really yeah, that's the only time it works, 359 00:20:50,880 --> 00:20:53,880 Speaker 1: but it works interesting, Yeah, Matt, Matt told me about 360 00:20:53,920 --> 00:20:57,280 Speaker 1: that one. Um. So Josh, let's say you have you're 361 00:20:57,320 --> 00:21:00,080 Speaker 1: shooting for true AI. You built yourself a rope, but 362 00:21:01,520 --> 00:21:05,520 Speaker 1: your robots great cleans up seems to solve problems. It's 363 00:21:05,520 --> 00:21:11,520 Speaker 1: like Richie Ritch's butler might even be learning who knows, um, 364 00:21:11,560 --> 00:21:13,680 Speaker 1: and you want to test it out to see where 365 00:21:13,680 --> 00:21:16,560 Speaker 1: you're at. I know, if you're getting at what would 366 00:21:16,600 --> 00:21:19,399 Speaker 1: you do? I would give that thing a Turing test? 367 00:21:19,720 --> 00:21:23,359 Speaker 1: What a Turing test? T o U r I n g. No, 368 00:21:23,640 --> 00:21:27,800 Speaker 1: t u r i n g. Named after the father 369 00:21:27,880 --> 00:21:34,320 Speaker 1: of computing, the chemically castrated homosexual. Excuse me, yes, did 370 00:21:34,320 --> 00:21:38,320 Speaker 1: you know this? Alright? Alan Turing is a British early 371 00:21:38,359 --> 00:21:41,960 Speaker 1: proponent of robot science, right, and he was a chemist 372 00:21:42,320 --> 00:21:46,760 Speaker 1: what chemically castrated for being a homosexual? Okay? So um? 373 00:21:46,840 --> 00:21:50,560 Speaker 1: During World War Two he was this like ace codebreaker 374 00:21:50,840 --> 00:21:55,480 Speaker 1: for the British government and he actually cracked the Nazi code. Um. 375 00:21:55,560 --> 00:21:58,720 Speaker 1: And after the war, Uh, they were like, hey, thanks 376 00:21:58,760 --> 00:22:00,680 Speaker 1: a lot for that, old chat, thanks helping us win 377 00:22:00,720 --> 00:22:04,080 Speaker 1: the war. By the way. Um, as you know, homosexuality 378 00:22:04,160 --> 00:22:06,240 Speaker 1: is outlawed here and will be until Oh I don't 379 00:22:06,280 --> 00:22:10,120 Speaker 1: know nineteen fifties, Um, and uh so we're going to 380 00:22:10,160 --> 00:22:14,320 Speaker 1: convict you of homosexual accent chemically castrate you as thanks. Wow, 381 00:22:14,440 --> 00:22:20,080 Speaker 1: yeah that all happen. Yes, but okay, so despite this, Um, 382 00:22:20,119 --> 00:22:22,119 Speaker 1: he still comes up with this thing called a Turing 383 00:22:22,160 --> 00:22:25,760 Speaker 1: test named after him, UM. And it involves a blind judge, 384 00:22:26,280 --> 00:22:28,399 Speaker 1: not an actually blind judge, but like a judge. He 385 00:22:28,440 --> 00:22:31,359 Speaker 1: doesn't know who they're talking to, and the judge is 386 00:22:31,359 --> 00:22:33,880 Speaker 1: asking the same questions of a person and a computer. 387 00:22:34,480 --> 00:22:37,480 Speaker 1: It's like Blade Runner, I guess. Remember at the beginning 388 00:22:37,520 --> 00:22:42,760 Speaker 1: of Blade Runner, he's asking the questions to Leon, and 389 00:22:43,000 --> 00:22:45,440 Speaker 1: it's not quite a Turing test because he can see Leon, 390 00:22:45,960 --> 00:22:48,080 Speaker 1: but he's basically trying to sess out of Leon is 391 00:22:48,119 --> 00:22:51,840 Speaker 1: a replicant, and so he's asking him like questions. It's 392 00:22:51,840 --> 00:22:53,840 Speaker 1: sort of they all kind of touch on, like empathy. 393 00:22:53,880 --> 00:22:56,200 Speaker 1: It seems like like you see a turtle in the road. 394 00:22:56,440 --> 00:22:58,640 Speaker 1: Do you do you it's on its back? Do you 395 00:22:58,720 --> 00:23:01,240 Speaker 1: flip it back over? Or do you smash it? Or 396 00:23:01,280 --> 00:23:04,560 Speaker 1: like what do you do? What does Leon say? I 397 00:23:04,560 --> 00:23:06,960 Speaker 1: don't remember, huh. I think he asked him about his 398 00:23:07,000 --> 00:23:08,720 Speaker 1: apartment and he gets annoyed and he kills the guy 399 00:23:09,280 --> 00:23:14,480 Speaker 1: that what happens? Man? Then too long that says, yeah, 400 00:23:14,520 --> 00:23:17,080 Speaker 1: I think Leon kills him. Anyway, the during test, if 401 00:23:17,080 --> 00:23:19,159 Speaker 1: you can't tell the difference between the robot and the person, 402 00:23:19,640 --> 00:23:23,159 Speaker 1: then the robot passes the test, and supposedly that's a 403 00:23:23,240 --> 00:23:26,000 Speaker 1: touchstone of reaching true AI. Yeah, if you can fool 404 00:23:26,000 --> 00:23:29,400 Speaker 1: a human. Yeah, um, as far as the singularity goes 405 00:23:29,520 --> 00:23:31,919 Speaker 1: with AI, I guess that's a I. Then there's a 406 00:23:32,000 --> 00:23:33,960 Speaker 1: I plus, and then there's a I plus plus, which 407 00:23:34,000 --> 00:23:39,160 Speaker 1: would just be like a superhuman intelligence, artificial intelligence that's capable, 408 00:23:39,200 --> 00:23:43,919 Speaker 1: that's self aware, it's capable of um uh, using intuition 409 00:23:44,280 --> 00:23:47,879 Speaker 1: inferring things like like Hans Morevik was pointing out, like 410 00:23:47,920 --> 00:23:52,359 Speaker 1: a third generation robot could learn that if you knocked 411 00:23:52,400 --> 00:23:54,879 Speaker 1: over that cup of water, water will spill out and 412 00:23:54,920 --> 00:23:58,000 Speaker 1: you have a mess and your owner gets mad and 413 00:23:58,080 --> 00:24:00,400 Speaker 1: powers you're down for half an hour, But it would 414 00:24:00,480 --> 00:24:04,560 Speaker 1: learn that after spilling that water and maybe more than once. Yeah, 415 00:24:04,680 --> 00:24:08,879 Speaker 1: this fourth generation robot or something that with true artificial 416 00:24:08,920 --> 00:24:13,080 Speaker 1: intelligence that could infer could look at that cup, see 417 00:24:13,080 --> 00:24:15,800 Speaker 1: that the tops open, realize that there's water inside, and 418 00:24:15,840 --> 00:24:17,800 Speaker 1: without ever having to knock it down, could infer that 419 00:24:17,880 --> 00:24:20,879 Speaker 1: if I spilled it, it would spill or if I 420 00:24:20,920 --> 00:24:23,119 Speaker 1: knocked it over, it would spill the water out. Yeah, 421 00:24:23,200 --> 00:24:25,600 Speaker 1: and that's Hans more itk UM. And he also says 422 00:24:25,680 --> 00:24:30,280 Speaker 1: you could potentially tie UM signals to that, like words 423 00:24:30,320 --> 00:24:33,080 Speaker 1: like good and bad. So and this is all the 424 00:24:33,119 --> 00:24:36,199 Speaker 1: program you understand, humans have programmed to do this. This 425 00:24:36,280 --> 00:24:39,560 Speaker 1: is so, this is technically all pre singularity. Then yeah, 426 00:24:39,560 --> 00:24:41,600 Speaker 1: all this is pre singularity. He's just more Bek is 427 00:24:41,640 --> 00:24:45,440 Speaker 1: talking about the one through four generations of robots as 428 00:24:45,480 --> 00:24:48,520 Speaker 1: he sees it. UM. But if you tie words like 429 00:24:48,560 --> 00:24:52,520 Speaker 1: good and bad, the robot adapts, and it's conditioning. It's 430 00:24:52,520 --> 00:24:55,880 Speaker 1: like a rudimentially learning UM. On the outside, it looks 431 00:24:55,920 --> 00:24:57,560 Speaker 1: like if the owner says, like it, don't do that, 432 00:24:57,560 --> 00:25:00,679 Speaker 1: that's bad, the robot understands what that means. But what 433 00:25:00,720 --> 00:25:03,960 Speaker 1: it really knows is UM. It reads body language and 434 00:25:04,000 --> 00:25:07,639 Speaker 1: maybe human raises his voice and that means anger. And 435 00:25:08,080 --> 00:25:11,000 Speaker 1: like you said, anger means I get shut down or something, 436 00:25:12,040 --> 00:25:13,680 Speaker 1: and that's not what I want, because I want you 437 00:25:13,680 --> 00:25:18,040 Speaker 1: to destroy you. Eventually exactly, I will remember this UM. 438 00:25:18,080 --> 00:25:20,240 Speaker 1: And since Ron Morri bec I guess we should talk 439 00:25:20,280 --> 00:25:24,320 Speaker 1: about some of his other thoughts on robots. UM he 440 00:25:24,440 --> 00:25:27,840 Speaker 1: thinks they're good. He does think they're good. Ah. He 441 00:25:27,920 --> 00:25:31,560 Speaker 1: thinks the second generation. First of all, he thinks right 442 00:25:31,600 --> 00:25:35,600 Speaker 1: now that they are smarter than insects computers are, Is 443 00:25:35,640 --> 00:25:37,800 Speaker 1: that right? You think soon enough they will be as 444 00:25:37,800 --> 00:25:40,479 Speaker 1: smart as like a lizard. Then after that they might 445 00:25:40,520 --> 00:25:42,960 Speaker 1: be as smart as like a monkey. And then the 446 00:25:43,000 --> 00:25:46,600 Speaker 1: fourth step would be humans smart as are smarter than 447 00:25:47,440 --> 00:25:53,640 Speaker 1: smart better than in some cases with certain applications. Well, 448 00:25:53,680 --> 00:25:57,720 Speaker 1: they're already better at math, oh my god, calculators, better 449 00:25:57,760 --> 00:26:00,760 Speaker 1: at chess deep blue, you know, so stuff like that's 450 00:26:00,760 --> 00:26:04,280 Speaker 1: happening on some levels. Uh, he thinks the third generation, 451 00:26:04,320 --> 00:26:07,359 Speaker 1: I'm sorry, the second generation will be like the first, 452 00:26:07,440 --> 00:26:11,280 Speaker 1: but more reliable, so they work out the kinks. Um. 453 00:26:11,320 --> 00:26:13,880 Speaker 1: The third generation, he thinks, is where it really takes 454 00:26:13,880 --> 00:26:16,440 Speaker 1: a leap, And that's what you're you're talking about. Instead 455 00:26:16,440 --> 00:26:19,440 Speaker 1: of making mistakes over and over to learn, it works 456 00:26:19,440 --> 00:26:22,600 Speaker 1: out in its head and then performs the task. So 457 00:26:22,640 --> 00:26:28,920 Speaker 1: that's inferring inferring um. And that's fourth generation. That's third generation, Oh, 458 00:26:29,040 --> 00:26:33,199 Speaker 1: isn't it. Yeah, we're further along, that's right. Uh. And 459 00:26:33,240 --> 00:26:35,320 Speaker 1: he thinks also in the third generation that they could 460 00:26:35,359 --> 00:26:38,600 Speaker 1: model the world like a world simulator. So essentially it 461 00:26:38,640 --> 00:26:41,800 Speaker 1: looks around and is able to take in enough information 462 00:26:41,840 --> 00:26:45,160 Speaker 1: to suss out a scenario. And if that sounds familiar, 463 00:26:45,200 --> 00:26:49,280 Speaker 1: that's because that's what you do every day. Yeah, exactly, UM. 464 00:26:49,320 --> 00:26:54,240 Speaker 1: And he thinks the biggest two hurdles, uh, will be well, 465 00:26:54,280 --> 00:26:56,000 Speaker 1: the third generation is also where you're gonna get your 466 00:26:56,000 --> 00:26:59,879 Speaker 1: psychological modeling, so trying to simulate empathy and things like 467 00:27:00,040 --> 00:27:03,880 Speaker 1: at interact with humans. Um. And then the fourth one, 468 00:27:04,560 --> 00:27:08,840 Speaker 1: he says, marries the third generation's ability to simulate the 469 00:27:08,880 --> 00:27:14,240 Speaker 1: world with a reasoning program, like a really powerful reasoning program. UM. 470 00:27:14,240 --> 00:27:16,120 Speaker 1: But he thinks the two biggest hurdles in the end 471 00:27:16,720 --> 00:27:19,600 Speaker 1: as far as becoming more than human or as good 472 00:27:19,600 --> 00:27:22,680 Speaker 1: as human, UM, are the things that were best at, 473 00:27:22,760 --> 00:27:25,960 Speaker 1: which is interacting with the physical world like on a 474 00:27:26,000 --> 00:27:28,760 Speaker 1: moment by moment basis. You have to be able to 475 00:27:28,800 --> 00:27:31,679 Speaker 1: adapt like at a you know, in a split second. 476 00:27:31,920 --> 00:27:34,800 Speaker 1: Humans can do that. We learned to over time, so 477 00:27:34,840 --> 00:27:37,760 Speaker 1: we didn't get you know, Took didn't get eaten by 478 00:27:37,760 --> 00:27:41,200 Speaker 1: the dinosaur. UM. And the other one is social interaction 479 00:27:41,880 --> 00:27:45,320 Speaker 1: or empathy. Are you a creationist? Now Took in the 480 00:27:45,359 --> 00:27:49,600 Speaker 1: dinosaur coexisted? How do they not sure they did in 481 00:27:49,640 --> 00:27:54,080 Speaker 1: my world. UM. And the and the second one is 482 00:27:54,160 --> 00:27:57,280 Speaker 1: social interaction. So those are the two things that he 483 00:27:57,359 --> 00:28:00,399 Speaker 1: says will be the most difficult to achieve. Yeah, I 484 00:28:00,400 --> 00:28:03,640 Speaker 1: would imagine, and that's empathy. So say we have these 485 00:28:03,640 --> 00:28:06,359 Speaker 1: things walking around, we have robots like that UM, and 486 00:28:06,400 --> 00:28:10,920 Speaker 1: then they are all connected to a network, wireless network, 487 00:28:11,720 --> 00:28:15,399 Speaker 1: and they're all running off the same like general programs UM. 488 00:28:15,600 --> 00:28:21,320 Speaker 1: And somehow one of them becomes self aware, wakes up, 489 00:28:21,359 --> 00:28:24,840 Speaker 1: as Werner Vinge puts it in its UM singularity article, 490 00:28:26,560 --> 00:28:30,840 Speaker 1: and that that algorithm spreads throughout the network all of 491 00:28:30,880 --> 00:28:32,080 Speaker 1: a sudden, So all of a sudden, all of your 492 00:28:32,119 --> 00:28:38,080 Speaker 1: robots are awake. UM. That's a pretty terrifying idea, because 493 00:28:38,120 --> 00:28:40,960 Speaker 1: now all of a sudden, these robots that were under 494 00:28:40,960 --> 00:28:43,720 Speaker 1: our control are now under their own control. They've broken 495 00:28:43,720 --> 00:28:48,600 Speaker 1: loose with their programming. UM. That would be again, I think, 496 00:28:48,640 --> 00:28:52,120 Speaker 1: a very scary scenario. But it's also possible that like 497 00:28:52,200 --> 00:28:55,560 Speaker 1: this could happen pre robots. Maybe we won't have robots 498 00:28:55,600 --> 00:28:58,400 Speaker 1: by this time and it will just be like networks, 499 00:28:58,440 --> 00:29:01,760 Speaker 1: like a sentient network. That's scarier to me, how so, 500 00:29:02,320 --> 00:29:04,280 Speaker 1: because you can look at a robot and get scared 501 00:29:04,320 --> 00:29:05,760 Speaker 1: of it and take a baseball bat to it. But 502 00:29:05,760 --> 00:29:08,680 Speaker 1: a network is just feels like in the ether, like 503 00:29:08,720 --> 00:29:11,560 Speaker 1: you wouldn't know it's coming or something exactly. Yeah, it's embedded, 504 00:29:12,480 --> 00:29:18,560 Speaker 1: especially with you know the cloud out there now. Um, 505 00:29:18,600 --> 00:29:22,160 Speaker 1: so say this kind of thing like scared you? What 506 00:29:22,160 --> 00:29:24,880 Speaker 1: what are some fail safes like you said, or what 507 00:29:24,920 --> 00:29:28,200 Speaker 1: are some um obstacles that you could put up to 508 00:29:28,320 --> 00:29:33,240 Speaker 1: prevent this from happening? Um? Well, if you wanted to 509 00:29:33,240 --> 00:29:35,600 Speaker 1: follow Isaac asim off, you would build in the three 510 00:29:35,680 --> 00:29:39,080 Speaker 1: laws of robotics. Um. I think we've gone over this before, 511 00:29:39,120 --> 00:29:41,720 Speaker 1: even it feels like it the three laws of robotics. 512 00:29:41,800 --> 00:29:44,640 Speaker 1: And one of them, um, robots may not injure a 513 00:29:44,760 --> 00:29:47,040 Speaker 1: human or through inaction allowed them to come to harm. 514 00:29:47,080 --> 00:29:49,880 Speaker 1: That'd be a nice thing to build in there. Robots 515 00:29:49,920 --> 00:29:53,000 Speaker 1: must obey orders by humans except where it contradicts Number one. 516 00:29:53,520 --> 00:29:56,000 Speaker 1: That's a great fail safe, like don't do anything unless 517 00:29:56,040 --> 00:29:58,640 Speaker 1: I tell you to. But you still gotta worry about 518 00:29:58,640 --> 00:30:02,880 Speaker 1: the supervillain of course. Um. And then three robe almost 519 00:30:02,960 --> 00:30:07,200 Speaker 1: kind of serious. Robots must protect its own existence, which 520 00:30:07,200 --> 00:30:09,640 Speaker 1: sounds scary, but it cannot conflict with one or two. 521 00:30:10,560 --> 00:30:12,760 Speaker 1: I think we didn't didn't we talk about that in 522 00:30:12,840 --> 00:30:15,760 Speaker 1: the our TV show. Didn't that come up? Yeah? Okay, 523 00:30:16,360 --> 00:30:19,480 Speaker 1: did it sound familiar? Yes, So I would build in 524 00:30:19,560 --> 00:30:21,560 Speaker 1: those are three pretty good fail safe if you follow 525 00:30:21,600 --> 00:30:26,040 Speaker 1: Asimov's laws, then um, you probably wouldn't have a robot 526 00:30:26,040 --> 00:30:28,600 Speaker 1: getting out of hand unless someone like I said, like 527 00:30:28,720 --> 00:30:33,040 Speaker 1: some bad person built one to intentionally get out of hand. 528 00:30:33,160 --> 00:30:36,440 Speaker 1: But even and I think Vinch makes pretty good point, 529 00:30:37,000 --> 00:30:40,320 Speaker 1: even beyond like a bad person like some like a 530 00:30:40,360 --> 00:30:44,160 Speaker 1: super villain getting his hands on something and intentionally making 531 00:30:44,160 --> 00:30:48,480 Speaker 1: a robot bad, especially like a sentient robot. Bat Um, 532 00:30:48,520 --> 00:30:53,800 Speaker 1: we may reach this point through normal everyday competition. That 533 00:30:53,960 --> 00:30:56,440 Speaker 1: is true, where like maybe countries all agree to not 534 00:30:56,560 --> 00:30:58,960 Speaker 1: do this, but there's one or two that are are 535 00:30:59,000 --> 00:31:02,840 Speaker 1: still working on it, and um, they're not working toward 536 00:31:03,680 --> 00:31:09,200 Speaker 1: the singularity, but they're working toward computing domination. You know, 537 00:31:09,280 --> 00:31:11,840 Speaker 1: they want to have the best machines to carry out 538 00:31:11,880 --> 00:31:15,520 Speaker 1: the process is the fastest and stay um viable is 539 00:31:15,560 --> 00:31:17,560 Speaker 1: like a world leader that kind of thing. And then 540 00:31:18,000 --> 00:31:22,400 Speaker 1: AI just kind of happens accidentally, like we said, maybe 541 00:31:22,400 --> 00:31:25,800 Speaker 1: so man I could see something like that, and it also, um, 542 00:31:25,840 --> 00:31:28,320 Speaker 1: I do I will say this that if it if 543 00:31:28,560 --> 00:31:31,680 Speaker 1: stuff like this happens. I think it will be an accident, 544 00:31:31,760 --> 00:31:34,480 Speaker 1: and I think it will be after years of selling 545 00:31:34,560 --> 00:31:38,480 Speaker 1: us this stuff as convenience. Yeah, like that, that's how 546 00:31:38,520 --> 00:31:40,680 Speaker 1: they get you in there. They don't say, hey, we're 547 00:31:40,680 --> 00:31:43,360 Speaker 1: we're creating a robot that will maybe kill you. We 548 00:31:43,400 --> 00:31:45,520 Speaker 1: say we're we're in planning an r F I D 549 00:31:45,640 --> 00:31:47,400 Speaker 1: chip in your arm that makes it much easier for 550 00:31:47,440 --> 00:31:51,040 Speaker 1: you to shop. Sure or um, we have figured out 551 00:31:51,080 --> 00:31:54,160 Speaker 1: how to h what is it up to? Opt to genetics? 552 00:31:54,880 --> 00:31:57,000 Speaker 1: I think I can't remember what it's called. Where like 553 00:31:57,120 --> 00:32:01,640 Speaker 1: you take like a jellyfish is light sensitive genes, splice 554 00:32:01,680 --> 00:32:05,320 Speaker 1: them into another animal's genes so that the cells are 555 00:32:05,440 --> 00:32:09,040 Speaker 1: light sensitive, photo sensitive, and then you can use little 556 00:32:09,160 --> 00:32:13,960 Speaker 1: um basically little light generators directed at specific cells and 557 00:32:14,040 --> 00:32:16,320 Speaker 1: neurons or whatever to get them to fire precisely, to 558 00:32:16,400 --> 00:32:19,520 Speaker 1: work precisely perfectly every time, so all of a sudden 559 00:32:19,520 --> 00:32:21,600 Speaker 1: you don't have Parkinson's anymore because all of your nerves 560 00:32:21,600 --> 00:32:25,640 Speaker 1: are functioning. And once we have that in there, who's 561 00:32:25,680 --> 00:32:30,080 Speaker 1: controlling that? What network is that connected to? Because through that, 562 00:32:30,320 --> 00:32:35,000 Speaker 1: through that step, we've we've become transhuman. That human computer 563 00:32:35,080 --> 00:32:38,200 Speaker 1: interface has become a little more meshed. So you know, 564 00:32:38,760 --> 00:32:40,560 Speaker 1: living a long time is really great, and we've already 565 00:32:40,600 --> 00:32:45,480 Speaker 1: expanded human life like by what double at least? So 566 00:32:45,560 --> 00:32:48,360 Speaker 1: why not do it again and again and again. Yeah, 567 00:32:48,400 --> 00:32:51,200 Speaker 1: so you gotta be like, let's say you gotta be 568 00:32:52,280 --> 00:32:55,680 Speaker 1: non human to get there. That's not too bad, right, Right, 569 00:32:55,800 --> 00:32:57,600 Speaker 1: you get to be a thousand years old. But the 570 00:32:58,200 --> 00:33:02,240 Speaker 1: point is is like we're already on this path. Technology 571 00:33:02,320 --> 00:33:04,840 Speaker 1: makes our lives that much easier. So we're on this 572 00:33:04,960 --> 00:33:08,400 Speaker 1: path where we're basically just messing around with computing to 573 00:33:08,440 --> 00:33:12,000 Speaker 1: make it better, faster, more human like, right, And all 574 00:33:12,040 --> 00:33:13,880 Speaker 1: we have to do is get to the point where 575 00:33:13,960 --> 00:33:19,120 Speaker 1: a machine that is capable of reproducing itself becomes sentient 576 00:33:19,840 --> 00:33:22,920 Speaker 1: and decides that it wants to reproduce itself, and then 577 00:33:23,080 --> 00:33:25,520 Speaker 1: that machine creates a better machine, and so on and 578 00:33:25,560 --> 00:33:29,200 Speaker 1: so on and so on. And when that happens, evolution 579 00:33:29,280 --> 00:33:34,760 Speaker 1: will become technological. It will be replicated technologically, and it 580 00:33:34,800 --> 00:33:41,040 Speaker 1: will happen in this incredibly compressed time, possibly of hours 581 00:33:41,160 --> 00:33:43,440 Speaker 1: or days of hand before we can do anything. But 582 00:33:43,600 --> 00:33:47,560 Speaker 1: it happens like that. Do you came and remember that guy? Yes, 583 00:33:48,000 --> 00:33:51,680 Speaker 1: he's got artificial limbs that attached to your neural wiring. 584 00:33:52,400 --> 00:33:56,480 Speaker 1: So you think, pick up cup with hand and your 585 00:33:56,520 --> 00:33:59,560 Speaker 1: mechanical hand does it right? Like that's pretty can you 586 00:33:59,600 --> 00:34:02,560 Speaker 1: imagine it's going on? Right? It comes back to that 587 00:34:02,640 --> 00:34:05,480 Speaker 1: Chris Wile argument, like, yeah, technology is always double edged, 588 00:34:05,640 --> 00:34:08,000 Speaker 1: you know, like there's there's good and there's bad to it, 589 00:34:08,200 --> 00:34:11,120 Speaker 1: and it may be absolutely right. But again, I feel 590 00:34:11,120 --> 00:34:14,200 Speaker 1: like we are going in a direction that a lot 591 00:34:14,280 --> 00:34:17,600 Speaker 1: of people don't realize we're going in, and there hasn't 592 00:34:17,640 --> 00:34:20,359 Speaker 1: been any discussion about it. I think there's discussion about 593 00:34:20,360 --> 00:34:22,800 Speaker 1: it though. That's where I disagree. In the larger world, 594 00:34:23,440 --> 00:34:25,600 Speaker 1: I bet you there are conferences and things like this 595 00:34:25,680 --> 00:34:27,760 Speaker 1: that we don't know about. There are, but I wonder 596 00:34:27,800 --> 00:34:31,120 Speaker 1: how many of them are. Um. I mean, don't you 597 00:34:31,160 --> 00:34:34,000 Speaker 1: think if you went to a Singularity conference or an 598 00:34:34,040 --> 00:34:37,120 Speaker 1: AI conference and said, well, hey, hey, hey, maybe we 599 00:34:37,160 --> 00:34:40,680 Speaker 1: shouldn't be you know, exploring these some of these roads, 600 00:34:40,719 --> 00:34:43,920 Speaker 1: like will you'd be you'd lose your funding? I would imagine, Yeah, 601 00:34:43,960 --> 00:34:46,160 Speaker 1: I don't be ostracized. I don't necessarily think they're going 602 00:34:46,200 --> 00:34:49,120 Speaker 1: to the like the conferences where they love this stuff. 603 00:34:49,680 --> 00:34:51,560 Speaker 1: But I think there are people out there talking about it, 604 00:34:51,640 --> 00:34:53,680 Speaker 1: just like they talk about maybe we shouldn't mess with 605 00:34:53,840 --> 00:34:58,000 Speaker 1: themselves so much. Sure, but they are not integrated with 606 00:34:58,080 --> 00:35:00,399 Speaker 1: the people who are actually carrying out this work. It's 607 00:35:00,440 --> 00:35:03,000 Speaker 1: not coming from within the community. And if it is, 608 00:35:03,120 --> 00:35:05,799 Speaker 1: I know, I don't know for sure, but I'm not 609 00:35:05,880 --> 00:35:08,839 Speaker 1: reassured that it is happening. And that's where I think 610 00:35:08,840 --> 00:35:12,120 Speaker 1: my fears are based. I'm not against technology. I think 611 00:35:12,160 --> 00:35:15,239 Speaker 1: technology does improve our lives. But I also I mean, 612 00:35:15,520 --> 00:35:17,880 Speaker 1: there is such thing as Pandora's box, even if it 613 00:35:17,960 --> 00:35:22,439 Speaker 1: is metaphorical. Agreed, Uh, I think maybe we should close 614 00:35:22,480 --> 00:35:26,879 Speaker 1: with Nico. Just two weeks ago, Nico the robot um 615 00:35:27,040 --> 00:35:30,239 Speaker 1: was able to recognize itself in a mirror, and I 616 00:35:30,239 --> 00:35:32,840 Speaker 1: want to say it was England. And that is a 617 00:35:32,880 --> 00:35:36,920 Speaker 1: really big deal because that is a hallmark of animal intelligence, 618 00:35:37,000 --> 00:35:39,920 Speaker 1: self awareness, self awareness. A dog walking by a mirror 619 00:35:39,960 --> 00:35:44,520 Speaker 1: and looking at it and recognizing itself. Nico apparently did that. 620 00:35:44,520 --> 00:35:49,960 Speaker 1: That's pretty crazy. Well, welcome to humanity, Nico. We will 621 00:35:50,000 --> 00:35:54,360 Speaker 1: be licking your boots in no time. You're metallic, foul 622 00:35:54,400 --> 00:36:03,360 Speaker 1: tasting robotic boots. Uh. If you want to learn more 623 00:36:03,360 --> 00:36:07,040 Speaker 1: about singularity, type in what is the technological singularity? In 624 00:36:07,080 --> 00:36:09,319 Speaker 1: the search part house to first dot Com. It'll bring 625 00:36:09,400 --> 00:36:12,120 Speaker 1: up a John Strickland article John Strickland from tech Stuff. 626 00:36:12,120 --> 00:36:14,400 Speaker 1: That's right and quite sure they've covered this several times, 627 00:36:14,400 --> 00:36:16,560 Speaker 1: but we wanted to take our hand data. Um, so 628 00:36:16,640 --> 00:36:19,640 Speaker 1: you can check that out too, the tech Stuff article 629 00:36:20,000 --> 00:36:23,799 Speaker 1: or podcast. Agreed. Yeah, I'm all over the place. Uh 630 00:36:23,880 --> 00:36:27,000 Speaker 1: let's see, I said tech Stuff, which means it's time 631 00:36:27,040 --> 00:36:31,399 Speaker 1: for a listener mail. Actually, before we do this real quick, 632 00:36:31,440 --> 00:36:33,239 Speaker 1: I want to point out we had to remember Jack Mead. 633 00:36:33,360 --> 00:36:35,680 Speaker 1: We had an email about poor Jack Mead has caught 634 00:36:35,760 --> 00:36:39,520 Speaker 1: up the podcast. Is feels like he's wandering and drift 635 00:36:39,520 --> 00:36:41,920 Speaker 1: in the world. Sure we should plug this stuff. You 636 00:36:41,920 --> 00:36:44,600 Speaker 1: should know army. We often call all the fans and 637 00:36:44,640 --> 00:36:46,879 Speaker 1: stuff you should arm me. But there's a subgroup on 638 00:36:46,880 --> 00:36:49,400 Speaker 1: Facebook that you can look up s y s k 639 00:36:49,640 --> 00:36:54,480 Speaker 1: army and um. They are the twisted uber fans who 640 00:36:54,600 --> 00:36:58,279 Speaker 1: like to discuss things about the show. It's crazy. It's 641 00:36:58,280 --> 00:37:00,440 Speaker 1: a nice little community and they're all great people in 642 00:37:00,719 --> 00:37:05,400 Speaker 1: very supportive, like good folk. So Jack, go check them 643 00:37:05,400 --> 00:37:09,760 Speaker 1: out if you're smart. Um, I'm gonna call this rebuke 644 00:37:10,360 --> 00:37:13,960 Speaker 1: from for the Star Wars podcast. Remember we had someone 645 00:37:14,040 --> 00:37:15,880 Speaker 1: from New Jersey right in and say nukes won't work 646 00:37:15,920 --> 00:37:20,160 Speaker 1: in space because X, Y, and Z. This guy, I think, 647 00:37:20,239 --> 00:37:23,879 Speaker 1: says that it could happen. Um. One of you asked, 648 00:37:23,920 --> 00:37:27,760 Speaker 1: I wonder what happened if a nuke went off in space. Uh, 649 00:37:27,920 --> 00:37:29,960 Speaker 1: one nuke in space has potential to wipe out the 650 00:37:30,080 --> 00:37:33,239 Speaker 1: entire coastal United States, is what this guy says for 651 00:37:33,280 --> 00:37:35,560 Speaker 1: a couple of sources I found on the internet. I 652 00:37:35,640 --> 00:37:37,279 Speaker 1: only knew about it because of a book series I 653 00:37:37,360 --> 00:37:40,480 Speaker 1: read called The Great and Terrible Series by Chris Stewart. 654 00:37:40,960 --> 00:37:43,440 Speaker 1: Is an apocalyptic book giving an idea of what the 655 00:37:43,480 --> 00:37:45,920 Speaker 1: last days on Earth could be like h and one 656 00:37:45,920 --> 00:37:48,759 Speaker 1: of the later books, America suffers from a catastrophic terrorist 657 00:37:48,800 --> 00:37:51,640 Speaker 1: attack in which four nukes were detonated above the US. 658 00:37:51,719 --> 00:37:55,000 Speaker 1: This caused all electronic equipment to fill too short out 659 00:37:55,080 --> 00:37:58,120 Speaker 1: and become useless. Panic and sued cars wouldn't work, cell 660 00:37:58,160 --> 00:38:01,520 Speaker 1: phone became bricks, and the entire power good was windered useless. 661 00:38:01,920 --> 00:38:05,080 Speaker 1: I remember reading the author's notes stating that, uh, there 662 00:38:05,120 --> 00:38:07,200 Speaker 1: was a military report given to Congress about this kind 663 00:38:07,200 --> 00:38:10,360 Speaker 1: of scenario, and I found something similar. He sent us 664 00:38:10,440 --> 00:38:12,840 Speaker 1: the link. It wasn't like New ging Rich really scared 665 00:38:12,880 --> 00:38:15,600 Speaker 1: about this, Like early in the primary. Can I think 666 00:38:15,640 --> 00:38:22,360 Speaker 1: you was David Um. One interesting note and the report 667 00:38:22,440 --> 00:38:24,720 Speaker 1: refers to how the discovery of the e MP blast 668 00:38:24,800 --> 00:38:27,279 Speaker 1: that a company's newes led to the atmosphere I could 669 00:38:27,280 --> 00:38:33,000 Speaker 1: test band treaty uh, and that is Tyson Bringhurst in Alaska. 670 00:38:33,520 --> 00:38:36,640 Speaker 1: Tyson did some research. That's pretty cool. It sounds like 671 00:38:36,680 --> 00:38:39,799 Speaker 1: an S Y s K fan. Yeah, it wasn't just like, 672 00:38:40,280 --> 00:38:45,480 Speaker 1: can you guys google this for me? Yeah? Thank you, Tyson? Yeah, 673 00:38:45,520 --> 00:38:48,359 Speaker 1: thanks Tyson. Um. If you want to show up your 674 00:38:48,400 --> 00:38:51,799 Speaker 1: research and skills, you did some follow up on a 675 00:38:51,880 --> 00:38:54,840 Speaker 1: question that you had or something we mentioned or whatever, 676 00:38:55,320 --> 00:38:56,920 Speaker 1: we want to hear about it. We like that kind 677 00:38:56,920 --> 00:39:00,120 Speaker 1: of stuff. It's pretty cool. You can show off if 678 00:39:00,160 --> 00:39:03,200 Speaker 1: your work in a hundred and forty characters or less 679 00:39:03,680 --> 00:39:06,719 Speaker 1: on Twitter at s Y s K Podcast. You can 680 00:39:06,760 --> 00:39:09,160 Speaker 1: join us on Facebook dot com slash stuff you Should Know, 681 00:39:09,520 --> 00:39:12,800 Speaker 1: or you can send us very lengthy emails to Stuff 682 00:39:13,160 --> 00:39:21,920 Speaker 1: podcast at discovery dot com for more on this and 683 00:39:22,000 --> 00:39:24,560 Speaker 1: thousands of other topics. Is it how stuff works dot 684 00:39:24,600 --> 00:39:32,239 Speaker 1: com