1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tex Stuff, a production of I Heart Radios. 2 00:00:07,320 --> 00:00:13,960 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,960 --> 00:00:17,560 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,560 --> 00:00:21,079 Speaker 1: I Heart Radio and I love all things tech. That's 5 00:00:21,120 --> 00:00:24,160 Speaker 1: time for another tech Stuff classic episode where Lauren Voge, 6 00:00:24,160 --> 00:00:27,960 Speaker 1: Obama and I sat down back in February two thirteen. 7 00:00:28,080 --> 00:00:34,440 Speaker 1: This episode published February eleven, and we talked about the singularity. 8 00:00:34,640 --> 00:00:38,040 Speaker 1: The episode was titled tech Stuff Enters the Singularity. So 9 00:00:38,479 --> 00:00:44,320 Speaker 1: sit back and enjoy this discussion about a somewhat controversial 10 00:00:44,400 --> 00:00:48,559 Speaker 1: hypothesis of the future of humanity today. We wanted to 11 00:00:48,600 --> 00:00:53,280 Speaker 1: talk about the future. The future. Yeah, really, we're talking 12 00:00:53,280 --> 00:00:55,840 Speaker 1: about a kind of a science fiction future. We're talking 13 00:00:55,880 --> 00:01:00,960 Speaker 1: about the singularity. And uh, long time listeners to tex Stuff, 14 00:01:01,000 --> 00:01:04,120 Speaker 1: and I'm talking about folks who listen way back before 15 00:01:04,200 --> 00:01:07,640 Speaker 1: we ever talked out at thirty minutes, let alone an hour. 16 00:01:08,360 --> 00:01:10,440 Speaker 1: May remember that we did an episode about how Ray 17 00:01:10,520 --> 00:01:13,759 Speaker 1: Kurtzwild works. And Ray Kurtzwill is a futurist and one 18 00:01:13,800 --> 00:01:16,880 Speaker 1: of the things he talks about extensively, particularly if you 19 00:01:17,240 --> 00:01:20,520 Speaker 1: corner him at a cocktail party, is the singularity. And 20 00:01:20,600 --> 00:01:23,240 Speaker 1: so we wanted to talk about what the singularity is, 21 00:01:23,720 --> 00:01:25,679 Speaker 1: what this idea, you know, we really wanted to kind 22 00:01:25,680 --> 00:01:27,960 Speaker 1: of dig down into it and why is this a 23 00:01:27,959 --> 00:01:33,200 Speaker 1: big deal? And how realistic is this vision of the future. Yeah, 24 00:01:33,200 --> 00:01:35,120 Speaker 1: because some people would take a little bit of uh, 25 00:01:36,319 --> 00:01:39,160 Speaker 1: would would argue with your concept of it being science fiction. 26 00:01:39,200 --> 00:01:41,399 Speaker 1: They take it extremely serious. Oh yeah, they say it's 27 00:01:41,400 --> 00:01:45,759 Speaker 1: science fact, science fact, it's science inevitability. Yeah. The time. 28 00:01:45,920 --> 00:01:48,840 Speaker 1: The term was actually coined by a mathematician, John von 29 00:01:48,920 --> 00:01:52,200 Speaker 1: Newman in the nineteen fifties. Um, but it was popularized 30 00:01:52,240 --> 00:01:58,520 Speaker 1: by a science fiction writer. Yeah, it's also Ah, there 31 00:01:58,520 --> 00:02:00,680 Speaker 1: are a lot of different concepts that are hied up together, 32 00:02:00,680 --> 00:02:03,400 Speaker 1: and it all depends on upon whom you ask what 33 00:02:03,440 --> 00:02:06,080 Speaker 1: it means by the singularity. For instance, there's some people 34 00:02:06,320 --> 00:02:08,880 Speaker 1: who when you hear the term the singularity, what they 35 00:02:08,919 --> 00:02:11,560 Speaker 1: say is, Okay, that's a time when we get to 36 00:02:11,600 --> 00:02:14,880 Speaker 1: the point where technological advances are coming so quickly that 37 00:02:15,080 --> 00:02:19,000 Speaker 1: it's impossible to have a meaningful conversation of what the 38 00:02:19,080 --> 00:02:24,200 Speaker 1: state of technology is because it changes changesseconds. So that's 39 00:02:24,240 --> 00:02:27,280 Speaker 1: one version, but most of the versions that that we're 40 00:02:27,320 --> 00:02:31,520 Speaker 1: familiar with that the futurists talk about incorporate an idea 41 00:02:31,720 --> 00:02:36,920 Speaker 1: of superhuman intelligence or the intelligence explosion, right, a kind 42 00:02:36,960 --> 00:02:40,880 Speaker 1: of combination of of human and technological development that just 43 00:02:41,040 --> 00:02:44,120 Speaker 1: dovetails into this gorgeous you know, space Baby from two 44 00:02:44,160 --> 00:02:47,320 Speaker 1: thousand one. Kind of that's an excellent way of putting it. 45 00:02:47,360 --> 00:02:51,720 Speaker 1: The documentary two thou one. Uh, I remember specifically when 46 00:02:51,720 --> 00:02:56,640 Speaker 1: the space baby looked at Earth. Okay, that documentary example 47 00:02:56,680 --> 00:02:58,960 Speaker 1: doesn't work at all. It usually does, but not this time, 48 00:02:59,240 --> 00:03:02,160 Speaker 1: not this time. S Space babies are a poor example 49 00:03:02,240 --> 00:03:06,320 Speaker 1: in this in this one instance. But but metaphorically speaking, yes, 50 00:03:06,400 --> 00:03:10,160 Speaker 1: you're right on track, because the intelligence explosion that was 51 00:03:10,200 --> 00:03:14,560 Speaker 1: a term introduced by someone known as Irving John Good, 52 00:03:14,760 --> 00:03:16,600 Speaker 1: or if you want to go with his birth name 53 00:03:16,919 --> 00:03:20,040 Speaker 1: is the door Jacub Gudak. I can see why he 54 00:03:20,120 --> 00:03:23,080 Speaker 1: changed it. Yeah he uh. He actually worked for a 55 00:03:23,120 --> 00:03:27,920 Speaker 1: while at Bletchley Park with another fellow who who made 56 00:03:28,000 --> 00:03:30,000 Speaker 1: sort of a name for himself in computer science, a 57 00:03:30,000 --> 00:03:33,679 Speaker 1: fellow named Alan Turing. Oh, I guess I've heard of him. Yeah, 58 00:03:33,720 --> 00:03:36,400 Speaker 1: Touring will come up in the discussion a little bit later, 59 00:03:36,520 --> 00:03:39,600 Speaker 1: but but for right now, so Irving John Good just 60 00:03:39,600 --> 00:03:42,440 Speaker 1: just a little quick anecdote that I thought was amusing. 61 00:03:42,880 --> 00:03:47,160 Speaker 1: So Good was working with touring to try and help 62 00:03:47,480 --> 00:03:50,000 Speaker 1: break German codes. I mean that's what Bletchley Park was 63 00:03:50,040 --> 00:03:55,560 Speaker 1: all about, right, So Good apparently one day drew the 64 00:03:55,680 --> 00:03:59,120 Speaker 1: ire of touring when he decided to take a little 65 00:03:59,160 --> 00:04:02,000 Speaker 1: cat nap. But because he was tired, and he was. 66 00:04:02,240 --> 00:04:06,080 Speaker 1: It was Good's philosophy that being tired did not mean 67 00:04:06,120 --> 00:04:07,960 Speaker 1: that he meant that he was not going to work 68 00:04:08,000 --> 00:04:10,240 Speaker 1: at his best, and he might as well go ahead 69 00:04:10,240 --> 00:04:13,360 Speaker 1: and nap, exactly, take a nap, get refreshed, and then 70 00:04:13,560 --> 00:04:16,200 Speaker 1: tackle the problem again, and you're more likely to solve it. 71 00:04:16,240 --> 00:04:19,560 Speaker 1: Whereas touring was very much a workhorse, you know he 72 00:04:19,640 --> 00:04:24,080 Speaker 1: was he was no rest, no rest type it, so 73 00:04:24,160 --> 00:04:27,320 Speaker 1: touring when he discovered that Good had been napping, decided 74 00:04:27,400 --> 00:04:30,000 Speaker 1: that that this was the Good was not so good 75 00:04:30,440 --> 00:04:34,720 Speaker 1: um and and touring touring sort of treated him with disdain. 76 00:04:35,640 --> 00:04:40,160 Speaker 1: He began to essentially not speak to Good. Good meanwhile, 77 00:04:40,480 --> 00:04:43,600 Speaker 1: began to think about the letters that were being used 78 00:04:43,600 --> 00:04:48,320 Speaker 1: to in Enigma codes to code German messages, and he 79 00:04:48,400 --> 00:04:51,839 Speaker 1: began to think, one, if these letters are not completely random, 80 00:04:51,960 --> 00:04:54,960 Speaker 1: What if the Germans are relying on some letters more 81 00:04:54,960 --> 00:04:58,120 Speaker 1: frequently than others, and he began to look at frequency 82 00:04:58,279 --> 00:05:00,440 Speaker 1: of these letters being used. He made up a table 83 00:05:01,000 --> 00:05:05,800 Speaker 1: and mathematically analyzed the frequency that certain letters were used 84 00:05:05,800 --> 00:05:08,760 Speaker 1: and discovered that there was a bias, there was a pattern. Yeah, 85 00:05:08,839 --> 00:05:11,159 Speaker 1: so he said, well, with this bias, that means that 86 00:05:11,200 --> 00:05:13,920 Speaker 1: we can start to narrow down the possibilities of these codes. 87 00:05:14,160 --> 00:05:17,200 Speaker 1: And in fact, he was able to demonstrate that this 88 00:05:17,279 --> 00:05:19,560 Speaker 1: was a way to help break German codes, and Touring, 89 00:05:19,600 --> 00:05:23,160 Speaker 1: when he saw Goods work, said the sworn I tried that, 90 00:05:24,080 --> 00:05:27,400 Speaker 1: but but clearly that showed that it worked well. And 91 00:05:27,440 --> 00:05:30,880 Speaker 1: then good and another point apparently went to sleep one 92 00:05:30,960 --> 00:05:33,760 Speaker 1: day and they've been working on a code that they 93 00:05:33,800 --> 00:05:36,520 Speaker 1: just could not break. And while he was sleeping, he 94 00:05:36,680 --> 00:05:41,359 Speaker 1: dreamed that perhaps when the Germans were encoding this particular message, 95 00:05:41,600 --> 00:05:43,840 Speaker 1: they used the letters in reverse of the way they 96 00:05:43,880 --> 00:05:46,240 Speaker 1: were actually printed, and so he tried that when he 97 00:05:46,279 --> 00:05:48,880 Speaker 1: woke up, and it turned out he was right. And 98 00:05:48,920 --> 00:05:51,440 Speaker 1: so then his argument was Touring, I need to go 99 00:05:51,480 --> 00:05:54,200 Speaker 1: to bed. So yeah, yeah, what the moral of the 100 00:05:54,240 --> 00:05:57,360 Speaker 1: story here is that knaps are good and no one 101 00:05:57,360 --> 00:06:01,120 Speaker 1: should talk to you, right, yeah, that's how I live 102 00:06:01,200 --> 00:06:05,400 Speaker 1: my life. But yeah, so so Goods point. Anyway, he 103 00:06:05,680 --> 00:06:08,520 Speaker 1: came up with this term of the intelligence explosion, and 104 00:06:08,560 --> 00:06:11,160 Speaker 1: it was this this sort of idea that we're going 105 00:06:11,160 --> 00:06:15,120 Speaker 1: to reach a point where we are increasing either our 106 00:06:15,160 --> 00:06:20,279 Speaker 1: own intelligence or some sort of artificial intelligence so far 107 00:06:20,400 --> 00:06:23,920 Speaker 1: beyond what we are currently capable of understanding, that life 108 00:06:24,000 --> 00:06:29,320 Speaker 1: as we know it will change completely. And because it's 109 00:06:29,320 --> 00:06:32,400 Speaker 1: going to go beyond what we know right now, there's 110 00:06:32,480 --> 00:06:34,320 Speaker 1: no way to predict what our life will be like 111 00:06:35,040 --> 00:06:38,839 Speaker 1: because it's beyond our because it is it is by definition, 112 00:06:39,000 --> 00:06:41,840 Speaker 1: out of our comprehension. Yes, as the Scots would say, 113 00:06:41,960 --> 00:06:46,680 Speaker 1: it's beyond our ken are we going to be doing 114 00:06:46,720 --> 00:06:50,000 Speaker 1: accents this episode? That was a terrible one. I actually 115 00:06:50,000 --> 00:06:53,640 Speaker 1: regret doing it right now. I already knew I couldn't 116 00:06:53,680 --> 00:06:56,480 Speaker 1: do Scottish and yet there I went. Anyway, your trail 117 00:06:56,520 --> 00:07:00,679 Speaker 1: placing again, So to kind of backtry a bit before 118 00:07:00,720 --> 00:07:05,520 Speaker 1: we really get into the whole singularity discussion. That was 119 00:07:05,560 --> 00:07:10,560 Speaker 1: just a brief overview. A good foundation to start from 120 00:07:10,800 --> 00:07:15,560 Speaker 1: is the concept of Moore's law. You know, originally Gordon Moore, who, 121 00:07:15,640 --> 00:07:17,160 Speaker 1: by the way, it was a co founder of a 122 00:07:17,160 --> 00:07:21,600 Speaker 1: little company called intel uh he he originally observed back 123 00:07:21,640 --> 00:07:24,880 Speaker 1: in nineteen sixty five in a paper that I'm going 124 00:07:24,920 --> 00:07:27,559 Speaker 1: to I'm gonna with this, but it was called something 125 00:07:27,560 --> 00:07:31,400 Speaker 1: like cramming more components onto integrated circuits something like that. 126 00:07:31,400 --> 00:07:33,800 Speaker 1: That was that was actually cramming was definitely one of 127 00:07:33,840 --> 00:07:37,680 Speaker 1: the words used um and circuit probably was to Anyway, 128 00:07:37,760 --> 00:07:41,720 Speaker 1: he noticed that over the course of I think originally 129 00:07:41,760 --> 00:07:45,400 Speaker 1: it was twelve months, but today we consider it two years. Uh, 130 00:07:45,920 --> 00:07:47,720 Speaker 1: the eighteen to twenty four months, I think is the 131 00:07:47,760 --> 00:07:51,360 Speaker 1: official unofficial right right, right, Yeah, that that the number 132 00:07:51,400 --> 00:07:55,360 Speaker 1: of discrete components on a a square inch silicon wafer 133 00:07:55,680 --> 00:08:00,480 Speaker 1: would double due to improvements in manufacturing and efficiency, so 134 00:08:00,560 --> 00:08:05,560 Speaker 1: that in effect, what this means to the layman is 135 00:08:05,600 --> 00:08:09,640 Speaker 1: that our electronics and particularly our computers get twice as 136 00:08:09,680 --> 00:08:12,520 Speaker 1: powerful every two years. So if you bought a computer 137 00:08:12,560 --> 00:08:16,160 Speaker 1: in and then bought another computer in two thousand, in theory, 138 00:08:16,240 --> 00:08:18,679 Speaker 1: the computer in two thousand would be twice as powerful 139 00:08:18,680 --> 00:08:23,240 Speaker 1: as the one from This is exponential growth. That's an 140 00:08:23,280 --> 00:08:29,040 Speaker 1: important component this idea of exponential growth. And it goes 141 00:08:29,840 --> 00:08:33,160 Speaker 1: without saying that if you continue on this path, if this, 142 00:08:33,640 --> 00:08:37,800 Speaker 1: if this continues indefinitely, then you know, you quickly get 143 00:08:37,800 --> 00:08:41,920 Speaker 1: to computers of almost unimaginable power just a decade out certainly, 144 00:08:41,960 --> 00:08:44,560 Speaker 1: although I mean I still don't really understand what a 145 00:08:44,559 --> 00:08:48,000 Speaker 1: gigabyte means because when when I first started using computers, 146 00:08:48,040 --> 00:08:49,840 Speaker 1: we were not counting in that. I mean, I mean 147 00:08:49,880 --> 00:08:52,680 Speaker 1: I was still impressed by kilobytes at the time. So yeah, Now, 148 00:08:52,720 --> 00:08:55,600 Speaker 1: I remember the first time I got a hard drive, 149 00:08:55,640 --> 00:08:58,840 Speaker 1: I think it had like two fifty megabyte hard drive, 150 00:08:58,880 --> 00:09:01,560 Speaker 1: and I thought, like, needs that much space? Now, grad 151 00:09:01,600 --> 00:09:06,120 Speaker 1: that's that's space we're talking about, not even processing time. Absolutely, 152 00:09:06,200 --> 00:09:08,600 Speaker 1: so yeah, it's it's it's one of those things where 153 00:09:09,040 --> 00:09:12,360 Speaker 1: the older you are, the more incredible today, is right, 154 00:09:12,400 --> 00:09:14,720 Speaker 1: because you start looking at computers and you think, I 155 00:09:14,760 --> 00:09:17,200 Speaker 1: remember when these things came out, and they were essentially 156 00:09:17,240 --> 00:09:21,560 Speaker 1: the equivalent of a of a really good desktop calculator. Right. So, 157 00:09:22,200 --> 00:09:26,200 Speaker 1: but Moore's law states that this advance will continue indefinitely 158 00:09:26,400 --> 00:09:29,800 Speaker 1: until we hit some sort of fundamental obstacle that we 159 00:09:29,920 --> 00:09:32,679 Speaker 1: just cannot engineer our way around, right, you know, and 160 00:09:32,720 --> 00:09:34,760 Speaker 1: people that that's why it's it's kind of in contention 161 00:09:34,840 --> 00:09:37,040 Speaker 1: right now because people are saying that, well, there's there's 162 00:09:37,080 --> 00:09:39,600 Speaker 1: only so much physical space that you can fit onto 163 00:09:39,720 --> 00:09:42,600 Speaker 1: with with silicone. There's there's there's a physical limitation to 164 00:09:42,640 --> 00:09:44,760 Speaker 1: the material in which there's only so much you can 165 00:09:44,760 --> 00:09:46,400 Speaker 1: do about it, and so does more It's law still 166 00:09:46,400 --> 00:09:49,600 Speaker 1: apply if we're talking about other materials, and what's you know, right? 167 00:09:49,640 --> 00:09:52,720 Speaker 1: And and how small can you get before you start 168 00:09:52,760 --> 00:09:56,520 Speaker 1: to run into quantum effects that are impossible to work around? Uh? 169 00:09:56,559 --> 00:09:59,680 Speaker 1: And then do you change the geometry of a chip? 170 00:09:59,760 --> 00:10:02,360 Speaker 1: Do you go three dimensional instead of two dimensional? Wouldn't 171 00:10:02,400 --> 00:10:04,000 Speaker 1: that help? And you know, there are a lot of 172 00:10:04,000 --> 00:10:07,680 Speaker 1: engineers are working on this, and frankly, pretty much every 173 00:10:07,720 --> 00:10:09,720 Speaker 1: couple of years, someone says, all right, this is the 174 00:10:09,760 --> 00:10:13,680 Speaker 1: year Moore's law, and it's good, it's it's over, it's gone, 175 00:10:13,800 --> 00:10:16,840 Speaker 1: it's done with. Five years later, you're still going strong. 176 00:10:17,280 --> 00:10:20,000 Speaker 1: And then on the sixth year, someone else says, Moore's 177 00:10:20,040 --> 00:10:21,120 Speaker 1: Law is gonna end. It's a little bit of a 178 00:10:21,120 --> 00:10:23,600 Speaker 1: self fulfilling prophecy. I think that a lot of companies 179 00:10:23,640 --> 00:10:27,440 Speaker 1: attempt to keep it going, to keep it going, sure, yeah, yeah, yeah. 180 00:10:27,480 --> 00:10:30,760 Speaker 1: No one wants to be the ones to say, guys, 181 00:10:30,800 --> 00:10:34,280 Speaker 1: guess what, we can't keep up with Moore's law anymore. 182 00:10:34,400 --> 00:10:35,920 Speaker 1: No one wants to do that. So it is a 183 00:10:35,920 --> 00:10:38,360 Speaker 1: good motivator. Also, if I can footing out myself real quick. 184 00:10:38,400 --> 00:10:40,640 Speaker 1: I'm pretty sure that I just pronounced silicon is still 185 00:10:40,679 --> 00:10:42,120 Speaker 1: a cone, and I would like I would like to 186 00:10:42,120 --> 00:10:43,440 Speaker 1: stay for the record that I know that those are 187 00:10:43,440 --> 00:10:46,160 Speaker 1: two different substances. Okay, that's fair. Anyway, I was I 188 00:10:46,200 --> 00:10:47,600 Speaker 1: was going to ask you about it, but by the 189 00:10:47,640 --> 00:10:50,120 Speaker 1: time you you were finished talking, I thought, let's just 190 00:10:50,160 --> 00:10:54,680 Speaker 1: go Yeah, that's cool. It's all right if you knew 191 00:10:54,679 --> 00:10:57,720 Speaker 1: how many times I have used that particular pronunciation to 192 00:10:57,720 --> 00:11:02,400 Speaker 1: to hilarious results. So moving on with this, this whole 193 00:11:02,440 --> 00:11:04,839 Speaker 1: idea about Moore's law, I mean, the reason this plays 194 00:11:04,840 --> 00:11:08,839 Speaker 1: into the singularity is with the technological advances, you start 195 00:11:08,920 --> 00:11:13,920 Speaker 1: to be able to achieve pretty incredible things, and uh, 196 00:11:13,960 --> 00:11:18,640 Speaker 1: even within one generation of Moore's law, which kind of 197 00:11:18,640 --> 00:11:21,840 Speaker 1: a meaningless term. But let's say let's say you arbitrarily 198 00:11:21,880 --> 00:11:24,200 Speaker 1: pick a date and then two years from that date 199 00:11:24,240 --> 00:11:26,840 Speaker 1: you look and see what's possible with the new technology, 200 00:11:27,320 --> 00:11:30,920 Speaker 1: and getting to twice as much power however you wanted 201 00:11:30,960 --> 00:11:34,360 Speaker 1: to find it doesn't necessarily mean that you've only doubled 202 00:11:34,520 --> 00:11:36,760 Speaker 1: the amount of things you can do with that power. 203 00:11:36,840 --> 00:11:41,400 Speaker 1: You may have limitless things you can do. So with 204 00:11:41,400 --> 00:11:45,240 Speaker 1: that idea. You're talking about being able to power through 205 00:11:45,320 --> 00:11:48,640 Speaker 1: problems way faster than you did before. And there's lots 206 00:11:48,679 --> 00:11:52,160 Speaker 1: of different ways of doing that. For example, UM, grid computing. 207 00:11:52,440 --> 00:11:56,280 Speaker 1: Grid computing is when you are linking computers together to 208 00:11:56,360 --> 00:11:59,199 Speaker 1: work on a problem all at once. Now, this works 209 00:11:59,240 --> 00:12:03,079 Speaker 1: really well with problems, parallel problems we call them. Uh. 210 00:12:03,080 --> 00:12:06,160 Speaker 1: These are problems where there are lots of potential solutions 211 00:12:06,440 --> 00:12:09,920 Speaker 1: and each computer essentially is working on one set of 212 00:12:10,000 --> 00:12:12,800 Speaker 1: potential solutions. And that way you have all these different 213 00:12:12,800 --> 00:12:15,280 Speaker 1: computers working on it at the same time. It reduces 214 00:12:15,320 --> 00:12:19,040 Speaker 1: the overall time it takes to solve that parallel problem. Uh. 215 00:12:19,080 --> 00:12:21,680 Speaker 1: And so UM, like if you've ever heard of anything 216 00:12:21,760 --> 00:12:24,880 Speaker 1: like folding at home or the set project, where you 217 00:12:24,920 --> 00:12:29,000 Speaker 1: could dedicate your computer's idle time. So the problem, the 218 00:12:29,040 --> 00:12:31,960 Speaker 1: idle processes, the processes that are not being used while 219 00:12:32,000 --> 00:12:36,040 Speaker 1: you're serving the web or writing how the singularity works, 220 00:12:36,160 --> 00:12:41,200 Speaker 1: or I don't know, uh, building a architectural program in 221 00:12:41,200 --> 00:12:45,200 Speaker 1: in in some sort of CAD application. Anything that you're 222 00:12:45,200 --> 00:12:48,480 Speaker 1: not using can be dedicated to one of these projects. 223 00:12:49,200 --> 00:12:52,360 Speaker 1: Same sort of idea that uh, you don't necessarily have 224 00:12:52,400 --> 00:12:56,200 Speaker 1: to build a supercomputer to solve complex problems if you 225 00:12:56,360 --> 00:12:58,880 Speaker 1: use a whole bunch of computers, whole bunch of small ones, 226 00:12:59,360 --> 00:13:01,880 Speaker 1: large had drunk lighter does this although they use very 227 00:13:02,000 --> 00:13:04,600 Speaker 1: nice advanced computers, but they do a lot of grid 228 00:13:04,600 --> 00:13:08,480 Speaker 1: computing as well. So, uh so, just using those kind 229 00:13:08,520 --> 00:13:11,840 Speaker 1: of models, we see that we're able to do much 230 00:13:11,880 --> 00:13:16,480 Speaker 1: more sophisticated things than we could otherwise if we were certainly. Yes, networks, 231 00:13:16,520 --> 00:13:19,199 Speaker 1: as it turns out, are pretty cool. Yeah, and networks 232 00:13:19,280 --> 00:13:22,040 Speaker 1: play a part in this idea of the singularity. Um. Actually, 233 00:13:22,280 --> 00:13:24,199 Speaker 1: I guess now is a good time. We'll we'll kind 234 00:13:24,200 --> 00:13:28,240 Speaker 1: of transition into Verner Vengees and I honestly, I don't 235 00:13:28,240 --> 00:13:29,760 Speaker 1: know how to say his last name. I say venge 236 00:13:29,920 --> 00:13:32,680 Speaker 1: and it could end up being VINGI. But I just 237 00:13:32,720 --> 00:13:35,160 Speaker 1: went with what you said. So that's great, that's fine. 238 00:13:36,040 --> 00:13:39,280 Speaker 1: What we'll say that Vench says everything is silicone. Uh 239 00:13:39,440 --> 00:13:44,559 Speaker 1: So Verner though he Verne, I call him vern He 240 00:13:44,559 --> 00:13:50,040 Speaker 1: He suggested four different potential pathways that humans could take, 241 00:13:50,520 --> 00:13:53,679 Speaker 1: or really that the world could take to arrive at 242 00:13:53,679 --> 00:13:57,840 Speaker 1: the technological singularity. Okay, what are they? The four ways 243 00:13:58,000 --> 00:14:03,640 Speaker 1: are we could develop a superhuman artificial intelligence. So computers 244 00:14:03,640 --> 00:14:08,640 Speaker 1: suddenly are able to think on a level that's analogous 245 00:14:08,720 --> 00:14:11,120 Speaker 1: to the way humans think and can do it better 246 00:14:11,160 --> 00:14:16,360 Speaker 1: than the right. Whether or not that means computers are conscious, 247 00:14:16,440 --> 00:14:20,600 Speaker 1: that's debatable. We'll get into that too. Computer networks could 248 00:14:20,600 --> 00:14:24,920 Speaker 1: somehow become self aware. That's number two. So yes, sky 249 00:14:24,960 --> 00:14:27,280 Speaker 1: net So like the grid computing we were just talking 250 00:14:27,280 --> 00:14:32,680 Speaker 1: about that, somehow using these grid computers, the network itself 251 00:14:32,880 --> 00:14:36,080 Speaker 1: having enough cycles and enough pathways and enough loops back around, 252 00:14:36,200 --> 00:14:39,280 Speaker 1: it starts going like, hey, I recognize this starts thinking 253 00:14:39,320 --> 00:14:42,440 Speaker 1: about like like like like thinking about IBM S Watson. 254 00:14:42,520 --> 00:14:46,160 Speaker 1: But it's distributed across a network. So computers you can 255 00:14:46,200 --> 00:14:51,040 Speaker 1: think of computers is all being super powerful neurons in 256 00:14:51,080 --> 00:14:54,200 Speaker 1: a brain, and that the network is actually neural pathways. 257 00:14:54,680 --> 00:14:58,160 Speaker 1: And it's definitely a science fiction e way of looking 258 00:14:58,200 --> 00:15:02,040 Speaker 1: at things. Doesn't mean it won't open, but stranger things, 259 00:15:02,080 --> 00:15:03,960 Speaker 1: my friends. It feels like a matrix kind of thing 260 00:15:03,960 --> 00:15:07,240 Speaker 1: to me. Then we have the idea that computer human 261 00:15:07,280 --> 00:15:12,920 Speaker 1: interfaces are so advanced and so intrinsically tied to who 262 00:15:12,960 --> 00:15:17,800 Speaker 1: we are, that humans themselves evolved beyond being human, we 263 00:15:17,880 --> 00:15:21,960 Speaker 1: become trans human. So this is an idea that we 264 00:15:22,040 --> 00:15:25,160 Speaker 1: almost merge with computers at least on some level via 265 00:15:25,320 --> 00:15:28,360 Speaker 1: kind of nanobot technology. You know, stuff stuff running through 266 00:15:28,400 --> 00:15:32,120 Speaker 1: our bloodstream, stuff in our yep. Or we have just 267 00:15:32,360 --> 00:15:37,560 Speaker 1: brain interfaces where our scientiousness, our consciousness is connected to 268 00:15:37,720 --> 00:15:40,920 Speaker 1: So so for example, we might have it where instead 269 00:15:40,960 --> 00:15:44,080 Speaker 1: of connecting to the Internet via some device like a 270 00:15:44,120 --> 00:15:47,960 Speaker 1: smartphone or a computer, yeah, it's right there in our 271 00:15:48,080 --> 00:15:51,680 Speaker 1: our meat brains, so that you know, you're sitting there 272 00:15:51,680 --> 00:15:53,760 Speaker 1: having a conversation with someone. Then you're like, oh, wait, 273 00:15:54,040 --> 00:15:56,160 Speaker 1: what movie was that guy in? Let me just look 274 00:15:56,200 --> 00:15:59,880 Speaker 1: up I am dB in my brain and then you 275 00:16:00,200 --> 00:16:03,480 Speaker 1: you know, depending on how good your connection is, which means, 276 00:16:03,560 --> 00:16:05,520 Speaker 1: by the way, if you are a journalist and you 277 00:16:05,560 --> 00:16:08,360 Speaker 1: attend C E S, you will automatically be dumber because 278 00:16:08,440 --> 00:16:11,200 Speaker 1: all the all the internet connectivity will be taken up 279 00:16:11,240 --> 00:16:13,480 Speaker 1: and so you'll be sitting there trying to ask good 280 00:16:13,520 --> 00:16:16,120 Speaker 1: questions and drool will come out of your mouth, which 281 00:16:16,160 --> 00:16:18,320 Speaker 1: to me is a typical C E S. I can 282 00:16:18,360 --> 00:16:21,200 Speaker 1: I can only assume that that wireless technology would advance 283 00:16:21,240 --> 00:16:24,320 Speaker 1: also at this point, but one can only hope fingers cross. 284 00:16:24,320 --> 00:16:26,760 Speaker 1: There are certain technologies that are not advancing at the 285 00:16:26,760 --> 00:16:29,320 Speaker 1: exponential rate of Moore's law, which is another problem we'll 286 00:16:29,360 --> 00:16:32,200 Speaker 1: talk about that. Uh. And then the fourth and final 287 00:16:32,680 --> 00:16:37,000 Speaker 1: method that Werner had suggested the world may go would 288 00:16:37,000 --> 00:16:40,920 Speaker 1: be that humans would advance so far in biological sciences 289 00:16:41,320 --> 00:16:45,080 Speaker 1: that that would allow us to engineer human intelligence so 290 00:16:45,120 --> 00:16:48,120 Speaker 1: that we could make ourselves as smart as we wanted 291 00:16:48,120 --> 00:16:50,720 Speaker 1: to be. This is sort of that Gattica future where 292 00:16:50,960 --> 00:16:54,440 Speaker 1: we've got all the another another great documentary, where we 293 00:16:54,440 --> 00:17:00,000 Speaker 1: we engineer ourselves to be super smart. So those are 294 00:17:00,080 --> 00:17:03,480 Speaker 1: the four pathways artificial intelligence, computer networks to become self aware, 295 00:17:03,800 --> 00:17:07,640 Speaker 1: computer human interface has become really really awesome, or we 296 00:17:07,720 --> 00:17:11,760 Speaker 1: have biologically engineered human intelligence and uh. And all four 297 00:17:11,800 --> 00:17:14,480 Speaker 1: of these lead to a similar outcome, which is this 298 00:17:14,600 --> 00:17:20,240 Speaker 1: intelligence explosion. And this is the idea that some form 299 00:17:20,359 --> 00:17:24,280 Speaker 1: of superhuman intelligence is created, either artificially or within ourselves, 300 00:17:24,680 --> 00:17:27,520 Speaker 1: and that at that point we will no longer be 301 00:17:27,600 --> 00:17:30,760 Speaker 1: able to predict what our world will will be like, 302 00:17:31,240 --> 00:17:36,200 Speaker 1: because by definition, we will have a superhuman intelligent entity involved. 303 00:17:36,280 --> 00:17:40,560 Speaker 1: And because that's superhuman, it's beyond our ability to predict 304 00:17:41,359 --> 00:17:45,160 Speaker 1: which is you know which, which makes thought experience experiments 305 00:17:45,160 --> 00:17:50,639 Speaker 1: about it a little bit uh, philosophical. That's the kind 306 00:17:50,680 --> 00:17:53,680 Speaker 1: way of putting it, uh, pointless would be another way 307 00:17:53,800 --> 00:17:56,040 Speaker 1: of putting up like we could we could, you know, 308 00:17:56,080 --> 00:17:58,919 Speaker 1: sit there and and and spitball a whole bunch of 309 00:17:58,920 --> 00:18:02,240 Speaker 1: possible futures. But that's the thing, they're possible. We don't 310 00:18:02,240 --> 00:18:04,880 Speaker 1: know which one could come out. We don't even know 311 00:18:04,960 --> 00:18:08,520 Speaker 1: if these four pathways are inevitable. We have futurists who 312 00:18:08,560 --> 00:18:11,360 Speaker 1: truly believe that this is something that will happen at 313 00:18:11,400 --> 00:18:15,320 Speaker 1: some point. There are other people who are more skeptical, 314 00:18:16,080 --> 00:18:19,639 Speaker 1: but we'll talk about them in a bit. So one 315 00:18:19,720 --> 00:18:23,320 Speaker 1: of the outcomes that Werner was talking about, uh, and 316 00:18:23,320 --> 00:18:26,800 Speaker 1: it's it's a fairly popular one in futurist circles is 317 00:18:26,840 --> 00:18:31,760 Speaker 1: the idea of the robo apocalypse. Essentially, this is where 318 00:18:31,760 --> 00:18:37,440 Speaker 1: you've got the humans are bad, destroy all humans idea. Essentially, 319 00:18:37,440 --> 00:18:41,240 Speaker 1: the the ideas that humans would become extinct, either through 320 00:18:41,280 --> 00:18:45,119 Speaker 1: definition because we've evolved into something else or because whatever 321 00:18:45,160 --> 00:18:48,680 Speaker 1: the super human intelligence is it besides we are a problem. Yeah, 322 00:18:48,760 --> 00:18:50,000 Speaker 1: and a lot a lot of futures are a lot 323 00:18:50,080 --> 00:18:52,280 Speaker 1: more positive about that. They're they're more looking forward to 324 00:18:52,359 --> 00:18:54,399 Speaker 1: it than being scared of it. It's less of oh, no, 325 00:18:54,520 --> 00:18:57,199 Speaker 1: big scary robots are coming to take over society and 326 00:18:57,240 --> 00:18:59,959 Speaker 1: more of a the robots are coming to take oversus 327 00:19:00,000 --> 00:19:03,520 Speaker 1: society like free Day. Yeah exactly. Yeah, I don't have 328 00:19:03,560 --> 00:19:06,720 Speaker 1: to work anymore, and and I don't because robots are 329 00:19:06,880 --> 00:19:10,400 Speaker 1: are supplying all the things we need. There's no need 330 00:19:10,480 --> 00:19:13,800 Speaker 1: for anyone to work anymore. There's no need for money anymore, 331 00:19:13,880 --> 00:19:15,879 Speaker 1: because the only reason you need money is so you 332 00:19:15,920 --> 00:19:18,119 Speaker 1: can buy stuff. But if everything is free, then you 333 00:19:18,160 --> 00:19:20,520 Speaker 1: don't need you. So it becomes Star Trek and we 334 00:19:20,560 --> 00:19:25,360 Speaker 1: all know run around in jumpsuits and punch people. And 335 00:19:25,440 --> 00:19:28,320 Speaker 1: if you're Kirk, you make out a lot. I mean 336 00:19:28,359 --> 00:19:33,879 Speaker 1: a lot. That dude every week Becauson Ryker. If you 337 00:19:33,920 --> 00:19:37,240 Speaker 1: add them together, make one Kirk. And yes, in this 338 00:19:37,359 --> 00:19:41,080 Speaker 1: documentary series Star Trek, I don't know about Archer because 339 00:19:41,080 --> 00:19:43,800 Speaker 1: I never watched Enterprise, So you guys have to get 340 00:19:43,800 --> 00:19:45,679 Speaker 1: back to me on that. Yeah, sorry, sorry about that. That That. 341 00:19:45,760 --> 00:19:48,320 Speaker 1: It's also a gap in my personal understanding. Just took 342 00:19:48,320 --> 00:19:50,680 Speaker 1: one look at that decontamination chamber and said, yep, I'm 343 00:19:50,680 --> 00:19:55,840 Speaker 1: out um anyway. So that's that's Werner Avenge. It's he's 344 00:19:55,880 --> 00:19:58,920 Speaker 1: sort of popularized this idea, but he's there are other 345 00:19:58,920 --> 00:20:02,159 Speaker 1: people who have I think their names are synonymous with it. 346 00:20:02,359 --> 00:20:06,520 Speaker 1: Hey guys, it's robo Jonathan from here to interrupt, because 347 00:20:06,560 --> 00:20:08,800 Speaker 1: even in the future, we need to take a break 348 00:20:08,880 --> 00:20:18,399 Speaker 1: to thank our sponsors and now back to the show. 349 00:20:19,200 --> 00:20:23,439 Speaker 1: So Werner Venge again very much associated with the idea 350 00:20:23,520 --> 00:20:26,600 Speaker 1: of the singularity. But there's another name that comes up 351 00:20:26,640 --> 00:20:30,760 Speaker 1: all the time, Ray Kurtswile, Rate Kurtswile. And this is 352 00:20:30,800 --> 00:20:33,960 Speaker 1: a fellow who has been referred to in various circles 353 00:20:33,720 --> 00:20:38,159 Speaker 1: as as the Thomas Edison of modern technology or um 354 00:20:38,280 --> 00:20:41,600 Speaker 1: or or perhaps more colorfully, the Willy Wonka of technology. 355 00:20:41,640 --> 00:20:43,320 Speaker 1: That was by Jeff Duncan of Digital Trends, and I 356 00:20:43,359 --> 00:20:46,480 Speaker 1: just wanted to shout out because that was great, but 357 00:20:46,880 --> 00:20:50,840 Speaker 1: you get nothing. I shared a remix of Willy Wonka 358 00:20:50,920 --> 00:20:54,440 Speaker 1: earlier today and it's still playing through my head. We're fans, 359 00:20:54,480 --> 00:20:56,480 Speaker 1: we might be fans of the Gene Wilder Willy Wonka. 360 00:20:56,600 --> 00:20:58,880 Speaker 1: Every when everyone homework assignment, go watch that. It has 361 00:20:58,920 --> 00:21:02,080 Speaker 1: nothing to do with the singularity singularity at all. I 362 00:21:02,080 --> 00:21:05,119 Speaker 1: don't know there's some chocolate singularity in their chocolate singularity. 363 00:21:05,160 --> 00:21:09,240 Speaker 1: I want episode on that band. If I were better 364 00:21:09,280 --> 00:21:11,200 Speaker 1: at cover band names. I totally would have said something 365 00:21:11,200 --> 00:21:13,480 Speaker 1: witty right there. Yeah, all right, well, fair enough, we'll 366 00:21:13,480 --> 00:21:18,240 Speaker 1: say it's the Archies, Sugar Sugar, oh dear, Oh my goodness. Okay, 367 00:21:18,240 --> 00:21:22,280 Speaker 1: So Ray kurtswhile Ray Kurts, while Um is the kind 368 00:21:22,280 --> 00:21:24,639 Speaker 1: of cat who you know, when he was in high school, 369 00:21:25,280 --> 00:21:28,000 Speaker 1: Um invented a computer program. And this isn't in the 370 00:21:28,040 --> 00:21:30,639 Speaker 1: mid nineteen sixties, This isn't like last year or something. 371 00:21:30,880 --> 00:21:33,680 Speaker 1: In the mid nineteen sixties, created a computer program that 372 00:21:34,200 --> 00:21:37,000 Speaker 1: listened to classical music, found patterns in it, and then 373 00:21:37,080 --> 00:21:40,199 Speaker 1: created new classical music based on that. So its a 374 00:21:40,240 --> 00:21:44,000 Speaker 1: computer that composed classical music following the rules of classical 375 00:21:44,040 --> 00:21:47,080 Speaker 1: music that other composers had created. Yes, that's kind of cool. 376 00:21:47,280 --> 00:21:50,200 Speaker 1: That's just that's just something he did, you know. And yeah, 377 00:21:50,359 --> 00:21:55,000 Speaker 1: that's dudes got credentials. Yeah, it's He also kind of 378 00:21:55,040 --> 00:22:00,439 Speaker 1: invented flatbed scanners, has done a whole bunch of stuff 379 00:22:00,440 --> 00:22:06,320 Speaker 1: in speech recognition, and uh, which that's interesting because well, 380 00:22:06,400 --> 00:22:08,199 Speaker 1: and we'll talk about that in a second. But but 381 00:22:09,160 --> 00:22:11,800 Speaker 1: one of Kurt's well's big points is that he thinks 382 00:22:11,840 --> 00:22:15,199 Speaker 1: that by and this all depends upon which interview you 383 00:22:15,240 --> 00:22:18,520 Speaker 1: read of Kurt Kurts well, but in various interviews he 384 00:22:18,600 --> 00:22:21,879 Speaker 1: said that essentially by twenty thirty we will reach a 385 00:22:21,880 --> 00:22:26,000 Speaker 1: point where we will be able to make an artificial brain. Well, 386 00:22:26,040 --> 00:22:29,359 Speaker 1: we'll we'll have reverse engineered the brain, and we'll be 387 00:22:29,400 --> 00:22:33,280 Speaker 1: able to create an artificial one. Uh. And there's a 388 00:22:33,280 --> 00:22:37,639 Speaker 1: lot of debate in in UH, in smarter circles than 389 00:22:37,640 --> 00:22:40,600 Speaker 1: the ones I move in. UH. That's not a that's 390 00:22:40,640 --> 00:22:43,280 Speaker 1: not a slap against my friends. They're pretty bright, but 391 00:22:43,359 --> 00:22:47,840 Speaker 1: none of us are neuro churologically gifted at that point. UM. 392 00:22:48,040 --> 00:22:51,159 Speaker 1: I include myself in that in that circle. So, but 393 00:22:51,400 --> 00:22:54,640 Speaker 1: there there's some very bright people who debate about this 394 00:22:54,680 --> 00:22:56,840 Speaker 1: point whether or not we'll be able by the year 395 00:22:56,880 --> 00:22:59,719 Speaker 1: twenty thirty two reverse engineer the brain and design an 396 00:22:59,760 --> 00:23:03,360 Speaker 1: artificial one. And I think the debate is not so 397 00:23:03,480 --> 00:23:06,679 Speaker 1: much on whether or not will have the technological power 398 00:23:06,840 --> 00:23:11,920 Speaker 1: necessary to to uh simulate a brain. We can simulate 399 00:23:11,960 --> 00:23:15,600 Speaker 1: brains on a certain superficial level today. I mean hypothetically 400 00:23:15,680 --> 00:23:17,959 Speaker 1: we could connect enough computers that we could make it go. 401 00:23:19,000 --> 00:23:21,800 Speaker 1: I think, yeah, we could probably get the computer horsepower, 402 00:23:21,920 --> 00:23:25,920 Speaker 1: especially by thirty to simulate a human brain. The question 403 00:23:26,040 --> 00:23:31,000 Speaker 1: is whether we will understand the human brain exactly. So 404 00:23:32,080 --> 00:23:34,119 Speaker 1: that's that's sort of where the debate lies. It's not 405 00:23:34,200 --> 00:23:36,840 Speaker 1: so much on the technological side of things as it 406 00:23:36,920 --> 00:23:41,600 Speaker 1: is the biological side of things, which is kind of interesting. Um. 407 00:23:41,720 --> 00:23:45,000 Speaker 1: I've read a lot of critics who who have really 408 00:23:45,119 --> 00:23:49,240 Speaker 1: jumped on Kurtzwild for this. Particularly PS Myers has written 409 00:23:49,680 --> 00:23:56,200 Speaker 1: some pretty um yeah, strongly worded, strongly worded criticisms to 410 00:23:56,320 --> 00:24:00,760 Speaker 1: Kurtzwild's theories, saying that that Kurtswild simply does not understand 411 00:24:01,160 --> 00:24:05,879 Speaker 1: neurological development and activities, and that you know, the nature 412 00:24:05,960 --> 00:24:09,560 Speaker 1: between the environment and are are the way our brains 413 00:24:09,600 --> 00:24:13,359 Speaker 1: develop over time versus the you know, nurture versus nature, 414 00:24:13,400 --> 00:24:17,480 Speaker 1: all of this stuff with hormonal changes, electrochemical reaction, saying 415 00:24:17,480 --> 00:24:19,239 Speaker 1: that there's there's so many little bits that make up 416 00:24:19,240 --> 00:24:21,760 Speaker 1: our brains, so many hormones, so many processes, and we 417 00:24:21,880 --> 00:24:23,959 Speaker 1: understand such a small fraction of what they do. This 418 00:24:24,000 --> 00:24:25,919 Speaker 1: is why a lot of psychiatric drugs, for example, are 419 00:24:25,960 --> 00:24:27,800 Speaker 1: kind of like, oh, well, we invented this thing, and 420 00:24:28,200 --> 00:24:30,679 Speaker 1: we guess it does this thing, right, take it and 421 00:24:30,720 --> 00:24:34,119 Speaker 1: see what happened we did stuff we don't. It tends 422 00:24:34,160 --> 00:24:38,199 Speaker 1: to make you happy. It also makes you perceive the 423 00:24:38,200 --> 00:24:42,119 Speaker 1: color red as having the smell of oranges. Like you 424 00:24:42,119 --> 00:24:45,119 Speaker 1: know that that we don't we don't understand it fully. 425 00:24:45,160 --> 00:24:47,560 Speaker 1: And in fact, there are other people like Stephen Novella, 426 00:24:47,640 --> 00:24:51,439 Speaker 1: who is uh he's the author of the Neurological Blog, 427 00:24:51,480 --> 00:24:54,919 Speaker 1: and he also is a host on a wonderful podcast 428 00:24:55,000 --> 00:24:57,560 Speaker 1: called Skeptics Guide to the Universe. If you guys haven't 429 00:24:57,600 --> 00:24:59,240 Speaker 1: listened to that, you should try that out if you, 430 00:24:59,320 --> 00:25:02,760 Speaker 1: especially if you like skepticism and critical thinking. But he's 431 00:25:02,880 --> 00:25:08,520 Speaker 1: he's a doctor and a proponent of evidence based medicine, 432 00:25:08,920 --> 00:25:13,200 Speaker 1: and he talks about how, uh, you know, we don't 433 00:25:13,240 --> 00:25:16,320 Speaker 1: know how much we don't know about the brain, like 434 00:25:16,800 --> 00:25:20,080 Speaker 1: we we have no way of knowing where the endpoint 435 00:25:20,359 --> 00:25:23,520 Speaker 1: is as far as the brain is concerned, and therefore 436 00:25:23,640 --> 00:25:27,520 Speaker 1: we cannot guess at how long it will take us 437 00:25:27,520 --> 00:25:29,720 Speaker 1: to reverse engineers, simply because we don't know where the 438 00:25:29,760 --> 00:25:33,560 Speaker 1: finish line is, right right, Yeah. Kurt kurts Wells Kurtzwell 439 00:25:33,600 --> 00:25:35,919 Speaker 1: has a new book new as of we're recording this 440 00:25:35,960 --> 00:25:40,280 Speaker 1: in January, just came out in November called How to 441 00:25:40,359 --> 00:25:43,879 Speaker 1: Create a Mind. The Secret of Human Thought Revealed and 442 00:25:43,880 --> 00:25:46,719 Speaker 1: and in the book, he theorizes that, um, okay, if 443 00:25:46,720 --> 00:25:48,520 Speaker 1: you'll follow me for a second, atoms are tiny bits 444 00:25:48,520 --> 00:25:53,439 Speaker 1: of data DNA. It's a form of a program. The 445 00:25:53,480 --> 00:25:57,959 Speaker 1: nervous system is a computer that coordinates bodily functions and 446 00:25:58,119 --> 00:26:01,200 Speaker 1: thought is kind of simultaneously a program and the data 447 00:26:01,240 --> 00:26:06,159 Speaker 1: that that program contains. Gotcha. Now, this is another problem 448 00:26:06,200 --> 00:26:11,359 Speaker 1: that some scientists have, is reducing the human brain to 449 00:26:11,480 --> 00:26:15,359 Speaker 1: the model of a computer, right, because it's you know, 450 00:26:15,400 --> 00:26:19,080 Speaker 1: it's it's a very it's a very elegant, interesting proposition 451 00:26:19,680 --> 00:26:21,359 Speaker 1: and and it's kind of sexy like that because you 452 00:26:21,440 --> 00:26:24,159 Speaker 1: go like, oh, well that's that that sort of makes sense. 453 00:26:24,200 --> 00:26:26,240 Speaker 1: Man Like, let's go get a pizza and talk about 454 00:26:26,240 --> 00:26:29,399 Speaker 1: this more. Let me let me get a program that 455 00:26:29,400 --> 00:26:32,800 Speaker 1: will allow me to suddenly know all kung fu. Right, 456 00:26:32,840 --> 00:26:35,200 Speaker 1: And when you're a programmer, that's a great plan. Yeah, 457 00:26:35,280 --> 00:26:38,639 Speaker 1: I mean that sounds that sounds terrific. But yeah, there's 458 00:26:39,440 --> 00:26:44,840 Speaker 1: one one specific guy found. Jarren Lanier wrote a terrific 459 00:26:44,880 --> 00:26:46,760 Speaker 1: thing called one Half of a Manifesto, which is a 460 00:26:46,800 --> 00:26:49,440 Speaker 1: really entertaining read if you guys like this kind of thing, 461 00:26:49,680 --> 00:26:52,160 Speaker 1: where and he was saying that what futurists are talking 462 00:26:52,160 --> 00:26:55,880 Speaker 1: about when they talk about this singularity is basically a religion. 463 00:26:56,359 --> 00:26:59,840 Speaker 1: He was calling it cybernetic totalism. Uh, you know, like 464 00:26:59,840 --> 00:27:03,800 Speaker 1: like about fanatic ideology. He compares it to Marxism at 465 00:27:03,840 --> 00:27:08,000 Speaker 1: some point. But yeah, um, and he was saying that that, 466 00:27:08,080 --> 00:27:12,119 Speaker 1: you know, it's this, This theory is a trific theory 467 00:27:12,160 --> 00:27:15,880 Speaker 1: if you want to get into the philosophy of who 468 00:27:15,920 --> 00:27:18,679 Speaker 1: we are and what we do and what technology is. 469 00:27:19,240 --> 00:27:22,560 Speaker 1: But that you know, cybernetic patterns aren't necessarily the best 470 00:27:22,560 --> 00:27:25,679 Speaker 1: way to understand reality, and that they're not necessarily the 471 00:27:25,720 --> 00:27:29,120 Speaker 1: best model for how people work, for how culture works, 472 00:27:29,119 --> 00:27:32,200 Speaker 1: for how intelligence works. And that's saying so is an 473 00:27:32,800 --> 00:27:36,080 Speaker 1: gross over simplification. That's a good point. And we should 474 00:27:36,119 --> 00:27:38,800 Speaker 1: also point out that it all depends on how you 475 00:27:38,880 --> 00:27:43,159 Speaker 1: define intelligence as well, because Kurtzwell himself has worded his 476 00:27:43,240 --> 00:27:46,760 Speaker 1: own predictions in such a way that that some would 477 00:27:46,840 --> 00:27:51,359 Speaker 1: argue Novella argues, for example, that he has given himself 478 00:27:51,440 --> 00:27:53,679 Speaker 1: enough room where he's going to be right no matter what, 479 00:27:53,880 --> 00:27:57,280 Speaker 1: like saying that we will be able to reverse engineer 480 00:27:57,800 --> 00:28:01,000 Speaker 1: basic brain functions, and novellas is, well, technically you could 481 00:28:01,080 --> 00:28:04,480 Speaker 1: argue that now, so that kind of gives you a 482 00:28:04,520 --> 00:28:08,679 Speaker 1: lot of room. Yeah. But but whether or not it 483 00:28:08,720 --> 00:28:12,040 Speaker 1: means total brain function, that's that's a totally different question. 484 00:28:12,119 --> 00:28:16,359 Speaker 1: And so the other point is that we could theoretically 485 00:28:16,359 --> 00:28:19,720 Speaker 1: create an artificial intelligence that does not necessarily reverse engineer 486 00:28:19,800 --> 00:28:22,480 Speaker 1: the brain. It doesn't follow the human intelligence model. I mean, 487 00:28:22,520 --> 00:28:27,160 Speaker 1: that's IBM S Watson again, a good example of artificial 488 00:28:27,200 --> 00:28:30,480 Speaker 1: intelligence that you know, in some ways it mimics the 489 00:28:30,520 --> 00:28:32,840 Speaker 1: brain because it kind of has to. You know, we're 490 00:28:32,880 --> 00:28:36,119 Speaker 1: coming at this. Human beings are the ones creating this technology, 491 00:28:36,760 --> 00:28:39,800 Speaker 1: and so as human beings creating this technology, it's going 492 00:28:39,880 --> 00:28:43,280 Speaker 1: to follow the rules that as we understand them. So 493 00:28:43,320 --> 00:28:46,720 Speaker 1: there's going to be some indigree there. But but IBM 494 00:28:46,760 --> 00:28:49,160 Speaker 1: S Watson, you know, you think about that, it doesn't 495 00:28:49,240 --> 00:28:53,960 Speaker 1: really understand necessarily the data that's passing through it. It's 496 00:28:54,000 --> 00:28:58,640 Speaker 1: looking for the connections and making just making making connections 497 00:28:58,640 --> 00:29:02,920 Speaker 1: and recognizing pattern and spitting out useful information. Yeah, it's 498 00:29:02,960 --> 00:29:07,280 Speaker 1: it's looking for whatever answer is most likely the right one. 499 00:29:07,560 --> 00:29:10,400 Speaker 1: It's all probability based, right, so, and if it doesn't 500 00:29:10,440 --> 00:29:14,680 Speaker 1: reach a certain threshold, it doesn't provide the answer. So 501 00:29:14,800 --> 00:29:18,200 Speaker 1: if arbitrarily speaking, I don't know what the threshold is 502 00:29:18,240 --> 00:29:21,320 Speaker 1: so I'm just making a number. Let's say it has 503 00:29:21,360 --> 00:29:24,920 Speaker 1: to be certain or higher for it to give that answer. 504 00:29:24,960 --> 00:29:27,880 Speaker 1: If that if the if the certainty falls below that threshold, 505 00:29:28,200 --> 00:29:31,480 Speaker 1: no answer is given. That's essentially how it worked when 506 00:29:31,600 --> 00:29:35,920 Speaker 1: Watson was on Jeopardy. It would it would analyze the 507 00:29:35,920 --> 00:29:40,600 Speaker 1: the the the answer in Jeopardy terms, and then come 508 00:29:40,680 --> 00:29:44,280 Speaker 1: up with what it thought was probably the most accurate 509 00:29:44,360 --> 00:29:48,360 Speaker 1: question for that answer, and occasionally it was wrong to 510 00:29:48,800 --> 00:29:55,680 Speaker 1: hilarious results. But it did sort of seem to kind 511 00:29:55,720 --> 00:29:57,760 Speaker 1: of mimic the way humans think, at least on a 512 00:29:57,800 --> 00:30:01,400 Speaker 1: superficial level. And oh, I mean the thing about humans 513 00:30:01,440 --> 00:30:03,480 Speaker 1: is that they're they're wrong a lot more than a 514 00:30:03,480 --> 00:30:06,120 Speaker 1: lot more than what that fifteen percent at the time. Yeah, 515 00:30:06,160 --> 00:30:08,080 Speaker 1: it's you know, it's we've we've got well, we give 516 00:30:08,080 --> 00:30:12,719 Speaker 1: answers even if we're not short the question. I certainly do, 517 00:30:12,800 --> 00:30:15,560 Speaker 1: because we all know from going to trivia nights. Yeah, 518 00:30:15,600 --> 00:30:17,480 Speaker 1: and and there's there's a lot of I've read a 519 00:30:17,520 --> 00:30:21,600 Speaker 1: lot on online about arguments of how it's our deficiencies 520 00:30:21,680 --> 00:30:26,240 Speaker 1: are memory biases, are rational behavior? Are weird hormonal stuff 521 00:30:26,280 --> 00:30:29,000 Speaker 1: going on? Or what makes us human? And that you 522 00:30:29,040 --> 00:30:33,480 Speaker 1: can't teach a computer to be irrational, that's true. Although 523 00:30:33,520 --> 00:30:35,959 Speaker 1: you can't teach you to swear. You can. We just 524 00:30:36,240 --> 00:30:41,240 Speaker 1: we read a story last week, Yeah, where IBM allowed 525 00:30:41,280 --> 00:30:44,719 Speaker 1: Watson to read the Urban Dictionary, and then Watson got 526 00:30:44,760 --> 00:30:46,440 Speaker 1: a little bit of a potty mouth. It got it 527 00:30:46,480 --> 00:30:48,760 Speaker 1: got kind of fresh. It did it did it started 528 00:30:49,040 --> 00:30:51,200 Speaker 1: it started to say that. Oh see what was it? 529 00:30:51,200 --> 00:30:53,880 Speaker 1: Oh I'm going to say something and it's going to 530 00:30:53,960 --> 00:31:01,640 Speaker 1: be bleaped out, right, Tyler Tyler, Tyler just said so. Uh. Anyway, 531 00:31:01,680 --> 00:31:04,400 Speaker 1: so there's one point where a researcher asked a question 532 00:31:04,480 --> 00:31:08,240 Speaker 1: of Watson, and Watson included within the answer the word 533 00:31:10,520 --> 00:31:12,600 Speaker 1: so since I was bleeped out, you probably don't know 534 00:31:12,640 --> 00:31:14,480 Speaker 1: what it was, so go look it up. It was funny. 535 00:31:14,600 --> 00:31:16,560 Speaker 1: It was really funny. Yes. And then and then they 536 00:31:16,600 --> 00:31:19,479 Speaker 1: basically nuked that part of Watson from orbit. They were like, 537 00:31:19,560 --> 00:31:21,280 Speaker 1: you know what, never mind, it was the only way 538 00:31:21,280 --> 00:31:24,680 Speaker 1: to be sure. They wiped out the Urban Dictionary from 539 00:31:24,680 --> 00:31:27,040 Speaker 1: Watson's memory. They also said that a very similar thing 540 00:31:27,040 --> 00:31:31,120 Speaker 1: happened when they let Watson read Wikipedia. No judgments here, 541 00:31:31,160 --> 00:31:35,040 Speaker 1: just saying what what IBM said. Anyway, Again, the computer 542 00:31:35,240 --> 00:31:39,520 Speaker 1: was unable to determine when when there was appropriate and 543 00:31:39,600 --> 00:31:42,760 Speaker 1: what's the appropriate context for dropping a swear word. It 544 00:31:42,880 --> 00:31:45,240 Speaker 1: didn't know, so it just started to speak kind of 545 00:31:45,280 --> 00:31:48,840 Speaker 1: like my wife does. So yeah, it was I'm going 546 00:31:48,880 --> 00:31:52,400 Speaker 1: to pay for that later. Uh So, anyway, that that 547 00:31:52,400 --> 00:31:55,080 Speaker 1: that's an interesting point though again you're you're showing how 548 00:31:55,320 --> 00:31:58,440 Speaker 1: machine intelligence and human intelligence are different because the machine 549 00:31:58,480 --> 00:32:01,600 Speaker 1: intelligence doesn't have that content. Sure, and of course you 550 00:32:01,600 --> 00:32:05,320 Speaker 1: know we're talking about about science fiction or science future, 551 00:32:05,520 --> 00:32:07,959 Speaker 1: however you want to term it, so that you know, 552 00:32:08,360 --> 00:32:10,760 Speaker 1: we might very well come up with a fancy little 553 00:32:10,800 --> 00:32:14,160 Speaker 1: program script that lets you that lets you introduce that 554 00:32:14,240 --> 00:32:17,120 Speaker 1: kind of bias. But you know, but again from that 555 00:32:17,160 --> 00:32:20,000 Speaker 1: documentary Star Trek, I mean data never figured out these contractions. 556 00:32:21,040 --> 00:32:24,880 Speaker 1: Hi guys, it's Jonathan. I have merged entirely with technology 557 00:32:24,880 --> 00:32:27,560 Speaker 1: at this point, but I still got to pay the bills. 558 00:32:27,720 --> 00:32:39,840 Speaker 1: So let's take a quick break. Touring actually had a great, 559 00:32:40,320 --> 00:32:45,320 Speaker 1: uh mental exercise really, and it's called the touring test, 560 00:32:45,360 --> 00:32:48,840 Speaker 1: and this this applies to artificial intelligence. Tourings point, and 561 00:32:48,920 --> 00:32:51,080 Speaker 1: we've talked about the train test in previous episodes of 562 00:32:51,080 --> 00:32:54,480 Speaker 1: Tech Stuff, But just as a refresher, Touring had suggested 563 00:32:54,520 --> 00:32:57,240 Speaker 1: that you could create a test and that if a 564 00:32:57,360 --> 00:33:01,560 Speaker 1: machine could pass that test at the same level as 565 00:33:01,600 --> 00:33:04,320 Speaker 1: a human. In other words, if you were unable to 566 00:33:04,360 --> 00:33:06,920 Speaker 1: determine that the person who took that test was human 567 00:33:07,040 --> 00:33:09,640 Speaker 1: or machine, the machine had passed the touring test and 568 00:33:09,720 --> 00:33:14,840 Speaker 1: had had essentially simulated human intelligence. And uh. It usually 569 00:33:14,840 --> 00:33:17,800 Speaker 1: works as an interview, So you have someone who's who's 570 00:33:17,840 --> 00:33:22,000 Speaker 1: conducting an interview, and you have either a machine answering 571 00:33:22,200 --> 00:33:25,360 Speaker 1: or a human answering, and there's a barrier up so 572 00:33:25,440 --> 00:33:27,960 Speaker 1: that of course the person asking the questions cannot see 573 00:33:28,000 --> 00:33:30,440 Speaker 1: who is responding, and of course they're responding through you know, 574 00:33:30,520 --> 00:33:33,520 Speaker 1: text usually because if they're responding through a voice, it's like, 575 00:33:34,240 --> 00:33:37,400 Speaker 1: I think the answer as far you know, you'd be like, well, 576 00:33:37,480 --> 00:33:39,560 Speaker 1: either it's a robot or the most boring person in 577 00:33:39,560 --> 00:33:42,240 Speaker 1: the world. Uh. The idea being you would ask these 578 00:33:42,320 --> 00:33:47,120 Speaker 1: questions over a computer monitor, get text responses, and if 579 00:33:47,160 --> 00:33:50,920 Speaker 1: you were unable to to answer with a certain degree 580 00:33:51,520 --> 00:33:53,800 Speaker 1: of accuracy whether or not it was a machine or 581 00:33:53,840 --> 00:33:56,360 Speaker 1: a person, then you would say the machine passed the 582 00:33:57,480 --> 00:34:00,520 Speaker 1: test and uh. And and you could argue, well, that 583 00:34:00,600 --> 00:34:02,640 Speaker 1: could just mean that the machine is very good at 584 00:34:02,720 --> 00:34:06,280 Speaker 1: mimicking human intelligence, it does not actually possess human intelligence. 585 00:34:06,520 --> 00:34:10,680 Speaker 1: Turing's point is, does that matter? Because I know that 586 00:34:10,800 --> 00:34:14,200 Speaker 1: I am intelligent, I speak with someone like Lauren, who 587 00:34:14,239 --> 00:34:17,880 Speaker 1: I assume is also intelligent based upon the responses she gives. 588 00:34:17,880 --> 00:34:21,719 Speaker 1: But she could just be simulating intelligence. However, I have 589 00:34:21,960 --> 00:34:27,360 Speaker 1: I have already bestowed in my mind the the feature 590 00:34:27,480 --> 00:34:31,200 Speaker 1: of intelligence upon Lauren, because what she does is very 591 00:34:31,280 --> 00:34:33,920 Speaker 1: much akin to what I do. So Touring said, if 592 00:34:33,960 --> 00:34:36,920 Speaker 1: you extend the courtesy to your fellow human being that 593 00:34:37,000 --> 00:34:39,760 Speaker 1: they are intelligent based on the fact that they act 594 00:34:39,800 --> 00:34:42,040 Speaker 1: like you do, why would you not do the same 595 00:34:42,080 --> 00:34:45,480 Speaker 1: thing for a machine. Does it matter if the machine 596 00:34:45,560 --> 00:34:49,080 Speaker 1: can actually think. If the machine simulates thought well enough 597 00:34:49,080 --> 00:34:51,200 Speaker 1: for it to pass as human, then you're giving it 598 00:34:51,239 --> 00:34:53,719 Speaker 1: the same benefit of doubt if you word anyone else. 599 00:34:54,760 --> 00:34:57,520 Speaker 1: This is what a lot of science fiction movies are about. Actually, 600 00:34:57,800 --> 00:34:59,960 Speaker 1: there's a lot of philosophy, and a lot of philosophy, 601 00:35:00,040 --> 00:35:02,320 Speaker 1: lot of Isaac Asthma, of a lot of Blade Runner 602 00:35:02,560 --> 00:35:04,920 Speaker 1: and and and that's not an author. Sorry, well no, 603 00:35:05,120 --> 00:35:07,680 Speaker 1: but you know Philip K. Dick. Look him up. So anyway, 604 00:35:08,000 --> 00:35:11,640 Speaker 1: thank you to Android's dream of electric Sheep. I won't. 605 00:35:11,719 --> 00:35:15,359 Speaker 1: I won't spoil it for you. Uh. They to kind 606 00:35:15,360 --> 00:35:18,880 Speaker 1: of wrap this all up, getting back into the discussion 607 00:35:18,920 --> 00:35:23,799 Speaker 1: of philosophy. Uh, we had very recently we did a 608 00:35:23,840 --> 00:35:27,560 Speaker 1: podcast about are we living in a computer simulation? And 609 00:35:27,640 --> 00:35:30,520 Speaker 1: that kind of plays into this idea of the singularity 610 00:35:30,600 --> 00:35:34,359 Speaker 1: because that argument stated that if the singularity is in 611 00:35:34,440 --> 00:35:38,359 Speaker 1: fact possible, if it's inevitable, if we are going to 612 00:35:38,360 --> 00:35:41,280 Speaker 1: reach this level of transhumanism where we are no longer 613 00:35:41,800 --> 00:35:44,279 Speaker 1: able to really predict what the present will be like 614 00:35:44,360 --> 00:35:47,839 Speaker 1: because it will be beyond our understanding, then one thing 615 00:35:47,880 --> 00:35:50,719 Speaker 1: we would expect to do is create simulations of our 616 00:35:50,760 --> 00:35:54,040 Speaker 1: past to kind of study ourselves and to see what 617 00:35:54,040 --> 00:35:57,399 Speaker 1: happens play around variables. Yeah. We like good experiments. Yeah, 618 00:35:57,440 --> 00:36:00,279 Speaker 1: and we could. We could, if we're that advanced, could 619 00:36:00,320 --> 00:36:03,880 Speaker 1: in theory, creates such a realistic simulation that the inhabitants 620 00:36:03,920 --> 00:36:07,680 Speaker 1: of that simulation would be incapable of knowing that they 621 00:36:07,719 --> 00:36:12,480 Speaker 1: were artificial and would be completely you know, self aware 622 00:36:12,480 --> 00:36:16,480 Speaker 1: of themselves. You know that was totally redundant, self aware 623 00:36:17,000 --> 00:36:21,439 Speaker 1: and uh, but unable to know that they were a simulation. Uh. 624 00:36:21,640 --> 00:36:24,000 Speaker 1: He said that if those things are possible, then there's 625 00:36:24,040 --> 00:36:26,879 Speaker 1: no way of knowing that, you know, the the overwhelming 626 00:36:27,200 --> 00:36:31,000 Speaker 1: uh possibility is that we are in a computer simulations, 627 00:36:31,400 --> 00:36:35,239 Speaker 1: computer simulation right now, because yeah, if that's if that's 628 00:36:35,239 --> 00:36:37,319 Speaker 1: what's gonna happen, then there's no way of saying with 629 00:36:37,400 --> 00:36:40,640 Speaker 1: certainty that we are not in fact the product of that. 630 00:36:41,320 --> 00:36:44,279 Speaker 1: And so uh, the point being not necessarily that we 631 00:36:44,320 --> 00:36:46,440 Speaker 1: are in fact living in a computer simulation, but that 632 00:36:46,560 --> 00:36:52,640 Speaker 1: perhaps this singularity, this transhumanism thing might not be realistic. 633 00:36:53,239 --> 00:36:55,160 Speaker 1: It might not be the future that we're headed to. 634 00:36:55,440 --> 00:36:58,680 Speaker 1: Maybe it ends up being a pipe dream that's not 635 00:36:58,800 --> 00:37:02,400 Speaker 1: really possible for us to attain. Or maybe we'll wipe 636 00:37:02,480 --> 00:37:09,760 Speaker 1: ourselves out through some terrible war or catastrophic accident. Um, 637 00:37:09,800 --> 00:37:13,560 Speaker 1: maybe we create a biological entity that wipes us out. 638 00:37:13,320 --> 00:37:18,320 Speaker 1: Allah the stand Why create a black hole at the LHC, 639 00:37:18,480 --> 00:37:21,160 Speaker 1: Which come on, people don't write me. I already know 640 00:37:21,239 --> 00:37:25,239 Speaker 1: about that and how tiny and and and almost non 641 00:37:25,239 --> 00:37:28,680 Speaker 1: existent they are because it lasted so quickly. Let's say 642 00:37:28,680 --> 00:37:30,520 Speaker 1: that they do that thing where you look at that 643 00:37:30,560 --> 00:37:33,040 Speaker 1: one website where the black hole forms in the parking 644 00:37:33,080 --> 00:37:36,440 Speaker 1: lot outside the LHC and you just see the whole picture. 645 00:37:36,520 --> 00:37:44,000 Speaker 1: Go um, which funny video anyway, that's that argument plays 646 00:37:44,040 --> 00:37:46,960 Speaker 1: back into this. So I don't know, I don't know 647 00:37:47,000 --> 00:37:49,279 Speaker 1: if we're going to ever see a future where the 648 00:37:49,320 --> 00:37:51,800 Speaker 1: singularity becomes the thing. Oh and we never really talked 649 00:37:51,800 --> 00:37:54,080 Speaker 1: about it. But one of the big points that kurts 650 00:37:54,080 --> 00:37:58,600 Speaker 1: Well really punches in his Singularity talks is the idea 651 00:37:58,680 --> 00:38:01,759 Speaker 1: of digital immortality, right right, And he's been obsessed with 652 00:38:01,800 --> 00:38:04,600 Speaker 1: this I and and obsessed is probably a judgmental word. 653 00:38:04,640 --> 00:38:08,840 Speaker 1: I apologize, that's but he's been very focused on this concept. 654 00:38:08,920 --> 00:38:12,319 Speaker 1: His father died when he was about twenty four, and 655 00:38:12,400 --> 00:38:15,600 Speaker 1: he's been exploring theories on life extension ever since then, 656 00:38:15,760 --> 00:38:19,839 Speaker 1: and supposedly takes all kinds of supplements and sells them 657 00:38:19,840 --> 00:38:22,480 Speaker 1: as well to extend life, has all kinds of kinds 658 00:38:22,480 --> 00:38:26,360 Speaker 1: of health plans, dietary plans that he has exercise all 659 00:38:26,719 --> 00:38:29,719 Speaker 1: the idea that the idea being that if he can 660 00:38:29,840 --> 00:38:32,600 Speaker 1: preserve his own line last long enough that we hit 661 00:38:32,640 --> 00:38:36,279 Speaker 1: the singularity, then he can become immortal. Right and either that, 662 00:38:36,560 --> 00:38:40,640 Speaker 1: you know, we attain immortality through one of a thousand 663 00:38:40,640 --> 00:38:44,080 Speaker 1: different ways. For example, we end up uploading our own 664 00:38:44,080 --> 00:38:46,880 Speaker 1: intelligence into the cloud, and then we've become part of 665 00:38:46,880 --> 00:38:50,239 Speaker 1: a group consciousness, so we are no longer really individuals, 666 00:38:50,400 --> 00:38:53,080 Speaker 1: or we merge with computers in some other way so 667 00:38:53,120 --> 00:38:56,279 Speaker 1: that we are technically immortal that way. Or we just 668 00:38:56,440 --> 00:39:00,720 Speaker 1: conquer the genes that all guide eight the aging process, 669 00:39:00,800 --> 00:39:02,960 Speaker 1: and we stop it, and we stop disease, you know, 670 00:39:03,000 --> 00:39:04,960 Speaker 1: we we take like in transmant you just take a 671 00:39:04,960 --> 00:39:07,160 Speaker 1: cancer pill and then you don't get cancer, because that's 672 00:39:07,160 --> 00:39:10,080 Speaker 1: what you do. Yeah, so I think again the singularity. 673 00:39:10,160 --> 00:39:12,360 Speaker 1: That's kind of why I think a lot of critics 674 00:39:12,400 --> 00:39:14,240 Speaker 1: also point to it as being more of a religion, 675 00:39:14,280 --> 00:39:17,680 Speaker 1: because it's kind of this sort of utopian pipe dream 676 00:39:18,120 --> 00:39:21,520 Speaker 1: in their minds. There's the former CEO of Lotus, Mitch 677 00:39:21,719 --> 00:39:24,080 Speaker 1: Kapoor Capper, I'm not sure how you say it, once 678 00:39:24,200 --> 00:39:27,120 Speaker 1: called called it the intelligent design for the i Q 679 00:39:27,320 --> 00:39:31,160 Speaker 1: one forty people. Yeah ouch ouch, Well, I mean, well, 680 00:39:31,239 --> 00:39:33,359 Speaker 1: kurtz Wild's kind of laughing all the way to the bank. 681 00:39:33,400 --> 00:39:36,719 Speaker 1: I hear that company that rhymes with schmoogle hired him 682 00:39:36,880 --> 00:39:39,040 Speaker 1: little little people, I mean, we're you probably wouldn't have 683 00:39:39,120 --> 00:39:41,480 Speaker 1: heard of him, but yeah, they just tired him on 684 00:39:41,560 --> 00:39:44,160 Speaker 1: to be I have it in my notes at the 685 00:39:44,200 --> 00:39:47,640 Speaker 1: official title. I think it's the director of Engineering a 686 00:39:47,800 --> 00:39:51,000 Speaker 1: director of engineering over there. Um, they get they get 687 00:39:51,000 --> 00:39:53,239 Speaker 1: some big names. I mean, they had Vince Surf as 688 00:39:53,320 --> 00:39:55,759 Speaker 1: the chief evangelists, and of course he was one of 689 00:39:55,800 --> 00:39:59,839 Speaker 1: the fathers of the Internet. So Google's Google's gotta they're 690 00:40:00,000 --> 00:40:03,400 Speaker 1: own for for getting some really smart people. And and 691 00:40:03,480 --> 00:40:07,200 Speaker 1: to be fair, while the singularity may or may not 692 00:40:07,400 --> 00:40:11,920 Speaker 1: ever happen, I think it's important that we have optimists 693 00:40:12,120 --> 00:40:14,919 Speaker 1: in the field of technology who are really pushing for 694 00:40:15,000 --> 00:40:19,840 Speaker 1: our development to try and make the world a better 695 00:40:19,880 --> 00:40:22,799 Speaker 1: place for people. Now, you know, absolutely so, even if 696 00:40:22,800 --> 00:40:24,560 Speaker 1: we're even if we never reach the point of digital 697 00:40:24,600 --> 00:40:27,680 Speaker 1: immortality in our lifetimes or any other it's I mean, 698 00:40:27,760 --> 00:40:30,120 Speaker 1: if if someone wants to think so big that that 699 00:40:30,160 --> 00:40:33,319 Speaker 1: they want to put in nanobots to make my body awesomer, 700 00:40:33,400 --> 00:40:35,879 Speaker 1: I mean and and not that connected that came out 701 00:40:35,920 --> 00:40:37,960 Speaker 1: that came out possibly crude. I mostly means that I 702 00:40:37,960 --> 00:40:41,840 Speaker 1: don't get cancer and die, um kind of stuff. That's 703 00:40:41,840 --> 00:40:44,399 Speaker 1: that's terrific. I can I can't argue with any part 704 00:40:44,400 --> 00:40:46,440 Speaker 1: of that. Yeah, I'm gonna be on video so much 705 00:40:46,520 --> 00:40:49,160 Speaker 1: this year that I definitely need my body to be awesomer, 706 00:40:50,040 --> 00:40:53,959 Speaker 1: So I'm all for that. Well either way, yes, and 707 00:40:53,960 --> 00:40:56,600 Speaker 1: and and Google. You know, Google looks forward so much 708 00:40:56,640 --> 00:41:00,719 Speaker 1: too to augmented reality. Augmented reality. I'm sorry, I'm I 709 00:41:00,719 --> 00:41:03,520 Speaker 1: can't pronounce anything today. I am on a non role 710 00:41:04,920 --> 00:41:07,240 Speaker 1: in the Internet of Things and all of that wonderful 711 00:41:07,280 --> 00:41:10,560 Speaker 1: future tech that it seems like a terrific fit. Yeah. Yeah, 712 00:41:10,680 --> 00:41:12,880 Speaker 1: so we'll see how how it goes. I mean, obviously, 713 00:41:12,920 --> 00:41:14,920 Speaker 1: the nice thing about this is that all we have 714 00:41:14,960 --> 00:41:17,840 Speaker 1: to do is live long enough to see it happen 715 00:41:18,120 --> 00:41:21,600 Speaker 1: or not happen. And most predictions have the Singularity hitting 716 00:41:21,719 --> 00:41:27,160 Speaker 1: somewhere between twenty Yeah, it all depends on upon which 717 00:41:27,360 --> 00:41:30,760 Speaker 1: futurist you're asking. I hope you guys enjoyed that classic 718 00:41:30,800 --> 00:41:34,400 Speaker 1: episode about the singularity. I have touched on that topic 719 00:41:34,520 --> 00:41:37,400 Speaker 1: multiple times, both on tech Stuff and on other shows 720 00:41:37,440 --> 00:41:40,920 Speaker 1: like Forward Thinking. So if you're really interested in it 721 00:41:40,960 --> 00:41:43,759 Speaker 1: and you want to hear even more, feel free to 722 00:41:43,840 --> 00:41:47,719 Speaker 1: do searches both on Forward Thinking and on tech Stuff. 723 00:41:47,760 --> 00:41:49,920 Speaker 1: You can search on tech Stuff going to the website 724 00:41:50,000 --> 00:41:52,799 Speaker 1: tech stuff podcast dot com. We have an archive there 725 00:41:52,840 --> 00:41:56,120 Speaker 1: of every episode that's ever published Forward Thinking. You might 726 00:41:56,160 --> 00:41:59,319 Speaker 1: have to do a little more googling, but it is there, 727 00:41:59,800 --> 00:42:04,040 Speaker 1: including podcasts and videos, so go check that out. If 728 00:42:04,040 --> 00:42:05,360 Speaker 1: you would like to get in touch with me and 729 00:42:05,400 --> 00:42:08,239 Speaker 1: suggest future topics for tech Stuff, you can do so 730 00:42:08,280 --> 00:42:11,160 Speaker 1: with the email address text Stuff at how stuff works 731 00:42:11,200 --> 00:42:13,880 Speaker 1: dot com. You can also drop a line on Facebook 732 00:42:13,960 --> 00:42:16,360 Speaker 1: or Twitter. The handle for both of those is tech Stuff. 733 00:42:16,520 --> 00:42:19,839 Speaker 1: Hs W. Also add that tech stuff podcast dot com 734 00:42:19,920 --> 00:42:22,640 Speaker 1: link I mentioned earlier. There's a link to our online store, 735 00:42:22,680 --> 00:42:25,000 Speaker 1: so if you've ever wanted any sort of tech stuff 736 00:42:25,000 --> 00:42:27,719 Speaker 1: merchandise you can find it there. Go check it out. 737 00:42:27,760 --> 00:42:29,920 Speaker 1: Every purchase you make goes to help the show. We 738 00:42:30,000 --> 00:42:33,640 Speaker 1: greatly appreciate it, and I'll talk to you again. Release soon. 739 00:42:38,000 --> 00:42:40,200 Speaker 1: Text Stuff is a production of I Heart Radio's How 740 00:42:40,280 --> 00:42:43,640 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 741 00:42:43,680 --> 00:42:46,759 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 742 00:42:46,840 --> 00:42:48,160 Speaker 1: listen to your favorite shows.