1 00:00:04,519 --> 00:00:12,959 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Say hey there, 2 00:00:12,960 --> 00:00:16,239 Speaker 1: and welcome to tech Stuff. I'm your host, jonvan Strickland. 3 00:00:16,280 --> 00:00:19,439 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,520 --> 00:00:24,440 Speaker 1: tech are you. So it's January second, twenty twenty four. 5 00:00:24,840 --> 00:00:26,960 Speaker 1: As you listen to this, it's actually back in the 6 00:00:27,000 --> 00:00:30,840 Speaker 1: past in twenty twenty three when I record it and 7 00:00:30,880 --> 00:00:33,000 Speaker 1: we are on vacation. You know, I brought you a 8 00:00:33,040 --> 00:00:36,839 Speaker 1: whole bunch of new episodes over the holidays, but I 9 00:00:36,880 --> 00:00:40,680 Speaker 1: feared it's about time for Jonathan to maybe rest a 10 00:00:40,720 --> 00:00:43,440 Speaker 1: little bit, So we're going to listen to a classic 11 00:00:43,720 --> 00:00:48,800 Speaker 1: episode today. This episode is called tech Stuff Enters the Singularity. 12 00:00:48,880 --> 00:00:50,960 Speaker 1: I mean, after all, we're in a new year, so 13 00:00:51,000 --> 00:00:53,960 Speaker 1: why not talk about the future a little bit. This 14 00:00:54,200 --> 00:00:59,160 Speaker 1: was recorded way back in February eleventh, twenty thirteen. Lauren 15 00:00:59,240 --> 00:01:03,320 Speaker 1: Vogelbaumb now host of brain Stuff and Savor along with 16 00:01:03,400 --> 00:01:06,479 Speaker 1: other Things, was my co host at the time, and 17 00:01:06,520 --> 00:01:10,000 Speaker 1: we sat down to talk about this vision of the future, 18 00:01:10,440 --> 00:01:14,319 Speaker 1: the singularity. What does it mean that when we're talking 19 00:01:14,360 --> 00:01:18,360 Speaker 1: about the technological singularity? That is So sit back, relax 20 00:01:18,800 --> 00:01:22,120 Speaker 1: and enjoy this classic episode from twenty thirteen called tech 21 00:01:22,160 --> 00:01:29,240 Speaker 1: Stuff Enters the Singularity. Get in touch with technologies with 22 00:01:29,400 --> 00:01:38,319 Speaker 1: tex Stuff from HowStuffWorks dot com. Hey there, everyone, and 23 00:01:38,400 --> 00:01:43,640 Speaker 1: welcome to tech Stuff. My name is Jonathan Strickland, Host Extraordinaire. 24 00:01:43,120 --> 00:01:45,400 Speaker 2: And I'm Lauren vocal Bam, host Extraordinarier. 25 00:01:46,200 --> 00:01:50,200 Speaker 1: That's very true, and today we wanted to talk about 26 00:01:50,600 --> 00:01:54,840 Speaker 1: the future. The future. Yeah, really, we're talking about kind 27 00:01:54,840 --> 00:01:58,200 Speaker 1: of a science fiction future. We're talking about the singularity. 28 00:01:58,280 --> 00:02:02,559 Speaker 1: And long time listeners to tech Stuff and I'm talking 29 00:02:02,560 --> 00:02:06,240 Speaker 1: about folks who listened way back before we ever talked 30 00:02:06,280 --> 00:02:10,040 Speaker 1: out at thirty minutes, let alone an hour. May remember 31 00:02:10,040 --> 00:02:12,560 Speaker 1: that we did an episode about how Ray Kurtzwel works 32 00:02:12,960 --> 00:02:15,080 Speaker 1: and Ray Kurtzwell is a futurist and one of the 33 00:02:15,120 --> 00:02:18,800 Speaker 1: things he talks about extensively, particularly if you corner him 34 00:02:18,800 --> 00:02:22,160 Speaker 1: at a cocktail party, is the singularity. And so we 35 00:02:22,200 --> 00:02:25,560 Speaker 1: wanted to talk about what the singularity is, what this idea, 36 00:02:25,639 --> 00:02:27,839 Speaker 1: you know, we really wanted to kind of dig down 37 00:02:27,880 --> 00:02:30,360 Speaker 1: into it and why is this a big deal and 38 00:02:30,440 --> 00:02:33,960 Speaker 1: how realistic is this vision of the future. 39 00:02:34,120 --> 00:02:35,920 Speaker 2: Yeah, because some people would take a little bit of 40 00:02:36,000 --> 00:02:39,600 Speaker 2: a would would argue with your concept of it being 41 00:02:39,680 --> 00:02:41,680 Speaker 2: science fiction. They take it extremely seriously. 42 00:02:41,720 --> 00:02:44,280 Speaker 1: Oh yeah, they say it's science fact, science fact, it's 43 00:02:44,320 --> 00:02:45,639 Speaker 1: science inevitability. 44 00:02:46,280 --> 00:02:49,720 Speaker 2: Yeah. The term was actually coined by a mathematician, Jean 45 00:02:49,800 --> 00:02:53,280 Speaker 2: von Newman in the nineteen fifties, but it was popularized 46 00:02:53,360 --> 00:02:56,680 Speaker 2: by a science fiction writere. 47 00:02:56,800 --> 00:03:00,360 Speaker 1: Yeah, it's also a There are a lot of different 48 00:03:00,360 --> 00:03:02,520 Speaker 1: concepts that are tied up together, and it all depends 49 00:03:02,520 --> 00:03:05,640 Speaker 1: on upon whom you ask what it means by the singularity. 50 00:03:05,639 --> 00:03:08,639 Speaker 1: For instance, there's some people who when you hear the 51 00:03:08,720 --> 00:03:11,560 Speaker 1: term the singularity, what they say is, Okay, that's a 52 00:03:11,680 --> 00:03:14,160 Speaker 1: time when we get to the point where technological advances 53 00:03:14,200 --> 00:03:18,200 Speaker 1: are coming so quickly that it's impossible to have a 54 00:03:18,360 --> 00:03:22,239 Speaker 1: meaningful conversation of what the state of technology is because 55 00:03:22,240 --> 00:03:26,000 Speaker 1: it changes changes by the milliseconds. Right. So that's one version, 56 00:03:26,280 --> 00:03:29,480 Speaker 1: But most of the versions that we're familiar with that 57 00:03:29,600 --> 00:03:34,840 Speaker 1: the futurists talk about incorporate an idea of superhuman intelligence 58 00:03:35,160 --> 00:03:37,880 Speaker 1: or the intelligence explosion, right. 59 00:03:37,760 --> 00:03:41,760 Speaker 2: A kind of combination of human and technological development that 60 00:03:41,960 --> 00:03:45,080 Speaker 2: just dovetails into this gorgeous you know, space baby from 61 00:03:45,080 --> 00:03:46,800 Speaker 2: two thousand and one kind of that's. 62 00:03:46,640 --> 00:03:49,640 Speaker 1: An excellent way of putting it. The documentary two thousand 63 00:03:49,640 --> 00:03:54,480 Speaker 1: and one. I remember specifically when the space baby looked 64 00:03:54,680 --> 00:03:58,520 Speaker 1: at Earth. Okay, that documentary example doesn't work at all. 65 00:03:58,680 --> 00:04:00,760 Speaker 1: It usually does, but not this Yeah, not this time. 66 00:04:00,800 --> 00:04:04,720 Speaker 2: Sorry, space babies are a poor example in this one instance. 67 00:04:04,800 --> 00:04:09,440 Speaker 1: But metaphorically speaking, yes, you're right on track because the 68 00:04:09,440 --> 00:04:14,160 Speaker 1: intelligence explosion. That was a term introduced by someone known 69 00:04:14,200 --> 00:04:17,039 Speaker 1: as Irving John Good or if you want to go with, 70 00:04:17,080 --> 00:04:20,880 Speaker 1: his birth name is Adore Jakub Gudak. I can see 71 00:04:20,920 --> 00:04:24,200 Speaker 1: why he changed it. Yeah. He actually worked for a 72 00:04:24,240 --> 00:04:29,080 Speaker 1: while at bletch Lee Park with another fellow who made 73 00:04:29,120 --> 00:04:31,120 Speaker 1: sort of a name for himself in computer science, a 74 00:04:31,120 --> 00:04:34,240 Speaker 1: fellow named Alan Turing. Oh oh, I guess I've heard 75 00:04:34,240 --> 00:04:36,560 Speaker 1: of him. Yeah. Touring will come up in the discussion 76 00:04:36,600 --> 00:04:39,960 Speaker 1: a little bit later, but for right now, So, Irving 77 00:04:40,040 --> 00:04:42,760 Speaker 1: John Good just just a little quick anecdote that I 78 00:04:42,760 --> 00:04:47,440 Speaker 1: thought was amusing. So Good was working with touring to 79 00:04:47,520 --> 00:04:50,280 Speaker 1: try and help break German codes. I mean that's what 80 00:04:50,279 --> 00:04:54,360 Speaker 1: Bletchley Park was all about, right right, So Good apparently 81 00:04:54,920 --> 00:04:59,600 Speaker 1: one day drew the ire of touring when he decided 82 00:04:59,600 --> 00:05:02,039 Speaker 1: to take a little cat nap because he was tired 83 00:05:02,480 --> 00:05:06,440 Speaker 1: and he was it was Goods philosophy that being tired 84 00:05:06,480 --> 00:05:08,440 Speaker 1: did not mean that he meant that he was not 85 00:05:08,480 --> 00:05:10,800 Speaker 1: going to work at his best, and he might as 86 00:05:10,839 --> 00:05:13,960 Speaker 1: well go ahead and nap, exactly, take a nap, get refreshed, 87 00:05:14,080 --> 00:05:16,640 Speaker 1: and then tackle the problem again, and you're more likely 88 00:05:16,800 --> 00:05:19,640 Speaker 1: to solve it. Whereas touring was very much a workhorse, 89 00:05:20,160 --> 00:05:22,800 Speaker 1: you know he was he was no rest, no rest, 90 00:05:22,839 --> 00:05:23,480 Speaker 1: We have to. 91 00:05:23,400 --> 00:05:25,640 Speaker 2: Do so touring. 92 00:05:25,680 --> 00:05:28,520 Speaker 1: When he discovered that Good had been napping, decided that 93 00:05:29,000 --> 00:05:32,400 Speaker 1: this was the Good was not so good and and 94 00:05:32,480 --> 00:05:37,400 Speaker 1: touring touring sort of treated him with disdain. He began 95 00:05:37,480 --> 00:05:42,080 Speaker 1: to essentially not speak to Good. Good meanwhile, began to 96 00:05:42,200 --> 00:05:46,080 Speaker 1: think about the letters that were being used in Enigma 97 00:05:46,360 --> 00:05:50,080 Speaker 1: codes to code German messages, and he began to think, 98 00:05:50,120 --> 00:05:53,440 Speaker 1: what if these letters are not completely random? What if 99 00:05:53,480 --> 00:05:57,080 Speaker 1: the Germans are relying on some letters more frequently than others. 100 00:05:57,279 --> 00:06:00,000 Speaker 1: And he began to look at frequency of these letters 101 00:06:00,160 --> 00:06:04,919 Speaker 1: being used. He made up a table and mathematically analyzed 102 00:06:04,960 --> 00:06:07,600 Speaker 1: the frequency that certain letters were used and discovered that 103 00:06:07,800 --> 00:06:10,040 Speaker 1: there was a bias. There was a pattern. Yeah, so 104 00:06:10,120 --> 00:06:12,360 Speaker 1: he said, well, with this bias, that means that we 105 00:06:12,400 --> 00:06:15,040 Speaker 1: can start to narrow down the possibilities of these codes, 106 00:06:15,279 --> 00:06:18,240 Speaker 1: and in fact he was able to demonstrate that this 107 00:06:18,400 --> 00:06:20,680 Speaker 1: was a way to help break German codes, and Touring, 108 00:06:20,720 --> 00:06:23,640 Speaker 1: when he saw Goods work, said, I could have sworn 109 00:06:23,680 --> 00:06:28,400 Speaker 1: I tried that, but clearly that showed that it worked well. 110 00:06:28,440 --> 00:06:31,799 Speaker 1: And then good and another point apparently went to sleep 111 00:06:31,839 --> 00:06:34,680 Speaker 1: one day and they've been working on a code that 112 00:06:34,720 --> 00:06:37,479 Speaker 1: they just could not break, and while he was sleeping, 113 00:06:37,560 --> 00:06:41,440 Speaker 1: he dreamed that perhaps when the Germans were encoding this 114 00:06:41,480 --> 00:06:44,680 Speaker 1: particular message, they used the letters in reverse of the 115 00:06:44,680 --> 00:06:47,039 Speaker 1: way they were actually printed, and so he tried that 116 00:06:47,120 --> 00:06:49,240 Speaker 1: when he woke up, and it turned out he was right. 117 00:06:49,839 --> 00:06:52,479 Speaker 1: And so then his argument was Touring, I need to 118 00:06:52,480 --> 00:06:52,920 Speaker 1: go to bed. 119 00:06:53,080 --> 00:06:55,880 Speaker 2: So yeah, yeah, what the moral of the story here 120 00:06:56,040 --> 00:06:57,479 Speaker 2: is that naps are good? 121 00:06:57,640 --> 00:07:01,239 Speaker 1: Yes, and no one should talk to you, right, yeah, yeah, 122 00:07:01,279 --> 00:07:04,919 Speaker 1: that's how I live my life. But yeah, so so 123 00:07:05,040 --> 00:07:07,839 Speaker 1: Goods point. Anyway, he came up with this term of 124 00:07:07,880 --> 00:07:11,400 Speaker 1: the intelligence explosion, and it was this this sort of 125 00:07:11,440 --> 00:07:14,360 Speaker 1: idea that we're going to reach a point where we 126 00:07:14,560 --> 00:07:18,320 Speaker 1: are increasing either our own intelligence or some sort of 127 00:07:18,400 --> 00:07:23,239 Speaker 1: artificial intelligence so far beyond what we are currently capable 128 00:07:23,240 --> 00:07:28,520 Speaker 1: of understanding, that life as we know it will change. Completely, 129 00:07:28,720 --> 00:07:32,080 Speaker 1: and because it's going to go beyond what we know 130 00:07:32,240 --> 00:07:34,880 Speaker 1: right now, there's no way to predict what our life 131 00:07:34,920 --> 00:07:38,160 Speaker 1: will be like, right because it's beyond our because it 132 00:07:38,240 --> 00:07:38,560 Speaker 1: is it. 133 00:07:38,520 --> 00:07:41,200 Speaker 2: Is Yeah, by definition out of our comprehension. 134 00:07:41,280 --> 00:07:44,760 Speaker 1: Yes, as the Scots would say, it's beyond our ken. 135 00:07:46,720 --> 00:07:48,720 Speaker 2: Are we going to be doing accents of this episode? 136 00:07:48,720 --> 00:07:51,840 Speaker 1: So that was a terrible one. I actually regret doing 137 00:07:51,880 --> 00:07:55,520 Speaker 1: it right now. I already knew I couldn't do Scottish 138 00:07:55,560 --> 00:07:58,840 Speaker 1: and yet there I went. Anyway, you're trail placing again. Yeah, 139 00:07:59,000 --> 00:08:01,960 Speaker 1: So to to kind of backtrack a bit before we 140 00:08:02,080 --> 00:08:06,840 Speaker 1: really get into the whole singularity discussion, that was just 141 00:08:07,200 --> 00:08:12,040 Speaker 1: a brief overview. A good foundation to start from is 142 00:08:12,160 --> 00:08:16,000 Speaker 1: the concept of Moore's law. You know, Originally Gordon Moore, 143 00:08:16,320 --> 00:08:18,280 Speaker 1: who by the way, was a co founder of a 144 00:08:18,280 --> 00:08:23,400 Speaker 1: little company called Intel, he originally observed back in nineteen 145 00:08:23,520 --> 00:08:26,520 Speaker 1: sixty five in a paper that I'm going to I'm 146 00:08:26,520 --> 00:08:28,760 Speaker 1: going to with this, but it was called something like 147 00:08:28,840 --> 00:08:32,559 Speaker 1: cramming more components onto integrated circuits something like that. That 148 00:08:33,000 --> 00:08:35,680 Speaker 1: was actually cramming was definitely one of the words used, 149 00:08:36,600 --> 00:08:40,120 Speaker 1: and circuit probably was too. Anyway, he noticed that over 150 00:08:40,400 --> 00:08:43,800 Speaker 1: the course of I think originally it was twelve months, 151 00:08:43,840 --> 00:08:48,360 Speaker 1: but today we consider it two years. 152 00:08:47,240 --> 00:08:50,400 Speaker 2: Eighteen to twenty four months, I think is the official, unofficial. 153 00:08:50,000 --> 00:08:53,760 Speaker 1: Right, right, right, Yeah, that the number of discrete components 154 00:08:53,800 --> 00:08:58,000 Speaker 1: on a square inch silicon wafer would double due to 155 00:08:58,200 --> 00:09:04,120 Speaker 1: improvements in manufacturing and efficiency, so that in effect, what 156 00:09:04,200 --> 00:09:08,040 Speaker 1: this means to the layman is that our electronics and 157 00:09:08,080 --> 00:09:12,160 Speaker 1: particularly our computers get twice as powerful every two years. 158 00:09:12,240 --> 00:09:14,720 Speaker 1: So if you bought a computer in nineteen ninety eight 159 00:09:15,040 --> 00:09:17,280 Speaker 1: and then bought another computer in two thousand, in theory, 160 00:09:17,360 --> 00:09:19,800 Speaker 1: the computer in two thousand would be twice as powerful 161 00:09:19,800 --> 00:09:22,800 Speaker 1: as the one from nineteen ninety eight. This is exponential growth. 162 00:09:23,920 --> 00:09:28,600 Speaker 1: That's an important component, this idea of exponential growth, right, 163 00:09:29,240 --> 00:09:33,520 Speaker 1: And it goes without saying that if you continue on 164 00:09:33,559 --> 00:09:37,840 Speaker 1: this path, if this, if this continues indefinitely, then you know, 165 00:09:37,920 --> 00:09:41,480 Speaker 1: you quickly get to computers of almost unimaginable power just 166 00:09:41,559 --> 00:09:42,440 Speaker 1: a decade. 167 00:09:42,120 --> 00:09:45,439 Speaker 2: Out certainly, although I mean I still don't really understand 168 00:09:45,440 --> 00:09:49,120 Speaker 2: what a gigabyte means, because when I first started using computers, 169 00:09:49,120 --> 00:09:50,960 Speaker 2: we were not counting in that. I mean, I mean, 170 00:09:51,000 --> 00:09:52,840 Speaker 2: I was still impressed by kilobytes at the time. 171 00:09:52,920 --> 00:09:55,520 Speaker 1: So yeah, Now, I remember the first time I got 172 00:09:56,080 --> 00:09:58,000 Speaker 1: a hard drive, I think it had like a two 173 00:09:58,120 --> 00:10:00,720 Speaker 1: hundred and fifty megabyte hard drive. I thought, you're like, 174 00:10:00,960 --> 00:10:03,840 Speaker 1: who needs that much space? Now? Grat that's that's space 175 00:10:03,880 --> 00:10:07,760 Speaker 1: we're talking about, not even processing pay right, absolutely, So, yeah, 176 00:10:07,800 --> 00:10:11,240 Speaker 1: it's it's it's one of those things where the older 177 00:10:11,280 --> 00:10:13,880 Speaker 1: you are, the more incredible today, is right, because you 178 00:10:13,920 --> 00:10:16,360 Speaker 1: start looking at computers and you think, I remember when 179 00:10:16,360 --> 00:10:19,479 Speaker 1: these things came out, and they were essentially the equivalent 180 00:10:19,520 --> 00:10:22,559 Speaker 1: of a of a really good desktop calculator. Right. So, 181 00:10:23,320 --> 00:10:27,320 Speaker 1: but Moore's law states that this advance will continue indefinitely 182 00:10:27,520 --> 00:10:30,920 Speaker 1: until we hit some sort of fundamental obstacle that we 183 00:10:31,080 --> 00:10:32,880 Speaker 1: just cannot engineer our way around. 184 00:10:32,960 --> 00:10:35,080 Speaker 2: Oh right, you know, and people that's why it's it's 185 00:10:35,120 --> 00:10:37,600 Speaker 2: kind of in contention right now, because people are saying that, well, 186 00:10:37,640 --> 00:10:39,880 Speaker 2: there's there's only so much physical space that you can 187 00:10:39,880 --> 00:10:43,520 Speaker 2: fit onto with silicone. There there's there's a physical limitation 188 00:10:43,600 --> 00:10:45,720 Speaker 2: to the material in which there's only so much it 189 00:10:45,720 --> 00:10:47,480 Speaker 2: can do about it. And so does More's law still 190 00:10:47,520 --> 00:10:50,280 Speaker 2: apply if we're talking about other materials and what's you. 191 00:10:50,240 --> 00:10:52,880 Speaker 1: Know, right, and and how small can you get before 192 00:10:53,000 --> 00:10:56,319 Speaker 1: you start to run into quantum effects that are impossible 193 00:10:56,320 --> 00:10:59,240 Speaker 1: to work around. Uh, and then do you change the 194 00:10:59,320 --> 00:11:02,440 Speaker 1: geometry of a chip? Do you go three dimensional instead 195 00:11:02,440 --> 00:11:04,800 Speaker 1: of two dimensional? Would that help? And yeah, there are 196 00:11:04,800 --> 00:11:07,440 Speaker 1: a lot of engineers are working on this, and frankly, 197 00:11:08,040 --> 00:11:10,160 Speaker 1: pretty much every couple of years, someone says, all right, 198 00:11:10,520 --> 00:11:14,240 Speaker 1: this is the year Moore's Law ends to end. It's over, 199 00:11:14,320 --> 00:11:17,319 Speaker 1: it's gone, it's done with. Five years later, you're still 200 00:11:17,320 --> 00:11:19,880 Speaker 1: going strong. Yeah, And then on the six years someone 201 00:11:19,880 --> 00:11:21,680 Speaker 1: else says More's Law is gonna end. 202 00:11:21,679 --> 00:11:23,679 Speaker 2: It's a little bit of a self fulfilling prophecy. I 203 00:11:23,679 --> 00:11:26,680 Speaker 2: think that a lot of companies attempt to. 204 00:11:26,040 --> 00:11:28,520 Speaker 1: Keep it going. To keep it going, oh sure, yeah, yeah, yeah. 205 00:11:28,600 --> 00:11:31,880 Speaker 1: I mean, no one wants to be the ones to say, uh, guys, 206 00:11:31,920 --> 00:11:35,439 Speaker 1: guess what, we can't keep up with More's law anymore. 207 00:11:35,480 --> 00:11:37,040 Speaker 1: No one wants to do that, so it is a 208 00:11:37,040 --> 00:11:37,760 Speaker 1: good motivator. 209 00:11:37,840 --> 00:11:39,880 Speaker 2: Also, if I can footnote myself real quick, I'm pretty 210 00:11:39,920 --> 00:11:42,240 Speaker 2: sure that I just pronounced silicon is still a cone, 211 00:11:42,240 --> 00:11:43,560 Speaker 2: and I would like I would like to stay for 212 00:11:43,559 --> 00:11:45,480 Speaker 2: the record that I know that those are two different substances. 213 00:11:45,559 --> 00:11:47,680 Speaker 1: Okay, that's fair. Anyway, I was I was going to 214 00:11:47,679 --> 00:11:49,720 Speaker 1: ask you about it, but by the time you were 215 00:11:49,720 --> 00:11:52,920 Speaker 1: finished talking, I thought, let's just go Yeah, that's cool. 216 00:11:53,679 --> 00:11:56,640 Speaker 1: It's all right. If you knew how many times I 217 00:11:56,720 --> 00:12:01,760 Speaker 1: have used that particular pronunciation to hilarious results, excellent. So 218 00:12:01,920 --> 00:12:04,880 Speaker 1: moving on with this whole idea about Moore's law, I mean, 219 00:12:04,920 --> 00:12:08,200 Speaker 1: the reason this plays into the singularity is with the 220 00:12:08,280 --> 00:12:12,880 Speaker 1: technological advances, you start to be able to achieve pretty 221 00:12:12,880 --> 00:12:19,080 Speaker 1: incredible things, and even within one generation of Moore's law, 222 00:12:19,120 --> 00:12:22,240 Speaker 1: which kind of a meaningless term. But let's say you 223 00:12:22,360 --> 00:12:25,040 Speaker 1: arbitrarily pick a date and then two years from that 224 00:12:25,120 --> 00:12:28,000 Speaker 1: date you look and see what's possible with the new technology, 225 00:12:28,440 --> 00:12:32,000 Speaker 1: and getting to twice as much power however you want 226 00:12:32,000 --> 00:12:35,440 Speaker 1: to define it doesn't necessarily mean that you've only doubled 227 00:12:35,640 --> 00:12:37,920 Speaker 1: the amount of things you can do with that power. 228 00:12:37,960 --> 00:12:42,440 Speaker 1: You may have limitless things you can do. So with 229 00:12:42,520 --> 00:12:46,360 Speaker 1: that idea, you're talking about being able to power through 230 00:12:46,440 --> 00:12:49,760 Speaker 1: problems way faster than you did before. And there's lots 231 00:12:49,760 --> 00:12:53,280 Speaker 1: of different ways of doing that. For example, grid computing. 232 00:12:53,559 --> 00:12:57,400 Speaker 1: Grid computing is when you are linking computers together to 233 00:12:57,480 --> 00:13:00,560 Speaker 1: work on a problem all at once. Now works really 234 00:13:00,559 --> 00:13:04,360 Speaker 1: well with certain problems parallel problems we call them. These 235 00:13:04,400 --> 00:13:07,679 Speaker 1: are problems where there are lots of potential solutions and 236 00:13:07,760 --> 00:13:12,239 Speaker 1: each computer essentially is working on one set of potential solutions. 237 00:13:12,559 --> 00:13:14,560 Speaker 1: And that way you have all these different computers working 238 00:13:14,600 --> 00:13:17,120 Speaker 1: on it at the same time. It reduces the overall 239 00:13:17,200 --> 00:13:20,560 Speaker 1: time it takes to solve that parallel problem. And so 240 00:13:21,400 --> 00:13:23,600 Speaker 1: like if you've ever heard of anything like folding at 241 00:13:23,600 --> 00:13:27,079 Speaker 1: Home or the SETI project, where you could dedicate your 242 00:13:27,120 --> 00:13:31,760 Speaker 1: computer's idle time, So the idle processes, the processes that 243 00:13:31,880 --> 00:13:34,440 Speaker 1: are not being used while you're serving the web or 244 00:13:34,720 --> 00:13:40,240 Speaker 1: writing how the singularity works, or I don't know, building 245 00:13:40,320 --> 00:13:46,000 Speaker 1: an architectural program in some sort of CAD application. Anything 246 00:13:46,000 --> 00:13:48,840 Speaker 1: that you're not using can be dedicated to one of 247 00:13:48,840 --> 00:13:53,400 Speaker 1: these projects. Same sort of idea that you don't necessarily 248 00:13:53,400 --> 00:13:57,199 Speaker 1: have to build a supercomputer to solve complex problems if 249 00:13:57,200 --> 00:13:59,559 Speaker 1: you use a whole bunch of computers, whole bunch of 250 00:13:59,559 --> 00:14:02,679 Speaker 1: small ones. Large Hadron Collider does this, although they use 251 00:14:02,840 --> 00:14:05,439 Speaker 1: very nice advanced computers, but they do a lot of 252 00:14:05,480 --> 00:14:10,280 Speaker 1: grid computing as well. So just using those kind of models, 253 00:14:10,320 --> 00:14:13,960 Speaker 1: we see that we're able to do much more sophisticated 254 00:14:13,960 --> 00:14:15,160 Speaker 1: things than we could. 255 00:14:15,240 --> 00:14:18,400 Speaker 2: Otherwise if we were certainly, Yes, networks, as it turns out, 256 00:14:18,440 --> 00:14:19,160 Speaker 2: are pretty cool. 257 00:14:19,360 --> 00:14:21,640 Speaker 1: Yeah, and networks play a part in this idea of 258 00:14:21,640 --> 00:14:24,560 Speaker 1: the singularity. Actually, I guess now is a good time 259 00:14:24,600 --> 00:14:28,960 Speaker 1: we'll kind of transition into Werner Venge's and I honestly, 260 00:14:29,040 --> 00:14:30,400 Speaker 1: I don't know how to say his last name. I 261 00:14:30,400 --> 00:14:33,080 Speaker 1: say Vinge, and it could end up being ringy. But 262 00:14:33,560 --> 00:14:35,680 Speaker 1: I just went with what you said. So that's great, 263 00:14:35,800 --> 00:14:38,160 Speaker 1: that's fine. Let's do it. What we'll say that Venge 264 00:14:38,160 --> 00:14:43,160 Speaker 1: says everything is silicone. So Werner though he vern, I 265 00:14:43,200 --> 00:14:50,040 Speaker 1: call him Vern. He suggested four different potential pathways that 266 00:14:50,160 --> 00:14:53,840 Speaker 1: humans could take, or really that the world could take, yes, 267 00:14:54,080 --> 00:14:57,800 Speaker 1: to arrive at the technological singularity. Okay, what are they? 268 00:14:58,080 --> 00:15:03,120 Speaker 1: The four ways are we could develop a superhuman artificial intelligence, 269 00:15:03,920 --> 00:15:08,800 Speaker 1: So computers suddenly are able to think on a level 270 00:15:08,840 --> 00:15:11,760 Speaker 1: that's analogous to the way humans think and can do 271 00:15:11,800 --> 00:15:15,560 Speaker 1: it better than better. Right whether or not that means 272 00:15:15,680 --> 00:15:19,600 Speaker 1: computers are conscious, that's debatable. We'll get into that too. 273 00:15:20,480 --> 00:15:24,000 Speaker 1: Computer networks could somehow become self aware. That's number two. Okay, 274 00:15:24,480 --> 00:15:27,960 Speaker 1: So yes, skynet, so like the grid computing we were 275 00:15:28,000 --> 00:15:32,520 Speaker 1: just talking about that. Somehow using these grid computers. The 276 00:15:32,560 --> 00:15:34,360 Speaker 1: network itself. 277 00:15:34,000 --> 00:15:37,200 Speaker 2: Having enough cycles and enough pathways and enough loops back around, 278 00:15:37,320 --> 00:15:39,560 Speaker 2: it starts going like, hey, I recognize this, yeah, and 279 00:15:39,840 --> 00:15:40,840 Speaker 2: starts thinking about. 280 00:15:41,560 --> 00:15:45,720 Speaker 1: Like thinking about IBM's Watson. But it's distributed across a network. 281 00:15:45,880 --> 00:15:48,560 Speaker 1: So computers. You can think of computers as all being 282 00:15:48,840 --> 00:15:53,880 Speaker 1: super powerful neurons in a brain, and that the network 283 00:15:53,920 --> 00:15:58,440 Speaker 1: is actually neural pathways. And it's definitely a science fiction 284 00:15:58,560 --> 00:16:01,360 Speaker 1: ye way of looking at things. Doesn't mean it won't happen, 285 00:16:02,200 --> 00:16:04,840 Speaker 1: but strangers, my friends, it feels like a matrix kind 286 00:16:04,840 --> 00:16:07,280 Speaker 1: of thing to me. Then we have the idea that 287 00:16:07,480 --> 00:16:13,560 Speaker 1: computer human interfaces are so advanced and so intrinsically tied 288 00:16:13,640 --> 00:16:18,800 Speaker 1: to who we are, that humans themselves evolve beyond being human, 289 00:16:18,840 --> 00:16:22,200 Speaker 1: we become trans human. Okay, So this is an idea 290 00:16:22,240 --> 00:16:25,440 Speaker 1: that we almost merge with computers at least on some. 291 00:16:25,560 --> 00:16:29,040 Speaker 2: Level via kind of nanobot technology, you know, stuff stuff 292 00:16:29,080 --> 00:16:32,280 Speaker 2: running through our bloodstreams, stuff in our selves yep. 293 00:16:32,440 --> 00:16:36,480 Speaker 1: Or we have just brain in our faces where our consciousness, 294 00:16:37,000 --> 00:16:40,200 Speaker 1: our consciousness is connected to So for example, we might 295 00:16:40,240 --> 00:16:43,880 Speaker 1: have it where instead of connecting to the internet via 296 00:16:44,120 --> 00:16:48,320 Speaker 1: some device like a smartphone or a computer device. Yeah, 297 00:16:48,320 --> 00:16:52,160 Speaker 1: it's right there in our meat brains, so that you know, 298 00:16:52,160 --> 00:16:54,760 Speaker 1: you're sitting there having a conversation with someone. Then you're like, oh, wait, 299 00:16:55,120 --> 00:16:57,240 Speaker 1: what movie was that guy in? Let me just look 300 00:16:57,320 --> 00:17:01,440 Speaker 1: up IMDb in my brain. And then you you know, 301 00:17:01,520 --> 00:17:04,840 Speaker 1: depending on how good your connection is, which means, by 302 00:17:04,840 --> 00:17:07,520 Speaker 1: the way, if you are a journalist and you attend CEES, 303 00:17:07,680 --> 00:17:10,560 Speaker 1: you will automatically be dumber because all the all the 304 00:17:10,560 --> 00:17:12,920 Speaker 1: internet connectivity will be taken up, and so you'll be 305 00:17:12,960 --> 00:17:15,760 Speaker 1: sitting there trying to ask good questions and druel will 306 00:17:15,760 --> 00:17:17,840 Speaker 1: come out of your mouth, which to me is a 307 00:17:17,840 --> 00:17:18,680 Speaker 1: typical CEES. 308 00:17:19,200 --> 00:17:21,680 Speaker 2: I can I can only assume that that wireless technology 309 00:17:21,680 --> 00:17:24,320 Speaker 2: would advance also at this point, but one can only 310 00:17:24,320 --> 00:17:25,320 Speaker 2: hope fingers crossed. 311 00:17:25,480 --> 00:17:27,879 Speaker 1: There are certain technologies that are not advancing at the 312 00:17:27,880 --> 00:17:30,440 Speaker 1: exponential rate of Moore's law, which is another problem. We'll 313 00:17:30,440 --> 00:17:33,320 Speaker 1: talk about that. Yeah. And then the fourth and final 314 00:17:33,800 --> 00:17:38,120 Speaker 1: method that Werner had suggested the world may go would 315 00:17:38,119 --> 00:17:42,080 Speaker 1: be that humans would advance so far in biological sciences 316 00:17:42,400 --> 00:17:46,199 Speaker 1: that they would allow us to engineer human intelligence so 317 00:17:46,240 --> 00:17:49,200 Speaker 1: that we could make ourselves as smart as we wanted 318 00:17:49,240 --> 00:17:51,840 Speaker 1: to be. This is sort of that Gattica future where 319 00:17:52,040 --> 00:17:55,320 Speaker 1: we've got all the another another great documentary where we 320 00:17:55,920 --> 00:18:01,120 Speaker 1: engineer ourselves to be super smart. Right, So those are 321 00:18:01,119 --> 00:18:04,560 Speaker 1: the four pathways artificial intelligence, computer networks become self aware, 322 00:18:04,960 --> 00:18:08,919 Speaker 1: computer human interfaces become really really awesome, or we have 323 00:18:09,000 --> 00:18:14,080 Speaker 1: biologically engineered human intelligence, and all four of these lead 324 00:18:14,160 --> 00:18:18,239 Speaker 1: to a similar outcome, which is this intelligence explosion. And 325 00:18:18,280 --> 00:18:22,760 Speaker 1: this is the idea that some form of superhuman intelligence 326 00:18:22,840 --> 00:18:26,480 Speaker 1: is created, either artificially or within ourselves, and that at 327 00:18:26,520 --> 00:18:29,680 Speaker 1: that point we will no longer be able to predict 328 00:18:29,720 --> 00:18:33,320 Speaker 1: what our world will will be like, because by definition, 329 00:18:33,359 --> 00:18:38,080 Speaker 1: we will have a superhuman intelligent entity involved. And because 330 00:18:38,080 --> 00:18:42,720 Speaker 1: that's superhuman, it's beyond our ability to predict. Right, which 331 00:18:42,760 --> 00:18:44,680 Speaker 1: is you know which. 332 00:18:44,440 --> 00:18:47,359 Speaker 2: Which makes that experience experiments about it a little bit uh. 333 00:18:49,280 --> 00:18:53,960 Speaker 1: Philosophical. Yeah, that's the kind way of putting it pointless, 334 00:18:54,000 --> 00:18:56,399 Speaker 1: would be another way of putting it, like we could 335 00:18:56,440 --> 00:18:59,359 Speaker 1: we could, you know, sit there and and spitball a 336 00:18:59,400 --> 00:19:02,920 Speaker 1: whole bunch of possible futures. But that's the thing, they're possible. 337 00:19:02,960 --> 00:19:05,600 Speaker 1: We don't know which one could come out. We don't 338 00:19:05,600 --> 00:19:08,800 Speaker 1: even know if these four pathways are inevitable. We have 339 00:19:08,880 --> 00:19:11,760 Speaker 1: futurists who truly believe that this is something that will 340 00:19:11,840 --> 00:19:14,920 Speaker 1: happen at some point. There are other people who are 341 00:19:15,680 --> 00:19:19,160 Speaker 1: more skeptical, but we'll talk about them in a bit. 342 00:19:19,280 --> 00:19:23,399 Speaker 1: So one of the outcomes that Werner was talking about, 343 00:19:24,160 --> 00:19:27,919 Speaker 1: and it's a fairly popular one in futurist circles, is 344 00:19:27,960 --> 00:19:32,679 Speaker 1: the idea of the robo apocalypse. Essentially, right, this is 345 00:19:32,680 --> 00:19:36,439 Speaker 1: where you've got the humans are bad, destroy all humans idea, 346 00:19:38,119 --> 00:19:42,359 Speaker 1: Essentially the ideas that humans would become extinct, either through 347 00:19:42,400 --> 00:19:46,199 Speaker 1: definition because we've evolved into something else or because whatever 348 00:19:46,240 --> 00:19:49,520 Speaker 1: the superhuman intelligence is it besides, we are a problem. 349 00:19:49,600 --> 00:19:51,320 Speaker 2: Yeah, and a lot of futurists are a lot more 350 00:19:51,359 --> 00:19:53,679 Speaker 2: positive about that. They're more looking forward to it than 351 00:19:53,720 --> 00:19:55,520 Speaker 2: being scared of it. It's less of a oh no, 352 00:19:55,640 --> 00:19:58,120 Speaker 2: big scary robots are coming to take over our society 353 00:19:58,240 --> 00:20:00,679 Speaker 2: and more of a robot it's are coming to take 354 00:20:00,720 --> 00:20:02,439 Speaker 2: over our society like free day. 355 00:20:02,600 --> 00:20:05,480 Speaker 1: Yeah, yeah, exactly. Yeah, I don't have to work anymore, 356 00:20:05,680 --> 00:20:09,320 Speaker 1: and and I don't because robots are supplying all the 357 00:20:09,320 --> 00:20:13,120 Speaker 1: things we need. There's no need for anyone to work anymore. 358 00:20:13,160 --> 00:20:16,120 Speaker 1: There's no need for money anymore because the only reason 359 00:20:16,119 --> 00:20:18,000 Speaker 1: you need money is so you can buy stuff. But 360 00:20:18,040 --> 00:20:20,520 Speaker 1: if everything's free, then you don't need you. So it 361 00:20:20,520 --> 00:20:23,359 Speaker 1: becomes Star Trek and we all, you know, run around 362 00:20:23,400 --> 00:20:27,240 Speaker 1: in jumpsuits and right punch people. And if you're Kirk, 363 00:20:27,280 --> 00:20:30,080 Speaker 1: you make out a lot. I mean a lot. That 364 00:20:30,240 --> 00:20:35,960 Speaker 1: dude every week Becrden Riiker. If you add them together, 365 00:20:36,520 --> 00:20:37,160 Speaker 1: make one. 366 00:20:37,040 --> 00:20:39,399 Speaker 2: Kirk and yes in this documentary series. 367 00:20:39,440 --> 00:20:41,960 Speaker 1: Yeah, Star Trek. Yeah, I don't know about our trick 368 00:20:41,960 --> 00:20:44,720 Speaker 1: because I never watched Enterprise, So you guys have to 369 00:20:44,720 --> 00:20:46,840 Speaker 1: get back to me on that. Yeah, sorry, sorry about that. That. 370 00:20:46,840 --> 00:20:48,720 Speaker 2: It's also a gap in my personal understanding. 371 00:20:48,960 --> 00:20:51,119 Speaker 1: I just took one look at that decontamination chamber and 372 00:20:51,119 --> 00:20:55,480 Speaker 1: I said, yep, I'm out anyway. So that's that's Werner Revenge. 373 00:20:55,880 --> 00:20:59,679 Speaker 1: It's he's sort of popularized this idea, but he's there 374 00:20:59,720 --> 00:21:02,080 Speaker 1: are other people have kind of I think their names 375 00:21:02,119 --> 00:21:04,920 Speaker 1: are synonymous with it, and we will talk about them 376 00:21:05,480 --> 00:21:08,240 Speaker 1: in just a minute and now back to the show. 377 00:21:09,040 --> 00:21:13,240 Speaker 1: So Werner Venge again very much associated with the idea 378 00:21:13,320 --> 00:21:16,399 Speaker 1: of the singularity. But there's another name that comes up 379 00:21:16,440 --> 00:21:18,280 Speaker 1: all the time, Ray Kurtsweile. 380 00:21:18,720 --> 00:21:21,640 Speaker 2: Ray Kurtsweil, and this is a fellow who has been 381 00:21:21,720 --> 00:21:25,520 Speaker 2: referred to in various circles as the Thomas Edison of 382 00:21:25,640 --> 00:21:30,639 Speaker 2: modern technology, or or, perhaps more colorfully, the Willy Wonka 383 00:21:30,680 --> 00:21:33,000 Speaker 2: of technology. That was by Jef Duncan of Digital Trends, 384 00:21:33,000 --> 00:21:35,800 Speaker 2: and I just wanted to shout out because that was great, nice. 385 00:21:36,119 --> 00:21:40,280 Speaker 1: But you get nothing. I shared a remix of Willy 386 00:21:40,280 --> 00:21:43,240 Speaker 1: Wonka earlier today and it's still playing through my head. 387 00:21:43,320 --> 00:21:45,679 Speaker 2: We're fans, we might be fans of the Gene Wilder 388 00:21:45,720 --> 00:21:48,639 Speaker 2: Willy Wonka. Everyone, homework assignment, go watch that. It has 389 00:21:48,720 --> 00:21:51,720 Speaker 2: nothing to do with the singularity, the singularity at all. 390 00:21:51,800 --> 00:21:54,919 Speaker 1: I don't know there's some chocolate singularity in there, Chocolate Singularity. 391 00:21:54,960 --> 00:21:56,000 Speaker 1: I want to do an episode on. 392 00:21:58,480 --> 00:22:00,320 Speaker 2: If I were better at cover band names, I totally 393 00:22:00,320 --> 00:22:01,680 Speaker 2: would have said something whitty right there. 394 00:22:01,760 --> 00:22:03,639 Speaker 1: Yeah, all right, well, fair enough, we'll say it's the 395 00:22:03,720 --> 00:22:07,480 Speaker 1: Archies for Sugar Sugar, Oh dear, Oh my goodness. 396 00:22:07,720 --> 00:22:12,120 Speaker 2: Okay, So Ray Kurtzweil, Yeah, Ray Kurtzweil is the kind 397 00:22:12,119 --> 00:22:14,080 Speaker 2: of cat who you know, when he was in high 398 00:22:14,080 --> 00:22:17,760 Speaker 2: school invented a computer program. And this is in the 399 00:22:17,840 --> 00:22:20,479 Speaker 2: mid nineteen sixties. This isn't like last year or something 400 00:22:20,680 --> 00:22:23,480 Speaker 2: in the mid nineteen sixties, created a computer program that 401 00:22:24,000 --> 00:22:26,800 Speaker 2: listened to classical music, found patterns in it, and then 402 00:22:26,880 --> 00:22:28,879 Speaker 2: created new classical music based on that. 403 00:22:29,640 --> 00:22:32,640 Speaker 1: So as a computer that composed classical music, yes, following 404 00:22:32,680 --> 00:22:36,120 Speaker 1: the rules of classical music that other composers had created. Yes, 405 00:22:36,160 --> 00:22:38,480 Speaker 1: that's kind of cool. That's just that's just something he did, 406 00:22:38,600 --> 00:22:43,480 Speaker 1: you know, And yeah, that's dude's got credentials. Yeah. 407 00:22:43,960 --> 00:22:49,159 Speaker 2: He also kind of invented flatbed scanners, has done a 408 00:22:49,160 --> 00:22:52,760 Speaker 2: whole bunch of stuff in speech recognition, and. 409 00:22:54,160 --> 00:22:57,000 Speaker 1: Which that's interesting because we'll and we'll talk about that 410 00:22:57,040 --> 00:23:00,679 Speaker 1: in a second. But one of Kurtzweld's big points is 411 00:23:00,680 --> 00:23:03,800 Speaker 1: that he thinks that by and this all depends upon 412 00:23:03,840 --> 00:23:08,240 Speaker 1: which interview you read of Cartswell, but in various interviews 413 00:23:08,280 --> 00:23:11,600 Speaker 1: he said that essentially, by twenty thirty, we will reach 414 00:23:11,600 --> 00:23:13,680 Speaker 1: a point where we will be able to make an 415 00:23:13,760 --> 00:23:19,080 Speaker 1: artificial brain. We'll have reverse engineered the brain, and we'll 416 00:23:19,080 --> 00:23:23,080 Speaker 1: be able to create an artificial one. And there's a 417 00:23:23,119 --> 00:23:27,879 Speaker 1: lot of debate in smarter circles and the ones I 418 00:23:27,960 --> 00:23:32,200 Speaker 1: move in. That's not a slap against my friends. They're 419 00:23:32,240 --> 00:23:36,159 Speaker 1: pretty bright, but none of us are neu quirologically gifted 420 00:23:36,200 --> 00:23:39,879 Speaker 1: at that point. I include myself in that circle. So, 421 00:23:40,880 --> 00:23:44,280 Speaker 1: but there are some very bright people who debate about 422 00:23:44,280 --> 00:23:46,440 Speaker 1: this point, whether or not we'll be able by the 423 00:23:46,520 --> 00:23:49,399 Speaker 1: year twenty thirty to reverse engineer the brain and design 424 00:23:49,440 --> 00:23:52,680 Speaker 1: an artificial one. And I think the debate is not 425 00:23:52,960 --> 00:23:56,160 Speaker 1: so much on whether or not we'll have the technological 426 00:23:56,200 --> 00:24:02,119 Speaker 1: power necessary to simulate a Sure we can simulate brains 427 00:24:02,160 --> 00:24:04,679 Speaker 1: on a certain superficial level today, well, I. 428 00:24:04,680 --> 00:24:07,320 Speaker 2: Mean hypothetically we could connect enough computers that we could 429 00:24:07,359 --> 00:24:07,840 Speaker 2: make it go. 430 00:24:08,840 --> 00:24:11,680 Speaker 1: I think, yeah, we could probably get the computer horsepower, 431 00:24:11,760 --> 00:24:14,879 Speaker 1: especially by twenty thirty, to simulate a human brain. The 432 00:24:15,320 --> 00:24:19,200 Speaker 1: question is whether we will understand the human brain enough 433 00:24:19,240 --> 00:24:23,119 Speaker 1: to do so exactly. So that's sort of where the 434 00:24:23,119 --> 00:24:25,960 Speaker 1: debate lies. It's not so much on the technological side 435 00:24:25,960 --> 00:24:28,679 Speaker 1: of things as it is the biological side of things, 436 00:24:29,240 --> 00:24:32,720 Speaker 1: which is kind of interesting. I've read a lot of 437 00:24:32,760 --> 00:24:37,280 Speaker 1: critics who have really jumped on Kurtzweld for this. Particularly PZ. 438 00:24:37,440 --> 00:24:45,199 Speaker 1: Myers has written some pretty yeah strongly worded, strongly worded 439 00:24:45,240 --> 00:24:49,919 Speaker 1: criticisms to Kurtzweil's theories, saying that Kurtzweil simply does not 440 00:24:50,040 --> 00:24:54,840 Speaker 1: understand neurological development and activities, and that you know, the 441 00:24:55,359 --> 00:24:59,679 Speaker 1: nature between the environment and the way our brains develop 442 00:24:59,720 --> 00:25:03,320 Speaker 1: over to versus the you know, nurture versus nature, all 443 00:25:03,359 --> 00:25:07,680 Speaker 1: of this stuff with the hormonal changes, electrochemical reactions. 444 00:25:07,080 --> 00:25:08,879 Speaker 2: Saying that there's there's so many little bits that make 445 00:25:08,960 --> 00:25:11,160 Speaker 2: up our brains, so many hormones, so many processes, and 446 00:25:11,400 --> 00:25:13,600 Speaker 2: we understand such a small fraction of what they do. 447 00:25:13,640 --> 00:25:15,560 Speaker 2: This is why a lot of psychiatric drugs, for example, 448 00:25:15,600 --> 00:25:17,320 Speaker 2: are kind of like, oh, well, we invented this thing, 449 00:25:17,440 --> 00:25:20,399 Speaker 2: and we guess it does this thing right, take it 450 00:25:20,440 --> 00:25:21,119 Speaker 2: and see what happens? 451 00:25:21,520 --> 00:25:24,320 Speaker 1: We do stuff? Yeah, we don't. It tends to make 452 00:25:24,400 --> 00:25:28,720 Speaker 1: you happy. It also makes you perceive the color red 453 00:25:28,920 --> 00:25:32,400 Speaker 1: as having the smell of oranges, like you know that 454 00:25:32,400 --> 00:25:35,440 Speaker 1: that's we don't. We don't understand it fully. And in fact, 455 00:25:35,480 --> 00:25:38,560 Speaker 1: there are other people like Stephen Novella, who is uh 456 00:25:39,040 --> 00:25:41,800 Speaker 1: he's the author of the Neurological Blog, and he also 457 00:25:42,000 --> 00:25:45,920 Speaker 1: is a host on a wonderful podcast called Skeptics Guide 458 00:25:45,920 --> 00:25:47,919 Speaker 1: to the Universe. If you guys haven't listened to that, 459 00:25:47,960 --> 00:25:49,680 Speaker 1: you should try that out if you especially if you 460 00:25:49,760 --> 00:25:54,520 Speaker 1: like skepticism and critical thinking. But he's he's a doctor 461 00:25:55,119 --> 00:26:00,399 Speaker 1: and a proponent of evidence based medicine, and he talks 462 00:26:00,480 --> 00:26:04,760 Speaker 1: about how we don't know how much we don't know 463 00:26:05,040 --> 00:26:09,040 Speaker 1: about the brain. We have no way of knowing where 464 00:26:09,200 --> 00:26:12,320 Speaker 1: the endpoint is as far as the brain is concerned, 465 00:26:12,720 --> 00:26:16,880 Speaker 1: and therefore we cannot guess at how long it will 466 00:26:16,880 --> 00:26:19,240 Speaker 1: take us to reverse engineer. It's simply because we don't 467 00:26:19,240 --> 00:26:20,360 Speaker 1: know where the finish line is. 468 00:26:20,680 --> 00:26:24,520 Speaker 2: Right right, Yeah, Kurt Kurtzwell's Kurtzwell has a new book 469 00:26:24,520 --> 00:26:26,879 Speaker 2: new as of we're recording this in January twenty thirteen. 470 00:26:26,920 --> 00:26:29,520 Speaker 2: It just came out in November of twenty twelve called 471 00:26:29,800 --> 00:26:32,480 Speaker 2: How to Create a Mind. The Secret of Human Thought Revealed. 472 00:26:33,080 --> 00:26:36,679 Speaker 2: And in the book he theorizes that Okay, if you'll 473 00:26:36,680 --> 00:26:39,720 Speaker 2: follow me for a second, adams are tiny bits of data, Okay. 474 00:26:39,840 --> 00:26:44,000 Speaker 2: DNA is a form of a program. The nervous system 475 00:26:44,080 --> 00:26:48,520 Speaker 2: is a computer that coordinates bodily functions, and thought is 476 00:26:48,560 --> 00:26:51,280 Speaker 2: kind of simultaneously a program and the data that that 477 00:26:51,320 --> 00:26:52,399 Speaker 2: program contains. 478 00:26:54,160 --> 00:26:58,080 Speaker 1: Gotcha. See, now, this is another problem that some scientists have, yeah, 479 00:26:58,320 --> 00:27:03,280 Speaker 1: is reducing the human brain to the model of a computer. 480 00:27:03,640 --> 00:27:06,000 Speaker 2: Right, because it's you know, it's it's a very it's 481 00:27:06,040 --> 00:27:10,359 Speaker 2: a very elegant, interesting proposition. Sure, and and it's kind 482 00:27:10,359 --> 00:27:11,800 Speaker 2: of sexy like that because you go like, oh, well 483 00:27:11,840 --> 00:27:14,760 Speaker 2: that's that that sort of makes sense. Man, Like, let's 484 00:27:14,760 --> 00:27:16,400 Speaker 2: go get a pizza and talk about this more. 485 00:27:16,440 --> 00:27:19,400 Speaker 1: Yeah, let me let me get a program that will 486 00:27:19,400 --> 00:27:21,680 Speaker 1: allow me to suddenly know all kung fu. 487 00:27:22,400 --> 00:27:25,080 Speaker 2: Right, and when you're a programmer, that's a great plan. Yeah, 488 00:27:25,119 --> 00:27:28,120 Speaker 2: I mean, yeah, that sounds that sounds terrific. But yeah, 489 00:27:28,160 --> 00:27:34,040 Speaker 2: there's one one specific guy found. Jaren Lanier wrote a 490 00:27:34,280 --> 00:27:36,480 Speaker 2: terrific thing called One Half of a Manifesto, which is 491 00:27:36,480 --> 00:27:38,920 Speaker 2: a really entertaining read if you guys like this kind 492 00:27:38,960 --> 00:27:41,719 Speaker 2: of thing, where he was saying that what futurists are 493 00:27:41,720 --> 00:27:44,240 Speaker 2: talking about when they talk about this the singularity is 494 00:27:44,720 --> 00:27:49,439 Speaker 2: basically a religion. He was calling it cybernetic totalism, you know, 495 00:27:49,520 --> 00:27:53,000 Speaker 2: like like a fanatic ideology. He compares it to Marxism 496 00:27:53,440 --> 00:27:57,720 Speaker 2: at some point. Interesting, Yeah, and he was saying that that, 497 00:27:57,880 --> 00:28:01,800 Speaker 2: you know, it's this, this theory is a terrific theory 498 00:28:02,000 --> 00:28:05,680 Speaker 2: if you want to get into the philosophy of who 499 00:28:05,720 --> 00:28:08,439 Speaker 2: we are and what we do and what technology is. 500 00:28:09,040 --> 00:28:12,360 Speaker 2: But that you know, cybernetic patterns aren't necessarily the best 501 00:28:12,359 --> 00:28:15,480 Speaker 2: way to understand reality, and that they're not necessarily the 502 00:28:15,520 --> 00:28:18,920 Speaker 2: best model for how people work, for how culture works, 503 00:28:18,920 --> 00:28:22,000 Speaker 2: for how intelligence works. And that's saying so is an 504 00:28:22,600 --> 00:28:23,840 Speaker 2: gross over simplification. 505 00:28:24,080 --> 00:28:26,479 Speaker 1: That's a good point, and we should also point out 506 00:28:26,520 --> 00:28:30,240 Speaker 1: that it all depends on how you define intelligence as well, 507 00:28:30,320 --> 00:28:34,200 Speaker 1: because Kurtzwell himself has worded his own predictions in such 508 00:28:34,200 --> 00:28:38,560 Speaker 1: a way that some would argue. Novella argues, for example, 509 00:28:39,040 --> 00:28:42,520 Speaker 1: that he has given himself enough room where he's going 510 00:28:42,560 --> 00:28:44,800 Speaker 1: to be right no matter what, like saying that by 511 00:28:44,880 --> 00:28:48,040 Speaker 1: twenty thirty, we will be able to reverse engineer basic 512 00:28:48,200 --> 00:28:51,160 Speaker 1: brain functions, and Novella says, well, technically you could argue 513 00:28:51,160 --> 00:28:54,520 Speaker 1: that now, So that kind of gives you a lot 514 00:28:54,520 --> 00:28:57,400 Speaker 1: of room, a little bit of a gimme there. Yeah, 515 00:28:56,640 --> 00:29:00,120 Speaker 1: But whether or not it means total brain function, and 516 00:29:00,200 --> 00:29:03,480 Speaker 1: that's that's a totally different question. And so the other 517 00:29:03,520 --> 00:29:07,480 Speaker 1: point is that we could theoretically create an artificial intelligence 518 00:29:07,480 --> 00:29:10,320 Speaker 1: that does not necessarily reverse engineer the brain. It doesn't 519 00:29:10,360 --> 00:29:14,200 Speaker 1: follow the human intelligence model. I mean, that's IBM's Watson again, 520 00:29:14,840 --> 00:29:19,040 Speaker 1: a good example of artificial intelligence that you know, in 521 00:29:19,120 --> 00:29:21,760 Speaker 1: some ways it mimics the brain because it kind of 522 00:29:21,800 --> 00:29:24,440 Speaker 1: has to. You know, we're coming at this human beings 523 00:29:24,480 --> 00:29:27,840 Speaker 1: are the ones creating this technology, and so as human 524 00:29:27,880 --> 00:29:31,520 Speaker 1: beings creating this technology, it's going to follow the rules 525 00:29:31,520 --> 00:29:33,720 Speaker 1: that as we understand them. So there's going to be 526 00:29:33,760 --> 00:29:37,280 Speaker 1: some medocry there. Right. But but IBM's Watson, you know, 527 00:29:37,360 --> 00:29:42,200 Speaker 1: you think about that. It doesn't really understand necessarily the 528 00:29:42,280 --> 00:29:45,120 Speaker 1: data that's passing through it. It's looking for the connections 529 00:29:45,120 --> 00:29:45,840 Speaker 1: and making. 530 00:29:45,840 --> 00:29:50,080 Speaker 2: It really savvy at making connections and recognizing patterns and 531 00:29:50,120 --> 00:29:52,040 Speaker 2: spitting out useful information. 532 00:29:52,240 --> 00:29:56,560 Speaker 1: Yeah, it's looking for whatever answer is most likely the 533 00:29:56,640 --> 00:29:59,720 Speaker 1: right one. It's all probability based, right, So, and if 534 00:29:59,720 --> 00:30:03,320 Speaker 1: it does doesn't reach a certain threshold, it doesn't provide 535 00:30:03,320 --> 00:30:07,240 Speaker 1: the answer. So if arbitrarily speaking, i don't know what 536 00:30:07,280 --> 00:30:09,640 Speaker 1: the threshold is, so I'm just making a number eighty 537 00:30:09,640 --> 00:30:12,040 Speaker 1: five percent. Let's say it has to be eighty five 538 00:30:12,040 --> 00:30:14,680 Speaker 1: percent certain or higher for it to give that answer. 539 00:30:14,800 --> 00:30:17,800 Speaker 1: If that if the if a certainty falls below that threshold, 540 00:30:18,000 --> 00:30:21,320 Speaker 1: no answer is given. That's essentially how it worked when 541 00:30:21,440 --> 00:30:25,760 Speaker 1: Watson was on Jeopardy, right. It would analyze the the 542 00:30:26,040 --> 00:30:30,840 Speaker 1: the answer in jeopardy terms and then come up with 543 00:30:31,080 --> 00:30:34,760 Speaker 1: what it thought was probably the most accurate question for 544 00:30:34,920 --> 00:30:40,400 Speaker 1: that answer, and occasionally it was wrong to hilarious results. 545 00:30:41,120 --> 00:30:45,960 Speaker 1: But it did sort of seem to kind of mimic 546 00:30:46,000 --> 00:30:48,520 Speaker 1: the way humans think, at least on a superficial level. 547 00:30:48,920 --> 00:30:51,600 Speaker 2: And I mean the thing about humans is that they're 548 00:30:51,680 --> 00:30:53,800 Speaker 2: they're wrong a lot more than a lot more than 549 00:30:53,800 --> 00:30:55,040 Speaker 2: what that fifteen percent of the time. 550 00:30:55,680 --> 00:30:57,880 Speaker 1: Yeah, it's you know, we've we've got well, we give 551 00:30:57,920 --> 00:31:00,480 Speaker 1: answers even if we're not eighty five percent short a question. 552 00:31:01,760 --> 00:31:02,560 Speaker 1: I certainly do. 553 00:31:02,640 --> 00:31:05,360 Speaker 2: Because we all know from going to trivia nights. Yeah, 554 00:31:05,400 --> 00:31:07,560 Speaker 2: and there's there's a lot of I've read a lot 555 00:31:07,880 --> 00:31:12,680 Speaker 2: on online about arguments of how it's our deficiencies, our memory, biases, 556 00:31:12,760 --> 00:31:16,960 Speaker 2: our rational behavior are weird hormonal stuff going on or 557 00:31:17,000 --> 00:31:19,840 Speaker 2: what make us human, and that you can't teach a 558 00:31:19,840 --> 00:31:21,120 Speaker 2: computer to be irrational. 559 00:31:21,480 --> 00:31:25,280 Speaker 1: That's true, although you can't teach you to swear. You can. 560 00:31:25,480 --> 00:31:29,600 Speaker 1: We just we read a story last week, yeah, where 561 00:31:29,920 --> 00:31:33,320 Speaker 1: IBM allowed Watson to read the Urban Dictionary and then 562 00:31:33,600 --> 00:31:35,760 Speaker 1: I Watson got a little bit of a botty mouth. 563 00:31:35,840 --> 00:31:37,160 Speaker 2: It got it got kind of fresh. 564 00:31:37,280 --> 00:31:40,160 Speaker 1: It did it did it started it started to say that. Uh. 565 00:31:40,200 --> 00:31:42,240 Speaker 1: Oh see what was it? Oh, I'm going to say 566 00:31:42,240 --> 00:31:47,160 Speaker 1: something and it's going to be bleeped out, right Tyler Tyler, 567 00:31:47,320 --> 00:31:52,360 Speaker 1: Tyler just said so. Uh. Anyway, so there's one point 568 00:31:52,400 --> 00:31:55,640 Speaker 1: where a researcher asked a question of Watson, and Watson 569 00:31:56,000 --> 00:32:01,200 Speaker 1: included within the answer the word bo. So since that 570 00:32:01,280 --> 00:32:02,800 Speaker 1: was bleeped out, you probably don't know what it was, 571 00:32:02,840 --> 00:32:04,640 Speaker 1: so go look it up. It was funny. It was 572 00:32:04,640 --> 00:32:05,200 Speaker 1: really funny. 573 00:32:05,360 --> 00:32:07,640 Speaker 2: Yes, And then and then they basically nuked that part 574 00:32:07,680 --> 00:32:09,760 Speaker 2: of Watson from orbit. They were like, you know what, 575 00:32:09,880 --> 00:32:10,600 Speaker 2: never mind. 576 00:32:10,440 --> 00:32:13,080 Speaker 1: It was the only way to be sure. They wiped 577 00:32:13,080 --> 00:32:15,840 Speaker 1: out the Urban Dictionary from Watson's memory. They also said 578 00:32:15,880 --> 00:32:17,960 Speaker 1: that a very similar thing happened when they let Watson 579 00:32:18,000 --> 00:32:21,840 Speaker 1: read Wikipedia. Oh no no judgments here, just saying what 580 00:32:21,840 --> 00:32:25,800 Speaker 1: what IBM said. Anyway, Again, the computer was unable to 581 00:32:25,880 --> 00:32:29,600 Speaker 1: determine when, when there, when it was appropriate and what's 582 00:32:29,600 --> 00:32:32,640 Speaker 1: the appropriate context for dropping a swear word. Yeah, it 583 00:32:32,680 --> 00:32:35,080 Speaker 1: didn't know, so it just started to speak kind of 584 00:32:35,120 --> 00:32:38,680 Speaker 1: like my wife does. So yeah, it was I'm going 585 00:32:38,760 --> 00:32:42,400 Speaker 1: to pay for that later. So anyway, that that that's 586 00:32:42,400 --> 00:32:45,479 Speaker 1: an interesting point though. Again you're you're showing how machine 587 00:32:45,480 --> 00:32:48,760 Speaker 1: intelligence and human intelligence are different because the machine intelligence 588 00:32:48,760 --> 00:32:50,480 Speaker 1: doesn't have that context for sure. 589 00:32:50,520 --> 00:32:53,320 Speaker 2: And of course, you know, we're talking about about science 590 00:32:53,480 --> 00:32:56,360 Speaker 2: fiction or science future, however you want to term it 591 00:32:56,520 --> 00:32:59,560 Speaker 2: so that, you know, we might very well come up 592 00:32:59,600 --> 00:33:03,160 Speaker 2: with fancy little program script that lets you that lets 593 00:33:03,160 --> 00:33:06,000 Speaker 2: you introduce that kind of bias. But you know, yeah, 594 00:33:06,280 --> 00:33:08,400 Speaker 2: but again from that documentary Star Trek, I mean, data 595 00:33:08,440 --> 00:33:10,320 Speaker 2: never figured out those contractions. 596 00:33:09,800 --> 00:33:18,520 Speaker 1: That's true, that's true. Touring actually had a great mental exercise, 597 00:33:18,560 --> 00:33:21,400 Speaker 1: really and it's called the Touring test, and this applies 598 00:33:21,440 --> 00:33:24,680 Speaker 1: to artificial intelligence. Touring's point, and we've talked about the 599 00:33:24,680 --> 00:33:27,240 Speaker 1: Touring test in previous episodes of Tech Stuff, but just 600 00:33:27,280 --> 00:33:30,520 Speaker 1: as a refresher, Touring had suggested that you could create 601 00:33:30,560 --> 00:33:33,920 Speaker 1: a test and that if a machine could pass that 602 00:33:34,080 --> 00:33:37,800 Speaker 1: test at the same level as a human, in other words, 603 00:33:38,440 --> 00:33:40,920 Speaker 1: if you were unable to determine that the person who 604 00:33:40,920 --> 00:33:43,640 Speaker 1: took that test was human or machine, the machine had 605 00:33:43,640 --> 00:33:48,640 Speaker 1: passed the Touring test and had essentially simulated human intelligence. 606 00:33:48,960 --> 00:33:51,760 Speaker 1: And it usually works as an interview, so you have 607 00:33:51,920 --> 00:33:55,800 Speaker 1: someone who's who's conducting an interview, and you have either 608 00:33:55,960 --> 00:33:59,520 Speaker 1: a machine answering or a human answering, and there's a 609 00:33:59,560 --> 00:34:01,960 Speaker 1: barrier up so that of course the person asking the 610 00:34:02,040 --> 00:34:04,400 Speaker 1: questions cannot see who is responding. And of course they're 611 00:34:04,440 --> 00:34:07,720 Speaker 1: responding through you know, text usually because if they're responding 612 00:34:07,760 --> 00:34:10,640 Speaker 1: through a voice and it's like I think the answer 613 00:34:10,840 --> 00:34:13,080 Speaker 1: as far you know, you'd be like, well, either it's 614 00:34:13,080 --> 00:34:15,400 Speaker 1: a robot or the most boring person in the world. 615 00:34:15,960 --> 00:34:18,440 Speaker 1: The idea being you would ask these questions over a 616 00:34:18,480 --> 00:34:23,160 Speaker 1: computer monitor, get text responses, and if you were unable 617 00:34:23,320 --> 00:34:28,080 Speaker 1: to answer with a certain degree of accuracy whether or 618 00:34:28,120 --> 00:34:30,160 Speaker 1: not it was a machine or a person, then you 619 00:34:30,200 --> 00:34:32,839 Speaker 1: would say the machine passed the Touring the test test. 620 00:34:33,440 --> 00:34:36,480 Speaker 1: And you could argue, well, that could just mean that 621 00:34:36,520 --> 00:34:39,319 Speaker 1: the machine's very good at mimicking human intelligence, it does 622 00:34:39,360 --> 00:34:44,120 Speaker 1: not actually possess human intelligence. Turing's point is, does that matter? 623 00:34:44,239 --> 00:34:47,920 Speaker 1: Because I know that I am intelligent. I speak with 624 00:34:48,000 --> 00:34:51,000 Speaker 1: someone like Lauren, who I assume is also intelligent based 625 00:34:51,040 --> 00:34:53,960 Speaker 1: upon the responses she gives, But she could just be 626 00:34:54,160 --> 00:34:59,800 Speaker 1: simulating intelligence. However, I have already bestowed in my mind 627 00:35:00,280 --> 00:35:05,640 Speaker 1: the feature of intelligence upon Lauren because what she does 628 00:35:05,920 --> 00:35:08,839 Speaker 1: is very much akin to what I do. So Touring said, 629 00:35:08,880 --> 00:35:11,879 Speaker 1: if you extend the courtesy to your fellow human being 630 00:35:11,920 --> 00:35:14,439 Speaker 1: that they are intelligent based on the fact that they 631 00:35:14,600 --> 00:35:16,840 Speaker 1: act like you do. Why would you not do the 632 00:35:16,880 --> 00:35:20,160 Speaker 1: same thing for a machine? Does it matter if the 633 00:35:20,200 --> 00:35:23,879 Speaker 1: machine can actually think, If the machine simulates thought well 634 00:35:24,000 --> 00:35:26,200 Speaker 1: enough for it to pass as human, then you're giving 635 00:35:26,200 --> 00:35:28,799 Speaker 1: it the same benefit of doubt that anyone else. 636 00:35:28,840 --> 00:35:31,080 Speaker 2: You mean, right, this is what a lot of science 637 00:35:31,080 --> 00:35:32,120 Speaker 2: fiction movies are about. 638 00:35:32,160 --> 00:35:33,920 Speaker 1: Actually, yeah, there's a lot of philosophy. 639 00:35:33,920 --> 00:35:36,240 Speaker 2: And yeah, a lot of philosophy, a lot of Isaac Asmo, 640 00:35:36,360 --> 00:35:39,160 Speaker 2: have a lot of Blade Runner and that's not an author. 641 00:35:39,239 --> 00:35:41,759 Speaker 1: Sorry, well no, but you know Philip K. Dick, look 642 00:35:41,800 --> 00:35:44,680 Speaker 1: him up. So anyway, thank you do. Android's dream of 643 00:35:44,719 --> 00:35:47,840 Speaker 1: Electric Sheep. I won't I won't spoil it for you. 644 00:35:49,160 --> 00:35:52,520 Speaker 1: They to kind of wrap this all up, getting back 645 00:35:52,520 --> 00:35:58,440 Speaker 1: into the discussion of philosophy. We had very recently, we 646 00:35:58,480 --> 00:36:02,239 Speaker 1: did a podcast about our we living in a computer simulation, right, right, 647 00:36:02,560 --> 00:36:05,600 Speaker 1: and that kind of plays into this idea of the singularity, 648 00:36:05,680 --> 00:36:09,520 Speaker 1: because that argument stated that if the singularity is in 649 00:36:09,560 --> 00:36:13,440 Speaker 1: fact possible, if it's inevitable, if we are going to 650 00:36:13,480 --> 00:36:16,480 Speaker 1: reach this level of transhumanism where we are no longer 651 00:36:16,920 --> 00:36:19,400 Speaker 1: able to really predict what the present will be like, 652 00:36:19,440 --> 00:36:23,080 Speaker 1: because it'll be beyond our understanding. Then one thing we 653 00:36:23,120 --> 00:36:26,240 Speaker 1: would expect to do is create simulations of our past 654 00:36:26,360 --> 00:36:28,440 Speaker 1: to kind of study ourselves, sure, right. 655 00:36:28,640 --> 00:36:32,080 Speaker 2: And to see what happens play around variables yams. 656 00:36:32,239 --> 00:36:35,319 Speaker 1: Yeah, and we could, if we're that advanced, we could, 657 00:36:35,400 --> 00:36:38,960 Speaker 1: in theory, create such a realistic simulation that the inhabitants 658 00:36:39,000 --> 00:36:42,799 Speaker 1: of that simulation would be incapable of knowing that they 659 00:36:42,800 --> 00:36:47,560 Speaker 1: were artificial and would be completely, you know, self aware 660 00:36:47,600 --> 00:36:51,600 Speaker 1: of themselves. You know, that was totally redundant, self aware, 661 00:36:53,360 --> 00:36:56,800 Speaker 1: but unable to know that they were a simulation. He 662 00:36:56,920 --> 00:36:59,200 Speaker 1: said that if those things are possible, then there's no 663 00:36:59,280 --> 00:37:03,960 Speaker 1: way of knowing that, you know, the the overwhelming possibility 664 00:37:04,000 --> 00:37:05,920 Speaker 1: is that we are in a computer simulation. 665 00:37:06,000 --> 00:37:08,200 Speaker 2: It's a computer simulation, yeah, right now. 666 00:37:08,400 --> 00:37:11,200 Speaker 1: Because yeah, if that's if that's what's gonna happen, then 667 00:37:11,239 --> 00:37:13,560 Speaker 1: there's no way of saying with certainty that we are 668 00:37:13,600 --> 00:37:17,520 Speaker 1: not in fact the product of that. And so, uh, 669 00:37:17,600 --> 00:37:20,040 Speaker 1: the point being not necessarily that we are in fact 670 00:37:20,080 --> 00:37:23,040 Speaker 1: living in a computer simulation, but that perhaps this singularity, 671 00:37:23,120 --> 00:37:28,840 Speaker 1: this transhumanism thing might not be realistic, It might not 672 00:37:28,960 --> 00:37:31,839 Speaker 1: be the future that we're headed to. Maybe it ends 673 00:37:31,920 --> 00:37:34,840 Speaker 1: up being a pipe dream that's not really possible for 674 00:37:34,920 --> 00:37:38,759 Speaker 1: us to attain. Or maybe we'll wipe ourselves out through 675 00:37:38,840 --> 00:37:46,000 Speaker 1: some terrible war or catastrophic accident. Maybe we create a 676 00:37:46,040 --> 00:37:51,239 Speaker 1: biological entity that wipes us out allah the stand, or 677 00:37:51,320 --> 00:37:54,480 Speaker 1: we create a black hole at the LHC. Which come on, 678 00:37:54,560 --> 00:37:57,719 Speaker 1: people don't write me. I already know about that and 679 00:37:58,160 --> 00:38:01,360 Speaker 1: how tiny and and alm non existent they are because 680 00:38:01,360 --> 00:38:03,840 Speaker 1: it lasts so quickly, it totally happened. Let's say that 681 00:38:03,880 --> 00:38:05,799 Speaker 1: they do that thing where you look at that one 682 00:38:05,840 --> 00:38:08,600 Speaker 1: website where the black hole forms in the parking lot 683 00:38:08,800 --> 00:38:11,560 Speaker 1: outside the LHC and you just see the whole picture 684 00:38:11,640 --> 00:38:19,799 Speaker 1: go which funny video. Anyway, that argument plays back into this. 685 00:38:19,880 --> 00:38:22,640 Speaker 1: So I don't know. I don't know if we're going 686 00:38:22,640 --> 00:38:25,520 Speaker 1: to ever see a future where the singularity becomes a thing. 687 00:38:25,560 --> 00:38:27,640 Speaker 1: Oh and we never really talked about it. But one 688 00:38:27,680 --> 00:38:31,600 Speaker 1: of the big points that Kurtzwell really punches in his 689 00:38:32,160 --> 00:38:36,000 Speaker 1: Singularity talks is the idea of digital immortality, right right. 690 00:38:35,880 --> 00:38:38,760 Speaker 2: And he's been obsessed with this, and obsessed is probably 691 00:38:38,760 --> 00:38:42,560 Speaker 2: a judgmental word. I apologize that's but he's been very 692 00:38:42,560 --> 00:38:45,680 Speaker 2: focused on this concept. His father died when he was 693 00:38:45,840 --> 00:38:49,080 Speaker 2: about twenty four, and he's been exploring theories on life 694 00:38:49,080 --> 00:38:53,480 Speaker 2: extension ever since then, and supposedly takes all kinds of 695 00:38:53,480 --> 00:38:56,759 Speaker 2: supplements and sells them as well to extend life. Has 696 00:38:56,760 --> 00:38:58,680 Speaker 2: all kinds of kinds of health plans. 697 00:38:58,360 --> 00:39:02,480 Speaker 1: Yeah, dietary that he has exercise all the idea that 698 00:39:03,080 --> 00:39:06,320 Speaker 1: the idea being that if he can preserve his own life. 699 00:39:06,320 --> 00:39:08,600 Speaker 2: Last long enough that we hit the singularity, then he 700 00:39:08,640 --> 00:39:09,680 Speaker 2: can become immortal. 701 00:39:09,840 --> 00:39:14,160 Speaker 1: Right, and either that, you know, we attain immortality through 702 00:39:14,160 --> 00:39:17,920 Speaker 1: one of a thousand different ways. For example, we end 703 00:39:18,000 --> 00:39:21,160 Speaker 1: up uploading our own intelligence into the cloud, right, and 704 00:39:21,200 --> 00:39:23,719 Speaker 1: then we become part of a group consciousness, so we 705 00:39:23,800 --> 00:39:27,040 Speaker 1: are no longer really individuals. Or we merge with computers 706 00:39:27,080 --> 00:39:29,840 Speaker 1: in some other way so that we are technically immortal 707 00:39:29,880 --> 00:39:33,520 Speaker 1: that way. Or we just conquer the genes that all 708 00:39:34,120 --> 00:39:37,040 Speaker 1: guide the aging process and we stop it, and we 709 00:39:37,040 --> 00:39:37,800 Speaker 1: stop disease. 710 00:39:37,880 --> 00:39:39,920 Speaker 2: You know, we take like in transmit, you just take 711 00:39:39,960 --> 00:39:42,040 Speaker 2: a cancer pill and then you don't get cancer because 712 00:39:42,040 --> 00:39:42,640 Speaker 2: that's what you do. 713 00:39:42,880 --> 00:39:46,560 Speaker 1: Yeah, So again the singularity. That's kind of why I 714 00:39:46,600 --> 00:39:48,359 Speaker 1: think a lot of critics also point to it as 715 00:39:48,400 --> 00:39:50,080 Speaker 1: being more of a religion because it's kind of this 716 00:39:50,200 --> 00:39:54,040 Speaker 1: sort of utopian pipe dream in their minds. 717 00:39:54,239 --> 00:39:57,879 Speaker 2: There's the former CEO of Lotus, Mitch Kapor Kapper, I'm 718 00:39:57,880 --> 00:40:00,440 Speaker 2: not sure how you say it, once called called it 719 00:40:00,480 --> 00:40:03,320 Speaker 2: the intelligent design for the i Q one forty people. 720 00:40:03,600 --> 00:40:07,880 Speaker 1: Yeah, ouch ouch. Well, meanwhile, Kurtzwild's kind of laughing all 721 00:40:07,880 --> 00:40:09,880 Speaker 1: the way to the bank. I hear that a company 722 00:40:09,920 --> 00:40:12,759 Speaker 1: that rhymes with Shmugel hired him Little little people. 723 00:40:12,800 --> 00:40:15,120 Speaker 2: I mean, we're you probably wouldn't have heard of him, Yeah, 724 00:40:15,160 --> 00:40:18,000 Speaker 2: but yeah, they just tired him on to be uh. 725 00:40:18,040 --> 00:40:19,960 Speaker 2: I have it in my notes the official title. I 726 00:40:20,000 --> 00:40:23,319 Speaker 2: think it's the director of engineering. Yeah, a director of 727 00:40:23,360 --> 00:40:24,239 Speaker 2: engineering over there. 728 00:40:24,400 --> 00:40:27,000 Speaker 1: Yeah, they get they get some big names. I mean 729 00:40:27,000 --> 00:40:30,040 Speaker 1: they had Vince Surf as the chief evangelist, and of 730 00:40:30,040 --> 00:40:32,200 Speaker 1: course he was one of the fathers of the Internet. 731 00:40:32,320 --> 00:40:36,359 Speaker 1: So Google's Google's got a They're known for getting some 732 00:40:36,480 --> 00:40:41,120 Speaker 1: really smart people. And to be fair, while the singularity 733 00:40:41,160 --> 00:40:44,920 Speaker 1: may or may not ever happen, I think it's important 734 00:40:44,920 --> 00:40:48,839 Speaker 1: that we have optimists in the field of technology who 735 00:40:48,840 --> 00:40:53,200 Speaker 1: are really pushing for our development to try and make 736 00:40:53,239 --> 00:40:55,840 Speaker 1: the world a better place for people. 737 00:40:55,920 --> 00:40:58,279 Speaker 2: Now, you know, oh, absolutely so, even if we're even 738 00:40:58,320 --> 00:41:00,440 Speaker 2: if we never reached the point of digital immorte in 739 00:41:00,480 --> 00:41:03,480 Speaker 2: our lifetimes or any other it's I mean, if someone 740 00:41:03,520 --> 00:41:05,799 Speaker 2: wants to think so big that they want to put 741 00:41:05,840 --> 00:41:08,840 Speaker 2: in nanobots to make my body awesomer, I mean, and 742 00:41:09,160 --> 00:41:12,040 Speaker 2: not that that came out. That came out possibly crude. 743 00:41:12,160 --> 00:41:14,120 Speaker 2: It mostly means that I don't get cancer and die 744 00:41:15,360 --> 00:41:18,960 Speaker 2: kind of stuff. That's that's terrific. Can I can't argue 745 00:41:18,960 --> 00:41:19,680 Speaker 2: with any part of that. 746 00:41:19,840 --> 00:41:21,680 Speaker 1: Yeah, I'm going to be on video so much this 747 00:41:21,800 --> 00:41:24,280 Speaker 1: year that I definitely need my body to be awesomer, 748 00:41:25,160 --> 00:41:28,200 Speaker 1: So I'm all for that well either way. 749 00:41:28,440 --> 00:41:31,400 Speaker 2: Yes, And and Google, you know, Google looks forward so 750 00:41:31,520 --> 00:41:36,040 Speaker 2: much to augmented reality. Augmented reality. I'm sorry, I can't 751 00:41:36,040 --> 00:41:39,160 Speaker 2: pronounce anything today. I am on a non roll. It's okay, 752 00:41:40,000 --> 00:41:42,360 Speaker 2: in the Internet of Things and all of that wonderful 753 00:41:42,360 --> 00:41:44,640 Speaker 2: future tech that it seems like a terrific fit. 754 00:41:45,000 --> 00:41:47,960 Speaker 1: Yeah. Yeah, so we'll see how it goes. I mean, obviously, 755 00:41:48,000 --> 00:41:50,000 Speaker 1: the nice thing about this is that all we have 756 00:41:50,040 --> 00:41:52,960 Speaker 1: to do is live long enough to see it happen 757 00:41:53,200 --> 00:41:56,680 Speaker 1: or not happen. And most predictions have the singularity hitting 758 00:41:56,800 --> 00:42:01,000 Speaker 1: somewhere between twenty thirty and twenty fifty. Yeah, it all 759 00:42:01,040 --> 00:42:04,319 Speaker 1: depends upon which futurist you're asking. And also it's one 760 00:42:04,320 --> 00:42:06,120 Speaker 1: of those kind of I think it's one of those 761 00:42:06,520 --> 00:42:10,640 Speaker 1: rolling goalposts as well. You know how certain technologies are 762 00:42:10,640 --> 00:42:13,200 Speaker 1: always twenty years away, or five years away or ten 763 00:42:13,280 --> 00:42:16,839 Speaker 1: years away. So we'll see. Maybe by twenty twenty we'll 764 00:42:16,840 --> 00:42:19,520 Speaker 1: be saying, all right, we've revised our figures. 765 00:42:19,800 --> 00:42:22,799 Speaker 2: By twenty seventies, definitely, but who knows. 766 00:42:22,880 --> 00:42:26,280 Speaker 1: We'll see. Guys, if you have any suggestions or future 767 00:42:26,280 --> 00:42:28,680 Speaker 1: episodes of tech Stuff, well here's what you can do. 768 00:42:29,000 --> 00:42:30,719 Speaker 1: You can write us an email. And a lot of 769 00:42:30,760 --> 00:42:33,200 Speaker 1: people have been asking about our email address. I do 770 00:42:33,320 --> 00:42:36,040 Speaker 1: say that every episode, but in case you've missed it, 771 00:42:36,320 --> 00:42:41,160 Speaker 1: listen carefully. Our email address is tech Stuff at Discovery 772 00:42:41,320 --> 00:42:45,000 Speaker 1: dot com. Send an email. I'll prove it by writing back, 773 00:42:45,800 --> 00:42:48,120 Speaker 1: or drop us a line on Facebook or Twitter. Our 774 00:42:48,200 --> 00:42:51,840 Speaker 1: handle there at both of those is text Stuff hsw 775 00:42:51,920 --> 00:42:54,960 Speaker 1: and Lauren and I will talk to you again in 776 00:42:55,000 --> 00:43:02,240 Speaker 1: the future. And that was the classic episode tech Stuff 777 00:43:02,400 --> 00:43:05,960 Speaker 1: Enters the Singularity, way back from twenty thirteen, more than 778 00:43:06,000 --> 00:43:10,560 Speaker 1: a decade ago. Holy cats, I have been doing this 779 00:43:10,719 --> 00:43:14,560 Speaker 1: show for so long because Lauren, of course, was my 780 00:43:14,719 --> 00:43:18,320 Speaker 1: second co host. I had already done a couple hundred 781 00:43:18,360 --> 00:43:22,600 Speaker 1: shows with a different co host. Wow really does hit 782 00:43:22,640 --> 00:43:26,279 Speaker 1: me right in the brain. So I hope you all 783 00:43:26,400 --> 00:43:29,880 Speaker 1: enjoyed that. I'm looking forward to seeing what comes next. 784 00:43:30,000 --> 00:43:33,359 Speaker 1: I don't think we're gonna hit the Singularity this year. 785 00:43:34,000 --> 00:43:36,000 Speaker 1: Maybe I should have put that as one of my predictions, 786 00:43:36,200 --> 00:43:40,400 Speaker 1: but who knows. Maybe open Aiye is gonna create the 787 00:43:40,440 --> 00:43:45,800 Speaker 1: next generative chat bot and it'll program the sky Net 788 00:43:46,040 --> 00:43:48,799 Speaker 1: like system that will bring us onto the Singularity kicking 789 00:43:48,880 --> 00:43:52,399 Speaker 1: and screaming, or if you're me, I'm probably already kicking 790 00:43:52,440 --> 00:43:57,120 Speaker 1: and screaming just just because I'm grouchy. Anyway, I hope 791 00:43:57,160 --> 00:43:58,920 Speaker 1: you're all well. I hope you had a happy and 792 00:43:59,000 --> 00:44:02,560 Speaker 1: safe New Year, and I'll talk to you again really soon. 793 00:44:08,440 --> 00:44:13,240 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 794 00:44:13,560 --> 00:44:17,239 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 795 00:44:17,320 --> 00:44:18,400 Speaker 1: to your favorite shows.