1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,640 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,800 --> 00:00:17,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,880 --> 00:00:23,599 Speaker 1: and love all things tech, and I also love speculative fiction, 5 00:00:24,000 --> 00:00:27,680 Speaker 1: and I really love science fiction. No big shock there, 6 00:00:27,720 --> 00:00:32,320 Speaker 1: I'm sure I love stories that involve cool and futuristic 7 00:00:32,360 --> 00:00:36,240 Speaker 1: technology and you know, technically, stories that they play out 8 00:00:37,360 --> 00:00:39,880 Speaker 1: in such a way that, if done well, it says 9 00:00:39,960 --> 00:00:43,040 Speaker 1: something interesting about what it is to be human when 10 00:00:43,040 --> 00:00:44,760 Speaker 1: you get down to it. A lot of science fiction 11 00:00:44,920 --> 00:00:47,199 Speaker 1: isn't so much about the pew pew lasers and the 12 00:00:47,280 --> 00:00:51,159 Speaker 1: zoom zoom spaceships. It's really more about stuff like resilience 13 00:00:51,400 --> 00:00:57,320 Speaker 1: and hope and hubrists and enmity and other very you know, 14 00:00:57,520 --> 00:01:02,040 Speaker 1: human qualities. So when you strip it all away, a 15 00:01:02,160 --> 00:01:05,480 Speaker 1: good science fiction story you should tell us more about, 16 00:01:05,560 --> 00:01:09,759 Speaker 1: you know, humanity. Now, when you create a science fiction story, 17 00:01:10,319 --> 00:01:13,959 Speaker 1: you have to set it somewhere and some win. And 18 00:01:14,040 --> 00:01:16,560 Speaker 1: that's what brings us to today's topic. I wanted to 19 00:01:16,560 --> 00:01:20,160 Speaker 1: talk about some science fiction stories that were set in 20 00:01:20,240 --> 00:01:23,560 Speaker 1: the years two thousand to two thousand twenty one, So 21 00:01:23,600 --> 00:01:27,880 Speaker 1: over the last twenty one years, now, not necessarily written 22 00:01:27,959 --> 00:01:31,160 Speaker 1: during that time, but set in that time, like this 23 00:01:31,319 --> 00:01:34,680 Speaker 1: would be the futuristic setting, because sometimes it's fun to 24 00:01:34,720 --> 00:01:36,720 Speaker 1: go back and talk about some of the wild things 25 00:01:36,760 --> 00:01:41,360 Speaker 1: that writers imagined happening by now. And sometimes they predict 26 00:01:41,400 --> 00:01:44,880 Speaker 1: stuff that comes true in some form or another. A 27 00:01:44,920 --> 00:01:47,040 Speaker 1: lot of times they predict stuff that we just don't 28 00:01:47,080 --> 00:01:49,360 Speaker 1: have yet. And I don't mean for this to be 29 00:01:49,480 --> 00:01:52,440 Speaker 1: a Hey, isn't it fun that they thought we would 30 00:01:52,440 --> 00:01:55,040 Speaker 1: be using holograms by now? Although there are a couple 31 00:01:55,040 --> 00:01:57,720 Speaker 1: of instances of that in this episode, but it's not 32 00:01:57,840 --> 00:02:01,440 Speaker 1: meant to poke fun at the futurists who were imagining 33 00:02:01,480 --> 00:02:05,080 Speaker 1: the world. In some cases, the writer clearly was thinking, 34 00:02:05,840 --> 00:02:08,800 Speaker 1: what is a year that's far enough out from today 35 00:02:09,280 --> 00:02:13,720 Speaker 1: that this could seem plausible. In other cases, the writer 36 00:02:13,880 --> 00:02:16,840 Speaker 1: might just have arbitrarily picked a year because the year 37 00:02:16,840 --> 00:02:21,520 Speaker 1: itself isn't really what's important, Like you might say in 38 00:02:21,560 --> 00:02:24,920 Speaker 1: the distant year of two thousand nine, but what you 39 00:02:25,000 --> 00:02:28,840 Speaker 1: really just mean is at a time that's in the future, right, 40 00:02:29,480 --> 00:02:31,400 Speaker 1: So in some cases, I think a lot of science 41 00:02:31,440 --> 00:02:33,840 Speaker 1: fiction stories, when they have a year, you might as 42 00:02:33,880 --> 00:02:38,040 Speaker 1: well just replace that year with, you know, in the future. 43 00:02:38,880 --> 00:02:42,440 Speaker 1: But we're gonna ignore that, and we're gonna poke fun 44 00:02:42,960 --> 00:02:45,360 Speaker 1: at some movies and some of their predictions, and not 45 00:02:45,440 --> 00:02:48,720 Speaker 1: just movies, other stories too. So I'm gonna look at 46 00:02:48,720 --> 00:02:51,880 Speaker 1: a lot of stories that were produced before the year 47 00:02:52,440 --> 00:02:54,640 Speaker 1: the story is set in. So, in other words, these 48 00:02:54,639 --> 00:02:57,519 Speaker 1: stories have to at least on some level make predictions 49 00:02:57,600 --> 00:03:00,359 Speaker 1: about the future. A lot of science fiction films are 50 00:03:00,400 --> 00:03:02,880 Speaker 1: set the same year they came out. A lot of 51 00:03:02,919 --> 00:03:05,160 Speaker 1: science fiction stories are set the same year they come out, 52 00:03:05,240 --> 00:03:09,960 Speaker 1: so those become more like alternate present rather than futuristic. 53 00:03:10,440 --> 00:03:13,959 Speaker 1: So I'm ignoring all those, and I'm really focusing on 54 00:03:13,960 --> 00:03:17,600 Speaker 1: ones that are more about projecting into the future. This 55 00:03:17,720 --> 00:03:19,880 Speaker 1: also is not going to be an exhaustive list, either, 56 00:03:20,639 --> 00:03:22,400 Speaker 1: because there have been a lot of stories that have 57 00:03:22,440 --> 00:03:25,200 Speaker 1: been set between two thousand and two thousand twenty one, 58 00:03:25,240 --> 00:03:27,959 Speaker 1: not as many as I thought there would be, or 59 00:03:28,000 --> 00:03:30,920 Speaker 1: at least, based upon my research, not as many as 60 00:03:30,919 --> 00:03:32,840 Speaker 1: I thought there would be. But there's still quite a few, 61 00:03:33,360 --> 00:03:35,960 Speaker 1: and I'm gonna try and stick with ones that are 62 00:03:36,040 --> 00:03:40,280 Speaker 1: either really famous or just super fun examples. So I 63 00:03:40,400 --> 00:03:43,040 Speaker 1: may very well skip over some of your favorites. For that, 64 00:03:43,080 --> 00:03:46,640 Speaker 1: I apologize I cannot cover them. All, but let's get 65 00:03:46,680 --> 00:03:50,360 Speaker 1: to it um And also before I get into the 66 00:03:50,400 --> 00:03:53,680 Speaker 1: twenty first century, let's have a couple of honorable mentions 67 00:03:53,680 --> 00:04:00,600 Speaker 1: in here. For example, George Orwell's Four is a phenomenal novel. 68 00:04:00,680 --> 00:04:05,400 Speaker 1: It describes the world as being governed by giant, totalitarian 69 00:04:05,440 --> 00:04:09,440 Speaker 1: authorities who essentially have divvied up the world. And it's 70 00:04:09,440 --> 00:04:13,960 Speaker 1: a world that's constantly under surveillance, like the government is 71 00:04:14,000 --> 00:04:18,320 Speaker 1: looking at everything that's going on, and the government tries 72 00:04:18,320 --> 00:04:21,400 Speaker 1: to regulate not just what people are allowed to do, 73 00:04:21,480 --> 00:04:24,960 Speaker 1: but even so much as how people are allowed to think. 74 00:04:25,720 --> 00:04:28,560 Speaker 1: It's a pretty terrifying story, and it's one I think 75 00:04:28,560 --> 00:04:31,280 Speaker 1: we can still recognize as being relevant today, whether it's 76 00:04:31,279 --> 00:04:34,080 Speaker 1: in the form of authoritarian governments. I mean, you can 77 00:04:34,120 --> 00:04:38,240 Speaker 1: look at some of China's initiatives and say, like, oh, 78 00:04:38,279 --> 00:04:41,520 Speaker 1: I see some similarities there. You can also arguably look 79 00:04:41,600 --> 00:04:43,960 Speaker 1: at the U k. And the United States and say 80 00:04:44,000 --> 00:04:47,960 Speaker 1: there's some elements there too. We also have seen the 81 00:04:48,080 --> 00:04:51,240 Speaker 1: rise of mega powerful corporations, which in some ways have 82 00:04:51,400 --> 00:04:55,960 Speaker 1: taken on some of the aspects that Orwell had attributed 83 00:04:56,040 --> 00:05:00,240 Speaker 1: to governments, Like we're seeing companies take on that kind 84 00:05:00,240 --> 00:05:03,520 Speaker 1: of of role and take on that kind of power. 85 00:05:04,120 --> 00:05:08,160 Speaker 1: We're almost four decades decades out from the setting of 86 00:05:08,200 --> 00:05:10,960 Speaker 1: that novel, and you know, we haven't quite reached the 87 00:05:11,040 --> 00:05:13,360 Speaker 1: level of dystopia described in the book, but you can 88 00:05:13,400 --> 00:05:15,880 Speaker 1: make a decent argument that a lot of the elements 89 00:05:16,640 --> 00:05:19,240 Speaker 1: that were or well described have kind of crept into 90 00:05:19,279 --> 00:05:22,159 Speaker 1: our actual world today. And granted, you know, some of 91 00:05:22,200 --> 00:05:24,600 Speaker 1: those were things that he was observing at the time 92 00:05:24,640 --> 00:05:29,679 Speaker 1: he wrote it, which was well before. Another honorable mention 93 00:05:29,760 --> 00:05:32,159 Speaker 1: I have to add in here is the Star Trek 94 00:05:32,160 --> 00:05:37,640 Speaker 1: original series episode Space Seed. The episode first aired in 95 00:05:37,800 --> 00:05:41,440 Speaker 1: nineteen sixty seven, and the year in which the episode 96 00:05:41,520 --> 00:05:46,960 Speaker 1: happens is supposed to be twenty two sixty seven. Uh, 97 00:05:47,080 --> 00:05:51,760 Speaker 1: but it references something that should have happened in our past. 98 00:05:51,880 --> 00:05:54,719 Speaker 1: It would have happened between those two years. This is 99 00:05:54,760 --> 00:06:01,200 Speaker 1: the episode that introduced the iconic character con Nunioning. So 100 00:06:01,480 --> 00:06:04,760 Speaker 1: in Star Trek lore, the world plunged into a global 101 00:06:04,839 --> 00:06:09,080 Speaker 1: conflict in the nineteen nineties called the Eugenics Wars, and 102 00:06:09,160 --> 00:06:11,880 Speaker 1: Cohn was one of several people who are part of 103 00:06:11,920 --> 00:06:16,880 Speaker 1: a long term experiment that was focused on selective breeding 104 00:06:17,040 --> 00:06:19,159 Speaker 1: of humans, and the idea being that this is a 105 00:06:19,240 --> 00:06:24,840 Speaker 1: process that would eventually produce exceptional human beings. Eugenics is 106 00:06:25,240 --> 00:06:31,720 Speaker 1: a real horrifying thing. It's also an incredibly racist thing. 107 00:06:31,960 --> 00:06:35,760 Speaker 1: It's it's it's bad, y'all. But anyway, this was sort 108 00:06:35,800 --> 00:06:38,200 Speaker 1: of the thing that Star Trek was pitching it as. 109 00:06:38,279 --> 00:06:41,440 Speaker 1: So con is supposed to be stronger and more intelligent 110 00:06:41,520 --> 00:06:45,479 Speaker 1: than your average person. In later Star Trek properties, his 111 00:06:45,560 --> 00:06:48,200 Speaker 1: backstory gets tweaked a little bit, so he's actually the 112 00:06:48,240 --> 00:06:51,839 Speaker 1: result not just of selective breeding, but of genetic engineering, 113 00:06:52,440 --> 00:06:55,880 Speaker 1: because people recognize that any sort of selective breeding process 114 00:06:56,200 --> 00:06:58,760 Speaker 1: would not progress to a point that you would have, 115 00:06:59,160 --> 00:07:03,800 Speaker 1: you know, people with obviously superior human qualities within a 116 00:07:03,800 --> 00:07:07,280 Speaker 1: couple of generations. That would take a very long time 117 00:07:07,279 --> 00:07:12,840 Speaker 1: to really do. And again, eugenics is horrifying lee awful. Anyway. 118 00:07:12,920 --> 00:07:17,160 Speaker 1: Within Star Trek lore, Kahn and his genetically superior colleagues 119 00:07:17,480 --> 00:07:19,920 Speaker 1: managed to conquer about a third of the Earth before 120 00:07:19,920 --> 00:07:24,720 Speaker 1: they were defeated, and then Kahn's crew escaped Earth. Most 121 00:07:24,760 --> 00:07:29,080 Speaker 1: of the genetically engineered humans were captured and sentenced to 122 00:07:29,200 --> 00:07:33,160 Speaker 1: death in the nineties, but Cohn and his crew escape 123 00:07:33,240 --> 00:07:36,240 Speaker 1: on a ship called the Botany Bay, and they go 124 00:07:36,280 --> 00:07:39,360 Speaker 1: into suspended animation and just go on a trip out 125 00:07:39,920 --> 00:07:45,040 Speaker 1: into space to escape their capture, and then a couple 126 00:07:45,160 --> 00:07:50,040 Speaker 1: hundred years later, the enterprise happens across them. Now, obviously 127 00:07:50,240 --> 00:07:51,880 Speaker 1: a lot of the stuff that was predicted back in 128 00:07:51,960 --> 00:07:56,600 Speaker 1: nineteen sixty seven never happened. There have been no eugenics wars. Thankfully, 129 00:07:56,720 --> 00:07:59,720 Speaker 1: we don't have the ability to genetically modify humans to 130 00:07:59,760 --> 00:08:02,040 Speaker 1: the point of guaranteeing that they are going to be 131 00:08:02,080 --> 00:08:06,080 Speaker 1: stronger and more intelligent. But there are ongoing discussions and 132 00:08:06,120 --> 00:08:10,240 Speaker 1: scientific circles about the ethics of genetic modification in general. Like, 133 00:08:10,440 --> 00:08:13,160 Speaker 1: obviously we've been doing a lot of work in genetic modification. 134 00:08:13,840 --> 00:08:17,960 Speaker 1: The development of crisper is a great example, but we're 135 00:08:18,000 --> 00:08:21,000 Speaker 1: still very much at the early stages of science when 136 00:08:21,040 --> 00:08:26,040 Speaker 1: it comes to genetically modifying things. And more than that, uh, 137 00:08:26,240 --> 00:08:29,320 Speaker 1: we still have these big ethical debates on at what 138 00:08:29,440 --> 00:08:33,240 Speaker 1: point do you say this is too far? Is it 139 00:08:33,360 --> 00:08:36,880 Speaker 1: okay to do genetic modification if you are trying to 140 00:08:37,000 --> 00:08:41,240 Speaker 1: make certain that someone is not born with a condition 141 00:08:41,280 --> 00:08:45,679 Speaker 1: that would otherwise inhibit them or or negatively impact their 142 00:08:45,720 --> 00:08:49,200 Speaker 1: quality of life? Is it okay if you go beyond that, 143 00:08:49,240 --> 00:08:51,400 Speaker 1: if you say, well, let's make sure that they have 144 00:08:51,880 --> 00:08:55,400 Speaker 1: certain qualities like blue eyes, or you go even further 145 00:08:55,440 --> 00:08:58,559 Speaker 1: than that and say, let's make them stronger and more intelligent. 146 00:08:59,000 --> 00:09:03,520 Speaker 1: This is an area that is thoroughly investigated by science 147 00:09:03,559 --> 00:09:06,720 Speaker 1: fiction and it's one where we have real discussions going 148 00:09:06,760 --> 00:09:10,840 Speaker 1: on today. Uh As for other science stuff that happens 149 00:09:10,960 --> 00:09:13,400 Speaker 1: in that that story, obviously we don't have a way 150 00:09:13,400 --> 00:09:16,680 Speaker 1: to put people in suspended animation, certainly not with any 151 00:09:16,679 --> 00:09:20,360 Speaker 1: way of halting aging completely and yet still being able 152 00:09:20,400 --> 00:09:24,400 Speaker 1: to revive the person with no adverse effects. We don't 153 00:09:24,520 --> 00:09:27,560 Speaker 1: have that capability. Uh. This is also an area of 154 00:09:27,600 --> 00:09:32,000 Speaker 1: intense interest. I mean, cryonics is a real thing in 155 00:09:32,040 --> 00:09:34,840 Speaker 1: the sense that there are people who are working on it. 156 00:09:35,400 --> 00:09:39,440 Speaker 1: Often the push for chronics comes from rich people who 157 00:09:39,440 --> 00:09:42,760 Speaker 1: are terrified of dying. But yeah, we haven't cracked that 158 00:09:42,760 --> 00:09:45,600 Speaker 1: one yet either. Beyond all that, we don't actually have 159 00:09:45,600 --> 00:09:48,960 Speaker 1: a spacecraft that humans could crew that is meant to 160 00:09:49,160 --> 00:09:52,599 Speaker 1: escape our solar system, let alone just wander the galaxy. 161 00:09:52,679 --> 00:09:55,280 Speaker 1: So yes, space Seeds plot depends upon stuff that just 162 00:09:55,679 --> 00:09:59,400 Speaker 1: didn't happen, and much of it couldn't have happened. Clearly 163 00:09:59,440 --> 00:10:02,000 Speaker 1: the writer's didn't you know they needed to have CON's 164 00:10:02,040 --> 00:10:05,320 Speaker 1: backstory happened sometime between the present day of nineteen sixty 165 00:10:05,360 --> 00:10:09,400 Speaker 1: seven and the Star Trek date of twenty two sixty seven, 166 00:10:09,679 --> 00:10:12,360 Speaker 1: plus it needed to be far enough back in the 167 00:10:12,440 --> 00:10:15,640 Speaker 1: history of Star Trek's world so that the average person 168 00:10:15,840 --> 00:10:18,560 Speaker 1: in Star Trek wouldn't immediately recognize the name of con 169 00:10:18,800 --> 00:10:21,440 Speaker 1: and know who that is just on the face of it. 170 00:10:21,480 --> 00:10:24,160 Speaker 1: So I get it. But it's a great example of 171 00:10:24,240 --> 00:10:27,720 Speaker 1: predictions made in science fiction that just didn't happen. All right, 172 00:10:27,800 --> 00:10:30,200 Speaker 1: let's talk about some of the stuff that was predicted 173 00:10:30,240 --> 00:10:33,240 Speaker 1: to happen within the last twenty one years, And for 174 00:10:33,360 --> 00:10:35,679 Speaker 1: the year two thousand, I would like to submit the 175 00:10:35,760 --> 00:10:39,679 Speaker 1: film Death Race two thousand. This one was made back 176 00:10:39,720 --> 00:10:43,080 Speaker 1: in nineteen seventy five. This was a Roger Corman production. 177 00:10:43,600 --> 00:10:46,360 Speaker 1: And Corman is one of those folks in Hollywood who 178 00:10:46,520 --> 00:10:49,640 Speaker 1: really was able to stretch a dollar as far as 179 00:10:49,640 --> 00:10:53,320 Speaker 1: it could go. Typically his films are really low budget affairs, 180 00:10:53,679 --> 00:10:58,160 Speaker 1: but he's been behind some pretty fun movies. Like Roger 181 00:10:58,200 --> 00:11:00,520 Speaker 1: Corman film is probably gonna be super low budget, but 182 00:11:00,559 --> 00:11:02,360 Speaker 1: it doesn't mean that it's not going to be good 183 00:11:02,480 --> 00:11:05,680 Speaker 1: or at least entertaining. I consider Death Race two thousand 184 00:11:06,120 --> 00:11:10,960 Speaker 1: to be a pretty entertaining movie. There's a nasty satirical 185 00:11:11,120 --> 00:11:13,520 Speaker 1: edge to the movie, a really nasty one, but I 186 00:11:13,559 --> 00:11:16,640 Speaker 1: find it fascinating. So in the lore of the movie, 187 00:11:17,000 --> 00:11:20,560 Speaker 1: the world went through a massive economic crash in nineteen 188 00:11:20,640 --> 00:11:24,240 Speaker 1: seventy nine, remember this film came out in seventy That 189 00:11:24,320 --> 00:11:27,960 Speaker 1: economic crash then made the United States unstable and at 190 00:11:28,000 --> 00:11:31,720 Speaker 1: some point the military overthrew the government and replace the 191 00:11:31,760 --> 00:11:37,640 Speaker 1: democratically elected representatives with a totalitarian military regime. The government 192 00:11:37,679 --> 00:11:40,839 Speaker 1: also co opted religion, so the state and the church 193 00:11:40,960 --> 00:11:45,160 Speaker 1: are united. This was an effort to consolidate power and 194 00:11:45,240 --> 00:11:49,320 Speaker 1: to keep the huddled masses distracted. Uh. The government has 195 00:11:49,360 --> 00:11:53,160 Speaker 1: this incredibly violent coast to coast race called the Annual 196 00:11:53,200 --> 00:11:58,880 Speaker 1: Transcontinental Road Race. It's meant to entertain people and redirect 197 00:11:58,920 --> 00:12:02,040 Speaker 1: their attention so they, you know, they don't realize how 198 00:12:02,080 --> 00:12:04,920 Speaker 1: bad things are and they don't get smart ideas like 199 00:12:05,000 --> 00:12:09,000 Speaker 1: demanding rights and stuff. So the race has this brutal 200 00:12:09,120 --> 00:12:12,000 Speaker 1: scoring system. Not only do you get points for you know, 201 00:12:12,240 --> 00:12:17,040 Speaker 1: being fast, you also get points for killing innocent people 202 00:12:17,080 --> 00:12:19,199 Speaker 1: along the way, like if you run down a pedestrian, 203 00:12:19,240 --> 00:12:22,240 Speaker 1: those are extra points. So this is a way for 204 00:12:22,280 --> 00:12:26,120 Speaker 1: an authoritarian government to retain control of a population, and also, 205 00:12:26,520 --> 00:12:29,760 Speaker 1: you know, is a message saying no one is safe. 206 00:12:30,040 --> 00:12:33,120 Speaker 1: This is a really common theme in stories that feature 207 00:12:33,160 --> 00:12:36,199 Speaker 1: dystopian futures. I'm sure you can think of some examples yourself, 208 00:12:36,240 --> 00:12:39,800 Speaker 1: but you can find it in movies like Rollerball, Uh, 209 00:12:39,920 --> 00:12:46,160 Speaker 1: The Hunger Games, Battle Royal, The Running Man, and many more. Um. 210 00:12:46,320 --> 00:12:49,079 Speaker 1: One of the drivers in the Death Race two thousand 211 00:12:49,080 --> 00:12:53,040 Speaker 1: movie is the government backed Frankenstein, who is said to 212 00:12:53,120 --> 00:12:56,400 Speaker 1: at least be part machine due to having sustained numerous 213 00:12:56,400 --> 00:13:00,839 Speaker 1: injuries and past competitions, and that he's while he's you 214 00:13:01,880 --> 00:13:07,240 Speaker 1: at least partly machine, he's also practically unkillable. So we've 215 00:13:07,240 --> 00:13:11,120 Speaker 1: got some cyborg stuff going on here. Except spoiler alert, 216 00:13:11,400 --> 00:13:13,520 Speaker 1: if you've not watched Death Rays two thousand and you 217 00:13:13,559 --> 00:13:15,559 Speaker 1: really want to, you probably want to skip this next bit. 218 00:13:16,160 --> 00:13:21,240 Speaker 1: But it turns out that Frankenstein isn't just one man. Instead, 219 00:13:21,400 --> 00:13:25,920 Speaker 1: the government finds someone to pose as Frankenstein for each race, 220 00:13:26,600 --> 00:13:29,080 Speaker 1: and should one of them die, they just get another 221 00:13:29,120 --> 00:13:32,040 Speaker 1: person to pose as Frankenstein for the next one. So 222 00:13:32,160 --> 00:13:35,360 Speaker 1: Frankenstein's outfit is a disguise in other words, and it 223 00:13:35,400 --> 00:13:40,280 Speaker 1: creates the illusion that the government has access to indestructible agents. 224 00:13:40,280 --> 00:13:42,720 Speaker 1: It's another way of sending the message of we're more 225 00:13:42,800 --> 00:13:46,280 Speaker 1: powerful than you are, so don't bother resisting. Now. I 226 00:13:46,320 --> 00:13:49,800 Speaker 1: won't spoil how the film ends except to say there 227 00:13:49,800 --> 00:13:52,000 Speaker 1: are a lot of kind of double and triple crosses 228 00:13:52,040 --> 00:13:55,000 Speaker 1: going on and stuff. It's kind of bonkers. But the 229 00:13:55,040 --> 00:13:58,560 Speaker 1: tech and death rays two thousand isn't particularly outlandish. It's 230 00:13:58,559 --> 00:14:02,840 Speaker 1: more about how society has devolved into this bloodthirsty and 231 00:14:02,920 --> 00:14:07,680 Speaker 1: yet suppressed mob. And I'm sure depending upon your own 232 00:14:08,080 --> 00:14:12,040 Speaker 1: you know, ideology, you might see some parallels in the 233 00:14:12,080 --> 00:14:16,400 Speaker 1: real world. Moving on to two thousand one, we have Hey, look, 234 00:14:16,400 --> 00:14:19,880 Speaker 1: it's two thousand one, a space odyssey. The novel and 235 00:14:19,960 --> 00:14:22,920 Speaker 1: the movie are both considered classics, but they also project 236 00:14:23,000 --> 00:14:26,440 Speaker 1: the level of technological sophistication that we have not yet reached, 237 00:14:26,600 --> 00:14:30,560 Speaker 1: not even in twenty twenty one. Uh. We could probably 238 00:14:30,640 --> 00:14:33,880 Speaker 1: forgive some of that because the stories that influenced two 239 00:14:33,960 --> 00:14:38,080 Speaker 1: thousand one came out of the nineteen fifties and nineteen sixties. 240 00:14:38,280 --> 00:14:41,600 Speaker 1: The movie version premiered in nineteen sixty eight, so at 241 00:14:41,600 --> 00:14:44,520 Speaker 1: that time we were still just on the verge of 242 00:14:44,560 --> 00:14:47,640 Speaker 1: going to the moon for the first time, it probably 243 00:14:47,680 --> 00:14:51,480 Speaker 1: seemed inevitable that we would continue to make incredible progress, 244 00:14:51,560 --> 00:14:54,600 Speaker 1: right Like, we were already within the span of a 245 00:14:54,640 --> 00:15:00,520 Speaker 1: decade making incredible strides towards the stars, so I guess 246 00:15:00,560 --> 00:15:03,400 Speaker 1: like it seemed kind of natural that we would continue 247 00:15:03,440 --> 00:15:06,840 Speaker 1: that momentum. So in the story of two thousand one, 248 00:15:07,320 --> 00:15:10,400 Speaker 1: there's a lunar base, and that's something we obviously don't 249 00:15:10,440 --> 00:15:14,720 Speaker 1: have yet. NASA's Artemis program, in which the agency plans 250 00:15:14,760 --> 00:15:18,080 Speaker 1: to return and send cruise to the Moon for the 251 00:15:18,080 --> 00:15:23,080 Speaker 1: first time since the early nineteen seventies, has hit some snags. Currently, 252 00:15:23,080 --> 00:15:25,640 Speaker 1: the agency says that a lunar landing will now take 253 00:15:25,640 --> 00:15:29,360 Speaker 1: place in twenty five at the earliest. The original plan 254 00:15:29,560 --> 00:15:32,720 Speaker 1: was to get there by four, but delays in various 255 00:15:32,800 --> 00:15:36,960 Speaker 1: aspects of the mission have made that impossible. So no 256 00:15:37,120 --> 00:15:39,800 Speaker 1: lunar base for us just yet. And there are those 257 00:15:39,800 --> 00:15:42,200 Speaker 1: who question whether a lunar base is even a logical 258 00:15:42,240 --> 00:15:47,480 Speaker 1: stepping stone, but it exists in the film anyway. There's 259 00:15:47,480 --> 00:15:50,000 Speaker 1: also a space station in the movie. This is obviously 260 00:15:50,080 --> 00:15:53,080 Speaker 1: something that we have achieved, but the story also has 261 00:15:53,120 --> 00:15:57,880 Speaker 1: astronauts traveling to Jupiter via spaceship with some passengers in 262 00:15:57,920 --> 00:16:01,680 Speaker 1: suspended animation. And we've already covered the suspended animation thing, 263 00:16:02,080 --> 00:16:06,080 Speaker 1: but we obviously also haven't made any missions to go 264 00:16:06,280 --> 00:16:09,000 Speaker 1: further out than our own Moon, and we haven't done 265 00:16:09,000 --> 00:16:13,280 Speaker 1: that since the nineteen seventies, at least not with humans 266 00:16:13,320 --> 00:16:16,920 Speaker 1: on board. We have sent, you know, unmanned spacecraft much 267 00:16:16,960 --> 00:16:21,320 Speaker 1: further out, but not with a human crew. In some 268 00:16:21,600 --> 00:16:26,040 Speaker 1: sections of two thousand one, the Jupiter mission section. UH, 269 00:16:26,040 --> 00:16:30,560 Speaker 1: spacecraft generate artificial gravity by spinning. So the spacecraft acts 270 00:16:30,600 --> 00:16:33,960 Speaker 1: like a centrifuge, and when you've got a rotating mass, 271 00:16:34,000 --> 00:16:38,600 Speaker 1: there's this pseudo force that we call centrifugal force, even 272 00:16:38,600 --> 00:16:41,440 Speaker 1: though it's not a quote unquote real force. UH. This 273 00:16:41,520 --> 00:16:45,680 Speaker 1: is directed radially outwards from the axis of rotation. So 274 00:16:45,760 --> 00:16:50,200 Speaker 1: if you imagine a bicycle wheel that has a pole 275 00:16:50,720 --> 00:16:53,440 Speaker 1: that goes right through the middle of the wheel and 276 00:16:53,480 --> 00:16:57,200 Speaker 1: it spins around the pole, the centrifugal force pushes outward 277 00:16:57,240 --> 00:17:00,680 Speaker 1: along the circumference of the wheel, as this is ninety 278 00:17:00,720 --> 00:17:03,680 Speaker 1: degrees out from the axis of rotation. So if you 279 00:17:03,720 --> 00:17:07,280 Speaker 1: had a spacecraft shaped like a wheel and it's rotating, 280 00:17:07,760 --> 00:17:12,840 Speaker 1: people could walk along the uh the well inner part 281 00:17:12,840 --> 00:17:16,160 Speaker 1: of the wheel, but on the outer edge of it 282 00:17:16,240 --> 00:17:19,240 Speaker 1: if you If that makes sense, like if you were 283 00:17:19,280 --> 00:17:22,159 Speaker 1: inside a bicycle wheel, your feet would be on the 284 00:17:22,280 --> 00:17:25,280 Speaker 1: rubber that would be on the outside, and you could 285 00:17:25,320 --> 00:17:28,080 Speaker 1: walk around that way like you would have artificial gravity 286 00:17:28,119 --> 00:17:31,520 Speaker 1: that way. The amount of artificial gravity would be dependent 287 00:17:31,600 --> 00:17:34,560 Speaker 1: upon the speed of rotation as well as how large 288 00:17:35,240 --> 00:17:40,240 Speaker 1: the spacecraft was. Uh. There are some problems doing this, however, 289 00:17:40,359 --> 00:17:44,720 Speaker 1: because the magnitude of centrifugal force depends partly upon the 290 00:17:44,800 --> 00:17:49,720 Speaker 1: distance from that axis of rotation, and that would mean 291 00:17:49,800 --> 00:17:52,919 Speaker 1: that our heads, which are clearly a little closer to 292 00:17:52,960 --> 00:17:56,800 Speaker 1: the axis, would be experiencing a different amount of force 293 00:17:57,320 --> 00:18:00,560 Speaker 1: than our feet would. So it would work from a 294 00:18:00,600 --> 00:18:03,720 Speaker 1: physics standpoint, but you might not feel so great if 295 00:18:03,720 --> 00:18:06,960 Speaker 1: you were to actually try it out in practice. The 296 00:18:07,119 --> 00:18:10,920 Speaker 1: primary antagonist for two thousand one is an artificially intelligent 297 00:18:10,960 --> 00:18:14,480 Speaker 1: computer system called How, which is everyone points out means 298 00:18:14,520 --> 00:18:18,920 Speaker 1: that each letter is just off by one from IBM. 299 00:18:19,040 --> 00:18:22,440 Speaker 1: Arthur C. Clarke, who wrote this, says that's a coincidence. 300 00:18:23,040 --> 00:18:28,080 Speaker 1: The system develops its own motivation and experiences something akin 301 00:18:28,160 --> 00:18:31,280 Speaker 1: to fear as it pleads with a character not to 302 00:18:31,480 --> 00:18:36,680 Speaker 1: deactivate it. So How goes eighte and kills nearly all 303 00:18:36,720 --> 00:18:40,439 Speaker 1: the crew before it is deactivated, But as it's being deactivated, 304 00:18:40,480 --> 00:18:43,720 Speaker 1: it's it's essentially pleading for its own life. While the 305 00:18:43,760 --> 00:18:48,320 Speaker 1: discipline of artificial intelligence has advanced dramatically since the nineteen sixties, 306 00:18:48,359 --> 00:18:50,520 Speaker 1: you know, we are still as of yet not able 307 00:18:50,520 --> 00:18:53,399 Speaker 1: to do anything close to what how could do, And 308 00:18:53,520 --> 00:18:57,000 Speaker 1: there's an ongoing debate on whether things like consciousness and 309 00:18:57,040 --> 00:19:01,440 Speaker 1: emotional motivation can even emerge out of technological systems, or 310 00:19:01,480 --> 00:19:04,480 Speaker 1: if they can, what level of complexity we would first 311 00:19:04,560 --> 00:19:07,240 Speaker 1: need to achieve in order for that to happen. There 312 00:19:07,240 --> 00:19:10,960 Speaker 1: are other technologies in two thousand one that have come true. 313 00:19:11,440 --> 00:19:14,160 Speaker 1: There's a sort of tablet computer device that you can 314 00:19:14,160 --> 00:19:17,159 Speaker 1: see in a couple of scenes, and obviously that's a 315 00:19:17,200 --> 00:19:19,560 Speaker 1: tech that we have today, though it would be a 316 00:19:19,600 --> 00:19:21,760 Speaker 1: little bit after two thousand one before we got one 317 00:19:21,840 --> 00:19:24,760 Speaker 1: that was really practical and you know, something that would 318 00:19:24,760 --> 00:19:29,240 Speaker 1: work in the mainstream consumer market. Voice activation is in 319 00:19:29,280 --> 00:19:31,840 Speaker 1: the movie. That's come a long way, and we've got 320 00:19:31,920 --> 00:19:34,560 Speaker 1: numerous systems that work with that. Also, a lot of 321 00:19:34,600 --> 00:19:36,880 Speaker 1: the stuff shown in two thousand one is pretty darn 322 00:19:36,960 --> 00:19:40,840 Speaker 1: accurate from a technical perspective. For example, there's no sound 323 00:19:40,840 --> 00:19:43,440 Speaker 1: in space, right because you don't have enough particles out 324 00:19:43,480 --> 00:19:47,560 Speaker 1: there to allow vibrations to carry in space. So Kubrick 325 00:19:47,640 --> 00:19:51,720 Speaker 1: made sure that shots that were in outer space, like 326 00:19:51,800 --> 00:19:54,760 Speaker 1: in the exteriors, were silent. That was a good touch. 327 00:19:55,280 --> 00:19:58,040 Speaker 1: When we come back, we'll continue our journey through futuristic 328 00:19:58,080 --> 00:20:01,439 Speaker 1: stories that were off by us the hair. But first, 329 00:20:01,840 --> 00:20:11,760 Speaker 1: let's listen to these messages, all right. Before the break, 330 00:20:11,760 --> 00:20:14,240 Speaker 1: we talked about two thousand one, which is widely considered 331 00:20:14,280 --> 00:20:16,919 Speaker 1: to be a cinematic masterpiece, and it's also a pretty 332 00:20:16,920 --> 00:20:19,760 Speaker 1: divisive film. Some folks don't like it so much that 333 00:20:19,800 --> 00:20:22,680 Speaker 1: I think it's a little boring, like watching paint dry. 334 00:20:23,160 --> 00:20:26,120 Speaker 1: I might be in that camp. I appreciate the movie 335 00:20:26,160 --> 00:20:28,320 Speaker 1: for what it is, but I do not find it 336 00:20:28,400 --> 00:20:30,840 Speaker 1: very entertaining. I find it hard to stay awake watching it. 337 00:20:31,200 --> 00:20:33,800 Speaker 1: That's on me, that's not on the movie anyway. Our 338 00:20:33,800 --> 00:20:36,720 Speaker 1: next film is definitely not a classic. It's a movie 339 00:20:36,760 --> 00:20:39,879 Speaker 1: that did not do well critically or commercially, and I 340 00:20:39,920 --> 00:20:43,879 Speaker 1: am talking about Bicentennial Man. The film is about a 341 00:20:43,920 --> 00:20:47,440 Speaker 1: household robot, one that is bipedal. It has a human 342 00:20:47,520 --> 00:20:51,760 Speaker 1: like body, and over time, this robot develops emotions and 343 00:20:51,840 --> 00:20:54,160 Speaker 1: motivations of its own. So it's kind of like How 344 00:20:54,400 --> 00:20:58,000 Speaker 1: in that regard. Only while How became a homicidal maniac 345 00:20:58,080 --> 00:21:01,360 Speaker 1: determined to complete a mission, the robot character of Andrew 346 00:21:01,440 --> 00:21:05,000 Speaker 1: and Bicentennial Man becomes so sweet he'll end up giving 347 00:21:05,000 --> 00:21:10,119 Speaker 1: you cavities. It's Robin Williams as schmaltziest. But if we 348 00:21:10,280 --> 00:21:13,600 Speaker 1: distill the movie down to the basics, we could say 349 00:21:13,640 --> 00:21:17,760 Speaker 1: that it's about a robot with artificial intelligence that gains sentience, 350 00:21:17,960 --> 00:21:22,480 Speaker 1: self awareness, and consciousness. And then how would humanity perceive 351 00:21:22,760 --> 00:21:25,760 Speaker 1: that kind of a machine. How would humans react? Would 352 00:21:25,760 --> 00:21:31,000 Speaker 1: they extend the idea of personhood to encompass an artificial being, 353 00:21:31,440 --> 00:21:35,240 Speaker 1: or would humans dismiss that as unthinkable and refuse to 354 00:21:35,280 --> 00:21:39,400 Speaker 1: acknowledge that a robot's humanlike qualities make it a person. 355 00:21:40,040 --> 00:21:42,639 Speaker 1: These are questions that folks are actually asking right now. 356 00:21:42,680 --> 00:21:45,520 Speaker 1: I mean, in the EU, there are committees that are 357 00:21:45,600 --> 00:21:49,639 Speaker 1: dedicated into looking into the idea of granting personhood for 358 00:21:49,800 --> 00:21:53,159 Speaker 1: robots and artificial intelligence. Should they reach a level of 359 00:21:53,200 --> 00:21:57,760 Speaker 1: sophistication that would necessitate such a thing. Bi Centennial Man 360 00:21:57,800 --> 00:22:00,439 Speaker 1: starts off in two thousand five, and I'm sure I 361 00:22:00,480 --> 00:22:02,400 Speaker 1: do not need to point out to you we did 362 00:22:02,440 --> 00:22:06,359 Speaker 1: not get intelligent bipedal helper robots in two thousand five. 363 00:22:06,960 --> 00:22:09,520 Speaker 1: We don't have them now, and there are lots of 364 00:22:09,520 --> 00:22:13,000 Speaker 1: reasons for that will put the AI side apart because 365 00:22:13,119 --> 00:22:14,959 Speaker 1: we already talked about that with how so there's no 366 00:22:15,040 --> 00:22:18,439 Speaker 1: need to tread over that again. But let's talk about 367 00:22:18,560 --> 00:22:22,719 Speaker 1: bipedal robots. So we don't have a ton of these because, 368 00:22:22,760 --> 00:22:26,080 Speaker 1: as it turns out, it's very hard to engineer a 369 00:22:26,160 --> 00:22:30,360 Speaker 1: bipedal robot. Getting the robot to move so that it's 370 00:22:30,400 --> 00:22:32,800 Speaker 1: not you know, just falling all over the place is 371 00:22:32,840 --> 00:22:37,840 Speaker 1: a non trivial problem. Balance is tough, maneuverability is tough. 372 00:22:38,280 --> 00:22:40,840 Speaker 1: We do have some robots that are bipedal, and they 373 00:22:40,880 --> 00:22:45,280 Speaker 1: are even some famous examples like Asimo, but um, you know, 374 00:22:45,359 --> 00:22:48,359 Speaker 1: they they're still very limited. I mean, Asimo could actually run. 375 00:22:48,680 --> 00:22:50,320 Speaker 1: It was a little hoppy run and made it look 376 00:22:50,359 --> 00:22:52,680 Speaker 1: like Asimo really needed to get to the Little Robots 377 00:22:52,760 --> 00:22:57,840 Speaker 1: Room pretty quickly, but it had its own limitations and restrictions. 378 00:22:57,840 --> 00:23:02,400 Speaker 1: It was largely under you know, annual control or very 379 00:23:02,560 --> 00:23:06,359 Speaker 1: very limited autonomous control. That's part of the reason we 380 00:23:06,440 --> 00:23:09,960 Speaker 1: typically see robots that depend upon wheels or treads to 381 00:23:10,040 --> 00:23:13,840 Speaker 1: move around because those components are far less complicated than 382 00:23:13,960 --> 00:23:17,840 Speaker 1: legs from an engineering perspective, they work on a simpler principle, 383 00:23:18,200 --> 00:23:21,479 Speaker 1: and they're easier to repair if things go wrong. If 384 00:23:21,520 --> 00:23:24,960 Speaker 1: you're looking at ways to simplify your robot design, getting 385 00:23:25,040 --> 00:23:28,200 Speaker 1: rid of legs is a no brainer. However, a lot 386 00:23:28,240 --> 00:23:32,040 Speaker 1: of human environments work best if you have legs. By 387 00:23:32,040 --> 00:23:34,760 Speaker 1: the way, this is a big problem, not just for robots. 388 00:23:34,800 --> 00:23:38,200 Speaker 1: I'm talking about accessibility in general. It's why we have 389 00:23:38,359 --> 00:23:42,480 Speaker 1: laws that are meant to guarantee accessibility, because otherwise people 390 00:23:42,480 --> 00:23:45,680 Speaker 1: who depend upon tech like wheelchairs to get around would 391 00:23:45,720 --> 00:23:49,120 Speaker 1: find themselves locked out of a lot of experiences. I mean, 392 00:23:49,160 --> 00:23:51,680 Speaker 1: they already do, but it would be even worse. And 393 00:23:51,760 --> 00:23:55,080 Speaker 1: while we've had some progress on making the world more accessible, 394 00:23:55,440 --> 00:23:58,879 Speaker 1: the fact is that the default design choice tends to 395 00:23:58,960 --> 00:24:02,159 Speaker 1: favor people who can walk around. The stairs are just 396 00:24:02,200 --> 00:24:07,200 Speaker 1: a simple example of that. Well, that's a pretty tough 397 00:24:07,280 --> 00:24:11,439 Speaker 1: challenge for robots to One of darpast robotics challenges was 398 00:24:11,520 --> 00:24:14,520 Speaker 1: for groups to design a bipedal robot that could complete 399 00:24:14,520 --> 00:24:19,439 Speaker 1: several tasks autonomously, including doing such things as operating a 400 00:24:19,520 --> 00:24:24,119 Speaker 1: vehicle opening a door, walking through a doorway, picking up 401 00:24:24,119 --> 00:24:27,280 Speaker 1: a power tool, using the power tool appropriately, and so on, 402 00:24:27,960 --> 00:24:30,919 Speaker 1: and even things like opening a door, which would be 403 00:24:30,960 --> 00:24:34,280 Speaker 1: a pretty trivial task for many people, became a big 404 00:24:34,320 --> 00:24:38,000 Speaker 1: engineering challenge, and walking through the door was another big one. 405 00:24:38,040 --> 00:24:40,800 Speaker 1: There are actually lots of videos of several robots just 406 00:24:40,880 --> 00:24:44,680 Speaker 1: playing tipping right over at that point. This challenge happened 407 00:24:44,720 --> 00:24:49,000 Speaker 1: a decade after two thousand five, after the the setting 408 00:24:49,080 --> 00:24:52,920 Speaker 1: of the beginning of Bicentennial Man, so from a basic 409 00:24:53,040 --> 00:24:56,800 Speaker 1: robotics standard, we're far off from having that become a reality. 410 00:24:57,320 --> 00:24:59,359 Speaker 1: I'm not going to comment on the quality of the movie, 411 00:24:59,520 --> 00:25:01,840 Speaker 1: but the argument on whether or not robots should be 412 00:25:01,880 --> 00:25:06,439 Speaker 1: granted personhood is a really interesting one, and it's also 413 00:25:06,880 --> 00:25:09,800 Speaker 1: an idea that pops up in the film AI And 414 00:25:09,840 --> 00:25:12,080 Speaker 1: as I mentioned, it's kind of an ongoing discussion here 415 00:25:12,080 --> 00:25:14,720 Speaker 1: in the real world today. Now we're going to jump 416 00:25:14,720 --> 00:25:19,040 Speaker 1: ahead to twenty mostly because very few well known futuristic 417 00:25:19,080 --> 00:25:22,480 Speaker 1: stories were set between two thousand five and twenty But 418 00:25:22,560 --> 00:25:25,760 Speaker 1: that means now we have to talk about the year 419 00:25:25,800 --> 00:25:28,919 Speaker 1: we make contact. This is actually the sequel to two 420 00:25:29,000 --> 00:25:32,960 Speaker 1: thousand one Space Odyssey. The film twenty ten came out 421 00:25:33,080 --> 00:25:37,160 Speaker 1: in the book came out two years earlier. And yes, 422 00:25:37,200 --> 00:25:39,560 Speaker 1: I know this sounds confusing because I'm using lots of 423 00:25:39,640 --> 00:25:43,159 Speaker 1: years here, but it was a nineteen eighties version of 424 00:25:43,160 --> 00:25:46,399 Speaker 1: what twenty ten would look like. Anyway, nine years have 425 00:25:46,440 --> 00:25:49,200 Speaker 1: gone by since the events of the first film, and 426 00:25:49,280 --> 00:25:51,640 Speaker 1: there are a lot of questions that are left back 427 00:25:51,680 --> 00:25:53,840 Speaker 1: on Earth, and the big ones are what the heck 428 00:25:53,880 --> 00:25:58,080 Speaker 1: happened out there? All those astronauts died, what happened to them? 429 00:25:58,119 --> 00:26:00,720 Speaker 1: And then one of them might not have died but 430 00:26:00,960 --> 00:26:04,280 Speaker 1: definitely disappeared. What happened to that guy? So twenty ten 431 00:26:04,400 --> 00:26:07,440 Speaker 1: obviously has some of the same issues when it comes 432 00:26:07,440 --> 00:26:10,400 Speaker 1: to the predicted tech that we saw in two thousand one, 433 00:26:10,440 --> 00:26:13,440 Speaker 1: so we're not going to go over all that again. Interestingly, 434 00:26:13,520 --> 00:26:17,120 Speaker 1: it also assumes that in the Soviet Union is still 435 00:26:17,160 --> 00:26:19,919 Speaker 1: a thing. In the film, the U S and the 436 00:26:20,000 --> 00:26:22,840 Speaker 1: U s s are are entering into essentially another kind 437 00:26:22,880 --> 00:26:26,320 Speaker 1: of Cold War space race, in this case a mission 438 00:26:26,359 --> 00:26:29,439 Speaker 1: to find out what the heck happened in the events 439 00:26:29,480 --> 00:26:32,879 Speaker 1: of two thousand one. Of course, the real Soviet Union 440 00:26:33,080 --> 00:26:35,919 Speaker 1: dissolved in the early nineteen nineties. It did not exist 441 00:26:35,960 --> 00:26:40,280 Speaker 1: by the movie answers the question about why How went 442 00:26:40,400 --> 00:26:43,600 Speaker 1: bonkers in the first film, and it turns out that 443 00:26:43,640 --> 00:26:47,560 Speaker 1: the crew were unknowingly on a secret miss mission, and 444 00:26:47,640 --> 00:26:49,439 Speaker 1: that mission had not been laid out to them, so 445 00:26:49,520 --> 00:26:53,119 Speaker 1: they didn't have any awareness of it. How however, was 446 00:26:53,240 --> 00:26:56,080 Speaker 1: aware of the secret mission, but was supposed to keep 447 00:26:56,119 --> 00:27:00,679 Speaker 1: that under wraps while appearing to facilitate the covers story mission, 448 00:27:00,720 --> 00:27:03,919 Speaker 1: the one that the astronauts thought they were on. But 449 00:27:04,119 --> 00:27:06,800 Speaker 1: that meant that there was a conflict with How's programming 450 00:27:06,840 --> 00:27:10,920 Speaker 1: because it was supposed to be a transparent and honest system, 451 00:27:10,960 --> 00:27:14,920 Speaker 1: and that brought How into an irreconcilable quandary. The computer 452 00:27:15,000 --> 00:27:19,240 Speaker 1: system was obligated to follow its mission, but that mission 453 00:27:19,320 --> 00:27:24,600 Speaker 1: included parameters that would violate the systems programming, so How snapped. Now, 454 00:27:24,640 --> 00:27:27,800 Speaker 1: setting aside the AI issues that we've already discussed, this 455 00:27:27,840 --> 00:27:31,560 Speaker 1: touches on another common element in speculative fiction, that of 456 00:27:31,720 --> 00:27:36,320 Speaker 1: artificial intelligence encountering some sort of problem or scenario that 457 00:27:36,359 --> 00:27:40,240 Speaker 1: subsequently causes it to harm people. Uh. One of the 458 00:27:40,280 --> 00:27:44,080 Speaker 1: basic ideas in Western science fiction goes to the three 459 00:27:44,359 --> 00:27:48,880 Speaker 1: major laws of robotics, as defined by Isaac Asimov. They 460 00:27:48,880 --> 00:27:52,480 Speaker 1: state that, first, a robot cannot harm a human or 461 00:27:52,640 --> 00:27:55,600 Speaker 1: allow a human to come to harm by failing to 462 00:27:55,640 --> 00:27:59,800 Speaker 1: prevent it. Second, a robot must obey any order given 463 00:27:59,800 --> 00:28:03,280 Speaker 1: to it by a human, provided that it doesn't violate 464 00:28:03,320 --> 00:28:07,359 Speaker 1: the first law. And Third, the robot must protect itself 465 00:28:07,400 --> 00:28:10,280 Speaker 1: as long as it doesn't conflict with the first two laws. 466 00:28:10,680 --> 00:28:13,080 Speaker 1: But science fiction is full of scenarios in which AI 467 00:28:13,200 --> 00:28:17,680 Speaker 1: causes harm or even leads to extinction level events. The classic, 468 00:28:17,800 --> 00:28:21,399 Speaker 1: perhaps most cliche example of this is the idea that 469 00:28:21,440 --> 00:28:24,280 Speaker 1: you create a supercomputer, and you your hope is that 470 00:28:24,320 --> 00:28:27,680 Speaker 1: the supercomputer, which is far more intelligent than any human being, 471 00:28:28,040 --> 00:28:31,880 Speaker 1: will be able to solve massive real world problems, so 472 00:28:32,800 --> 00:28:34,760 Speaker 1: it will be able to do things that humans can't 473 00:28:34,760 --> 00:28:37,440 Speaker 1: do the humans aren't smart enough to do. And then 474 00:28:37,480 --> 00:28:40,160 Speaker 1: you tell it to bring about world peace, and then 475 00:28:40,240 --> 00:28:43,240 Speaker 1: the supercomputer comes to the conclusion that the only way 476 00:28:43,520 --> 00:28:46,840 Speaker 1: to guarantee world peace is to wipe out all of humanity. 477 00:28:46,880 --> 00:28:49,120 Speaker 1: That way, there's no one left to declare war on 478 00:28:49,160 --> 00:28:53,680 Speaker 1: anyone else. That classic sci fi cliche. The other stuff 479 00:28:53,720 --> 00:28:56,880 Speaker 1: that happens in ten goes more into the realm of 480 00:28:56,920 --> 00:28:59,280 Speaker 1: fantasy than science fiction, So I'm not going to get 481 00:28:59,320 --> 00:29:02,000 Speaker 1: into the rest here yere. Uh. It's definitely a very 482 00:29:02,040 --> 00:29:05,240 Speaker 1: different movie than two thousand one. It's less poetic, it's 483 00:29:05,240 --> 00:29:08,120 Speaker 1: a little more narrative. I don't think it's a better 484 00:29:08,480 --> 00:29:11,160 Speaker 1: film than two thousand one. I do think it's one 485 00:29:11,200 --> 00:29:15,000 Speaker 1: I can watch more easily than two thousand one. All Right, 486 00:29:15,040 --> 00:29:18,720 Speaker 1: but let's move up to I thought about including I 487 00:29:18,760 --> 00:29:21,400 Speaker 1: Am Legend in this lineup that actually came out in 488 00:29:21,400 --> 00:29:25,800 Speaker 1: two thousand seven but was set in twelve. However, there's 489 00:29:25,840 --> 00:29:28,320 Speaker 1: not really any tech to speak of in I Am Legend, 490 00:29:28,440 --> 00:29:31,680 Speaker 1: and we're currently living through a global pandemic, so I 491 00:29:31,720 --> 00:29:34,080 Speaker 1: don't think we really need to talk about a fictional version. 492 00:29:34,120 --> 00:29:37,640 Speaker 1: So instead we're gonna talk about and I can't believe 493 00:29:37,680 --> 00:29:41,920 Speaker 1: I'm following up like with this, but John Carpenter's film 494 00:29:42,200 --> 00:29:46,000 Speaker 1: Escape from l A. It hurts me to talk about 495 00:29:46,000 --> 00:29:48,360 Speaker 1: this movie. I would have much preferred to talk about 496 00:29:48,480 --> 00:29:51,800 Speaker 1: Escape from New York. I know that Carpenter thinks Escape 497 00:29:51,800 --> 00:29:54,240 Speaker 1: from l A is a superior film to Escape from 498 00:29:54,240 --> 00:30:00,760 Speaker 1: New York. I respectfully disagree. The film Escape from New 499 00:30:00,800 --> 00:30:03,240 Speaker 1: York was set in the late nineties, so it's outside 500 00:30:03,240 --> 00:30:05,720 Speaker 1: the window for this episode. But Escape from l A 501 00:30:06,760 --> 00:30:09,680 Speaker 1: is a lot like Escape from New York, but not 502 00:30:09,840 --> 00:30:13,800 Speaker 1: as entertaining in my opinion. It was made in which 503 00:30:14,280 --> 00:30:17,520 Speaker 1: actually was the year before the setting of Escape from 504 00:30:17,560 --> 00:30:21,600 Speaker 1: New York. The basic premises that over several years, Los 505 00:30:21,680 --> 00:30:26,160 Speaker 1: Angeles descends into crime and chaos. It's a lost cause. 506 00:30:26,640 --> 00:30:29,880 Speaker 1: Then there's an earthquake and that causes Los Angeles to 507 00:30:29,960 --> 00:30:34,520 Speaker 1: effectively become an island separate from the California mainland. And 508 00:30:34,560 --> 00:30:38,280 Speaker 1: then there is a dictator who declares himself President of 509 00:30:38,320 --> 00:30:41,800 Speaker 1: the US for life, who then seizes control and he 510 00:30:42,120 --> 00:30:45,680 Speaker 1: uh declares that Los Angeles is effectively a prison. He 511 00:30:45,720 --> 00:30:48,080 Speaker 1: has walls built around the city, and if you break 512 00:30:48,120 --> 00:30:50,400 Speaker 1: any laws in the United States, well you're gonna get 513 00:30:50,440 --> 00:30:53,120 Speaker 1: sent to l a where you have to just kind 514 00:30:53,160 --> 00:30:57,680 Speaker 1: of try and survive anyway. The main character is Snake Bliskin, 515 00:30:57,760 --> 00:31:00,720 Speaker 1: who's a former military man. He's been come a nihilist. 516 00:31:00,760 --> 00:31:04,200 Speaker 1: He doesn't really believe in anything. He detests the world 517 00:31:04,560 --> 00:31:07,360 Speaker 1: as it has turned out to be. And he's also 518 00:31:07,400 --> 00:31:10,480 Speaker 1: a criminal and he's going to be sentenced to Los Angeles, 519 00:31:10,520 --> 00:31:13,400 Speaker 1: but he's offered a part pardon if he can retrieve 520 00:31:13,480 --> 00:31:17,960 Speaker 1: a remote control that controls a satellite based weapons system, 521 00:31:18,000 --> 00:31:22,440 Speaker 1: and this weapon can blast targeted regions with an electromagnetic 522 00:31:22,520 --> 00:31:26,400 Speaker 1: pulse that's strong enough to disable you know, electrical systems. 523 00:31:27,000 --> 00:31:29,320 Speaker 1: The dictator of the United States plans on using the 524 00:31:29,320 --> 00:31:33,520 Speaker 1: weapon on his various enemies around the world, essentially kind 525 00:31:33,520 --> 00:31:36,280 Speaker 1: of sending him on a path for world domination. And 526 00:31:36,320 --> 00:31:39,760 Speaker 1: to make sure that Pliskin plays ball, the government injects 527 00:31:39,760 --> 00:31:41,880 Speaker 1: what they say is a virus that will kill him 528 00:31:41,880 --> 00:31:45,440 Speaker 1: within ten hours unless he gets an antidote. This blot, 529 00:31:45,440 --> 00:31:49,160 Speaker 1: by the way, is almost identical to Escape from New York. 530 00:31:49,360 --> 00:31:54,000 Speaker 1: The particulars are different, but the idea is is virtually 531 00:31:54,040 --> 00:31:57,400 Speaker 1: the same. Some of the technologies shown in the film 532 00:31:57,560 --> 00:32:02,080 Speaker 1: include that satellite based e MP or electromagnetic pulse weapon. 533 00:32:02,640 --> 00:32:05,480 Speaker 1: E m p s are real. In fact, they really 534 00:32:05,480 --> 00:32:10,320 Speaker 1: happen in nature, and a sufficiently powerful electromagnetic pulse can 535 00:32:10,400 --> 00:32:14,960 Speaker 1: cause electronics to overload and fail. Uh. If it's powerful enough, 536 00:32:15,040 --> 00:32:17,960 Speaker 1: it can damage electronics to the point where they won't 537 00:32:17,960 --> 00:32:20,440 Speaker 1: work anymore. You'll have to repair them or replace them. 538 00:32:20,920 --> 00:32:25,160 Speaker 1: So lightning is one type of an electromagnetic pulse. A 539 00:32:25,160 --> 00:32:28,959 Speaker 1: coronal mass ejection or CMME from a star like the 540 00:32:29,000 --> 00:32:32,280 Speaker 1: Sun can create a magnetic field strong enough to be 541 00:32:32,400 --> 00:32:35,760 Speaker 1: an e m P. That's why you'll hear about times 542 00:32:35,800 --> 00:32:39,280 Speaker 1: where there's a lot of solar activity potentially interfering with electronics. 543 00:32:39,320 --> 00:32:44,200 Speaker 1: Here on Earth, nuclear explosions generate e m P s uh. 544 00:32:44,280 --> 00:32:47,320 Speaker 1: A really powerful pulse could cause power lines to snap. 545 00:32:48,520 --> 00:32:51,880 Speaker 1: You would have an excess of electrical current and voltage 546 00:32:51,880 --> 00:32:53,640 Speaker 1: in the power lines. They wouldn't be able to handle it, 547 00:32:53,680 --> 00:32:56,680 Speaker 1: and they can snap right then and there. Computer systems 548 00:32:56,680 --> 00:33:00,480 Speaker 1: are particularly vulnerable to electromagnetic pulses. So an e MP 549 00:33:00,600 --> 00:33:04,920 Speaker 1: weapon is not just possible, it's something that exists, and 550 00:33:04,960 --> 00:33:06,960 Speaker 1: it's a lot of a lot of militaries around the 551 00:33:07,000 --> 00:33:10,520 Speaker 1: world have worked on refining them over the years, because 552 00:33:10,520 --> 00:33:13,320 Speaker 1: if you have a weapon that can destroy communication infrastructure 553 00:33:13,480 --> 00:33:17,280 Speaker 1: without actually causing physical destruction to the region itself, that's 554 00:33:17,320 --> 00:33:20,680 Speaker 1: a super high value weapon. That being said, I am 555 00:33:20,720 --> 00:33:24,080 Speaker 1: not aware of any system that could produce a powerful 556 00:33:24,200 --> 00:33:28,560 Speaker 1: e MP blast that you could base on a satellite platform. 557 00:33:28,680 --> 00:33:31,920 Speaker 1: Um it essentially just becomes a laser gun weapon in 558 00:33:31,960 --> 00:33:34,960 Speaker 1: the movie, And I can't think of a way where 559 00:33:34,960 --> 00:33:37,760 Speaker 1: you would be able to do this unless you had 560 00:33:37,800 --> 00:33:42,239 Speaker 1: like nuclear explosives in orbit. Even then you would have 561 00:33:42,320 --> 00:33:45,240 Speaker 1: to really find a way to direct that pulse to 562 00:33:45,280 --> 00:33:50,520 Speaker 1: have maximum effectiveness. The movie also features a personal holographic 563 00:33:50,560 --> 00:33:55,360 Speaker 1: projector that's capable of producing a three dimensional hologram that's 564 00:33:55,480 --> 00:33:57,959 Speaker 1: convincing enough to fool people who are standing in the 565 00:33:58,040 --> 00:34:01,160 Speaker 1: same area. Right Like, if you were in a room 566 00:34:01,200 --> 00:34:03,239 Speaker 1: and someone were using this, you would think that that 567 00:34:03,320 --> 00:34:05,600 Speaker 1: was actually a person there, not a hologram. That's all 568 00:34:05,640 --> 00:34:08,680 Speaker 1: effective it is within the movie. It even casts a 569 00:34:08,760 --> 00:34:11,400 Speaker 1: shadow behind it in the scene where it plays a 570 00:34:11,440 --> 00:34:15,160 Speaker 1: big part in the film. Obviously, we have not developed 571 00:34:15,160 --> 00:34:18,960 Speaker 1: that kind of technology. There are some pretty nifty effects 572 00:34:19,000 --> 00:34:23,320 Speaker 1: that we can create to simulate holograms under very specific conditions, 573 00:34:23,600 --> 00:34:28,160 Speaker 1: but generally speaking, this is way beyond our capabilities. Oh also, 574 00:34:28,400 --> 00:34:30,279 Speaker 1: one of the actors in the movie would later go 575 00:34:30,320 --> 00:34:34,640 Speaker 1: on to become the Countess of Devon. True story, it 576 00:34:34,680 --> 00:34:37,680 Speaker 1: really did happen. All right, We got a few more 577 00:34:37,840 --> 00:34:40,600 Speaker 1: stories we want to talk about, but first let's take 578 00:34:40,719 --> 00:34:51,600 Speaker 1: another quick break. Okay, we're up to two thousand and fifteen, 579 00:34:51,719 --> 00:34:55,279 Speaker 1: and it's time to talk about flying cars. In fact, 580 00:34:55,360 --> 00:34:57,360 Speaker 1: those pop up a couple of times, but this is 581 00:34:57,360 --> 00:34:58,759 Speaker 1: the first one, and you knew we were going to 582 00:34:58,840 --> 00:35:02,960 Speaker 1: have to get around to it, because is when about 583 00:35:03,080 --> 00:35:05,880 Speaker 1: half of Back to the Future Part two takes place. 584 00:35:06,640 --> 00:35:09,239 Speaker 1: If you've never seen the movie, well, first you got 585 00:35:09,239 --> 00:35:11,200 Speaker 1: the first Back to the Future, which follows a character 586 00:35:11,239 --> 00:35:15,160 Speaker 1: named Marty McFly as he accidentally travels back from nineteen 587 00:35:15,239 --> 00:35:19,800 Speaker 1: eighty five to nineteen fifty five and then, through some misadventures, 588 00:35:19,840 --> 00:35:21,720 Speaker 1: has to figure out a way to make his parents 589 00:35:21,800 --> 00:35:24,399 Speaker 1: fall in love with each other or else he'll never 590 00:35:24,520 --> 00:35:29,600 Speaker 1: have existed. So it's your classic temporal paradox scenario. But 591 00:35:29,640 --> 00:35:32,480 Speaker 1: in the sequel, which came out in nineteen eighty nine, 592 00:35:33,040 --> 00:35:38,399 Speaker 1: Marty's friend and mentor, Doc brown Uh, convinces Marty five 593 00:35:38,440 --> 00:35:40,719 Speaker 1: Marty that he has to travel to the far off 594 00:35:40,800 --> 00:35:46,080 Speaker 1: future of in order to help Marty's kids. At least 595 00:35:46,080 --> 00:35:48,719 Speaker 1: that's the first section of Back to the Future Part two, 596 00:35:49,040 --> 00:35:52,720 Speaker 1: so we get to visit and things are a little 597 00:35:52,760 --> 00:35:56,560 Speaker 1: different from our real world version of They are a 598 00:35:56,719 --> 00:36:00,080 Speaker 1: lot more day glow for one thing, a lot of 599 00:36:00,120 --> 00:36:03,760 Speaker 1: like fluorescent colors, and in the Back to the Future 600 00:36:03,800 --> 00:36:08,960 Speaker 1: two version also homes come standard with fax machines. In 601 00:36:08,960 --> 00:36:11,560 Speaker 1: that version of now maybe in the late eighties, that 602 00:36:11,640 --> 00:36:14,120 Speaker 1: seem like it was a realistic outcome, but Obviously it's 603 00:36:14,160 --> 00:36:17,080 Speaker 1: not what are would be in our most homes today. 604 00:36:17,080 --> 00:36:19,239 Speaker 1: I mean, there's no need for them because we have 605 00:36:19,280 --> 00:36:22,680 Speaker 1: plenty of electronic systems that don't require paper or toner. 606 00:36:23,280 --> 00:36:26,600 Speaker 1: But email was not something that people were really thinking 607 00:36:26,640 --> 00:36:33,160 Speaker 1: about in Hollywood at levels. Restaurants in the movie have 608 00:36:33,360 --> 00:36:35,800 Speaker 1: a robo wait staff, and there's a There are a 609 00:36:35,880 --> 00:36:38,759 Speaker 1: few novelty places around the world, a lot of them 610 00:36:38,760 --> 00:36:42,040 Speaker 1: in Japan, that use computers and robots in order to 611 00:36:42,120 --> 00:36:45,719 Speaker 1: serve food, but they are really a novelty and an exception, 612 00:36:45,800 --> 00:36:49,200 Speaker 1: not the rule. I will say, however, that wait staff 613 00:36:49,280 --> 00:36:52,120 Speaker 1: is one of those roles that robots and automated systems 614 00:36:52,160 --> 00:36:56,439 Speaker 1: could potentially thrive in. And I say that because we're 615 00:36:56,440 --> 00:36:59,319 Speaker 1: talking about an environment that has a limited set of 616 00:36:59,400 --> 00:37:03,160 Speaker 1: variable Like if you can only order from a menu, 617 00:37:03,400 --> 00:37:05,320 Speaker 1: that means you can't just walk into the restaurant and 618 00:37:05,440 --> 00:37:09,160 Speaker 1: order anything. Right, you couldn't walk into like a Mexican 619 00:37:09,239 --> 00:37:11,399 Speaker 1: restaurant that didn't have pizza on the menu and say 620 00:37:11,480 --> 00:37:14,359 Speaker 1: I want a pizza. You typically have to order off 621 00:37:14,400 --> 00:37:16,800 Speaker 1: the menu. Now you might be able to order something 622 00:37:16,880 --> 00:37:20,440 Speaker 1: off menu if the staff like you and the chefs 623 00:37:20,440 --> 00:37:23,160 Speaker 1: in the back don't mind, but only if the restaurant 624 00:37:23,160 --> 00:37:27,680 Speaker 1: actually has the necessary ingredients on hand. So robots work 625 00:37:27,760 --> 00:37:32,280 Speaker 1: well in environments that have a limited number of varia variables, 626 00:37:32,320 --> 00:37:35,680 Speaker 1: like if they have restrictions on variability, robots do better. 627 00:37:35,680 --> 00:37:38,800 Speaker 1: It's when you start adding more variables then it becomes 628 00:37:38,840 --> 00:37:42,399 Speaker 1: more complicated for a robot to operate. Whether folks would 629 00:37:42,400 --> 00:37:46,320 Speaker 1: ever see robots as being useful or you know, pleasant 630 00:37:46,360 --> 00:37:49,560 Speaker 1: to interact with on that level, or whether they would 631 00:37:49,600 --> 00:37:53,400 Speaker 1: even make economic sense compared to say, hiring human weight staff, 632 00:37:53,840 --> 00:37:57,319 Speaker 1: those are other matters like technologically it's probably not the 633 00:37:57,360 --> 00:37:59,680 Speaker 1: most difficult thing in the world to do. The question 634 00:37:59,760 --> 00:38:04,080 Speaker 1: is that makes sense financially and socially. Like Escape from 635 00:38:04,200 --> 00:38:06,640 Speaker 1: l A, we see holograms and Back to the Future too. 636 00:38:07,000 --> 00:38:10,400 Speaker 1: There's a film Marquee for Jaws film that has a 637 00:38:10,520 --> 00:38:15,120 Speaker 1: holographic shark emerged from the screen to seemingly attack Marty, 638 00:38:15,200 --> 00:38:18,880 Speaker 1: which startles him. It's a pretty cool effect, and it 639 00:38:18,960 --> 00:38:22,040 Speaker 1: is possible to create a three D effect with a 640 00:38:22,120 --> 00:38:26,960 Speaker 1: screen without the need for three D glasses. Lenticular displays 641 00:38:27,000 --> 00:38:30,360 Speaker 1: can do this. However, this is a pretty limited effect 642 00:38:30,440 --> 00:38:32,840 Speaker 1: and typically you need to be positioned in a sweet 643 00:38:32,840 --> 00:38:35,480 Speaker 1: spot in order to experience it, if you move a 644 00:38:35,480 --> 00:38:37,480 Speaker 1: little bit to the left or to the right, the 645 00:38:37,560 --> 00:38:41,640 Speaker 1: effect changes. You don't get the proper images that are 646 00:38:41,640 --> 00:38:44,480 Speaker 1: directed towards your eyes, and it will look all messy. 647 00:38:44,600 --> 00:38:47,080 Speaker 1: From personal experience, I can tell you that looking at 648 00:38:47,200 --> 00:38:49,960 Speaker 1: three D glasses, or rather three D displays that are 649 00:38:49,960 --> 00:38:53,600 Speaker 1: glasses free, it's not fun. It can actually bring on 650 00:38:53,680 --> 00:38:56,080 Speaker 1: some eye strain. It is just it's possible to do. 651 00:38:56,239 --> 00:38:58,360 Speaker 1: It just wouldn't happen the way it does in the movie. 652 00:38:58,760 --> 00:39:02,560 Speaker 1: There's also the hydrator oven in the movie that turns 653 00:39:02,600 --> 00:39:05,640 Speaker 1: like a hockey puck sized pizza into a full sized 654 00:39:05,719 --> 00:39:09,080 Speaker 1: cooked pizza in just a matter of moments. The movie 655 00:39:09,080 --> 00:39:11,239 Speaker 1: doesn't bother to explain how this works. But I mean, 656 00:39:11,280 --> 00:39:14,680 Speaker 1: the name hydrator suggests that you're adding water to something 657 00:39:14,719 --> 00:39:17,240 Speaker 1: in order to make it expand to the appropriate size, 658 00:39:17,880 --> 00:39:22,120 Speaker 1: and simultaneously it's somehow heating up the pizza at the 659 00:39:22,160 --> 00:39:25,800 Speaker 1: same time. Not sure how that works. Heat transfer isn't 660 00:39:25,880 --> 00:39:29,560 Speaker 1: magically instantaneous, so I don't know. But they don't bother 661 00:39:29,640 --> 00:39:32,000 Speaker 1: to explain it because it's just it's just a fun 662 00:39:32,000 --> 00:39:35,719 Speaker 1: little accent. It's not it's not meant to be analyzed 663 00:39:35,719 --> 00:39:38,600 Speaker 1: the way I'm doing it. I'm the jerk here. Then 664 00:39:38,680 --> 00:39:42,600 Speaker 1: there are all the flying machines like hoverboards and flying cars. 665 00:39:43,400 --> 00:39:46,040 Speaker 1: There's again not a lot of explanation about how these 666 00:39:46,040 --> 00:39:49,960 Speaker 1: things work. Presumably the hoverboards are generating some sort of 667 00:39:49,960 --> 00:39:55,400 Speaker 1: electromagnetic field that counteracts gravity somehow, although I can't even 668 00:39:55,440 --> 00:39:58,520 Speaker 1: begin to imagine how you can make that be a thing. Now, 669 00:39:58,600 --> 00:40:02,959 Speaker 1: you could use something like super conductivity to magnetically lock 670 00:40:03,120 --> 00:40:07,920 Speaker 1: something into a specific position over a magnetic track, but 671 00:40:08,120 --> 00:40:12,600 Speaker 1: that would require cooling the the something down to levels 672 00:40:13,280 --> 00:40:16,680 Speaker 1: of around absolute zero to get it super conductive. It's 673 00:40:16,680 --> 00:40:19,759 Speaker 1: not really practical. It's really hard to do, and you 674 00:40:19,760 --> 00:40:22,960 Speaker 1: would be limited to the magnetic surface itself. So in 675 00:40:22,960 --> 00:40:25,880 Speaker 1: other words, like you could have a track and you 676 00:40:25,880 --> 00:40:33,720 Speaker 1: could have a super conductive magnetic hoverboard over that track, 677 00:40:34,120 --> 00:40:36,200 Speaker 1: and it would hover, and you could push it and 678 00:40:36,200 --> 00:40:39,719 Speaker 1: it would just effortlessly slide all the way to the 679 00:40:39,840 --> 00:40:41,960 Speaker 1: end of the track. But it couldn't go off track 680 00:40:42,360 --> 00:40:44,800 Speaker 1: because it has to have that magnetic base to work, 681 00:40:44,840 --> 00:40:47,080 Speaker 1: and it has to be locked into a magnetic field. 682 00:40:47,360 --> 00:40:49,720 Speaker 1: If you go outside of that, you lose the effect. 683 00:40:50,120 --> 00:40:53,480 Speaker 1: And obviously we don't have those magical flying cars today, 684 00:40:54,000 --> 00:40:57,239 Speaker 1: let alone back in we do have some cars that 685 00:40:57,280 --> 00:41:01,440 Speaker 1: can fly, though the word cars being a little bit generous. 686 00:41:01,440 --> 00:41:05,680 Speaker 1: The most common variations I see are quad copter like designs. 687 00:41:05,719 --> 00:41:08,360 Speaker 1: So think of like a quad copter drone that is 688 00:41:08,400 --> 00:41:11,360 Speaker 1: a drone that has the four propellers that are you know, 689 00:41:11,400 --> 00:41:14,800 Speaker 1: kind of like the four corners around the drone, only 690 00:41:15,040 --> 00:41:17,319 Speaker 1: you know, supersize it, make it big enough so that 691 00:41:17,360 --> 00:41:19,600 Speaker 1: you could have a compartment where a person could sit 692 00:41:19,719 --> 00:41:23,240 Speaker 1: inside of it in the middle. Um. There are tons 693 00:41:23,239 --> 00:41:27,319 Speaker 1: of companies working on making flying cars reality, mostly with 694 00:41:27,400 --> 00:41:31,040 Speaker 1: the goal of creating a ride hailing service similar to 695 00:41:31,080 --> 00:41:34,719 Speaker 1: like Uber or lift, in which customers pay to fly 696 00:41:34,880 --> 00:41:38,120 Speaker 1: across town without having to deal with street traffic. But 697 00:41:38,440 --> 00:41:41,160 Speaker 1: that technology is going to hinge not just on making 698 00:41:41,200 --> 00:41:45,080 Speaker 1: the stuff safe and reliable, but also creating the regulations 699 00:41:45,160 --> 00:41:49,000 Speaker 1: that will guide how the tech can interoperate with you know, 700 00:41:49,040 --> 00:41:52,160 Speaker 1: the rest of the environment, like a city. You're gonna 701 00:41:52,200 --> 00:41:56,000 Speaker 1: have to have rules for that otherwise the potential for 702 00:41:56,120 --> 00:41:59,160 Speaker 1: disaster is just way too high. Uh. And to me, 703 00:41:59,280 --> 00:42:02,120 Speaker 1: the biggest p of technology from Back to the Future too, 704 00:42:02,239 --> 00:42:04,359 Speaker 1: and really the end of the first Back to the 705 00:42:04,360 --> 00:42:09,400 Speaker 1: Future is Mr fusion. This device presumably uses fusion to 706 00:42:09,520 --> 00:42:13,040 Speaker 1: generate electricity, and it's enough to provide the one point 707 00:42:13,080 --> 00:42:17,280 Speaker 1: twenty one jiggawatts of power for the Delorean's time circuits. 708 00:42:17,960 --> 00:42:23,759 Speaker 1: Fusion involves fusing atoms together. It's the process that the 709 00:42:23,800 --> 00:42:28,080 Speaker 1: Sun goes through where it uh it fuses hydrogen atoms 710 00:42:28,080 --> 00:42:31,960 Speaker 1: into helium at a temperature of millions of degrees. The 711 00:42:32,040 --> 00:42:34,799 Speaker 1: process requires a lot of energy to get started. Like 712 00:42:34,840 --> 00:42:38,480 Speaker 1: you know, with the Sun, you've got this intense gravitational pull. 713 00:42:38,560 --> 00:42:43,759 Speaker 1: You've got really dense system there, uh and incredible temperatures. 714 00:42:43,800 --> 00:42:46,080 Speaker 1: So it's got this amazing amount of energy that can 715 00:42:46,120 --> 00:42:51,239 Speaker 1: sustain this process. But here on Earth, I mean, it 716 00:42:51,360 --> 00:42:54,359 Speaker 1: requires a lot of of energy and pressure to get 717 00:42:54,400 --> 00:42:59,520 Speaker 1: this thing started. And yeah, the output is potentially even 718 00:42:59,560 --> 00:43:05,080 Speaker 1: more energy, but sustaining that reaction is very challenging to do. 719 00:43:05,239 --> 00:43:10,400 Speaker 1: Scientists around the world are working on developing practical fusion reactors. 720 00:43:11,080 --> 00:43:13,440 Speaker 1: So far, the amount of energy needed to start and 721 00:43:13,520 --> 00:43:17,000 Speaker 1: sustain a fusion reaction is greater than what we get 722 00:43:17,000 --> 00:43:18,880 Speaker 1: out of it, Like we can have a net positive 723 00:43:18,960 --> 00:43:24,319 Speaker 1: outcome on an individual reaction, but sustaining it so that 724 00:43:24,680 --> 00:43:27,560 Speaker 1: we can do something useful with it. That's a different matter. 725 00:43:28,239 --> 00:43:32,239 Speaker 1: If we can get through that, that will be a 726 00:43:32,320 --> 00:43:37,040 Speaker 1: transformational change for the world. I don't think, however, we're 727 00:43:37,040 --> 00:43:39,760 Speaker 1: ever going to see fusion reactors that can be small 728 00:43:39,880 --> 00:43:42,960 Speaker 1: enough to be incorporated into the electrical system of a vehicle, 729 00:43:43,320 --> 00:43:46,840 Speaker 1: nor can I imagine a need to do that. And 730 00:43:46,880 --> 00:43:48,919 Speaker 1: to be fair, the techa back to the Future too 731 00:43:48,960 --> 00:43:51,880 Speaker 1: was always intended to be whimsical, and some of the 732 00:43:51,920 --> 00:43:55,799 Speaker 1: stuff we've seen in the movie has kind of come 733 00:43:55,840 --> 00:43:58,560 Speaker 1: to pass, just not necessarily in the way that it 734 00:43:58,640 --> 00:44:03,200 Speaker 1: showed up on screen. Alright, let's close out this list 735 00:44:03,719 --> 00:44:06,920 Speaker 1: with the movie Blade Runner. Blade Runner came out in 736 00:44:07,080 --> 00:44:10,800 Speaker 1: nineteen eight two, but it's set in the year twenty nineteen. 737 00:44:11,520 --> 00:44:15,399 Speaker 1: It's a science fiction film noir kind of movie. Now, 738 00:44:15,400 --> 00:44:18,600 Speaker 1: if you've never seen it, you should totally watch it. 739 00:44:18,600 --> 00:44:22,839 Speaker 1: It is an amazing movie. I will warn you. There 740 00:44:22,840 --> 00:44:26,640 Speaker 1: are some very slowly paced moments in that film, Like 741 00:44:26,920 --> 00:44:29,400 Speaker 1: there might be some bits where you find yourself saying, 742 00:44:29,520 --> 00:44:32,720 Speaker 1: get on with it as you watch people very slowly 743 00:44:32,760 --> 00:44:37,399 Speaker 1: walk around a building for what feels like an eternity, 744 00:44:37,400 --> 00:44:41,800 Speaker 1: But the premise is really neat. So in this version 745 00:44:41,920 --> 00:44:45,800 Speaker 1: of two thousand nineteen, you are in a densely populated 746 00:44:45,920 --> 00:44:50,440 Speaker 1: and dystopian Los Angeles, and there's this big social problem. 747 00:44:50,719 --> 00:44:57,080 Speaker 1: So humanity has developed a way to bio engineer synthetic humans. 748 00:44:57,120 --> 00:45:01,200 Speaker 1: So they're kind of like androids, but they're not robots. 749 00:45:01,239 --> 00:45:04,400 Speaker 1: They're made out of gooey, fleshy stuff. I mean, I 750 00:45:04,440 --> 00:45:06,520 Speaker 1: guess you could call them robots in the sense that 751 00:45:06,600 --> 00:45:10,400 Speaker 1: they are more like the robots of the original Roslum's 752 00:45:10,480 --> 00:45:14,680 Speaker 1: Universal Robots, which was the Czechoslovakian play from the early 753 00:45:14,760 --> 00:45:17,400 Speaker 1: twentieth century. They have a lot more in common with 754 00:45:17,440 --> 00:45:22,319 Speaker 1: those than with the danger Will Robinson style robots that 755 00:45:22,360 --> 00:45:25,360 Speaker 1: we think of today. But yeah, we call them replicants 756 00:45:25,360 --> 00:45:27,960 Speaker 1: in the movie. Uh. And the reason that humans even 757 00:45:28,000 --> 00:45:31,600 Speaker 1: made replicants is the same reason that we typically make robots. 758 00:45:32,080 --> 00:45:34,520 Speaker 1: It's so that we have something to take care of 759 00:45:34,520 --> 00:45:37,680 Speaker 1: the work that is one of the three ds that 760 00:45:37,840 --> 00:45:45,279 Speaker 1: is dull, dirty, and dangerous. So these synthetics, these replicants 761 00:45:45,280 --> 00:45:48,359 Speaker 1: are meant to take on jobs that traditionally humans would 762 00:45:48,400 --> 00:45:51,200 Speaker 1: have to do. But it's not very fulfilling work, and 763 00:45:51,239 --> 00:45:54,960 Speaker 1: it can be very dangerous and both physically and mentally, 764 00:45:55,040 --> 00:45:57,160 Speaker 1: and have a negative impact on the people who do 765 00:45:57,239 --> 00:46:00,400 Speaker 1: the work, so the ideas you offload that work to 766 00:46:01,160 --> 00:46:03,799 Speaker 1: a machine. So in this case, the machine is a 767 00:46:03,800 --> 00:46:06,680 Speaker 1: synthetic human, and they are not considered to be quote 768 00:46:06,760 --> 00:46:11,640 Speaker 1: unquote real. The replicants are meant to be used off world, 769 00:46:12,000 --> 00:46:14,959 Speaker 1: not on Earth, in other words, but four of them 770 00:46:15,239 --> 00:46:18,480 Speaker 1: have escaped and made it to Earth in order to 771 00:46:18,560 --> 00:46:22,160 Speaker 1: experience Earth, and a former police officer who specializes in 772 00:46:22,280 --> 00:46:26,479 Speaker 1: identifying and tracking down replicants, which is a specific job 773 00:46:26,560 --> 00:46:31,520 Speaker 1: called a blade runner, is essentially extorted into eliminating these 774 00:46:31,560 --> 00:46:34,160 Speaker 1: four replicants. Now, I'm not gonna ruin the story. It 775 00:46:34,320 --> 00:46:37,200 Speaker 1: is worth seeing and it has some of the most 776 00:46:37,480 --> 00:46:41,960 Speaker 1: beautiful imagery in early nine science fiction. Also has one 777 00:46:42,000 --> 00:46:44,759 Speaker 1: of the most famous speeches in science fiction films of 778 00:46:44,760 --> 00:46:48,640 Speaker 1: all time. But let's talk about the tech. Clearly, we 779 00:46:48,840 --> 00:46:52,279 Speaker 1: can't create synthetic human beings right now, there's been some 780 00:46:52,360 --> 00:46:56,480 Speaker 1: amazing research and development in synthetic organs, or replicated and 781 00:46:56,600 --> 00:47:01,040 Speaker 1: three D printed organs. That stuff is a slutely amazing. 782 00:47:01,200 --> 00:47:05,960 Speaker 1: Scientists are creating bio friendly scaffolds, and on the scaffolds 783 00:47:05,960 --> 00:47:09,959 Speaker 1: they can then print tissue structure, so that research could 784 00:47:09,960 --> 00:47:12,640 Speaker 1: potentially lead to a future in which we use stuff 785 00:47:12,680 --> 00:47:17,200 Speaker 1: like a patient's stem cells to create three D printed 786 00:47:17,520 --> 00:47:21,359 Speaker 1: synthetic organs and use that for stuff like transplants. So 787 00:47:21,400 --> 00:47:25,799 Speaker 1: if that panned out, it would revolutionize transplant surgery. You 788 00:47:25,840 --> 00:47:28,960 Speaker 1: potentially you could cut way down on the risk of 789 00:47:29,000 --> 00:47:33,359 Speaker 1: the recipient's body rejecting the new organ, because if the 790 00:47:33,520 --> 00:47:38,600 Speaker 1: organ is created using essentially tissue from the donor like 791 00:47:38,640 --> 00:47:42,640 Speaker 1: the actual patient, then the body is at least the 792 00:47:42,640 --> 00:47:46,600 Speaker 1: thought goes more likely to accept the new organ. Yeah, 793 00:47:46,640 --> 00:47:49,160 Speaker 1: we're not able to make a fully synthetic human being. 794 00:47:50,080 --> 00:47:52,520 Speaker 1: The replicants on the run in the film are classified 795 00:47:52,560 --> 00:47:56,080 Speaker 1: as Nexus six replicants. Google got a little cheeky when 796 00:47:56,200 --> 00:48:00,239 Speaker 1: through its Motorola Mobility division it no longer has that ada. 797 00:48:00,280 --> 00:48:03,040 Speaker 1: At the time, it developed an Android phone and it 798 00:48:03,080 --> 00:48:05,520 Speaker 1: was codenamed Shamu, but when they released it they called 799 00:48:05,560 --> 00:48:09,560 Speaker 1: it the Nexus six. Cute reference and Blade Runner also 800 00:48:09,640 --> 00:48:12,360 Speaker 1: features flying cars just like Back to the Future too, 801 00:48:12,680 --> 00:48:15,239 Speaker 1: and like Back to the Future Too, how they fly 802 00:48:15,560 --> 00:48:18,400 Speaker 1: isn't important, so I can't really comment on the proposed 803 00:48:18,440 --> 00:48:20,799 Speaker 1: methods except to say, obviously we don't have it in 804 00:48:20,800 --> 00:48:23,799 Speaker 1: real life. The film feature stuff that would be out 805 00:48:23,800 --> 00:48:26,320 Speaker 1: of place in the real Two thousand nineteen. For instance, 806 00:48:26,360 --> 00:48:30,359 Speaker 1: the main character looks through tons of polaroid photographs, so 807 00:48:30,440 --> 00:48:33,120 Speaker 1: today he would more likely be looking through a folder 808 00:48:33,160 --> 00:48:37,200 Speaker 1: of digital images. He also uses a machine called an esper, 809 00:48:37,480 --> 00:48:40,880 Speaker 1: which can analyze a two dimensional photograph and then produce 810 00:48:41,000 --> 00:48:44,279 Speaker 1: views of stuff that are in that photograph that are 811 00:48:44,320 --> 00:48:47,760 Speaker 1: from other angles. Like imagine you've taken a still photo 812 00:48:48,000 --> 00:48:50,319 Speaker 1: of a table, So you're you're standing on one side 813 00:48:50,320 --> 00:48:52,520 Speaker 1: of the table. You take a picture, you use this thing, 814 00:48:52,840 --> 00:48:55,799 Speaker 1: you could theory theoretically look at the table from a 815 00:48:55,840 --> 00:48:59,719 Speaker 1: one degree change in view, like you were standing on 816 00:48:59,760 --> 00:49:02,840 Speaker 1: the opposite side. Now, computers can do some pretty cool stuff, 817 00:49:03,280 --> 00:49:05,319 Speaker 1: but doing that in real time isn't something we can 818 00:49:05,360 --> 00:49:08,840 Speaker 1: easily manage. And also, you know, we would really just 819 00:49:08,880 --> 00:49:11,480 Speaker 1: be looking at a best guest scenario, like the computer 820 00:49:11,520 --> 00:49:13,960 Speaker 1: would be guessing what the other side looked like. It 821 00:49:13,960 --> 00:49:18,319 Speaker 1: wouldn't really be useful. All right. That is just a 822 00:49:18,400 --> 00:49:22,200 Speaker 1: quick rundown of some science fiction movies, the main predictions 823 00:49:22,239 --> 00:49:26,080 Speaker 1: that have not quite turned out the way people envisioned. 824 00:49:26,400 --> 00:49:30,800 Speaker 1: I hope you enjoyed this rambling discussion of science fiction. 825 00:49:31,080 --> 00:49:34,480 Speaker 1: I like doing episodes about sci fi occasionally. It's always 826 00:49:34,480 --> 00:49:37,719 Speaker 1: fun if you have any movies specifically you would like 827 00:49:37,800 --> 00:49:40,279 Speaker 1: me to really dive into and talk about from a 828 00:49:40,320 --> 00:49:44,240 Speaker 1: technical level, like the tech that's either went into making 829 00:49:44,239 --> 00:49:46,920 Speaker 1: the movie or the tech that's displayed within the movie itself, 830 00:49:47,040 --> 00:49:50,759 Speaker 1: let me know, or any other topics. I'm eager to 831 00:49:50,840 --> 00:49:53,120 Speaker 1: hear your thoughts. The best way to get in touch 832 00:49:53,239 --> 00:49:55,399 Speaker 1: is with Twitter. The handle for the show is tech 833 00:49:55,480 --> 00:50:00,319 Speaker 1: Stuff hs W and I'll talk to you again really soon. Yeah. 834 00:50:04,360 --> 00:50:07,399 Speaker 1: Text Stuff is an I Heart Radio production. For more 835 00:50:07,480 --> 00:50:10,840 Speaker 1: podcasts from I Heart Radio, visit the i Heart Radio app, 836 00:50:11,000 --> 00:50:14,160 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.