1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,960 --> 00:00:17,680 Speaker 1: job in Strickland. I'm an executive producer with I Heart 4 00:00:17,760 --> 00:00:20,159 Speaker 1: Radio and I love all things tech and it is 5 00:00:20,200 --> 00:00:24,599 Speaker 1: time for a tech Stuff classic episode. This episode originally 6 00:00:24,600 --> 00:00:29,280 Speaker 1: published back in September of two thousand, fourteen September to 7 00:00:29,360 --> 00:00:33,480 Speaker 1: be precise. It is titled five Technologies to End All 8 00:00:33,560 --> 00:00:37,840 Speaker 1: Wars that Didn't And I think the title pretty much 9 00:00:37,960 --> 00:00:41,960 Speaker 1: sets it all up. Let's dive in. People who are 10 00:00:42,000 --> 00:00:47,600 Speaker 1: are real innovators, really uh forward thinkers, for lack of 11 00:00:47,600 --> 00:00:51,400 Speaker 1: a better word, often get kind of idealistic and optimistic, 12 00:00:51,680 --> 00:00:56,560 Speaker 1: sometimes perhaps unrealistically so. Right, and so we have some 13 00:00:56,600 --> 00:01:01,440 Speaker 1: examples here of people who very confidently at the time 14 00:01:01,560 --> 00:01:05,240 Speaker 1: proclaimed that the technology either they invented or that they 15 00:01:05,280 --> 00:01:08,759 Speaker 1: were an advocate for would be the end of war 16 00:01:09,080 --> 00:01:11,200 Speaker 1: for one reason or another. And they fall into different 17 00:01:11,280 --> 00:01:15,120 Speaker 1: kinds of categories. So we're gonna talk about all of them. Yeah, 18 00:01:15,160 --> 00:01:17,479 Speaker 1: So there are several different ways I guess you could 19 00:01:17,760 --> 00:01:21,959 Speaker 1: imagine that war between humans could end. And on one hand, 20 00:01:22,000 --> 00:01:24,080 Speaker 1: it's kind of hard to imagine that because it's just 21 00:01:24,160 --> 00:01:27,280 Speaker 1: such a fundamental part of human nature and human history, 22 00:01:27,319 --> 00:01:29,480 Speaker 1: and obviously it would be a fantastic thing for us 23 00:01:29,520 --> 00:01:31,959 Speaker 1: to not violently kill each other in great numbers at 24 00:01:32,080 --> 00:01:35,760 Speaker 1: intervals of time. But there are a few ways you 25 00:01:35,800 --> 00:01:38,839 Speaker 1: could look at how this might happen. So one would 26 00:01:38,880 --> 00:01:43,240 Speaker 1: be to sort of make war too risky, so that 27 00:01:43,319 --> 00:01:47,000 Speaker 1: it's just not in your self interest to pursue it, gotcha. 28 00:01:47,120 --> 00:01:49,680 Speaker 1: So the idea of being that even if you feel 29 00:01:49,680 --> 00:01:53,880 Speaker 1: you have an advanced military that to wage war of 30 00:01:53,920 --> 00:01:57,760 Speaker 1: any type would would incurage such losses as to nullify 31 00:01:57,920 --> 00:02:02,280 Speaker 1: any positive effect that that war might have, So it 32 00:02:02,400 --> 00:02:05,280 Speaker 1: just cannot be of a net advantage to you. The 33 00:02:05,360 --> 00:02:08,360 Speaker 1: only way to win is not to play. I guess 34 00:02:08,400 --> 00:02:11,119 Speaker 1: another way, though this is kind of harder to imagine 35 00:02:11,120 --> 00:02:13,520 Speaker 1: how it would be done, would be to say that 36 00:02:13,560 --> 00:02:19,000 Speaker 1: you would make war completely impracticable or physically impossible. So 37 00:02:19,240 --> 00:02:22,360 Speaker 1: it's just you somehow create a technology that makes it 38 00:02:22,400 --> 00:02:25,400 Speaker 1: so that people cannot actually do it. You want to 39 00:02:25,400 --> 00:02:28,520 Speaker 1: shoot somebody, but your gun doesn't work. It's just filled 40 00:02:28,560 --> 00:02:31,919 Speaker 1: with crayons. Yeah, that would be a more difficult kind 41 00:02:31,919 --> 00:02:34,560 Speaker 1: of technology to imagine, but some people have sort of 42 00:02:34,600 --> 00:02:37,160 Speaker 1: gone down that road, and it will blur together with 43 00:02:37,200 --> 00:02:41,280 Speaker 1: the category I just mentioned. Um, another way would probably 44 00:02:41,280 --> 00:02:46,320 Speaker 1: be to sort of make war irrelevant, like to eliminate 45 00:02:46,520 --> 00:02:49,480 Speaker 1: the motivations that would drive people to war. So you 46 00:02:49,520 --> 00:02:51,680 Speaker 1: imagine that you might come up with a list of 47 00:02:51,720 --> 00:02:55,960 Speaker 1: different reasons people would declare war on one another, and 48 00:02:56,000 --> 00:02:59,880 Speaker 1: if you can eliminate all of those reasons for people 49 00:03:00,040 --> 00:03:03,680 Speaker 1: wanting to go, hypothetically they won't go. Gotcha. So this 50 00:03:03,720 --> 00:03:06,320 Speaker 1: would be kind of the Star Trek future where you 51 00:03:06,360 --> 00:03:11,040 Speaker 1: have created a world where where you've eliminated need. Therefore 52 00:03:11,480 --> 00:03:14,120 Speaker 1: war is a thing of the past. Yeah, got you. 53 00:03:14,360 --> 00:03:16,880 Speaker 1: The other one would be to sort of change us, 54 00:03:17,080 --> 00:03:20,720 Speaker 1: to to change our outlook or to change our nature, 55 00:03:21,400 --> 00:03:23,679 Speaker 1: or at least give us some kind of perspective that 56 00:03:23,720 --> 00:03:27,320 Speaker 1: would make war ridiculous to where we just realize it's 57 00:03:27,320 --> 00:03:30,440 Speaker 1: no good and we don't want to do it. And 58 00:03:30,760 --> 00:03:33,920 Speaker 1: we've got some some actual examples that fall into these 59 00:03:34,000 --> 00:03:37,240 Speaker 1: various categories in more than one well, and they're One 60 00:03:37,240 --> 00:03:39,440 Speaker 1: of the funny things is you don't have to go 61 00:03:39,600 --> 00:03:43,720 Speaker 1: digging into the annals of crank history to find people 62 00:03:43,760 --> 00:03:46,120 Speaker 1: who thought world peace would come about by one of 63 00:03:46,280 --> 00:03:49,480 Speaker 1: one of these methods. In fact, you can find really 64 00:03:49,520 --> 00:03:54,920 Speaker 1: smart famous, powerful, influential people who thought technology could get 65 00:03:55,000 --> 00:03:57,320 Speaker 1: us down one of these roads to world peace. And 66 00:03:57,360 --> 00:03:59,720 Speaker 1: I think the first one we should talk about is 67 00:03:59,800 --> 00:04:03,840 Speaker 1: a guy who's very Internet famous these days, Nicola Tesla. 68 00:04:04,040 --> 00:04:06,720 Speaker 1: Although you did say, you know, you don't have to 69 00:04:06,720 --> 00:04:10,600 Speaker 1: look at cranks, Well, oh, you're you're getting on your 70 00:04:10,640 --> 00:04:16,440 Speaker 1: anti Tesla. Look, I just electrical Horse. The episode that 71 00:04:16,640 --> 00:04:19,520 Speaker 1: just published was a rerun about our episode on Tesla 72 00:04:19,600 --> 00:04:22,440 Speaker 1: where I talked about advocating for Tesla and the Tesla 73 00:04:22,560 --> 00:04:25,400 Speaker 1: versus Edison debate. Well, I mean, it's it's not that 74 00:04:25,440 --> 00:04:28,520 Speaker 1: I'm against Tesla, but it's pretty true that he had 75 00:04:28,560 --> 00:04:32,520 Speaker 1: some mental health issues. But at any rate, let's talk 76 00:04:32,560 --> 00:04:35,799 Speaker 1: about this idea, this idea of a of an invention 77 00:04:36,279 --> 00:04:40,960 Speaker 1: that negates war. Right, Tesla thought that you could build 78 00:04:41,040 --> 00:04:45,920 Speaker 1: something that would make war impossible. What was that thing? Well, 79 00:04:46,000 --> 00:04:50,080 Speaker 1: it's Tesla's famous death ray, which immediately as soon as 80 00:04:50,080 --> 00:04:52,600 Speaker 1: you hear the name, you think, yeah, that that sounds peaceful. 81 00:04:52,720 --> 00:04:54,760 Speaker 1: I don't know if he actually called it death ray. 82 00:04:55,080 --> 00:04:57,320 Speaker 1: I don't think those were The New York Times called 83 00:04:57,360 --> 00:04:59,559 Speaker 1: it a death ray. Yeah, so we've got a quote 84 00:04:59,600 --> 00:05:03,080 Speaker 1: from Gla Tesla. This is from an article from nineteen 85 00:05:03,240 --> 00:05:06,440 Speaker 1: thirty seven called a Machine to End War, which featured 86 00:05:06,480 --> 00:05:10,919 Speaker 1: an extended interview with Nikola Tesla, where he gives the 87 00:05:10,960 --> 00:05:14,760 Speaker 1: following quote. We cannot abolish war by outlawing it. We 88 00:05:14,839 --> 00:05:18,000 Speaker 1: cannot end it by disarming the strong. War can be 89 00:05:18,080 --> 00:05:21,240 Speaker 1: stopped not by making the strong week, but by making 90 00:05:21,360 --> 00:05:25,840 Speaker 1: every nation weak or strong, able to defend itself. Hitherto, 91 00:05:26,000 --> 00:05:29,080 Speaker 1: all devices that could be used for defense could also 92 00:05:29,120 --> 00:05:33,159 Speaker 1: be utilized to serve for aggression. This nullified the value 93 00:05:33,279 --> 00:05:36,839 Speaker 1: of the improvement for purposes of peace. But I was 94 00:05:36,920 --> 00:05:40,400 Speaker 1: fortunate enough to evolve a new idea and too perfect 95 00:05:40,480 --> 00:05:43,960 Speaker 1: means which can be used chiefly for defense. If it 96 00:05:44,120 --> 00:05:48,160 Speaker 1: is adopted, it will revolutionize the relations between nations. It 97 00:05:48,200 --> 00:05:53,160 Speaker 1: will make any country, large or small, impregnable against armies, airplanes, 98 00:05:53,200 --> 00:05:57,080 Speaker 1: and other means for attack. My invention requires a large plant, 99 00:05:57,880 --> 00:06:00,640 Speaker 1: but once it is established, it will be possible to 100 00:06:00,760 --> 00:06:04,520 Speaker 1: destroy anything men or machines approaching within a radius of 101 00:06:04,560 --> 00:06:07,919 Speaker 1: two hundred miles. It will, so to speak, provide a 102 00:06:08,040 --> 00:06:13,840 Speaker 1: wall of power, offering an insuperable obstacle against any effective aggression. 103 00:06:14,320 --> 00:06:17,640 Speaker 1: So when he says plant, he of course means power plant. 104 00:06:17,800 --> 00:06:20,760 Speaker 1: He does not rout. Yeah, not yet, not an enormous 105 00:06:20,800 --> 00:06:24,840 Speaker 1: redwood or something. So this death ray, as as was 106 00:06:24,880 --> 00:06:27,680 Speaker 1: referred to in the article, I was never actually fleshed out. 107 00:06:27,720 --> 00:06:29,800 Speaker 1: As far as we know. Tesla, first of all, was 108 00:06:29,839 --> 00:06:32,120 Speaker 1: famous for not writing a lot of stuff down. He 109 00:06:32,520 --> 00:06:34,520 Speaker 1: or at least that's why he claimed. He claimed he 110 00:06:34,560 --> 00:06:39,320 Speaker 1: could envision inventions completely fully formed in his head and 111 00:06:39,360 --> 00:06:42,960 Speaker 1: even take them apart virtually in his head and examined 112 00:06:43,000 --> 00:06:45,400 Speaker 1: them to see how they worked, and then eventually build 113 00:06:45,480 --> 00:06:47,880 Speaker 1: the things, and they worked exactly the way they were 114 00:06:47,920 --> 00:06:51,800 Speaker 1: supposed to. Uh, that's part of the Tesla story. Whether 115 00:06:51,920 --> 00:06:54,800 Speaker 1: or not that's true, I don't know. But at any rate, 116 00:06:55,520 --> 00:06:58,880 Speaker 1: we don't have any evidence that Tesla had anything remotely 117 00:06:58,920 --> 00:07:03,800 Speaker 1: resembling a ray or what the actual mechanism would have been. Well, 118 00:07:03,839 --> 00:07:07,680 Speaker 1: and he says, my apparatus projects particles which may be 119 00:07:07,800 --> 00:07:11,640 Speaker 1: relatively large or of microscopic dimensions, enabling us to convey 120 00:07:11,680 --> 00:07:14,600 Speaker 1: to a small area at a great distance trillions of 121 00:07:14,640 --> 00:07:17,760 Speaker 1: times more energy than it's possible with rays of any kind, 122 00:07:18,640 --> 00:07:22,800 Speaker 1: which doesn't really sound like it means anything. But at 123 00:07:22,800 --> 00:07:26,080 Speaker 1: any rate, Uh, during this time of Tesla's life, he 124 00:07:26,120 --> 00:07:29,800 Speaker 1: was in his seventies, and this was when he was 125 00:07:29,880 --> 00:07:34,200 Speaker 1: really kind of mentally it appeared he was breaking down. 126 00:07:34,760 --> 00:07:40,040 Speaker 1: He had already shown some signs of obsessive compulsive behaviors, 127 00:07:40,400 --> 00:07:44,160 Speaker 1: possibly even some paranoid schizophrenia, because it was around this 128 00:07:44,240 --> 00:07:48,160 Speaker 1: time also when he claimed that he had received transmissions 129 00:07:48,200 --> 00:07:51,840 Speaker 1: from people, either from Venus or Mars, So you know, 130 00:07:52,000 --> 00:07:54,520 Speaker 1: it's things were a little rough for Tesla. He was 131 00:07:54,560 --> 00:07:57,440 Speaker 1: also in the process of moving from one hotel to 132 00:07:57,480 --> 00:08:00,960 Speaker 1: the other because he would get evicted due to incurring 133 00:08:01,160 --> 00:08:03,720 Speaker 1: enormous debts. But because he had such a rock star 134 00:08:03,800 --> 00:08:07,920 Speaker 1: status as a physicist and uh an electrical engineer, he 135 00:08:07,920 --> 00:08:10,600 Speaker 1: would be invited to go live in another hotel until 136 00:08:10,600 --> 00:08:14,000 Speaker 1: he had run up the debts there. So he may 137 00:08:14,120 --> 00:08:17,560 Speaker 1: very well have just been trying to earn as much 138 00:08:17,560 --> 00:08:20,720 Speaker 1: money as he possibly could selling this idea. Uh And 139 00:08:20,760 --> 00:08:23,160 Speaker 1: you know, not necessarily. I don't mean that he didn't 140 00:08:23,200 --> 00:08:25,280 Speaker 1: think it was real. He may very well believe that 141 00:08:25,360 --> 00:08:28,680 Speaker 1: he could in fact produce this device, but he didn't. 142 00:08:28,960 --> 00:08:32,800 Speaker 1: So uh. But this definitely falls into the realm of 143 00:08:32,920 --> 00:08:36,360 Speaker 1: let's make war impossible to happen by creating something that 144 00:08:36,400 --> 00:08:39,800 Speaker 1: would prevent the very act of aggression. From reaching the 145 00:08:39,880 --> 00:08:44,520 Speaker 1: intended target. Once another country knows that you have this 146 00:08:44,640 --> 00:08:48,440 Speaker 1: capability to stop any incoming attack, there's no reason they 147 00:08:48,480 --> 00:08:52,480 Speaker 1: would ever attack you because it wouldn't work, right, of course, 148 00:08:52,520 --> 00:08:54,840 Speaker 1: I mean that's the that's the logic behind it, and 149 00:08:54,920 --> 00:08:59,120 Speaker 1: the idea actually, Now, whether you're talking about array or 150 00:08:59,160 --> 00:09:01,160 Speaker 1: a thing that he does don't want to call array 151 00:09:01,320 --> 00:09:04,959 Speaker 1: but is instead projecting particles or whatever it is, this 152 00:09:05,040 --> 00:09:08,760 Speaker 1: idea has not necessarily died, the basic idea of creating 153 00:09:08,880 --> 00:09:14,280 Speaker 1: a technological infrastructure that would repel automatically all incoming attacks 154 00:09:14,320 --> 00:09:18,680 Speaker 1: and make them pointless. How about the Star Wars initiative. Yeah, 155 00:09:18,679 --> 00:09:22,400 Speaker 1: the the Strategic Defense initiative in the nineteen eighties often 156 00:09:22,480 --> 00:09:26,000 Speaker 1: referred to as the Star Wars program derisively. Yeah, I've 157 00:09:26,040 --> 00:09:28,840 Speaker 1: really wanted to do a full episode about this, to 158 00:09:28,920 --> 00:09:33,240 Speaker 1: really explain what the concept was, what the motivations were, 159 00:09:33,480 --> 00:09:37,720 Speaker 1: because this is this ends up being a very political, uh, 160 00:09:37,760 --> 00:09:40,839 Speaker 1: a very political story, not just a technology story. In fact, 161 00:09:40,920 --> 00:09:44,280 Speaker 1: it's more political than technological in many ways. The whole 162 00:09:44,360 --> 00:09:47,439 Speaker 1: Strategic Defense initiative was fueled by the Cold War between 163 00:09:47,440 --> 00:09:50,440 Speaker 1: the United States and the then Soviet Union, and the 164 00:09:50,520 --> 00:09:53,559 Speaker 1: idea was that if you have a system in place 165 00:09:53,880 --> 00:09:58,840 Speaker 1: that can block any incoming missile attack, then your country 166 00:09:59,000 --> 00:10:02,040 Speaker 1: is going to be at an advantage and be safe 167 00:10:02,080 --> 00:10:05,920 Speaker 1: from aggression. Uh. This was during an era that we'll 168 00:10:05,960 --> 00:10:11,520 Speaker 1: talk about very shortly, the whole idea of mutually assured destruction, 169 00:10:11,720 --> 00:10:14,000 Speaker 1: but we'll get to that in a little bit. So 170 00:10:14,600 --> 00:10:17,920 Speaker 1: very similar idea. Yeah, there's another idea that's a lot 171 00:10:18,000 --> 00:10:21,880 Speaker 1: like this, which is actually in use today, Israel's Iron 172 00:10:21,960 --> 00:10:23,840 Speaker 1: Dome system. Have you read about this has been in 173 00:10:23,840 --> 00:10:26,600 Speaker 1: the news lately. Yeah, it went into effect in two 174 00:10:26,600 --> 00:10:31,360 Speaker 1: thousand and eleven. This is an anti rocket defense system. 175 00:10:31,440 --> 00:10:34,920 Speaker 1: So it's meant to intercept rockets that are fired from 176 00:10:34,960 --> 00:10:38,520 Speaker 1: outside of Israel into Israel, and it does this by 177 00:10:38,679 --> 00:10:43,680 Speaker 1: firing off interceptor missiles. They're called Tamure surface to air missiles. 178 00:10:44,320 --> 00:10:47,440 Speaker 1: And you've got a radar system that first to texts 179 00:10:47,440 --> 00:10:50,679 Speaker 1: an incoming rocket, and then you have computers running predictive 180 00:10:51,200 --> 00:10:54,959 Speaker 1: software that will look at the pathway of the rocket, 181 00:10:55,000 --> 00:10:58,640 Speaker 1: predict which way it's going, and then send an interceptor 182 00:10:58,720 --> 00:11:02,640 Speaker 1: missile to destroy that rocket before it reaches its intended target. 183 00:11:02,960 --> 00:11:06,439 Speaker 1: It's even supposed to only focus on rockets that are 184 00:11:06,480 --> 00:11:09,200 Speaker 1: aimed at populated areas, So if a rocket were to 185 00:11:09,960 --> 00:11:13,080 Speaker 1: be aimed at an open area where there's not likely 186 00:11:13,120 --> 00:11:17,360 Speaker 1: anyone to be harmed, it won't target that rocket. The 187 00:11:17,400 --> 00:11:21,040 Speaker 1: reason being that interceptor missiles are expensive. We're talking like 188 00:11:22,160 --> 00:11:25,120 Speaker 1: just shy of a hundred thousand dollars a pop. So 189 00:11:25,400 --> 00:11:28,600 Speaker 1: also I'm imagining that the system at large must just 190 00:11:28,679 --> 00:11:32,720 Speaker 1: be incredibly expensive and complicated, because if it's actually able 191 00:11:32,720 --> 00:11:36,520 Speaker 1: to calculate the trajectory of an incoming rocket to figure 192 00:11:36,520 --> 00:11:40,320 Speaker 1: out whether it needs to intercept or not, I mean wow. 193 00:11:40,640 --> 00:11:44,040 Speaker 1: And there's been some criticism of the system. One is 194 00:11:44,120 --> 00:11:49,560 Speaker 1: based on just a skepticism that it's really effective. The 195 00:11:49,600 --> 00:11:53,240 Speaker 1: idea being that it's possible that in a display of 196 00:11:53,320 --> 00:11:56,800 Speaker 1: its effectiveness when because just recently there was a news 197 00:11:56,840 --> 00:11:59,880 Speaker 1: story where the system was able to shoot down fifty 198 00:12:00,120 --> 00:12:04,440 Speaker 1: different incoming rockets that were fired simultaneously. Uh, and that 199 00:12:04,440 --> 00:12:07,839 Speaker 1: that's really impressive. But there's some critics who suggest it's 200 00:12:07,880 --> 00:12:11,280 Speaker 1: kind of in a conspiracy theory way, that it could 201 00:12:11,360 --> 00:12:14,480 Speaker 1: just be the is the Israeli government firing off the 202 00:12:14,480 --> 00:12:17,680 Speaker 1: interceptor missiles and then having them detonate because you can't 203 00:12:17,720 --> 00:12:20,040 Speaker 1: see these rockets with the naked eye. When they're flying 204 00:12:20,480 --> 00:12:23,000 Speaker 1: through the air, they're they're too small for you to notice, 205 00:12:23,000 --> 00:12:24,760 Speaker 1: and they fly too high for you to notice. So 206 00:12:25,040 --> 00:12:29,320 Speaker 1: it's possible that you could detonate interceptor missiles then uh, 207 00:12:29,360 --> 00:12:32,800 Speaker 1: and then say, oh, that's a successful interception. I'm not 208 00:12:33,000 --> 00:12:34,760 Speaker 1: going to go so far as to say that that 209 00:12:34,800 --> 00:12:39,160 Speaker 1: conspiracy theory holds merit. I more inclined to believe that 210 00:12:39,200 --> 00:12:43,320 Speaker 1: this is in fact an actual demonstration of it working. Um. 211 00:12:43,360 --> 00:12:46,520 Speaker 1: But there are also critics who say it gives any 212 00:12:46,559 --> 00:12:49,880 Speaker 1: system that's like this, whether it's the Strategic Defense Initiative 213 00:12:50,040 --> 00:12:54,960 Speaker 1: or the Iron Dome initiative, it can create an unhealthy 214 00:12:55,040 --> 00:12:59,160 Speaker 1: political environment, meaning that if you have this kind of 215 00:12:59,160 --> 00:13:02,640 Speaker 1: technology that you're disposal, you may feel that you are 216 00:13:02,920 --> 00:13:06,840 Speaker 1: to some degree invulnerable. It's weird to say to some degree, 217 00:13:06,960 --> 00:13:10,120 Speaker 1: but at any rate that you are largely invulnerable to 218 00:13:10,200 --> 00:13:13,880 Speaker 1: incoming attack, which some argue means that you have less 219 00:13:13,880 --> 00:13:17,160 Speaker 1: of of an incentive to pursue peace. Well. Yeah, if 220 00:13:17,200 --> 00:13:19,600 Speaker 1: we want to transition to the next item on our 221 00:13:19,640 --> 00:13:23,000 Speaker 1: list here, some people might say that it's very important 222 00:13:23,120 --> 00:13:27,920 Speaker 1: that we all feel vulnerable in order to maintain peace. Yeah. Yeah, 223 00:13:27,960 --> 00:13:30,200 Speaker 1: this is where we get into the idea of mutually 224 00:13:30,280 --> 00:13:33,600 Speaker 1: assured destruction. So this is uh, it doesn't have a 225 00:13:33,679 --> 00:13:36,120 Speaker 1: very friendly name, but believe it or not, for the 226 00:13:36,120 --> 00:13:38,000 Speaker 1: past century, there have been a lot of people who 227 00:13:38,120 --> 00:13:41,200 Speaker 1: thought that this was one of the best technological routes 228 00:13:41,200 --> 00:13:43,520 Speaker 1: to world peace. Yeah, And in fact, this is this 229 00:13:43,600 --> 00:13:46,840 Speaker 1: is kind of piggybacked onto Tesla's idea, right, the idea 230 00:13:46,920 --> 00:13:50,880 Speaker 1: that if you are in possession of enough firepower of 231 00:13:50,960 --> 00:13:55,240 Speaker 1: whatever sort, that that's enough to to deter people or 232 00:13:55,280 --> 00:13:59,200 Speaker 1: other countries from attacking you. This ends up being repeated 233 00:13:59,240 --> 00:14:01,560 Speaker 1: over and over a in up and up through the 234 00:14:01,559 --> 00:14:04,400 Speaker 1: Cold War and before the Cold War, the Cold War. 235 00:14:04,520 --> 00:14:08,360 Speaker 1: Really yeah, Hiron Maxim, who invented the machine gun, said 236 00:14:08,480 --> 00:14:10,720 Speaker 1: this device is going to end all war because it's 237 00:14:10,760 --> 00:14:13,880 Speaker 1: so dangerous that no one would dare attack. Yeah, and 238 00:14:13,920 --> 00:14:17,079 Speaker 1: that ended up being very much wrong, especially in the 239 00:14:17,360 --> 00:14:19,400 Speaker 1: whole story of World War One and World War Two. 240 00:14:19,560 --> 00:14:23,600 Speaker 1: Certainly proves that Orville right believe that the airplane would 241 00:14:23,720 --> 00:14:27,480 Speaker 1: end war. The airplane would be such an incredibly superior 242 00:14:27,600 --> 00:14:31,320 Speaker 1: vehicle that there'd be no reason to ever declare war 243 00:14:31,720 --> 00:14:35,120 Speaker 1: for fear of what would rain down upon you. Obviously, 244 00:14:35,200 --> 00:14:37,640 Speaker 1: that became a very important tool in war. In fact, 245 00:14:37,680 --> 00:14:39,520 Speaker 1: I think there are some people who would look at 246 00:14:39,560 --> 00:14:42,840 Speaker 1: those two examples you just cited and say that those 247 00:14:42,960 --> 00:14:46,200 Speaker 1: might have been some of the leading causes of World 248 00:14:46,200 --> 00:14:49,680 Speaker 1: War One obvious at least the leading tools. Well, I 249 00:14:49,720 --> 00:14:52,080 Speaker 1: mean causes in the sense that obviously there was a 250 00:14:52,080 --> 00:14:55,640 Speaker 1: lot of international political tension. I mean, does anybody really 251 00:14:55,680 --> 00:14:57,960 Speaker 1: know what caused World War One? I think a lot 252 00:14:57,960 --> 00:15:01,920 Speaker 1: of confusion, But certainly an opinion that I've heard before 253 00:15:01,960 --> 00:15:04,000 Speaker 1: and I don't know enough to dismiss it out of hand, 254 00:15:04,080 --> 00:15:06,480 Speaker 1: is to say that a major factor in what led 255 00:15:06,520 --> 00:15:09,640 Speaker 1: to World War One was the acquiring of new warmaking 256 00:15:09,760 --> 00:15:14,920 Speaker 1: technology and people building up their military stockpiles and basically 257 00:15:14,960 --> 00:15:17,440 Speaker 1: looking for a way to test this new stuff out. Well, 258 00:15:17,480 --> 00:15:20,080 Speaker 1: that's part of it. I mean, during the late nineteenth 259 00:15:20,080 --> 00:15:23,160 Speaker 1: and early twentieth centuries, you had all these different countries 260 00:15:23,320 --> 00:15:26,320 Speaker 1: in Europe, including Germany, which had just formed as an 261 00:15:26,320 --> 00:15:29,440 Speaker 1: actual country. I mean a lot of people forget that 262 00:15:29,520 --> 00:15:33,200 Speaker 1: Germany as a unified country didn't really exist until the 263 00:15:33,280 --> 00:15:37,000 Speaker 1: mid nineteenth century, Uh, you know it, since it had 264 00:15:37,040 --> 00:15:41,800 Speaker 1: previously been part of of various empires. Before we talk 265 00:15:41,880 --> 00:15:44,800 Speaker 1: about some more technologies that were supposed to end wars, 266 00:15:44,960 --> 00:15:56,200 Speaker 1: but didn't. Let's take a quick break. These different countries 267 00:15:56,200 --> 00:16:00,280 Speaker 1: were all trying to make sure they cemented their city, 268 00:16:00,480 --> 00:16:03,000 Speaker 1: that they made sure they were safe from other nations, 269 00:16:03,360 --> 00:16:07,200 Speaker 1: which meant investing in militaries. For Germany and for the 270 00:16:07,280 --> 00:16:10,640 Speaker 1: United Kingdom that was largely naval. They were investing heavily 271 00:16:10,640 --> 00:16:13,920 Speaker 1: in their navies. Um. But what it What it also 272 00:16:14,000 --> 00:16:18,240 Speaker 1: meant was that you had these these incredibly powerful armies 273 00:16:18,360 --> 00:16:22,920 Speaker 1: around Europe that were ready to go at any moment 274 00:16:23,160 --> 00:16:27,120 Speaker 1: with no particular opponent. But you also have these incredibly 275 00:16:27,160 --> 00:16:32,320 Speaker 1: complex treaties between countries, so when there would be a 276 00:16:32,560 --> 00:16:36,800 Speaker 1: precipitating event, it wasn't in at least in hindsight, it 277 00:16:36,920 --> 00:16:39,880 Speaker 1: wasn't a huge surprise to see it kind of escalate 278 00:16:39,960 --> 00:16:45,520 Speaker 1: into the major conflict. It became that historical onion headline 279 00:16:45,560 --> 00:16:47,720 Speaker 1: about World War One. I think it's something like war 280 00:16:47,840 --> 00:16:52,200 Speaker 1: declared by All. It's a that's fairly accurate. I mean, 281 00:16:52,360 --> 00:16:56,040 Speaker 1: it certainly didn't all happen simultaneously, but it it's again, 282 00:16:56,160 --> 00:16:58,960 Speaker 1: in hindsight, you can totally see how it happened. But 283 00:16:59,280 --> 00:17:01,960 Speaker 1: at the time I'm sure people just thought it was 284 00:17:02,640 --> 00:17:07,359 Speaker 1: it was unimaginable but that that philosophy continued beyond World 285 00:17:07,440 --> 00:17:10,280 Speaker 1: War One. Yeah. Well, so it's quite obvious that machine 286 00:17:10,320 --> 00:17:13,840 Speaker 1: guns and airplanes have not made war obsolete. But there's 287 00:17:13,880 --> 00:17:19,840 Speaker 1: a more questionable proposal, which is that, um, maybe nuclear 288 00:17:19,880 --> 00:17:25,000 Speaker 1: weapons have. Yeah. Alright, so you've heard of the the 289 00:17:25,160 --> 00:17:31,119 Speaker 1: scientist Edward Teller is born in Hungary and uh immigrated 290 00:17:31,119 --> 00:17:34,080 Speaker 1: to the United States. And he had this to say 291 00:17:34,720 --> 00:17:38,639 Speaker 1: about scientists in general, which was, the scientist is not 292 00:17:38,760 --> 00:17:41,560 Speaker 1: responsible for the laws of nature. It is his job 293 00:17:41,640 --> 00:17:45,040 Speaker 1: to find out how these laws operate. Is the scientist's 294 00:17:45,200 --> 00:17:47,600 Speaker 1: job to find the ways in which these laws can 295 00:17:47,640 --> 00:17:51,320 Speaker 1: serve the human will. However, it is not the scientists 296 00:17:51,400 --> 00:17:55,000 Speaker 1: job to determine whether a hydrogen bomb should be constructed, 297 00:17:55,280 --> 00:17:58,200 Speaker 1: whether it should be used, or how it should be used. Now, 298 00:17:58,320 --> 00:18:01,040 Speaker 1: the reason he said that was that he worked on 299 00:18:01,080 --> 00:18:03,439 Speaker 1: the Manhattan Project, all right. He was one of the 300 00:18:03,440 --> 00:18:05,680 Speaker 1: science In fact, he was one of the three scientists 301 00:18:05,960 --> 00:18:09,240 Speaker 1: who convinced Albert Einstein that he should tell the president 302 00:18:09,640 --> 00:18:13,080 Speaker 1: about the powers of nuclear fission, which then of the 303 00:18:13,119 --> 00:18:16,280 Speaker 1: United States. The President United States yes, in order to 304 00:18:17,040 --> 00:18:20,800 Speaker 1: precipitate the development of the atomic bomb. And then there 305 00:18:20,840 --> 00:18:25,080 Speaker 1: became another discussion within the same group of the Manhattan 306 00:18:25,080 --> 00:18:28,400 Speaker 1: Project about the development of what they were calling a superbomb, 307 00:18:28,480 --> 00:18:33,119 Speaker 1: a thermonuclear weapon, and tell her was very much on 308 00:18:33,160 --> 00:18:36,520 Speaker 1: the side of we should make this thing. This this 309 00:18:36,560 --> 00:18:39,200 Speaker 1: is something we should invest in. Uh. And then there 310 00:18:39,200 --> 00:18:43,760 Speaker 1: were other scientists like Oppenheimer who said this is a 311 00:18:43,760 --> 00:18:45,600 Speaker 1: bad idea, we should not do this thing. And there 312 00:18:45,680 --> 00:18:49,240 Speaker 1: was a lot of disagreement, which led to some pretty 313 00:18:49,240 --> 00:18:51,840 Speaker 1: controversial stuff down the line. But at any rate, you 314 00:18:51,920 --> 00:18:54,320 Speaker 1: had tell her advocating for this. Now, why did he 315 00:18:54,320 --> 00:18:57,760 Speaker 1: advocate for this? Well, he grew up in Europe during 316 00:18:57,800 --> 00:19:02,480 Speaker 1: a very tumultuous time and developed a distinct dislike for 317 00:19:02,640 --> 00:19:08,639 Speaker 1: fascism and communism before leaving to come to America. Yeah, 318 00:19:08,720 --> 00:19:13,399 Speaker 1: he saw Europe being torn apart. I mean it was 319 00:19:13,440 --> 00:19:16,440 Speaker 1: already uh, it had already gone through World War One 320 00:19:16,480 --> 00:19:19,199 Speaker 1: and was entering World War two or was about to 321 00:19:19,320 --> 00:19:21,159 Speaker 1: enter World War two by the time he was leaving, 322 00:19:21,680 --> 00:19:25,200 Speaker 1: and things just got worse from there. So in his mind, 323 00:19:25,280 --> 00:19:27,879 Speaker 1: one of the worst things in the world was the 324 00:19:27,920 --> 00:19:31,560 Speaker 1: formation of the Soviet Union. He saw communism as being 325 00:19:32,000 --> 00:19:36,080 Speaker 1: probably the greatest danger to the human race, and his 326 00:19:36,160 --> 00:19:40,480 Speaker 1: argument was that if we don't pursue this sort of 327 00:19:40,520 --> 00:19:45,560 Speaker 1: weapons program, this arms race. The Soviet Union certainly will, 328 00:19:46,000 --> 00:19:48,360 Speaker 1: so if we don't do it, we're gonna be left 329 00:19:48,359 --> 00:19:51,800 Speaker 1: behind and we'll be vulnerable. The only way to ensure 330 00:19:51,880 --> 00:19:54,840 Speaker 1: that we're safe is to also engage in an arms 331 00:19:54,960 --> 00:19:59,240 Speaker 1: race to develop these incredibly destructive weapons and thus be 332 00:19:59,560 --> 00:20:01,840 Speaker 1: a big a threat where the Soviet Union would never 333 00:20:01,920 --> 00:20:05,240 Speaker 1: attack us. I mean, we would surely never attack anyone 334 00:20:05,280 --> 00:20:08,520 Speaker 1: else unprovoked. So this is just for us to make 335 00:20:08,520 --> 00:20:11,199 Speaker 1: sure that they don't attack us. That was the That 336 00:20:11,280 --> 00:20:14,520 Speaker 1: was the general philosophy. So build up your arms to 337 00:20:14,560 --> 00:20:17,560 Speaker 1: the point where you would be such a destructive force 338 00:20:17,720 --> 00:20:20,800 Speaker 1: that it would be crazy to attack you, which is 339 00:20:21,040 --> 00:20:23,880 Speaker 1: again very similar to what we've already talked about. Right, 340 00:20:23,960 --> 00:20:28,800 Speaker 1: So now we have gigantic nuclear stockpiles on Earth as 341 00:20:28,840 --> 00:20:32,439 Speaker 1: a result of this doctrine. But this is this is 342 00:20:32,520 --> 00:20:35,360 Speaker 1: interesting because I feel like, while on one hand, it's 343 00:20:35,440 --> 00:20:39,480 Speaker 1: not a very friendly sounding doctrine, some people might still 344 00:20:39,600 --> 00:20:43,400 Speaker 1: argue for the wisdom of mad Well it's there. I mean, 345 00:20:43,440 --> 00:20:45,159 Speaker 1: there would be a lot of people. I think it 346 00:20:45,320 --> 00:20:49,600 Speaker 1: is completely wrong to say that it would make war obsolete. 347 00:20:49,720 --> 00:20:52,520 Speaker 1: I mean, there have been tons of war since nuclear 348 00:20:52,520 --> 00:20:56,960 Speaker 1: weapons were invented, you know, involving nuclear powers. But it's 349 00:20:56,960 --> 00:20:59,600 Speaker 1: some people I think might still say that, well, maybe 350 00:20:59,640 --> 00:21:02,760 Speaker 1: the presidence of all these nuclear weapons did prevent all 351 00:21:02,960 --> 00:21:06,240 Speaker 1: out war between say the United States and the Soviet Union. 352 00:21:07,320 --> 00:21:09,080 Speaker 1: We just have to keep in mind that there have 353 00:21:09,160 --> 00:21:12,760 Speaker 1: been tons of proxy wars throughout the years, fought by 354 00:21:12,800 --> 00:21:16,120 Speaker 1: these powers sort of through other countries and sure so, 355 00:21:16,119 --> 00:21:19,720 Speaker 1: so essentially what you're saying is the existence of thermonuclear 356 00:21:19,760 --> 00:21:24,560 Speaker 1: weapons has prevented a thermonuclear war. That without the existence 357 00:21:24,560 --> 00:21:28,520 Speaker 1: of these thermonuclear weapons in one or other of the parties, 358 00:21:28,720 --> 00:21:32,480 Speaker 1: the possibility of thermonuclear war rises. But because you have 359 00:21:32,680 --> 00:21:35,880 Speaker 1: this this balance where you have people, you know, who 360 00:21:35,920 --> 00:21:39,560 Speaker 1: realized what the implications of starting such a war would be, 361 00:21:40,240 --> 00:21:44,400 Speaker 1: it hasn't happened. Yea. Even if you were to agree 362 00:21:44,400 --> 00:21:47,160 Speaker 1: with this this idea, you might not necessarily think it's 363 00:21:47,160 --> 00:21:49,800 Speaker 1: good for the world as a whole. It might be 364 00:21:49,800 --> 00:21:52,359 Speaker 1: better for the people living in the United States and 365 00:21:52,400 --> 00:21:56,959 Speaker 1: the Soviet Union and not so much for their allies. Uh. 366 00:21:57,000 --> 00:21:59,119 Speaker 1: But I'm not even saying I agree with this. I 367 00:21:59,200 --> 00:22:01,639 Speaker 1: just wanted to say, well, this is one that we 368 00:22:01,680 --> 00:22:04,119 Speaker 1: can't put it in the totally ridiculous camp, because I 369 00:22:04,119 --> 00:22:07,159 Speaker 1: think there are still people who think that mutually assured 370 00:22:07,200 --> 00:22:11,439 Speaker 1: destruction had some kind of effectiveness. Now, this has also 371 00:22:11,560 --> 00:22:16,320 Speaker 1: led to the very popular trope of creating a doomsday 372 00:22:16,320 --> 00:22:20,000 Speaker 1: device that is so dangerous that that's what protects you 373 00:22:20,080 --> 00:22:23,520 Speaker 1: from attack, right right. The idea of a fail deadly 374 00:22:23,640 --> 00:22:27,040 Speaker 1: device as depicted in the movie Doctor Strange Love, which 375 00:22:27,080 --> 00:22:30,840 Speaker 1: how how would something like a fail deadly device work? Well, 376 00:22:31,080 --> 00:22:33,640 Speaker 1: Doctor Strange Love. The way it works is that it's 377 00:22:33,680 --> 00:22:37,359 Speaker 1: a device that, once initiated, cannot be canceled, and and 378 00:22:37,359 --> 00:22:40,040 Speaker 1: and Dr Strangelove, it's it's the Soviet Union has built 379 00:22:40,040 --> 00:22:45,479 Speaker 1: a device that will automatically activate if any sort of 380 00:22:45,560 --> 00:22:50,240 Speaker 1: oncoming attack, bombing attack were to target the Soviet Union, 381 00:22:50,720 --> 00:22:53,760 Speaker 1: and once it detects such an attack, it then initiates 382 00:22:53,760 --> 00:22:57,159 Speaker 1: this device, which is you can't turn it off, and 383 00:22:57,160 --> 00:23:02,000 Speaker 1: and it's designed to kill essentially everything. It's automatic. Yeah, 384 00:23:02,040 --> 00:23:04,359 Speaker 1: And because you can't turn it off, then the idea 385 00:23:04,359 --> 00:23:07,240 Speaker 1: is that's the perfect deterrent, because you just tell everyone, hey, 386 00:23:07,359 --> 00:23:10,399 Speaker 1: if you attack us, this thing goes on, it kills everybody, 387 00:23:10,480 --> 00:23:13,439 Speaker 1: and we can't stop it. We even if we want to, 388 00:23:13,560 --> 00:23:16,960 Speaker 1: we cannot stop this thing. And uh, the the main 389 00:23:17,080 --> 00:23:20,240 Speaker 1: character or the title character of the movie, Dr Strangelove, 390 00:23:20,280 --> 00:23:21,879 Speaker 1: He's not the main character, but he is the title 391 00:23:21,960 --> 00:23:25,159 Speaker 1: character points out that this particular kind of device is 392 00:23:25,240 --> 00:23:29,960 Speaker 1: only useful if the world knows about it, if it's secret, 393 00:23:30,920 --> 00:23:32,960 Speaker 1: as it is in the movie. But then it turns 394 00:23:32,960 --> 00:23:37,480 Speaker 1: out like the Russian Uh, and the Russian ambassador was like, well, 395 00:23:37,640 --> 00:23:41,480 Speaker 1: we're going to have a big event next week. Unfortunately 396 00:23:41,560 --> 00:23:45,240 Speaker 1: you jumped the gotten type of thing. And uh, supposedly 397 00:23:45,920 --> 00:23:50,800 Speaker 1: Doctor Strangelove himself was based off of Teller. Yeah, that's 398 00:23:50,840 --> 00:23:53,359 Speaker 1: the that's the rumor. Although whether or not that's true, 399 00:23:53,359 --> 00:23:56,960 Speaker 1: I do not know, but it's um, it's certainly interesting 400 00:23:57,200 --> 00:24:00,360 Speaker 1: that uh, you know this, this kind of idea has 401 00:24:00,440 --> 00:24:05,120 Speaker 1: filtered into the fiction as well as into reality. This 402 00:24:05,200 --> 00:24:07,480 Speaker 1: is also what fed into that Star Wars program we 403 00:24:07,520 --> 00:24:10,800 Speaker 1: talked about earlier, the idea that, well, if the Soviet 404 00:24:10,880 --> 00:24:14,080 Speaker 1: Union also has these major, massive weapons, and we have 405 00:24:14,160 --> 00:24:18,040 Speaker 1: these massive weapons, do we really feel comfortable that just 406 00:24:18,200 --> 00:24:20,720 Speaker 1: the presence of those weapons is enough to deter a 407 00:24:20,840 --> 00:24:23,800 Speaker 1: thermonuclear war? What if we could end up creating a 408 00:24:23,840 --> 00:24:26,879 Speaker 1: system that would shoot down enemy weapons so that we 409 00:24:26,960 --> 00:24:30,680 Speaker 1: remained safe and that was what really led to this, 410 00:24:31,080 --> 00:24:34,000 Speaker 1: that strategic defense initiative, and tell her was a major 411 00:24:34,119 --> 00:24:39,240 Speaker 1: opponent for that. He he really wanted to see this enacted. Uh. Ultimately, 412 00:24:39,280 --> 00:24:43,400 Speaker 1: that technology did not prove to be very reliable and 413 00:24:43,600 --> 00:24:46,159 Speaker 1: it didn't really seem like it would actually do what 414 00:24:46,200 --> 00:24:49,400 Speaker 1: it was supposed to do. Um at the time, we 415 00:24:49,480 --> 00:24:52,439 Speaker 1: probably could develop much better technology now, but it's a 416 00:24:52,560 --> 00:24:55,360 Speaker 1: very different world now because the Cold War is over. Sure. 417 00:24:56,119 --> 00:25:00,440 Speaker 1: So even if we accept that, Okay, what if we 418 00:25:00,560 --> 00:25:05,640 Speaker 1: believe that mutually shared destruction prevented all out nuclear war 419 00:25:05,960 --> 00:25:10,280 Speaker 1: between the United States and the Soviet Union, it still 420 00:25:10,320 --> 00:25:14,000 Speaker 1: didn't prevent all kinds of other smaller wars during that 421 00:25:14,080 --> 00:25:18,120 Speaker 1: time period. And we can't say that it will necessarily 422 00:25:18,160 --> 00:25:21,160 Speaker 1: always work in the future. I mean, it's something that 423 00:25:21,440 --> 00:25:26,240 Speaker 1: depends on everybody being sort of rational and self interested, 424 00:25:26,400 --> 00:25:31,639 Speaker 1: and on your technology being reliable and not breaking and 425 00:25:32,080 --> 00:25:34,719 Speaker 1: not just there are a lot of contingencies that go 426 00:25:34,800 --> 00:25:38,840 Speaker 1: into this being a good strategy for avoiding major warfare, right, 427 00:25:38,880 --> 00:25:41,679 Speaker 1: not mistaking a flock of geese for incoming missile attack 428 00:25:41,720 --> 00:25:46,040 Speaker 1: that yeah, um yeah, I mean you could argue that 429 00:25:46,280 --> 00:25:49,000 Speaker 1: saying mutually a shared destruction is the reason we haven't 430 00:25:49,000 --> 00:25:52,440 Speaker 1: had a nuclear war is equivalent to saying, hey, I've 431 00:25:52,480 --> 00:25:54,760 Speaker 1: got this magic rock that keeps tigers away. Do you 432 00:25:54,760 --> 00:25:57,720 Speaker 1: see any tigers here that proves it works? Right? I 433 00:25:58,040 --> 00:26:01,119 Speaker 1: think that's pretty apt. Yeah, so we don't want to 434 00:26:01,160 --> 00:26:03,560 Speaker 1: just be doom and gloom here. I mean, no, let's 435 00:26:03,560 --> 00:26:06,560 Speaker 1: stop talking about weapons and look at other ways technology 436 00:26:06,880 --> 00:26:10,240 Speaker 1: could in fact prevent all war in the future and 437 00:26:10,320 --> 00:26:12,679 Speaker 1: lead to a happy, peaceful flower time. Yeah. This is 438 00:26:12,680 --> 00:26:18,320 Speaker 1: more of the idealistic version of people who came up 439 00:26:18,359 --> 00:26:21,600 Speaker 1: with technologies and and the their reasoning behind why they 440 00:26:21,640 --> 00:26:24,520 Speaker 1: thought the technology they had either developed or advocated for 441 00:26:24,760 --> 00:26:27,359 Speaker 1: would end war. And one of them goes back to 442 00:26:27,560 --> 00:26:32,239 Speaker 1: uh a fellow named Marconi. You know he didn't just 443 00:26:32,440 --> 00:26:36,720 Speaker 1: play the momba. No, he did not listen to the radio. Uh. 444 00:26:37,000 --> 00:26:39,959 Speaker 1: He did listen to the radio. He made the radio 445 00:26:40,280 --> 00:26:44,840 Speaker 1: make noise. Uh. So Marconi often credited as the inventor 446 00:26:44,880 --> 00:26:47,960 Speaker 1: of the radio. Can you pronounce his first name? Can I? 447 00:26:47,960 --> 00:26:52,680 Speaker 1: I could try Googli Elmo, Googli Googlielmo. Yeah, I would say, 448 00:26:52,720 --> 00:26:57,200 Speaker 1: but my my, sorry, I imagine Elmo from Sesame Street, 449 00:26:57,280 --> 00:27:00,480 Speaker 1: but with googly eyes. My my Italian is worse than 450 00:27:00,520 --> 00:27:02,960 Speaker 1: any of the other foreign languages I don't speak. It's 451 00:27:03,000 --> 00:27:04,760 Speaker 1: probably the worst out of all of them. Well, the 452 00:27:04,880 --> 00:27:08,840 Speaker 1: Greek is maybe worse. Please enjoy our ignorance at any rate. Yeah. 453 00:27:08,840 --> 00:27:11,400 Speaker 1: So Marconi, who is credited as the inventor of the radio. 454 00:27:11,440 --> 00:27:14,480 Speaker 1: I know the Tesla fans out there are up in arms, 455 00:27:14,480 --> 00:27:18,040 Speaker 1: and I agree. Marconi used a lot of Tesla's patents, 456 00:27:18,080 --> 00:27:20,280 Speaker 1: according to Tesla's like, he's a good fellow. He's using 457 00:27:20,320 --> 00:27:23,440 Speaker 1: seventeen of my patents. But but he is the first 458 00:27:23,440 --> 00:27:27,879 Speaker 1: person to transmit a an encoded letter in Morse code 459 00:27:27,920 --> 00:27:31,320 Speaker 1: across the Atlantic, and that's why he is often referred 460 00:27:31,320 --> 00:27:33,520 Speaker 1: to as the inventor of the radio. So he's into 461 00:27:33,600 --> 00:27:36,920 Speaker 1: wireless very much. So he believed that we were going 462 00:27:36,960 --> 00:27:39,600 Speaker 1: to enter a wireless age where we wouldn't just have 463 00:27:39,640 --> 00:27:44,760 Speaker 1: wireless communication, we'd have wireless power, which goes back to 464 00:27:44,960 --> 00:27:47,640 Speaker 1: what Tesla believed to He was very much an advocate 465 00:27:47,640 --> 00:27:49,840 Speaker 1: for that as well. But he also thought we'd have 466 00:27:50,200 --> 00:27:56,280 Speaker 1: wireless commerce and wireless fertilization. I don't know what that means, 467 00:27:57,080 --> 00:28:01,240 Speaker 1: but yeah. There was an article where a reporter had 468 00:28:01,280 --> 00:28:06,520 Speaker 1: interviewed Marconi's in a technical World magazine in October nineteen twelve. Yeah, 469 00:28:06,560 --> 00:28:09,679 Speaker 1: so this way back in nineteen twelve and and Marconi 470 00:28:09,880 --> 00:28:15,119 Speaker 1: was kind of just thinking out loud about the possibilities 471 00:28:15,240 --> 00:28:16,840 Speaker 1: of the future. I mean, think about this. This is 472 00:28:16,840 --> 00:28:21,280 Speaker 1: an era where we just had really mastered the the 473 00:28:21,359 --> 00:28:26,720 Speaker 1: harnessing of electromagnetic radiation for the purposes of communication. It 474 00:28:26,800 --> 00:28:31,200 Speaker 1: seemed to at this point like anything could potentially be possible. Well, 475 00:28:31,240 --> 00:28:34,119 Speaker 1: and let's go right ahead and say it. I believe 476 00:28:34,400 --> 00:28:38,120 Speaker 1: radio certainly did change the world absolutely. I mean it 477 00:28:38,240 --> 00:28:40,920 Speaker 1: changed the world even you would say changed the world 478 00:28:41,000 --> 00:28:43,920 Speaker 1: almost definitely for the better in lots and lots of ways. 479 00:28:44,680 --> 00:28:48,520 Speaker 1: But when he says, quote, the coming of the wireless 480 00:28:48,560 --> 00:28:53,640 Speaker 1: era will make war impossible because it will make war ridiculous, yeah, 481 00:28:53,800 --> 00:28:57,760 Speaker 1: it turned out a little bit wrong. Yeah. Now, I 482 00:28:57,840 --> 00:29:02,320 Speaker 1: greatly admire the reason behind what he said, you know, 483 00:29:02,400 --> 00:29:07,160 Speaker 1: because it now grant it assumes that people will try 484 00:29:07,200 --> 00:29:11,960 Speaker 1: to comport themselves with compassion and rationality and also let 485 00:29:12,040 --> 00:29:18,760 Speaker 1: go of things that are culturally ingrained for generations, sometimes 486 00:29:19,160 --> 00:29:24,160 Speaker 1: millennia in some areas, and that is a lot to 487 00:29:24,200 --> 00:29:27,160 Speaker 1: ask for. But his idea was that this wireless era 488 00:29:27,720 --> 00:29:30,600 Speaker 1: would result in a world where we are able to 489 00:29:30,680 --> 00:29:34,040 Speaker 1: understand one another and communicate with one another so freely 490 00:29:34,640 --> 00:29:38,760 Speaker 1: that we would end up resolving disagreements before it would 491 00:29:38,800 --> 00:29:41,760 Speaker 1: ever get to a point where warfare would even be 492 00:29:41,800 --> 00:29:46,560 Speaker 1: a consideration. I mean, I don't want to say this 493 00:29:46,600 --> 00:29:49,560 Speaker 1: about someone who's obviously a brilliant man, much smarter than me. 494 00:29:49,640 --> 00:29:53,000 Speaker 1: But that's so naive, it is. It's but it's so 495 00:29:53,040 --> 00:29:55,440 Speaker 1: sweet you want it to be true. Yeah, the the 496 00:29:55,520 --> 00:29:59,680 Speaker 1: idea that, well, maybe we're just having lack of communication, 497 00:30:00,360 --> 00:30:02,600 Speaker 1: we're not getting through to each other, but if we 498 00:30:02,680 --> 00:30:06,880 Speaker 1: have wireless radio going from every country to every other country, 499 00:30:07,160 --> 00:30:11,200 Speaker 1: we can just talk it out. Now. The truly ironic 500 00:30:11,280 --> 00:30:14,600 Speaker 1: part of this is that radio would play an instrumental 501 00:30:14,760 --> 00:30:19,720 Speaker 1: role in warfare, everything from communication to radar and all 502 00:30:19,760 --> 00:30:22,160 Speaker 1: sorts of other applications. I mean, not to mention just 503 00:30:22,800 --> 00:30:26,360 Speaker 1: stoking anger around the world with talk radio hosts. Yeah, well, 504 00:30:26,560 --> 00:30:29,960 Speaker 1: propaganda is a huge part of it. I mean, and 505 00:30:30,000 --> 00:30:32,600 Speaker 1: in fact, propaganda, you could argue, would be the opposite 506 00:30:32,640 --> 00:30:35,560 Speaker 1: of what Marconi was envisioning, instead of it being this 507 00:30:35,680 --> 00:30:38,520 Speaker 1: kind of nationalistic approach where you know, it's it's a 508 00:30:38,600 --> 00:30:43,120 Speaker 1: very simple US versus them story where you make us 509 00:30:43,400 --> 00:30:46,920 Speaker 1: as as noble as possible, and them as evil and 510 00:30:46,960 --> 00:30:50,640 Speaker 1: wicked as possible. In your in your narrative, I mean, 511 00:30:50,680 --> 00:30:54,880 Speaker 1: that's hell, that's that's probably. It seems to me exactly 512 00:30:54,920 --> 00:30:57,920 Speaker 1: the opposite of what Marconi's idealistic vision of the future 513 00:30:57,960 --> 00:31:01,560 Speaker 1: would have been. He also thought with this wireless age 514 00:31:01,640 --> 00:31:04,840 Speaker 1: that we would probably have more access to resources than 515 00:31:04,880 --> 00:31:08,080 Speaker 1: we do now, which would help alleviate the reasons for 516 00:31:08,200 --> 00:31:10,720 Speaker 1: going to war in the first place, not just communication 517 00:31:10,880 --> 00:31:14,120 Speaker 1: but resources. Well, he's not the only person in history. 518 00:31:14,120 --> 00:31:17,520 Speaker 1: In fact, he's not the only famous, brilliant person in 519 00:31:17,640 --> 00:31:22,600 Speaker 1: history to have predicted that changes in access to resources 520 00:31:22,640 --> 00:31:26,320 Speaker 1: would be able to obviate the need for war. I mean, 521 00:31:26,520 --> 00:31:29,880 Speaker 1: there is an idea that, Okay, at least a large 522 00:31:30,080 --> 00:31:33,560 Speaker 1: number of the struggles that we experience in our lives 523 00:31:33,680 --> 00:31:37,280 Speaker 1: are over resources. We need food, we need water, we 524 00:31:37,320 --> 00:31:40,480 Speaker 1: need space and shelter, and so we are. I mean, 525 00:31:40,600 --> 00:31:43,760 Speaker 1: life is a struggle. We are competing for things that 526 00:31:43,800 --> 00:31:48,480 Speaker 1: we need. I wonder how much you can really chalk 527 00:31:48,600 --> 00:31:50,600 Speaker 1: war up to this, But let's take a look at 528 00:31:50,640 --> 00:31:53,880 Speaker 1: somebody who thought that you basically could. So you're talking 529 00:31:53,920 --> 00:32:00,600 Speaker 1: about the chemist Pierre Yogene Masselein. Berto, Yeah, Bertello import chemist, 530 00:32:01,000 --> 00:32:04,760 Speaker 1: very important chemist, brilliant man, absolutely brilliant man. And uh 531 00:32:05,040 --> 00:32:10,360 Speaker 1: he had some pretty pretty amazing things, some amazing predictions 532 00:32:10,360 --> 00:32:13,680 Speaker 1: that he made. He was also interviewed for a magazine article. 533 00:32:13,760 --> 00:32:18,040 Speaker 1: This was in McClure's magazine in September eight and the 534 00:32:18,200 --> 00:32:21,440 Speaker 1: article was titled Foods in the year two thousand Professor 535 00:32:21,440 --> 00:32:26,360 Speaker 1: Bernelow's theory that chemistry will displace agriculture. Now, some of 536 00:32:26,400 --> 00:32:29,160 Speaker 1: his predictions in here, while they haven't exactly come true, 537 00:32:29,280 --> 00:32:33,240 Speaker 1: are kind of perceptive of some of the food innovations 538 00:32:33,280 --> 00:32:35,560 Speaker 1: we might see coming down the road. I mean, obviously 539 00:32:35,560 --> 00:32:38,000 Speaker 1: not by the year two thousand, but still on the way, 540 00:32:38,120 --> 00:32:42,680 Speaker 1: and not not not solely through chemistry, which was his 541 00:32:42,680 --> 00:32:46,040 Speaker 1: his vision. He was we do now have lab grown beef. 542 00:32:46,240 --> 00:32:49,920 Speaker 1: Absolutely yes. So he was looking at the world through 543 00:32:49,920 --> 00:32:52,040 Speaker 1: the eyes of a chemist, and this is right of 544 00:32:52,080 --> 00:32:58,520 Speaker 1: the era where synthesizing chemicals was starting to really become, uh, 545 00:32:58,600 --> 00:33:03,200 Speaker 1: you know, an amazing industree. And he foresaw an era 546 00:33:03,320 --> 00:33:06,760 Speaker 1: where we'd be able to synthesize organic compounds as easily 547 00:33:06,800 --> 00:33:09,440 Speaker 1: as anything else, to the point where we could synthesize 548 00:33:09,440 --> 00:33:13,040 Speaker 1: in the lab anything. We could synthesize meat, we could 549 00:33:13,080 --> 00:33:18,560 Speaker 1: synthesize vegetables, we could synthesize alcohol and tobacco. And his 550 00:33:18,640 --> 00:33:21,520 Speaker 1: idea was that once we get to this world, which 551 00:33:21,520 --> 00:33:23,880 Speaker 1: he thought would be around the year two thousand, I've 552 00:33:23,880 --> 00:33:26,600 Speaker 1: got more of his quote in a second that um 553 00:33:26,840 --> 00:33:30,200 Speaker 1: will Will really kind of pull in his idea of 554 00:33:30,320 --> 00:33:34,960 Speaker 1: why this would change the world. He was sure that 555 00:33:35,160 --> 00:33:38,520 Speaker 1: this kind of development would mean that one you would 556 00:33:38,600 --> 00:33:41,080 Speaker 1: end up eliminating a lot of the problems of the 557 00:33:41,120 --> 00:33:43,840 Speaker 1: world because you would have a surplus of resources, no 558 00:33:43,920 --> 00:33:46,640 Speaker 1: longer scarcity, which would be a good thing on its own, 559 00:33:46,680 --> 00:33:50,680 Speaker 1: of course. And keep in mind this is late nineteenth 560 00:33:50,760 --> 00:33:54,360 Speaker 1: century Europe, a time when there were some serious famines 561 00:33:54,440 --> 00:33:58,760 Speaker 1: going on around Europe that we're leading to lots of 562 00:33:58,760 --> 00:34:02,800 Speaker 1: of strife, and so hunger was definitely one of the 563 00:34:02,840 --> 00:34:05,320 Speaker 1: big motivators, and he thought, well, if you can eliminate hunger, 564 00:34:05,360 --> 00:34:08,560 Speaker 1: you've eliminated a large reason why countries go to war. 565 00:34:09,120 --> 00:34:13,160 Speaker 1: Take care of that resources problem. He also thought that 566 00:34:13,239 --> 00:34:18,080 Speaker 1: we would develop food that is so delicious and nutritious 567 00:34:18,640 --> 00:34:23,200 Speaker 1: and edifying that it would improve the moral nature of 568 00:34:23,400 --> 00:34:26,600 Speaker 1: mankind itself. So in other words, you would eat this 569 00:34:26,680 --> 00:34:29,759 Speaker 1: food and you would become a better person. And that 570 00:34:29,800 --> 00:34:32,000 Speaker 1: would also help lead to the end divorce. So not 571 00:34:32,040 --> 00:34:34,600 Speaker 1: only would we have a surplus of resources, but we'd 572 00:34:34,640 --> 00:34:38,080 Speaker 1: be better people and therefore we would not go to 573 00:34:38,120 --> 00:34:40,759 Speaker 1: war because we would have compassion for our fellow men, 574 00:34:41,800 --> 00:34:45,640 Speaker 1: which is again an interesting concept. Uh, the idea that 575 00:34:45,760 --> 00:34:47,560 Speaker 1: you know, make sure you get your your fruits and 576 00:34:47,640 --> 00:34:51,000 Speaker 1: vegim otherwise you'll go to war with Spain. I mean, 577 00:34:51,040 --> 00:34:53,960 Speaker 1: that's kind of well. There were a lot of weird 578 00:34:54,040 --> 00:34:57,319 Speaker 1: ideas floating around at the time about how nutrition like 579 00:34:57,480 --> 00:35:01,600 Speaker 1: created the personality of a nation and stuff like that. Yeah, 580 00:35:01,800 --> 00:35:04,160 Speaker 1: they thought weird stuff in Europe in the nineteenth century. 581 00:35:04,239 --> 00:35:07,040 Speaker 1: This is true. Here is one of his longer quotes, 582 00:35:07,080 --> 00:35:10,879 Speaker 1: and I think it's absolutely charming. Again, you probably would 583 00:35:10,920 --> 00:35:13,160 Speaker 1: have to chalk this up as being naive, but still 584 00:35:13,239 --> 00:35:17,520 Speaker 1: very charming. Man should grow in sweetness and nobility because 585 00:35:17,560 --> 00:35:20,640 Speaker 1: he will have done with war with existence based upon 586 00:35:20,680 --> 00:35:24,799 Speaker 1: the slaughter of beasts. Perhaps this is only a dream. Remember, 587 00:35:25,080 --> 00:35:29,320 Speaker 1: synthetic chemistry, or something that we might call spiritual chemistry, 588 00:35:29,400 --> 00:35:33,160 Speaker 1: will develop means to as profoundly alter man's moral nature 589 00:35:33,440 --> 00:35:37,000 Speaker 1: as material chemistry will change the conditions of his environment. 590 00:35:37,360 --> 00:35:40,120 Speaker 1: There is no fear that art, beauty, and the charm 591 00:35:40,200 --> 00:35:43,400 Speaker 1: of human existence are destined to disappear. If the surface 592 00:35:43,440 --> 00:35:46,160 Speaker 1: of the earth ceases to be divided and I may 593 00:35:46,239 --> 00:35:50,919 Speaker 1: say disfigured by the geometrical devices of agriculture, it will 594 00:35:50,960 --> 00:35:55,880 Speaker 1: regain its natural verdure of woods and flowers. Man becoming 595 00:35:55,920 --> 00:35:59,560 Speaker 1: familiar with the principles and responsibilities of self government will 596 00:35:59,600 --> 00:36:03,000 Speaker 1: be more easily governed. The favored portions of the earth 597 00:36:03,040 --> 00:36:05,880 Speaker 1: will become vast gardens in which the human race will 598 00:36:05,960 --> 00:36:09,480 Speaker 1: dwell amid a peace, a luxury, and an abundance, recalling 599 00:36:09,520 --> 00:36:14,080 Speaker 1: the golden age of legendary lore. These are dreams, of course, 600 00:36:14,480 --> 00:36:18,719 Speaker 1: but science may surely be permitted to dreams. Sometimes. If 601 00:36:18,719 --> 00:36:20,799 Speaker 1: it were not for our dreams, where would it be 602 00:36:20,880 --> 00:36:24,879 Speaker 1: our impulse to progress, which I think is a beautiful thought. Yeah, sure, 603 00:36:25,040 --> 00:36:28,960 Speaker 1: I I wish it had turned out that way. Yeah. Well, 604 00:36:29,000 --> 00:36:33,560 Speaker 1: I mean this is like Tesla's case. This is one 605 00:36:33,600 --> 00:36:38,200 Speaker 1: where we haven't actually achieved the technological advance that he 606 00:36:38,239 --> 00:36:41,920 Speaker 1: says is required to bring about this future. So unlike 607 00:36:42,000 --> 00:36:47,520 Speaker 1: mutually assured destruction or unlike wireless radio, it's not one 608 00:36:47,560 --> 00:36:50,279 Speaker 1: we can look at and say, well, the technology is here. 609 00:36:50,280 --> 00:36:52,759 Speaker 1: In your prediction fell flat. The technology is just not 610 00:36:52,960 --> 00:36:55,600 Speaker 1: here yet, and maybe it never will be. So well, yeah, 611 00:36:55,640 --> 00:36:57,920 Speaker 1: and even if it does get here, and even if 612 00:36:57,960 --> 00:37:00,280 Speaker 1: we get to a point where it's incredibly neutri shift, 613 00:37:00,400 --> 00:37:04,439 Speaker 1: I think this, this moral improvement is probably based upon 614 00:37:04,680 --> 00:37:08,880 Speaker 1: more nineteenth century philosophy than than our current understanding. Also, 615 00:37:09,800 --> 00:37:13,640 Speaker 1: you know, unless you just take it off the menu, 616 00:37:14,719 --> 00:37:17,800 Speaker 1: you're not gonna stop me from eating some of the horrible, 617 00:37:17,960 --> 00:37:21,960 Speaker 1: horrible food I love. The other thing I would say 618 00:37:22,000 --> 00:37:26,240 Speaker 1: about this is, obviously, at the ground level, the struggle 619 00:37:26,280 --> 00:37:29,399 Speaker 1: over resources does matter very much. But we're talking about 620 00:37:29,400 --> 00:37:33,120 Speaker 1: war here. I mean we're talking about nations mobilizing vast 621 00:37:33,400 --> 00:37:38,160 Speaker 1: organized forces and superior weaponry. I mean, at that level, 622 00:37:38,320 --> 00:37:41,160 Speaker 1: how many wars are started by people who aren't getting 623 00:37:41,280 --> 00:37:43,680 Speaker 1: enough to eat? Yeah, a lot of those. A lot 624 00:37:43,760 --> 00:37:47,760 Speaker 1: of reasons for war are outside of resources. Resources often 625 00:37:47,800 --> 00:37:51,000 Speaker 1: play a very important part. Oh sure, sure, but there 626 00:37:51,080 --> 00:37:54,040 Speaker 1: they might be resources beyond what we need to survive 627 00:37:54,120 --> 00:37:57,160 Speaker 1: and be healthy. You know, we might. I can see 628 00:37:57,160 --> 00:38:01,880 Speaker 1: a world where everybody has complete access to nutritious pills 629 00:38:02,000 --> 00:38:05,319 Speaker 1: that are delicious and fill you with happiness. And butterflies 630 00:38:05,360 --> 00:38:07,160 Speaker 1: and all that, and and that would be a great 631 00:38:07,200 --> 00:38:10,919 Speaker 1: thing in itself, but I still can see in that 632 00:38:11,000 --> 00:38:14,960 Speaker 1: world people going to war over other things, over boundaries 633 00:38:15,000 --> 00:38:20,239 Speaker 1: of national territory, over ideologies, certainly whether religious or or 634 00:38:20,280 --> 00:38:24,879 Speaker 1: over ethnic division, and just hatred. I mean, there are 635 00:38:24,920 --> 00:38:28,120 Speaker 1: lots of reasons that people do the horrible thing we 636 00:38:28,239 --> 00:38:30,600 Speaker 1: call war, and not all of them have to do 637 00:38:30,840 --> 00:38:33,080 Speaker 1: with competing for resources, right, There are a lot of 638 00:38:33,160 --> 00:38:36,680 Speaker 1: other fundamental issues that would have to be addressed, and 639 00:38:36,760 --> 00:38:40,440 Speaker 1: this particular approach would not necessarily address us. So how 640 00:38:40,840 --> 00:38:44,280 Speaker 1: do we get to a point where we're able to 641 00:38:44,440 --> 00:38:48,239 Speaker 1: get a different perspective, be able to to expand our 642 00:38:48,320 --> 00:38:53,040 Speaker 1: minds and and see what's really important? I mean, what 643 00:38:53,120 --> 00:38:56,319 Speaker 1: does it take. It's time for another break, but we 644 00:38:56,400 --> 00:38:59,040 Speaker 1: will be right back to talk about technologies that were 645 00:38:59,080 --> 00:39:11,040 Speaker 1: intended in war just didn't turn out that way. Maybe 646 00:39:11,560 --> 00:39:17,200 Speaker 1: once we call anized space, does this Sidmyer civilization? Is 647 00:39:17,239 --> 00:39:18,960 Speaker 1: that what we're doing now? No? No, I want to 648 00:39:18,960 --> 00:39:21,560 Speaker 1: tap into one more idea here. And this isn't so 649 00:39:21,680 --> 00:39:25,160 Speaker 1: much a specific prediction of one person, but there's a 650 00:39:25,200 --> 00:39:29,759 Speaker 1: general idea that's been propagated by several people, including some 651 00:39:29,880 --> 00:39:33,520 Speaker 1: former astronauts known as the overview effect. Now, that was 652 00:39:33,640 --> 00:39:38,080 Speaker 1: coined by Frank White, right, that particular term, Yeah, that's right. 653 00:39:38,440 --> 00:39:43,360 Speaker 1: Uh So. The idea is when astronauts go up into 654 00:39:43,520 --> 00:39:46,760 Speaker 1: say the International Space Station, or to a space capsule 655 00:39:47,040 --> 00:39:49,560 Speaker 1: or orbiting the Earth, y're traveling to the moon, whatever 656 00:39:49,560 --> 00:39:52,640 Speaker 1: it is, and they look back down on the Earth. 657 00:39:53,520 --> 00:40:00,760 Speaker 1: It's striking that many astronauts have independently reported this feeling 658 00:40:01,040 --> 00:40:06,920 Speaker 1: of euphoria and connectedness and togetherness with all of humanity, 659 00:40:07,040 --> 00:40:11,719 Speaker 1: where national boundaries seem to fade away, right, and the 660 00:40:11,840 --> 00:40:17,960 Speaker 1: idea of human strife suddenly seems very ridiculous and pointless 661 00:40:18,400 --> 00:40:21,400 Speaker 1: because we're all in it together, right. I mean, it's 662 00:40:21,440 --> 00:40:25,520 Speaker 1: when you're from that distance and you see that everybody, 663 00:40:25,600 --> 00:40:29,239 Speaker 1: every single human being that is alive, with the exception 664 00:40:29,440 --> 00:40:33,560 Speaker 1: of less than a dozen people, are in your field 665 00:40:33,600 --> 00:40:37,720 Speaker 1: of view right then, because they're all on that planet, 666 00:40:38,200 --> 00:40:41,919 Speaker 1: it's hard to say that, you know, why are why 667 00:40:41,960 --> 00:40:45,279 Speaker 1: are their divisions? I mean, why aren't they're more? Why 668 00:40:45,320 --> 00:40:48,359 Speaker 1: isn't there more of a connectiveness? Were clearly all in 669 00:40:48,440 --> 00:40:51,400 Speaker 1: the same place. We're all on this one planet. This 670 00:40:51,480 --> 00:40:53,560 Speaker 1: is also where you get this idea that a lot 671 00:40:53,600 --> 00:40:58,120 Speaker 1: of astronauts report feeling that the world is is ultimately 672 00:40:58,160 --> 00:41:02,400 Speaker 1: a fragile place. It's this tiny blue spec the tiny 673 00:41:02,400 --> 00:41:05,560 Speaker 1: blue dot that you know, you heard Carl Sagan talk 674 00:41:05,640 --> 00:41:09,320 Speaker 1: about another phrase you often here, hanging in space. I 675 00:41:09,360 --> 00:41:13,440 Speaker 1: mean it is it. It's it's floating out there, just 676 00:41:13,600 --> 00:41:17,080 Speaker 1: vulnerable to the universe, and this is where we all 677 00:41:17,160 --> 00:41:21,760 Speaker 1: have to live, and that this results in this cognitive shift. 678 00:41:22,040 --> 00:41:24,760 Speaker 1: That that is a phrase that's come up, cognitive shift 679 00:41:24,840 --> 00:41:28,400 Speaker 1: because it's not just reported as a sort of momentary 680 00:41:28,520 --> 00:41:33,440 Speaker 1: feeling of euphoria revelation, but something that stays with astronauts 681 00:41:33,480 --> 00:41:37,680 Speaker 1: after they return to Earth. And so if it really 682 00:41:37,760 --> 00:41:41,440 Speaker 1: does happen to to everybody. Now, obviously not all astronauts 683 00:41:41,440 --> 00:41:43,799 Speaker 1: have talked about this, but not all astronauts have been 684 00:41:43,840 --> 00:41:46,359 Speaker 1: asked about it. So it may very well be that 685 00:41:46,880 --> 00:41:51,600 Speaker 1: this is a universal or near universal experience, but some 686 00:41:51,680 --> 00:41:53,640 Speaker 1: have chosen to talk about and some may not have 687 00:41:54,000 --> 00:41:57,239 Speaker 1: right now, sure, yeah, so we don't know yet, but 688 00:41:58,320 --> 00:42:02,200 Speaker 1: the fact that so many of experience this does make 689 00:42:02,280 --> 00:42:06,400 Speaker 1: someone wonder, well, if we become a space faring species 690 00:42:06,520 --> 00:42:10,600 Speaker 1: and everybody can have that profound moment of looking back 691 00:42:10,640 --> 00:42:14,600 Speaker 1: at the Earth and realizing the togetherness that we really 692 00:42:14,680 --> 00:42:17,279 Speaker 1: must feel in the face of the vast universe that 693 00:42:17,360 --> 00:42:20,800 Speaker 1: wants to kill us all right, then maybe it would 694 00:42:20,880 --> 00:42:27,799 Speaker 1: become widespread enough in humanity that war just couldn't happen anymore. Yeah, 695 00:42:28,080 --> 00:42:31,200 Speaker 1: that's a tough solution, right, Ye. Getting a lot of people, 696 00:42:31,239 --> 00:42:33,799 Speaker 1: I mean we're talking. I like the idea of it. 697 00:42:34,120 --> 00:42:37,360 Speaker 1: I mean, if you if you narrow it down to say, 698 00:42:37,520 --> 00:42:40,200 Speaker 1: let's get the people who would be the ones responsible 699 00:42:40,280 --> 00:42:42,399 Speaker 1: for waging war in the first place, like the ones 700 00:42:42,400 --> 00:42:44,960 Speaker 1: who would be the ones to initiate war, take all 701 00:42:45,000 --> 00:42:48,120 Speaker 1: the executives in generals and everybody up into space, then 702 00:42:48,160 --> 00:42:53,480 Speaker 1: maybe that's a little more magable than say everybody. Um. Yeah. 703 00:42:53,880 --> 00:42:56,320 Speaker 1: This is one of the things where I don't doubt 704 00:42:56,960 --> 00:43:02,200 Speaker 1: that there is a truly profound moment that a person 705 00:43:02,320 --> 00:43:05,600 Speaker 1: experiences when they are able to look back on the 706 00:43:05,640 --> 00:43:11,480 Speaker 1: Earth and see it as this hanging globe in space. 707 00:43:11,920 --> 00:43:14,560 Speaker 1: I don't doubt it at all. I wish I could 708 00:43:14,560 --> 00:43:16,839 Speaker 1: experience it. It's one of those things that I think 709 00:43:16,880 --> 00:43:20,960 Speaker 1: would really mean a lot to me personally. But because 710 00:43:21,000 --> 00:43:23,480 Speaker 1: I seriously doubt that this is ever going to become 711 00:43:23,760 --> 00:43:27,600 Speaker 1: something within the near future, anyway that the average person 712 00:43:27,800 --> 00:43:31,640 Speaker 1: or even the quote unquote important people would be able 713 00:43:31,680 --> 00:43:37,120 Speaker 1: to do. It's it's one that I fear is moot 714 00:43:37,480 --> 00:43:40,520 Speaker 1: in this discussion, at least for you know, the the 715 00:43:40,640 --> 00:43:44,839 Speaker 1: near future, like the next twenty to fifty years. Um, 716 00:43:44,840 --> 00:43:48,080 Speaker 1: maybe I'm wrong, which would be the best to be 717 00:43:48,080 --> 00:43:49,800 Speaker 1: the best mistake I ever made. I would love to 718 00:43:49,840 --> 00:43:54,440 Speaker 1: be wrong about that. Well. I mean, as with what 719 00:43:54,480 --> 00:43:56,960 Speaker 1: we were just talking about with Berna Low having widespread 720 00:43:57,040 --> 00:44:00,360 Speaker 1: nutritious food, this is something that would be would in 721 00:44:00,440 --> 00:44:02,960 Speaker 1: its own right, and we can see reasons that it 722 00:44:02,960 --> 00:44:05,320 Speaker 1: would be great to get lots of people into space 723 00:44:05,400 --> 00:44:09,800 Speaker 1: for exploration and scientific discovery, even if it didn't cause 724 00:44:09,880 --> 00:44:12,440 Speaker 1: this show. I mean, if this is a side effect 725 00:44:12,440 --> 00:44:16,080 Speaker 1: of something that would be good anyway, that that's a 726 00:44:16,160 --> 00:44:19,719 Speaker 1: sort of double plus, unlike something like mutually assured destruction, 727 00:44:19,800 --> 00:44:22,319 Speaker 1: where you're just hoping it works out and it's not 728 00:44:22,400 --> 00:44:25,680 Speaker 1: a side effect of something that's nice. Yeah, So Joe, 729 00:44:25,719 --> 00:44:28,640 Speaker 1: what if, Um, what if during this era of space exploration, 730 00:44:30,040 --> 00:44:32,560 Speaker 1: the astronauts go up, they look back on Earth, they 731 00:44:32,600 --> 00:44:35,200 Speaker 1: have this profound moment, they go to Mars and then 732 00:44:35,239 --> 00:44:37,640 Speaker 1: declare war on each other because they're like this is Mars. 733 00:44:37,800 --> 00:44:40,560 Speaker 1: It's totally different from Well, and it's Mars, the planet 734 00:44:40,560 --> 00:44:45,560 Speaker 1: of war. Yeah, it's it's it's it's a named Mars, Mars. 735 00:44:46,480 --> 00:44:50,319 Speaker 1: I think it was the Mars bar Candy Company, if 736 00:44:50,320 --> 00:44:53,000 Speaker 1: I'm not mistaken. You know, I have the need to 737 00:44:53,040 --> 00:44:56,160 Speaker 1: renew their contract with NASA or it's gonna expire and 738 00:44:56,239 --> 00:44:59,000 Speaker 1: become the Snickers planet. Yeah, I haven't. I haven't researched that. 739 00:44:59,040 --> 00:45:01,440 Speaker 1: I mean, I know Disney Nickers owned by More. I 740 00:45:01,480 --> 00:45:07,000 Speaker 1: don't know Disney name Pluto, so um yeah, I don't know. 741 00:45:07,160 --> 00:45:12,640 Speaker 1: I have to they did not. Okay, look, this is stopsformation. 742 00:45:12,760 --> 00:45:17,080 Speaker 1: This isn't stuff to blow your mind. I don't know science. No, no, obviously, 743 00:45:17,120 --> 00:45:19,880 Speaker 1: I'm I'm just having a little goofy fun here. But 744 00:45:19,960 --> 00:45:24,400 Speaker 1: this is really like, On one hand, you could say 745 00:45:24,520 --> 00:45:26,960 Speaker 1: we're kind of bumming everybody out because we're talking about 746 00:45:26,960 --> 00:45:29,279 Speaker 1: these technologies that were meant to end war but have 747 00:45:29,400 --> 00:45:31,680 Speaker 1: it and at least in a couple of cases the 748 00:45:31,760 --> 00:45:34,480 Speaker 1: jury could still be out. But I like to think 749 00:45:34,480 --> 00:45:38,719 Speaker 1: of it as there's no reason why we can't truly 750 00:45:39,160 --> 00:45:46,200 Speaker 1: examine the the concept of war and really work towards 751 00:45:46,320 --> 00:45:50,719 Speaker 1: eliminating it. Whether technologically or otherwise. I think probably otherwise, 752 00:45:50,760 --> 00:45:53,600 Speaker 1: because a lot of the technological solutions pretty much end 753 00:45:53,680 --> 00:45:57,280 Speaker 1: up with, well, we won't have war because we'll wipe 754 00:45:57,320 --> 00:46:00,560 Speaker 1: everybody out. Yeah, I mean, they're all so plenty of 755 00:46:00,560 --> 00:46:04,400 Speaker 1: doomsday science fiction scenarios about that. Right, You create the 756 00:46:04,520 --> 00:46:08,720 Speaker 1: perfect artificial intelligence that's super smart, it's super humanly smart, 757 00:46:09,040 --> 00:46:11,440 Speaker 1: and you asked the computer, you say, I want you 758 00:46:11,520 --> 00:46:14,520 Speaker 1: to end all war on Earth, and the computer then 759 00:46:14,600 --> 00:46:17,040 Speaker 1: runs through all the various scenarios and says that the 760 00:46:17,080 --> 00:46:19,600 Speaker 1: most realistic one is to wipe out all of humanity. 761 00:46:19,640 --> 00:46:24,280 Speaker 1: Therefore you cannot have war anymore. And then the humans go, whoops, 762 00:46:25,480 --> 00:46:28,000 Speaker 1: that's not a great it's not a great outcome. So 763 00:46:28,440 --> 00:46:31,160 Speaker 1: but you know I am, I am a peaceful kind 764 00:46:31,200 --> 00:46:34,920 Speaker 1: of person myself. I really hope that that we continue. 765 00:46:35,560 --> 00:46:38,360 Speaker 1: I like the optimism and the idealism, even if it 766 00:46:38,400 --> 00:46:41,440 Speaker 1: does border on the naive. It appeals to me that 767 00:46:41,480 --> 00:46:43,920 Speaker 1: people who are really sincere in that and who really 768 00:46:44,480 --> 00:46:49,960 Speaker 1: uh pursue that are also these folks who are far 769 00:46:50,040 --> 00:46:52,319 Speaker 1: more intelligent than I am. I mean, not one of 770 00:46:52,360 --> 00:46:56,040 Speaker 1: these people is a dummy. No, they're all like absolutely brilliant. 771 00:46:56,120 --> 00:47:01,239 Speaker 1: I mean they're their discoveries, their contribution to science. You know, 772 00:47:01,400 --> 00:47:04,319 Speaker 1: I talk about Teller and his he's often called the 773 00:47:04,320 --> 00:47:09,360 Speaker 1: father the hydrogen bomb. He made so many different contributions 774 00:47:09,400 --> 00:47:11,920 Speaker 1: to science that have nothing to do with war, but 775 00:47:12,000 --> 00:47:14,680 Speaker 1: that's what he's known for. But all of these people 776 00:47:14,840 --> 00:47:19,799 Speaker 1: had made amazing contributions to the world knowledge and too 777 00:47:19,880 --> 00:47:23,520 Speaker 1: technology and science. And so I certainly hope that we 778 00:47:23,719 --> 00:47:28,440 Speaker 1: see more idealistic innovators out there. Uh, Like I feel 779 00:47:28,440 --> 00:47:32,960 Speaker 1: that that. Uh. In some ways, Musk comes across as 780 00:47:33,000 --> 00:47:38,359 Speaker 1: that Elon Musk. He's certainly also an entrepreneur, but he 781 00:47:38,560 --> 00:47:42,719 Speaker 1: does seem to genuinely believe in a lot of the 782 00:47:42,760 --> 00:47:45,640 Speaker 1: idealistic things he talks. Well, he's one of those big thinkers, 783 00:47:45,960 --> 00:47:49,120 Speaker 1: is he's certainly he's certainly more intelligent than I am 784 00:47:49,120 --> 00:47:51,520 Speaker 1: as well. I have no problems. Don't mean like he 785 00:47:51,560 --> 00:47:54,200 Speaker 1: has a big brain. I mean like he has big projects. 786 00:47:54,719 --> 00:47:58,560 Speaker 1: He's he's willing to take on things that might seem 787 00:47:58,680 --> 00:48:01,160 Speaker 1: ridiculous at first glance. And I think it's good to 788 00:48:01,160 --> 00:48:03,960 Speaker 1: have people like that, absolutely because even let we say this, 789 00:48:04,000 --> 00:48:06,160 Speaker 1: on forward thinking all the time, even if you fail 790 00:48:06,480 --> 00:48:10,080 Speaker 1: in your efforts along the way, you learn, and by 791 00:48:10,160 --> 00:48:14,319 Speaker 1: learning probably going to be useful to That wraps up 792 00:48:14,360 --> 00:48:17,120 Speaker 1: this classic episode. One of the things I love about 793 00:48:17,120 --> 00:48:20,840 Speaker 1: technology is how it inspires optimism. But one of the 794 00:48:20,840 --> 00:48:25,360 Speaker 1: issues I have with unfettered optimism is that it ignores 795 00:48:25,640 --> 00:48:28,759 Speaker 1: other aspects of reality that you really should take into 796 00:48:28,800 --> 00:48:32,799 Speaker 1: account before you start proclaiming this is going to end 797 00:48:32,840 --> 00:48:36,239 Speaker 1: all war or it will conflict will no longer be 798 00:48:36,320 --> 00:48:41,320 Speaker 1: a thing after this happens. I feel like optimism tinged 799 00:48:41,520 --> 00:48:45,880 Speaker 1: with realism is important because it means that you know 800 00:48:45,960 --> 00:48:50,040 Speaker 1: you're you're working toward a worthy goal, but you're also 801 00:48:50,120 --> 00:48:54,000 Speaker 1: acknowledging that they're going to be challenges in the way, 802 00:48:54,280 --> 00:48:56,560 Speaker 1: and I think that if you do that, you have 803 00:48:56,640 --> 00:48:59,920 Speaker 1: a better chance of success. That's just my general full 804 00:49:00,000 --> 00:49:03,000 Speaker 1: a soophy. However, I consider myself an optimist. I just 805 00:49:03,040 --> 00:49:08,280 Speaker 1: consider my pragmatic optimist. If you have suggestions for topics 806 00:49:08,280 --> 00:49:10,640 Speaker 1: I should cover in future episodes of tech Stuff, let 807 00:49:10,719 --> 00:49:12,719 Speaker 1: me know. The best way to do that is over 808 00:49:12,760 --> 00:49:15,800 Speaker 1: on Twitter. The handle for the show is text stuff 809 00:49:16,040 --> 00:49:20,080 Speaker 1: H s W and I'll talk to you again really soon. 810 00:49:25,120 --> 00:49:28,080 Speaker 1: Tex Stuff is an I Heart Radio production. For more 811 00:49:28,200 --> 00:49:31,240 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio 812 00:49:31,320 --> 00:49:34,880 Speaker 1: app Apple podcasts, or wherever you listen to your favorite shows.