1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,840 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,880 --> 00:00:16,520 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer and 4 00:00:16,560 --> 00:00:21,279 Speaker 1: I love all things tech, and we are continuing the 5 00:00:21,360 --> 00:00:24,680 Speaker 1: story of John von Neuman. Now, when we left off 6 00:00:24,720 --> 00:00:28,840 Speaker 1: in our last episode, von Neuman, the mathematician and all 7 00:00:28,880 --> 00:00:32,560 Speaker 1: around smarty pants, had just gotten married for a second time, 8 00:00:32,640 --> 00:00:36,360 Speaker 1: this time to a woman named Clara Dan. Clara, who 9 00:00:36,400 --> 00:00:40,400 Speaker 1: hadn't had any real formal advanced training in mathematics, would 10 00:00:40,440 --> 00:00:44,479 Speaker 1: prove herself to be an incredibly adept mathematician and computer 11 00:00:44,600 --> 00:00:49,080 Speaker 1: programmer in her own right, a remarkable woman. But back 12 00:00:49,120 --> 00:00:52,440 Speaker 1: to von Neuman and the world as it was when 13 00:00:52,479 --> 00:00:55,200 Speaker 1: this was all going around. So at this point, Europe 14 00:00:55,240 --> 00:00:58,400 Speaker 1: was on the verge of war. This was ninety seven, 15 00:00:58,440 --> 00:01:01,960 Speaker 1: so still two years before Germany would invade Poland, but 16 00:01:02,120 --> 00:01:06,240 Speaker 1: tensions in Europe were mounting. I guess it was safe 17 00:01:06,280 --> 00:01:09,080 Speaker 1: to say so it seemed like a foregone conclusion that 18 00:01:09,840 --> 00:01:11,560 Speaker 1: there was going to be some sort of war, and 19 00:01:11,560 --> 00:01:14,120 Speaker 1: that if war were to break out on a large scale, 20 00:01:14,520 --> 00:01:19,600 Speaker 1: the United States would probably get involved eventually. And so 21 00:01:19,760 --> 00:01:23,120 Speaker 1: von Neumann made another decision. This time, he decided to 22 00:01:23,160 --> 00:01:27,039 Speaker 1: become a naturalized citizen of the United States. He had 23 00:01:27,040 --> 00:01:30,520 Speaker 1: been living in the US since nineteen thirty, but he 24 00:01:30,600 --> 00:01:33,959 Speaker 1: had not yet applied for citizenship. However, it was clear 25 00:01:34,040 --> 00:01:36,040 Speaker 1: that if the United States were to go to war, 26 00:01:36,520 --> 00:01:39,960 Speaker 1: he would need to be a citizen to guarantee his employment. 27 00:01:40,080 --> 00:01:42,840 Speaker 1: He was worried that if he did not become a citizen, 28 00:01:43,120 --> 00:01:46,440 Speaker 1: he might be barred from working in his field if 29 00:01:46,440 --> 00:01:49,400 Speaker 1: he were seen as a European just living in the 30 00:01:49,480 --> 00:01:54,680 Speaker 1: United States. So, according to one story about his application process, 31 00:01:55,120 --> 00:01:59,400 Speaker 1: von Neumann was coached by his collaborator Oscar Morgenstern, who 32 00:01:59,440 --> 00:02:02,560 Speaker 1: in the future would write that book on game theory. 33 00:02:02,640 --> 00:02:05,080 Speaker 1: I mentioned in the last episode that that book wouldn't 34 00:02:05,080 --> 00:02:07,880 Speaker 1: come out till but the two of them were already 35 00:02:07,920 --> 00:02:12,640 Speaker 1: working closely together. He and another mathematician and logician named 36 00:02:12,720 --> 00:02:16,880 Speaker 1: Kurt Girdle were both on their way to their immigration interviews. 37 00:02:16,919 --> 00:02:20,040 Speaker 1: That is, von Neuman and Girdle together were on their 38 00:02:20,080 --> 00:02:24,160 Speaker 1: way to these immigration interviews when Morgenstern asked if they 39 00:02:24,240 --> 00:02:29,080 Speaker 1: had any questions about the Constitution, and reportedly Kurt Girl 40 00:02:29,160 --> 00:02:31,440 Speaker 1: said that he had no questions, but he had seen 41 00:02:31,520 --> 00:02:35,200 Speaker 1: several inconsistencies in the wording in the constitution, and he 42 00:02:35,600 --> 00:02:37,960 Speaker 1: wondered if perhaps he should point those out to the 43 00:02:37,960 --> 00:02:41,840 Speaker 1: immigration officers to make them aware of those inconsistencies, and 44 00:02:41,880 --> 00:02:46,480 Speaker 1: Morgenstern reportedly discouraged that it sounds a little bit too 45 00:02:46,560 --> 00:02:49,080 Speaker 1: cute and like a punchline to me. But I like 46 00:02:49,120 --> 00:02:51,200 Speaker 1: the story, so I decided I want to share it. 47 00:02:51,320 --> 00:02:54,520 Speaker 1: Lighten things up a little bit anyway. Also around this time, 48 00:02:55,000 --> 00:02:58,919 Speaker 1: John von Neumann received a request from a British mathematician 49 00:02:59,080 --> 00:03:04,359 Speaker 1: from Cambridge uh The request was for a proctor visitor 50 00:03:04,480 --> 00:03:10,280 Speaker 1: fellowship at Princeton University. That mathematician's name was Alan Turing. 51 00:03:10,880 --> 00:03:13,480 Speaker 1: This would be the same Alan Touring who would later 52 00:03:13,520 --> 00:03:16,960 Speaker 1: propose what we now call the Turing test. This is 53 00:03:17,000 --> 00:03:19,440 Speaker 1: the same Alan Turing who would go on to work 54 00:03:19,520 --> 00:03:23,280 Speaker 1: on an early computer and help the British military crack 55 00:03:23,520 --> 00:03:28,119 Speaker 1: Germany's secret codes during World War Two. Von Neumann had 56 00:03:28,160 --> 00:03:32,040 Speaker 1: supported Turing's request. He had observed Touring a work on 57 00:03:32,080 --> 00:03:36,280 Speaker 1: several occasions. He had seen Touring when he visited Cambridge, 58 00:03:36,360 --> 00:03:39,320 Speaker 1: and he said that he had also observed Touring at 59 00:03:39,320 --> 00:03:42,960 Speaker 1: Princeton when Touring had visited for a while and Touring 60 00:03:43,120 --> 00:03:46,680 Speaker 1: von Neumann would both be instrumental in the early days 61 00:03:46,800 --> 00:03:51,120 Speaker 1: of computer science, two of the really important pioneers in 62 00:03:51,200 --> 00:03:55,320 Speaker 1: that world. Now, speaking of early computer science, in the 63 00:03:55,360 --> 00:03:58,280 Speaker 1: early nineteen forties, von Neumann was said to have a 64 00:03:58,360 --> 00:04:02,440 Speaker 1: funny conversation with Claude Shannon, and I've done an episode 65 00:04:02,480 --> 00:04:05,279 Speaker 1: about Claude Shannon in the past. He was a mathematician 66 00:04:05,440 --> 00:04:09,000 Speaker 1: who would go on to found information theory, so another 67 00:04:09,120 --> 00:04:13,200 Speaker 1: very important person in this world. Anyway, in Shannon was 68 00:04:13,240 --> 00:04:16,480 Speaker 1: preparing to present the result of his postdoctoral research work 69 00:04:17,160 --> 00:04:19,520 Speaker 1: and he wasn't sure what he should call it. He 70 00:04:19,560 --> 00:04:23,719 Speaker 1: was torn between referring to his logarithmic statistical formulation of 71 00:04:23,800 --> 00:04:29,159 Speaker 1: data as either information or uncertainty, and he didn't know 72 00:04:29,200 --> 00:04:31,760 Speaker 1: which way he should go, So he asked von Neuman's 73 00:04:31,800 --> 00:04:34,719 Speaker 1: opinion on the matter, and John von Neumann reportedly said 74 00:04:35,080 --> 00:04:38,080 Speaker 1: something to the effect that what Claude Shannon should call 75 00:04:38,120 --> 00:04:43,000 Speaker 1: it is entropy, because the mathematical mechanics in his logarithmic 76 00:04:43,080 --> 00:04:47,760 Speaker 1: statistical formulation were the same as those used in entropy equations. 77 00:04:47,880 --> 00:04:51,919 Speaker 1: And more importantly, no one really knew what the heck 78 00:04:52,080 --> 00:04:54,600 Speaker 1: entropy was in the first place, so there'd be very 79 00:04:54,600 --> 00:04:58,320 Speaker 1: little chance of anyone challenging his thesis, which I thought 80 00:04:58,320 --> 00:05:00,080 Speaker 1: was a pretty clever way of getting around it. So 81 00:05:00,160 --> 00:05:02,400 Speaker 1: if you have to defend your ideas, just go with 82 00:05:02,480 --> 00:05:07,320 Speaker 1: something so obscure that no one has enough expertise to 83 00:05:07,360 --> 00:05:11,080 Speaker 1: contradict you. At least that seems to be the the 84 00:05:11,080 --> 00:05:14,680 Speaker 1: the key of the matter. John von Neumann never wanted 85 00:05:14,720 --> 00:05:18,000 Speaker 1: to sit around without something to think about, began to 86 00:05:18,400 --> 00:05:22,520 Speaker 1: look into the UH, the question of logical design, and 87 00:05:22,640 --> 00:05:26,400 Speaker 1: eventually what would become computer science. He was also studying 88 00:05:26,400 --> 00:05:31,760 Speaker 1: the mathematics of explosives. He became an expert on shaped charges, 89 00:05:32,279 --> 00:05:35,120 Speaker 1: which is an explosive that is shaped in a way 90 00:05:35,160 --> 00:05:38,480 Speaker 1: to focus and direct the explosive energy in a very 91 00:05:38,480 --> 00:05:41,600 Speaker 1: specific way. So you've probably heard about this. These are 92 00:05:41,640 --> 00:05:47,040 Speaker 1: different ways of designing explosive devices for different purposes. It 93 00:05:47,160 --> 00:05:50,000 Speaker 1: was likely this work that would lead to him being 94 00:05:50,040 --> 00:05:53,640 Speaker 1: tapped to join a top secret research project for the 95 00:05:53,720 --> 00:05:58,279 Speaker 1: United States military, the development of the atomic bomb, also 96 00:05:58,320 --> 00:06:02,080 Speaker 1: known as the Manhattan pro object. Work had already been 97 00:06:02,120 --> 00:06:05,400 Speaker 1: going strong on the Manhattan Project by the time John 98 00:06:05,480 --> 00:06:10,119 Speaker 1: von Neumann was tapped to join it. UH J. Robert 99 00:06:10,160 --> 00:06:15,120 Speaker 1: Oppenheimer managed the laboratory with lots of top notch scientists 100 00:06:15,320 --> 00:06:21,600 Speaker 1: working alongside him like Ernest Lawrence, Stanislav Oulam, Neils bore 101 00:06:22,120 --> 00:06:27,920 Speaker 1: Seth Neddermeyer. Tons of very intelligent, very influential scientists and 102 00:06:28,680 --> 00:06:31,560 Speaker 1: engineers were working on this project, all of them trying 103 00:06:31,600 --> 00:06:35,000 Speaker 1: to develop working atomic bombs. There were three main areas 104 00:06:35,040 --> 00:06:39,680 Speaker 1: of research in the Manhattan project. There was using uranium 105 00:06:39,720 --> 00:06:42,320 Speaker 1: as the nuclear material that would be at the heart 106 00:06:42,400 --> 00:06:46,760 Speaker 1: of the atomic bomb, using plutonium, and then the hydrogen 107 00:06:46,839 --> 00:06:50,680 Speaker 1: bomb or the fusion bomb, that would be the most 108 00:06:50,720 --> 00:06:53,839 Speaker 1: powerful of the three, and it was also the most 109 00:06:53,880 --> 00:06:56,600 Speaker 1: complex and the one that was on the longest timeline 110 00:06:56,640 --> 00:07:00,160 Speaker 1: for development. For the uranium atomic bomb, the team had 111 00:07:00,200 --> 00:07:04,800 Speaker 1: decided to go with a firing mechanism inside the bomb 112 00:07:04,960 --> 00:07:11,600 Speaker 1: to detonate the actual nuclear UH payload. So this was 113 00:07:12,200 --> 00:07:14,720 Speaker 1: referred to in general as a gadget, the gadget that 114 00:07:14,760 --> 00:07:17,800 Speaker 1: would make the bomb go off. And what it would 115 00:07:17,840 --> 00:07:21,680 Speaker 1: do is it would shoot one mass of subcritical uranium, 116 00:07:21,760 --> 00:07:28,360 Speaker 1: essentially a hollow uranium bullet, into another mass of subcritical uranium. 117 00:07:28,400 --> 00:07:33,080 Speaker 1: In this sense, subcritical means that individually the two masses 118 00:07:33,080 --> 00:07:36,400 Speaker 1: would not have enough uranium two thirty five atoms to 119 00:07:36,560 --> 00:07:39,600 Speaker 1: sustain a nuclear reaction. I've talked about this in the 120 00:07:39,680 --> 00:07:43,440 Speaker 1: Nuclear Power episodes, but essentially what's happening is uranium when 121 00:07:43,440 --> 00:07:47,000 Speaker 1: it decays, gives off some high speed neutrons, and if 122 00:07:47,000 --> 00:07:51,440 Speaker 1: those neutrons were to collide with another unstable uranium atom, 123 00:07:51,560 --> 00:07:58,560 Speaker 1: they could induce another split, another fission, and the reaction 124 00:07:58,600 --> 00:08:01,720 Speaker 1: would continue. And in a nuclear power plant, you do 125 00:08:01,840 --> 00:08:05,400 Speaker 1: this in in the hopes of creating a contained and 126 00:08:05,840 --> 00:08:09,840 Speaker 1: sustainable nuclear reaction. When the nuclear bomb, you want something 127 00:08:09,840 --> 00:08:15,120 Speaker 1: that's going to escalate two explosive force. So the challenge 128 00:08:15,280 --> 00:08:18,000 Speaker 1: was you've got to find a way of doing this 129 00:08:18,240 --> 00:08:22,520 Speaker 1: in a predictable and controllable matter, in the in the 130 00:08:22,560 --> 00:08:24,840 Speaker 1: sense that you don't want a bomb to go off prematurely. 131 00:08:25,520 --> 00:08:30,200 Speaker 1: So this that's why you're using subcritical masses of uranium. 132 00:08:30,240 --> 00:08:33,840 Speaker 1: That way, if atom were to decay, it would not 133 00:08:34,080 --> 00:08:36,080 Speaker 1: set off a chain reaction that would cause the bomb 134 00:08:36,160 --> 00:08:41,080 Speaker 1: to go off prematurely. So when you fired this hollow 135 00:08:41,200 --> 00:08:45,600 Speaker 1: uranium bullet at this massive uranium, those two subcritical masses 136 00:08:45,640 --> 00:08:50,240 Speaker 1: would combine into a supercritical mass and that would start 137 00:08:50,240 --> 00:08:53,200 Speaker 1: off the reaction that would perpetuate, releasing lots of energy, 138 00:08:53,760 --> 00:08:57,640 Speaker 1: and in the grand scheme of things, that particular method 139 00:08:57,720 --> 00:09:01,280 Speaker 1: would be easier to achieve than would be required for 140 00:09:01,400 --> 00:09:04,880 Speaker 1: a plutonium bomb. So the gun method was not going 141 00:09:04,920 --> 00:09:08,120 Speaker 1: to cut it with a plutonium based bomb. For one thing, 142 00:09:08,760 --> 00:09:13,280 Speaker 1: they had determined that they would need more purified plutonium 143 00:09:13,360 --> 00:09:17,080 Speaker 1: than they would be capable of producing. It would require 144 00:09:17,080 --> 00:09:20,120 Speaker 1: too much uranium to go through the purification process to 145 00:09:20,160 --> 00:09:23,520 Speaker 1: get the purified plutonium. It just wasn't feasible. But the 146 00:09:23,559 --> 00:09:26,040 Speaker 1: team did determine that if they could sit around a 147 00:09:26,120 --> 00:09:31,719 Speaker 1: subcritical mass of plutonium with a chemical explosive so it's 148 00:09:31,720 --> 00:09:35,240 Speaker 1: got a chemical explosive coding around it essentially, and then 149 00:09:35,320 --> 00:09:38,520 Speaker 1: ignite that chemical explosives so that it would direct the 150 00:09:38,600 --> 00:09:42,319 Speaker 1: explosion inward. Instead of it being an explosion, it's an implosion. 151 00:09:42,840 --> 00:09:44,960 Speaker 1: It all goes inward and creates a shock wave that 152 00:09:45,040 --> 00:09:49,640 Speaker 1: compresses the subcritical plutonium mass so that it becomes super 153 00:09:49,640 --> 00:09:54,400 Speaker 1: critical and then boom, there's your atomic bomb. But Seth Neddermeyer, 154 00:09:54,640 --> 00:09:57,480 Speaker 1: who was working on the implosion approach, was having a 155 00:09:57,520 --> 00:10:00,560 Speaker 1: heck of a time getting a symmetrical emplode, which would 156 00:10:00,559 --> 00:10:04,640 Speaker 1: be necessary to make this work. His implosions were coming 157 00:10:04,679 --> 00:10:08,600 Speaker 1: off asymmetrical, and he wasn't sure how to fix that, 158 00:10:08,760 --> 00:10:11,520 Speaker 1: So enter John von Neumann. He took a look at 159 00:10:11,520 --> 00:10:14,599 Speaker 1: the problem and suggested changes that would make the implosion 160 00:10:14,679 --> 00:10:19,840 Speaker 1: method viable, and his ideas encouraged Oppenheimer, who dedicated more 161 00:10:19,920 --> 00:10:25,440 Speaker 1: resources to testing out von Neumann's calculations, and it would 162 00:10:25,480 --> 00:10:29,200 Speaker 1: become the operating principle that made the fat Man atomic 163 00:10:29,320 --> 00:10:32,920 Speaker 1: bomb work. The atomic bomb Little Boy would use the 164 00:10:33,040 --> 00:10:36,480 Speaker 1: uranium gun approach. There was another one called the thin Man, 165 00:10:36,600 --> 00:10:41,240 Speaker 1: but that one never saw real operation because of problems 166 00:10:41,280 --> 00:10:44,880 Speaker 1: with its design. Work continued for a while to try 167 00:10:44,880 --> 00:10:47,680 Speaker 1: and make the plutonium work with this gun method. But 168 00:10:47,760 --> 00:10:51,000 Speaker 1: they also discovered that if you had enough plutonium to 169 00:10:51,040 --> 00:10:55,320 Speaker 1: set off the explosion, it would by itself release enough 170 00:10:55,440 --> 00:10:59,280 Speaker 1: high speed neutrons to make a pre detonation reaction a 171 00:10:59,360 --> 00:11:01,640 Speaker 1: near certain d. So, in other words, if you went 172 00:11:01,679 --> 00:11:05,360 Speaker 1: with the gun approach and you were holding onto that 173 00:11:05,440 --> 00:11:08,439 Speaker 1: nuclear bomb for any length of time, there was almost 174 00:11:08,480 --> 00:11:10,960 Speaker 1: a percent chance that it was going to go off prematurely, 175 00:11:11,400 --> 00:11:14,440 Speaker 1: just because the plutonium would be giving off enough neutrons 176 00:11:14,480 --> 00:11:17,800 Speaker 1: to set everything off. And if there's one thing you 177 00:11:17,800 --> 00:11:20,000 Speaker 1: don't want with your atomic bombs, it's the tendency for 178 00:11:20,040 --> 00:11:22,199 Speaker 1: them to go off prematurely. Now. I've talked about the 179 00:11:22,240 --> 00:11:25,320 Speaker 1: Manhattan Project before and how complicated it was, not just 180 00:11:25,400 --> 00:11:29,439 Speaker 1: from a technological standpoint, but also an ethical standpoint. When 181 00:11:29,480 --> 00:11:32,120 Speaker 1: the Army took over the atomic bomb project in the 182 00:11:32,120 --> 00:11:35,880 Speaker 1: early forties, experts determined that it would take about three years. 183 00:11:36,120 --> 00:11:39,439 Speaker 1: There's two, so I take about three years to produce 184 00:11:39,520 --> 00:11:41,880 Speaker 1: enough uranium and plutonium to serve as the material in 185 00:11:41,920 --> 00:11:45,880 Speaker 1: an atomic bomb. The projection ended up being correct, and 186 00:11:45,880 --> 00:11:48,040 Speaker 1: it put the possibility of launching in attack with an 187 00:11:48,040 --> 00:11:53,440 Speaker 1: atomic bomb in nineteen forty five. So even when people 188 00:11:53,480 --> 00:11:56,840 Speaker 1: were working on these projects in forty two and forty 189 00:11:56,880 --> 00:11:59,760 Speaker 1: three or d four, they knew that ninety five would 190 00:11:59,800 --> 00:12:02,680 Speaker 1: be the earliest that they would be able to detonate 191 00:12:02,720 --> 00:12:06,000 Speaker 1: an atomic bomb. And uh, this puts you in a 192 00:12:06,080 --> 00:12:09,840 Speaker 1: really weird position, I would imagined, because if you're working 193 00:12:09,840 --> 00:12:12,760 Speaker 1: on something and you're hoping to prove that it's a 194 00:12:13,760 --> 00:12:18,080 Speaker 1: a viable weapon, then part of you might be hoping 195 00:12:18,160 --> 00:12:20,280 Speaker 1: for award to stretch on long enough for you to 196 00:12:20,280 --> 00:12:24,040 Speaker 1: be able to use it, which seems pretty dark to me. 197 00:12:24,600 --> 00:12:27,480 Speaker 1: It doesn't seem it is super dark. But anyway, the 198 00:12:27,480 --> 00:12:29,920 Speaker 1: production of the nuclear material was just one of the 199 00:12:30,000 --> 00:12:33,240 Speaker 1: challenges that the group faced. Everything else would have to 200 00:12:33,280 --> 00:12:35,920 Speaker 1: be designed and produced to make the atomic bomb of 201 00:12:35,960 --> 00:12:40,280 Speaker 1: possibility in that tight time frame. So getting enough nuclear 202 00:12:40,360 --> 00:12:45,120 Speaker 1: material was one challenge. Putting together the actual physical bombs, 203 00:12:45,160 --> 00:12:50,240 Speaker 1: the the structures that would make this weapon viable, that 204 00:12:50,360 --> 00:12:53,400 Speaker 1: was a separate challenge that also had to be completed 205 00:12:53,440 --> 00:12:58,559 Speaker 1: in that same time frame. On July six, the researchers 206 00:12:58,559 --> 00:13:01,120 Speaker 1: were able to conduct the first full scale test in 207 00:13:01,200 --> 00:13:04,240 Speaker 1: New Mexico. A little more than a decade later, in 208 00:13:04,320 --> 00:13:07,559 Speaker 1: nineteen seven, John von Neumann's cause of death would be 209 00:13:07,600 --> 00:13:10,560 Speaker 1: determined to be bone cancer, and there have been more 210 00:13:10,600 --> 00:13:13,720 Speaker 1: than a few researchers who suggested his presence at atomic 211 00:13:13,800 --> 00:13:18,120 Speaker 1: tests at Bikinia to all actually and his proximity to 212 00:13:18,240 --> 00:13:22,120 Speaker 1: radioactive material likely contributed to the development of that cancer. 213 00:13:22,720 --> 00:13:26,679 Speaker 1: So while he played an instrumental role in the development 214 00:13:26,960 --> 00:13:30,440 Speaker 1: of atomic bombs, at least the fat Man variety of 215 00:13:30,480 --> 00:13:36,440 Speaker 1: atomic bombs, it would ultimately at least apparently contribute to 216 00:13:36,559 --> 00:13:40,079 Speaker 1: his eventual demise. I have a lot more to say, 217 00:13:40,120 --> 00:13:43,240 Speaker 1: but first let's take a quick break to thank our 218 00:13:43,280 --> 00:13:54,560 Speaker 1: sponsor John von Neumann would work on the mathematics side 219 00:13:54,640 --> 00:13:57,680 Speaker 1: of the Manhattan Project, so he wasn't actually building bombs. 220 00:13:57,679 --> 00:14:01,839 Speaker 1: He was working out calculations using out through math how 221 00:14:01,880 --> 00:14:05,559 Speaker 1: an explosion would happen, essentially creating simulations in this way. 222 00:14:05,800 --> 00:14:08,840 Speaker 1: This was incredibly important because it was not practical to 223 00:14:08,920 --> 00:14:13,640 Speaker 1: do lots of real world tests with these designs, so 224 00:14:13,840 --> 00:14:17,120 Speaker 1: he was using mathematics to test these ideas and say, well, 225 00:14:17,160 --> 00:14:21,840 Speaker 1: based upon the various components, will this work. The project 226 00:14:21,880 --> 00:14:25,960 Speaker 1: had access to IBM tabulating machines, which von Neumann would 227 00:14:25,960 --> 00:14:28,480 Speaker 1: work with. These would use punch cards to read and 228 00:14:28,480 --> 00:14:32,040 Speaker 1: perform operations on data and then produce output that's on 229 00:14:32,200 --> 00:14:35,640 Speaker 1: yet more punch cards. Von Neumann felt that a general 230 00:14:35,680 --> 00:14:38,760 Speaker 1: purpose machine similar to the ones he worked with at 231 00:14:38,800 --> 00:14:44,000 Speaker 1: Los Alamos would be useful for all sorts of scientific applications, 232 00:14:44,120 --> 00:14:48,360 Speaker 1: so he immediately saw the potential for computers, but he 233 00:14:48,440 --> 00:14:52,640 Speaker 1: wanted something that would be much more flexible, capable of 234 00:14:52,720 --> 00:14:56,640 Speaker 1: running lots of different types of calculations, not something that's 235 00:14:56,640 --> 00:14:59,880 Speaker 1: set up to run a specific type of calculation and 236 00:15:00,000 --> 00:15:03,440 Speaker 1: that's all it can do. So he was also during 237 00:15:03,480 --> 00:15:07,680 Speaker 1: the Manhattan Project selected to serve on the Target Selection 238 00:15:07,760 --> 00:15:12,040 Speaker 1: Committee in nineteen he would be one of the people 239 00:15:12,080 --> 00:15:15,440 Speaker 1: to recommend the future target site of the atomic bombs 240 00:15:15,440 --> 00:15:19,080 Speaker 1: being built in the Manhattan Project, and the selection process 241 00:15:19,120 --> 00:15:22,120 Speaker 1: for where the bombs would be dropped gets super duper 242 00:15:22,120 --> 00:15:25,480 Speaker 1: squiacky for me. The committee decided that for the atomic 243 00:15:25,520 --> 00:15:28,200 Speaker 1: bomb to be seen as effective and to act as 244 00:15:28,200 --> 00:15:32,040 Speaker 1: a deterrent, it should be dropped on an area that 245 00:15:32,160 --> 00:15:36,359 Speaker 1: it would have the largest impact as far as devastation 246 00:15:36,480 --> 00:15:41,200 Speaker 1: was concerned, and so that started to create the parameters 247 00:15:41,240 --> 00:15:43,640 Speaker 1: that they were using to make their choices. It had 248 00:15:43,640 --> 00:15:47,280 Speaker 1: to be someplace within bombing range. They wanted someplace that 249 00:15:47,320 --> 00:15:51,120 Speaker 1: would have closely built frame buildings because that would really 250 00:15:51,120 --> 00:15:54,280 Speaker 1: show off the destructive power of the bomb, so it 251 00:15:54,280 --> 00:15:58,160 Speaker 1: would have to be a city, and they also wanted 252 00:15:58,200 --> 00:16:01,440 Speaker 1: to try and find a place that had not been 253 00:16:01,480 --> 00:16:05,440 Speaker 1: bombed previously, so that way it would be evident how 254 00:16:05,480 --> 00:16:08,200 Speaker 1: destructive this atomic bomb would be. The atomic bomb would 255 00:16:08,400 --> 00:16:11,440 Speaker 1: create most of its damage from the primary blast of 256 00:16:11,480 --> 00:16:15,280 Speaker 1: the weapon, and then the fires that would follow would 257 00:16:15,280 --> 00:16:18,360 Speaker 1: create a lot more damage in a wider area beyond 258 00:16:18,400 --> 00:16:24,040 Speaker 1: the blast range. So the committee selected Hiroshima and Nagasaki. 259 00:16:24,360 --> 00:16:28,000 Speaker 1: Nagasaki would become the target of fat Man as the 260 00:16:28,040 --> 00:16:31,240 Speaker 1: bomb with the implosion detonation gadget that von Neumann had 261 00:16:31,240 --> 00:16:35,920 Speaker 1: worked on. According to Stanford, von Neuman also contributed directly 262 00:16:36,000 --> 00:16:39,360 Speaker 1: in this effort by calculating the optimal flight path for 263 00:16:39,440 --> 00:16:42,480 Speaker 1: the bombers to take to minimize the risk of being 264 00:16:42,520 --> 00:16:45,240 Speaker 1: shot down en route to the target sites. There was 265 00:16:45,240 --> 00:16:48,480 Speaker 1: another example of the men maxing theory von Neuman had 266 00:16:48,520 --> 00:16:51,920 Speaker 1: worked on nearly two decades earlier, and when he was 267 00:16:52,120 --> 00:16:55,680 Speaker 1: talking about game theory, John von Neumann would go on 268 00:16:55,720 --> 00:16:58,800 Speaker 1: to consult for the Rand Corporation, which at the time 269 00:16:58,840 --> 00:17:02,200 Speaker 1: was operating as a think tank dedicated to running nuclear 270 00:17:02,240 --> 00:17:07,680 Speaker 1: war scenarios. Neuman would argue for a concept called preventive war. 271 00:17:08,480 --> 00:17:12,280 Speaker 1: You could refer to this as a preemptive strike, or 272 00:17:12,400 --> 00:17:18,240 Speaker 1: maybe more appropriately a nuclear sucker punch, because the preemptive 273 00:17:18,280 --> 00:17:22,560 Speaker 1: strike is typically a strategy we associate with two nuclear powers. 274 00:17:22,600 --> 00:17:25,880 Speaker 1: It's the concept of a nuclear power launching an initial 275 00:17:25,920 --> 00:17:29,360 Speaker 1: attack in an effort to knock out the second nuclear 276 00:17:29,400 --> 00:17:33,280 Speaker 1: powers nuclear capabilities as much as possible so that no 277 00:17:33,400 --> 00:17:38,080 Speaker 1: retaliation is is is available. Uh Nouman was going further 278 00:17:38,160 --> 00:17:40,560 Speaker 1: than that. He was actually suggesting that the United States 279 00:17:40,600 --> 00:17:45,320 Speaker 1: used nuclear weapons against the Soviet Union before the USSR 280 00:17:45,440 --> 00:17:48,120 Speaker 1: could become a nuclear power at all. So essentially, he's saying, 281 00:17:48,880 --> 00:17:53,280 Speaker 1: launch a full scale attack on Moscow because sooner or 282 00:17:53,359 --> 00:17:55,359 Speaker 1: later they're going to build nuclear weapons. It's likely going 283 00:17:55,400 --> 00:17:57,919 Speaker 1: to be sooner, so let's do it. Let's wipe them 284 00:17:57,920 --> 00:18:01,919 Speaker 1: out before they can build these nuclear weapons. So his 285 00:18:02,000 --> 00:18:05,080 Speaker 1: argument was, assuming that nuclear war is inevitable once the 286 00:18:05,119 --> 00:18:08,320 Speaker 1: Soviet Union developed the ability to make nuclear weapons, the 287 00:18:08,359 --> 00:18:11,960 Speaker 1: best thing would be to launch that nuclear attack against Moscow. 288 00:18:12,720 --> 00:18:15,240 Speaker 1: But it was based on that assumption that nuclear war 289 00:18:15,640 --> 00:18:23,360 Speaker 1: is in fact inevitable once enough superpowers have nuclear capabilities. Uh, 290 00:18:23,600 --> 00:18:25,879 Speaker 1: so far that hasn't proven to be the case. So 291 00:18:26,080 --> 00:18:30,439 Speaker 1: ya for that. But Neuman wasn't convinced that that was 292 00:18:30,600 --> 00:18:35,240 Speaker 1: necessarily gonna hold true. He was actually quoted as having said, 293 00:18:35,680 --> 00:18:38,840 Speaker 1: if you say why not bomb them tomorrow, I say 294 00:18:38,880 --> 00:18:41,399 Speaker 1: why not today? If you say today at five o'clock, 295 00:18:41,440 --> 00:18:46,120 Speaker 1: I say, why not one o'clock. So he was gung 296 00:18:46,200 --> 00:18:48,920 Speaker 1: ho on this at the time. The United States, suffice 297 00:18:48,920 --> 00:18:52,160 Speaker 1: it to say, did not follow up on von Neuman's suggestion, 298 00:18:52,840 --> 00:18:55,280 Speaker 1: and at least so far, nuclear war has not happened, 299 00:18:55,320 --> 00:18:58,240 Speaker 1: though I'm personally not a fan of the mutually assured 300 00:18:58,280 --> 00:19:02,800 Speaker 1: destruction strategy that various countries have taken either, but that's 301 00:19:02,800 --> 00:19:06,800 Speaker 1: for a different podcast. Back at Princeton, von Neuman would 302 00:19:06,800 --> 00:19:10,920 Speaker 1: become the manager of the Electronic Computer project. The goal 303 00:19:11,040 --> 00:19:14,720 Speaker 1: was to design and build an electronic computer capable of 304 00:19:14,840 --> 00:19:18,440 Speaker 1: using a stored program. Von Neuman had served as a 305 00:19:18,520 --> 00:19:22,400 Speaker 1: consultant on an earlier project out of the University of Pennsylvania. 306 00:19:22,480 --> 00:19:26,280 Speaker 1: He had by chance met the leaders of that project. 307 00:19:26,359 --> 00:19:31,000 Speaker 1: That was called the Electronic Numerical Integrator and Computer, better 308 00:19:31,040 --> 00:19:34,919 Speaker 1: known as NIAC. ENIAC was built as a general purpose 309 00:19:34,960 --> 00:19:39,440 Speaker 1: programmable electronic computer. It was funded by the Army Ordinance Department, 310 00:19:39,480 --> 00:19:43,800 Speaker 1: which wanted a computer capable of calculating complicated ballistics tables. 311 00:19:44,560 --> 00:19:50,160 Speaker 1: Earlier computers were mostly electro mechanical devices, which meant they 312 00:19:50,200 --> 00:19:53,800 Speaker 1: had real moving parts that operated as switches. But that 313 00:19:53,880 --> 00:19:56,560 Speaker 1: also meant those computers were subject to wear and tear, 314 00:19:56,680 --> 00:20:00,520 Speaker 1: and worse, for the terms of running lots of cut relations. 315 00:20:00,960 --> 00:20:03,800 Speaker 1: Their speed was limited because they had to rely on 316 00:20:03,840 --> 00:20:08,720 Speaker 1: this mechanical action of these various components. And electronic computer 317 00:20:08,840 --> 00:20:12,280 Speaker 1: would eliminate all those moving parts and speed things up considerably. 318 00:20:12,880 --> 00:20:16,800 Speaker 1: While ENIAC was still being constructed. Von Neumann would end 319 00:20:16,840 --> 00:20:20,520 Speaker 1: up working with the ENIAC creators J. Pressper Eckert and 320 00:20:20,640 --> 00:20:25,040 Speaker 1: John V. Mochley to design the successor to the ENIAC, 321 00:20:25,440 --> 00:20:29,600 Speaker 1: and this would be EDVAC. That stood for Electronic Discrete 322 00:20:29,760 --> 00:20:34,520 Speaker 1: variable Automatic Computer. So what was different about EDVAC. What 323 00:20:34,600 --> 00:20:37,879 Speaker 1: made it different from ENIAC. That would be the computer's 324 00:20:37,960 --> 00:20:42,080 Speaker 1: internal memory. EDVAC had it ENIAC didn't. EDVAC would have 325 00:20:42,200 --> 00:20:45,600 Speaker 1: enough and memory to store a program in it as 326 00:20:45,640 --> 00:20:47,639 Speaker 1: well as the data that the program would work on. 327 00:20:48,240 --> 00:20:51,359 Speaker 1: So you could take a program, feed it to the 328 00:20:51,400 --> 00:20:55,600 Speaker 1: computer and the associated data, and the computer would hold 329 00:20:56,000 --> 00:20:58,520 Speaker 1: it in what we would now call RAM or random 330 00:20:58,520 --> 00:21:02,440 Speaker 1: access memory, and the computer's processor would follow the instructions 331 00:21:02,520 --> 00:21:06,880 Speaker 1: on the program and perform operations on the associated data. Now, 332 00:21:06,960 --> 00:21:11,359 Speaker 1: essentially that's how all computers or most computers anyway, work today, 333 00:21:11,400 --> 00:21:14,840 Speaker 1: But at the time it was revolutionary and the name 334 00:21:14,880 --> 00:21:19,560 Speaker 1: people used to describe it was Von Neumann architecture. Now, 335 00:21:19,600 --> 00:21:23,119 Speaker 1: before EDVAC, the program a computer would run was essentially 336 00:21:23,280 --> 00:21:26,720 Speaker 1: part of the machine itself. So, like I said, with 337 00:21:26,760 --> 00:21:29,600 Speaker 1: the tabulating machines in some cases, and then you had 338 00:21:29,600 --> 00:21:33,879 Speaker 1: specific purpose computers, they could run one program because that's 339 00:21:34,240 --> 00:21:36,960 Speaker 1: physically what they were capable of doing. They were the 340 00:21:37,000 --> 00:21:40,440 Speaker 1: actual design of the computer itself was the program in part. 341 00:21:41,480 --> 00:21:44,359 Speaker 1: But other computers you could technically change the program, but 342 00:21:44,400 --> 00:21:49,160 Speaker 1: it would require physically rewiring the computer, like removing plugs 343 00:21:49,160 --> 00:21:52,320 Speaker 1: from a plug board and plugging them into different outlets, 344 00:21:52,920 --> 00:21:55,160 Speaker 1: and that was a laborious process and it was really 345 00:21:55,200 --> 00:21:56,919 Speaker 1: easy to mess up. If you plug a plug in 346 00:21:56,920 --> 00:21:59,000 Speaker 1: the wrong place, suddenly you've got errors in all of 347 00:21:59,000 --> 00:22:02,560 Speaker 1: your programs. Where in all your your operations, I should say, 348 00:22:02,560 --> 00:22:07,080 Speaker 1: because the plugging was the program. Now, while we call 349 00:22:07,560 --> 00:22:12,280 Speaker 1: this approach to design the von Neuman architecture, it's important 350 00:22:12,320 --> 00:22:15,119 Speaker 1: to remember this was actually a group effort. It wasn't 351 00:22:15,280 --> 00:22:18,440 Speaker 1: just coming from John von Neuman. He's the guy who 352 00:22:18,480 --> 00:22:22,240 Speaker 1: wrote about it and who popularized it, and he already 353 00:22:22,280 --> 00:22:26,120 Speaker 1: had a celebrity status, so his name was associated with it, 354 00:22:26,400 --> 00:22:29,800 Speaker 1: but he was not the sole contributor. He actually had 355 00:22:29,920 --> 00:22:34,240 Speaker 1: lengthy discussions with lots of other computer pioneers like Eckered Emotally, 356 00:22:34,520 --> 00:22:39,400 Speaker 1: also Arthur Burke's Herman Goldstein, and together they all published 357 00:22:39,480 --> 00:22:43,040 Speaker 1: the formal explanation for what people would later call von 358 00:22:43,119 --> 00:22:48,639 Speaker 1: Neuman architecture back in in a paper titled Preliminary Discussion 359 00:22:48,760 --> 00:22:53,000 Speaker 1: of the Logical Design of an Electronic Computing Instrument. In 360 00:22:53,040 --> 00:22:56,199 Speaker 1: just a note here, I'm not saying this to diminish 361 00:22:56,440 --> 00:23:00,520 Speaker 1: John von Neuman's role in this. He was absolutely pivotal 362 00:23:00,640 --> 00:23:03,640 Speaker 1: in the development of how computers work today. But it's 363 00:23:03,640 --> 00:23:07,560 Speaker 1: also irresponsible to ignore the other contributors, so we have 364 00:23:07,720 --> 00:23:11,760 Speaker 1: to make sure we we take time to acknowledge their 365 00:23:11,800 --> 00:23:16,240 Speaker 1: work as well. While the internal memory element of ADVECS 366 00:23:16,280 --> 00:23:19,359 Speaker 1: design tends to get the most attention, technically the von 367 00:23:19,440 --> 00:23:23,280 Speaker 1: Neuman architecture includes more than just that. So in addition 368 00:23:23,359 --> 00:23:27,440 Speaker 1: to internal memory, the von Neuman architecture would describe a 369 00:23:27,480 --> 00:23:32,200 Speaker 1: computer that would also have an arithmetic UH logic unit 370 00:23:32,480 --> 00:23:35,080 Speaker 1: which would kind of evolve into a cential processing unit, 371 00:23:35,359 --> 00:23:39,320 Speaker 1: a control unit which was in charge of sending information 372 00:23:39,800 --> 00:23:42,560 Speaker 1: to different parts of a computer. UH. There was an 373 00:23:42,560 --> 00:23:46,760 Speaker 1: interface for input and an interface for output, and a 374 00:23:46,880 --> 00:23:50,320 Speaker 1: bus the pathway that would allow data to transfer. So 375 00:23:50,359 --> 00:23:52,800 Speaker 1: you had the control unit that was telling data where 376 00:23:52,840 --> 00:23:56,040 Speaker 1: to go essentially, and you had the buses that would 377 00:23:56,040 --> 00:23:58,480 Speaker 1: allow the data to go from one part of the 378 00:23:58,520 --> 00:24:02,880 Speaker 1: computer to another. Then you had the arithmetic logic unit 379 00:24:02,960 --> 00:24:07,159 Speaker 1: that would actually execute the operations. This computer would follow 380 00:24:07,400 --> 00:24:11,479 Speaker 1: specific steps. So a typical program would be fetch an 381 00:24:11,480 --> 00:24:15,439 Speaker 1: instruction from internal memory according to the address designated by 382 00:24:15,480 --> 00:24:19,439 Speaker 1: the program counter, add the length of the instruction to 383 00:24:19,520 --> 00:24:23,600 Speaker 1: the program counter, use the control unit to decode the instruction, 384 00:24:23,760 --> 00:24:26,640 Speaker 1: and then direct the computer to execute whatever that instruction 385 00:24:26,760 --> 00:24:29,960 Speaker 1: might be, and then go back to step one and 386 00:24:30,040 --> 00:24:34,760 Speaker 1: do the next step in the instructions. So that was 387 00:24:34,800 --> 00:24:39,080 Speaker 1: your basic computer program, which again sounds super primitive. There 388 00:24:39,119 --> 00:24:44,359 Speaker 1: would never be a pure von Neumann architecture computer necessarily. 389 00:24:44,400 --> 00:24:47,959 Speaker 1: There were a lot of variations on that, essentially improvements 390 00:24:47,960 --> 00:24:51,400 Speaker 1: really to do stuff like error checking, which the original 391 00:24:51,480 --> 00:24:56,720 Speaker 1: architecture did not account for. But it was the foundation 392 00:24:57,000 --> 00:25:01,160 Speaker 1: for modern computing as we know it. I'm not done yet. 393 00:25:01,200 --> 00:25:04,120 Speaker 1: I've got some more things to talk about with John 394 00:25:04,200 --> 00:25:06,840 Speaker 1: von Neuman, but first let's take another quick break to 395 00:25:06,920 --> 00:25:17,520 Speaker 1: thank our sponsor. The AdVac team worked hard and had 396 00:25:17,560 --> 00:25:21,320 Speaker 1: a working computer in nineteen fifty one in limited operation. 397 00:25:21,520 --> 00:25:24,440 Speaker 1: They expanded that to normal operations in nineteen fifty two. 398 00:25:24,920 --> 00:25:27,520 Speaker 1: But the ad back wasn't the first computer to make 399 00:25:27,640 --> 00:25:30,280 Speaker 1: use of the von Neuman architecture. That honor would go 400 00:25:30,400 --> 00:25:34,200 Speaker 1: to the Manchester Mark one in England, which performed its 401 00:25:34,240 --> 00:25:39,080 Speaker 1: first operations in nineteen EDVAC was part of the Ballistics 402 00:25:39,119 --> 00:25:42,080 Speaker 1: Research Laboratory and it would run operations for several hours 403 00:25:42,080 --> 00:25:47,240 Speaker 1: a day, every day until nineteen sixty two. Meanwhile, over 404 00:25:47,280 --> 00:25:51,960 Speaker 1: at the Institute for Advanced Study in Princeton, which von 405 00:25:52,040 --> 00:25:55,120 Speaker 1: Neumann was heading up, he was he was a part 406 00:25:55,160 --> 00:25:58,800 Speaker 1: of that for all of his life. The I A 407 00:25:59,119 --> 00:26:04,440 Speaker 1: S Machine was under construction. This was another von Neumann 408 00:26:04,560 --> 00:26:07,600 Speaker 1: architecture computer and it also was ready to go in 409 00:26:07,680 --> 00:26:11,640 Speaker 1: nineteen fifty one. Like d VAC, it was a binary computer, 410 00:26:11,760 --> 00:26:14,639 Speaker 1: It had internal memory, and it weighed half a ton. 411 00:26:15,119 --> 00:26:18,240 Speaker 1: It was a big, big machine. This is well before 412 00:26:18,280 --> 00:26:21,280 Speaker 1: the age of manaturization, when you're using stuff like vacuum 413 00:26:21,320 --> 00:26:25,560 Speaker 1: tubes for your instead of transistors. Another area of computer 414 00:26:25,640 --> 00:26:30,560 Speaker 1: science that von Neuman would pioneer was cellular automata. So 415 00:26:31,560 --> 00:26:34,440 Speaker 1: what the heck is that? Well, cellular in this case 416 00:26:34,600 --> 00:26:37,560 Speaker 1: means cells within a grid. So imagine you have a 417 00:26:37,560 --> 00:26:42,159 Speaker 1: sheet of grid paper and now imagine that every square 418 00:26:42,720 --> 00:26:45,919 Speaker 1: on that sheet of grid paper is a cell. So 419 00:26:46,000 --> 00:26:49,280 Speaker 1: what does the concept of automata come in? Well, now, 420 00:26:49,359 --> 00:26:52,879 Speaker 1: Imagine that each of those cells in that grid paper 421 00:26:53,280 --> 00:26:58,240 Speaker 1: has certain rules associated with that cell, and those rules 422 00:26:58,280 --> 00:27:03,280 Speaker 1: relate to the states of the neighboring cells. So let's 423 00:27:03,280 --> 00:27:05,640 Speaker 1: imagine you've got a sheet of grid paper in front 424 00:27:05,680 --> 00:27:07,760 Speaker 1: of you. You pick a grid somewhere in the middle 425 00:27:07,760 --> 00:27:11,280 Speaker 1: of the page. Now, in this case, it would mean 426 00:27:11,320 --> 00:27:14,280 Speaker 1: that the rules for that cell that you're looking at 427 00:27:14,440 --> 00:27:19,000 Speaker 1: depend in part on the state of the four cells 428 00:27:19,040 --> 00:27:22,679 Speaker 1: that border it. In the cell above, below, and to 429 00:27:22,760 --> 00:27:25,280 Speaker 1: either side of it. This is what we would call 430 00:27:25,359 --> 00:27:29,840 Speaker 1: a von Neumann neighborhood. By the way, Now, let's say 431 00:27:29,840 --> 00:27:33,199 Speaker 1: that all the rules have to do with the color 432 00:27:33,400 --> 00:27:37,800 Speaker 1: of the cell. So maybe the central cell that you've picked, 433 00:27:38,040 --> 00:27:40,359 Speaker 1: you you create a rule for that set that says, 434 00:27:40,800 --> 00:27:44,720 Speaker 1: if the cell above me is red, then I should 435 00:27:44,760 --> 00:27:49,280 Speaker 1: turn blue. Now, imagine all the cells in this grid 436 00:27:49,440 --> 00:27:53,160 Speaker 1: have their own rules that relate to their neighboring cells, 437 00:27:54,000 --> 00:27:58,000 Speaker 1: and all of these rules relate to color. Now imagine 438 00:27:58,040 --> 00:28:01,280 Speaker 1: all those rules will carry out a according to time steps, 439 00:28:01,320 --> 00:28:05,520 Speaker 1: almost like turns in a board game. And so you 440 00:28:05,760 --> 00:28:08,560 Speaker 1: designate one cell to have a certain color. You're you're 441 00:28:08,600 --> 00:28:11,040 Speaker 1: starting everything off. You say, all right, I'm going to 442 00:28:11,160 --> 00:28:14,200 Speaker 1: change the color of this cell to blue, you advance 443 00:28:14,480 --> 00:28:19,760 Speaker 1: the time step by one. Every rule that is in 444 00:28:19,840 --> 00:28:24,040 Speaker 1: place for those neighboring cells then takes effect. So some 445 00:28:24,160 --> 00:28:26,800 Speaker 1: cells might and that our neighboring and might say, all right, 446 00:28:26,840 --> 00:28:29,119 Speaker 1: well that means I have to turn red, or another 447 00:28:29,160 --> 00:28:31,000 Speaker 1: one might say I have to turn yellow. Another one 448 00:28:31,040 --> 00:28:35,040 Speaker 1: might say, I do nothing at all, because the the 449 00:28:35,080 --> 00:28:38,120 Speaker 1: fact that that cell is blue doesn't change anything for me. 450 00:28:39,720 --> 00:28:42,280 Speaker 1: And then you advance the time step again. And now 451 00:28:42,360 --> 00:28:46,560 Speaker 1: the neighboring cells of the cells that changed are reacting 452 00:28:46,920 --> 00:28:49,920 Speaker 1: to those changes, and you advance the time step again, 453 00:28:50,200 --> 00:28:53,400 Speaker 1: and so on and so forth. Now that approach has 454 00:28:53,400 --> 00:28:56,520 Speaker 1: been used in a number of really interesting applications. I 455 00:28:56,760 --> 00:28:59,680 Speaker 1: gave a very abstract, simple version, but there are real 456 00:28:59,680 --> 00:29:02,959 Speaker 1: world applications for this, and it would also become one 457 00:29:02,960 --> 00:29:05,960 Speaker 1: of the components in something von Neuman theorized about that 458 00:29:06,000 --> 00:29:09,760 Speaker 1: I mentioned any podcasts about Maker Bought, and that would 459 00:29:09,760 --> 00:29:15,000 Speaker 1: be the universal constructor. Von Neuman's universal constructor was an 460 00:29:15,000 --> 00:29:18,600 Speaker 1: abstract notion of a self replicating machine that he was 461 00:29:18,640 --> 00:29:23,000 Speaker 1: able to prove through this cellular automata approach. So when 462 00:29:23,000 --> 00:29:26,320 Speaker 1: you would run this machine that was a universal constructor, 463 00:29:26,360 --> 00:29:28,960 Speaker 1: it would be able to make a copy of itself. 464 00:29:29,680 --> 00:29:34,360 Speaker 1: It would have blueprints for its own design, a method 465 00:29:34,600 --> 00:29:37,920 Speaker 1: of following those blueprints and creating a copy of them, 466 00:29:37,960 --> 00:29:41,760 Speaker 1: producing a copy of itself, including a copy of the blueprints, 467 00:29:41,800 --> 00:29:44,080 Speaker 1: so that the copy of itself could then go on 468 00:29:44,520 --> 00:29:49,200 Speaker 1: to make a copy of itself. So he could program 469 00:29:49,200 --> 00:29:52,080 Speaker 1: each cell in one of these grids, so that would 470 00:29:52,120 --> 00:29:55,160 Speaker 1: follow a sequence within a certain number of time steps 471 00:29:55,560 --> 00:30:00,040 Speaker 1: in order to create a specific pattern to initiate a 472 00:30:00,040 --> 00:30:04,400 Speaker 1: hoppy of the original and in the next cycle of 473 00:30:04,440 --> 00:30:06,920 Speaker 1: time steps, it could do a copy of itself, and 474 00:30:06,960 --> 00:30:09,040 Speaker 1: so on and so on, over and over and over again. 475 00:30:09,640 --> 00:30:13,400 Speaker 1: Each cell in von Neuman's original proposal could have one 476 00:30:13,560 --> 00:30:18,080 Speaker 1: of twenty nine different states at any given time step, 477 00:30:18,200 --> 00:30:21,200 Speaker 1: based on the rules he had created. So, you know, 478 00:30:21,240 --> 00:30:24,360 Speaker 1: I was saying red or blue, you know, or maybe 479 00:30:24,360 --> 00:30:26,520 Speaker 1: it's clear or red or whatever. That's very you know, 480 00:30:26,560 --> 00:30:29,880 Speaker 1: that's obviously binary. It's that's two states. You're either clear 481 00:30:29,960 --> 00:30:31,800 Speaker 1: or your red or your red, or you're blue, or 482 00:30:31,800 --> 00:30:35,440 Speaker 1: whatever it may be. He had twenty nine different states 483 00:30:35,440 --> 00:30:37,840 Speaker 1: that each cell could be in. Now, this did not 484 00:30:37,920 --> 00:30:42,120 Speaker 1: create an actual physical machine capable of doing work, but 485 00:30:42,200 --> 00:30:45,320 Speaker 1: it acted kind of like a simulator for such a device, 486 00:30:46,040 --> 00:30:48,880 Speaker 1: and he could make this work. He could make a 487 00:30:49,760 --> 00:30:54,320 Speaker 1: set of rules that a grid that applied to a grid, 488 00:30:54,600 --> 00:30:57,800 Speaker 1: and that if you were to initiate the action, it 489 00:30:57,840 --> 00:31:01,880 Speaker 1: would actually create copies of a design over and over 490 00:31:01,920 --> 00:31:05,200 Speaker 1: and over again, and within a certain number of time 491 00:31:05,200 --> 00:31:08,840 Speaker 1: steps per copy. So it it acted as kind of 492 00:31:08,840 --> 00:31:11,000 Speaker 1: a starting point for a lot of other work in 493 00:31:11,040 --> 00:31:15,120 Speaker 1: this field, ranging from stuff like the rep rap project, 494 00:31:15,400 --> 00:31:17,840 Speaker 1: where I was talking about a three D printer capable 495 00:31:17,840 --> 00:31:22,000 Speaker 1: of printing its own copy, to the proposal of nanotechnology 496 00:31:22,000 --> 00:31:25,480 Speaker 1: devices like molecular assemblers. This is a kind of a 497 00:31:25,520 --> 00:31:29,640 Speaker 1: science fiction e sort of concept. These would be microscopic 498 00:31:29,840 --> 00:31:34,360 Speaker 1: a similars that could construct material molecule by molecule or 499 00:31:34,400 --> 00:31:39,520 Speaker 1: maybe even atom by atom um and that would allow 500 00:31:39,600 --> 00:31:41,440 Speaker 1: us to make all sorts of stuff. If we could 501 00:31:41,520 --> 00:31:45,120 Speaker 1: get down to construction on the molecular level and scale 502 00:31:45,160 --> 00:31:47,680 Speaker 1: it so that you could produce something in a reasonable 503 00:31:47,680 --> 00:31:50,760 Speaker 1: amount of time, you can make all sorts of things. 504 00:31:50,760 --> 00:31:52,840 Speaker 1: This is kind of the basis of the Star Trek 505 00:31:53,720 --> 00:31:57,920 Speaker 1: concept of of having the replicator. Von Neumann wrote a 506 00:31:57,920 --> 00:32:01,320 Speaker 1: book on the subject, but it was not published until 507 00:32:01,360 --> 00:32:05,000 Speaker 1: after he died, so his health was beginning to fail 508 00:32:05,360 --> 00:32:09,360 Speaker 1: in the mid nineteen fifties. He received the Enrico Fermia 509 00:32:09,400 --> 00:32:12,400 Speaker 1: Award from the Atomic Energy Commission in nineteen fifty six, 510 00:32:12,440 --> 00:32:16,200 Speaker 1: and at that time he already knew he had cancer 511 00:32:16,560 --> 00:32:19,600 Speaker 1: and that his time was short. Towards the end of 512 00:32:19,600 --> 00:32:22,440 Speaker 1: his life, von Neumann, who up to that point had 513 00:32:22,480 --> 00:32:28,240 Speaker 1: been agnostic, he didn't have any real belief in the 514 00:32:28,280 --> 00:32:32,640 Speaker 1: existence or non existence of God. He actually converted to Christianity. 515 00:32:33,040 --> 00:32:35,440 Speaker 1: The prevailing thought is he did so because he was 516 00:32:35,680 --> 00:32:39,320 Speaker 1: terrified of death, and according to one biographer, he was 517 00:32:39,360 --> 00:32:42,560 Speaker 1: said to have entertained the notion that Blaze Pascal was 518 00:32:42,600 --> 00:32:45,320 Speaker 1: onto something in the form of Pascal's wager. This is 519 00:32:45,360 --> 00:32:49,280 Speaker 1: a philosophical argument that states human beings should really believe 520 00:32:49,320 --> 00:32:53,600 Speaker 1: in God because they're betting their existence of God with 521 00:32:53,760 --> 00:32:58,400 Speaker 1: their lives. So the argument goes that a rational person 522 00:32:59,080 --> 00:33:03,280 Speaker 1: should behave as if God definitely exists, and they should 523 00:33:03,320 --> 00:33:08,320 Speaker 1: seek to believe in God, because if God does not exist, 524 00:33:08,520 --> 00:33:11,440 Speaker 1: you don't really lose much when you die, You encounter 525 00:33:11,600 --> 00:33:15,600 Speaker 1: the exact same fate as everybody else. However, if God 526 00:33:16,120 --> 00:33:20,080 Speaker 1: does exist, and believing in God and behaving in a 527 00:33:20,120 --> 00:33:24,640 Speaker 1: certain way is a prerequisite to going to Heaven versus 528 00:33:24,680 --> 00:33:29,120 Speaker 1: going to Hell. Then you stand everything to gain if 529 00:33:29,160 --> 00:33:32,640 Speaker 1: you go by that philosophy. So this falls in line 530 00:33:32,680 --> 00:33:36,000 Speaker 1: with von Neuman's game theory that whole minimize your losses 531 00:33:36,040 --> 00:33:40,120 Speaker 1: in the event of a worst case scenario. John von 532 00:33:40,240 --> 00:33:45,600 Speaker 1: Neuman died on February eighth, nineteen fifty seven. He truly 533 00:33:45,720 --> 00:33:48,720 Speaker 1: was a genius, and he made numerous contributions to our 534 00:33:48,800 --> 00:33:52,840 Speaker 1: understanding of mathematics, not to mention the foundations of modern computing. 535 00:33:53,320 --> 00:33:56,120 Speaker 1: And I did say at the beginning of these episodes 536 00:33:56,160 --> 00:33:59,240 Speaker 1: that I would also address a few traits that some 537 00:33:59,320 --> 00:34:02,959 Speaker 1: people have finally described as quirks, but I feel are 538 00:34:03,000 --> 00:34:08,040 Speaker 1: actually much deeper flaws. One was his love of fast cars. 539 00:34:08,200 --> 00:34:11,040 Speaker 1: Now that in itself isn't a flaw. There are a 540 00:34:11,080 --> 00:34:14,520 Speaker 1: lot of people who love fast cars. Scott Benjamin is 541 00:34:14,960 --> 00:34:19,440 Speaker 1: a good friend, and he loves cars. But von Neumann 542 00:34:19,440 --> 00:34:23,080 Speaker 1: would drive so fast and so carelessly that he earned 543 00:34:23,080 --> 00:34:27,760 Speaker 1: a reputation for wrecking cars for totaling them. It happened 544 00:34:27,800 --> 00:34:31,600 Speaker 1: so regularly that one intersection at Princeton was called von 545 00:34:31,719 --> 00:34:35,560 Speaker 1: Neumann Corner because he had wrecked more than one car 546 00:34:35,640 --> 00:34:39,480 Speaker 1: in that location. He was said too often drive while 547 00:34:39,520 --> 00:34:44,200 Speaker 1: he was distracted, including while he was reading. So that's 548 00:34:44,200 --> 00:34:47,520 Speaker 1: not ideal as it shows a level of irresponsibility that 549 00:34:47,560 --> 00:34:51,920 Speaker 1: could into the injury or death of someone, whether it's 550 00:34:52,000 --> 00:34:56,080 Speaker 1: von Neumann or or someone completely not connected to the 551 00:34:56,120 --> 00:34:59,719 Speaker 1: event at all other than you know, being in a 552 00:34:59,719 --> 00:35:04,319 Speaker 1: colle vision. Von Neuman was also something of a hedonist. 553 00:35:04,520 --> 00:35:08,319 Speaker 1: He was known to eat and drink to excess. His 554 00:35:08,480 --> 00:35:11,080 Speaker 1: love of parties would continue throughout his life, and again, 555 00:35:11,120 --> 00:35:14,879 Speaker 1: that's not necessarily a flaw unless it is taken to extremes, 556 00:35:15,640 --> 00:35:18,120 Speaker 1: and from what I've read, it sounds like there might 557 00:35:18,160 --> 00:35:22,480 Speaker 1: have been a few extreme cases in von Neuman's life. 558 00:35:22,800 --> 00:35:25,919 Speaker 1: And he was also known to be kind of creepy. 559 00:35:26,440 --> 00:35:30,440 Speaker 1: He liked being around young women, and he liked looking 560 00:35:30,480 --> 00:35:33,880 Speaker 1: at their legs a lot. Apparently he would go so 561 00:35:33,960 --> 00:35:38,120 Speaker 1: far as to lean down to look underneath desks in 562 00:35:38,239 --> 00:35:40,439 Speaker 1: order to get a look at legs and maybe even 563 00:35:40,440 --> 00:35:43,719 Speaker 1: pick up the skirt of a woman, which is absolutely despicable. 564 00:35:44,320 --> 00:35:46,440 Speaker 1: It got to a point where he had such a 565 00:35:46,440 --> 00:35:50,279 Speaker 1: reputation for doing this that some of the secretaries at 566 00:35:50,360 --> 00:35:54,520 Speaker 1: Los Alamos during the Manhattan Project actually would have cardboard 567 00:35:54,600 --> 00:35:58,200 Speaker 1: sheets that they could slide in front of their legs 568 00:35:58,320 --> 00:36:02,960 Speaker 1: under their desks to blow his view, which is pretty 569 00:36:03,040 --> 00:36:08,399 Speaker 1: darn creepy not cool anyway, There's no question that he 570 00:36:08,480 --> 00:36:13,560 Speaker 1: was brilliant and a genius. But I'm also glad that, 571 00:36:13,640 --> 00:36:15,360 Speaker 1: even though he was a genius, that not all of 572 00:36:15,360 --> 00:36:18,040 Speaker 1: his ideas were adopted readily, because his argument that the 573 00:36:18,120 --> 00:36:20,920 Speaker 1: United States should launch a preventive strike against the Soviet 574 00:36:21,000 --> 00:36:24,520 Speaker 1: Union to wipe out that country before it could develop 575 00:36:24,560 --> 00:36:30,200 Speaker 1: its own nuclear weapons seems particularly horrific to me. Millions 576 00:36:30,520 --> 00:36:33,480 Speaker 1: of innocent people who had no say in the nuclear 577 00:36:33,520 --> 00:36:37,320 Speaker 1: aspirations of the country they lived in and no contribution 578 00:36:37,640 --> 00:36:40,919 Speaker 1: toward the development of nuclear weapons would have died. If 579 00:36:40,920 --> 00:36:43,960 Speaker 1: that were the case, it would have been a massive 580 00:36:44,200 --> 00:36:50,279 Speaker 1: slaughter of people who had no say in the eventual 581 00:36:50,440 --> 00:36:56,520 Speaker 1: development of nuclear weapons. I find this notion impossible to justify. 582 00:36:56,560 --> 00:36:59,279 Speaker 1: And while von Neuman argued that a nuclear war would 583 00:36:59,320 --> 00:37:02,680 Speaker 1: be inevitable should the Soviets develop the capability to manufacture 584 00:37:02,680 --> 00:37:07,160 Speaker 1: such weapons, and that a nuclear war would re devastation 585 00:37:07,360 --> 00:37:11,120 Speaker 1: much much larger than launching a preventative strike against the 586 00:37:11,160 --> 00:37:15,120 Speaker 1: Soviet Union, it turns out that that didn't happen at 587 00:37:15,200 --> 00:37:18,880 Speaker 1: least it hasn't happened yet, So maybe one day there 588 00:37:18,880 --> 00:37:21,719 Speaker 1: will be a nuclear war, which would in fact be 589 00:37:21,880 --> 00:37:26,360 Speaker 1: absolutely terrible. But it may be that it's not a 590 00:37:26,520 --> 00:37:32,319 Speaker 1: foregone conclusion the way von Neumann believed. So we can 591 00:37:32,360 --> 00:37:35,520 Speaker 1: look back and say, at least so far, it seems 592 00:37:35,560 --> 00:37:41,960 Speaker 1: like not bombing a country out of existence h for 593 00:37:42,000 --> 00:37:46,200 Speaker 1: fear of them developing nuclear weapons was the right choice, 594 00:37:46,360 --> 00:37:50,759 Speaker 1: because they did not launch a nuclear attack against us. 595 00:37:50,840 --> 00:37:53,879 Speaker 1: They developed nuclear weapons, but we haven't had a war, 596 00:37:54,840 --> 00:37:59,759 Speaker 1: so we would have killed millions of people and not 597 00:38:00,000 --> 00:38:04,640 Speaker 1: prevented a war because the war hasn't happened anyway. But still, 598 00:38:05,520 --> 00:38:09,759 Speaker 1: all of that being said, we have to remember von 599 00:38:09,840 --> 00:38:18,640 Speaker 1: Neumann made incalculable contributions towards multiple disciplines, and the world 600 00:38:18,680 --> 00:38:20,960 Speaker 1: would be a very different place if he had not 601 00:38:21,239 --> 00:38:25,680 Speaker 1: done that. So for that I am thankful, and I 602 00:38:25,719 --> 00:38:28,520 Speaker 1: think it is important that we take it to consideration 603 00:38:28,920 --> 00:38:33,240 Speaker 1: all of a person's traits, their strengths, their weaknesses, their virtues, 604 00:38:33,360 --> 00:38:38,280 Speaker 1: and their flaws. We should not just idolize people without 605 00:38:38,360 --> 00:38:42,600 Speaker 1: critical thought. That is not a responsible thing to do, 606 00:38:42,880 --> 00:38:47,040 Speaker 1: nor should we dismiss all the contributions. We have to 607 00:38:47,080 --> 00:38:52,560 Speaker 1: weigh everything in kind and try and take a human perspective. 608 00:38:53,160 --> 00:38:57,640 Speaker 1: All of us have contributions, all of us have flaws. Ah, 609 00:38:57,680 --> 00:39:01,760 Speaker 1: but yeah, that that sums up my thoughts on von Neuman. 610 00:39:02,560 --> 00:39:07,840 Speaker 1: He has written a ton of very information, informational, very educational, 611 00:39:07,960 --> 00:39:13,759 Speaker 1: very interesting papers on numerous subjects. I urge you to 612 00:39:13,800 --> 00:39:16,640 Speaker 1: go out and find some of those if you are 613 00:39:16,719 --> 00:39:20,120 Speaker 1: really interested in the various topics I've talked about. They 614 00:39:20,160 --> 00:39:23,680 Speaker 1: are very academic, so if you don't have the schooling 615 00:39:23,920 --> 00:39:28,279 Speaker 1: or the expertise in those areas, you may rapidly find yourself, 616 00:39:29,320 --> 00:39:33,279 Speaker 1: uh finding it really challenging to understand what is going on. 617 00:39:33,360 --> 00:39:36,120 Speaker 1: I know I did, I've It's been a long time 618 00:39:36,160 --> 00:39:41,399 Speaker 1: since I've taken calculus, for example. Uh. But they are 619 00:39:41,440 --> 00:39:46,839 Speaker 1: incredibly important papers, so if you want to learn more, 620 00:39:46,920 --> 00:39:50,360 Speaker 1: you can seek that out. I'm still looking for a 621 00:39:50,480 --> 00:39:55,000 Speaker 1: really good biography about von Neuman. I've read a couple, 622 00:39:55,600 --> 00:39:58,160 Speaker 1: and I don't think either are the ones I've read 623 00:39:58,640 --> 00:40:02,600 Speaker 1: have been exactly what I'm looking for. So if you 624 00:40:02,719 --> 00:40:05,720 Speaker 1: happen to know of a really good biography about von Neumann, 625 00:40:05,840 --> 00:40:08,520 Speaker 1: I'm interested to learn more about that too, So hit 626 00:40:08,600 --> 00:40:10,920 Speaker 1: me up. You can go to our website that is 627 00:40:11,280 --> 00:40:15,279 Speaker 1: tech Stuff podcast dot com. There you're gonna find all 628 00:40:15,320 --> 00:40:17,760 Speaker 1: the different ways to contact me, you'll learn more about 629 00:40:17,800 --> 00:40:21,080 Speaker 1: the show. You can visit our merchandise store that's at 630 00:40:21,120 --> 00:40:23,920 Speaker 1: t public dot com slash tech Stuff. That's where you're 631 00:40:23,920 --> 00:40:29,400 Speaker 1: gonna find links to things like a T shirts, tote bags, stickers, 632 00:40:29,560 --> 00:40:32,680 Speaker 1: coffee mugs, all with different designs on them. We've got 633 00:40:32,719 --> 00:40:34,759 Speaker 1: some great ones up there, go check them out. Every 634 00:40:34,760 --> 00:40:36,600 Speaker 1: purchase you make goes to help the show. We greatly 635 00:40:36,640 --> 00:40:40,520 Speaker 1: appreciate it, and I'll talk to you again really so 636 00:40:40,520 --> 00:40:48,120 Speaker 1: soon for more on this and thousands of other topics 637 00:40:48,160 --> 00:40:59,480 Speaker 1: because it how staff works dot com