1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,000 --> 00:00:14,520 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,640 --> 00:00:17,840 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,880 --> 00:00:20,360 Speaker 1: and I love all things tech. It is a Friday, 5 00:00:20,560 --> 00:00:23,720 Speaker 1: It's time for a tech Stuff classic episode. This episode 6 00:00:23,720 --> 00:00:29,080 Speaker 1: originally published August six, two thousand fourteen, and it's it's 7 00:00:29,120 --> 00:00:32,880 Speaker 1: one about an important person in tech, an important person who, 8 00:00:32,920 --> 00:00:35,519 Speaker 1: at least at the time not a not a lot 9 00:00:35,560 --> 00:00:38,920 Speaker 1: of people outside of certain text spheres really knew a 10 00:00:38,920 --> 00:00:42,720 Speaker 1: lot about him. So this episode is titled who was 11 00:00:42,920 --> 00:00:47,760 Speaker 1: Claude Shannon? The Father of information theory right also known 12 00:00:47,800 --> 00:00:53,000 Speaker 1: as the father of the electronic communication age, and his 13 00:00:53,280 --> 00:00:57,520 Speaker 1: full name Claude Ellwood Shannon. Very important person he's been. 14 00:00:57,960 --> 00:01:01,840 Speaker 1: He's been compared to, you know, some some pretty impressive, 15 00:01:01,840 --> 00:01:07,039 Speaker 1: big recently big people like Einstein. Yeah, Einstein being one 16 00:01:07,040 --> 00:01:10,240 Speaker 1: of them. And you might say, well, whoa you know Einstein, Like, 17 00:01:11,240 --> 00:01:15,839 Speaker 1: Einstein's name has become synonymous with just the concept of genius, 18 00:01:15,880 --> 00:01:18,319 Speaker 1: like to the point where we use it in phrases 19 00:01:18,319 --> 00:01:21,520 Speaker 1: where we're being you know, a little a little condescenating. Yeah, 20 00:01:21,600 --> 00:01:24,760 Speaker 1: way to go, Einstein, that kind of thing. But as 21 00:01:24,800 --> 00:01:27,840 Speaker 1: you'll see when we go through this this episode and 22 00:01:27,880 --> 00:01:32,560 Speaker 1: explain what Claude Shannon did and his his contributions to technology, 23 00:01:32,840 --> 00:01:35,640 Speaker 1: as well as just kind of his wacky personality, you'll 24 00:01:35,640 --> 00:01:40,040 Speaker 1: really kind of see how that that applies. So exactly 25 00:01:40,280 --> 00:01:43,680 Speaker 1: who was he and what did he do? When was 26 00:01:43,720 --> 00:01:51,280 Speaker 1: this guy born? He was born in nineteen sixteen in Potaski. Yeah. Yeah, 27 00:01:51,640 --> 00:01:53,920 Speaker 1: his father was a probate judge and his mother was 28 00:01:53,960 --> 00:01:57,360 Speaker 1: a high school principle. He also did have some mildly 29 00:01:57,480 --> 00:02:00,560 Speaker 1: famous family. A very distant cousin of his kind of 30 00:02:00,560 --> 00:02:04,120 Speaker 1: made a name for himself, Yeah, for killing an elephant 31 00:02:03,920 --> 00:02:09,359 Speaker 1: with electricity, Thomas Edison. He did a few other things too, Yeah, 32 00:02:09,400 --> 00:02:13,440 Speaker 1: that's the requisite doing from the internet. Thomas Edison obviously 33 00:02:14,040 --> 00:02:17,799 Speaker 1: did many many important things, some of them not remotely 34 00:02:17,880 --> 00:02:23,280 Speaker 1: involving putting an animal to death with electricity. Yeah, the 35 00:02:23,400 --> 00:02:27,280 Speaker 1: large majority of which so kill an elephant once. Yeah, 36 00:02:27,440 --> 00:02:30,239 Speaker 1: I know, you just sticks with you right. Well. As 37 00:02:30,280 --> 00:02:35,000 Speaker 1: a boy, Claude Shannon became interested in electronics and began 38 00:02:35,840 --> 00:02:38,320 Speaker 1: experimenting with different stuff. He was just curious about how 39 00:02:38,360 --> 00:02:41,120 Speaker 1: things work and how to build them himself. He built 40 00:02:41,160 --> 00:02:44,160 Speaker 1: a working model of an airplane. Pretty impressive. Think, I 41 00:02:44,200 --> 00:02:47,079 Speaker 1: think he was born in nineteen sixteen. You didn't have 42 00:02:47,240 --> 00:02:50,400 Speaker 1: airplanes for very long. They were pretty new. Yeah, they 43 00:02:50,440 --> 00:02:53,320 Speaker 1: were brand new back in the early twentieth century. And 44 00:02:53,360 --> 00:02:57,680 Speaker 1: he also reportedly made a working telegraph system that they 45 00:02:57,720 --> 00:03:00,840 Speaker 1: set up between his bedroom and a friends bedroom. His 46 00:03:00,880 --> 00:03:03,200 Speaker 1: friend lived half a mile away, and it was all 47 00:03:03,240 --> 00:03:05,800 Speaker 1: made out of fencing wire. Yeah, so he could all 48 00:03:05,840 --> 00:03:08,680 Speaker 1: but I mean the wire itself. Yeah, he could actually 49 00:03:08,760 --> 00:03:11,240 Speaker 1: end up sending messages to his friend have a mile away. 50 00:03:11,560 --> 00:03:14,000 Speaker 1: He was also really into radio circuits and built a 51 00:03:14,080 --> 00:03:18,399 Speaker 1: radio controlled model boat. Yeah, so very much interest that. Yeah, yeah, 52 00:03:18,440 --> 00:03:22,280 Speaker 1: this is this is the growing world of radio technology 53 00:03:22,320 --> 00:03:25,680 Speaker 1: and the growing world of communications technology. So he was 54 00:03:25,720 --> 00:03:28,960 Speaker 1: interested in it as a kid. Now a little bit 55 00:03:29,080 --> 00:03:31,760 Speaker 1: later on, when he was a teenager, he got work 56 00:03:31,840 --> 00:03:34,800 Speaker 1: as a basic mechanic in a drug store, running a 57 00:03:34,800 --> 00:03:37,800 Speaker 1: fix it shop in a drug store, because that's that 58 00:03:37,920 --> 00:03:41,320 Speaker 1: was like the center of town. Yeah, where you go 59 00:03:41,400 --> 00:03:43,400 Speaker 1: and you go and get your your chocolate malt and 60 00:03:43,480 --> 00:03:47,200 Speaker 1: your your your fan fixed. You know, it's a one 61 00:03:47,240 --> 00:03:51,880 Speaker 1: stop shop. He attended an Arbor College, where he studied 62 00:03:51,920 --> 00:03:56,920 Speaker 1: mathematics and electrical engineering. He graduated an Arbor College in 63 00:03:57,080 --> 00:03:59,720 Speaker 1: nineteen thirty six and then went on to enroll in 64 00:04:00,040 --> 00:04:05,040 Speaker 1: date level study at the Massachusetts Institute of Technology. And 65 00:04:05,120 --> 00:04:07,880 Speaker 1: he decided upon m i T because he saw this 66 00:04:07,960 --> 00:04:11,880 Speaker 1: work study add like pinned onto a physical bulletin board 67 00:04:11,960 --> 00:04:15,560 Speaker 1: on his college campus that was advertising for someone interested 68 00:04:15,600 --> 00:04:20,880 Speaker 1: in working on Vanavar Bush's differential analyzer, which was an 69 00:04:20,920 --> 00:04:26,480 Speaker 1: analog computer that used these physical mechanical connections to make calculations. 70 00:04:27,000 --> 00:04:29,039 Speaker 1: The deal here was that he would spend half his 71 00:04:29,080 --> 00:04:31,440 Speaker 1: time working towards his degree and the other half in 72 00:04:31,440 --> 00:04:34,240 Speaker 1: the lab with bush Um, who was then m i 73 00:04:34,320 --> 00:04:36,919 Speaker 1: t s vice president and also their dean of engineering. 74 00:04:36,960 --> 00:04:39,520 Speaker 1: So this was kind of sort of a big deal Um, 75 00:04:39,640 --> 00:04:41,880 Speaker 1: And this machine was huge. It was the system of 76 00:04:42,200 --> 00:04:45,480 Speaker 1: gears and pulleys and rods that calculated with an entire 77 00:04:45,600 --> 00:04:48,479 Speaker 1: range of values that were based on the physical rotation 78 00:04:48,520 --> 00:04:52,080 Speaker 1: of the rods. And you could program it by physically 79 00:04:52,279 --> 00:04:56,680 Speaker 1: rearranging all of these mechanical bits to correspond with different equations. 80 00:04:57,040 --> 00:04:59,800 Speaker 1: The control circuit, I mean, this is how early this 81 00:04:59,880 --> 00:05:03,880 Speaker 1: was in computing technology. The control circuit itself was a 82 00:05:03,920 --> 00:05:08,760 Speaker 1: system of some hundred electromagnetic switches. Yeah, this this is 83 00:05:08,800 --> 00:05:13,839 Speaker 1: a kind of the the evolution of what Charles Babbage 84 00:05:13,920 --> 00:05:17,200 Speaker 1: created way back in the day, the Difference Engine. Uh 85 00:05:17,240 --> 00:05:20,840 Speaker 1: so we've done the text us done episodes about and 86 00:05:20,920 --> 00:05:24,160 Speaker 1: A Lovelace, who was the first computer programmer she built. 87 00:05:24,360 --> 00:05:27,800 Speaker 1: She kind of saw that computers could be things that 88 00:05:27,880 --> 00:05:31,280 Speaker 1: could do more than just crunch numbers. They could analyze 89 00:05:31,279 --> 00:05:34,400 Speaker 1: any kind of data. Yeah, they could represent stuff that 90 00:05:34,640 --> 00:05:39,520 Speaker 1: isn't numbers as numbers, so that you could She had 91 00:05:39,560 --> 00:05:42,000 Speaker 1: this brilliant idea of, oh, a computer might be able 92 00:05:42,040 --> 00:05:44,800 Speaker 1: to represent something like a piece of music and be 93 00:05:44,880 --> 00:05:48,520 Speaker 1: able to create, you know, replicated in some way years 94 00:05:48,520 --> 00:05:51,200 Speaker 1: and years ahead of her time. And the computers of 95 00:05:51,240 --> 00:05:57,360 Speaker 1: those days were these giant analog actual machines. Yeah, sometimes manpowered. 96 00:05:57,400 --> 00:06:00,560 Speaker 1: Sometimes they had this electro mechanical element to it. So 97 00:06:00,720 --> 00:06:06,360 Speaker 1: we're predating the time of the electronic computer at this point. So, uh, 98 00:06:06,560 --> 00:06:09,480 Speaker 1: as Claude Shannon began to work on this machine, you 99 00:06:09,560 --> 00:06:11,680 Speaker 1: know now that he had had enrolled with m I T, 100 00:06:12,279 --> 00:06:15,320 Speaker 1: he noticed something interesting. He saw that the switches corresponded 101 00:06:15,320 --> 00:06:18,160 Speaker 1: with a concept he had started on studying first as 102 00:06:18,160 --> 00:06:21,280 Speaker 1: an undergraduate, and that was really focusing on, which was 103 00:06:21,760 --> 00:06:27,600 Speaker 1: symbolic logic. Now, I took symbolic logic in college. I 104 00:06:27,640 --> 00:06:31,719 Speaker 1: loved it because the basic idea of symbolic logic is 105 00:06:31,880 --> 00:06:37,200 Speaker 1: you reduce logical statements to mathematical statements. Actually, I took 106 00:06:37,200 --> 00:06:40,120 Speaker 1: a similar class. It was it was basically the at 107 00:06:40,320 --> 00:06:43,000 Speaker 1: least mathematical math class I could get away with as 108 00:06:43,000 --> 00:06:46,000 Speaker 1: an English major. Well, the neat thing about it is 109 00:06:46,040 --> 00:06:50,120 Speaker 1: that if you could prove that it mathematically made sense, 110 00:06:50,480 --> 00:06:53,200 Speaker 1: then you could say that the statement is true, right, 111 00:06:53,560 --> 00:06:57,000 Speaker 1: And if it does exactly so, you could you could 112 00:06:57,040 --> 00:07:00,320 Speaker 1: start to listen to your friends argue and sketch it out. 113 00:07:00,400 --> 00:07:04,600 Speaker 1: And then he said, look, here's where you went wrong. 114 00:07:05,040 --> 00:07:07,280 Speaker 1: But at any rate, while he was at m I T. 115 00:07:07,440 --> 00:07:11,080 Speaker 1: He started really studying the work of a thinker named 116 00:07:11,240 --> 00:07:14,560 Speaker 1: George Boole, who was from the nineteenth century and back. 117 00:07:14,600 --> 00:07:18,720 Speaker 1: In eighteen fifty four, George Bull published an investigation of 118 00:07:18,760 --> 00:07:21,960 Speaker 1: the laws of thought on which are founded the mathematical 119 00:07:22,000 --> 00:07:26,120 Speaker 1: theories of logic and probabilities, sometimes known as the laws 120 00:07:26,120 --> 00:07:29,000 Speaker 1: of thought. We usually shorten that to just laws of thought. 121 00:07:29,640 --> 00:07:34,480 Speaker 1: So this discussion about the mathematical theories of logic had 122 00:07:34,560 --> 00:07:38,880 Speaker 1: Bull using algebraic equations to represent logical forms and syllogisms, 123 00:07:38,880 --> 00:07:41,280 Speaker 1: which is exactly what you know I experienced when I 124 00:07:41,320 --> 00:07:44,240 Speaker 1: was in college. In this work, he also said that 125 00:07:44,280 --> 00:07:48,520 Speaker 1: the only idempotent numbers, which are numbers that can be 126 00:07:48,560 --> 00:07:51,920 Speaker 1: put through a certain operation multiple times without changing the result, 127 00:07:52,160 --> 00:07:57,200 Speaker 1: are zero and one. For example, one times one equals one, 128 00:07:57,520 --> 00:08:00,560 Speaker 1: and no matter how many times you will multiply by one, 129 00:08:00,640 --> 00:08:02,520 Speaker 1: it will always be one. Right, So if you take 130 00:08:02,560 --> 00:08:05,680 Speaker 1: the product of that of that that equation and then 131 00:08:05,760 --> 00:08:08,720 Speaker 1: multiplied by itself, you still stay with one. Same thing 132 00:08:08,800 --> 00:08:11,440 Speaker 1: with zero, although also with zero you can add and 133 00:08:11,520 --> 00:08:15,320 Speaker 1: subtract and still end up with zero. So zero zero zero, zero, 134 00:08:15,640 --> 00:08:18,120 Speaker 1: so bool. Use zero and one for the values of 135 00:08:18,120 --> 00:08:21,080 Speaker 1: the symbols. In his algebraic logic, he said an argument 136 00:08:21,120 --> 00:08:24,840 Speaker 1: held in logic if when reduced to an algebraic equation, 137 00:08:25,120 --> 00:08:28,120 Speaker 1: it held in common algebra with the zero one restriction 138 00:08:28,160 --> 00:08:31,160 Speaker 1: of the possible interpretations of the symbols, meaning that if 139 00:08:31,160 --> 00:08:33,640 Speaker 1: you could replace the symbols with a zero or a 140 00:08:33,679 --> 00:08:36,960 Speaker 1: one and it's still made sense, it still worked, then 141 00:08:37,000 --> 00:08:40,160 Speaker 1: it held true. So Claude Shannon looked at this and 142 00:08:40,200 --> 00:08:43,280 Speaker 1: he was thinking, this is a really cool idea. I 143 00:08:43,320 --> 00:08:46,880 Speaker 1: love this, this approach to logic, and hey, you know 144 00:08:47,640 --> 00:08:51,880 Speaker 1: a switch has two positions on and off, so sort 145 00:08:51,920 --> 00:08:54,640 Speaker 1: of like a one in zero. Yeah, I mean, what 146 00:08:54,720 --> 00:08:58,080 Speaker 1: if we were to you know, kind of, oh, play 147 00:08:58,120 --> 00:09:01,280 Speaker 1: with that, that whole switch process. And that became something 148 00:09:01,280 --> 00:09:03,280 Speaker 1: that would percolated in the back of his head for 149 00:09:03,320 --> 00:09:07,200 Speaker 1: a while. In fact, it percolated so long that people 150 00:09:07,240 --> 00:09:10,719 Speaker 1: suspect that he had fully formed this whole idea of 151 00:09:10,760 --> 00:09:16,040 Speaker 1: applying Boolean logic to electronic devices for years before writing 152 00:09:16,040 --> 00:09:19,800 Speaker 1: it down. And once he wrote it out and presented it, well, 153 00:09:19,840 --> 00:09:22,439 Speaker 1: we'll get there. We'll get there. I also do want 154 00:09:22,440 --> 00:09:26,640 Speaker 1: to note that around this time, Shannon became interested in juggling, 155 00:09:26,800 --> 00:09:30,240 Speaker 1: I think originally for like physical mathematical purposes. He showed up, 156 00:09:30,320 --> 00:09:32,880 Speaker 1: he started showing up at the M I T. Juggling 157 00:09:32,920 --> 00:09:36,120 Speaker 1: Club Juggling Club, I see what you did there, and 158 00:09:36,240 --> 00:09:38,600 Speaker 1: asking some of its members if he could like measure 159 00:09:38,840 --> 00:09:43,480 Speaker 1: their juggling and and thereby sort of got involved with them, 160 00:09:43,640 --> 00:09:46,800 Speaker 1: and this would be a lifelong interest. As we will 161 00:09:46,800 --> 00:09:48,520 Speaker 1: get into a little bit later on a little bit 162 00:09:48,559 --> 00:09:52,080 Speaker 1: of trivia. A certain podcaster by the name of Jonathan 163 00:09:52,120 --> 00:09:55,280 Speaker 1: Strickland was a founding member of the University of Georgia 164 00:09:55,440 --> 00:09:59,520 Speaker 1: Juggling Club. So uh, that's the only thing I really 165 00:09:59,520 --> 00:10:02,840 Speaker 1: share in common with. I loved symbolic logic and I 166 00:10:02,960 --> 00:10:08,079 Speaker 1: enjoyed juggling. They're the comparison ends for he was far 167 00:10:08,240 --> 00:10:12,400 Speaker 1: more intelligent than I can ever hope to aspire. But yeah, 168 00:10:12,440 --> 00:10:15,840 Speaker 1: you have to agree with It's sorry, man, it's fine. 169 00:10:15,559 --> 00:10:18,800 Speaker 1: I have come to grips with it. Okay. If you 170 00:10:18,880 --> 00:10:20,960 Speaker 1: told me, hey, Jonathan, you're never going to be as 171 00:10:21,000 --> 00:10:24,400 Speaker 1: smart as say Claude Shannon or Albert Einstein, it's alright, 172 00:10:24,600 --> 00:10:29,440 Speaker 1: most people won't be, so, I guess night. Claude Shannon 173 00:10:29,480 --> 00:10:33,480 Speaker 1: writes a thesis applying Bulls approach to circuitry by equating 174 00:10:33,480 --> 00:10:36,840 Speaker 1: the zero one restriction as the off and on positions 175 00:10:36,840 --> 00:10:40,320 Speaker 1: of a switch within a circuit. He was twenty two 176 00:10:40,679 --> 00:10:43,320 Speaker 1: years old. This this had never been done. This has 177 00:10:43,440 --> 00:10:45,839 Speaker 1: never been the first time anyone had ever said this, 178 00:10:46,160 --> 00:10:49,400 Speaker 1: certainly out loud, and other thinkers have said that it 179 00:10:49,480 --> 00:10:53,080 Speaker 1: would have taken decades for anyone else to have come 180 00:10:53,120 --> 00:10:55,079 Speaker 1: to this kind of conclusion. Right, we could have been 181 00:10:55,240 --> 00:10:58,280 Speaker 1: sort of groping around with other approaches for years before 182 00:10:58,320 --> 00:11:02,000 Speaker 1: someone had come up with this particular or version and 183 00:11:02,200 --> 00:11:04,960 Speaker 1: not only did he come up with this idea, but 184 00:11:05,000 --> 00:11:08,360 Speaker 1: the way he he presented it in his thesis it 185 00:11:08,480 --> 00:11:11,319 Speaker 1: was very elegant, and he would he would expand upon 186 00:11:11,360 --> 00:11:13,480 Speaker 1: it a little bit later, to the point where people said, 187 00:11:13,760 --> 00:11:16,360 Speaker 1: this is this is why he gets compared to Einstein. 188 00:11:16,720 --> 00:11:19,960 Speaker 1: It's like Einstein saying not just I figured out this 189 00:11:20,080 --> 00:11:23,320 Speaker 1: one component to how the universe works, but being able 190 00:11:23,320 --> 00:11:26,480 Speaker 1: to express it elegantly and have a whole picture, right. Like, 191 00:11:26,520 --> 00:11:29,520 Speaker 1: it's like, it's not just a fact, it's a hill 192 00:11:29,800 --> 00:11:33,360 Speaker 1: host of facts that are all support one another. And 193 00:11:33,440 --> 00:11:35,560 Speaker 1: it's like they say, it's it's like you come up 194 00:11:35,600 --> 00:11:38,959 Speaker 1: with a fundamental theory of science and unfold it all 195 00:11:39,000 --> 00:11:43,920 Speaker 1: at once. It's just so. His thesis also laid out 196 00:11:44,000 --> 00:11:48,000 Speaker 1: how logical functions such as and or and not could 197 00:11:48,000 --> 00:11:51,560 Speaker 1: be implemented within a physical circuit, so building of logic gates. 198 00:11:51,760 --> 00:11:55,439 Speaker 1: Now keep in mind, this is all in a hypothetical 199 00:11:55,480 --> 00:11:58,440 Speaker 1: slash theoretical approach, right, It's not like he was He 200 00:11:58,520 --> 00:12:03,520 Speaker 1: wasn't building this, McCay or electronically, that's the case, maybe exactly, yeah, 201 00:12:03,559 --> 00:12:05,200 Speaker 1: he was. He was. He was laying out how this 202 00:12:05,280 --> 00:12:10,160 Speaker 1: could be possible, not actually building them. Himself. Claude Shannon 203 00:12:10,280 --> 00:12:13,280 Speaker 1: leaves m I T after earning a doctorate in mathematics 204 00:12:13,280 --> 00:12:17,160 Speaker 1: to teach for one year at Princeton Um. And here's 205 00:12:17,160 --> 00:12:19,360 Speaker 1: the story. Has a couple of different who has some 206 00:12:19,440 --> 00:12:21,840 Speaker 1: alternate endings. We will present you with the two that 207 00:12:21,880 --> 00:12:24,440 Speaker 1: we know of. But the story goes that he was 208 00:12:24,640 --> 00:12:27,640 Speaker 1: teaching at Princeton and while he was teaching a class 209 00:12:27,679 --> 00:12:32,080 Speaker 1: he was holding a lecture. Albert Einstein himself opened the 210 00:12:32,120 --> 00:12:36,400 Speaker 1: door and stepped inside, and Claude Shannon kept going on 211 00:12:36,440 --> 00:12:40,640 Speaker 1: with the lecture, but obviously was very much impressed with 212 00:12:40,679 --> 00:12:44,640 Speaker 1: the fact that this genius has walked into his classroom. 213 00:12:44,679 --> 00:12:48,440 Speaker 1: He sees Einstein bend over and whispers something to one 214 00:12:48,440 --> 00:12:50,280 Speaker 1: of the students in the back. He sees that the 215 00:12:50,280 --> 00:12:53,760 Speaker 1: student replies, and then he sees that Einstein quietly leaves 216 00:12:53,800 --> 00:12:56,240 Speaker 1: the room. He continues on with his lecture. At the 217 00:12:56,320 --> 00:12:59,080 Speaker 1: end of the lecture, he holds the student back and 218 00:12:59,320 --> 00:13:04,079 Speaker 1: with great anticipation asks the student, what did this brilliant 219 00:13:04,160 --> 00:13:08,319 Speaker 1: man have to say about my lecture? And my version 220 00:13:08,320 --> 00:13:11,360 Speaker 1: of the story was that Einstein had very quietly asked 221 00:13:11,360 --> 00:13:16,320 Speaker 1: the student, where are they currently serving tea? I've heard 222 00:13:16,320 --> 00:13:18,640 Speaker 1: that he asked where the men's room was, so maybe 223 00:13:18,640 --> 00:13:22,480 Speaker 1: there's where are they currently allowing you to peat could 224 00:13:22,520 --> 00:13:26,599 Speaker 1: possibly been at any rate. Apparently that became one of 225 00:13:26,640 --> 00:13:29,560 Speaker 1: Claude Shannon's favorite stories. He would love to tell the 226 00:13:29,600 --> 00:13:33,480 Speaker 1: story about how Albert Einstein walked into his classroom and 227 00:13:33,520 --> 00:13:36,760 Speaker 1: asked something completely not connected with what he had to say, 228 00:13:36,800 --> 00:13:39,920 Speaker 1: and that made him like, just tickled it. It tickled it, 229 00:13:40,760 --> 00:13:42,480 Speaker 1: And I thought, well that that also tells you a 230 00:13:42,520 --> 00:13:48,160 Speaker 1: lot about his personality that he did not take himself. Uh. 231 00:13:48,200 --> 00:13:52,120 Speaker 1: In nineteen forty one he joined a company famous for 232 00:13:52,200 --> 00:13:57,000 Speaker 1: its research and development, Bell Telephone Labs, and his work 233 00:13:57,320 --> 00:13:59,719 Speaker 1: mostly focused on things that had to do with the 234 00:13:59,760 --> 00:14:03,320 Speaker 1: war effort in this one is World War two, and 235 00:14:03,559 --> 00:14:07,400 Speaker 1: it included anti aircraft devices that could calculate and target 236 00:14:07,640 --> 00:14:11,320 Speaker 1: counter missiles, which came pretty seriously in handy during the 237 00:14:11,400 --> 00:14:14,800 Speaker 1: German blitz on England. Yeah. Yeah, it turns out if 238 00:14:15,280 --> 00:14:18,120 Speaker 1: if your enemy is blasting you with missiles, counter missiles 239 00:14:18,120 --> 00:14:22,400 Speaker 1: are a high priority. He also got to work in cryptography, 240 00:14:22,480 --> 00:14:25,040 Speaker 1: so here's something where he's got a you know, a 241 00:14:25,080 --> 00:14:28,720 Speaker 1: connection with people like Alan Turing, who was working on 242 00:14:29,080 --> 00:14:32,400 Speaker 1: cracking the Enigma machine back over in England he was 243 00:14:32,520 --> 00:14:35,160 Speaker 1: now Claude Shannon was designed devices used by Allied powers 244 00:14:35,160 --> 00:14:37,160 Speaker 1: to send messages back and forth, so he was looking 245 00:14:37,200 --> 00:14:41,480 Speaker 1: at keeping Allied messages safe rather than cracking German messages 246 00:14:41,840 --> 00:14:45,120 Speaker 1: or access power messages. He later wrote a paper about 247 00:14:45,200 --> 00:14:49,760 Speaker 1: communication theory of secrecy systems, which according to M. I. 248 00:14:49,840 --> 00:14:53,600 Speaker 1: T is generally credited with transforming cryptography from an art 249 00:14:53,680 --> 00:14:57,080 Speaker 1: to a science. UM. It was a mathematical proof that 250 00:14:57,160 --> 00:15:00,120 Speaker 1: an encryption scheme called the one time pad or the 251 00:15:00,200 --> 00:15:04,840 Speaker 1: Vernon cipher is is unbreakable. And it's the that cipher 252 00:15:04,920 --> 00:15:06,880 Speaker 1: is the basic idea of encoding a message with a 253 00:15:06,920 --> 00:15:09,560 Speaker 1: random series of digits a key, as we have talked 254 00:15:09,600 --> 00:15:13,480 Speaker 1: about on the show before UM, which both parties communicating 255 00:15:13,520 --> 00:15:16,480 Speaker 1: have a copy of. But you know, this is a 256 00:15:16,600 --> 00:15:21,840 Speaker 1: very simple concept in cryptography. But having the mathematical proof 257 00:15:21,840 --> 00:15:24,840 Speaker 1: that it is in fact unbreakable if the system is, 258 00:15:26,240 --> 00:15:29,120 Speaker 1: then that's really awesome. And when we talked about the 259 00:15:29,240 --> 00:15:32,400 Speaker 1: Enigma machine, that was one of those systems that could 260 00:15:32,400 --> 00:15:36,800 Speaker 1: have been unbreakable had people actually been able to follow 261 00:15:36,960 --> 00:15:40,600 Speaker 1: the rules properly. But because there were two things that 262 00:15:40,680 --> 00:15:42,760 Speaker 1: really fell apart. For the Enigma machine and I know 263 00:15:42,840 --> 00:15:44,320 Speaker 1: this is a bit of a tangent, but it relates 264 00:15:44,360 --> 00:15:47,520 Speaker 1: to this. Those two things were one. The Enigma machine 265 00:15:47,560 --> 00:15:50,560 Speaker 1: was designed so that no matter what the letter you 266 00:15:50,640 --> 00:15:53,720 Speaker 1: pressed would never light up as the same The same 267 00:15:53,800 --> 00:15:55,560 Speaker 1: letter would never light up as the letter that you 268 00:15:55,600 --> 00:15:58,720 Speaker 1: had pressed, So knowing that meant that you could remove 269 00:15:58,960 --> 00:16:03,560 Speaker 1: one variable from all the possible outcomes. Secondly, people were 270 00:16:03,600 --> 00:16:06,560 Speaker 1: not as careful with their log books, with their code 271 00:16:06,560 --> 00:16:09,000 Speaker 1: books as they needed to be um and that that 272 00:16:09,120 --> 00:16:12,160 Speaker 1: led to the code being broken. But everyone seems to 273 00:16:12,200 --> 00:16:16,080 Speaker 1: agree that had every had the Germans, had the access powers, 274 00:16:16,080 --> 00:16:21,280 Speaker 1: been incredibly careful, then that would have been an unbreakable code. 275 00:16:21,280 --> 00:16:25,160 Speaker 1: Of course, times of war, you can't really do share 276 00:16:25,160 --> 00:16:27,280 Speaker 1: in human error being what it is. Yeah, I mean, 277 00:16:27,320 --> 00:16:32,680 Speaker 1: it's it's that's the difference between the ideal and reality. Meanwhile, uh, 278 00:16:32,800 --> 00:16:35,800 Speaker 1: Claude Shannon began to develop theories on how to apply 279 00:16:36,040 --> 00:16:39,160 Speaker 1: his ideas about bully and logic and circuitry to telephone 280 00:16:39,240 --> 00:16:43,120 Speaker 1: switching lines. We have more episode to go, but first 281 00:16:43,400 --> 00:16:55,600 Speaker 1: let's take a quick break in something else not involving 282 00:16:55,640 --> 00:16:59,840 Speaker 1: Claude Channon. Happened that bell laps the development of the transistor. 283 00:17:00,560 --> 00:17:04,000 Speaker 1: Now the transistor was a huge breakthrough. It meant that 284 00:17:04,680 --> 00:17:07,600 Speaker 1: the world of electronics could move away from things like 285 00:17:07,720 --> 00:17:12,639 Speaker 1: vacuum tubes and allow this other device to take its place, essentially, 286 00:17:13,000 --> 00:17:17,439 Speaker 1: which ultimately lead to the manatorization of electronics. But it 287 00:17:17,480 --> 00:17:23,040 Speaker 1: wouldn't be until Claude Shannon Um published his concepts about 288 00:17:23,160 --> 00:17:27,520 Speaker 1: information theory that would let that be a functional item 289 00:17:27,600 --> 00:17:30,159 Speaker 1: in the way that it became. Yeah. Yeah, it was 290 00:17:30,240 --> 00:17:34,560 Speaker 1: really this idea of digitizing information that Shannon had that 291 00:17:34,920 --> 00:17:40,280 Speaker 1: made this a a practical device beyond just especially that 292 00:17:40,400 --> 00:17:42,920 Speaker 1: early transistor. It's enormous if you ever see a picture 293 00:17:42,920 --> 00:17:45,280 Speaker 1: of it, I mean compared to the if you think 294 00:17:45,320 --> 00:17:49,000 Speaker 1: that billions of transistors can now fit on a microprocessor 295 00:17:49,080 --> 00:17:51,320 Speaker 1: chip and then you look at the first one, it's 296 00:17:51,359 --> 00:17:56,760 Speaker 1: it's enormous difference. Obviously. Now, this idea of digitizing information 297 00:17:57,320 --> 00:18:00,720 Speaker 1: was pretty much what would allow the transistor become useful, 298 00:18:00,800 --> 00:18:04,520 Speaker 1: and also it's what would lead to things like encoding 299 00:18:04,560 --> 00:18:08,400 Speaker 1: information onto storage media like uh, like a compact disc. 300 00:18:09,359 --> 00:18:13,760 Speaker 1: This is what would make not just uh, processing data possible, 301 00:18:13,760 --> 00:18:16,840 Speaker 1: but storing it. Yeah, and right, it's it's kind of 302 00:18:16,880 --> 00:18:19,280 Speaker 1: a really beautiful coincidence that both of these technologies were 303 00:18:19,320 --> 00:18:23,040 Speaker 1: being developed at Bell Labs within a year of each other. 304 00:18:23,200 --> 00:18:26,720 Speaker 1: As it turns out, because in night that is when 305 00:18:27,240 --> 00:18:32,640 Speaker 1: Claude and actually published his paper Mathematical Theory of Communication. Yes, 306 00:18:33,000 --> 00:18:35,679 Speaker 1: and that's available in PDF form. Will will share the 307 00:18:35,720 --> 00:18:39,439 Speaker 1: link because you can actually read his paper on information theory. 308 00:18:39,920 --> 00:18:42,440 Speaker 1: And this is the one that I said earlier that 309 00:18:42,560 --> 00:18:46,960 Speaker 1: you know, people people who were information theory experts, they say, like, 310 00:18:47,040 --> 00:18:49,639 Speaker 1: this is this is like Einstein coming out with the 311 00:18:49,640 --> 00:18:53,400 Speaker 1: theories of relativity. This idea of a complete picture, not 312 00:18:53,480 --> 00:18:55,760 Speaker 1: just an idea, but a complete picture of an approach 313 00:18:56,200 --> 00:19:00,440 Speaker 1: that laid the groundwork for digitizing information so it can 314 00:19:00,440 --> 00:19:04,680 Speaker 1: be transmitted and stored. Now, again, he was a theorist. 315 00:19:04,920 --> 00:19:07,600 Speaker 1: He did not build this. He explained how it is 316 00:19:07,800 --> 00:19:10,920 Speaker 1: mathematically possible, right, and so it left it up to 317 00:19:11,200 --> 00:19:14,679 Speaker 1: engineers and computer scientists to figure out, Okay, if this 318 00:19:14,760 --> 00:19:17,639 Speaker 1: is theoretically possible, how do we make it real? What 319 00:19:17,680 --> 00:19:21,399 Speaker 1: do we do to actually put this stuff into into 320 00:19:21,560 --> 00:19:26,000 Speaker 1: reality and have it work for us? Uh? Now, when 321 00:19:26,000 --> 00:19:29,200 Speaker 1: it was published, but there are people who have looked 322 00:19:29,200 --> 00:19:31,320 Speaker 1: into Claude Shannon's life who say that he may have 323 00:19:31,440 --> 00:19:35,040 Speaker 1: had this fully formed as early as ninety three, and 324 00:19:35,080 --> 00:19:36,840 Speaker 1: he thought that it was a really cool idea, but 325 00:19:36,920 --> 00:19:39,199 Speaker 1: just didn't think, you know, no one else is going 326 00:19:39,240 --> 00:19:42,760 Speaker 1: to care about this. I would, I would argue. I mean, 327 00:19:42,800 --> 00:19:44,640 Speaker 1: from from what I've read, it sounded to me more 328 00:19:44,680 --> 00:19:46,840 Speaker 1: like he kind of had it brewing and just didn't 329 00:19:46,840 --> 00:19:49,760 Speaker 1: want to present it until it was done. He did 330 00:19:49,880 --> 00:19:52,119 Speaker 1: seem like the kind of person who he wanted to 331 00:19:52,160 --> 00:19:56,840 Speaker 1: make sure that he had as complete a picture of 332 00:19:56,960 --> 00:19:59,840 Speaker 1: an idea as possible before presenting it to anyone else. 333 00:19:59,840 --> 00:20:03,200 Speaker 1: He and not want to have the experience of coming 334 00:20:03,320 --> 00:20:06,920 Speaker 1: forward with just half an idea. So yeah, he's kind 335 00:20:06,920 --> 00:20:10,760 Speaker 1: of a perfectionist in that sense. And it really is 336 00:20:11,040 --> 00:20:15,240 Speaker 1: a challenge to explain just to an average person exactly 337 00:20:15,280 --> 00:20:19,040 Speaker 1: how important this theory was. But you know, in a 338 00:20:19,080 --> 00:20:21,119 Speaker 1: in a practical sense, at the time that he was 339 00:20:21,160 --> 00:20:23,480 Speaker 1: coming up with this, it was necessary to create a 340 00:20:23,480 --> 00:20:27,479 Speaker 1: better telephone system. So in the old analog telephone system, 341 00:20:27,720 --> 00:20:31,520 Speaker 1: you've got some pretty big limitations, some some barriers you've 342 00:20:31,520 --> 00:20:34,520 Speaker 1: got to get across due to signal loss or noise, 343 00:20:34,880 --> 00:20:38,320 Speaker 1: and analog telephone signal gets weaker the longer that the 344 00:20:38,320 --> 00:20:41,280 Speaker 1: telephone line it's traveling along is Yeah, so In order 345 00:20:41,320 --> 00:20:44,480 Speaker 1: to get around that, engineers would place amplifiers along a 346 00:20:44,520 --> 00:20:46,760 Speaker 1: telephone line to boost the signal. So you get a 347 00:20:46,760 --> 00:20:49,399 Speaker 1: weak signal coming in, it goes through the amplifier, the 348 00:20:49,440 --> 00:20:53,240 Speaker 1: signals boosted, it's stronger going out. But unfortunately, um the 349 00:20:53,520 --> 00:20:55,720 Speaker 1: along with the signal that you want to get boosted, 350 00:20:55,760 --> 00:20:58,240 Speaker 1: all of the noise that's on the line also gets boosted. 351 00:20:58,320 --> 00:21:00,800 Speaker 1: So eventually you run out. I mean, I mean just 352 00:21:00,840 --> 00:21:03,280 Speaker 1: the noise takes over. Yeah, yeah, you lose the signal 353 00:21:03,359 --> 00:21:05,000 Speaker 1: in the noise. So that would be you know, if 354 00:21:05,000 --> 00:21:08,959 Speaker 1: you've ever heard like one of those those telephone conversations 355 00:21:09,000 --> 00:21:12,360 Speaker 1: that goes on in an old movie where it's just 356 00:21:12,440 --> 00:21:15,280 Speaker 1: like all you hear is cracked. Yeah, just imagine that 357 00:21:15,320 --> 00:21:17,760 Speaker 1: if you're far enough away that all you would get 358 00:21:17,800 --> 00:21:19,640 Speaker 1: was the stack, you would not get any voice at all. 359 00:21:20,040 --> 00:21:24,120 Speaker 1: So uh. The interesting thing was that by switching from 360 00:21:24,200 --> 00:21:28,320 Speaker 1: analog signals to digital signals, they didn't have to worry 361 00:21:28,400 --> 00:21:31,840 Speaker 1: about this signal boosting problem. Instead of a continuous signal 362 00:21:31,880 --> 00:21:34,280 Speaker 1: like a sign wave, which is you know, an acoustic wave, 363 00:21:34,440 --> 00:21:37,359 Speaker 1: is what you would get with an analog telephone line, 364 00:21:38,240 --> 00:21:40,840 Speaker 1: digital signals are sent in a series of bits and 365 00:21:40,880 --> 00:21:43,160 Speaker 1: a bit is either a zero or a one. That's 366 00:21:43,240 --> 00:21:46,720 Speaker 1: all based off of Claude Shannon's application of Boolean algebra 367 00:21:46,840 --> 00:21:50,720 Speaker 1: to electronics, and it worked. So you could do this 368 00:21:50,800 --> 00:21:53,120 Speaker 1: with telephones, which was great, but it meant you could 369 00:21:53,119 --> 00:21:55,199 Speaker 1: also do it with just about any other kind of 370 00:21:55,240 --> 00:22:00,919 Speaker 1: information transfer from radio to telegraph, telephones, everything. And again 371 00:22:01,000 --> 00:22:03,040 Speaker 1: this was one of those things that could not immediately 372 00:22:03,080 --> 00:22:05,840 Speaker 1: be implemented. The engineers had to build the technology sported. 373 00:22:06,440 --> 00:22:09,840 Speaker 1: But once they did, they realized, we can build out 374 00:22:09,960 --> 00:22:14,480 Speaker 1: a nationwide telephone, even a global telephone system that doesn't 375 00:22:14,480 --> 00:22:18,600 Speaker 1: require amplifiers every x number of miles because you're never 376 00:22:18,640 --> 00:22:22,680 Speaker 1: going to lose that that signal clarity, all right, Like hypothetically, 377 00:22:22,800 --> 00:22:26,480 Speaker 1: you can do this with literally zero loss in quality. 378 00:22:26,560 --> 00:22:29,440 Speaker 1: So so long as you don't mind taking the necessary 379 00:22:29,440 --> 00:22:31,960 Speaker 1: amount of time for each bit to be transferred. Really, 380 00:22:32,000 --> 00:22:35,040 Speaker 1: the transfer speed is the only cap that you're working 381 00:22:35,040 --> 00:22:38,520 Speaker 1: with at this junction exactly. And Claude Shannon he kind 382 00:22:38,560 --> 00:22:40,840 Speaker 1: of came up with that too. He said, uh, you know, 383 00:22:41,680 --> 00:22:44,960 Speaker 1: if if we have an infinite amount of time, you'll 384 00:22:44,960 --> 00:22:50,960 Speaker 1: have zero signal laws. But that any medium of transmission 385 00:22:51,080 --> 00:22:54,440 Speaker 1: is going to have ultimately a cap of how much 386 00:22:54,520 --> 00:22:57,720 Speaker 1: data it can carry at any given within a given 387 00:22:57,760 --> 00:23:01,399 Speaker 1: amount of time. So it was interesting because that was 388 00:23:01,440 --> 00:23:03,840 Speaker 1: one of those things that ended up becoming a challenge 389 00:23:03,880 --> 00:23:08,399 Speaker 1: to engineers. He said, look, for whatever medium you choose, 390 00:23:09,040 --> 00:23:11,720 Speaker 1: it's and it's specific to each medium. You're going to 391 00:23:11,800 --> 00:23:14,359 Speaker 1: have this limit that you're going to hit and you 392 00:23:14,400 --> 00:23:16,920 Speaker 1: can't go beyond it. And the engineer said, all right, 393 00:23:16,920 --> 00:23:19,560 Speaker 1: we agree, there's no way we can go beyond that limit. 394 00:23:19,640 --> 00:23:21,719 Speaker 1: So what our goal is is to get as close 395 00:23:21,760 --> 00:23:24,760 Speaker 1: to that limit as we possibly can. And and this 396 00:23:24,880 --> 00:23:29,080 Speaker 1: also led into some really interesting side concepts about digital 397 00:23:29,119 --> 00:23:34,280 Speaker 1: compression and error. Yeah exactly, Yeah, you had to. You 398 00:23:34,320 --> 00:23:38,680 Speaker 1: could end up compressing data into smaller data packages, which 399 00:23:38,720 --> 00:23:42,080 Speaker 1: helps you get around that bandwidth cap. But in order 400 00:23:42,119 --> 00:23:43,959 Speaker 1: to do that, you also have to have that that 401 00:23:44,119 --> 00:23:48,000 Speaker 1: error correction software, that those algorithms that are able to 402 00:23:48,160 --> 00:23:52,040 Speaker 1: detect and and fix any errors that come across while 403 00:23:52,040 --> 00:23:56,320 Speaker 1: you're transmitting this information. These were all laid out his ideas, 404 00:23:56,520 --> 00:23:59,960 Speaker 1: and and that that error correction concept also ties back 405 00:24:00,160 --> 00:24:03,840 Speaker 1: into the idea that, uh, you know, if you scratch 406 00:24:03,880 --> 00:24:08,040 Speaker 1: a c D you can still it can still be read. Yeah, yeah, 407 00:24:08,080 --> 00:24:10,960 Speaker 1: because you have these extra bits that are built into 408 00:24:11,080 --> 00:24:14,920 Speaker 1: the data itself, these bits that otherwise would seem superfluous. 409 00:24:14,920 --> 00:24:17,960 Speaker 1: They're not necessary for you to have the full message, 410 00:24:18,000 --> 00:24:21,840 Speaker 1: but those extra bits actually allow some redundancy. So if 411 00:24:21,840 --> 00:24:24,760 Speaker 1: there is some damage to the physical medium, you can 412 00:24:24,800 --> 00:24:27,359 Speaker 1: still end up using it. And it's not like you 413 00:24:27,400 --> 00:24:30,640 Speaker 1: get a smudge on your your your disk and now 414 00:24:30,720 --> 00:24:33,240 Speaker 1: you can't use it. Right. The concept of a disc 415 00:24:33,320 --> 00:24:35,560 Speaker 1: also being new, because that was something that he laid 416 00:24:35,560 --> 00:24:37,679 Speaker 1: out in here, saying that this is a method for 417 00:24:37,880 --> 00:24:41,840 Speaker 1: possible storage, not just transmission, but also storage. Yeah, so 418 00:24:41,840 --> 00:24:46,239 Speaker 1: so big big ideas. Uh. At any rate, moving on 419 00:24:46,320 --> 00:24:48,240 Speaker 1: with his life, I mean he's so he's already gotten 420 00:24:48,280 --> 00:24:50,359 Speaker 1: to the point where he's laid out everything that's going 421 00:24:50,400 --> 00:24:54,080 Speaker 1: to lead to things like JPEG's, MP three's ZIP files. Uh, 422 00:24:54,280 --> 00:24:58,720 Speaker 1: data transmission across cable, across telephone lines. All of this 423 00:24:58,840 --> 00:25:01,959 Speaker 1: stuff is possible because of the ideas he came up with. 424 00:25:03,320 --> 00:25:06,240 Speaker 1: His life continues on and in nineteen forty nine he 425 00:25:06,359 --> 00:25:10,800 Speaker 1: marries Mary Elizabeth Moore Betty Betty. She was a new 426 00:25:10,960 --> 00:25:14,080 Speaker 1: miracle analyst at Bell Labs, and they would go on 427 00:25:14,160 --> 00:25:17,919 Speaker 1: to have two children together. And he also during his 428 00:25:18,240 --> 00:25:21,920 Speaker 1: time off from changing the world. UH. Decided to build 429 00:25:21,960 --> 00:25:24,280 Speaker 1: a simple computer to play chess, and he wrote a 430 00:25:24,520 --> 00:25:28,679 Speaker 1: paper about programming computers and computer chess algorithms. A lot 431 00:25:28,720 --> 00:25:33,280 Speaker 1: of computer like chess playing computers are still based upon 432 00:25:33,800 --> 00:25:36,680 Speaker 1: the foundations that he laid out while he was working 433 00:25:36,680 --> 00:25:39,320 Speaker 1: on this. UH. You find that the Claude Shannon in 434 00:25:39,359 --> 00:25:42,240 Speaker 1: his spirit time often did things that that most of 435 00:25:42,280 --> 00:25:43,840 Speaker 1: us would be like, well, you could have a full 436 00:25:43,840 --> 00:25:45,480 Speaker 1: time job doing that. He's like, no, I just want 437 00:25:45,520 --> 00:25:48,760 Speaker 1: to do that, you know, I'd like to keep my 438 00:25:48,800 --> 00:25:52,560 Speaker 1: hand in. Around that time, engineers at Bell Labs at 439 00:25:52,600 --> 00:25:55,040 Speaker 1: that time being ninety nine began to actually create the 440 00:25:55,080 --> 00:25:58,760 Speaker 1: technology that implemented Shannon's ideas, and they built something called 441 00:25:58,840 --> 00:26:03,160 Speaker 1: a regenerative repeater. And the idea was that a bit 442 00:26:03,400 --> 00:26:06,600 Speaker 1: could be regenerated perfectly and repeatedly as long as the 443 00:26:06,640 --> 00:26:09,399 Speaker 1: bits weren't quote unquote too small, So as long as 444 00:26:09,440 --> 00:26:15,880 Speaker 1: the messages weren't too small, they could consistently regenerate a message. UH. 445 00:26:15,920 --> 00:26:18,200 Speaker 1: And that would mean that you would again have no 446 00:26:18,520 --> 00:26:21,480 Speaker 1: signal loss, You wouldn't lose any data in the process 447 00:26:21,560 --> 00:26:24,119 Speaker 1: because you could just just as quickly as it was 448 00:26:24,160 --> 00:26:28,520 Speaker 1: coming into the regenerative regenerative repeater, it would send out 449 00:26:28,760 --> 00:26:32,440 Speaker 1: a copy the same data message back out again. Um. 450 00:26:32,520 --> 00:26:35,080 Speaker 1: Also to around this time, as the engineers at Bell 451 00:26:35,200 --> 00:26:40,719 Speaker 1: Labs were creating that that physical technology to incorporate Shannon's ideas, 452 00:26:40,800 --> 00:26:43,520 Speaker 1: he started to introduce the idea of bandwidth limits. Yeah, 453 00:26:43,560 --> 00:26:45,280 Speaker 1: this is what I was talking about when he said, 454 00:26:45,440 --> 00:26:48,120 Speaker 1: it doesn't matter what medium you're using, Eventually you're going 455 00:26:48,160 --> 00:26:53,399 Speaker 1: to hit that capacity. And eventually they started calling this 456 00:26:53,520 --> 00:26:57,200 Speaker 1: the Shannon capacity or Shannon limit. So it was again 457 00:26:57,240 --> 00:27:00,399 Speaker 1: a very important idea that ended up being playing a 458 00:27:00,480 --> 00:27:03,159 Speaker 1: huge role in the telecommunications industry as well as just 459 00:27:03,280 --> 00:27:06,800 Speaker 1: electronics and computing in general. Uh so this is what 460 00:27:07,000 --> 00:27:09,439 Speaker 1: gives engineers that goal. This is where they want to 461 00:27:09,520 --> 00:27:12,120 Speaker 1: hit as close to that number as they possibly can 462 00:27:12,440 --> 00:27:15,320 Speaker 1: to maximize the amount of data they can shove through 463 00:27:15,680 --> 00:27:18,840 Speaker 1: any particular medium at top speed. So, you know, we 464 00:27:18,920 --> 00:27:23,200 Speaker 1: often talk about data transmission speeds, but speed is really 465 00:27:23,400 --> 00:27:27,120 Speaker 1: kind of a deceptive term because it's not just how 466 00:27:27,200 --> 00:27:30,040 Speaker 1: fast something gets from point A to point B. Usually 467 00:27:30,119 --> 00:27:32,720 Speaker 1: we're talking about speeds that are approaching the speed of light. 468 00:27:33,320 --> 00:27:36,560 Speaker 1: That's really fast. What we're what we're really concerned with 469 00:27:36,720 --> 00:27:39,520 Speaker 1: is throughput, which is the amount of data that can 470 00:27:39,560 --> 00:27:41,879 Speaker 1: travel at that speed to get from point A to 471 00:27:41,960 --> 00:27:44,560 Speaker 1: point B. Because if you're dividing that data up into 472 00:27:44,640 --> 00:27:48,120 Speaker 1: lots of of bits like a long string, yes, each 473 00:27:48,160 --> 00:27:50,040 Speaker 1: individual bit is moving at the speed of light, but 474 00:27:50,080 --> 00:27:52,640 Speaker 1: you still got to get that whole string through. Yeah. Yeah, 475 00:27:52,640 --> 00:27:54,719 Speaker 1: it's it's the you know, getting the caboose through at 476 00:27:54,720 --> 00:27:57,080 Speaker 1: the end. Really. Yeah, it's the idea of if the 477 00:27:57,560 --> 00:28:00,240 Speaker 1: if we hear that there's pizza in the kitchen and 478 00:28:00,400 --> 00:28:02,280 Speaker 1: uh and we're all invited to go and eat it, 479 00:28:02,359 --> 00:28:05,320 Speaker 1: then the problem isn't that we have a bunch of 480 00:28:05,359 --> 00:28:07,439 Speaker 1: slow people on staff. We're all very very fast. The 481 00:28:07,440 --> 00:28:10,400 Speaker 1: problem is the doors only so wide, and eventually four 482 00:28:10,480 --> 00:28:12,280 Speaker 1: or five of us while just try and cram through 483 00:28:12,320 --> 00:28:15,000 Speaker 1: it at the same time. So that's the difference between 484 00:28:15,000 --> 00:28:18,000 Speaker 1: just speed and throughput. Now, tipt ones and zeroes don't 485 00:28:18,040 --> 00:28:21,000 Speaker 1: usually elbow you in the face, that's true, but we 486 00:28:21,119 --> 00:28:27,119 Speaker 1: have no such restriction, as we have demonstrated upon multiple occasions. Uh. Now. 487 00:28:27,880 --> 00:28:30,120 Speaker 1: At this time, engineers were also trying to find on 488 00:28:30,480 --> 00:28:32,960 Speaker 1: ways to take on other elements of this theory, like 489 00:28:33,040 --> 00:28:36,800 Speaker 1: the compression and redundancy ideas and build working devices and 490 00:28:36,880 --> 00:28:41,480 Speaker 1: algorithms that turned that theory into reality, actually making products 491 00:28:41,560 --> 00:28:44,960 Speaker 1: that could take advantage of the ideas that Shannon had produced. 492 00:28:45,560 --> 00:28:50,240 Speaker 1: And uh. Meanwhile, Shannon received a very special present at 493 00:28:50,720 --> 00:28:55,400 Speaker 1: Christmas of from his wife this year, a unicycle, and 494 00:28:55,520 --> 00:28:58,240 Speaker 1: stories say that he frequently rode through the halls of 495 00:28:58,280 --> 00:29:01,960 Speaker 1: Bell Labs at night on this cycle while juggling. He 496 00:29:02,080 --> 00:29:04,400 Speaker 1: is my hero because of why not. Now, See, if 497 00:29:04,440 --> 00:29:07,560 Speaker 1: my wife gave me a unicycle for Christmas, I would 498 00:29:07,600 --> 00:29:10,880 Speaker 1: imagine she was plotting my demise and perhaps had put 499 00:29:10,880 --> 00:29:14,200 Speaker 1: taken out yet another life insurance policy on me because 500 00:29:14,320 --> 00:29:19,400 Speaker 1: she knows my my lack of balance. But but I 501 00:29:19,800 --> 00:29:23,720 Speaker 1: I have nothing but respect for someone who is transforming 502 00:29:23,800 --> 00:29:28,680 Speaker 1: information theory while writing a unicycle and juggling. Juggling. Yeah 503 00:29:29,120 --> 00:29:33,000 Speaker 1: so because because it Meanwhile, he was looking into machine 504 00:29:33,040 --> 00:29:36,840 Speaker 1: intelligence and memory. Yeah, he was really branching out, you know, 505 00:29:36,880 --> 00:29:40,320 Speaker 1: he was. He was very much interested in exploring all 506 00:29:40,320 --> 00:29:43,440 Speaker 1: these different ideas. Time for us to take another break, 507 00:29:43,600 --> 00:29:54,080 Speaker 1: but we will be right back now. By nineteen fifty six, 508 00:29:54,120 --> 00:29:56,960 Speaker 1: he decides to leave Bell Labs, though he continues on 509 00:29:57,120 --> 00:29:59,720 Speaker 1: as a consultant, and he goes back to M I. 510 00:29:59,800 --> 00:30:03,440 Speaker 1: T To teach UH he also wrote a paper he 511 00:30:03,560 --> 00:30:07,800 Speaker 1: was called the Bandwagon, and uh, that's when he said 512 00:30:07,880 --> 00:30:10,920 Speaker 1: he didn't really like how the words information theory were 513 00:30:10,960 --> 00:30:14,080 Speaker 1: being thrown around. So essentially what he was saying was 514 00:30:14,120 --> 00:30:17,120 Speaker 1: that they were losing their value. Information theory as a 515 00:30:17,160 --> 00:30:20,360 Speaker 1: concept was losing its value because companies were using it 516 00:30:20,400 --> 00:30:23,560 Speaker 1: to describe things that didn't really fall within the umbrella 517 00:30:23,640 --> 00:30:26,840 Speaker 1: of information. Yeah, it was a really popular and pop 518 00:30:26,920 --> 00:30:29,720 Speaker 1: culture almost term in the scientific community at the time. 519 00:30:29,800 --> 00:30:32,880 Speaker 1: And I mean people were publishing papers that had information 520 00:30:32,960 --> 00:30:35,840 Speaker 1: theory in the title just because they thought it sounded cool, 521 00:30:35,920 --> 00:30:38,120 Speaker 1: when in fact, right, it had nothing to do with that. 522 00:30:38,200 --> 00:30:41,719 Speaker 1: So it was kind of like how virtual reality became 523 00:30:41,720 --> 00:30:46,240 Speaker 1: this buzzword that began to lose meaning, particularly when the 524 00:30:46,280 --> 00:30:49,120 Speaker 1: public started to see what the reality of the field 525 00:30:49,280 --> 00:30:52,760 Speaker 1: was as compared to the Hollywood depiction of what virtual 526 00:30:52,840 --> 00:30:55,720 Speaker 1: reality was back in the early nineties. Sure sure, like 527 00:30:55,800 --> 00:30:59,080 Speaker 1: artificial intelligence or I read an essay recently from the 528 00:30:59,120 --> 00:31:01,240 Speaker 1: guy who coined to the term manic Pixie dream girls 529 00:31:01,280 --> 00:31:03,200 Speaker 1: saying that he just wished he had never done that thing. 530 00:31:03,680 --> 00:31:07,880 Speaker 1: I would like to apologize to the world. Yeah, so 531 00:31:07,920 --> 00:31:09,920 Speaker 1: this was one of those interesting things were the paper 532 00:31:10,280 --> 00:31:13,440 Speaker 1: wasn't so much about advancing the concept, but just saying, 533 00:31:14,000 --> 00:31:18,320 Speaker 1: let's use our words carefully and correctly. He said that 534 00:31:18,480 --> 00:31:22,040 Speaker 1: perhaps the term had quote ballooned to an importance beyond 535 00:31:22,120 --> 00:31:24,960 Speaker 1: its actual accomplishments end quote. I think that's a little 536 00:31:24,960 --> 00:31:27,720 Speaker 1: bit modest on his part. Honestly, I think so too, 537 00:31:27,760 --> 00:31:31,960 Speaker 1: considering that again, without that theory, computers and electronics would 538 00:31:32,000 --> 00:31:35,479 Speaker 1: not work the way they do today. Yeah, but at 539 00:31:35,480 --> 00:31:37,920 Speaker 1: any rate, this kind of marked the beginning of Shannon's 540 00:31:37,920 --> 00:31:43,000 Speaker 1: disappearance from the research and technology scene. He he really 541 00:31:43,000 --> 00:31:46,000 Speaker 1: didn't want to be a celebrity, I think, and he 542 00:31:46,080 --> 00:31:48,760 Speaker 1: had this huge push from the media and the government 543 00:31:48,920 --> 00:31:52,400 Speaker 1: and science in general to be made into one, and 544 00:31:52,600 --> 00:31:55,360 Speaker 1: it it kind of pulled him away from from both 545 00:31:55,400 --> 00:31:59,240 Speaker 1: research and public education, right and he was It wasn't 546 00:31:59,280 --> 00:32:02,000 Speaker 1: that he was old, from why, I understand. Whenever he 547 00:32:02,040 --> 00:32:05,080 Speaker 1: gave talks they were really great, and whenever he wrote papers, 548 00:32:05,120 --> 00:32:08,320 Speaker 1: they were really great. But he was constantly being pressured 549 00:32:08,360 --> 00:32:10,880 Speaker 1: to do that, and it was starting to become more 550 00:32:11,120 --> 00:32:14,160 Speaker 1: of something that would cause him anxiety as opposed to 551 00:32:14,240 --> 00:32:17,960 Speaker 1: something that he would enjoy doing well. In nineteen seventy three, 552 00:32:18,000 --> 00:32:20,360 Speaker 1: the information theory Society, which is part of the I 553 00:32:20,600 --> 00:32:25,520 Speaker 1: Triple E or I created an annual Shannon lecture that 554 00:32:25,640 --> 00:32:29,720 Speaker 1: became the Shannon Award UH And in nineteen seventy eight, 555 00:32:29,760 --> 00:32:32,760 Speaker 1: Claude Shannon officially retired from m T, although he had 556 00:32:32,840 --> 00:32:35,920 Speaker 1: not really been actively working there for some years before. 557 00:32:36,120 --> 00:32:39,520 Speaker 1: Certainly UH And in nineteen eight seven, Claude Shannon gave 558 00:32:39,600 --> 00:32:44,200 Speaker 1: his last interview to Omni Magazine. Now, by the late eighties, 559 00:32:44,240 --> 00:32:47,800 Speaker 1: Claude Shannon began to suffer from Alzheimer's and withdrew from 560 00:32:47,800 --> 00:32:51,160 Speaker 1: the public eye entirely. His wife would go and attend 561 00:32:51,280 --> 00:32:55,520 Speaker 1: events instead in his place, and in February two thousand one, 562 00:32:55,600 --> 00:32:58,240 Speaker 1: at the age of eighty four, he would pass away. Yes, 563 00:32:58,640 --> 00:33:03,640 Speaker 1: there are some very uh inspiring and moving tributes to 564 00:33:03,680 --> 00:33:06,880 Speaker 1: Claude Shannon that were published, really beautiful things. You can 565 00:33:06,960 --> 00:33:10,640 Speaker 1: certainly go online and read a lot of those those 566 00:33:10,680 --> 00:33:15,440 Speaker 1: tributes that were written the week and month following his passing. 567 00:33:16,040 --> 00:33:19,200 Speaker 1: And we have a collection of interesting little trivia that 568 00:33:19,280 --> 00:33:22,400 Speaker 1: we didn't really want to fit into the overall episode, 569 00:33:22,440 --> 00:33:25,959 Speaker 1: but it didn't really fit into the timeline. But so 570 00:33:26,040 --> 00:33:29,080 Speaker 1: much of I mean, if it wasn't charming enough, I mean, 571 00:33:29,120 --> 00:33:31,360 Speaker 1: if charming is the correct word. Actually, charming is totally 572 00:33:31,400 --> 00:33:33,640 Speaker 1: the correct word. Parting to me, I find it downright 573 00:33:34,320 --> 00:33:38,520 Speaker 1: charming that he wrote, you know, papers that mathematically proved 574 00:33:38,520 --> 00:33:42,720 Speaker 1: the computers can exist. But but but but other than that, 575 00:33:42,760 --> 00:33:47,840 Speaker 1: there's just a lot of little just yeah. So so 576 00:33:47,920 --> 00:33:49,880 Speaker 1: one of those things is that, you know, we just 577 00:33:49,920 --> 00:33:53,080 Speaker 1: said he he was not big on on pursuing the limelight. 578 00:33:53,120 --> 00:33:55,720 Speaker 1: He didn't. He didn't go after that at all, and 579 00:33:55,720 --> 00:34:00,160 Speaker 1: and often he would reluctantly take the stage, but as 580 00:34:00,280 --> 00:34:03,920 Speaker 1: time went on, he did that even less frequently. He 581 00:34:03,920 --> 00:34:07,040 Speaker 1: wouldn't go out very much at all to to address 582 00:34:07,120 --> 00:34:10,480 Speaker 1: the public, and according to M I. T. Technology Review, 583 00:34:10,560 --> 00:34:14,399 Speaker 1: he even had a file labeled letters I've procrastinated too 584 00:34:14,440 --> 00:34:18,080 Speaker 1: long on So if he got something from colleagues or 585 00:34:18,080 --> 00:34:21,280 Speaker 1: government officials or scientific institutions and had just been sitting 586 00:34:21,360 --> 00:34:24,480 Speaker 1: around for a really long while, he would just put 587 00:34:24,480 --> 00:34:27,680 Speaker 1: this in a file, saying, well, that's too that's too late, 588 00:34:27,800 --> 00:34:29,719 Speaker 1: and that's never gonna happen, So I'm just gonna put 589 00:34:29,719 --> 00:34:33,719 Speaker 1: that in this file. Um. He, like we said, love 590 00:34:33,800 --> 00:34:36,279 Speaker 1: to build stuff, to engineer stuff. You know, that whole 591 00:34:36,280 --> 00:34:39,319 Speaker 1: telegraph line stories. One of my favorites um Now as 592 00:34:39,320 --> 00:34:42,440 Speaker 1: a parent, he built a chairlift that would take his 593 00:34:42,600 --> 00:34:46,239 Speaker 1: kids from his house to a nearby lake, so they 594 00:34:46,280 --> 00:34:48,080 Speaker 1: didn't have to walk the whole way to the lake. 595 00:34:48,640 --> 00:34:52,040 Speaker 1: He also, from what I understand, designed a hidden panel 596 00:34:52,120 --> 00:34:55,120 Speaker 1: in his office that didn't lead anywhere at all. He 597 00:34:55,239 --> 00:34:57,640 Speaker 1: just he just felt like building one. He just needed it. 598 00:34:57,640 --> 00:34:59,960 Speaker 1: It made me think of a Mitchell and Web sketch 599 00:35:00,040 --> 00:35:04,919 Speaker 1: where this wall must rotate, be both here and not here. 600 00:35:06,239 --> 00:35:09,520 Speaker 1: We look, mate, that's a load bearing wool. But anyway, 601 00:35:09,560 --> 00:35:11,880 Speaker 1: he just decided he wanted to make one. He also 602 00:35:11,960 --> 00:35:16,400 Speaker 1: built a life sized electric mouse named Theseus, after the 603 00:35:16,480 --> 00:35:19,759 Speaker 1: Greek mythology figure that's the one who was stuck in 604 00:35:19,800 --> 00:35:21,680 Speaker 1: the labyrinth that had to find his way out, and 605 00:35:21,760 --> 00:35:26,040 Speaker 1: the minotaur or Minotar depending upon your preferred pronunciations after him. 606 00:35:26,360 --> 00:35:29,120 Speaker 1: So this mouse, what it was due is it would 607 00:35:29,120 --> 00:35:32,080 Speaker 1: explore a maze and quote unquote remember where it comes from. 608 00:35:32,440 --> 00:35:35,720 Speaker 1: It was it was going after some little metal cheese bits. 609 00:35:35,800 --> 00:35:38,960 Speaker 1: I think. So the the way this mouse would go 610 00:35:39,040 --> 00:35:41,080 Speaker 1: through the maze is it would go down a pathway 611 00:35:41,080 --> 00:35:46,040 Speaker 1: and whenever the pathway would branch, it would start to rotate. Yeah, 612 00:35:46,080 --> 00:35:48,040 Speaker 1: so it would take one and then it would, uh 613 00:35:48,320 --> 00:35:52,319 Speaker 1: it could backtrack if it went down an incorrect route, right, 614 00:35:52,320 --> 00:35:54,120 Speaker 1: and then it could take the path it had not 615 00:35:54,239 --> 00:35:55,880 Speaker 1: taken as opposed to you know, if this were just 616 00:35:55,920 --> 00:35:59,760 Speaker 1: an electronic mouse that had some collision detection, it wouldn't 617 00:36:00,080 --> 00:36:02,600 Speaker 1: could potentially just go back and forth down the same 618 00:36:02,680 --> 00:36:07,160 Speaker 1: little pathway forever. Yeah, but this was branching. This one knew, Okay, well, 619 00:36:07,239 --> 00:36:09,960 Speaker 1: I already took the path that's on the right, so 620 00:36:10,000 --> 00:36:11,720 Speaker 1: I have to take the path that's on the left. 621 00:36:11,960 --> 00:36:14,520 Speaker 1: So it's pretty cool that he built this thing, you know, 622 00:36:14,760 --> 00:36:18,600 Speaker 1: just for the fun of it. He built it also 623 00:36:19,000 --> 00:36:23,920 Speaker 1: probably my my favorite robotic piece of his eight juggling robot, 624 00:36:24,640 --> 00:36:27,640 Speaker 1: a bounce juggling robot to be precise, bounce juggling robot 625 00:36:27,640 --> 00:36:31,279 Speaker 1: that like w C Fields to be even more precise. Yeah. 626 00:36:31,400 --> 00:36:34,320 Speaker 1: It was like having a like imagine a drumhead, right, 627 00:36:34,640 --> 00:36:37,880 Speaker 1: and the drumhead allows things that are dropped on it, 628 00:36:37,960 --> 00:36:40,160 Speaker 1: like a ball bearing to be bounced on it. And 629 00:36:40,160 --> 00:36:44,120 Speaker 1: then two little uh angled platforms that are serving his 630 00:36:44,280 --> 00:36:49,320 Speaker 1: hands that are bouncing this again, these little these balls. Yeah, 631 00:36:49,400 --> 00:36:50,960 Speaker 1: and they just kept it going in a in a 632 00:36:51,120 --> 00:36:54,080 Speaker 1: bounced juggling pattern. Perfectly, and he basically made it out 633 00:36:54,120 --> 00:36:56,640 Speaker 1: of like erector set pieces. Yeah, you know, just like 634 00:36:56,719 --> 00:36:58,200 Speaker 1: you do. And then he wrote a paper on the 635 00:36:58,280 --> 00:37:01,160 Speaker 1: dynamics of keeping multiple objects in the air simultaneously. It's 636 00:37:01,200 --> 00:37:04,040 Speaker 1: pretty famous within the juggling community. I tried to read 637 00:37:04,040 --> 00:37:06,319 Speaker 1: it what I actually wrote, how juggling works for how 638 00:37:06,360 --> 00:37:08,120 Speaker 1: stuff works dot com. In fact, if you go to 639 00:37:08,239 --> 00:37:11,160 Speaker 1: that that article on how stuff works and you look 640 00:37:11,200 --> 00:37:14,520 Speaker 1: up how juggling works, there's a video of me juggling 641 00:37:14,960 --> 00:37:17,560 Speaker 1: in that article. I still I still say it because 642 00:37:17,560 --> 00:37:19,160 Speaker 1: I juggle a little bit. I still say that we 643 00:37:19,200 --> 00:37:21,960 Speaker 1: really need to do a video of both jugged. All right, 644 00:37:22,520 --> 00:37:26,600 Speaker 1: I juggled torches in mine. You're ready to pick those up? Okay, well, 645 00:37:26,640 --> 00:37:30,200 Speaker 1: well we'll start small. Uh. He also made a robot 646 00:37:30,239 --> 00:37:33,560 Speaker 1: that could solve a Rubic's cube, which is pretty amazing. 647 00:37:33,680 --> 00:37:37,160 Speaker 1: I mean, obviously that needs I can't either. I know 648 00:37:37,200 --> 00:37:40,759 Speaker 1: there are algorithms for how to solve it the most efficiently, 649 00:37:41,080 --> 00:37:43,000 Speaker 1: and I've seen people who are really good at who 650 00:37:43,040 --> 00:37:46,279 Speaker 1: just like it's like it's like magic. You know. The 651 00:37:46,280 --> 00:37:48,640 Speaker 1: way I saw a Rubik's cube is by peeling the 652 00:37:48,680 --> 00:37:52,960 Speaker 1: stickers off and then replacing them properly, I cheat, but yeah, no. 653 00:37:53,080 --> 00:37:55,560 Speaker 1: He he created a robot that could follow these algorithms 654 00:37:55,560 --> 00:37:58,279 Speaker 1: and also just recognize what the pattern was on any 655 00:37:58,320 --> 00:38:00,239 Speaker 1: given side, so it could, you know, create e the 656 00:38:00,320 --> 00:38:03,319 Speaker 1: rules that needed to solve it. UM. And he made 657 00:38:03,320 --> 00:38:07,960 Speaker 1: a calculator that worked with Roman numerals. It was called throwback, 658 00:38:08,480 --> 00:38:13,399 Speaker 1: which stood for a thrifty Roman numerical backward looking computer. UM. 659 00:38:13,640 --> 00:38:17,680 Speaker 1: Also a rocket powered Frisbees, and motorized poco sticks. Yes, 660 00:38:17,920 --> 00:38:20,279 Speaker 1: the motorized pogo stick. I was thinking, like again, that 661 00:38:20,320 --> 00:38:24,319 Speaker 1: sounds terrible. If the unicycle hadn't killed me already, that 662 00:38:24,440 --> 00:38:29,360 Speaker 1: certainly would. He built the ultimate machine. My favorite machine 663 00:38:29,480 --> 00:38:32,399 Speaker 1: of all time is the ultimate machine. All right, tell 664 00:38:32,440 --> 00:38:35,120 Speaker 1: us about it, Jonathan. All right. Now, imagine you have 665 00:38:35,600 --> 00:38:38,799 Speaker 1: before you a box, and on that box you can 666 00:38:38,800 --> 00:38:41,400 Speaker 1: see the outline of a trap door. And the only 667 00:38:41,480 --> 00:38:45,560 Speaker 1: other really interesting feature on this box is a simple 668 00:38:45,680 --> 00:38:49,240 Speaker 1: switch that switched to off, and you push the switch 669 00:38:49,280 --> 00:38:52,839 Speaker 1: to on. The trap door opens, and a hand emerges 670 00:38:53,000 --> 00:38:56,279 Speaker 1: from beneath the trap door and hits the switch back 671 00:38:56,320 --> 00:38:58,479 Speaker 1: to the off position, with draws back inside, and trap 672 00:38:58,520 --> 00:39:02,120 Speaker 1: door closes. That's it. That's it. Hit the switch and 673 00:39:02,160 --> 00:39:04,399 Speaker 1: the harm comes back out, yet the switch the arm 674 00:39:04,440 --> 00:39:08,399 Speaker 1: comes back out. Uh. I want to share this video too. 675 00:39:08,440 --> 00:39:11,880 Speaker 1: There's a video of a brilliant variation of the Ultimate 676 00:39:11,920 --> 00:39:17,480 Speaker 1: Machine that is hysterically funny. It doesn't just do that like, 677 00:39:17,520 --> 00:39:20,200 Speaker 1: it starts to do it so um. It ends up 678 00:39:20,239 --> 00:39:23,080 Speaker 1: at first looking like it's a variation on the Ultimate Machine, like, oh, 679 00:39:23,160 --> 00:39:25,520 Speaker 1: that's cute, But then it starts doing other things too, 680 00:39:25,560 --> 00:39:28,080 Speaker 1: because this particular box had wheels on it and can 681 00:39:28,160 --> 00:39:30,160 Speaker 1: move all the way, so it's starting to avoid the 682 00:39:30,160 --> 00:39:33,120 Speaker 1: person who's trying to hit the switch, or it would 683 00:39:33,440 --> 00:39:36,840 Speaker 1: playback prerecorded messages saying like hey, hands off, buddy, that 684 00:39:36,880 --> 00:39:40,080 Speaker 1: kind of stuff and was really really entertaining. So we'll 685 00:39:40,120 --> 00:39:41,880 Speaker 1: share that one as well. But you have to remember 686 00:39:41,960 --> 00:39:45,680 Speaker 1: that that particular very entertaining machine is based off this 687 00:39:45,719 --> 00:39:47,759 Speaker 1: thing that Claude Shannon built for no reason other than 688 00:39:47,800 --> 00:39:50,400 Speaker 1: it tickled him just because he could. Um. He also 689 00:39:50,560 --> 00:39:54,879 Speaker 1: had a collection of exotic unicycles, including some that were 690 00:39:54,960 --> 00:39:57,279 Speaker 1: because he he was wondering how small could you make 691 00:39:57,280 --> 00:40:01,359 Speaker 1: a unicycle before someone would be unable to write it? Uh, 692 00:40:01,480 --> 00:40:05,120 Speaker 1: for me, that's any size, but I think me too, 693 00:40:05,200 --> 00:40:07,279 Speaker 1: that would be any size. But assuming that you are 694 00:40:07,360 --> 00:40:09,920 Speaker 1: capable of writing a unicycle, how small could you go 695 00:40:10,040 --> 00:40:12,799 Speaker 1: before you could no longer maintain your balance? In fact, 696 00:40:12,800 --> 00:40:17,600 Speaker 1: he had a couple that I've heard are essentially unwriteable. Uh. 697 00:40:17,680 --> 00:40:20,600 Speaker 1: He also lectured on using information theory as an application 698 00:40:20,640 --> 00:40:23,920 Speaker 1: to playing the stock market, though he never really published 699 00:40:23,920 --> 00:40:25,680 Speaker 1: any work on this. He did do a lecture, but 700 00:40:25,760 --> 00:40:28,480 Speaker 1: he didn't write a paper. He also did really well 701 00:40:28,480 --> 00:40:31,920 Speaker 1: in the stock market himself, although he wasn't necessarily employing 702 00:40:31,960 --> 00:40:36,480 Speaker 1: information theory to do so. He was investing in companies 703 00:40:36,520 --> 00:40:39,839 Speaker 1: that friends of his. Yeah, he made some very savvy 704 00:40:39,880 --> 00:40:43,560 Speaker 1: stock purchases based on amazing work that his friends were doing. 705 00:40:43,760 --> 00:40:46,279 Speaker 1: These are these are the people who were inventing like 706 00:40:46,320 --> 00:40:49,279 Speaker 1: the basic components of computers and electronics, going on to 707 00:40:49,320 --> 00:40:52,319 Speaker 1: form their own companies, and he would invest in those 708 00:40:52,400 --> 00:40:55,200 Speaker 1: and then they ended up being these these enormous companies 709 00:40:55,200 --> 00:40:58,200 Speaker 1: we know today. So he did quite well. Uh. And 710 00:40:58,239 --> 00:41:02,120 Speaker 1: there's no Nobel Eries for mathematics, which is why Claude 711 00:41:02,120 --> 00:41:04,799 Speaker 1: Shannon never won one, right, But he certainly did win 712 00:41:05,400 --> 00:41:08,680 Speaker 1: a number, I mean, probably way too numerous to mention 713 00:41:08,800 --> 00:41:11,120 Speaker 1: here awards, but but one that we wanted to mention 714 00:41:11,880 --> 00:41:15,239 Speaker 1: is the very first Kyoto Prize, which was created in 715 00:41:15,320 --> 00:41:18,600 Speaker 1: Japan to award honors to contributions in mathematics. Essentially, it 716 00:41:18,640 --> 00:41:22,080 Speaker 1: was supposed to be the Nobel Prize for mathematics, right right, 717 00:41:22,160 --> 00:41:23,840 Speaker 1: And this was all the way in the nineteen eighties, 718 00:41:23,920 --> 00:41:26,399 Speaker 1: and this came into invention. Yea, the very first one 719 00:41:26,640 --> 00:41:29,080 Speaker 1: went to Claude Shannon, and from what I understand, it 720 00:41:29,080 --> 00:41:32,560 Speaker 1: actually came with an even larger cash prize than the 721 00:41:32,560 --> 00:41:35,120 Speaker 1: Nobel Prize does. So, so if you if you feel 722 00:41:35,160 --> 00:41:38,799 Speaker 1: like he was he was snubbed because Nobel Prizes don't 723 00:41:39,040 --> 00:41:42,600 Speaker 1: recognize mathematics. Fear not, the Kyoto Prize had him covered. 724 00:41:43,520 --> 00:41:46,600 Speaker 1: I hope you guys enjoyed this classic episode Who Was 725 00:41:46,680 --> 00:41:49,880 Speaker 1: Claude Shannon? Published back in August of two thousand fourteen. 726 00:41:50,760 --> 00:41:53,440 Speaker 1: If you have suggestions for topics I should cover in 727 00:41:53,480 --> 00:41:56,760 Speaker 1: future episodes of Tech Stuff, whether it's a technology, a trend, 728 00:41:56,960 --> 00:42:00,000 Speaker 1: maybe it's another important person in the field of techno 729 00:42:00,000 --> 00:42:02,920 Speaker 1: oology and you feel like this person hasn't really you know, 730 00:42:03,200 --> 00:42:06,759 Speaker 1: received the full treatment that they should, let me know, 731 00:42:06,960 --> 00:42:08,480 Speaker 1: reach out to me. The best way to do that 732 00:42:08,600 --> 00:42:10,759 Speaker 1: is over on Twitter. The handle for the show is 733 00:42:10,840 --> 00:42:13,680 Speaker 1: text Stuff h S. W and I'll talk to you 734 00:42:13,719 --> 00:42:23,080 Speaker 1: again really soon. Text Stuff is an I heart Radio production. 735 00:42:23,320 --> 00:42:26,160 Speaker 1: For more podcasts from I heart Radio, visit the i 736 00:42:26,280 --> 00:42:29,480 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 737 00:42:29,560 --> 00:42:30,480 Speaker 1: your favorite shows.