1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tex Stuff, a production of I Heart Radios, 2 00:00:07,320 --> 00:00:13,720 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,720 --> 00:00:17,000 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,079 --> 00:00:19,360 Speaker 1: How Stuff Works and I Heart Radio and I love 5 00:00:19,480 --> 00:00:24,000 Speaker 1: all things tech, and today we're going to look at 6 00:00:24,040 --> 00:00:28,800 Speaker 1: a classic episode that aired back in July of two 7 00:00:28,800 --> 00:00:33,000 Speaker 1: thousand twelve, July six to be precise, and it is 8 00:00:33,000 --> 00:00:37,839 Speaker 1: titled tech Stuff looks at super Computers. This was a 9 00:00:37,920 --> 00:00:41,200 Speaker 1: fun discussion to have with my former co host Chris 10 00:00:41,240 --> 00:00:45,360 Speaker 1: Pallette because I, you know, grew up in an era 11 00:00:45,440 --> 00:00:47,920 Speaker 1: of super computers and only had sort of a vague 12 00:00:47,960 --> 00:00:51,880 Speaker 1: idea of what that term meant for many years. I'm 13 00:00:51,880 --> 00:00:53,320 Speaker 1: sure there was a time when I was a kid 14 00:00:53,400 --> 00:00:55,120 Speaker 1: where I thought of it as a computer that wore 15 00:00:55,120 --> 00:00:57,240 Speaker 1: a cape. But as it turns out, it gets a 16 00:00:57,280 --> 00:01:00,960 Speaker 1: little more complicated than that. Let's listen in So, Chris, 17 00:01:01,080 --> 00:01:02,920 Speaker 1: if I were to ask you, just off the top 18 00:01:02,960 --> 00:01:06,080 Speaker 1: of your head, how would you define a supercomputer? What 19 00:01:06,120 --> 00:01:09,240 Speaker 1: would you say? Well, if I hadn't already made the joke, 20 00:01:09,280 --> 00:01:10,959 Speaker 1: I would have said it was a computer in the 21 00:01:11,000 --> 00:01:15,360 Speaker 1: cape and tights. But no, honestly, I would say supercomputer. 22 00:01:15,560 --> 00:01:20,120 Speaker 1: Is a computer that can do a lot more calculations 23 00:01:20,160 --> 00:01:23,000 Speaker 1: in a shorter period of time than the machines sitting 24 00:01:23,040 --> 00:01:24,959 Speaker 1: on our desktop. Yeah. I think of it as sort 25 00:01:24,959 --> 00:01:28,320 Speaker 1: of the bleeding edge of what a computer is capable 26 00:01:28,360 --> 00:01:32,040 Speaker 1: of doing. Something that that still feel fills a room, 27 00:01:32,080 --> 00:01:35,280 Speaker 1: even though typical computers these days don't need to fill 28 00:01:35,280 --> 00:01:37,640 Speaker 1: a room because it's that big. It still has that 29 00:01:37,760 --> 00:01:42,280 Speaker 1: much computing power, right, right, And the term comes from 30 00:01:42,560 --> 00:01:46,240 Speaker 1: the nineteen sixties and uh. In order to really kind 31 00:01:46,240 --> 00:01:50,680 Speaker 1: of understand the the the span of this, I think 32 00:01:51,000 --> 00:01:52,960 Speaker 1: I was going to talk a little bit about the 33 00:01:53,520 --> 00:01:56,600 Speaker 1: last computer I could find that was a powerful computer 34 00:01:57,080 --> 00:02:02,120 Speaker 1: that existed before people started talking about super computers, which 35 00:02:02,200 --> 00:02:07,360 Speaker 1: was the IBM seventy thirty Stretch. Yes, that was the 36 00:02:07,360 --> 00:02:13,280 Speaker 1: one that was made with elastic. Yes, ye gain a 37 00:02:13,280 --> 00:02:15,640 Speaker 1: couple of pounds. Your computer can still you know fit. 38 00:02:16,040 --> 00:02:19,320 Speaker 1: It was Mr Fantastic of the computer world. No, because 39 00:02:19,360 --> 00:02:22,000 Speaker 1: it was not a super computer. It took up two 40 00:02:22,080 --> 00:02:25,480 Speaker 1: thousand square feet back in the day, this being the 41 00:02:26,120 --> 00:02:32,200 Speaker 1: early sixties. Two thousand square feet. It costs thirteen million dollars, which, 42 00:02:32,240 --> 00:02:36,680 Speaker 1: if you were to translate to today's cash, would be 43 00:02:36,800 --> 00:02:40,120 Speaker 1: ninety one million dollars. It's a lot of money. It's 44 00:02:40,120 --> 00:02:43,280 Speaker 1: a lot of cash. So that was the fastest computer 45 00:02:43,360 --> 00:02:46,320 Speaker 1: at the time until a fellow named Seymour Roger Craig 46 00:02:46,440 --> 00:02:50,360 Speaker 1: showed up. I s Mr. Craig. Yeah, Craig ends up 47 00:02:50,400 --> 00:02:54,160 Speaker 1: being a big name in supercomputers, particularly in the sixties, 48 00:02:54,200 --> 00:02:58,000 Speaker 1: seventies and up to the mid eighties. That was the 49 00:02:58,200 --> 00:03:03,160 Speaker 1: name in supercomputers. And he was working for a company 50 00:03:03,200 --> 00:03:07,360 Speaker 1: called Engineering Research Associates or e r A, which actually 51 00:03:07,400 --> 00:03:12,080 Speaker 1: grew out of a naval operation, um being the U. S. Navy, 52 00:03:12,120 --> 00:03:15,000 Speaker 1: not belly Button. I was gonna you were looking at 53 00:03:15,040 --> 00:03:18,600 Speaker 1: me like joke. No, No, not that it was a 54 00:03:18,720 --> 00:03:21,840 Speaker 1: navy project. How about that as in as in the 55 00:03:21,880 --> 00:03:25,360 Speaker 1: military force, not the color It was a Navy project 56 00:03:25,600 --> 00:03:28,480 Speaker 1: that was all about code breaking, all right. So there 57 00:03:28,520 --> 00:03:31,280 Speaker 1: was this project about code breaking that eventually kind of 58 00:03:31,320 --> 00:03:34,359 Speaker 1: spun off and became an actual company all on its 59 00:03:34,360 --> 00:03:37,920 Speaker 1: own called Engineering Research Associates, and it branched out beyond 60 00:03:38,000 --> 00:03:40,520 Speaker 1: code breaking, although it took all the code breaking work 61 00:03:40,560 --> 00:03:44,760 Speaker 1: it could get. Yeah, we talked about the Enigma UM 62 00:03:44,880 --> 00:03:48,760 Speaker 1: some episodes back UM and we were talking about the 63 00:03:48,880 --> 00:03:53,320 Speaker 1: bomb UM and yeah, there was early uh that was 64 00:03:53,360 --> 00:03:58,080 Speaker 1: really the early application for supercomputers was you know, needing 65 00:03:58,120 --> 00:04:01,720 Speaker 1: to crunch a lot of data very quickly, and there weren't. 66 00:04:02,680 --> 00:04:04,640 Speaker 1: There weren't the kind of applications that we have now. 67 00:04:04,680 --> 00:04:06,560 Speaker 1: We'll get into that, I'm sure in a few minutes. 68 00:04:06,600 --> 00:04:09,880 Speaker 1: But but yeah, I mean that was you know, why 69 00:04:09,920 --> 00:04:13,120 Speaker 1: would you need a supercomputer? You know, that was That's 70 00:04:13,120 --> 00:04:15,920 Speaker 1: probably about the only thing I could think of where 71 00:04:15,920 --> 00:04:18,840 Speaker 1: people were needing to crunch that kind of information as 72 00:04:18,920 --> 00:04:23,719 Speaker 1: quickly as possible. And defense. Yeah, typically, especially with the 73 00:04:23,760 --> 00:04:28,560 Speaker 1: early supercomputers, they were really designed for very specialized computing, 74 00:04:29,360 --> 00:04:32,960 Speaker 1: so not necessarily specialized from the ground up for a 75 00:04:33,120 --> 00:04:35,840 Speaker 1: one particular type of computing, but they were They were 76 00:04:35,880 --> 00:04:38,400 Speaker 1: not meant to be general computers. They were meant to 77 00:04:38,440 --> 00:04:43,039 Speaker 1: do tip no admiral computers because they were the Navy, 78 00:04:43,120 --> 00:04:45,640 Speaker 1: that's true. Uh No, they were. They were meant to 79 00:04:45,680 --> 00:04:48,560 Speaker 1: do a specific task very very well, and that's that's 80 00:04:48,600 --> 00:04:53,080 Speaker 1: all they were meant to do. Now, Craig had an 81 00:04:53,120 --> 00:04:55,800 Speaker 1: interesting philosophy he said, and this is this is a 82 00:04:55,880 --> 00:04:59,480 Speaker 1: quote from him. He said, anyone can build a fast CPU. 83 00:04:59,760 --> 00:05:02,840 Speaker 1: That trick is to build a fast system and that 84 00:05:02,920 --> 00:05:07,040 Speaker 1: was the secret to create creating the first supercomputer. He 85 00:05:07,120 --> 00:05:10,640 Speaker 1: realized that if you created a processor that was really, 86 00:05:10,640 --> 00:05:14,120 Speaker 1: really fast, that did not matter if it couldn't get 87 00:05:14,160 --> 00:05:18,560 Speaker 1: the data it needed to execute operations upon fast enough. 88 00:05:19,279 --> 00:05:22,680 Speaker 1: So he saw the need to create a system that 89 00:05:22,720 --> 00:05:26,960 Speaker 1: would move data through very very quickly, not just processed data, 90 00:05:27,040 --> 00:05:28,720 Speaker 1: but move it so that means it needs a lot 91 00:05:28,720 --> 00:05:31,880 Speaker 1: of memory, it needs a very fast pathway from memory 92 00:05:31,880 --> 00:05:34,479 Speaker 1: to processor. There are a lot of pieces that have 93 00:05:34,520 --> 00:05:36,760 Speaker 1: to be put in place, and he saw this very 94 00:05:36,800 --> 00:05:40,800 Speaker 1: early on, and so using that philosophy, he designed a 95 00:05:40,800 --> 00:05:44,440 Speaker 1: computer back in nineteen two that was called the c 96 00:05:44,760 --> 00:05:49,520 Speaker 1: d C six six hundred. Now CDC stands for Controlled 97 00:05:49,600 --> 00:05:54,120 Speaker 1: Data Corporation. Yeah, um h E R A was taken 98 00:05:54,120 --> 00:05:58,400 Speaker 1: over by Remington Randy UM. And that's, uh, that's the 99 00:05:58,480 --> 00:06:01,640 Speaker 1: name I remember because you know, UH still remember a 100 00:06:01,640 --> 00:06:05,680 Speaker 1: lot of those old machine names UM from stuff that 101 00:06:05,800 --> 00:06:09,000 Speaker 1: I found in my uh dad's collection. Of course, he was, 102 00:06:09,040 --> 00:06:12,520 Speaker 1: you know, a mechanical engineer UM before he retired, and 103 00:06:13,160 --> 00:06:15,400 Speaker 1: you know, so he was interested in all kinds of machines. 104 00:06:15,480 --> 00:06:17,400 Speaker 1: And I didn't know what I was looking at at 105 00:06:17,400 --> 00:06:19,400 Speaker 1: the time, of course, you know, but they were all 106 00:06:19,440 --> 00:06:22,960 Speaker 1: these UM science and computing magazines laying around and that 107 00:06:23,040 --> 00:06:27,120 Speaker 1: name I recognized also UNSIS because Remington Rand became Unities 108 00:06:27,680 --> 00:06:30,040 Speaker 1: UM and probably a lot more of our listeners are 109 00:06:30,040 --> 00:06:34,120 Speaker 1: familiar with that name. But he partnered with William Norris 110 00:06:34,320 --> 00:06:38,360 Speaker 1: to start Controlled Data Corporation UM back in nine seven 111 00:06:39,120 --> 00:06:42,120 Speaker 1: UM and really at that point, the UNIVAC from Remington 112 00:06:42,240 --> 00:06:46,760 Speaker 1: Rand and IBM were the computing companies. And you know, 113 00:06:46,800 --> 00:06:50,239 Speaker 1: IBM has been the heavyweight for so long, but CDC 114 00:06:50,560 --> 00:06:54,600 Speaker 1: was the first, uh you know, upstart to really make 115 00:06:54,640 --> 00:06:58,720 Speaker 1: a dent in there, uh stranglehold on the industry. And 116 00:06:59,000 --> 00:07:02,880 Speaker 1: Craig wanted to join CDC fairly early on, but apparently 117 00:07:03,279 --> 00:07:07,919 Speaker 1: he was needed for a project UM that would not 118 00:07:08,120 --> 00:07:10,560 Speaker 1: let him leave exactly what he wanted to. So once 119 00:07:10,560 --> 00:07:13,840 Speaker 1: he did leave, that's when he designed the CDC sixty hundred, 120 00:07:14,920 --> 00:07:19,280 Speaker 1: which was officially announced in nineteen sixty four, so designed 121 00:07:19,280 --> 00:07:21,840 Speaker 1: in sixty two, announced two years later, and it was 122 00:07:21,880 --> 00:07:25,720 Speaker 1: the first commercially successful supercomputer, with a price tag of 123 00:07:25,800 --> 00:07:28,520 Speaker 1: between seven and eight million dollars, sometimes going up as 124 00:07:28,560 --> 00:07:31,240 Speaker 1: high as ten million, depending upon the configuration that the 125 00:07:31,280 --> 00:07:34,840 Speaker 1: customer wanted. UM. Now, in today's cash, that would equal 126 00:07:34,880 --> 00:07:39,239 Speaker 1: about sixty million dollars, so thirty one million dollars cheaper 127 00:07:39,240 --> 00:07:44,960 Speaker 1: in today's money than the Stretch computer, and it was 128 00:07:45,000 --> 00:07:50,680 Speaker 1: actually much more powerful. It had four hundred thousand transistors 129 00:07:50,720 --> 00:07:54,280 Speaker 1: and one hundred miles of wiring, and it was the 130 00:07:54,360 --> 00:07:58,240 Speaker 1: size of about four filing cabinets, so it's also significantly 131 00:07:58,320 --> 00:08:01,920 Speaker 1: smaller than the Stretch. Didn't take up two thousand square feet. 132 00:08:02,800 --> 00:08:06,600 Speaker 1: The clock speed was around a hundred nanoseconds, and it 133 00:08:06,680 --> 00:08:10,920 Speaker 1: had sixty five thousand sixty bit words of memory. So 134 00:08:11,400 --> 00:08:13,680 Speaker 1: this is kind of an odd time in computing. We 135 00:08:13,720 --> 00:08:18,800 Speaker 1: hadn't really settled on the thirty two sixty four bit 136 00:08:18,960 --> 00:08:23,440 Speaker 1: kind of model. This was before that. Um. It also 137 00:08:23,520 --> 00:08:26,080 Speaker 1: used six high speed drums as sort of a temporary 138 00:08:26,080 --> 00:08:29,440 Speaker 1: storage area. It had a central storage that used magnetic tape, 139 00:08:29,840 --> 00:08:33,760 Speaker 1: and it used the four trans sixty six compiler. UM 140 00:08:33,800 --> 00:08:36,440 Speaker 1: the equivalent to today's machines means that it would have 141 00:08:36,520 --> 00:08:42,280 Speaker 1: about a ten mega hurts processor. Yeah, well that could 142 00:08:42,320 --> 00:08:44,520 Speaker 1: work up to forty mega hurts and speed. Well, it 143 00:08:44,559 --> 00:08:47,880 Speaker 1: could do a three million floating point operations per second. 144 00:08:47,960 --> 00:08:52,520 Speaker 1: Yeah those areas, Yeah, so three million, that would be 145 00:08:52,520 --> 00:08:55,720 Speaker 1: a mega flop, three mega flops, right, so we're gonna 146 00:08:55,720 --> 00:08:58,760 Speaker 1: get into lots of different flop terms later as well, 147 00:08:58,840 --> 00:09:03,520 Speaker 1: and they get incredibly huge. Of course, you have to 148 00:09:03,559 --> 00:09:05,439 Speaker 1: keep it cool because otherwise it breaks out into a 149 00:09:05,480 --> 00:09:08,400 Speaker 1: flop sweat. And that's true. Uh, well, not the flop 150 00:09:08,440 --> 00:09:10,000 Speaker 1: sweat part, but you do have to keep it cool. 151 00:09:10,360 --> 00:09:13,280 Speaker 1: As we know electronics, when you're running electricity through them, 152 00:09:13,280 --> 00:09:15,960 Speaker 1: one of the byproducts is heat, and heat, as it 153 00:09:16,000 --> 00:09:19,959 Speaker 1: turns out, is not a great thing for electronic components. 154 00:09:20,360 --> 00:09:25,199 Speaker 1: It can make stuff expand contacts can lose connections, so 155 00:09:25,240 --> 00:09:29,360 Speaker 1: that stuff starts to malfunction. An entire system could shut down. 156 00:09:29,480 --> 00:09:34,920 Speaker 1: So the CDC had a cooling system that was provided 157 00:09:35,000 --> 00:09:39,880 Speaker 1: by a special chemical free only. Yeah, they used free 158 00:09:39,920 --> 00:09:42,760 Speaker 1: on to cool the system. In fact, it was they 159 00:09:42,760 --> 00:09:45,920 Speaker 1: would use free On for a while before finally having 160 00:09:45,960 --> 00:09:49,400 Speaker 1: to switch to a different coolant because free on just 161 00:09:49,559 --> 00:09:54,400 Speaker 1: wasn't efficient enough. Eventually, now at it was still doing 162 00:09:54,440 --> 00:09:59,720 Speaker 1: the job. So Cray was also an innovator in another way. 163 00:10:00,320 --> 00:10:05,880 Speaker 1: The stretch IBM stretched um was sort of a hybrid machine. 164 00:10:06,400 --> 00:10:10,520 Speaker 1: They had transistors and vacuum tubes in it um and 165 00:10:10,559 --> 00:10:13,480 Speaker 1: that's I think why one of the reasons why craze 166 00:10:13,520 --> 00:10:17,640 Speaker 1: machines were smaller. The six four, which proceeded the sixty 167 00:10:17,679 --> 00:10:21,800 Speaker 1: six hundred UM, was the one of the very first 168 00:10:21,840 --> 00:10:29,080 Speaker 1: to use transistors only, so there was also a transistor machine, 169 00:10:29,280 --> 00:10:30,960 Speaker 1: and so it would take up a lot less space 170 00:10:30,960 --> 00:10:33,480 Speaker 1: than the vacuum tubes. And I would imagine that based 171 00:10:33,480 --> 00:10:36,000 Speaker 1: on my knowledge, my personal knowledge of vacuum tubes, might 172 00:10:36,000 --> 00:10:38,439 Speaker 1: have been a little cooler simply because of that. Yeah, 173 00:10:38,440 --> 00:10:40,200 Speaker 1: I would imagine that they wouldn't have had to have 174 00:10:40,520 --> 00:10:44,160 Speaker 1: as dramatic an a C system to keep the the 175 00:10:44,240 --> 00:10:47,760 Speaker 1: room bearable, because vacuum tubes do put off a lot 176 00:10:47,800 --> 00:10:52,720 Speaker 1: of heat UM. Another interesting IBM C d C connection 177 00:10:52,800 --> 00:10:56,760 Speaker 1: here is that Thomas Watson Jr. Which was IBM s C. 178 00:10:57,000 --> 00:11:00,600 Speaker 1: He was IBM CEO at the time, wrote a famous 179 00:11:00,640 --> 00:11:06,160 Speaker 1: memo that time too IBM employees, and he said, last 180 00:11:06,200 --> 00:11:10,840 Speaker 1: week Controlled Data announced the six system. I understand that 181 00:11:10,880 --> 00:11:13,600 Speaker 1: in the laboratory developing the system, there are only thirty 182 00:11:13,640 --> 00:11:17,760 Speaker 1: four people, including the janitor. Of these fourteen our engineers 183 00:11:17,800 --> 00:11:21,199 Speaker 1: and four our programmers. Contrasting this modest effort with our 184 00:11:21,320 --> 00:11:25,040 Speaker 1: vast developmental activities, I failed to understand why we have 185 00:11:25,200 --> 00:11:28,040 Speaker 1: lost our industry leadership position by letting someone else offer 186 00:11:28,080 --> 00:11:32,560 Speaker 1: the world's most powerful computer. Craig's response was a reportedly, well, 187 00:11:32,600 --> 00:11:36,400 Speaker 1: there's your problem. Essentially, Craig was saying that, you know, 188 00:11:36,640 --> 00:11:44,080 Speaker 1: perhaps IBMS approach it was a little burdened by size. 189 00:11:44,320 --> 00:11:47,840 Speaker 1: That IBM had grown so large that to manage a 190 00:11:47,880 --> 00:11:51,640 Speaker 1: project like this was very difficult to do because it 191 00:11:51,679 --> 00:11:55,360 Speaker 1: was just too big. So that's an interesting idea that 192 00:11:55,840 --> 00:11:59,920 Speaker 1: an organization needed to be kind of small and nim 193 00:12:00,040 --> 00:12:02,199 Speaker 1: bowl in order to pull something off like creating the 194 00:12:02,240 --> 00:12:05,839 Speaker 1: world's fastest computer. He followed up the c d C 195 00:12:06,160 --> 00:12:10,760 Speaker 1: six hundred with the seventy, which had a sixty five thousand, 196 00:12:10,840 --> 00:12:13,559 Speaker 1: five hundred thirty six sixty bit word memory and a 197 00:12:13,600 --> 00:12:16,559 Speaker 1: clock speed of twenty seven nano seconds uh and actually 198 00:12:16,800 --> 00:12:20,680 Speaker 1: in practice ran about five times faster than the sixty. 199 00:12:22,000 --> 00:12:25,160 Speaker 1: But then Cray left c d C and he formed 200 00:12:25,200 --> 00:12:30,480 Speaker 1: his own company, Kray Research, and in nineteen seventy six 201 00:12:30,480 --> 00:12:33,560 Speaker 1: he introduced the Kray one, which if you've ever heard 202 00:12:33,600 --> 00:12:37,000 Speaker 1: the Kray supercomputer, that's what this is. It's the crazy 203 00:12:37,080 --> 00:12:39,840 Speaker 1: One was the first of those. It had a clock 204 00:12:39,920 --> 00:12:43,320 Speaker 1: speed of a well, its processor ran at eighty mega 205 00:12:43,360 --> 00:12:48,240 Speaker 1: hurts and back. At this time these supercomputers were still 206 00:12:48,360 --> 00:12:52,400 Speaker 1: using a single CPU, so that was kind of interesting 207 00:12:52,400 --> 00:12:55,240 Speaker 1: to these were single CPU systems. So it had eighty 208 00:12:55,280 --> 00:12:59,120 Speaker 1: mega Hurts processor sixty four bit system. It ran at 209 00:12:59,120 --> 00:13:01,559 Speaker 1: a hundred thirty six mega flops, so one or three 210 00:13:01,640 --> 00:13:04,960 Speaker 1: six million floating operations per second, and it had one 211 00:13:05,000 --> 00:13:09,080 Speaker 1: thousand six d sixty two printed circuit boards that made 212 00:13:09,160 --> 00:13:13,040 Speaker 1: up the components of this computer. It costs between five 213 00:13:13,040 --> 00:13:15,120 Speaker 1: and eight million dollars, depending on how you wanted it 214 00:13:15,120 --> 00:13:17,880 Speaker 1: set up, and in today's cash that's about twenty five 215 00:13:17,920 --> 00:13:21,760 Speaker 1: million dollars. So we see that the processor speed is 216 00:13:21,800 --> 00:13:26,319 Speaker 1: increasing and the price is coming down. Often the size 217 00:13:26,440 --> 00:13:29,520 Speaker 1: of the computer is decreasing as well, although that that 218 00:13:29,679 --> 00:13:34,480 Speaker 1: also flip flops over the years because while the solid 219 00:13:34,559 --> 00:13:40,319 Speaker 1: state electronics definitely brought the size down, eventually the way 220 00:13:40,480 --> 00:13:45,400 Speaker 1: we pack in more speed requires more space. But we'll 221 00:13:45,400 --> 00:13:50,280 Speaker 1: get into that. Okay. So after the Cray one came 222 00:13:50,320 --> 00:13:55,520 Speaker 1: the Cray x MP. Yeah. This is uh, this is 223 00:13:55,600 --> 00:14:00,240 Speaker 1: interesting because realized also in addition to the fact that 224 00:14:00,360 --> 00:14:03,439 Speaker 1: he knew that the components, the all of the components 225 00:14:03,440 --> 00:14:07,800 Speaker 1: the entire machine was important and not just a processor, 226 00:14:07,840 --> 00:14:12,240 Speaker 1: he also realized that uh, early on that parallel processing 227 00:14:12,480 --> 00:14:16,840 Speaker 1: could also speed things up. UM. Now it's common for 228 00:14:16,920 --> 00:14:20,080 Speaker 1: us to have multi core processes in our desktop machines 229 00:14:20,240 --> 00:14:23,440 Speaker 1: or laptops, or in fact, now we're starting to see 230 00:14:23,440 --> 00:14:27,880 Speaker 1: them in our mobile devices. UM. But you know, at 231 00:14:27,920 --> 00:14:30,080 Speaker 1: the at the time in the seventies and eighties, this 232 00:14:30,240 --> 00:14:34,240 Speaker 1: was still something sort of newish, UM, and it's not 233 00:14:34,320 --> 00:14:38,360 Speaker 1: something that everybody realized. Uh. So the x MP actually 234 00:14:38,720 --> 00:14:44,760 Speaker 1: was to Cray one computers linked together, UM, and using 235 00:14:44,760 --> 00:14:49,320 Speaker 1: those two machines together in a multiprocessing effort, UM, they 236 00:14:49,360 --> 00:14:54,200 Speaker 1: could triple the performance of just one Cray one UM, 237 00:14:54,240 --> 00:14:59,600 Speaker 1: which is something interesting to note. And uh Cray two 238 00:14:59,720 --> 00:15:02,800 Speaker 1: had four processors in the same machine and that was 239 00:15:02,840 --> 00:15:06,320 Speaker 1: the first to exceed one billion flops as Britainic it 240 00:15:06,440 --> 00:15:09,280 Speaker 1: tells me. Yeah, uh, it actually could have up to 241 00:15:09,680 --> 00:15:14,880 Speaker 1: eight CPUs the create too. UM. The these machines often 242 00:15:14,960 --> 00:15:19,080 Speaker 1: over time were upgraded, so the initial step specs you 243 00:15:19,080 --> 00:15:21,440 Speaker 1: would get when they were first released were one thing, 244 00:15:21,520 --> 00:15:24,320 Speaker 1: and then by the end of the run of production 245 00:15:24,520 --> 00:15:26,720 Speaker 1: they would be better. I mean, which makes sense. I 246 00:15:26,720 --> 00:15:28,440 Speaker 1: mean we see that in computers all the time. We 247 00:15:28,600 --> 00:15:31,560 Speaker 1: definitely we tend to call them different model numbers now, 248 00:15:31,600 --> 00:15:35,200 Speaker 1: but the same sort of thing happens. So back in two, 249 00:15:35,200 --> 00:15:38,760 Speaker 1: you had this the Create XMP with a hundred five 250 00:15:38,840 --> 00:15:44,080 Speaker 1: mega hurts CPUs running around two hundred megaflops each. Uh, 251 00:15:44,200 --> 00:15:46,680 Speaker 1: then if they had up to four CPUs you could 252 00:15:46,680 --> 00:15:50,800 Speaker 1: get eight hundred megaflops going, and that was pretty impressive. 253 00:15:50,840 --> 00:15:53,400 Speaker 1: It had the equivalent, by the way, of a hundred 254 00:15:53,440 --> 00:15:57,040 Speaker 1: and twenty eight megabytes of RAM, So yeah, you think 255 00:15:57,040 --> 00:15:59,440 Speaker 1: about that one or twenty eight megabytes of RAM in 256 00:15:59,560 --> 00:16:03,040 Speaker 1: nineteen a D two that was considered bleeding edge for 257 00:16:03,080 --> 00:16:07,920 Speaker 1: a supercomputer. Um and the storage units for the Cray 258 00:16:08,160 --> 00:16:10,320 Speaker 1: XMP were the size of a file cabinet and they 259 00:16:10,360 --> 00:16:13,360 Speaker 1: could hold up to twelve gigs of storage. Because they 260 00:16:13,360 --> 00:16:16,200 Speaker 1: have a flash drive in my bag with me that 261 00:16:16,240 --> 00:16:18,720 Speaker 1: has eight gigs. Yeah, and you can find you can 262 00:16:18,760 --> 00:16:22,440 Speaker 1: find you just find twenty gig or more flash drives, 263 00:16:22,480 --> 00:16:24,160 Speaker 1: which you know, you think about that, that's something that 264 00:16:24,360 --> 00:16:27,440 Speaker 1: is small enough for you to carry on a key chain. Well, 265 00:16:27,480 --> 00:16:30,880 Speaker 1: back in two you had a file cabinet sized device 266 00:16:30,960 --> 00:16:35,040 Speaker 1: that could hold twelve gigs and that was considered massive, 267 00:16:35,160 --> 00:16:39,400 Speaker 1: like a massive amount of information. So yeah, time really 268 00:16:39,440 --> 00:16:42,040 Speaker 1: does change things, doesn't it. So, yeah, the Cray two. 269 00:16:42,600 --> 00:16:46,480 Speaker 1: That's when they switched from free on to Flora Nert 270 00:16:47,400 --> 00:16:50,680 Speaker 1: as their coolant. I'm sorry, but that sounds like a 271 00:16:50,960 --> 00:16:54,760 Speaker 1: made up alien name from a from a an animated movie. 272 00:16:54,800 --> 00:17:03,200 Speaker 1: Technically all names are made up. I know. That's I 273 00:17:03,280 --> 00:17:06,800 Speaker 1: just blew your mind. What if there were no hypothetical questions? 274 00:17:08,440 --> 00:17:13,359 Speaker 1: Turn So the Floria Nert. The reason why they switched 275 00:17:13,480 --> 00:17:16,200 Speaker 1: was because they had at that point packed the components 276 00:17:16,240 --> 00:17:19,040 Speaker 1: so tightly together that free on was not efficient enough 277 00:17:19,040 --> 00:17:21,240 Speaker 1: to cool them. So they switched from free on to 278 00:17:21,280 --> 00:17:25,480 Speaker 1: Flora Nert. And it's a little Floria Nert I've had 279 00:17:25,520 --> 00:17:28,320 Speaker 1: around somewhere. And then they also had to figure out 280 00:17:28,320 --> 00:17:31,520 Speaker 1: a new way to access the memory on the create too, 281 00:17:31,600 --> 00:17:34,280 Speaker 1: because at this point they had reached that that point 282 00:17:34,280 --> 00:17:37,920 Speaker 1: that Kray had mentioned earlier about creating a CPU I 283 00:17:37,960 --> 00:17:41,040 Speaker 1: can process information faster than it can pull information in. 284 00:17:41,800 --> 00:17:46,800 Speaker 1: So they found they would actually dedicate processors to getting 285 00:17:46,880 --> 00:17:52,840 Speaker 1: data from memory and funneling it into the central processing units. 286 00:17:53,200 --> 00:17:56,600 Speaker 1: And uh, this was this was really important. It was 287 00:17:56,640 --> 00:18:01,479 Speaker 1: what kind of led the way into threading and loading memory. UH, 288 00:18:01,720 --> 00:18:05,399 Speaker 1: CPUs that have that capability to load information from memory, 289 00:18:05,520 --> 00:18:08,840 Speaker 1: preloading things that kind of came out of this work. 290 00:18:09,040 --> 00:18:11,679 Speaker 1: In fact, a lot of the uh, the advances that 291 00:18:11,720 --> 00:18:15,439 Speaker 1: we see in personal computers um are really possible because 292 00:18:15,440 --> 00:18:17,840 Speaker 1: of the pioneering work that was done in supercomputers. It 293 00:18:17,920 --> 00:18:21,159 Speaker 1: was stuff that that found its way from the engineering 294 00:18:21,240 --> 00:18:27,360 Speaker 1: of supercomputers into personal computers, often a completely different sense 295 00:18:27,400 --> 00:18:31,560 Speaker 1: of scale, but a similar approach. Now, after the Cray Too, 296 00:18:32,320 --> 00:18:38,080 Speaker 1: that's when Japan started to produce some supercomputers that were 297 00:18:38,200 --> 00:18:41,200 Speaker 1: uh that were actually faster than anything that was being 298 00:18:41,200 --> 00:18:44,480 Speaker 1: produced in the United States. So up until this point 299 00:18:44,480 --> 00:18:48,119 Speaker 1: it was all US that was they dominated that that 300 00:18:48,200 --> 00:18:54,040 Speaker 1: country dominated the supercomputer industry. But in so this is 301 00:18:54,119 --> 00:18:58,679 Speaker 1: you know, again, the Cray craze, if you will, lasted 302 00:18:58,680 --> 00:19:01,160 Speaker 1: front the sixties all the way into the eighties. Well 303 00:19:01,240 --> 00:19:05,200 Speaker 1: ninety six, Japan introduced the s R twenty two oh one, 304 00:19:05,800 --> 00:19:09,119 Speaker 1: which had two thousand, forty eight processors. So remember a 305 00:19:09,200 --> 00:19:13,399 Speaker 1: Cray too that was up to eight processors. The s 306 00:19:13,520 --> 00:19:17,520 Speaker 1: R two two oh one two thousand forty eight processors. Yep, 307 00:19:17,359 --> 00:19:20,640 Speaker 1: ye I count two thousand forty more processors with that computer. 308 00:19:20,720 --> 00:19:23,919 Speaker 1: Then with the Cray, do my math could be off 309 00:19:23,960 --> 00:19:27,200 Speaker 1: of an English major and it could it could have 310 00:19:27,280 --> 00:19:32,360 Speaker 1: up to six hundred gigaflops of processing. That's kind of crazy. Um. Yeah. 311 00:19:32,400 --> 00:19:34,640 Speaker 1: I also I also feel like we would be remissed 312 00:19:34,640 --> 00:19:39,320 Speaker 1: to mention the efforts of Danny Hillis um W. Daniel 313 00:19:39,400 --> 00:19:43,040 Speaker 1: Hillis was a grad student at m i T. Massachusetts 314 00:19:43,040 --> 00:19:47,320 Speaker 1: Institute of Technology when he realized that distributing computing was 315 00:19:47,800 --> 00:19:50,520 Speaker 1: the way of the future, if you will. Um. He 316 00:19:50,640 --> 00:19:55,840 Speaker 1: started thinking Machines Corporation ine UM and this CM one, 317 00:19:55,920 --> 00:19:58,119 Speaker 1: which was the first of his machines to come out 318 00:19:58,160 --> 00:20:05,560 Speaker 1: in eight five m It had six one bit processors 319 00:20:05,680 --> 00:20:10,120 Speaker 1: grouped sixteen to a chip. Interesting. Um, that's a that's 320 00:20:10,160 --> 00:20:17,879 Speaker 1: a really interesting approach tiny tiny processors. Yeah, so you 321 00:20:17,920 --> 00:20:22,359 Speaker 1: know wow. But yeah, I didn't come across my um my, 322 00:20:22,359 --> 00:20:24,800 Speaker 1: my research, which is why this is actually really like 323 00:20:24,960 --> 00:20:27,320 Speaker 1: I'm my mind is really as I'm thinking about that 324 00:20:27,480 --> 00:20:30,280 Speaker 1: sort of architecture. That's really an interesting approach. Well, it's 325 00:20:30,280 --> 00:20:33,600 Speaker 1: interesting too to see how different Uh see, Jonathan and 326 00:20:33,640 --> 00:20:36,439 Speaker 1: I do our research separately on purpose so that we 327 00:20:36,600 --> 00:20:40,399 Speaker 1: uh come up with different things on the cases and um, 328 00:20:40,640 --> 00:20:43,640 Speaker 1: so it's funny that that I would have come across that. Also, well, 329 00:20:43,640 --> 00:20:46,320 Speaker 1: I think of Danny hillis because I've seen his name 330 00:20:46,359 --> 00:20:49,439 Speaker 1: a lot in things like a long Now Foundation and 331 00:20:49,520 --> 00:20:52,399 Speaker 1: people with he hangs out with people like Stewart Brandon, 332 00:20:52,480 --> 00:20:56,760 Speaker 1: Kevin Kelly, UM, fascinating people. But UM anyway, Yeah, that 333 00:20:56,880 --> 00:20:59,760 Speaker 1: that's uh, that was one of his contributions. And you 334 00:20:59,800 --> 00:21:02,720 Speaker 1: see that in again in today's machines. I mean we 335 00:21:02,760 --> 00:21:06,439 Speaker 1: have this, you know, with us every day. But you 336 00:21:06,480 --> 00:21:08,399 Speaker 1: know this is uh, this is when we started to 337 00:21:08,440 --> 00:21:11,879 Speaker 1: realize that you don't necessarily have to go buy More's 338 00:21:11,960 --> 00:21:13,879 Speaker 1: Law and wait until next year's chip comes out with 339 00:21:13,920 --> 00:21:16,840 Speaker 1: twice as many processors on it. You can you can 340 00:21:16,880 --> 00:21:20,159 Speaker 1: do this by dividing up the work. Yeah. And and 341 00:21:20,200 --> 00:21:23,359 Speaker 1: in fact that that's another good point about the s 342 00:21:23,480 --> 00:21:27,719 Speaker 1: R O one, the computer from Japan, because uh, in 343 00:21:27,840 --> 00:21:31,280 Speaker 1: order to use these two thousand forty eight processors, there 344 00:21:31,400 --> 00:21:34,440 Speaker 1: was a new development in computer science which was called 345 00:21:34,600 --> 00:21:38,720 Speaker 1: multiple instruction multiple data or m I M D. Yes. Now, 346 00:21:38,760 --> 00:21:41,760 Speaker 1: this is the idea of being able to solve problems 347 00:21:41,800 --> 00:21:46,119 Speaker 1: by pulling in information from from memory and feeding it 348 00:21:46,160 --> 00:21:50,239 Speaker 1: to different processors that are all using different operations on 349 00:21:50,280 --> 00:21:55,720 Speaker 1: that data to come to a single solution, not necessarily 350 00:21:55,720 --> 00:21:57,800 Speaker 1: a single solution, but that's I'm using that as as 351 00:21:57,800 --> 00:22:02,919 Speaker 1: an example for this for this explanation. So this m 352 00:22:02,960 --> 00:22:06,120 Speaker 1: I M D approach is what allowed us to develop 353 00:22:06,240 --> 00:22:10,200 Speaker 1: multi core processors, because in this case we're still talking 354 00:22:10,200 --> 00:22:14,359 Speaker 1: about single processors that are all grouped together. Eventually we 355 00:22:14,400 --> 00:22:16,400 Speaker 1: will get to the point where we have multi core 356 00:22:16,480 --> 00:22:19,880 Speaker 1: processors where a single processor has multiple cores and each 357 00:22:20,000 --> 00:22:23,280 Speaker 1: core can work on part of a problem or separate 358 00:22:23,320 --> 00:22:28,000 Speaker 1: problems to solve things faster, to to get to a 359 00:22:28,080 --> 00:22:31,280 Speaker 1: conclusion faster than they would if it was just one 360 00:22:31,400 --> 00:22:35,119 Speaker 1: single processor, even if it was a really really fast 361 00:22:35,200 --> 00:22:39,520 Speaker 1: processor working on a series of problems. So I always, 362 00:22:39,600 --> 00:22:43,800 Speaker 1: I always use this analogy. Imagine that you have one 363 00:22:44,040 --> 00:22:49,040 Speaker 1: super smart math genius taking a math test, and the 364 00:22:49,080 --> 00:22:52,120 Speaker 1: math genius is going through and solving all of these problems, 365 00:22:52,600 --> 00:22:56,600 Speaker 1: and he or she is able to do this flawlessly, 366 00:22:58,000 --> 00:22:59,639 Speaker 1: able to solve all the problems, but it takes a 367 00:22:59,640 --> 00:23:02,520 Speaker 1: certain time to get through the test. Then you get 368 00:23:02,560 --> 00:23:06,800 Speaker 1: that same test to four above average math students. They're 369 00:23:06,800 --> 00:23:09,640 Speaker 1: not geniuses, but they're there. They can hold their own. 370 00:23:10,440 --> 00:23:12,760 Speaker 1: And you divide it up, say all right, you take 371 00:23:12,800 --> 00:23:15,840 Speaker 1: this this one fourth of the test. You take this quarter, 372 00:23:15,960 --> 00:23:18,199 Speaker 1: you take this quarter, and you take that quarter and 373 00:23:18,240 --> 00:23:21,240 Speaker 1: the four students together start to work. Those four students 374 00:23:21,280 --> 00:23:23,240 Speaker 1: are very likely going to be able to finish the 375 00:23:23,400 --> 00:23:25,920 Speaker 1: entirety of that test much faster, each of them working 376 00:23:25,920 --> 00:23:28,920 Speaker 1: on a quarter of it, rather than the genius who 377 00:23:28,960 --> 00:23:31,000 Speaker 1: is working on the full thing at the same time. 378 00:23:31,000 --> 00:23:33,240 Speaker 1: Even though the genius is smarter and can work faster 379 00:23:33,320 --> 00:23:37,280 Speaker 1: on each individual problem, collectively, those four students are going 380 00:23:37,320 --> 00:23:41,600 Speaker 1: to solve that test faster. That's the philosophy behind both 381 00:23:42,119 --> 00:23:46,680 Speaker 1: grouping cores together and making them a parallel processing unit 382 00:23:47,240 --> 00:23:50,320 Speaker 1: or taking a multi core approach to a CPU. Yep, 383 00:23:50,440 --> 00:23:52,920 Speaker 1: and you can you can thank Danny Hillis for figuring 384 00:23:53,000 --> 00:23:58,000 Speaker 1: out the idea of massively parallel computing UM. But you 385 00:23:58,040 --> 00:24:01,760 Speaker 1: know that that's a problem though too, because instead of 386 00:24:01,800 --> 00:24:05,159 Speaker 1: having two machines running side by side and linked together, 387 00:24:05,840 --> 00:24:07,760 Speaker 1: now you have to figure out how you're going to 388 00:24:07,880 --> 00:24:11,800 Speaker 1: parse all those instructions between all those different processors. So 389 00:24:11,840 --> 00:24:15,040 Speaker 1: you have to have the software or the operating system 390 00:24:15,080 --> 00:24:20,679 Speaker 1: that will give instructions to each of the processors actively 391 00:24:20,840 --> 00:24:23,760 Speaker 1: and direct essentially directing traffic. Yes, this is this is 392 00:24:23,840 --> 00:24:25,760 Speaker 1: kind of like, it's not It's not a simple thing 393 00:24:25,800 --> 00:24:28,639 Speaker 1: to figure out. It reminds me of Intel's tick talk 394 00:24:28,800 --> 00:24:33,119 Speaker 1: approach to developing processors. You think of the tick being 395 00:24:33,320 --> 00:24:38,800 Speaker 1: the physical machinery that's going to do the processing, and 396 00:24:38,840 --> 00:24:41,919 Speaker 1: you think of the talk as the software that's optimized 397 00:24:42,000 --> 00:24:45,040 Speaker 1: to work on that physical hardware to make it really 398 00:24:45,080 --> 00:24:47,480 Speaker 1: live up to its potential. And then you have another 399 00:24:47,520 --> 00:24:50,720 Speaker 1: tick where you've got an advancement in the physical hardware, 400 00:24:51,000 --> 00:24:54,280 Speaker 1: but perhaps the last generation of software isn't really optimized 401 00:24:54,280 --> 00:24:57,040 Speaker 1: to work on that, so you have to make new software. 402 00:24:57,320 --> 00:24:59,560 Speaker 1: This is a continuation. In fact, that's one of the 403 00:24:59,600 --> 00:25:04,760 Speaker 1: things that people say is a barrier to artificial intelligence 404 00:25:04,760 --> 00:25:07,919 Speaker 1: to the point of having a computer that has self awareness. 405 00:25:08,240 --> 00:25:13,960 Speaker 1: It's not necessarily that we can't reach the physical uh 406 00:25:14,359 --> 00:25:16,320 Speaker 1: requirements we would need in order to have a computer 407 00:25:16,440 --> 00:25:19,320 Speaker 1: be able to have some form of self awareness. It's 408 00:25:19,359 --> 00:25:23,480 Speaker 1: the idea that we could throw as much horsepower at 409 00:25:23,480 --> 00:25:25,800 Speaker 1: the problem as we wanted to. Without the software, it 410 00:25:25,920 --> 00:25:30,119 Speaker 1: just won't happen. Will be rejoining this classic episode in 411 00:25:30,320 --> 00:25:33,040 Speaker 1: just a moment, but first let's take a quick break 412 00:25:33,160 --> 00:25:43,639 Speaker 1: to thank our sponsor. There's one company name we haven't 413 00:25:43,640 --> 00:25:46,840 Speaker 1: really mentioned yet, and it's big. I mean, we talked 414 00:25:46,840 --> 00:25:48,960 Speaker 1: about a little bit just then, but not in the 415 00:25:49,119 --> 00:25:53,560 Speaker 1: terms of supercomputers. It's a big name in computer architecture, 416 00:25:53,560 --> 00:25:55,400 Speaker 1: but it wasn't a really big name in the whole 417 00:25:55,440 --> 00:26:00,320 Speaker 1: supercomputer story. And that's Intel. Now, Intel was not just 418 00:26:00,520 --> 00:26:03,280 Speaker 1: sitting back during this whole time. Now, granted, Intel's main 419 00:26:03,400 --> 00:26:09,920 Speaker 1: focus is on enterprise and consumer processors, which are not 420 00:26:10,600 --> 00:26:14,720 Speaker 1: completely analogous to what is you you find in supercomputers 421 00:26:14,760 --> 00:26:19,080 Speaker 1: at this time. Right, that would change, but not immediately. 422 00:26:19,440 --> 00:26:23,679 Speaker 1: But Intel did develop something called the Paragon, which was 423 00:26:23,880 --> 00:26:27,960 Speaker 1: supposed to be, you know, another fantastic supercomputer and it 424 00:26:28,240 --> 00:26:32,520 Speaker 1: could support up to four thousand processors using this M 425 00:26:32,600 --> 00:26:36,919 Speaker 1: I M D architecture. But it did not succeed in 426 00:26:36,960 --> 00:26:39,679 Speaker 1: the market. It just sort of well, it flopped in 427 00:26:39,680 --> 00:26:42,439 Speaker 1: a different way. The other kind of flopped, Yeah, the 428 00:26:42,480 --> 00:26:45,840 Speaker 1: bad kind. So that didn't really go anywhere, but it 429 00:26:45,880 --> 00:26:49,960 Speaker 1: did again sort of push this trend of parallel processing 430 00:26:49,960 --> 00:26:53,400 Speaker 1: and M I M D. Uh. The Japanese came out 431 00:26:53,520 --> 00:26:57,080 Speaker 1: with a couple of other computers called as Key Read 432 00:26:57,119 --> 00:27:02,080 Speaker 1: and Asky White. Intel also had an askey read Um. Yeah. Well, 433 00:27:02,119 --> 00:27:06,040 Speaker 1: actually this this goes back to the Comprehensive Test Band treaty. Uh, 434 00:27:06,400 --> 00:27:11,880 Speaker 1: that the United States signed, Um, they needed a certification 435 00:27:11,920 --> 00:27:15,440 Speaker 1: program for the nuclear weapons that they had built up 436 00:27:16,400 --> 00:27:20,120 Speaker 1: and uh so what they started was the Accelerated Strategic 437 00:27:20,119 --> 00:27:24,520 Speaker 1: Computing Initiative. Asking with only one eye instead of asking 438 00:27:24,640 --> 00:27:28,320 Speaker 1: characters yes with two eyes, just to clarify yes, I'm 439 00:27:28,359 --> 00:27:30,960 Speaker 1: glad you did, thank you, uh and ask you read yes. 440 00:27:31,040 --> 00:27:34,399 Speaker 1: Was built at Sandy and National Laboratories in Albuquerque, New Mexico. 441 00:27:35,000 --> 00:27:38,119 Speaker 1: Until helped them out with that, and that that was 442 00:27:38,160 --> 00:27:41,480 Speaker 1: the first machine to get a terra flop. Yeah, and 443 00:27:41,560 --> 00:27:43,919 Speaker 1: it was the first one to break the terra flop barrier. 444 00:27:43,960 --> 00:27:46,840 Speaker 1: It did that with six thousand two d mega hurts 445 00:27:46,920 --> 00:27:51,200 Speaker 1: pentium pro processors, nine thousand, seventy two of them, well 446 00:27:51,320 --> 00:27:54,240 Speaker 1: six thousand at first. It then eventually was upgraded. The 447 00:27:54,320 --> 00:27:57,359 Speaker 1: very first one had six thousand and the very last 448 00:27:57,400 --> 00:28:01,399 Speaker 1: one had nine thousand UM two z on processors. And 449 00:28:01,480 --> 00:28:03,960 Speaker 1: it actually hit three point one tara flops at the 450 00:28:04,119 --> 00:28:07,520 Speaker 1: end of its production life. So yeah, like I said, 451 00:28:07,520 --> 00:28:10,000 Speaker 1: you know, when we give these numbers, there are different 452 00:28:10,040 --> 00:28:14,000 Speaker 1: ones because there's a certain amount that was available when 453 00:28:14,040 --> 00:28:17,200 Speaker 1: the computer first premiered. Then there was like the average 454 00:28:17,200 --> 00:28:19,920 Speaker 1: amount during the computer's lifetime and then the amount that 455 00:28:19,960 --> 00:28:21,600 Speaker 1: was available at the very end of its run time. 456 00:28:21,680 --> 00:28:25,480 Speaker 1: So these numbers do change a little bit depending upon 457 00:28:26,119 --> 00:28:28,800 Speaker 1: which source you're reading in which version of the computer 458 00:28:28,840 --> 00:28:32,280 Speaker 1: they're looking at, because again, these computers are they come 459 00:28:32,280 --> 00:28:34,280 Speaker 1: in a range of models, so not all of them 460 00:28:34,320 --> 00:28:40,160 Speaker 1: are exactly the same. Now, while we talk about playing 461 00:28:40,160 --> 00:28:42,600 Speaker 1: games like chess, you know that that's one of the 462 00:28:42,600 --> 00:28:51,200 Speaker 1: big uh consumer uh visibility issues with supercomputers. You don't 463 00:28:51,240 --> 00:28:54,120 Speaker 1: see what supercomputers do. And that was a way for them, 464 00:28:54,560 --> 00:28:59,240 Speaker 1: the IBM in particular, to achieve notice, was taking on 465 00:28:59,560 --> 00:29:04,560 Speaker 1: people like Gary Kasparov, chess masters worldwide with a supercomputer 466 00:29:04,600 --> 00:29:09,360 Speaker 1: kind of computer outthink quote unquote out think a human. Well, 467 00:29:09,440 --> 00:29:12,160 Speaker 1: the point of ASKI was again one of those behind 468 00:29:12,200 --> 00:29:14,480 Speaker 1: the scenes thing. It was a very military thing. It 469 00:29:14,560 --> 00:29:19,680 Speaker 1: was more like Whopper in more games actually, uh actually 470 00:29:19,800 --> 00:29:25,200 Speaker 1: exactly like that. The point was to simulate nuclear tests, um. 471 00:29:25,520 --> 00:29:29,240 Speaker 1: And that was why they needed a lot of computing power, 472 00:29:30,000 --> 00:29:32,800 Speaker 1: uh and something a machine that could run a lot 473 00:29:32,840 --> 00:29:36,600 Speaker 1: of calculations very quickly, because they wanted to uh, you know, 474 00:29:36,640 --> 00:29:38,400 Speaker 1: this is not something you want to do. Hey, well 475 00:29:38,480 --> 00:29:42,440 Speaker 1: let's uh, let's test out fifty nuclear warheads. Yeah, this. 476 00:29:42,960 --> 00:29:45,480 Speaker 1: You know, they wanted to do this with a computer simulation, 477 00:29:45,600 --> 00:29:48,360 Speaker 1: and uh so that's why they started the initiative. It 478 00:29:48,440 --> 00:29:52,840 Speaker 1: was not a game, but a challenge. Hey, let's you know, 479 00:29:53,160 --> 00:29:55,960 Speaker 1: let's keep coming up with newer faster machines because we 480 00:29:55,960 --> 00:30:00,760 Speaker 1: need newer faster machines to run nuclear simulations and simulations 481 00:30:00,800 --> 00:30:04,880 Speaker 1: in general were a big part of what these supercomputers 482 00:30:04,920 --> 00:30:09,400 Speaker 1: were put to use for. I mean like climatology for example, 483 00:30:09,520 --> 00:30:13,800 Speaker 1: weather predictions that was a big requirement as well, as 484 00:30:13,920 --> 00:30:16,480 Speaker 1: supercomputers have been put towards that to try and help 485 00:30:17,320 --> 00:30:21,360 Speaker 1: map and predict climate change and just weather patterns, not 486 00:30:21,360 --> 00:30:24,120 Speaker 1: not just climate but weather, day to day weather, and 487 00:30:24,160 --> 00:30:28,040 Speaker 1: also other simulations as well. Not to mention crunching data 488 00:30:28,240 --> 00:30:34,520 Speaker 1: from facilities that generate lots and lots of information. So um, 489 00:30:34,560 --> 00:30:40,280 Speaker 1: things like the SETI Institute would for extraterrestrial intelligence. Yes 490 00:30:40,280 --> 00:30:43,040 Speaker 1: that they would use very powerful computers to try and 491 00:30:43,080 --> 00:30:45,400 Speaker 1: crunch all the data they would get from radio telescopes. 492 00:30:46,000 --> 00:30:48,560 Speaker 1: You also had things like the Large Hadron Collider and 493 00:30:48,560 --> 00:30:51,680 Speaker 1: other super colliders that generate lots and lots of data 494 00:30:51,880 --> 00:30:54,040 Speaker 1: and they need these really fast computers in order to 495 00:30:54,160 --> 00:30:58,120 Speaker 1: process the data and make it meaningful. So UM. Moving on. 496 00:30:58,640 --> 00:31:01,400 Speaker 1: So right around this time him when the sky Red 497 00:31:01,440 --> 00:31:05,240 Speaker 1: comes out. Um, that's when there was a shift in supercomputing. 498 00:31:06,160 --> 00:31:12,120 Speaker 1: So before there were all these customized UH computers that 499 00:31:12,240 --> 00:31:17,719 Speaker 1: had their own processors or had thousands of processors running together. UH. 500 00:31:18,600 --> 00:31:21,920 Speaker 1: But at this point it became possible to actually build 501 00:31:21,960 --> 00:31:26,240 Speaker 1: a supercomputer with off the shelf parts. You could actually 502 00:31:26,840 --> 00:31:31,080 Speaker 1: get enough computers together and linked them together to perform 503 00:31:31,240 --> 00:31:34,840 Speaker 1: as a supercomputer. And this was also when there became 504 00:31:35,040 --> 00:31:39,640 Speaker 1: a shift to using the Linux operating system. UH. So 505 00:31:39,800 --> 00:31:43,040 Speaker 1: Lenox kind of replaces Unix as the OS of choice 506 00:31:43,560 --> 00:31:47,360 Speaker 1: for people who are designing supercomputers, which is nice because 507 00:31:47,360 --> 00:31:51,000 Speaker 1: now you can tell the company nurse never mind. In 508 00:31:51,040 --> 00:31:54,760 Speaker 1: two thousand two, Japan comes back with the s key 509 00:31:54,800 --> 00:31:59,440 Speaker 1: White where it's had a thirty five terra flops computer. 510 00:31:59,560 --> 00:32:02,360 Speaker 1: It was the NYC Earth Simulator, and it cost a 511 00:32:03,760 --> 00:32:07,600 Speaker 1: hair under a billion dollars nine million. It's a lot 512 00:32:07,600 --> 00:32:10,239 Speaker 1: of hairs, actually hundred millions, a lot of hairs. If 513 00:32:10,240 --> 00:32:12,200 Speaker 1: anyone wants to give me a hair in that sense, 514 00:32:12,400 --> 00:32:16,560 Speaker 1: I will take it. Uh. And two thousand four, IBM 515 00:32:16,640 --> 00:32:20,080 Speaker 1: comes out with the blue Gene slash L computer and 516 00:32:20,240 --> 00:32:24,960 Speaker 1: had sixteen thousand computer nodes and each node had two CPUs. 517 00:32:25,440 --> 00:32:27,360 Speaker 1: I'm gonna be thinking Bowie the rest of the day now, 518 00:32:27,760 --> 00:32:32,520 Speaker 1: So yeah, thirty two thousand CPUs. Ultimately, if my math 519 00:32:32,600 --> 00:32:35,959 Speaker 1: is correct, and then that could run at seventy terra flops, 520 00:32:36,000 --> 00:32:38,560 Speaker 1: so twice as fast as the Asky White and a 521 00:32:38,600 --> 00:32:41,000 Speaker 1: two thousand seven version of this could actually manage up 522 00:32:41,040 --> 00:32:44,360 Speaker 1: to six hundred terra flops. And it had a hundred 523 00:32:44,560 --> 00:32:48,400 Speaker 1: thousand computer nodes, so two hundred thousand processors. With that 524 00:32:48,960 --> 00:32:53,440 Speaker 1: starting to get into some preretty ridiculous computers from you know, 525 00:32:53,520 --> 00:32:55,600 Speaker 1: if you're looking at it as hey, I own a 526 00:32:55,640 --> 00:32:58,280 Speaker 1: computer that's got a single processor, this one has a 527 00:32:58,600 --> 00:33:02,160 Speaker 1: two hundred thousand of them. Yeah. Yeah. It also sort 528 00:33:02,200 --> 00:33:07,160 Speaker 1: of uh makes apples claim. In the late nineties, UM 529 00:33:07,200 --> 00:33:12,000 Speaker 1: sort of silly, UM, because well, the federal government classified 530 00:33:12,000 --> 00:33:15,560 Speaker 1: a supercomputer UM I can't remember exactly when it was. 531 00:33:15,560 --> 00:33:17,840 Speaker 1: It was in the late nineties and uh as as 532 00:33:17,880 --> 00:33:21,520 Speaker 1: a machine that would run a giga flop and um 533 00:33:21,560 --> 00:33:23,840 Speaker 1: IBM when they were still running on on power process 534 00:33:23,960 --> 00:33:26,800 Speaker 1: power PC processors. UM, there was a MAC that they 535 00:33:26,840 --> 00:33:30,760 Speaker 1: advertised as being a supercomputer because it could reach a 536 00:33:30,800 --> 00:33:33,920 Speaker 1: gigga flop UM, and I just thought at the time 537 00:33:33,960 --> 00:33:37,320 Speaker 1: it was kind of weird to think about UM, But 538 00:33:37,440 --> 00:33:39,160 Speaker 1: now it's just kind of silly when you take it 539 00:33:39,200 --> 00:33:44,120 Speaker 1: into context. And these these actual supercomputers at the time. Now, uh, yeah, 540 00:33:44,280 --> 00:33:48,000 Speaker 1: a gigga flop is good, but no, right, So a 541 00:33:48,040 --> 00:33:51,160 Speaker 1: mega flop is a million floating operations per second, a 542 00:33:51,200 --> 00:33:54,560 Speaker 1: giga flop is a billion floating operations per second, a 543 00:33:54,600 --> 00:33:58,800 Speaker 1: tara flop is a trillion floating operations per second. Well, 544 00:33:59,000 --> 00:34:03,080 Speaker 1: and then there's a peta, which is a quadrillion floating 545 00:34:03,080 --> 00:34:07,160 Speaker 1: operations per second per second. Yeah, quadrillion. And the first 546 00:34:07,360 --> 00:34:12,000 Speaker 1: supercomputer to hit that and break that barrier was another 547 00:34:12,040 --> 00:34:17,560 Speaker 1: IBM machine, the road Runner, And uh it had twenty 548 00:34:17,600 --> 00:34:20,239 Speaker 1: thousand CPUs and it was the first computer to break 549 00:34:20,280 --> 00:34:25,320 Speaker 1: that pedophal up barrier. So one quadrillion floating operations per second. 550 00:34:25,520 --> 00:34:29,799 Speaker 1: It's a serious machine. Chris and I will return to 551 00:34:29,880 --> 00:34:33,680 Speaker 1: discussing supercomputers in just a moment. After this quick break. 552 00:34:41,239 --> 00:34:45,360 Speaker 1: In two, there was an interesting development because China entered 553 00:34:45,440 --> 00:34:48,279 Speaker 1: the supercomputer Fray. Now at this point it was really 554 00:34:48,320 --> 00:34:50,759 Speaker 1: a battle down between the United States and Japan and 555 00:34:50,800 --> 00:34:53,279 Speaker 1: Germany also has quite a few supercomputers as well. But 556 00:34:54,280 --> 00:34:56,480 Speaker 1: but US and Japan were the ones that were stealing 557 00:34:56,520 --> 00:34:59,560 Speaker 1: the record back from between each other. And then China 558 00:34:59,640 --> 00:35:02,879 Speaker 1: came out with a computer which I'm sure I'm gonna 559 00:35:03,160 --> 00:35:06,240 Speaker 1: mispronounced because I I don't know how to pronounce Chinese, 560 00:35:06,320 --> 00:35:09,400 Speaker 1: but tian hey is how it would be spelled in English, 561 00:35:09,600 --> 00:35:12,360 Speaker 1: and and someone's probably gonna say it's sheen hey or 562 00:35:12,400 --> 00:35:16,359 Speaker 1: something like that. Please let us know, yeah, because I don't. 563 00:35:16,920 --> 00:35:19,560 Speaker 1: But it was a computer from China that could run 564 00:35:19,600 --> 00:35:22,759 Speaker 1: at two point five pedal flops and uh it had 565 00:35:22,920 --> 00:35:27,719 Speaker 1: fourteen thousand, three hundred thirty six Intel Xeon X five 566 00:35:27,840 --> 00:35:33,200 Speaker 1: six seven zero CPUs and seven thousand, one six eight 567 00:35:33,360 --> 00:35:37,560 Speaker 1: in video Tesla GPUs and so that was, you know, 568 00:35:38,040 --> 00:35:41,440 Speaker 1: a really impressive machine that was that stole all the 569 00:35:41,480 --> 00:35:46,960 Speaker 1: titles away in But also another important moment for China 570 00:35:47,120 --> 00:35:50,879 Speaker 1: in that year was that China developed the sun Way, 571 00:35:50,960 --> 00:35:54,600 Speaker 1: which was slow by supercomputer standards because they could only 572 00:35:54,680 --> 00:35:58,439 Speaker 1: run a pedal flop um and they had already gotten 573 00:35:58,480 --> 00:36:01,080 Speaker 1: up to two point five pedal flops. Penahlop is still 574 00:36:01,080 --> 00:36:04,080 Speaker 1: incredibly fast, people, I'm just slow slow in general terms. 575 00:36:04,080 --> 00:36:07,040 Speaker 1: Here relative terms But the cool thing about the Sunway, 576 00:36:07,080 --> 00:36:09,920 Speaker 1: at least from China's perspective, is that it was the 577 00:36:09,920 --> 00:36:15,719 Speaker 1: first supercomputer China had designed with all Chinese processors, so 578 00:36:15,719 --> 00:36:18,520 Speaker 1: they weren't depending upon some other companies process or some 579 00:36:18,600 --> 00:36:22,200 Speaker 1: other country processors. They wanted to be able to be 580 00:36:22,800 --> 00:36:25,839 Speaker 1: self reliant when it came to developing computers, and so 581 00:36:25,880 --> 00:36:32,400 Speaker 1: that China really pushed it's it's computer engineering industry and 582 00:36:32,680 --> 00:36:36,960 Speaker 1: was able to design you know, the Chinese UM engineers 583 00:36:36,960 --> 00:36:40,880 Speaker 1: were able to design this the supercomputer. UM. Then you 584 00:36:40,960 --> 00:36:45,680 Speaker 1: had Fujitsu's K supercomputer, which until recently held the record. 585 00:36:46,480 --> 00:36:49,200 Speaker 1: It was capable of running up to ten pedaphlops with 586 00:36:49,320 --> 00:36:53,840 Speaker 1: eighty eight thousand, one eight Spark sixty four processors, and 587 00:36:53,880 --> 00:36:57,120 Speaker 1: each CPU had sixteen gigabytes of local RAN and it 588 00:36:57,239 --> 00:37:01,840 Speaker 1: had one thousand, three D seventy seven terab bytes of memory, 589 00:37:02,680 --> 00:37:04,919 Speaker 1: and eventually it got up to a seven hundred five 590 00:37:05,040 --> 00:37:10,520 Speaker 1: thousand process records. Yeah, it sits in Japan's ken Advanced 591 00:37:10,560 --> 00:37:16,600 Speaker 1: Institute for Computational Science in it thinks what And that's funny. 592 00:37:16,600 --> 00:37:19,000 Speaker 1: It's asky only spelled in different I mean the letters 593 00:37:19,000 --> 00:37:22,840 Speaker 1: are in different UM anyway, Sorry, I just noticed that 594 00:37:22,840 --> 00:37:25,520 Speaker 1: as I was looking down in my notes. Um, that's 595 00:37:25,560 --> 00:37:27,960 Speaker 1: actually sort of why we decided to do this now, 596 00:37:28,000 --> 00:37:30,520 Speaker 1: because it was just the week that we're recording this 597 00:37:30,600 --> 00:37:33,239 Speaker 1: that we found out about the test. Now they do 598 00:37:33,320 --> 00:37:36,799 Speaker 1: these tests twice a year, every six months. They have 599 00:37:36,920 --> 00:37:42,120 Speaker 1: the top five hundred supercomputer sites, um so computers from 600 00:37:42,160 --> 00:37:45,160 Speaker 1: all over the world. Uh. They put them on wheels 601 00:37:45,200 --> 00:37:47,640 Speaker 1: at the top of this big hill and push it 602 00:37:47,640 --> 00:37:51,880 Speaker 1: down the hill race hands like a big computer soapbox derby, 603 00:37:52,040 --> 00:37:55,560 Speaker 1: you know. They they they give them problems to solve 604 00:37:55,840 --> 00:37:59,320 Speaker 1: and uh see who's the fastest the top five hundred 605 00:37:59,320 --> 00:38:02,600 Speaker 1: supercomputers in the world. Which way. It's kind of silly, 606 00:38:02,800 --> 00:38:05,360 Speaker 1: but at the same time very very cool. And you 607 00:38:05,400 --> 00:38:07,560 Speaker 1: can actually see the results of this if you want to, 608 00:38:07,600 --> 00:38:11,560 Speaker 1: if you go to top five dot org. Um there 609 00:38:11,600 --> 00:38:15,760 Speaker 1: there are the organizations that put it on uh publish 610 00:38:15,840 --> 00:38:17,560 Speaker 1: this every year, and that happens to be the University 611 00:38:17,560 --> 00:38:21,200 Speaker 1: of Mannheim, Lawrence Berkeley National Laboratory, and the University of 612 00:38:21,280 --> 00:38:25,840 Speaker 1: Tennessee actually do this and they are trying to figure 613 00:38:25,840 --> 00:38:29,120 Speaker 1: out the the fastest, and the fastest was just announced. 614 00:38:29,120 --> 00:38:31,440 Speaker 1: The new fastest was just announced this week, and we 615 00:38:31,440 --> 00:38:33,520 Speaker 1: thought there would be a great time to talk about it. 616 00:38:33,520 --> 00:38:37,560 Speaker 1: It's a machine actually name for a tree. Yes, it 617 00:38:37,680 --> 00:38:41,920 Speaker 1: is the IBM Sequoia. And uh when when we say 618 00:38:42,320 --> 00:38:47,719 Speaker 1: recording this week, the date is June. And so the 619 00:38:47,760 --> 00:38:51,120 Speaker 1: Sequoia has taken the title of fastest supercomputer, which means 620 00:38:51,160 --> 00:38:54,200 Speaker 1: that that's from IBM. So it means the USA has 621 00:38:54,239 --> 00:38:58,759 Speaker 1: the title once more, at least until the next Supercomputer Olympics. 622 00:38:59,520 --> 00:39:03,520 Speaker 1: And um, yeah, it's a giant gold medal that is 623 00:39:03,640 --> 00:39:06,200 Speaker 1: stamped on the outside of them. So you're you're probably 624 00:39:06,239 --> 00:39:09,920 Speaker 1: all asking, hey, so what are some stats on this, uh, 625 00:39:10,239 --> 00:39:13,400 Speaker 1: the Sequoia computer. How how fast can it go? And 626 00:39:13,440 --> 00:39:15,759 Speaker 1: what what's making it take? Well? I do want to 627 00:39:15,760 --> 00:39:19,840 Speaker 1: point out that it is owned by the Department of Energy, UM, 628 00:39:19,880 --> 00:39:25,960 Speaker 1: so this isn't really a military machine. UM is at 629 00:39:25,960 --> 00:39:32,160 Speaker 1: the Lawrence Livermore National Laboratory. UM. And yes, the specs 630 00:39:32,239 --> 00:39:35,000 Speaker 1: on this are pretty impressive. I mean it uses seven 631 00:39:35,000 --> 00:39:41,520 Speaker 1: thousand kilowatts. Yeah, it's actually fairly efficient for a supercomputer. Yeah. 632 00:39:41,520 --> 00:39:46,640 Speaker 1: It has one million, five hundred seventy two thousand, eight 633 00:39:46,760 --> 00:39:51,680 Speaker 1: hundred sixty four processors and one point six peta bytes 634 00:39:51,800 --> 00:39:56,160 Speaker 1: of memory. It takes up three thousand four hundred twenty 635 00:39:56,239 --> 00:39:59,279 Speaker 1: two square feet of space, so we've finally gotten back 636 00:39:59,320 --> 00:40:02,520 Speaker 1: to that those a enormous computers. Remember the stretch was 637 00:40:02,600 --> 00:40:06,040 Speaker 1: two thousand square feet. Now this one's three thousand two 638 00:40:06,040 --> 00:40:11,560 Speaker 1: square feet and it can run at sixteen point three 639 00:40:11,640 --> 00:40:15,600 Speaker 1: two pedal flops, so six point three to pedophalops faster, 640 00:40:15,920 --> 00:40:19,080 Speaker 1: well not even quite that much, because the the K 641 00:40:19,760 --> 00:40:22,840 Speaker 1: eventually got up to ten point five, but it is 642 00:40:23,120 --> 00:40:28,200 Speaker 1: significantly faster than the K. So IBM now holds the 643 00:40:28,200 --> 00:40:31,759 Speaker 1: the distinction of having the fastest or having designed the 644 00:40:31,800 --> 00:40:35,640 Speaker 1: fastest supercomputer in the world. Now, I thought it'd be 645 00:40:35,680 --> 00:40:40,240 Speaker 1: kind of fun too to compare that to IBM's Watson computer, 646 00:40:40,360 --> 00:40:45,160 Speaker 1: because that made headlines last year when Watson was designed 647 00:40:45,600 --> 00:40:49,399 Speaker 1: in part to compete against humans in a very human game. 648 00:40:49,440 --> 00:40:53,160 Speaker 1: Because we've already talked about computers playing chess against humans, 649 00:40:53,160 --> 00:40:55,759 Speaker 1: we've also talked about computers playing other games against humans. 650 00:40:55,760 --> 00:40:58,320 Speaker 1: In fact, we did a full episode about this particular computer. 651 00:40:59,520 --> 00:41:01,640 Speaker 1: So i'd be Watson was designed to play in a 652 00:41:01,719 --> 00:41:04,879 Speaker 1: game show Let's Make a Deal, So they called out 653 00:41:04,880 --> 00:41:11,040 Speaker 1: Watson and if you didn't know what was behind, well 654 00:41:11,040 --> 00:41:14,160 Speaker 1: it was it did have a dress on no it wasn't. 655 00:41:14,200 --> 00:41:16,920 Speaker 1: It wasn't let's make a deal. It was jeopardy and uh. 656 00:41:17,000 --> 00:41:19,480 Speaker 1: And in Jeopardy, of course you are given an answer. 657 00:41:19,520 --> 00:41:21,560 Speaker 1: You have to come up with the appropriate question. And 658 00:41:22,040 --> 00:41:24,359 Speaker 1: it's it's really tricky for a computer to do this 659 00:41:24,400 --> 00:41:28,480 Speaker 1: because it's not just a matching game where you matching 660 00:41:28,680 --> 00:41:30,920 Speaker 1: an answer to a question. You also have to take 661 00:41:30,960 --> 00:41:35,480 Speaker 1: in context. Sometimes there's word play, sometimes there's a riddle. Um, 662 00:41:35,520 --> 00:41:38,839 Speaker 1: it's a it's a lot more complicated than just question answer. Yeah. 663 00:41:38,880 --> 00:41:43,440 Speaker 1: They they specifically wanted it to play a human game. 664 00:41:43,640 --> 00:41:47,759 Speaker 1: They didn't alter the clues. They're actually clues in this show. 665 00:41:47,800 --> 00:41:50,000 Speaker 1: If you've never seen it. Um, they give you the answer, 666 00:41:50,000 --> 00:41:52,960 Speaker 1: and they you are supposed to supply the question and uh. 667 00:41:53,000 --> 00:41:56,960 Speaker 1: They use wordplay and and things in these clues. And 668 00:41:57,000 --> 00:42:00,400 Speaker 1: they specifically want the IBM engineers specifically need it to 669 00:42:00,440 --> 00:42:03,280 Speaker 1: play a human game to to test its natural language 670 00:42:03,320 --> 00:42:07,399 Speaker 1: processing ability. Can it figure out what from context what 671 00:42:07,440 --> 00:42:10,279 Speaker 1: it is you're talking about? And it did very well. Yeah. 672 00:42:10,440 --> 00:42:13,279 Speaker 1: So what was powering the Watson if you want to 673 00:42:13,320 --> 00:42:16,040 Speaker 1: compare it to say the Sequoia, Well, it had a 674 00:42:16,360 --> 00:42:20,319 Speaker 1: it was using ninety IBM power seven fifties servers in 675 00:42:20,520 --> 00:42:25,800 Speaker 1: ten server racks, and it had sixteen terabytes of memory 676 00:42:26,280 --> 00:42:32,360 Speaker 1: and two thousand eight processors um so or processor cores 677 00:42:32,400 --> 00:42:35,080 Speaker 1: I should say, not just processors, uh, and so two 678 00:42:35,080 --> 00:42:37,520 Speaker 1: thousand that sounds like a lot, But then you compare 679 00:42:37,560 --> 00:42:42,200 Speaker 1: that to the one million, five sixty four processors that 680 00:42:42,239 --> 00:42:45,279 Speaker 1: the Sequoia has and you realize that Watson, as far 681 00:42:45,320 --> 00:42:51,120 Speaker 1: as supercomputers go, doesn't merit mention. It's that which, again, 682 00:42:51,880 --> 00:42:54,399 Speaker 1: Watson was designed for a very specific purpose, this whole 683 00:42:54,480 --> 00:42:57,040 Speaker 1: natural language. Being able to recognize that and being able 684 00:42:57,080 --> 00:43:02,120 Speaker 1: to come up with information. That's a very specialized computer. 685 00:43:02,440 --> 00:43:06,480 Speaker 1: So it doesn't necessarily have to have this incredible by 686 00:43:06,480 --> 00:43:11,600 Speaker 1: comparison processing speed and number crunching ability, which might be 687 00:43:11,680 --> 00:43:17,200 Speaker 1: used for other very intensive tasks, so things like very 688 00:43:17,320 --> 00:43:20,919 Speaker 1: very realistic simulations that kind of thing, and predictions. So 689 00:43:21,520 --> 00:43:23,560 Speaker 1: I just wanted to compare that so that people could 690 00:43:23,640 --> 00:43:25,640 Speaker 1: understand because Watson's one of those words that we've heard 691 00:43:25,640 --> 00:43:27,960 Speaker 1: a lot about and we think of that as like 692 00:43:27,960 --> 00:43:31,480 Speaker 1: a supercomputer, but really, if we define supercomputer as a 693 00:43:31,520 --> 00:43:33,960 Speaker 1: computer that has is on that bleeding edge of what 694 00:43:34,000 --> 00:43:36,600 Speaker 1: a computer is capable of doing, it does not It 695 00:43:36,680 --> 00:43:40,400 Speaker 1: doesn't measure up. But when you talk about comparing the 696 00:43:40,440 --> 00:43:43,840 Speaker 1: top five hundred or putting a computer in a chess 697 00:43:43,880 --> 00:43:47,759 Speaker 1: match or in a game of jeopardy, UM, you know, 698 00:43:47,840 --> 00:43:49,360 Speaker 1: I was. I made the joke that it was a 699 00:43:49,360 --> 00:43:51,600 Speaker 1: little silly, and yeah, you could. You could say that 700 00:43:51,680 --> 00:43:53,960 Speaker 1: you're using a computer, you could be using it for 701 00:43:54,000 --> 00:43:56,880 Speaker 1: scientific purposes or doing something and instead you're you're taking 702 00:43:56,880 --> 00:44:00,640 Speaker 1: time off to do something else. But really it's nice 703 00:44:00,680 --> 00:44:03,400 Speaker 1: that for one thing, people understand what it is, it's 704 00:44:03,440 --> 00:44:07,960 Speaker 1: a supercomputer is and can do. And also it's uh, 705 00:44:08,040 --> 00:44:10,920 Speaker 1: it's a way to test out these machines and make 706 00:44:10,960 --> 00:44:14,439 Speaker 1: them better. UM you know. Even like I was talking 707 00:44:14,440 --> 00:44:19,040 Speaker 1: about the h the power used by the Sequoia machine, 708 00:44:19,400 --> 00:44:24,600 Speaker 1: it's considerably more efficient than the K computer. UM the 709 00:44:24,719 --> 00:44:33,399 Speaker 1: seven seven thowts beats K's twelve thousand watts. So with 710 00:44:33,560 --> 00:44:36,840 Speaker 1: every every time that they come out with a new supercomputer, 711 00:44:37,520 --> 00:44:43,040 Speaker 1: it's more efficient. They find better ways to route instructions, UM, 712 00:44:43,080 --> 00:44:45,319 Speaker 1: you know, and and they can make things smaller than 713 00:44:45,440 --> 00:44:48,479 Speaker 1: than before. So you really do see the implications in 714 00:44:48,480 --> 00:44:52,280 Speaker 1: in our our everyday computers because now we have multi 715 00:44:52,280 --> 00:44:57,319 Speaker 1: core processors in UM these everyday devices that we use, 716 00:44:57,840 --> 00:45:01,040 Speaker 1: UM you don't necessarily need that write a letter or 717 00:45:01,280 --> 00:45:05,239 Speaker 1: serve the internet. But it does make things faster and 718 00:45:05,280 --> 00:45:08,640 Speaker 1: more efficient. Uh, computers are are more reliable. You see 719 00:45:09,280 --> 00:45:13,680 Speaker 1: advances in operating systems that we use every day because um, 720 00:45:14,440 --> 00:45:18,400 Speaker 1: the things that they found out, um in the process 721 00:45:18,400 --> 00:45:21,399 Speaker 1: of making these supercomputers. They find better ways to route 722 00:45:21,440 --> 00:45:26,080 Speaker 1: instructions in a simpler computer. Um. And so it's really 723 00:45:26,080 --> 00:45:29,719 Speaker 1: worth it to do these these tests and uh find 724 00:45:29,760 --> 00:45:32,080 Speaker 1: out just what a computer can do. So, you know, 725 00:45:32,160 --> 00:45:34,680 Speaker 1: having a challenge just for the fun of it. You know, 726 00:45:34,719 --> 00:45:37,839 Speaker 1: I don't see that necessarily as a bad thing. Um, 727 00:45:37,880 --> 00:45:40,320 Speaker 1: you know, especially when we can we can make advances 728 00:45:40,320 --> 00:45:42,880 Speaker 1: and build on those for the next generation of machines. 729 00:45:43,719 --> 00:45:46,040 Speaker 1: And just to kind of sum this up, I thought 730 00:45:46,080 --> 00:45:48,400 Speaker 1: I would just kind of a fun fact. If you 731 00:45:48,400 --> 00:45:51,480 Speaker 1: look at the top ten fastest supercomputers in the world, 732 00:45:52,400 --> 00:45:55,880 Speaker 1: three of them are in the United States, two of 733 00:45:55,920 --> 00:45:59,080 Speaker 1: them are in Germany, two of them are in China, 734 00:45:59,640 --> 00:46:02,920 Speaker 1: and the other three are in Japan, Italy, and France. 735 00:46:03,600 --> 00:46:06,560 Speaker 1: And that's it for this classic episode. I hope you 736 00:46:06,600 --> 00:46:10,760 Speaker 1: guys enjoyed it again. Was recorded in two thousand twelve. 737 00:46:11,480 --> 00:46:15,440 Speaker 1: We've had bigger and better supercomputers come out since then, 738 00:46:15,520 --> 00:46:18,640 Speaker 1: and we've also seen the rise of graphics processing units 739 00:46:19,080 --> 00:46:25,040 Speaker 1: that have largely supplanted supercomputers in many, but not all applications. 740 00:46:25,320 --> 00:46:28,160 Speaker 1: I've done other episodes about that. You can search our 741 00:46:28,320 --> 00:46:30,520 Speaker 1: archive if you want to see those. The way you 742 00:46:30,560 --> 00:46:32,840 Speaker 1: do that is you pop on over to our website 743 00:46:32,920 --> 00:46:36,279 Speaker 1: text stuff podcast dot com. We have the archive there. 744 00:46:36,360 --> 00:46:40,520 Speaker 1: It is searchable, so you can look for specific episodes 745 00:46:40,600 --> 00:46:44,120 Speaker 1: that cover, you know, specific topics. Maybe you don't see 746 00:46:44,120 --> 00:46:46,640 Speaker 1: the topic you want. Maybe you do a search and 747 00:46:46,719 --> 00:46:49,520 Speaker 1: nothing comes up. Well, then you can write me an 748 00:46:49,560 --> 00:46:53,319 Speaker 1: email and suggest that topic to me. The addresses tech 749 00:46:53,400 --> 00:46:56,160 Speaker 1: stuff at how stuff works dot com, where you can 750 00:46:56,160 --> 00:46:58,680 Speaker 1: pop on over to Facebook or Twitter. We have to 751 00:46:58,719 --> 00:47:01,279 Speaker 1: handle text stuff h s W at both of those. 752 00:47:01,480 --> 00:47:03,759 Speaker 1: You can send us a suggestion that way as well. 753 00:47:04,280 --> 00:47:08,399 Speaker 1: And I hope to talk to you again. Really sick Yeah. 754 00:47:12,120 --> 00:47:14,279 Speaker 1: Text Stuff is a production of I Heart Radio's How 755 00:47:14,360 --> 00:47:17,760 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 756 00:47:17,800 --> 00:47:20,880 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 757 00:47:20,920 --> 00:47:22,280 Speaker 1: listen to your favorite shows.