1 00:00:01,480 --> 00:00:07,720 Speaker 1: Welcome to Stuff you should know, a production of iHeartRadio. 2 00:00:11,160 --> 00:00:13,920 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:14,120 --> 00:00:16,759 Speaker 2: Chuck and Jerry's here too, and we've got our pocket 4 00:00:16,800 --> 00:00:20,919 Speaker 2: protectors and tape on the bridge of our glasses day 5 00:00:21,360 --> 00:00:22,639 Speaker 2: and this is stuff you should know. 6 00:00:23,480 --> 00:00:26,680 Speaker 1: Nice work. Did you just hear something? 7 00:00:27,240 --> 00:00:30,080 Speaker 2: No, not weird. I heard you say nice work. Yeah, 8 00:00:30,120 --> 00:00:31,400 Speaker 2: but then stop abruptly. 9 00:00:31,560 --> 00:00:34,159 Speaker 1: Well I stopped abruptly because I thought I heard a 10 00:00:34,159 --> 00:00:35,960 Speaker 1: little digital glitch. 11 00:00:36,840 --> 00:00:38,440 Speaker 2: Oh no, I didn't hear anything. 12 00:00:39,159 --> 00:00:42,839 Speaker 1: I might be losing my mind, then, I you know, I'm. 13 00:00:42,720 --> 00:00:44,879 Speaker 2: Curious whether we end up editing this out or not. 14 00:00:45,040 --> 00:00:47,760 Speaker 2: Any other podcast on the planet would edit that out 15 00:00:47,760 --> 00:00:50,200 Speaker 2: without even thinking about it. But there's like a fifty 16 00:00:50,240 --> 00:00:52,000 Speaker 2: percent chance that'll stay in with us. 17 00:00:52,440 --> 00:00:54,120 Speaker 1: I mean, this is why we didn't get that Golden 18 00:00:54,160 --> 00:00:57,360 Speaker 1: Globe nomination. That's right, it's kind of classic stuff you 19 00:00:57,360 --> 00:00:58,800 Speaker 1: should know. On professionality. 20 00:00:58,880 --> 00:01:02,840 Speaker 2: It's exactly right in the Italian accent, mispronunciations, there's a 21 00:01:02,840 --> 00:01:07,200 Speaker 2: whole laundry list. Yeah, that's okay, Chuck, I think we're 22 00:01:07,280 --> 00:01:08,959 Speaker 2: golden regardless. 23 00:01:09,520 --> 00:01:10,119 Speaker 1: Agreed. 24 00:01:11,080 --> 00:01:14,640 Speaker 2: So we're talking about data centers, which I had a 25 00:01:14,760 --> 00:01:18,200 Speaker 2: very very rough idea about But actually no, I knew 26 00:01:18,240 --> 00:01:21,040 Speaker 2: that they existed essentially and that they were becoming a 27 00:01:21,040 --> 00:01:25,000 Speaker 2: problem with the rise of AI. Yes, that was about it. 28 00:01:25,080 --> 00:01:29,280 Speaker 2: How about you are you are you d data center Feliac? 29 00:01:32,240 --> 00:01:35,440 Speaker 1: No? You know me, I'm not super technology minded, so 30 00:01:35,520 --> 00:01:37,920 Speaker 1: I don't know a lot about this stuff. I remember 31 00:01:37,959 --> 00:01:41,240 Speaker 1: walking by our server room back in the day when 32 00:01:41,240 --> 00:01:44,480 Speaker 1: we were at Pont City Market and seeing our colleague 33 00:01:44,480 --> 00:01:47,160 Speaker 1: Izzy in there hard at word yeah. And when that 34 00:01:47,200 --> 00:01:50,440 Speaker 1: door was unlocked and open, hearing the were of the 35 00:01:50,560 --> 00:01:53,640 Speaker 1: of the servers and the cooling machines, and you know, 36 00:01:53,760 --> 00:01:56,680 Speaker 1: that's on a smaller scale, that's a data center. 37 00:01:57,120 --> 00:01:59,680 Speaker 2: Absolutely, one hundred percent, that's a data center. It was 38 00:01:59,720 --> 00:02:01,800 Speaker 2: also so a great place to curl up and take 39 00:02:01,840 --> 00:02:02,320 Speaker 2: a nap in. 40 00:02:02,320 --> 00:02:05,480 Speaker 1: The middle of the day, like the warmth of the server. 41 00:02:05,680 --> 00:02:10,359 Speaker 2: Yeah, and the word put right the word yeah, So 42 00:02:10,800 --> 00:02:13,200 Speaker 2: uh yeah, that definitely counts as a data server. If 43 00:02:13,240 --> 00:02:16,639 Speaker 2: you have like one of those little home networking setups 44 00:02:16,639 --> 00:02:19,320 Speaker 2: in like a closet in your house, data center. Sure, 45 00:02:19,800 --> 00:02:24,079 Speaker 2: technically the PC is a data center. Anywhere you can 46 00:02:24,160 --> 00:02:28,240 Speaker 2: store and access data, that's technically a data center. And 47 00:02:28,280 --> 00:02:30,080 Speaker 2: you're like, well, that's stupid. Why did you even say 48 00:02:30,080 --> 00:02:33,280 Speaker 2: that Josh's quibbling, that's quotitian. Shut up and get on 49 00:02:33,360 --> 00:02:36,480 Speaker 2: with data centers. Whoa, whoa, whoa. First of all, don't 50 00:02:36,560 --> 00:02:41,240 Speaker 2: use the S word and then secondly word yes, Wait, 51 00:02:41,320 --> 00:02:43,400 Speaker 2: what did I say? That's the keyword Quoteitian? 52 00:02:43,880 --> 00:02:47,200 Speaker 1: Oh no, that's kW right. 53 00:02:49,240 --> 00:02:51,720 Speaker 2: So the reason that I bring that up, though, Chuck, 54 00:02:51,800 --> 00:02:54,920 Speaker 2: is because that technically is part of the progression of 55 00:02:55,040 --> 00:02:57,720 Speaker 2: data centers. Yeah, it probably goes without saying, but it's 56 00:02:57,760 --> 00:03:01,200 Speaker 2: evolved along with computing, and his computer's kind of gotten 57 00:03:01,240 --> 00:03:05,120 Speaker 2: bigger and bigger. The need to store and access more 58 00:03:05,200 --> 00:03:07,800 Speaker 2: data has gotten bigger so much so, Chuck that just 59 00:03:07,840 --> 00:03:10,480 Speaker 2: wrap your head around this one. In twenty twenty four, 60 00:03:11,840 --> 00:03:14,600 Speaker 2: just over a year ago. Yeah, we used one hundred 61 00:03:14,639 --> 00:03:19,160 Speaker 2: and fifty zetabytes of data. That's what we consumed. And 62 00:03:19,200 --> 00:03:22,680 Speaker 2: consuming is anything from making a video and uploading it 63 00:03:22,720 --> 00:03:27,000 Speaker 2: to TikTok or putting a post up on Instagram. It's 64 00:03:27,919 --> 00:03:31,519 Speaker 2: browsing a website, it's buying a song from iTunes, it's 65 00:03:31,720 --> 00:03:35,800 Speaker 2: doing web analytics, it's buying something with your American Express card. 66 00:03:36,160 --> 00:03:39,440 Speaker 2: All of that is data consumption. And we consumed one 67 00:03:39,520 --> 00:03:43,600 Speaker 2: hundred and fifty zetabytes of data in twenty twenty four. 68 00:03:44,320 --> 00:03:46,480 Speaker 1: Yeah, I don't even know how many big Max that is. 69 00:03:47,160 --> 00:03:49,880 Speaker 1: I know you name dropped a lot of brands. It 70 00:03:49,920 --> 00:03:52,400 Speaker 1: should be like the movies where every time you even 71 00:03:52,480 --> 00:03:55,160 Speaker 1: just say, like buy something on your amex, the bank 72 00:03:55,160 --> 00:03:57,000 Speaker 1: account grows by like ten dollars. 73 00:03:57,240 --> 00:04:02,480 Speaker 2: I agree wholeheartedly. I agree that amex should do that. Amex, 74 00:04:03,560 --> 00:04:05,200 Speaker 2: there's twenty bucks. I'll split it with you. 75 00:04:05,280 --> 00:04:08,360 Speaker 1: Wow, you just bought lunch in nineteen ninety seven. 76 00:04:08,200 --> 00:04:11,760 Speaker 2: That's right. So just real quick, as zeta byte, chuck 77 00:04:11,840 --> 00:04:15,440 Speaker 2: is a trillion gigabytes, So we consumed one hundred and 78 00:04:15,480 --> 00:04:19,239 Speaker 2: fifty trillion gigabytes. That was twenty twenty four. 79 00:04:19,440 --> 00:04:21,480 Speaker 1: That's worldwide, right, yes, yeah. 80 00:04:21,240 --> 00:04:26,160 Speaker 2: That's worldwide. In twenty ten, we consumed two zeta bytes. Jeez. Yeah. 81 00:04:26,160 --> 00:04:29,280 Speaker 2: So it's growing exponentially, which means that data centers are 82 00:04:29,279 --> 00:04:34,000 Speaker 2: growing exponentially. And now they're about to just blow up, 83 00:04:34,600 --> 00:04:39,840 Speaker 2: like truffle up essentially from you know, this kind of 84 00:04:40,320 --> 00:04:42,640 Speaker 2: calm like plateau that they'd reached. It's about to just 85 00:04:42,680 --> 00:04:44,400 Speaker 2: go in hyper drive. 86 00:04:44,640 --> 00:04:46,840 Speaker 1: Yeah, and actively is, and we're going we're gonna get 87 00:04:46,880 --> 00:04:49,719 Speaker 1: to some startling statistics later on in the episode. But 88 00:04:50,839 --> 00:04:53,680 Speaker 1: Kyle helped us out with this our writer over in 89 00:04:53,720 --> 00:04:54,160 Speaker 1: the UK. 90 00:04:54,640 --> 00:04:55,960 Speaker 2: He did a fantastic job. 91 00:04:56,080 --> 00:04:57,680 Speaker 1: He did a really good job. And there's going to 92 00:04:57,760 --> 00:05:00,320 Speaker 1: be some UK specific things in here because Kyle's always keen, 93 00:05:00,640 --> 00:05:02,240 Speaker 1: as they say, to throw that stuff in there. 94 00:05:02,720 --> 00:05:05,279 Speaker 2: Yeah, for sure, Kyle likes to pepper those in Yeah, of. 95 00:05:05,320 --> 00:05:09,240 Speaker 1: Course, and he's not barred from doing so we allow it. 96 00:05:12,080 --> 00:05:17,400 Speaker 1: So since Kyle, you know, is frequenter of the Wayback Machine, 97 00:05:17,440 --> 00:05:21,159 Speaker 1: as are all the wonderful writers that we use, they 98 00:05:21,200 --> 00:05:24,480 Speaker 1: all had the keys to the car. Essentially, he jumped 99 00:05:24,520 --> 00:05:26,080 Speaker 1: in the Wayback Machine to sort of give us a 100 00:05:26,080 --> 00:05:28,080 Speaker 1: little bit of a timeline on data centers and a 101 00:05:28,080 --> 00:05:29,800 Speaker 1: bit on you know, mainframes and PCs. 102 00:05:30,120 --> 00:05:32,200 Speaker 2: He also left all of his used tea bags in 103 00:05:32,240 --> 00:05:33,640 Speaker 2: there too. I don't know if you noticed that. 104 00:05:33,720 --> 00:05:35,480 Speaker 1: Oh it's fine, you know, you can throw those back 105 00:05:35,480 --> 00:05:37,360 Speaker 1: in some hot water and they do just a little 106 00:05:37,360 --> 00:05:38,039 Speaker 1: bit weaker tea. 107 00:05:38,440 --> 00:05:40,240 Speaker 2: Well, if you put like five of them together, it's 108 00:05:40,240 --> 00:05:40,760 Speaker 2: like one. 109 00:05:41,080 --> 00:05:43,239 Speaker 1: Yeah, and Kyle, I mean that thing was full. 110 00:05:43,720 --> 00:05:46,080 Speaker 2: Maybe really what he drinks a lot of tea, he 111 00:05:46,120 --> 00:05:46,599 Speaker 2: really does. 112 00:05:47,240 --> 00:05:49,040 Speaker 1: So if you want to talk about the earliest data 113 00:05:49,040 --> 00:05:51,760 Speaker 1: centers that you could kind of call maybe a data center. 114 00:05:52,800 --> 00:05:56,920 Speaker 1: They were you know, computers, they were electronic computers. Most 115 00:05:56,960 --> 00:05:58,520 Speaker 1: of this stuff that we're going to talk about early 116 00:05:58,560 --> 00:06:02,279 Speaker 1: on was military and and as you'll see, even the 117 00:06:02,320 --> 00:06:05,440 Speaker 1: first when we talk about the UK one that was 118 00:06:05,640 --> 00:06:08,440 Speaker 1: supposedly in that military, they even loaned it to the military, 119 00:06:08,480 --> 00:06:11,960 Speaker 1: which was kind of interesting. But these things were built 120 00:06:12,080 --> 00:06:14,440 Speaker 1: with you know, state of the art technology at the time, 121 00:06:14,480 --> 00:06:18,120 Speaker 1: which meant vacuum tubes and you know, manual switches and 122 00:06:18,160 --> 00:06:21,200 Speaker 1: plugs and things like that. And the first thing that 123 00:06:21,200 --> 00:06:24,640 Speaker 1: we can really talk about as the first programmable electric 124 00:06:24,920 --> 00:06:28,760 Speaker 1: digital computer was the Colossus. And as we'll see, Elon 125 00:06:28,839 --> 00:06:30,960 Speaker 1: Musk has now stolen that for his own purposes that 126 00:06:31,120 --> 00:06:35,400 Speaker 1: name probably because of this, I would imagine, but it 127 00:06:35,440 --> 00:06:37,640 Speaker 1: was at Bletchley Park, of course, during World War Two, 128 00:06:38,240 --> 00:06:41,360 Speaker 1: and they were trying to you know, crack into Hitler's 129 00:06:42,040 --> 00:06:45,479 Speaker 1: messages at the time, and these things were huge and 130 00:06:45,600 --> 00:06:47,720 Speaker 1: kind of to me. The thing that stood out about Colossus, 131 00:06:47,760 --> 00:06:51,640 Speaker 1: which is a neat little factoid, is that where Colossus 132 00:06:51,760 --> 00:06:54,719 Speaker 1: was at Bletchley Park at Block H, it is now 133 00:06:54,800 --> 00:06:56,320 Speaker 1: the National Museum of Computing. 134 00:06:56,560 --> 00:06:58,479 Speaker 2: I want to go to that so bad when we 135 00:06:58,560 --> 00:07:03,240 Speaker 2: do that UK up tour next year. Oh, we got 136 00:07:03,240 --> 00:07:04,000 Speaker 2: to go to that together. 137 00:07:04,080 --> 00:07:05,920 Speaker 1: Okay, okay, are we doing that next year? 138 00:07:06,440 --> 00:07:08,680 Speaker 2: That's what we were talking about. All right, we kind 139 00:07:08,680 --> 00:07:11,360 Speaker 2: of already have promised it. We have to. Now we're 140 00:07:11,360 --> 00:07:12,200 Speaker 2: locked in the punch. 141 00:07:12,360 --> 00:07:13,360 Speaker 1: Why's my voice O high? 142 00:07:13,360 --> 00:07:17,920 Speaker 2: Then? I don't know. You're practicing for the alph Yeah, 143 00:07:17,960 --> 00:07:18,560 Speaker 2: that's right now. 144 00:07:18,600 --> 00:07:19,960 Speaker 1: No, that that'd be a lot of fun. I'd love 145 00:07:20,000 --> 00:07:20,360 Speaker 1: to go to that. 146 00:07:21,040 --> 00:07:23,880 Speaker 2: So that was Colossus. Another one about the same time 147 00:07:24,000 --> 00:07:28,720 Speaker 2: was the Aniac Electrical Numerical Integrator and Computer. So that's 148 00:07:28,760 --> 00:07:33,760 Speaker 2: a quality acronym. Yeah, and it was the first general 149 00:07:33,840 --> 00:07:38,520 Speaker 2: purpose electronic computer. And here's the thing. This is technically 150 00:07:38,560 --> 00:07:42,800 Speaker 2: not data storage yet, it's data processing, right, But these 151 00:07:42,840 --> 00:07:46,520 Speaker 2: things Colossus Aniac, you walked up to them and you said, 152 00:07:47,520 --> 00:07:50,120 Speaker 2: what's the trajectory of this missile if I fire it 153 00:07:50,200 --> 00:07:52,920 Speaker 2: from here? And Aniac would go peep pop, poop, poop, 154 00:07:52,960 --> 00:07:56,600 Speaker 2: and then say, like, uh, whatever a trajectory is described 155 00:07:56,600 --> 00:07:59,520 Speaker 2: in Sure or Colossus. You'd be like, what is Hilter 156 00:07:59,640 --> 00:08:03,880 Speaker 2: saying here to Goebbels And the Colossus would say, Hilter 157 00:08:04,160 --> 00:08:08,320 Speaker 2: is saying that he's a big fan of goebels work, 158 00:08:08,640 --> 00:08:11,080 Speaker 2: but he's suspicious that the rest of the world doesn't 159 00:08:11,160 --> 00:08:14,080 Speaker 2: like either of them. And that was it. After that, 160 00:08:14,200 --> 00:08:16,200 Speaker 2: you'd be like, hey, what was the last answer, and 161 00:08:16,200 --> 00:08:17,400 Speaker 2: they'd be like, what's an answer? 162 00:08:18,000 --> 00:08:20,360 Speaker 1: Yeah, you gotta just tell me what's going on with 163 00:08:20,400 --> 00:08:21,520 Speaker 1: this Hilter business. 164 00:08:21,680 --> 00:08:25,480 Speaker 2: You don't remember from our art Mysteries of the Art World, 165 00:08:26,320 --> 00:08:29,560 Speaker 2: how stuff works are to go? Oh, the title of 166 00:08:29,600 --> 00:08:32,200 Speaker 2: that section was did Hilter do these pain? Oh? 167 00:08:32,280 --> 00:08:35,360 Speaker 1: My god, that's a deep cut. I did not remember that. 168 00:08:35,720 --> 00:08:38,840 Speaker 2: Yes, and I think it still says that on that 169 00:08:39,240 --> 00:08:42,319 Speaker 2: oh babilistical Yeah, yeah, I can only hope it's got 170 00:08:42,360 --> 00:08:44,880 Speaker 2: to be Hilter forever. All right. 171 00:08:44,920 --> 00:08:47,719 Speaker 1: So we go into mainframes at this point, and this 172 00:08:47,840 --> 00:08:52,600 Speaker 1: is like the nineteen fifties basically when companies could actually 173 00:08:52,600 --> 00:08:55,679 Speaker 1: have their own computer. It wasn't just the military. These 174 00:08:55,679 --> 00:08:58,640 Speaker 1: were the old punch card computers, and they were called mainframes. 175 00:08:58,920 --> 00:09:02,000 Speaker 1: It wasn't made up for this. Mainframes were originally described 176 00:09:02,840 --> 00:09:05,880 Speaker 1: We're describing like what you would house telecommunication equipment and 177 00:09:05,920 --> 00:09:09,240 Speaker 1: maybe some other sciencey stuff, but it was referencing literally 178 00:09:09,240 --> 00:09:12,320 Speaker 1: the cabinets that held this technology. And it became known 179 00:09:12,360 --> 00:09:15,120 Speaker 1: as just you know, it kind of took over when 180 00:09:15,120 --> 00:09:18,199 Speaker 1: the computer's world started using it as computer only. 181 00:09:18,720 --> 00:09:21,839 Speaker 2: Yeah, but again, this is like you're a company and 182 00:09:21,920 --> 00:09:24,679 Speaker 2: this is where you store and process all of your data, 183 00:09:24,720 --> 00:09:27,200 Speaker 2: and it's in this one room, but it's not going 184 00:09:27,240 --> 00:09:29,720 Speaker 2: anywhere else. It's not for anybody else, and you have 185 00:09:29,800 --> 00:09:33,400 Speaker 2: to physically be in the room to get your answer, 186 00:09:33,640 --> 00:09:37,360 Speaker 2: process whatever data you're looking for. When the PC came along, 187 00:09:37,600 --> 00:09:40,800 Speaker 2: and then the Mac and Tosh came along, they took 188 00:09:40,880 --> 00:09:44,000 Speaker 2: that thing and just made it very small so you 189 00:09:44,000 --> 00:09:46,280 Speaker 2: could put it on all of your employee's desks. And 190 00:09:46,320 --> 00:09:48,839 Speaker 2: now they had, like I was saying, before, their own 191 00:09:48,880 --> 00:09:52,400 Speaker 2: little data center right there. So if you said, like, hey, 192 00:09:52,559 --> 00:09:56,120 Speaker 2: what's the I need to know the Q four reports, 193 00:09:56,160 --> 00:09:59,040 Speaker 2: they'd say, go to Debbie's desk. Debbie's is the one 194 00:09:59,080 --> 00:10:01,040 Speaker 2: who's got that on her computer. And you would go 195 00:10:01,040 --> 00:10:03,920 Speaker 2: over there and say, WI, what's the Q four report? 196 00:10:04,160 --> 00:10:06,760 Speaker 2: And w would give it to you. Right, there was 197 00:10:06,800 --> 00:10:10,680 Speaker 2: no connectivity, but you could still like do a lot 198 00:10:10,800 --> 00:10:14,040 Speaker 2: more stuff than you could when you had a mainframe. Yeah, 199 00:10:14,120 --> 00:10:17,320 Speaker 2: for sure, which makes mainframes feel like really outdated, but 200 00:10:17,360 --> 00:10:20,040 Speaker 2: it turns out they're like totally still in use. Today. 201 00:10:20,320 --> 00:10:22,599 Speaker 1: Oh yeah, absolutely, I do want to jump back in 202 00:10:22,640 --> 00:10:24,640 Speaker 1: time a little bit, because I did. I promised talk 203 00:10:24,720 --> 00:10:29,480 Speaker 1: of lending the military, oh basically your equipment, and that's 204 00:10:29,520 --> 00:10:31,280 Speaker 1: what happened. In nineteen fifty one, there was a T 205 00:10:31,480 --> 00:10:33,960 Speaker 1: shop chain in the UK. I don't know if it's 206 00:10:34,000 --> 00:10:37,880 Speaker 1: still around, Lions Lyns, and they were the very first 207 00:10:37,880 --> 00:10:39,760 Speaker 1: company in the world that used a mainframe. It was 208 00:10:39,760 --> 00:10:42,560 Speaker 1: called the LEO, the l EO, and it was you know, 209 00:10:42,760 --> 00:10:45,120 Speaker 1: like what you would think they've handled, like payroll and 210 00:10:45,240 --> 00:10:49,800 Speaker 1: stock management and stuff like that, but there wasn't a 211 00:10:49,800 --> 00:10:52,360 Speaker 1: lot for it to do at a T shop chain 212 00:10:52,840 --> 00:10:56,160 Speaker 1: except for those couple of things. And so they calculated 213 00:10:56,400 --> 00:10:59,000 Speaker 1: missile trajectories like you were talking about for the Ministry 214 00:10:59,040 --> 00:10:59,840 Speaker 1: of Defense. 215 00:11:00,600 --> 00:11:05,480 Speaker 2: And that actually kind of helped establish like a I 216 00:11:05,480 --> 00:11:09,720 Speaker 2: guess a pay schedule. How people charged for data centers 217 00:11:09,720 --> 00:11:12,840 Speaker 2: to come. Yeah, it was they like you would charge 218 00:11:12,840 --> 00:11:15,200 Speaker 2: them for the time that they used it, or you 219 00:11:15,240 --> 00:11:17,840 Speaker 2: could lease it for a month. And that really started 220 00:11:17,840 --> 00:11:19,960 Speaker 2: to come around when IBM got in the game. They 221 00:11:20,040 --> 00:11:23,720 Speaker 2: became like the main frame leader in the fifties, the 222 00:11:23,760 --> 00:11:25,960 Speaker 2: early fifties, I think they had a unit that you 223 00:11:25,960 --> 00:11:29,440 Speaker 2: could lease for sixteen thousand dollars per month. That's in 224 00:11:29,520 --> 00:11:32,880 Speaker 2: nineteen fifty two money. And then as the things as 225 00:11:32,920 --> 00:11:37,200 Speaker 2: like the processors got better and smaller and faster, that 226 00:11:37,320 --> 00:11:41,199 Speaker 2: price came down dramatically. And then finally in the sixties 227 00:11:41,600 --> 00:11:46,560 Speaker 2: they released the IBM six System three sixty, which not 228 00:11:46,640 --> 00:11:51,240 Speaker 2: only got Apollo eleven to the moon and back, it 229 00:11:51,320 --> 00:11:54,559 Speaker 2: is a It appears in an episode of mad Men apparently. 230 00:11:55,160 --> 00:11:58,080 Speaker 1: Oh really, yeah, you usould have though, right. 231 00:11:58,400 --> 00:12:00,760 Speaker 2: No, I never did. I just saw reference to it 232 00:12:00,800 --> 00:12:04,960 Speaker 2: on the government and you knew it was the show. Yeah, 233 00:12:05,000 --> 00:12:07,000 Speaker 2: but they like you should look up pictures of it. 234 00:12:07,000 --> 00:12:10,280 Speaker 2: It's like those giant burnt orange cabinets with the real 235 00:12:10,320 --> 00:12:13,079 Speaker 2: magnetic tape. It's just beautiful. They're cool to look at. 236 00:12:13,160 --> 00:12:15,959 Speaker 1: Yeah, yeah, I remember. We've referenced the movie War Games 237 00:12:15,960 --> 00:12:17,839 Speaker 1: from our childhood in the eighties a lot, and that 238 00:12:18,000 --> 00:12:21,520 Speaker 1: the whopper for War Games was that was you know, 239 00:12:21,600 --> 00:12:25,000 Speaker 1: at that age, to see the whopper and action and 240 00:12:25,040 --> 00:12:30,440 Speaker 1: to see Matthew almost aid, Matthew Modine, Matthew Broderick hanging 241 00:12:30,520 --> 00:12:34,480 Speaker 1: up his handheld telephone receiver onto a modem to talk 242 00:12:34,520 --> 00:12:35,400 Speaker 1: to the school computer. 243 00:12:35,520 --> 00:12:38,720 Speaker 2: It was mind blowing that, Yeah, that phone in the 244 00:12:38,720 --> 00:12:41,360 Speaker 2: modem made just a really big impression on me. 245 00:12:41,679 --> 00:12:42,840 Speaker 1: Yeah, and a cool sound. 246 00:12:44,280 --> 00:12:45,320 Speaker 2: Peepoop boop. 247 00:12:46,400 --> 00:12:47,360 Speaker 1: Should we take a break? 248 00:12:48,120 --> 00:12:50,560 Speaker 2: Wait, let me talk about mainframes today though, because I 249 00:12:50,679 --> 00:12:52,440 Speaker 2: just want to give a little I don't know if 250 00:12:52,480 --> 00:12:55,440 Speaker 2: a shout out's the right term, but they are still 251 00:12:55,480 --> 00:12:59,760 Speaker 2: around because they're so reliable, Because they're so secure. You 252 00:12:59,800 --> 00:13:02,520 Speaker 2: can make it so that there's information on those things 253 00:13:02,559 --> 00:13:04,920 Speaker 2: that you again have to be physically present in the 254 00:13:05,000 --> 00:13:07,839 Speaker 2: room to access. Sure, you can put all sorts of 255 00:13:07,880 --> 00:13:11,200 Speaker 2: different layers of security. So if you're like Visa, or 256 00:13:11,240 --> 00:13:16,280 Speaker 2: you're a healthcare company, or you're the Census Bureau, you're 257 00:13:16,280 --> 00:13:19,599 Speaker 2: probably still using a main frame because you're protecting information 258 00:13:20,280 --> 00:13:22,640 Speaker 2: as tightly as you can. But those things are also 259 00:13:22,920 --> 00:13:27,480 Speaker 2: super fast and can hold huge volumes of computation at once. 260 00:13:27,800 --> 00:13:30,920 Speaker 1: Yeah, or a non Golden Globe nominated podcast. 261 00:13:31,640 --> 00:13:34,320 Speaker 2: Yeah, we've got our own main frame. Yeah, we've got 262 00:13:34,320 --> 00:13:36,960 Speaker 2: our IBM three sixty that's right. 263 00:13:37,840 --> 00:13:39,400 Speaker 1: What year was that from again, the three. 264 00:13:39,280 --> 00:13:40,560 Speaker 2: Sixty sixty four. 265 00:13:40,880 --> 00:13:43,679 Speaker 1: Yeah, yeah, that's the one. I'll just making sure we 266 00:13:43,720 --> 00:13:45,679 Speaker 1: didn't have the sixty five because that no, no, no, 267 00:13:45,720 --> 00:13:48,520 Speaker 1: the sixty buggy. Yeah, yeah, all right, we can take 268 00:13:48,520 --> 00:13:51,280 Speaker 1: that break now and we're gonna jump out of the 269 00:13:51,280 --> 00:13:53,520 Speaker 1: wayback machine and venture into the modern world. 270 00:13:53,440 --> 00:13:54,000 Speaker 2: Right after this. 271 00:14:24,440 --> 00:14:26,960 Speaker 1: All right, so we are out of the way back machine. 272 00:14:27,200 --> 00:14:30,280 Speaker 1: We're making that uh we're combining all those old tea 273 00:14:30,320 --> 00:14:34,680 Speaker 1: bags and making some still somewhat weaker tea. Yeah, it's 274 00:14:34,680 --> 00:14:37,160 Speaker 1: not too bad. It's a combination of Earl Gray and 275 00:14:37,480 --> 00:14:40,840 Speaker 1: Camma meal, all kinds of fun stuff, but not too bad. 276 00:14:41,560 --> 00:14:43,240 Speaker 1: And now we're going to talk a little bit about 277 00:14:43,280 --> 00:14:45,600 Speaker 1: when things started to ramp up, because it kind of 278 00:14:45,640 --> 00:14:49,560 Speaker 1: happened in fits and starts. And one of the biggest, uh, 279 00:14:49,680 --> 00:14:51,840 Speaker 1: I guess would it be a fitter a start was 280 00:14:51,880 --> 00:14:54,920 Speaker 1: the Internet. Because once the Internet came along, every business 281 00:14:54,960 --> 00:14:57,520 Speaker 1: in the world started using it, and so all of 282 00:14:57,520 --> 00:15:01,240 Speaker 1: a sudden you had to have a lot more data 283 00:15:01,400 --> 00:15:05,080 Speaker 1: storage and bigger data centers and bigger server rooms in 284 00:15:05,120 --> 00:15:08,080 Speaker 1: your companies, which was, you know, a pretty good thing 285 00:15:08,120 --> 00:15:10,520 Speaker 1: at the time. After the dot com bus. There were 286 00:15:10,720 --> 00:15:13,640 Speaker 1: a lot of casualties of that growth, but then things 287 00:15:13,720 --> 00:15:15,560 Speaker 1: kind of you know, the ship kind of righted itself. 288 00:15:15,920 --> 00:15:19,200 Speaker 2: Yeah, and because it was like accessible to basically every 289 00:15:19,400 --> 00:15:22,960 Speaker 2: business now, like you didn't have to buy a mainframe 290 00:15:23,080 --> 00:15:26,960 Speaker 2: you could lease space on someone else's main frame, like 291 00:15:26,960 --> 00:15:29,280 Speaker 2: you're the Ministry of Defense or something, all of a sudden. 292 00:15:29,560 --> 00:15:32,720 Speaker 2: So that led to this huge proliferation that gave the 293 00:15:34,080 --> 00:15:38,480 Speaker 2: foundation for web commerce. E commerce, that's what they used 294 00:15:38,520 --> 00:15:40,920 Speaker 2: to call it. That's an old timey term now, but 295 00:15:41,120 --> 00:15:44,320 Speaker 2: it created the ability for e commerce to start and flourish. 296 00:15:44,360 --> 00:15:48,920 Speaker 2: So this data center's scaling up to meet the needs 297 00:15:48,960 --> 00:15:51,640 Speaker 2: of the Internet and then to kind of give people 298 00:15:51,680 --> 00:15:53,800 Speaker 2: all sorts of new space and room to come up 299 00:15:53,840 --> 00:15:57,200 Speaker 2: with new stuff. That's where the digital economy came from, 300 00:15:57,280 --> 00:15:57,760 Speaker 2: right there. 301 00:15:58,120 --> 00:15:59,960 Speaker 1: Yeah, all of a sudden, you could hop on webvan 302 00:16:01,000 --> 00:16:02,960 Speaker 1: for an order a SACA groceries. 303 00:16:04,200 --> 00:16:08,120 Speaker 2: I have a friend who was all in about that. 304 00:16:08,760 --> 00:16:11,840 Speaker 1: Well man, yeah, yeah, I think I had a friend 305 00:16:11,840 --> 00:16:13,840 Speaker 1: who was pretty heavily invested. 306 00:16:15,200 --> 00:16:18,160 Speaker 2: It's but clearly it was ahead of its time. I mean, sure, 307 00:16:18,840 --> 00:16:22,520 Speaker 2: let's see, it's a post Mates and. 308 00:16:22,440 --> 00:16:24,160 Speaker 1: Well there's a lot of several of them now that 309 00:16:24,320 --> 00:16:25,040 Speaker 1: have succeeded. 310 00:16:25,200 --> 00:16:26,760 Speaker 2: Well name them, will get ten bucks each? 311 00:16:27,000 --> 00:16:30,600 Speaker 1: Well, well you just got ten for post Mates, and 312 00:16:30,880 --> 00:16:31,840 Speaker 1: you got to split that with me. 313 00:16:31,880 --> 00:16:33,280 Speaker 2: You know, I will, all right. 314 00:16:34,440 --> 00:16:37,600 Speaker 1: Cloud computing was the next big jump. When cloud computing 315 00:16:37,640 --> 00:16:40,520 Speaker 1: came around the early two thousands, or do they call 316 00:16:40,560 --> 00:16:44,920 Speaker 1: that the early oughts? I do, okay, I thought i'd 317 00:16:44,920 --> 00:16:47,080 Speaker 1: heard that come from your mouth. But that is when 318 00:16:47,400 --> 00:16:50,280 Speaker 1: you know, that was the real game changer because things 319 00:16:50,320 --> 00:16:53,600 Speaker 1: were still I mean, when cloud computing came along, people 320 00:16:53,640 --> 00:16:56,800 Speaker 1: thought of it if they didn't look too hard into it. 321 00:16:56,800 --> 00:16:58,680 Speaker 1: They thought it was just you know, floating up in 322 00:16:58,680 --> 00:17:01,880 Speaker 1: the ether somewhere. Yeah, it's it's still being stored on stuff, 323 00:17:01,880 --> 00:17:05,280 Speaker 1: it's just not being stored locally. So all of a sudden, 324 00:17:05,640 --> 00:17:08,920 Speaker 1: things were just going somewhere else for someone else to 325 00:17:09,000 --> 00:17:12,359 Speaker 1: worry about all that storage. And more importantly, they could 326 00:17:12,400 --> 00:17:15,000 Speaker 1: they could link everything together and store a lot of 327 00:17:15,000 --> 00:17:16,399 Speaker 1: stuff from a bunch of different people. 328 00:17:17,000 --> 00:17:21,240 Speaker 2: Right, So now you have data centers not just available 329 00:17:21,280 --> 00:17:24,400 Speaker 2: to somebody like a huge bank or something like that, 330 00:17:24,680 --> 00:17:26,920 Speaker 2: or the government in a shop then a bank, yeah, 331 00:17:27,000 --> 00:17:31,280 Speaker 2: or a t shop, and then to e commerce businesses. 332 00:17:31,440 --> 00:17:34,879 Speaker 2: Now it's available to you and me. So it's really 333 00:17:34,920 --> 00:17:38,040 Speaker 2: hard to remember back because the world has changed so much. 334 00:17:38,080 --> 00:17:41,359 Speaker 2: But Chuck, like two thousand and eight, two thousand and nine, 335 00:17:42,119 --> 00:17:46,040 Speaker 2: they were given us like VPN, little things that like 336 00:17:46,400 --> 00:17:48,400 Speaker 2: you could go home and work and like you would 337 00:17:48,520 --> 00:17:51,280 Speaker 2: it would never work. I'd never understood how to make 338 00:17:51,320 --> 00:17:53,880 Speaker 2: it work. Yeah, but that was like the very beginning 339 00:17:54,160 --> 00:17:55,960 Speaker 2: of how you could take your work home with you 340 00:17:56,119 --> 00:17:59,679 Speaker 2: and work from home and do things remotely like we 341 00:17:59,720 --> 00:18:02,720 Speaker 2: can now. Like it's nothing, but this led to the 342 00:18:02,800 --> 00:18:06,840 Speaker 2: rise of businesses like Dropbox. Right. So Dropbox goes to 343 00:18:06,960 --> 00:18:09,880 Speaker 2: Amazon Web Services and say, hey, we want to buy 344 00:18:09,880 --> 00:18:12,520 Speaker 2: a bunch of your cloud, right, which means that they're 345 00:18:12,560 --> 00:18:15,080 Speaker 2: going to use a bunch of like different servers and 346 00:18:15,119 --> 00:18:17,760 Speaker 2: different data centers all over the place. And then Dropbox 347 00:18:17,760 --> 00:18:20,360 Speaker 2: turns around to you and says, hey, if you give 348 00:18:20,400 --> 00:18:23,880 Speaker 2: me nineteen ninety five a month, you can have one 349 00:18:24,000 --> 00:18:27,800 Speaker 2: terabyte of data. Right. You can consume one terabyte of data, right, 350 00:18:28,119 --> 00:18:31,760 Speaker 2: and then hopefully you don't use all of that, so 351 00:18:31,880 --> 00:18:35,000 Speaker 2: they don't have to pay Amazon Web Services for stuff 352 00:18:35,040 --> 00:18:37,840 Speaker 2: they didn't use. But you're paying that nineteen ninety five 353 00:18:37,880 --> 00:18:40,800 Speaker 2: a month rather whether you use that whole terabyte or not. 354 00:18:41,080 --> 00:18:44,440 Speaker 2: It's a pretty smart business model. Would not exist at 355 00:18:44,480 --> 00:18:47,120 Speaker 2: all if the cloud didn't exist. 356 00:18:47,480 --> 00:18:49,680 Speaker 1: Yeah, as funny as you were talking about two thousand 357 00:18:49,680 --> 00:18:51,399 Speaker 1: and eight and how quaint that is, now, that's what 358 00:18:51,440 --> 00:18:53,800 Speaker 1: the year we started the show I know, I know, 359 00:18:53,920 --> 00:18:56,240 Speaker 1: it's crazy to think about it really. 360 00:18:56,040 --> 00:18:59,280 Speaker 2: Is, but imagine like working from home at that time, 361 00:18:59,359 --> 00:19:01,680 Speaker 2: it was just didn't you could know. 362 00:19:02,040 --> 00:19:04,280 Speaker 1: It was kind of great you went home and you homed. 363 00:19:04,600 --> 00:19:07,359 Speaker 2: That's exactly right. That was a big difference I remember. 364 00:19:08,040 --> 00:19:11,600 Speaker 1: But these data centers have now come together in such 365 00:19:11,600 --> 00:19:13,679 Speaker 1: a big way now that they the largest ones are 366 00:19:13,680 --> 00:19:18,600 Speaker 1: called hyper scale and they host more than five thousand servers, 367 00:19:18,640 --> 00:19:23,040 Speaker 1: like servers, not individual persons. Data. It's like, it's incredible, 368 00:19:23,080 --> 00:19:25,480 Speaker 1: how big is how big they've gotten Googles. And we'll 369 00:19:25,480 --> 00:19:27,520 Speaker 1: go over some of the kind of square footage and 370 00:19:27,520 --> 00:19:30,320 Speaker 1: then later talk about the elephant in the room, which 371 00:19:30,320 --> 00:19:34,040 Speaker 1: is energy and water usage. But Google's first data center 372 00:19:34,080 --> 00:19:36,560 Speaker 1: was built in two thousand and six, just but two 373 00:19:36,680 --> 00:19:38,520 Speaker 1: years before stuff you should have launched. And this is 374 00:19:38,560 --> 00:19:42,680 Speaker 1: an Oregon and they are still expanding that thing beyond 375 00:19:43,160 --> 00:19:48,199 Speaker 1: one point three million square feet. Meanwhile, in China they're like, 376 00:19:48,840 --> 00:19:53,720 Speaker 1: hold my tea, I guess, because China Telecom has a 377 00:19:53,840 --> 00:19:58,080 Speaker 1: ten point seven million square foot data center in Inner Mongolia. 378 00:19:58,280 --> 00:19:59,880 Speaker 2: It's two hundred and fifty acres. 379 00:20:01,600 --> 00:20:06,359 Speaker 1: That is a warehouse full of worrying servers heating up 380 00:20:06,359 --> 00:20:07,000 Speaker 1: and being cool. 381 00:20:07,520 --> 00:20:10,160 Speaker 2: Yeah, which is a big problem for any data center, 382 00:20:10,200 --> 00:20:14,719 Speaker 2: it turns out. But the whole expansion this jump starting 383 00:20:14,760 --> 00:20:18,320 Speaker 2: in twenty seventeen thanks to cloud computing, because again, cloud 384 00:20:18,359 --> 00:20:20,840 Speaker 2: computing just means all your stuff isn't on one server 385 00:20:20,920 --> 00:20:23,000 Speaker 2: and one data center, it's broken up into pieces and 386 00:20:23,040 --> 00:20:25,840 Speaker 2: spread all over the place. That's the cloud. That's basically it. 387 00:20:26,440 --> 00:20:30,240 Speaker 2: Even though it's way more advanced and intricate than that. 388 00:20:30,240 --> 00:20:32,760 Speaker 2: That's like all you really need to know for the 389 00:20:32,800 --> 00:20:36,800 Speaker 2: purposes of this episode, right. It led to a huge jump, 390 00:20:36,800 --> 00:20:39,520 Speaker 2: a huge need in data centers, and it also expanded 391 00:20:39,560 --> 00:20:43,360 Speaker 2: all the stuff we can do now, and COVID actually 392 00:20:43,359 --> 00:20:47,840 Speaker 2: gave it another bump, made building a data center very 393 00:20:48,400 --> 00:20:51,159 Speaker 2: economically attractive thing to do if you had the money. 394 00:20:51,400 --> 00:20:56,159 Speaker 2: Because remote working finally finally established itself as like, no, 395 00:20:56,400 --> 00:21:00,560 Speaker 2: we're doing this, stop calling us back to the office, ye, 396 00:21:00,760 --> 00:21:04,320 Speaker 2: which is what they're doing now. I know, I hope 397 00:21:04,359 --> 00:21:07,520 Speaker 2: it doesn't work because I remember when all of that 398 00:21:07,560 --> 00:21:10,399 Speaker 2: started and everybody was so nervous, like management was all 399 00:21:10,440 --> 00:21:12,439 Speaker 2: so nervous that people were just going to totally like 400 00:21:12,520 --> 00:21:15,840 Speaker 2: mess around and everything. It just it didn't happen. I 401 00:21:15,880 --> 00:21:19,439 Speaker 2: don't know anybody who's even been like gotten a talking to, 402 00:21:19,760 --> 00:21:23,280 Speaker 2: let alone, been fired for just messing around at home. 403 00:21:23,480 --> 00:21:25,359 Speaker 2: As a matter of fact, like you were saying, it 404 00:21:25,480 --> 00:21:26,640 Speaker 2: just makes you work more. 405 00:21:27,680 --> 00:21:29,520 Speaker 1: Yeah, I mean, do you know how many times in 406 00:21:29,560 --> 00:21:32,320 Speaker 1: our old offices I would see Jonathan Strickland just wandering 407 00:21:32,320 --> 00:21:34,200 Speaker 1: aimlessly through the office chatting with people. 408 00:21:34,520 --> 00:21:37,359 Speaker 2: Yes, I do, because he would chat with me a lot. 409 00:21:37,800 --> 00:21:38,120 Speaker 2: He did. 410 00:21:38,160 --> 00:21:41,000 Speaker 1: He did that in front of God and everybody, as 411 00:21:41,040 --> 00:21:43,080 Speaker 1: they say, right there in the office. So I can't 412 00:21:43,080 --> 00:21:44,440 Speaker 1: imagine what happened with him at home. 413 00:21:44,600 --> 00:21:47,520 Speaker 2: Yeah, like it was a sorority mixer or something. 414 00:21:49,840 --> 00:21:51,800 Speaker 1: We love Strickland. He's still around everyone. By the way, 415 00:21:51,840 --> 00:21:54,960 Speaker 1: he retired from tech stuff, but he's still with a company. 416 00:21:54,680 --> 00:21:57,440 Speaker 2: Which is great. We love Strickland all right. 417 00:21:57,480 --> 00:21:59,760 Speaker 1: So now we're onto AI data centers and that was 418 00:21:59,800 --> 00:22:04,119 Speaker 1: the I mean to call it a game changer. Seems 419 00:22:04,200 --> 00:22:07,680 Speaker 1: quaint compared to the rise of cloud computing and everything, 420 00:22:07,760 --> 00:22:10,600 Speaker 1: because it is off to the races in a way 421 00:22:10,640 --> 00:22:14,560 Speaker 1: that seemingly cannot be stopped. The genie has left the bottle, 422 00:22:14,560 --> 00:22:17,399 Speaker 1: as they say. Starting in twenty twenty two, when chat 423 00:22:17,480 --> 00:22:21,159 Speaker 1: GPT was released by open Ai, all of a sudden, 424 00:22:22,320 --> 00:22:26,439 Speaker 1: the need for these data centers became exponentially greater in 425 00:22:26,560 --> 00:22:29,359 Speaker 1: size in the speed at which they need these things 426 00:22:29,359 --> 00:22:36,040 Speaker 1: built because AI requires a ton of computing power to operate. 427 00:22:36,760 --> 00:22:39,280 Speaker 2: So much so that they don't even use the standard 428 00:22:40,600 --> 00:22:44,040 Speaker 2: what's called the compute machine. So compute is like all 429 00:22:44,080 --> 00:22:47,400 Speaker 2: of the processing power, the networking, all of that stuff, 430 00:22:47,760 --> 00:22:51,040 Speaker 2: and traditionally with a computer that's done on a CPU. Right, 431 00:22:52,080 --> 00:22:54,760 Speaker 2: that's how all of this gets done, right, everything else 432 00:22:54,800 --> 00:22:57,240 Speaker 2: is infrastructure. The CPU is doing all of the work. 433 00:22:58,400 --> 00:23:01,960 Speaker 2: Those are so like they still work like. Most data 434 00:23:01,960 --> 00:23:05,840 Speaker 2: centers are running on CPUs for AI, just not fast enough. 435 00:23:06,800 --> 00:23:11,159 Speaker 2: They use GPUs graphic processing units, which are associated with 436 00:23:11,720 --> 00:23:14,040 Speaker 2: video games for most people, right, you need a good 437 00:23:14,080 --> 00:23:19,240 Speaker 2: graphics card to run your video game, I guess, but 438 00:23:20,640 --> 00:23:23,600 Speaker 2: they the reason that for AI data centers that they 439 00:23:23,680 --> 00:23:26,640 Speaker 2: use GPUs is because they're really good at parallel processing. 440 00:23:26,680 --> 00:23:28,840 Speaker 2: They can run a bunch of different operations at once. 441 00:23:29,320 --> 00:23:31,399 Speaker 2: So you're like, cool, you just throw a GPU in 442 00:23:31,400 --> 00:23:33,720 Speaker 2: a data center and you can run an AI. No, 443 00:23:34,200 --> 00:23:38,080 Speaker 2: you need hundreds of thousands of these things strung together, 444 00:23:38,720 --> 00:23:41,760 Speaker 2: and instead of like a CPU running like a couple 445 00:23:41,800 --> 00:23:45,200 Speaker 2: of servers or something like that. At data center. All 446 00:23:45,240 --> 00:23:48,959 Speaker 2: of them are strung together to form one giant supercomputer 447 00:23:49,600 --> 00:23:51,320 Speaker 2: that the AI operates on. 448 00:23:52,000 --> 00:23:55,600 Speaker 1: Yeah, like chat GPT itself was trained on twenty thousand 449 00:23:55,720 --> 00:24:00,320 Speaker 1: of these GPUs a GPU, you know, the sort of 450 00:24:00,320 --> 00:24:02,639 Speaker 1: the biggest name in the game. There's a couple, but 451 00:24:02,680 --> 00:24:06,320 Speaker 1: the biggest one obviously is the Navidia. But the Navidia 452 00:24:06,440 --> 00:24:09,840 Speaker 1: H one hundred that is the standard right now. If 453 00:24:09,880 --> 00:24:13,000 Speaker 1: you look this thing up, it it fits in your hand. 454 00:24:13,480 --> 00:24:16,960 Speaker 1: It's not like some gigantic thing. H twenty thousand of 455 00:24:17,000 --> 00:24:19,960 Speaker 1: them linked together or one hundred thousand of them linked together. 456 00:24:20,040 --> 00:24:22,760 Speaker 1: Who knows how many, you know, hundreds of thousands are 457 00:24:22,760 --> 00:24:26,640 Speaker 1: eventually going to be linked together to end the world. 458 00:24:26,680 --> 00:24:28,960 Speaker 1: That's where all the power comes from, like you were saying, 459 00:24:29,000 --> 00:24:33,800 Speaker 1: But it's you know, it's just a little rectangular handheld 460 00:24:33,800 --> 00:24:36,000 Speaker 1: thing that's like, oh, that looks like something that maybe 461 00:24:36,000 --> 00:24:40,639 Speaker 1: came out of a computer. And Navidia is what are 462 00:24:40,640 --> 00:24:43,240 Speaker 1: their stock jumped over a couple of years, like nine 463 00:24:43,320 --> 00:24:46,320 Speaker 1: hundred percent over twenty twenty three and twenty twenty four 464 00:24:46,640 --> 00:24:47,200 Speaker 1: something like. 465 00:24:47,119 --> 00:24:50,040 Speaker 2: That, Yeah, nine hundred percent increase. 466 00:24:51,240 --> 00:24:53,720 Speaker 1: Yeah, and we'll talk about why all of this is 467 00:24:54,160 --> 00:24:56,879 Speaker 1: super like scary and dangerous because it really is. 468 00:24:57,119 --> 00:24:59,199 Speaker 2: Well, yeah, if you want a really good explanation of 469 00:24:59,240 --> 00:25:01,879 Speaker 2: this about and like you said, how many GPUs you 470 00:25:01,920 --> 00:25:05,800 Speaker 2: string together before we end the world? Nate Sores and 471 00:25:06,119 --> 00:25:08,840 Speaker 2: Eliezer Yukowski In that book, I keep referencing that I 472 00:25:08,880 --> 00:25:12,159 Speaker 2: think everybody should read. If anyone builds it, everyone dies. 473 00:25:12,359 --> 00:25:15,879 Speaker 2: About the current state of AI. They talk about this 474 00:25:16,119 --> 00:25:20,160 Speaker 2: in depth, but in a really understandable way. It's really fascinating. 475 00:25:21,119 --> 00:25:23,399 Speaker 2: But that's essentially one of the things they say is 476 00:25:23,440 --> 00:25:26,760 Speaker 2: like we keep stringing together tens and tens of thousands 477 00:25:26,760 --> 00:25:31,600 Speaker 2: more GPUs, that just makes the supercomputer smarter and smarter 478 00:25:31,760 --> 00:25:35,200 Speaker 2: and more capable. And eventually, what's going to happen. We're 479 00:25:35,200 --> 00:25:38,520 Speaker 2: going to reach some point potentially where we just put 480 00:25:38,520 --> 00:25:41,040 Speaker 2: that extra last GPU in there and all of a sudden, 481 00:25:41,040 --> 00:25:43,600 Speaker 2: the balance is tipped and the thing becomes super intelligent. 482 00:25:44,119 --> 00:25:46,960 Speaker 1: That's right. Also a time for me because you're always 483 00:25:47,000 --> 00:25:48,800 Speaker 1: too shy to to plug the end of the world 484 00:25:48,800 --> 00:25:53,040 Speaker 1: with Josh Clark, your fantastic limited series, of which AI 485 00:25:53,359 --> 00:25:55,600 Speaker 1: is one of the central focuses or one of was 486 00:25:55,640 --> 00:25:56,320 Speaker 1: it eight things? 487 00:25:57,560 --> 00:26:00,320 Speaker 2: Ten? Well, there was ten episodes. 488 00:26:00,040 --> 00:26:03,240 Speaker 1: Ten episodes, right, thanks baby? Well but one of the 489 00:26:03,240 --> 00:26:06,440 Speaker 1: episodes was just like you talking about Jimmy Buffett Records. 490 00:26:06,119 --> 00:26:06,920 Speaker 2: And that's right. 491 00:26:07,960 --> 00:26:11,680 Speaker 1: You had to lighten the mood yep Should we take 492 00:26:11,720 --> 00:26:13,600 Speaker 1: a break or should we keep going for a minute. 493 00:26:13,640 --> 00:26:15,960 Speaker 1: Let's keep going for a minute, okay, because you talked 494 00:26:16,000 --> 00:26:18,000 Speaker 1: about investment, and you know, if you have the money 495 00:26:18,000 --> 00:26:20,639 Speaker 1: to open one of these things, and that's what these 496 00:26:20,720 --> 00:26:24,679 Speaker 1: tech companies are doing, like to perhaps they're great peril. 497 00:26:24,680 --> 00:26:27,800 Speaker 1: At some point we'll see. Microsoft has invested eighty eight 498 00:26:27,800 --> 00:26:31,480 Speaker 1: billion dollars in data centers just in twenty twenty five. 499 00:26:32,160 --> 00:26:34,720 Speaker 1: Amazon is pledged over the next fifteen years one hundred 500 00:26:34,760 --> 00:26:39,040 Speaker 1: and fifty billion dollars. And Google and Meta together about 501 00:26:39,200 --> 00:26:41,280 Speaker 1: you know, not working together, but they are expected to 502 00:26:41,280 --> 00:26:44,439 Speaker 1: spend about seven hundred and fifty billion dollars just on 503 00:26:44,760 --> 00:26:49,320 Speaker 1: equipment over the next two years. And Stanley Morgan says 504 00:26:49,920 --> 00:26:53,119 Speaker 1: to Morgan, Stanley, what I say, Stanley Morgan, Yeah, I 505 00:26:53,119 --> 00:26:54,200 Speaker 1: think we should leave that in there. 506 00:26:54,320 --> 00:26:54,800 Speaker 2: Okay. 507 00:26:55,440 --> 00:26:58,160 Speaker 1: Stanley says, Hey, you know, guys. 508 00:26:58,160 --> 00:26:59,920 Speaker 2: Maybe you might like Stanley Kamma boar. 509 00:27:00,680 --> 00:27:04,560 Speaker 1: Yeah, Stanley Kumo Morgan. Over five years between twenty five 510 00:27:04,640 --> 00:27:09,240 Speaker 1: and twenty thirty, Morgan, Stanley says about three trillion dollars 511 00:27:10,040 --> 00:27:12,479 Speaker 1: is going to be spent just on the data centers 512 00:27:13,080 --> 00:27:15,760 Speaker 1: about I mean, half of which is the hardware and 513 00:27:15,880 --> 00:27:17,520 Speaker 1: half of which is just building these things. 514 00:27:17,680 --> 00:27:22,720 Speaker 2: Yeah, just in what the next four years. Yeah, So 515 00:27:22,880 --> 00:27:24,760 Speaker 2: think about it. If you're in Nvidia and you're the 516 00:27:24,840 --> 00:27:27,840 Speaker 2: industry leader for GPUs and everybody's like, we're gonna spend 517 00:27:27,880 --> 00:27:31,920 Speaker 2: one point five trillion dollars on this on the infrastructure 518 00:27:32,000 --> 00:27:35,320 Speaker 2: in the GPUs, you're looking pretty good down the. 519 00:27:35,359 --> 00:27:38,440 Speaker 1: Road, Yeah, for sure. And you know they're doing this 520 00:27:38,560 --> 00:27:41,160 Speaker 1: because there's a demand right now for use at least 521 00:27:41,280 --> 00:27:45,360 Speaker 1: because things like OpenAI and other AI creators are using 522 00:27:45,400 --> 00:27:48,480 Speaker 1: them like crazy. But these companies are also using them 523 00:27:48,520 --> 00:27:51,120 Speaker 1: for their own AI research, right. 524 00:27:51,359 --> 00:27:56,879 Speaker 2: Yeah. So like Xai has that Colossus machine that you 525 00:27:56,960 --> 00:28:00,639 Speaker 2: were talking about earlier, which is two high hundred thousand 526 00:28:00,720 --> 00:28:05,880 Speaker 2: GPUs strung together. I'm not sure if it's fully online yet, Tennessee. Yeah, 527 00:28:06,119 --> 00:28:08,760 Speaker 2: And it's just for that. It's they're not doing any 528 00:28:08,800 --> 00:28:13,040 Speaker 2: they're not calculating missile trajectories for the Ministry of Defense 529 00:28:13,119 --> 00:28:16,000 Speaker 2: or anything like that. Like, it's just for that AI. 530 00:28:16,600 --> 00:28:18,520 Speaker 2: And yeah, I think Meta is doing the same thing. 531 00:28:19,119 --> 00:28:22,000 Speaker 2: Open ai I don't think is building their own because 532 00:28:22,040 --> 00:28:25,600 Speaker 2: they're so in cahoots with Microsoft. I think they run 533 00:28:25,640 --> 00:28:28,600 Speaker 2: their stuff on Microsoft's data centers. But yeah, yeah, if 534 00:28:28,640 --> 00:28:31,280 Speaker 2: you have an AI essentially right now, which means like 535 00:28:31,400 --> 00:28:35,240 Speaker 2: God and everybody, you probably have your own data center 536 00:28:35,359 --> 00:28:36,119 Speaker 2: dedicated to it. 537 00:28:36,800 --> 00:28:41,120 Speaker 1: Yeah. I And this isn't some some moral stand I'm 538 00:28:41,160 --> 00:28:43,960 Speaker 1: taking by saying that I have never used AI. And 539 00:28:44,520 --> 00:28:46,960 Speaker 1: trust me, I know that every part of my life 540 00:28:47,040 --> 00:28:49,720 Speaker 1: is now touched by AI, so I am inadvertently using. 541 00:28:49,560 --> 00:28:51,760 Speaker 2: It touched by an AI, that's right. 542 00:28:53,080 --> 00:28:56,840 Speaker 1: But I've never I've never used like you know, chatbots 543 00:28:56,920 --> 00:29:00,840 Speaker 1: or or large language models or inning like that, just 544 00:29:01,320 --> 00:29:04,719 Speaker 1: mainly because I'm just I'm fine doing things like they 545 00:29:04,760 --> 00:29:07,400 Speaker 1: are for now, and not in a luddite sort of way. 546 00:29:07,560 --> 00:29:10,440 Speaker 1: I just everything's going along great for me in my 547 00:29:10,560 --> 00:29:12,080 Speaker 1: job and how I live my life, so I just 548 00:29:12,160 --> 00:29:13,160 Speaker 1: I don't have a need for it. 549 00:29:13,520 --> 00:29:15,440 Speaker 2: I do the same thing. And I think also both 550 00:29:15,520 --> 00:29:18,240 Speaker 2: of us are like if somebody else wants to do 551 00:29:18,360 --> 00:29:20,600 Speaker 2: it the other way, that's fine. Like we're certainly not 552 00:29:20,680 --> 00:29:24,400 Speaker 2: gonna criticize them or be crimudgeony about it or say 553 00:29:24,480 --> 00:29:26,640 Speaker 2: that you know that's stupid. 554 00:29:27,440 --> 00:29:31,240 Speaker 1: Right, But as you'll see. You know, and again this 555 00:29:31,400 --> 00:29:35,000 Speaker 1: isn't yucking someone's yum, but everyone should know what they're 556 00:29:35,000 --> 00:29:36,840 Speaker 1: a part of, and that's part of what the episode 557 00:29:36,880 --> 00:29:37,200 Speaker 1: is about. 558 00:29:37,360 --> 00:29:41,200 Speaker 2: That's right. Yeah, no, I totally do. Before we take 559 00:29:41,240 --> 00:29:45,480 Speaker 2: a break. I think it's a small kind of side issue, 560 00:29:45,600 --> 00:29:49,160 Speaker 2: but it's worth pointing out that it sucks because these 561 00:29:49,280 --> 00:29:53,480 Speaker 2: Nvidia chips are so in demand from these massive companies. 562 00:29:53,800 --> 00:29:57,360 Speaker 2: It has driven the price for just the average Nvidia 563 00:29:57,480 --> 00:30:01,240 Speaker 2: graphics card sky high. So if you're a gamer and 564 00:30:01,360 --> 00:30:04,680 Speaker 2: you're like trying to improve your system, like you pay 565 00:30:05,360 --> 00:30:08,600 Speaker 2: way more than you used to for the same graphics 566 00:30:08,680 --> 00:30:11,600 Speaker 2: card that you could have bought for like a quarter 567 00:30:11,760 --> 00:30:14,200 Speaker 2: of the price, you know, a couple of years ago. 568 00:30:14,760 --> 00:30:17,280 Speaker 1: Yeah, and I wasn't even looking like I didn't even 569 00:30:17,320 --> 00:30:18,800 Speaker 1: know that you could just buy This is how little 570 00:30:18,840 --> 00:30:21,000 Speaker 1: I know about all this before. This was like, could 571 00:30:21,080 --> 00:30:24,760 Speaker 1: you just buy a Navidia Gpu? But I was just 572 00:30:24,840 --> 00:30:26,840 Speaker 1: researching the size and like what do these things look like? 573 00:30:27,360 --> 00:30:29,360 Speaker 1: And it, you know, it's one was on eBay for 574 00:30:29,480 --> 00:30:31,600 Speaker 1: twenty thousand dollars and I was like, oh my god, 575 00:30:31,760 --> 00:30:33,480 Speaker 1: I didn't Oh I didn't know that was the deal? 576 00:30:33,680 --> 00:30:34,160 Speaker 2: Is that right? 577 00:30:35,200 --> 00:30:36,920 Speaker 1: Yeah, and I don't know if that's accurate. I don't 578 00:30:37,280 --> 00:30:39,800 Speaker 1: know anything about it, so I could easily be corrected 579 00:30:39,840 --> 00:30:43,040 Speaker 1: on all this, but that's what the Internet told me. 580 00:30:43,240 --> 00:30:45,400 Speaker 2: Okay, well the Internet never lies. 581 00:30:46,960 --> 00:30:48,600 Speaker 1: Oh one thing before we break real quick, because we 582 00:30:48,640 --> 00:30:50,760 Speaker 1: did promise a little UK specific stuff and I didn't 583 00:30:50,760 --> 00:30:55,080 Speaker 1: want to short shrift our brit listeners or Kyle. The 584 00:30:55,280 --> 00:30:58,200 Speaker 1: UK is right now like the third largest nation for 585 00:30:58,320 --> 00:31:00,880 Speaker 1: data centers. The US is first, I think Germany a second, 586 00:31:01,680 --> 00:31:04,120 Speaker 1: and they signed what was called the Tech Prosperity Deal 587 00:31:04,200 --> 00:31:07,240 Speaker 1: with the giants, the tech giants of the United States, 588 00:31:07,920 --> 00:31:11,600 Speaker 1: And right now Microsoft has announced a thirty billion dollar 589 00:31:11,680 --> 00:31:15,520 Speaker 1: investment in UK data centers, and I think like one 590 00:31:15,600 --> 00:31:18,400 Speaker 1: hundred new AI data centers are planned in the UK 591 00:31:18,520 --> 00:31:19,600 Speaker 1: at this point moving forward. 592 00:31:19,920 --> 00:31:21,840 Speaker 2: Yeah, and I saw there's at least one in Wales 593 00:31:21,920 --> 00:31:26,720 Speaker 2: that's being smartly done. They took an old radiator factory 594 00:31:26,880 --> 00:31:31,920 Speaker 2: plant campus and they're revitalizing that as AI data center. 595 00:31:32,040 --> 00:31:35,160 Speaker 2: So it does sound like I get why the UK 596 00:31:35,360 --> 00:31:37,360 Speaker 2: is doing it, but there's a lot of people in 597 00:31:37,440 --> 00:31:39,280 Speaker 2: the UK and elsewhere who are like, these are not 598 00:31:39,520 --> 00:31:42,920 Speaker 2: this is not a good investment for local governments or 599 00:31:42,960 --> 00:31:46,240 Speaker 2: even national governments. There's a big problem with all this, 600 00:31:46,480 --> 00:31:50,280 Speaker 2: Like there is a AI boom going on. Data centers 601 00:31:50,320 --> 00:31:52,800 Speaker 2: are just one part of it. Like people are throwing 602 00:31:52,920 --> 00:31:57,360 Speaker 2: money at AI like it's nineteen ninety nine, and a 603 00:31:57,400 --> 00:31:59,760 Speaker 2: lot of people are like, there's another it's not a 604 00:32:00,400 --> 00:32:04,360 Speaker 2: bubble this time, but it's a AI bubble. Yeah. One 605 00:32:04,400 --> 00:32:06,840 Speaker 2: of the reasons why some people are pointing to it 606 00:32:06,920 --> 00:32:11,560 Speaker 2: as an AI bubble is that it's just not clear 607 00:32:12,000 --> 00:32:14,400 Speaker 2: how much money is going to be made from AI 608 00:32:14,920 --> 00:32:18,800 Speaker 2: and when that's going to start. Yeah. I think the 609 00:32:18,920 --> 00:32:21,760 Speaker 2: Financial Times called Open AI a money pit with a 610 00:32:21,840 --> 00:32:25,400 Speaker 2: website on top. Yeah, not great, no, because people are 611 00:32:25,480 --> 00:32:27,760 Speaker 2: just pumping money into this stuff, but they're not getting 612 00:32:28,440 --> 00:32:32,360 Speaker 2: they're not seeing results from it, not yet. It's not 613 00:32:32,480 --> 00:32:35,400 Speaker 2: necessarily a bad bet that AI is going to completely 614 00:32:35,480 --> 00:32:38,960 Speaker 2: revolutionize the world and like revolutionize economies and going to 615 00:32:39,040 --> 00:32:42,880 Speaker 2: make some people a lot of money, but there's just 616 00:32:43,000 --> 00:32:45,240 Speaker 2: no clear path to it right now, which makes some 617 00:32:45,320 --> 00:32:46,120 Speaker 2: people nervous. 618 00:32:46,800 --> 00:32:50,840 Speaker 1: Yeah, there's about five percent, just five percent of pilot 619 00:32:50,920 --> 00:32:56,480 Speaker 1: AI programs right now in business secure returns on their investments, 620 00:32:56,800 --> 00:32:58,080 Speaker 1: you know, like they make the money. 621 00:32:58,520 --> 00:33:03,360 Speaker 2: But Stanley Morgan is predicting revenues of a trillion dollars 622 00:33:03,400 --> 00:33:05,480 Speaker 2: by twenty twenty eight, that's. 623 00:33:05,320 --> 00:33:08,440 Speaker 1: What they're saying. I mean, we'll see Navidia. I mean, 624 00:33:08,560 --> 00:33:11,280 Speaker 1: Kyle's also keen to point out that there's sort of 625 00:33:11,680 --> 00:33:14,400 Speaker 1: a circular economy within all this going on. That's a 626 00:33:14,440 --> 00:33:17,920 Speaker 1: little bit like troubling maybe because Navidia is investing in 627 00:33:18,040 --> 00:33:21,520 Speaker 1: open AAI, but that depends on their purchase of those 628 00:33:21,600 --> 00:33:26,160 Speaker 1: Navidia chips. So, you know, everyone from you know, just 629 00:33:26,360 --> 00:33:28,600 Speaker 1: people who are smarter than us as far as the 630 00:33:28,640 --> 00:33:31,560 Speaker 1: stuff goes, are warning people right down to the IMF, 631 00:33:31,680 --> 00:33:36,160 Speaker 1: the International Monetary Fund are flashing the warning signs saying 632 00:33:36,280 --> 00:33:39,120 Speaker 1: like this could be you know, it could make a 633 00:33:39,200 --> 00:33:41,440 Speaker 1: trillion dollars by twenty twenty eight, or it could like 634 00:33:41,520 --> 00:33:42,680 Speaker 1: wreck the global economy. 635 00:33:43,040 --> 00:33:49,000 Speaker 2: Yeah for sure. Yeah, we have no idea, although I 636 00:33:49,080 --> 00:33:52,040 Speaker 2: have seen people argue against it, that say like this 637 00:33:52,240 --> 00:33:55,120 Speaker 2: is nothing like Yeah, a lot of these AI companies 638 00:33:55,160 --> 00:34:00,560 Speaker 2: are probably overinflated, but it's nothing like it was with 639 00:34:01,000 --> 00:34:03,560 Speaker 2: like the two thousand and eight meltdown or the dot 640 00:34:03,640 --> 00:34:07,120 Speaker 2: com bubble, Like this is we're a lot we're a 641 00:34:07,160 --> 00:34:09,560 Speaker 2: lot more seasoned, or investors are a lot more seasoned. 642 00:34:09,600 --> 00:34:12,000 Speaker 2: Than they were before. The problem is one of the 643 00:34:12,080 --> 00:34:16,600 Speaker 2: problems is that the financing is expected to come in 644 00:34:16,800 --> 00:34:21,360 Speaker 2: large part from private credit, which is essentially an investment 645 00:34:21,480 --> 00:34:25,400 Speaker 2: vehicle for investors to go lend money to say like 646 00:34:25,600 --> 00:34:28,960 Speaker 2: companies that want to build data centers. Right, and this 647 00:34:29,239 --> 00:34:33,520 Speaker 2: is largely unregulated. It's very shadowy. We don't know how 648 00:34:33,600 --> 00:34:37,600 Speaker 2: many how much debt exists in the world on private 649 00:34:37,680 --> 00:34:40,320 Speaker 2: credit because they don't have to report this stuff. And 650 00:34:41,040 --> 00:34:42,759 Speaker 2: you know, as we learn from the two thousand and 651 00:34:42,760 --> 00:34:48,040 Speaker 2: eight meltdown, when there's like a massive speculation among finances 652 00:34:48,400 --> 00:34:52,759 Speaker 2: that involves debt's that can go really bad. 653 00:34:53,440 --> 00:34:57,239 Speaker 1: Yeah for sure. And speaking of going bad, I guess 654 00:34:57,320 --> 00:34:59,560 Speaker 1: we're at the sort of environmental piece of this whole thing. 655 00:35:00,280 --> 00:35:01,719 Speaker 1: And this is what I was talking about when I 656 00:35:01,760 --> 00:35:03,600 Speaker 1: said that, you know, people should just be aware of 657 00:35:03,680 --> 00:35:05,880 Speaker 1: what they're taking part in. And again this is not 658 00:35:05,960 --> 00:35:09,120 Speaker 1: to shame anybody who uses AI for their job or 659 00:35:09,320 --> 00:35:13,320 Speaker 1: just to make funny fake videos, but but you know, 660 00:35:13,960 --> 00:35:16,320 Speaker 1: everyone is sort of tied together to make this what 661 00:35:16,440 --> 00:35:19,400 Speaker 1: it is. Who's using that stuff, And I get if 662 00:35:19,400 --> 00:35:21,000 Speaker 1: someone says, like, hey, if I quit this thing, it's 663 00:35:21,040 --> 00:35:23,920 Speaker 1: not going to make any difference. But that's sort of 664 00:35:23,960 --> 00:35:26,879 Speaker 1: the the age old. Like, you know, if I don't 665 00:35:26,920 --> 00:35:30,560 Speaker 1: recycle my ten can, my aluminum can, ten cans, my 666 00:35:30,640 --> 00:35:33,000 Speaker 1: aluminum cans, then it's not gonna make that big of 667 00:35:33,080 --> 00:35:36,080 Speaker 1: a difference. But the idea of everyone getting together to 668 00:35:36,160 --> 00:35:38,680 Speaker 1: do something for the common good, that's where change happens, 669 00:35:39,840 --> 00:35:44,880 Speaker 1: or where negative change happens. So as far as AI 670 00:35:45,040 --> 00:35:48,759 Speaker 1: data centers go, the main you know, aside from just 671 00:35:49,000 --> 00:35:51,080 Speaker 1: you know, the land use and everything else in the 672 00:35:52,120 --> 00:35:55,640 Speaker 1: hardship on the local economies and towns in certain ways 673 00:35:55,680 --> 00:35:59,120 Speaker 1: that we're going to get to, it's really just a 674 00:35:59,320 --> 00:36:05,480 Speaker 1: sucubus of electricity and water usage. Yeah, psychobus is not 675 00:36:05,560 --> 00:36:06,320 Speaker 1: the right word. 676 00:36:06,360 --> 00:36:08,560 Speaker 2: No, but it makes sense. It's like a bunker down. 677 00:36:10,160 --> 00:36:12,520 Speaker 1: Yeah, But I say psychibis to mean just like a 678 00:36:12,640 --> 00:36:14,839 Speaker 1: bottomless pit. But I know that's not what it means, 679 00:36:14,880 --> 00:36:15,400 Speaker 1: by the way. 680 00:36:15,360 --> 00:36:18,839 Speaker 2: A giant sucking thing, right right, And it is. It's 681 00:36:18,840 --> 00:36:21,440 Speaker 2: sucking tons of electricity and water up. Like some of 682 00:36:21,520 --> 00:36:25,640 Speaker 2: these AI data plants use the same amount of electricity 683 00:36:25,680 --> 00:36:28,719 Speaker 2: as a town of fifty thousand yeah, and about the 684 00:36:28,800 --> 00:36:31,479 Speaker 2: same amount of water is a town of fifty thousand people. 685 00:36:31,520 --> 00:36:34,440 Speaker 2: This is a data center we're talking about, and it's 686 00:36:34,480 --> 00:36:37,760 Speaker 2: not even necessarily an AI data center. Just any hyper 687 00:36:37,840 --> 00:36:42,879 Speaker 2: scale data center uses a ton of electricity and water. 688 00:36:43,200 --> 00:36:47,200 Speaker 2: The reason it uses water is because all of these processors, 689 00:36:47,320 --> 00:36:50,160 Speaker 2: the CPUs that are doing all this work, and just 690 00:36:50,280 --> 00:36:52,400 Speaker 2: all of the networking that's going on with it, it's 691 00:36:52,520 --> 00:36:56,960 Speaker 2: generating heat, and computing happens faster when it's cooler. So 692 00:36:57,120 --> 00:36:59,919 Speaker 2: to keep the place cool they use evaporative cooling, where 693 00:37:00,120 --> 00:37:06,520 Speaker 2: they funnel waste heat air through wet pads essentially, like 694 00:37:06,800 --> 00:37:09,680 Speaker 2: they just buy old mattresses and doze them with water 695 00:37:10,120 --> 00:37:12,359 Speaker 2: and then they run the heat through there and through 696 00:37:12,400 --> 00:37:16,600 Speaker 2: evaporative cooling, it cools it off. It uses a little 697 00:37:17,040 --> 00:37:22,279 Speaker 2: electricity than air cooling, but it uses water a lot 698 00:37:22,400 --> 00:37:22,840 Speaker 2: of water. 699 00:37:23,600 --> 00:37:26,040 Speaker 1: Yeah, I mean, I assume most people know this. But 700 00:37:26,280 --> 00:37:28,560 Speaker 1: like your laptop has a tiny fan in it, Like 701 00:37:28,960 --> 00:37:31,000 Speaker 1: every computer in the world has a little fan in 702 00:37:31,080 --> 00:37:33,719 Speaker 1: it that cools it down. So when you've got all 703 00:37:33,760 --> 00:37:35,799 Speaker 1: this stuff together, you know it's going to generate tons 704 00:37:35,840 --> 00:37:38,480 Speaker 1: and tons of heat. That was the whirrying of the 705 00:37:38,520 --> 00:37:41,839 Speaker 1: server room that you used to sleep in. Those were 706 00:37:41,960 --> 00:37:44,399 Speaker 1: all fans, you know, And you know there's some other 707 00:37:44,520 --> 00:37:47,399 Speaker 1: sounds coming but mostly those fans trying to cool everything down. 708 00:37:48,280 --> 00:37:49,919 Speaker 1: We've got a lot of stats here that are pretty 709 00:37:49,920 --> 00:37:53,520 Speaker 1: eye popping. But there are eleven roughly eleven thousand data 710 00:37:53,520 --> 00:37:56,920 Speaker 1: centers around the world. Most of these are not AI obviously, 711 00:37:57,000 --> 00:37:59,920 Speaker 1: but they're the most you know, robust sort of u 712 00:38:00,040 --> 00:38:03,000 Speaker 1: users of the energy. But they use between one and 713 00:38:03,160 --> 00:38:05,759 Speaker 1: one point five percent, which it doesn't sound like a lot, 714 00:38:06,200 --> 00:38:11,360 Speaker 1: but of the entire world's electricity usage. I know on 715 00:38:11,520 --> 00:38:13,560 Speaker 1: planet Earth goes to data centers right now, and in 716 00:38:13,640 --> 00:38:17,840 Speaker 1: certain places like Ireland, data centers use about twenty percent 717 00:38:17,960 --> 00:38:19,120 Speaker 1: of the country's electricity. 718 00:38:19,280 --> 00:38:22,360 Speaker 2: Yeah, and if you dive into different places around like 719 00:38:22,760 --> 00:38:25,920 Speaker 2: the world where data centers are like that's collectively, right, 720 00:38:26,280 --> 00:38:27,719 Speaker 2: all of them in Iran and all of them in 721 00:38:27,800 --> 00:38:30,280 Speaker 2: the world. If you kind of zoom into the towns 722 00:38:30,800 --> 00:38:35,279 Speaker 2: where these things are located, there's well, there's something called 723 00:38:35,400 --> 00:38:40,560 Speaker 2: Data Center Alley in northern Virginia outside of DC, where 724 00:38:40,760 --> 00:38:45,680 Speaker 2: there's this huge concentration of large data centers, probably the 725 00:38:45,719 --> 00:38:49,319 Speaker 2: biggest concentration in the world. Those data centers use about 726 00:38:49,360 --> 00:38:52,799 Speaker 2: the same amount of electricity as sixty percent of all 727 00:38:52,840 --> 00:38:54,880 Speaker 2: the households in the state of Virginia. 728 00:38:55,280 --> 00:38:58,920 Speaker 1: Yeah, here's another one. By twenty thirty, they're predicting. This 729 00:38:59,040 --> 00:39:03,080 Speaker 1: is Barclay's Bank is predicting that data center energy use 730 00:39:03,120 --> 00:39:06,360 Speaker 1: in the United States would make up about thirteen percent 731 00:39:06,480 --> 00:39:10,400 Speaker 1: of the entire electricity demand of the United States. And 732 00:39:11,360 --> 00:39:13,680 Speaker 1: Meta has there. They all have silly names, but they're 733 00:39:14,000 --> 00:39:17,520 Speaker 1: data centers called Hyperion. They're all you know, one was 734 00:39:18,200 --> 00:39:20,560 Speaker 1: where are the where's that list? They're all these kind 735 00:39:20,600 --> 00:39:23,080 Speaker 1: of sci fi sounding names. 736 00:39:23,480 --> 00:39:28,560 Speaker 2: Yes, Stargate, Yeah, Jupiter, Prometheus. God, I'm sure all of 737 00:39:28,560 --> 00:39:30,360 Speaker 2: those nerds are like, what do you mean silly? 738 00:39:31,120 --> 00:39:32,759 Speaker 1: If I opened up a data center, I'd call it 739 00:39:32,840 --> 00:39:33,400 Speaker 1: Old Bessie. 740 00:39:34,480 --> 00:39:38,920 Speaker 2: Bessie's I hope so bad that somebody's listening to this 741 00:39:39,040 --> 00:39:42,319 Speaker 2: and they open a massive hyper scale data center named 742 00:39:42,360 --> 00:39:42,920 Speaker 2: Old Bessie. 743 00:39:43,680 --> 00:39:46,719 Speaker 1: That would be great. But Meta's Hyperion data center will 744 00:39:46,800 --> 00:39:49,680 Speaker 1: consume by the time it's finished, about five gigawatts. And 745 00:39:49,719 --> 00:39:54,440 Speaker 1: if you're like, what's five gigawatts? That is about half 746 00:39:54,600 --> 00:39:55,560 Speaker 1: of the peak load. 747 00:39:55,440 --> 00:39:57,440 Speaker 2: Of all of New York City, the most that it 748 00:39:57,480 --> 00:40:00,240 Speaker 2: can possibly it can possibly be demanded. 749 00:40:00,120 --> 00:40:02,480 Speaker 1: Right, Yeah, the very toplope probably, I guess in New 750 00:40:02,560 --> 00:40:06,400 Speaker 1: York City on the hottest day of the year, with 751 00:40:06,640 --> 00:40:10,120 Speaker 1: all the lights on at night or something. Yeah, and 752 00:40:10,200 --> 00:40:13,560 Speaker 1: that Rocketfeller tree just they just there's a summer version. 753 00:40:13,400 --> 00:40:17,160 Speaker 2: That puts it over the edge that's right blackout. So 754 00:40:17,320 --> 00:40:20,640 Speaker 2: you can imagine that when you're using all this electricity 755 00:40:20,800 --> 00:40:24,080 Speaker 2: and using all this water, if you're starting to build 756 00:40:24,120 --> 00:40:26,920 Speaker 2: these massive data centers, you're looking for places that have 757 00:40:27,120 --> 00:40:32,080 Speaker 2: like cheap land, cheap electricity, and because electricity is often 758 00:40:32,320 --> 00:40:36,200 Speaker 2: more expensive than water, they'll go to places. They'll build 759 00:40:36,239 --> 00:40:39,239 Speaker 2: them in places that are like water scarce, that have 760 00:40:39,560 --> 00:40:43,720 Speaker 2: cheap electricity. I'm the premise that, like, we're a massive 761 00:40:43,960 --> 00:40:48,960 Speaker 2: multinational corporation, we can push around this little county and 762 00:40:49,320 --> 00:40:51,000 Speaker 2: use up all of their water and what are they 763 00:40:51,040 --> 00:40:51,920 Speaker 2: gonna do? Nothing? 764 00:40:52,840 --> 00:40:55,560 Speaker 1: Yeah, And I mean that's literally happening. There's one right 765 00:40:55,600 --> 00:40:58,520 Speaker 1: here in Georgia, in Newton County. It's a metadata center 766 00:40:58,920 --> 00:41:03,480 Speaker 1: that's using tenth of the local water use. And like 767 00:41:03,600 --> 00:41:06,200 Speaker 1: you said, water is a is a resource that isn't infinite. 768 00:41:06,239 --> 00:41:08,360 Speaker 1: We've talked about the dangers in the future of like 769 00:41:08,760 --> 00:41:10,400 Speaker 1: you know, perhaps the wars of the future will be 770 00:41:10,480 --> 00:41:14,080 Speaker 1: fought over water, and this could get us there. I 771 00:41:14,160 --> 00:41:18,200 Speaker 1: think in Phoenix, Arizona, you know, known for their abundant water. 772 00:41:19,239 --> 00:41:23,280 Speaker 1: Meta and Microsoft use seven million gallons of water every 773 00:41:23,480 --> 00:41:25,160 Speaker 1: single day for their data centers. 774 00:41:25,239 --> 00:41:27,640 Speaker 2: Yeah, every day, you said. 775 00:41:28,120 --> 00:41:30,160 Speaker 1: Every day, seven million gallons of water. 776 00:41:30,320 --> 00:41:33,839 Speaker 2: That's insane. Yeah. And when I saw this, I was like, oh, 777 00:41:33,960 --> 00:41:37,600 Speaker 2: here we go. In the UK, data centers used ten 778 00:41:37,880 --> 00:41:45,040 Speaker 2: billion leaders of drinking water every year. L I. T R. E. S. Yeah, 779 00:41:45,080 --> 00:41:46,680 Speaker 2: that's right. Uh. 780 00:41:46,960 --> 00:41:50,120 Speaker 1: But you know you mentioned some of these towns. Not 781 00:41:50,239 --> 00:41:52,919 Speaker 1: only are some there like using let's say ten percent 782 00:41:52,960 --> 00:41:56,320 Speaker 1: of the local water here in Newton County in Virginia 783 00:41:56,640 --> 00:41:59,359 Speaker 1: where data center Alley is, some of these places are 784 00:41:59,520 --> 00:42:01,319 Speaker 1: like some of these towns are running out of water, 785 00:42:01,480 --> 00:42:03,359 Speaker 1: Like they go to turn on their water and water 786 00:42:03,480 --> 00:42:04,640 Speaker 1: doesn't come out because of this. 787 00:42:04,920 --> 00:42:07,480 Speaker 2: Well. Plus also, like we talked about how gamers are 788 00:42:07,520 --> 00:42:11,560 Speaker 2: getting the short end of the stick when it comes 789 00:42:11,600 --> 00:42:14,880 Speaker 2: to buying graphic cards because they are in such high demand, 790 00:42:15,200 --> 00:42:18,120 Speaker 2: same thing happens with electricity. So in addition to this 791 00:42:18,320 --> 00:42:20,600 Speaker 2: data center coming to town and using up all your water, 792 00:42:20,920 --> 00:42:24,239 Speaker 2: they're also jacking up your electricity prices because there's only 793 00:42:24,360 --> 00:42:28,360 Speaker 2: so much that your local electrical company can produce, So 794 00:42:28,480 --> 00:42:31,399 Speaker 2: because of supply and demand, your price is going to rise, 795 00:42:31,440 --> 00:42:35,640 Speaker 2: and I guess around Data center Alley in Northern Virginia, 796 00:42:36,000 --> 00:42:39,600 Speaker 2: electricity prices have increased two hundred and sixty seven percent 797 00:42:40,280 --> 00:42:45,359 Speaker 2: since twenty twenty. And that also is affecting Maryland, which 798 00:42:45,440 --> 00:42:48,280 Speaker 2: is getting little to no benefit from Data center Alley 799 00:42:48,440 --> 00:42:50,960 Speaker 2: and is just helping pay the price for it. This 800 00:42:51,120 --> 00:42:56,359 Speaker 2: is subsidization of these data centers, like they are subsidized 801 00:42:56,400 --> 00:42:58,719 Speaker 2: in just about every single way you can imagine. 802 00:42:59,239 --> 00:43:02,239 Speaker 1: Yeah, for sure. And if you say like, oh, well sure, 803 00:43:02,280 --> 00:43:04,520 Speaker 1: but they create jobs, right, so that's great for the 804 00:43:04,560 --> 00:43:09,720 Speaker 1: local economy. Kyle gives an example here of Northumberland, England. 805 00:43:10,120 --> 00:43:13,640 Speaker 1: There's a ten billion pound data center there or I 806 00:43:13,680 --> 00:43:16,719 Speaker 1: guess it's coming and you'd think, oh, great, that's that's 807 00:43:16,760 --> 00:43:19,560 Speaker 1: going to employ probably like five thousand people, right, it's 808 00:43:19,600 --> 00:43:22,240 Speaker 1: going to employ four hundred people with full time jobs. 809 00:43:22,600 --> 00:43:25,919 Speaker 2: Yeah, a ten billion dollar or ten billion pound data 810 00:43:25,960 --> 00:43:29,120 Speaker 2: center four hundred jobs. Because these things are so efficient 811 00:43:29,520 --> 00:43:33,040 Speaker 2: and everything is just so advanced, they don't really need 812 00:43:33,120 --> 00:43:36,359 Speaker 2: that many people to keep an eye on it. Right. Plus, Also, 813 00:43:37,400 --> 00:43:40,000 Speaker 2: the money from that data center, if they're not going 814 00:43:40,040 --> 00:43:42,439 Speaker 2: to spread it around the UK, it's going to flow 815 00:43:42,680 --> 00:43:45,200 Speaker 2: right back to the US, to the parent company. 816 00:43:46,200 --> 00:43:49,359 Speaker 1: Oh yeah, for sure. And you know, we also didn't 817 00:43:49,440 --> 00:43:52,960 Speaker 1: point out that a lot of these these energy grids 818 00:43:53,400 --> 00:43:56,319 Speaker 1: like are literally going to buckle under pressure at some point, 819 00:43:56,440 --> 00:43:57,480 Speaker 1: like they're not built for this. 820 00:43:57,840 --> 00:44:00,720 Speaker 2: Yes, and we're not so we're I know, it sounds 821 00:44:00,760 --> 00:44:03,800 Speaker 2: like we're just like and this and that. How terrible 822 00:44:03,840 --> 00:44:08,360 Speaker 2: are data centers like There's they're they're incredibly important and 823 00:44:08,520 --> 00:44:12,759 Speaker 2: they support an amazing array of really great stuff, right 824 00:44:13,160 --> 00:44:17,520 Speaker 2: and they they are the foundation that the next expansion 825 00:44:17,680 --> 00:44:20,879 Speaker 2: of the digital economy and the world culture are going 826 00:44:20,960 --> 00:44:23,759 Speaker 2: to grow on. Like, they're incredibly important, but they have 827 00:44:24,080 --> 00:44:27,000 Speaker 2: a lot of problems with them that need to be addressed. 828 00:44:27,280 --> 00:44:31,160 Speaker 2: They're not being addressed because every government from like the 829 00:44:31,320 --> 00:44:36,120 Speaker 2: local city council up to the leaders of the free world, 830 00:44:37,560 --> 00:44:41,200 Speaker 2: like are just giving these people whatever they want. That's 831 00:44:41,280 --> 00:44:45,279 Speaker 2: what's going on now. There's no checks going on at all, 832 00:44:45,440 --> 00:44:46,880 Speaker 2: right now, that's the problem. 833 00:44:47,520 --> 00:44:50,120 Speaker 1: Yeah, And that's that's because the flow of money is 834 00:44:50,239 --> 00:44:54,440 Speaker 1: so great at this point to a certain segment of 835 00:44:54,480 --> 00:45:00,120 Speaker 1: the population. Only they're protecting their their own investment, you know, 836 00:45:00,200 --> 00:45:01,719 Speaker 1: they're watching their own backsides. 837 00:45:01,840 --> 00:45:04,040 Speaker 2: That's definitely I would say ninety nine percent of it. 838 00:45:04,440 --> 00:45:06,840 Speaker 2: But I think there's also chuck, a little factor of 839 00:45:06,960 --> 00:45:11,920 Speaker 2: like g whiz, Like these these titans of the AI 840 00:45:12,120 --> 00:45:17,680 Speaker 2: industry are good at like razzle, dazzling elected officials into 841 00:45:17,760 --> 00:45:20,200 Speaker 2: doing whatever they want by I think making them feel 842 00:45:20,200 --> 00:45:24,920 Speaker 2: included in this new frontier. Essentially, I think there's a 843 00:45:25,080 --> 00:45:25,960 Speaker 2: certain element of that. 844 00:45:26,880 --> 00:45:30,080 Speaker 1: I think you're probably right. It's hey, maybe it'll all 845 00:45:30,120 --> 00:45:30,680 Speaker 1: work out great. 846 00:45:30,840 --> 00:45:35,080 Speaker 2: Sure, it probably will. It usually does astoundingly, it usually 847 00:45:35,200 --> 00:45:37,239 Speaker 2: does work out well. 848 00:45:37,520 --> 00:45:39,440 Speaker 1: True as far as the world hasn't ended. 849 00:45:39,760 --> 00:45:43,480 Speaker 2: That's exactly what I mean. Yeah, yeah, yeah, So I 850 00:45:43,520 --> 00:45:45,359 Speaker 2: think that's it. We said yeah like four or five 851 00:45:45,440 --> 00:45:48,560 Speaker 2: times in secession. I think we accidentally triggered listener, ma'am. 852 00:45:51,280 --> 00:45:54,960 Speaker 1: That's right. This relates to our history of the BBC episode, 853 00:45:55,120 --> 00:45:58,600 Speaker 1: and this is from Erica, and Erica says, hey, guys, 854 00:45:59,000 --> 00:46:01,320 Speaker 1: I really love the episode and left me reflecting on 855 00:46:01,400 --> 00:46:03,320 Speaker 1: how I've come to understand the country through both the 856 00:46:03,400 --> 00:46:06,800 Speaker 1: content the BBC produces and the people's reactions to the BBC. 857 00:46:07,000 --> 00:46:09,720 Speaker 1: But more recently, my work as an academic has enabled 858 00:46:09,760 --> 00:46:12,040 Speaker 1: me to be involved in creating programs for the BBC 859 00:46:12,600 --> 00:46:15,520 Speaker 1: across TV, radio and online. Because there's one awesome fact 860 00:46:16,000 --> 00:46:18,960 Speaker 1: about the BBC that wasn't included. For over fifty years, 861 00:46:19,000 --> 00:46:22,239 Speaker 1: the BBC has partnered with the Open University OU, which 862 00:46:22,239 --> 00:46:26,680 Speaker 1: specializes in accessible and distance education. The partnership started in 863 00:46:26,719 --> 00:46:30,359 Speaker 1: the nineteen seventies to provide learning at scale, including facilitating 864 00:46:30,480 --> 00:46:33,360 Speaker 1: university level lectures at night on public television. Today, the 865 00:46:33,400 --> 00:46:38,279 Speaker 1: partnership facilitates access to academic consultants to co produce high 866 00:46:38,360 --> 00:46:43,080 Speaker 1: quality and form content across platforms, including some of the 867 00:46:43,400 --> 00:46:44,600 Speaker 1: David Attenborough. 868 00:46:44,400 --> 00:46:45,319 Speaker 2: Nature stuff nice. 869 00:46:46,320 --> 00:46:50,560 Speaker 1: Additionally, the Open University creates supplementary materials to enable people 870 00:46:50,600 --> 00:46:55,080 Speaker 1: to continue their learning journey and explore topics in more detail. So, 871 00:46:55,200 --> 00:46:57,640 Speaker 1: whether viewers or listeners realize it or not, this partnership 872 00:46:57,760 --> 00:47:01,000 Speaker 1: enables the public to benefit from special US knowledge and 873 00:47:01,120 --> 00:47:06,200 Speaker 1: accessible ways. And that is from Erica from the Open University, 874 00:47:06,280 --> 00:47:08,399 Speaker 1: who is a professor of medical anthropology. 875 00:47:08,640 --> 00:47:11,640 Speaker 2: Oh wow, that's an awesome Erica. You got to send 876 00:47:11,719 --> 00:47:15,479 Speaker 2: us some topic ideas too, totally right up your alley 877 00:47:15,600 --> 00:47:18,600 Speaker 2: and congratulations. That's pretty neat making stuff in conjunction with 878 00:47:18,680 --> 00:47:21,399 Speaker 2: the BBC. That's gotta be a neat high water mark, 879 00:47:21,640 --> 00:47:25,000 Speaker 2: you know, agreed, And I think, Chuck, I'm curious to 880 00:47:25,080 --> 00:47:28,440 Speaker 2: see if we go look at our account, we'll see 881 00:47:28,440 --> 00:47:31,920 Speaker 2: a little line item from open University and one from BBC. 882 00:47:33,520 --> 00:47:35,360 Speaker 1: Well, it would be like seven pounds or something. I 883 00:47:35,400 --> 00:47:36,600 Speaker 1: don't know the exchange rate, right. 884 00:47:36,600 --> 00:47:40,279 Speaker 2: That sounds about right? All right, great, well, thanks again, Erica, 885 00:47:40,320 --> 00:47:43,480 Speaker 2: and please do send us some medical anthropology ideas because 886 00:47:43,520 --> 00:47:45,960 Speaker 2: that just sounds like it'll knock our socks off. And 887 00:47:46,040 --> 00:47:47,719 Speaker 2: if you want to be like Erica and try to 888 00:47:47,800 --> 00:47:51,520 Speaker 2: knock our socks off, good luck, you can send it 889 00:47:51,640 --> 00:47:55,279 Speaker 2: off to us at stuff podcast at iHeartRadio dot com. 890 00:47:58,480 --> 00:48:01,320 Speaker 2: Stuff you Should Know is a production of iHeartRadio. For 891 00:48:01,480 --> 00:48:05,600 Speaker 2: more podcasts my heart Radio, visit the iHeartRadio app, Apple Podcasts, 892 00:48:05,760 --> 00:48:07,560 Speaker 2: or wherever you listen to your favorite shows.