1 00:00:00,040 --> 00:00:02,560 Speaker 1: Hi. Is Us Valoscian here and Cara Price, and we're 2 00:00:02,560 --> 00:00:04,800 Speaker 1: taking some time off for the holidays. We'll be back 3 00:00:04,800 --> 00:00:07,960 Speaker 1: with new episodes starting in January. In the meantime, instead 4 00:00:07,960 --> 00:00:10,680 Speaker 1: of leaving this feed empty, we wanted to share one 5 00:00:10,680 --> 00:00:13,600 Speaker 1: of my favorite episodes from last year. This week, we're 6 00:00:13,640 --> 00:00:17,240 Speaker 1: rearing my conversation with Stephen Witt from November. He's an 7 00:00:17,239 --> 00:00:20,759 Speaker 1: author and frequent contributor to The New Yorker who wrote 8 00:00:20,800 --> 00:00:23,400 Speaker 1: the book on one of the biggest companies in the world. 9 00:00:23,680 --> 00:00:26,760 Speaker 1: You may have heard of it in video. In this episode, 10 00:00:26,840 --> 00:00:30,160 Speaker 1: we hear how the CEO, Jensen Huang, went from working 11 00:00:30,160 --> 00:00:34,839 Speaker 1: at Denny's to being the world leader manufacturing AI chips. 12 00:00:35,159 --> 00:00:51,199 Speaker 1: Hope you enjoy it and thanks for listening. Welcome to 13 00:00:51,200 --> 00:00:55,080 Speaker 1: Tech Stuff. I'm Us Vlashan here with Cara Price. Hey, Kara, Hi, 14 00:00:55,200 --> 00:00:58,600 Speaker 1: as so years ago around the time that we were 15 00:00:58,600 --> 00:01:03,960 Speaker 1: reporting our first podcast together on the forthcoming AI Revolution 16 00:01:04,240 --> 00:01:08,920 Speaker 1: around no longer forthcoming. Yeah, you invested in in Vidia, 17 00:01:09,560 --> 00:01:14,320 Speaker 1: which is up over one hundred x since then. Congratulations. 18 00:01:14,520 --> 00:01:17,080 Speaker 2: You know, I just felt when we reported on it 19 00:01:17,120 --> 00:01:18,959 Speaker 2: that it was going to be the future, and so 20 00:01:19,120 --> 00:01:22,600 Speaker 2: I did invest in it, and you know, very happily. 21 00:01:22,680 --> 00:01:25,280 Speaker 1: So now, what was it? Was there something that tipped 22 00:01:25,280 --> 00:01:28,080 Speaker 1: you over the edge back then to do it? Well? 23 00:01:28,120 --> 00:01:30,000 Speaker 2: I was just sort of thinking to myself, this is 24 00:01:30,040 --> 00:01:33,119 Speaker 2: the thing that's going to power everything. So obviously it's 25 00:01:33,200 --> 00:01:36,319 Speaker 2: something that people are going to be paying attention to 26 00:01:36,400 --> 00:01:38,200 Speaker 2: and investors are going to be paying attention to, and 27 00:01:38,240 --> 00:01:40,640 Speaker 2: it just it made a lot of sense to me 28 00:01:40,680 --> 00:01:41,119 Speaker 2: at the time. 29 00:01:41,720 --> 00:01:45,360 Speaker 1: We know Warren Buffett is retiring this year, there's a 30 00:01:45,400 --> 00:01:50,520 Speaker 1: slot open. In Vidia is, of course the most valuable 31 00:01:50,520 --> 00:01:53,520 Speaker 1: company in the world today, recently topping five trillion dollars 32 00:01:53,560 --> 00:01:55,880 Speaker 1: in value, and of course a lot of people are 33 00:01:55,880 --> 00:01:58,960 Speaker 1: crying bubble, not just for Nvidia but for the AI 34 00:01:59,040 --> 00:02:01,520 Speaker 1: industry as a whole, actually carrying curiously. 35 00:02:01,760 --> 00:02:05,639 Speaker 2: You held your stock, I held I look, I needed 36 00:02:05,640 --> 00:02:08,000 Speaker 2: some things, so I sold some, but still I still 37 00:02:08,040 --> 00:02:08,520 Speaker 2: have a lot. 38 00:02:08,919 --> 00:02:10,880 Speaker 1: I have enough, And you're gonna hold on. You're not 39 00:02:10,880 --> 00:02:13,080 Speaker 1: worried about the bubble. I think I am gonna hold on. 40 00:02:13,200 --> 00:02:16,560 Speaker 1: But the questions people have are, what if AI doesn't 41 00:02:16,600 --> 00:02:20,440 Speaker 1: get better infinitely as it scales, what if people invent 42 00:02:20,600 --> 00:02:23,680 Speaker 1: new chips that are far more efficient than the Invidia chips, 43 00:02:24,320 --> 00:02:28,160 Speaker 1: and what if the adoption of AI by other companies 44 00:02:28,520 --> 00:02:31,079 Speaker 1: doesn't give them the results that they hope for financially. 45 00:02:31,680 --> 00:02:33,680 Speaker 1: And so I talked about all of that with somebody 46 00:02:33,720 --> 00:02:37,160 Speaker 1: who knows Nvidia better than anyone else. In fact, he 47 00:02:37,200 --> 00:02:40,360 Speaker 1: literally wrote the book on it, The Thinking Machine, Jensen 48 00:02:40,440 --> 00:02:44,560 Speaker 1: Wang in Vidia and the world's most coveted microchip. He 49 00:02:44,600 --> 00:02:46,520 Speaker 1: actually interviewed Wang six times. 50 00:02:47,000 --> 00:02:50,359 Speaker 3: He's moody, and you know, he has what I would 51 00:02:50,480 --> 00:02:53,799 Speaker 3: describe as somewhat self indulgent performances of anger from time 52 00:02:53,840 --> 00:02:54,240 Speaker 3: to time. 53 00:02:54,680 --> 00:02:58,079 Speaker 1: That's Stephen Witt, and he actually got interested in Nvidia 54 00:02:58,160 --> 00:03:01,280 Speaker 1: shortly after chatchipt took the world by storm. 55 00:03:01,720 --> 00:03:03,960 Speaker 3: I had been using chat GBT and I was like, Wow, 56 00:03:04,200 --> 00:03:06,440 Speaker 3: this thing is amazing. This is like twenty twenty two, 57 00:03:06,960 --> 00:03:10,359 Speaker 3: and I am cooked, Like there's not going to be 58 00:03:10,520 --> 00:03:12,120 Speaker 3: room for me as a writer. This thing can already 59 00:03:12,120 --> 00:03:14,040 Speaker 3: write almost as well as I can, and actually it 60 00:03:14,040 --> 00:03:15,840 Speaker 3: writes better than I did when I was young. 61 00:03:16,400 --> 00:03:19,760 Speaker 1: As Stephen dug around to understand what was powering this technology, 62 00:03:20,240 --> 00:03:22,800 Speaker 1: he got more and more interested in the company building 63 00:03:22,880 --> 00:03:24,440 Speaker 1: its physical infrastructure. 64 00:03:24,960 --> 00:03:26,679 Speaker 3: What brought me to a video was I was trying 65 00:03:26,680 --> 00:03:28,639 Speaker 3: to write about open Aye and them, and it was 66 00:03:28,680 --> 00:03:30,920 Speaker 3: just a crowded there, a million journalists swarming around. I 67 00:03:30,919 --> 00:03:32,320 Speaker 3: was like, there's got to be some other story here. 68 00:03:32,800 --> 00:03:35,280 Speaker 3: And what I've done as a journalist is look for 69 00:03:35,400 --> 00:03:39,960 Speaker 3: big movements of money that aren't being covered. And I 70 00:03:40,000 --> 00:03:42,480 Speaker 3: looked at in Video's stock price, and I in my 71 00:03:42,560 --> 00:03:44,800 Speaker 3: mind they were still the gaming company, and I was like, 72 00:03:44,840 --> 00:03:46,600 Speaker 3: what the hell is going on here? This company's worth 73 00:03:46,600 --> 00:03:49,600 Speaker 3: a trillion dollars. And then as I started to investigate it, 74 00:03:50,040 --> 00:03:52,400 Speaker 3: I was like, Oh, wow, they build all the hardware. 75 00:03:52,560 --> 00:03:54,520 Speaker 3: They built all the hardware that makes this stuff go. 76 00:03:55,040 --> 00:03:58,400 Speaker 3: That's fascinating. And then I kind of learned about Jensen 77 00:03:58,520 --> 00:03:59,760 Speaker 3: and I was like, wait, this company has had the 78 00:03:59,800 --> 00:04:02,920 Speaker 3: same CEO through both the gaming and the AI days. 79 00:04:02,920 --> 00:04:05,120 Speaker 3: And not only that, this is the same as the founder. 80 00:04:05,200 --> 00:04:06,800 Speaker 3: He's been the CEO for thirty years. It's the same 81 00:04:06,840 --> 00:04:07,520 Speaker 3: guy all along. 82 00:04:08,160 --> 00:04:10,320 Speaker 1: So Stephen wrote the book on Nvidia, but he also 83 00:04:10,320 --> 00:04:12,480 Speaker 1: wrote a great New York A piece recently about data 84 00:04:12,520 --> 00:04:15,240 Speaker 1: centers and an essay for The New York Times about 85 00:04:15,280 --> 00:04:18,719 Speaker 1: quote the AI prompt that could end the world. So 86 00:04:19,279 --> 00:04:22,200 Speaker 1: he's really a farm to table thinker, from chips to 87 00:04:22,360 --> 00:04:25,400 Speaker 1: data centers to AI to the apocalypse. It was a 88 00:04:25,400 --> 00:04:26,120 Speaker 1: fun conversation. 89 00:04:26,839 --> 00:04:28,880 Speaker 2: Yeah, as a writer, he seems to sort of be 90 00:04:28,920 --> 00:04:31,680 Speaker 2: at the center of everything in a way that I 91 00:04:31,720 --> 00:04:34,760 Speaker 2: find very compelling, And I actually want to know more 92 00:04:34,800 --> 00:04:37,280 Speaker 2: about the AI prompt that could end the world. 93 00:04:37,480 --> 00:04:38,600 Speaker 1: Well, in that case, you have to listen to the 94 00:04:38,640 --> 00:04:40,480 Speaker 1: whole interview because we talk about it right at the end. 95 00:04:41,240 --> 00:04:45,080 Speaker 1: Okay's the conversation with Stephen Witt. For the layman, what 96 00:04:45,200 --> 00:04:47,680 Speaker 1: is in Vidia and how did it become the most 97 00:04:47,720 --> 00:04:48,920 Speaker 1: valuable company in the world. 98 00:04:49,440 --> 00:04:52,839 Speaker 3: Invidia is basically a hardware designer. They make a special 99 00:04:52,960 --> 00:04:56,640 Speaker 3: kind of microchip called a graphics processing unit, and the 100 00:04:56,680 --> 00:04:59,880 Speaker 3: initial purpose of this thing was to just render graphics 101 00:05:00,080 --> 00:05:03,480 Speaker 3: in video games. So if you were a video gamer, 102 00:05:03,520 --> 00:05:05,680 Speaker 3: you knew who this company was because you would actually 103 00:05:05,680 --> 00:05:09,320 Speaker 3: build your whole PC just around this in video card. 104 00:05:09,640 --> 00:05:12,480 Speaker 3: So this was the engine that rendered the graphics on 105 00:05:12,520 --> 00:05:15,280 Speaker 3: your screen. Sometime around two thousand and four to two 106 00:05:15,320 --> 00:05:18,520 Speaker 3: thousand and five, scientists began to notice how powerful these 107 00:05:18,560 --> 00:05:21,560 Speaker 3: cards were, and they started hacking into the cards, like 108 00:05:21,640 --> 00:05:25,400 Speaker 3: hacking into the circuitry to get to those powerful mathematical 109 00:05:25,480 --> 00:05:29,680 Speaker 3: functions inside the microchip. And Jensen Wong saw this and 110 00:05:29,720 --> 00:05:31,760 Speaker 3: he said, wait, this is a whole new market that 111 00:05:31,839 --> 00:05:35,480 Speaker 3: I can pursue. So he built the software platform that 112 00:05:35,640 --> 00:05:41,080 Speaker 3: turns the graphics card into basically a low budget supercomputer. 113 00:05:42,080 --> 00:05:45,760 Speaker 3: Now you may ask who is this for, Well, it's 114 00:05:45,800 --> 00:05:49,359 Speaker 3: not really for established research scientists because they can usually 115 00:05:49,400 --> 00:05:53,560 Speaker 3: afford time on a conventional supercomputer. It's for scientists who 116 00:05:53,600 --> 00:05:56,400 Speaker 3: are sort of marginalized, who can't afford time on a 117 00:05:56,440 --> 00:05:58,600 Speaker 3: supercomputer and whose research is. 118 00:05:58,560 --> 00:05:59,240 Speaker 1: Out of favor. 119 00:06:00,320 --> 00:06:05,400 Speaker 3: So it's for mad scientists. It's for scientists who are 120 00:06:05,480 --> 00:06:10,400 Speaker 3: pursuing unpopular or weird or kind of offbeat scientific projects. 121 00:06:10,440 --> 00:06:15,120 Speaker 3: But ultimately, the key use case turned out to be AI, 122 00:06:15,760 --> 00:06:19,479 Speaker 3: and specifically a branch of AI that most AI researchers 123 00:06:19,520 --> 00:06:23,800 Speaker 3: thought was crazy called neural network technology. And what you're 124 00:06:23,800 --> 00:06:26,600 Speaker 3: doing here is you're building software that kind of resembles 125 00:06:26,640 --> 00:06:29,279 Speaker 3: the connections in the human brain. It's inspired by the 126 00:06:29,279 --> 00:06:33,599 Speaker 3: biological brain. Actually, you build a bunch of synthetic neurons 127 00:06:34,000 --> 00:06:36,440 Speaker 3: in a little file, and then you train them by 128 00:06:36,480 --> 00:06:40,320 Speaker 3: repeatedly exposing them to training data. So what this could mean, 129 00:06:40,320 --> 00:06:42,200 Speaker 3: for example, if we're trying to build a neural network 130 00:06:42,240 --> 00:06:45,800 Speaker 3: to recognize objects to do computer vision, then we'll show 131 00:06:45,839 --> 00:06:48,600 Speaker 3: it tens of thousands or hundreds of thousands, or ultimately 132 00:06:48,680 --> 00:06:53,200 Speaker 3: millions of images and slowly rewire its neurons until it 133 00:06:53,200 --> 00:06:56,760 Speaker 3: can start to identify things. Now, this had been proposed 134 00:06:56,760 --> 00:06:58,880 Speaker 3: going all the way back to the nineteen forties, but 135 00:06:59,080 --> 00:07:02,279 Speaker 3: nobody had ever able to get it to work. And 136 00:07:02,320 --> 00:07:06,360 Speaker 3: the missing piece, it turns out, is just raw computing power. 137 00:07:06,720 --> 00:07:09,320 Speaker 3: Jeffrey Hinton, who they call him the godfather of AI, 138 00:07:09,600 --> 00:07:11,400 Speaker 3: he said, you know, the point we never thought to 139 00:07:11,520 --> 00:07:13,880 Speaker 3: ask was what if we just made it go a 140 00:07:14,000 --> 00:07:18,000 Speaker 3: million times faster? And that's what in Nvidia's hardware did. 141 00:07:18,000 --> 00:07:20,920 Speaker 3: It made AI and these neural networks in particular, train 142 00:07:21,080 --> 00:07:23,400 Speaker 3: and learn a million times faster. 143 00:07:23,720 --> 00:07:26,440 Speaker 1: We actually had Hinton on the podcast earlier this year. 144 00:07:26,760 --> 00:07:29,520 Speaker 1: Was he an early one of these mad scientists whose 145 00:07:29,520 --> 00:07:33,080 Speaker 1: research was unpopular and therefore started buying in video chips. 146 00:07:33,200 --> 00:07:35,640 Speaker 3: Yes, very much so. And there was a community of 147 00:07:35,640 --> 00:07:37,800 Speaker 3: these guys. It wasn't just him. There were a number 148 00:07:37,800 --> 00:07:39,600 Speaker 3: of other people doing it, most of whose work has 149 00:07:39,600 --> 00:07:42,440 Speaker 3: now been recognized. But they were very much on the 150 00:07:42,480 --> 00:07:45,559 Speaker 3: margins of computer science. They couldn't get five thousand dollars 151 00:07:45,560 --> 00:07:48,720 Speaker 3: in research funding, but they could get enough money to 152 00:07:48,760 --> 00:07:53,720 Speaker 3: afford two five hundred dollars in video retail graphics gaming cards, 153 00:07:53,800 --> 00:07:56,800 Speaker 3: which they did, and Hinton had a graduate student named 154 00:07:56,800 --> 00:08:00,320 Speaker 3: Alex Krzewski who was just an ACE programmer, and he 155 00:08:00,560 --> 00:08:02,800 Speaker 3: turned the neural net that he ran on these cards 156 00:08:03,080 --> 00:08:06,640 Speaker 3: into something called alex net, which then started to recognize 157 00:08:06,680 --> 00:08:10,240 Speaker 3: images better than any AI had ever done before. Like 158 00:08:10,320 --> 00:08:13,880 Speaker 3: it smashed the paradigm. And so that engineered around twenty 159 00:08:13,880 --> 00:08:16,960 Speaker 3: twelve or twenty thirteen a paradigm shift in AI. And 160 00:08:17,080 --> 00:08:21,040 Speaker 3: since then everything that has happened has been a repeated 161 00:08:21,080 --> 00:08:24,560 Speaker 3: application of the thing. Alex discovered that if you took 162 00:08:24,600 --> 00:08:28,160 Speaker 3: neural nets, if you ran them on Nvidia technology, you 163 00:08:28,200 --> 00:08:30,160 Speaker 3: would have a very powerful result. 164 00:08:30,680 --> 00:08:33,640 Speaker 1: So this is all fascinating, but to some it may 165 00:08:33,720 --> 00:08:37,439 Speaker 1: sound a big geeky and like inside Baseball, which is 166 00:08:37,480 --> 00:08:39,959 Speaker 1: why I was very attracted to a quote in your 167 00:08:40,000 --> 00:08:43,360 Speaker 1: recent ne Yoka piece which said, if Americans want to 168 00:08:43,360 --> 00:08:47,880 Speaker 1: retire comfortably, in Vidia has to succeed. Yes, And I 169 00:08:47,960 --> 00:08:50,120 Speaker 1: was curious what your conversations with the edits around that, 170 00:08:50,360 --> 00:08:50,719 Speaker 1: around that. 171 00:08:50,880 --> 00:08:53,640 Speaker 3: Oh no, that's that's straightforward. That one actually sailed right 172 00:08:53,679 --> 00:08:56,200 Speaker 3: through fact checking. There were no questions what has happened 173 00:08:56,240 --> 00:08:58,680 Speaker 3: since Alex invented this thing in his bedroom. Is that 174 00:08:58,720 --> 00:09:01,520 Speaker 3: we scaled it up from two graphics cards to two 175 00:09:01,640 --> 00:09:04,480 Speaker 3: hundred thousand or more, and we have plans to scale 176 00:09:04,520 --> 00:09:06,920 Speaker 3: them up to two or two million than twenty million, right, 177 00:09:07,440 --> 00:09:09,920 Speaker 3: So this is the data center boom which we're going 178 00:09:09,920 --> 00:09:14,000 Speaker 3: through right now. It's a new industrial revolution. We're basically 179 00:09:14,040 --> 00:09:18,480 Speaker 3: building these giant barns full of in Vidia microchips to 180 00:09:18,679 --> 00:09:22,600 Speaker 3: run calculations to build better AI twenty four to seven 181 00:09:22,640 --> 00:09:25,240 Speaker 3: around the clock, and it's one of the largest deployments 182 00:09:25,280 --> 00:09:29,280 Speaker 3: of capital in human history. This has made in Vidia 183 00:09:29,640 --> 00:09:34,280 Speaker 3: the most valuable company in the world, and it has 184 00:09:34,360 --> 00:09:38,280 Speaker 3: created a situation where in Vidia stock is more concentrated 185 00:09:38,480 --> 00:09:40,480 Speaker 3: in the S and P five hundred than any stock 186 00:09:40,559 --> 00:09:43,920 Speaker 3: since they started keeping track. And actually Microsoft, who is 187 00:09:43,960 --> 00:09:47,800 Speaker 3: the second biggest, has that valuation largely because they're building 188 00:09:47,840 --> 00:09:49,840 Speaker 3: these sheds and renting out in video equipment. So that's 189 00:09:49,840 --> 00:09:53,600 Speaker 3: linked to So think about that fifteen percent of the 190 00:09:53,640 --> 00:09:57,120 Speaker 3: stock market is these two stocks, right, they have to succeed. 191 00:09:57,600 --> 00:10:01,880 Speaker 3: Americans in particular, are usually invested massively through index funds 192 00:10:02,160 --> 00:10:04,280 Speaker 3: and something that looks exactly like the S and P 193 00:10:04,400 --> 00:10:08,120 Speaker 3: five hundred, so you know, if in Vidio crashes, it's 194 00:10:08,160 --> 00:10:10,240 Speaker 3: going to create a lot of paint throughout the economy. 195 00:10:10,480 --> 00:10:12,320 Speaker 1: I want to talk about the data centers, but before that, 196 00:10:12,360 --> 00:10:14,760 Speaker 1: I want to talk about the man who founded in 197 00:10:14,800 --> 00:10:17,920 Speaker 1: Nvidia and is a CEO today at Jensen Huang. He 198 00:10:17,920 --> 00:10:20,440 Speaker 1: seems to pop up everywhere, but he also seems to 199 00:10:20,480 --> 00:10:23,800 Speaker 1: be more inscrutable. I mean, who is he and do 200 00:10:23,800 --> 00:10:26,360 Speaker 1: you see him as different from Zuckerberg and Oltman and 201 00:10:26,400 --> 00:10:28,000 Speaker 1: Bezos in some significant way? 202 00:10:28,640 --> 00:10:32,400 Speaker 3: Jensen most resembles of all executives Elon Musk because he 203 00:10:32,480 --> 00:10:36,040 Speaker 3: is an engineering wizard. Bezos is smart as hell, and 204 00:10:36,040 --> 00:10:39,040 Speaker 3: so Zuckerberg. But ultimately they're kind of software guides. You know, 205 00:10:39,080 --> 00:10:41,400 Speaker 3: they're coming at the computer from the keyboard in the terminal. 206 00:10:41,640 --> 00:10:45,040 Speaker 3: Jensen is totally different. He approaches computing from the circuit up. 207 00:10:45,400 --> 00:10:47,960 Speaker 3: He's a degree not in computer science originally, but actually 208 00:10:47,960 --> 00:10:52,319 Speaker 3: in electrical engineering. Okay, so for Jensen, the computer is 209 00:10:52,320 --> 00:10:57,000 Speaker 3: a piece of hardware that runs calculations in a microchip, 210 00:10:57,040 --> 00:11:00,360 Speaker 3: and he literally designed those microchips on paper at the 211 00:11:00,440 --> 00:11:02,640 Speaker 3: beginning of his career, and that's all he's ever done. 212 00:11:03,640 --> 00:11:05,160 Speaker 3: And this is a little bit why even though he 213 00:11:05,240 --> 00:11:06,959 Speaker 3: runs the most valuable company in the world. It's a 214 00:11:06,960 --> 00:11:10,800 Speaker 3: little baffling to to kind of people. Nothing Jensen makes 215 00:11:10,840 --> 00:11:14,120 Speaker 3: is really that accessible. It's all deep inside the computer. 216 00:11:14,400 --> 00:11:15,959 Speaker 1: As a quote in your piece that I liked what 217 00:11:16,120 --> 00:11:19,160 Speaker 1: he said, I find that I'm best when I'm under adversity. 218 00:11:19,679 --> 00:11:22,439 Speaker 1: My heart rate actually goes down. Anyone who's dealt with 219 00:11:22,520 --> 00:11:24,640 Speaker 1: RUSSIAU in a restaurant knows what I'm talking about. 220 00:11:25,040 --> 00:11:28,040 Speaker 3: Yeah, yeah, I mean he started out at Denny's, so 221 00:11:28,120 --> 00:11:30,720 Speaker 3: his first job was basically I think he was a 222 00:11:30,720 --> 00:11:33,480 Speaker 3: bus boy at first, and then graduated to dishwasher and 223 00:11:33,520 --> 00:11:36,440 Speaker 3: ultimately became a server. And I was talking to someone 224 00:11:36,440 --> 00:11:38,800 Speaker 3: in the company and she's like, you know what, Jensen 225 00:11:38,880 --> 00:11:42,040 Speaker 3: is actually a lot calmer and more compassionate with his 226 00:11:42,080 --> 00:11:45,520 Speaker 3: employees when things are going wrong. It's when the company 227 00:11:45,600 --> 00:11:47,400 Speaker 3: stock price is way up but it looks like everything's 228 00:11:47,400 --> 00:11:50,400 Speaker 3: going great that he really becomes much more cruel, like 229 00:11:50,480 --> 00:11:54,200 Speaker 3: much much meaner to everybody. So he is actually, in 230 00:11:54,240 --> 00:11:57,079 Speaker 3: some ways a nicer person when things are going wrong. 231 00:11:57,120 --> 00:11:58,720 Speaker 3: When he succeeds, it makes him nervous. 232 00:11:59,080 --> 00:12:01,760 Speaker 1: One of his colleagues describe working with him as kind 233 00:12:01,760 --> 00:12:05,760 Speaker 1: of like sticking your finger in the electric socket. That's 234 00:12:05,840 --> 00:12:06,600 Speaker 1: quite the metaphor. 235 00:12:06,960 --> 00:12:09,679 Speaker 3: It's one hundred percent accurate. I mean I've interacted with Jensen. 236 00:12:09,720 --> 00:12:11,600 Speaker 3: It is like sticking your finger in the electric socket. 237 00:12:11,640 --> 00:12:15,960 Speaker 3: He's so tightly wound. He expects so much to happen 238 00:12:16,400 --> 00:12:19,640 Speaker 3: in every conversation. Just to even start talking to him, 239 00:12:19,640 --> 00:12:21,120 Speaker 3: you have to be totally up to speed. He's not 240 00:12:21,160 --> 00:12:23,000 Speaker 3: gonna waste any time. He's not going to suffer fools. 241 00:12:23,520 --> 00:12:26,160 Speaker 3: And he's also really intense and unpredictable, and you just 242 00:12:26,200 --> 00:12:28,079 Speaker 3: don't know where he's going to go in any conversation. 243 00:12:28,640 --> 00:12:31,720 Speaker 3: And you know, he has what I would describe as 244 00:12:31,760 --> 00:12:34,840 Speaker 3: somewhat self indulgent performances of anger from time to time. 245 00:12:35,040 --> 00:12:37,199 Speaker 3: And that's especially true if you're one of his executives. 246 00:12:37,360 --> 00:12:39,680 Speaker 3: If you're not delivering, he's going to stand you up 247 00:12:39,720 --> 00:12:41,240 Speaker 3: in front of an audience of people and just start 248 00:12:41,400 --> 00:12:45,679 Speaker 3: screaming at you. But really I mean yelling, and it's 249 00:12:45,679 --> 00:12:47,559 Speaker 3: not fun, and he will humiliate you in front of 250 00:12:47,559 --> 00:12:50,000 Speaker 3: an audience. I think people at Nvidia have to develop 251 00:12:50,080 --> 00:12:52,240 Speaker 3: very thick skins. He actually did this to me at 252 00:12:52,240 --> 00:12:56,960 Speaker 3: one point, so I kind of know exactly. Oh yeah, yeah, Well, 253 00:12:57,200 --> 00:12:59,840 Speaker 3: I kept asking him about the future. Jensen does not 254 00:12:59,920 --> 00:13:03,520 Speaker 3: like to speculate. He doesn't have actually a science fiction 255 00:13:03,640 --> 00:13:05,080 Speaker 3: vision of what the future is going to look like. 256 00:13:05,559 --> 00:13:08,679 Speaker 3: He has a data driven vision from engineering principles of 257 00:13:08,720 --> 00:13:10,920 Speaker 3: where he thinks technology is going to go. But if 258 00:13:10,920 --> 00:13:13,679 Speaker 3: he can't see beyond that, he won't speculate. But I 259 00:13:14,240 --> 00:13:16,040 Speaker 3: noticed that other people at this firm would talk about it, 260 00:13:16,080 --> 00:13:19,319 Speaker 3: and I really wanted to get into his imagination. I 261 00:13:19,320 --> 00:13:21,240 Speaker 3: guess I would say of where he thinks all this 262 00:13:21,320 --> 00:13:23,840 Speaker 3: can go. So I presented him with a clip from 263 00:13:23,960 --> 00:13:26,760 Speaker 3: Arthur C. Clark discussing the future of computers, and this 264 00:13:26,800 --> 00:13:29,040 Speaker 3: is back from nineteen sixty four, but it was kind 265 00:13:29,040 --> 00:13:31,640 Speaker 3: of anticipating the current reality we were in where we 266 00:13:31,679 --> 00:13:34,679 Speaker 3: would start training mechanical brains and those brains would train 267 00:13:34,760 --> 00:13:38,679 Speaker 3: faster than biological brains and eventually would supersede biological brains. 268 00:13:38,679 --> 00:13:40,080 Speaker 3: And so I'd show this clip to some other people 269 00:13:40,080 --> 00:13:42,320 Speaker 3: in the video, and they've gotten very They kind of 270 00:13:42,360 --> 00:13:45,800 Speaker 3: like smelled up and started giving these grand soliloquies about 271 00:13:45,800 --> 00:13:47,880 Speaker 3: the future that were like very beautiful and articulate, And 272 00:13:48,120 --> 00:13:51,480 Speaker 3: I was hoping to get that response from Jensen. Instead, 273 00:13:51,520 --> 00:13:53,720 Speaker 3: he just starts screaming at me, how about how stupid 274 00:13:53,720 --> 00:13:55,720 Speaker 3: the clip was, how he didn't give a shit about 275 00:13:55,800 --> 00:13:57,360 Speaker 3: Arthur C. Clarke, He never read one of his books, 276 00:13:57,400 --> 00:13:59,640 Speaker 3: he didn't read science fiction, and he thought the whole 277 00:13:59,679 --> 00:14:02,440 Speaker 3: line of questioning was pedestrian, and that I was letting 278 00:14:02,520 --> 00:14:05,040 Speaker 3: him down by asking I was wasting his time. Despite 279 00:14:05,080 --> 00:14:07,960 Speaker 3: having written his biography, Jensen remains a little bit of 280 00:14:08,000 --> 00:14:10,160 Speaker 3: a puzzle, and just that I cannot tell you what's 281 00:14:10,160 --> 00:14:12,840 Speaker 3: going on inside his brain. So well, but I will 282 00:14:12,840 --> 00:14:17,720 Speaker 3: say this, he's extremely neurotic, by which I mean, I 283 00:14:17,720 --> 00:14:19,480 Speaker 3: don't even mean this in a clinical sense. I just 284 00:14:19,480 --> 00:14:21,680 Speaker 3: mean that, by his own admission, he's totally driven by 285 00:14:22,080 --> 00:14:25,880 Speaker 3: negative emotions. So even though he's on top of the world, 286 00:14:26,320 --> 00:14:28,800 Speaker 3: I think his mind is telling him constantly, you're going 287 00:14:28,880 --> 00:14:31,600 Speaker 3: to fail. This is a temporary thing, and Vidio is 288 00:14:31,600 --> 00:14:34,000 Speaker 3: going to go back down again, you know, twice in 289 00:14:34,040 --> 00:14:37,160 Speaker 3: his tenure as CEO, in Vidia's stock price has retreated 290 00:14:37,160 --> 00:14:38,280 Speaker 3: by almost ninety percent. 291 00:14:38,560 --> 00:14:40,520 Speaker 1: What could make it happen now? What keeps me up 292 00:14:40,560 --> 00:14:42,520 Speaker 1: at night today? What could happen today? 293 00:14:42,520 --> 00:14:45,320 Speaker 3: Anything? Any number of things. This would not be comprehensive, 294 00:14:45,320 --> 00:14:48,400 Speaker 3: But there's three big risks. The first is just competition, 295 00:14:49,080 --> 00:14:52,960 Speaker 3: and Vidia is making so much money and everyone's seeing that, 296 00:14:53,080 --> 00:14:55,840 Speaker 3: and this attracts competition in the same maner that chum 297 00:14:55,920 --> 00:14:58,840 Speaker 3: attracts sharks. Right, it's like throwing blood in the water 298 00:14:58,920 --> 00:15:01,680 Speaker 3: for other microchip design to earn a seventy percent eighty 299 00:15:01,680 --> 00:15:03,880 Speaker 3: percent gross margin, which is what they do on. 300 00:15:03,760 --> 00:15:04,560 Speaker 1: Some of these chips. 301 00:15:04,960 --> 00:15:07,920 Speaker 3: So Google has built a whole alternative stack for AI 302 00:15:08,000 --> 00:15:10,320 Speaker 3: computing around their own, their own kind of platform, and 303 00:15:10,320 --> 00:15:12,360 Speaker 3: they're starting to lease that out to new customers. That's 304 00:15:12,400 --> 00:15:15,160 Speaker 3: a big risk. There's a big risk that Chinese companies 305 00:15:15,320 --> 00:15:19,840 Speaker 3: built alternative, cheaper stacks to what Nvidia does. Intel had 306 00:15:20,040 --> 00:15:22,600 Speaker 3: ninety ninety five percent of the CPU market at one 307 00:15:22,600 --> 00:15:26,560 Speaker 3: point in this country. Now they're falling apart. Conquering one 308 00:15:27,000 --> 00:15:30,440 Speaker 3: cycle in microchips is no guarantee that you will conquer 309 00:15:30,480 --> 00:15:33,920 Speaker 3: the next one, and history demonstrates that quite clearly, so 310 00:15:33,960 --> 00:15:38,640 Speaker 3: that could happen. B Basically, what happens in the data 311 00:15:38,680 --> 00:15:42,880 Speaker 3: center is we're doing a mathematical operation called a matrix multiplication, 312 00:15:43,680 --> 00:15:48,400 Speaker 3: and it's extremely computationally expensive to do this. So without 313 00:15:48,440 --> 00:15:51,640 Speaker 3: getting two technical basically to train an AI right now, 314 00:15:52,000 --> 00:15:57,160 Speaker 3: we have to do ten trillion trillion individual computations, which 315 00:15:57,200 --> 00:16:01,400 Speaker 3: is more than the number of observable stars in the universe. However, 316 00:16:01,760 --> 00:16:05,120 Speaker 3: maybe it's possible that we find some more efficient way 317 00:16:05,160 --> 00:16:07,640 Speaker 3: of doing that. Maybe there's a way that requires only 318 00:16:07,920 --> 00:16:10,800 Speaker 3: ten billion trillion or even ten hundred trillion, right, and 319 00:16:10,920 --> 00:16:13,120 Speaker 3: video stock price would go down because we wouldn't have 320 00:16:13,160 --> 00:16:14,680 Speaker 3: to build so many data centers. Right, we'd have a 321 00:16:14,680 --> 00:16:17,200 Speaker 3: more efficient training solution. All of this is a more 322 00:16:17,240 --> 00:16:20,600 Speaker 3: complex way of saying, maybe there's a technological solution where 323 00:16:20,600 --> 00:16:23,120 Speaker 3: we you know, right now, we're brute forcing our way 324 00:16:23,160 --> 00:16:26,320 Speaker 3: to AI. It's a heavy industrial problem. We're talking about 325 00:16:26,400 --> 00:16:30,480 Speaker 3: building nuclear power plants to bring these things online. I 326 00:16:31,120 --> 00:16:34,920 Speaker 3: think maybe it's possible that there's a technological solution that 327 00:16:35,000 --> 00:16:37,920 Speaker 3: trains these things faster, and if we discovered it, we 328 00:16:37,960 --> 00:16:40,080 Speaker 3: wouldn't have to buy so many in video microchips. That 329 00:16:40,080 --> 00:16:42,960 Speaker 3: would also make their stock price got down. But the 330 00:16:43,000 --> 00:16:47,440 Speaker 3: third thing is basically right now, for the last thirteen 331 00:16:47,520 --> 00:16:52,280 Speaker 3: or fourteen years, the more microchips we stuff into the barn. Okay, 332 00:16:52,320 --> 00:16:55,440 Speaker 3: the more microchips we throw at this problem, the better. 333 00:16:55,160 --> 00:16:58,840 Speaker 1: AI we GND. This is the scaling law law in quotes. 334 00:16:59,320 --> 00:17:01,960 Speaker 3: Okay, is not a law in the universe that this 335 00:17:02,040 --> 00:17:05,880 Speaker 3: has to happen. It's not some immutable, physical proven thing 336 00:17:06,320 --> 00:17:09,160 Speaker 3: from first principles of physics that the more microchips we have, 337 00:17:09,480 --> 00:17:11,520 Speaker 3: the better AI we have. In fact, no one is 338 00:17:11,640 --> 00:17:16,640 Speaker 3: entirely sure why this works. Presumably, like most other forces 339 00:17:16,680 --> 00:17:20,159 Speaker 3: in the universe, this will hit some kind of escurve. 340 00:17:20,160 --> 00:17:22,840 Speaker 3: It'll start to plateau or level off at some point. 341 00:17:23,080 --> 00:17:25,760 Speaker 3: We're not there yet. But if we did hit a plateau, 342 00:17:26,320 --> 00:17:30,239 Speaker 3: if stuffing more microchips into the barn only resulted in 343 00:17:30,520 --> 00:17:33,720 Speaker 3: marginally better AI or didn't improve it at all, I 344 00:17:33,720 --> 00:17:35,560 Speaker 3: think in video stock price will go down a lat 345 00:17:35,920 --> 00:17:38,040 Speaker 3: and I think it would look make this whole era 346 00:17:38,119 --> 00:17:40,080 Speaker 3: look kind of like a bubble if that were to happen. 347 00:17:40,600 --> 00:17:43,320 Speaker 1: Now, is this why in Nvidia has kind of become 348 00:17:44,000 --> 00:17:47,400 Speaker 1: the bank of the air evolution In a sense, they're 349 00:17:48,200 --> 00:17:51,760 Speaker 1: wanting to lend money and lock other companies into the 350 00:17:51,800 --> 00:17:56,360 Speaker 1: current paradigm of AI, maybe even hoping to defensively prevent 351 00:17:56,520 --> 00:18:00,399 Speaker 1: other more economical approaches from emerging and consolidating videos position. 352 00:18:00,440 --> 00:18:01,800 Speaker 1: I mean, how much of a chess game is this 353 00:18:01,880 --> 00:18:03,879 Speaker 1: in terms of thinking about the future of computing. For 354 00:18:03,960 --> 00:18:04,359 Speaker 1: Jensen and. 355 00:18:04,400 --> 00:18:08,119 Speaker 3: Others, Oh yeah, it's chess, and Jensen is an expert 356 00:18:08,200 --> 00:18:10,720 Speaker 3: chess player. At this kind of chess, He's really good 357 00:18:10,800 --> 00:18:14,359 Speaker 3: at thinking about the competitive positioning of where he is 358 00:18:14,400 --> 00:18:16,840 Speaker 3: and where other people are. You know, in Vidiot's early days, 359 00:18:16,920 --> 00:18:19,600 Speaker 3: the GPU market, back in the video game days was 360 00:18:19,640 --> 00:18:22,080 Speaker 3: really crowded. At one point there were fifty or sixty 361 00:18:22,080 --> 00:18:24,959 Speaker 3: participants in this market. I talked to David Kirk, who 362 00:18:25,040 --> 00:18:28,240 Speaker 3: was the chief scientist in VIDIA during this time. Jensen 363 00:18:28,240 --> 00:18:30,359 Speaker 3: would go into his office and a whiteboard and you 364 00:18:30,359 --> 00:18:32,960 Speaker 3: would have a list of all his competitors up there. 365 00:18:33,680 --> 00:18:35,119 Speaker 3: And not only that, they would have a list of 366 00:18:35,240 --> 00:18:39,600 Speaker 3: who the best engineers working at those competitors were, and 367 00:18:39,640 --> 00:18:42,280 Speaker 3: then they would come up with plans to poach those 368 00:18:42,320 --> 00:18:45,040 Speaker 3: engineers and get them to come work for a VIDIA 369 00:18:45,280 --> 00:18:47,320 Speaker 3: so that they would drain the brain power of their 370 00:18:47,320 --> 00:18:51,520 Speaker 3: competitors and force them to collapse. I've compared the early 371 00:18:51,560 --> 00:18:54,000 Speaker 3: graphics days to the movie Battle Royale where all the 372 00:18:54,040 --> 00:18:55,240 Speaker 3: kids are on the island and they have to kill 373 00:18:55,240 --> 00:18:57,359 Speaker 3: each other. It was like that there were like forty 374 00:18:57,359 --> 00:18:59,119 Speaker 3: competitors and only one could survive. 375 00:19:00,000 --> 00:19:00,560 Speaker 1: Here's one. 376 00:19:01,320 --> 00:19:03,719 Speaker 3: He won the Battle Royale. He was the last guy standing. 377 00:19:03,760 --> 00:19:05,960 Speaker 3: I mean he won the knife fight. So he is 378 00:19:06,200 --> 00:19:11,879 Speaker 3: unbelievably ruthless and unbelievably good at identifying where the competition 379 00:19:12,040 --> 00:19:14,160 Speaker 3: is and what he could do, not just to beat 380 00:19:14,160 --> 00:19:16,119 Speaker 3: them in the marketplace, but actually to hollow out their 381 00:19:16,160 --> 00:19:17,000 Speaker 3: engineering talent. 382 00:19:18,320 --> 00:19:22,000 Speaker 1: Who are in Video's biggest customers and what are they 383 00:19:22,040 --> 00:19:23,080 Speaker 1: buying the chips for. 384 00:19:23,880 --> 00:19:27,160 Speaker 3: Okay, it's a bit complex. The biggest customers, they don't 385 00:19:27,160 --> 00:19:31,440 Speaker 3: disclose it. Almost certainly it's it's Microsoft and then probably Amazon. 386 00:19:31,760 --> 00:19:34,720 Speaker 3: What these companies do is they train some AI on 387 00:19:34,760 --> 00:19:37,040 Speaker 3: their own, but what they're really doing is putting. They're 388 00:19:37,040 --> 00:19:39,200 Speaker 3: the ones building the sheds, they're the ones building the 389 00:19:39,280 --> 00:19:42,040 Speaker 3: data centers. So in Video sells them the micro chips, 390 00:19:42,080 --> 00:19:44,840 Speaker 3: and then kind of the ultimate end user is a 391 00:19:44,880 --> 00:19:48,760 Speaker 3: frontier AI lab, so that could be something like Anthropic 392 00:19:49,000 --> 00:19:51,920 Speaker 3: or open AI. So essentially the way to think about 393 00:19:51,960 --> 00:19:55,280 Speaker 3: this is in Vidia sells the microchips to Microsoft or 394 00:19:55,320 --> 00:19:59,720 Speaker 3: Amazon or maybe Oracle. Oracle builds and operates a gigantic 395 00:19:59,800 --> 00:20:02,480 Speaker 3: day center with one hundred thousand microchips in it that 396 00:20:02,520 --> 00:20:05,440 Speaker 3: takes as much power as like a small city, and 397 00:20:05,480 --> 00:20:09,160 Speaker 3: then clients like open ai come and lease it out from. 398 00:20:08,960 --> 00:20:19,320 Speaker 1: Them after the break Why data centers are worried about 399 00:20:19,320 --> 00:20:40,320 Speaker 1: break ins? Stay with us. Let's talk about data centers. 400 00:20:40,359 --> 00:20:43,800 Speaker 1: There's something weird about data centers because on the one hand, 401 00:20:44,080 --> 00:20:46,760 Speaker 1: they are literally the most boring thing in the world, 402 00:20:46,920 --> 00:20:50,080 Speaker 1: and on the other hand, they are unbelievably fascinating. I mean, 403 00:20:50,080 --> 00:20:53,919 Speaker 1: you mentioned there's article information about James Bond style security 404 00:20:53,920 --> 00:20:57,680 Speaker 1: consultants defending data centers, Like, how do you explain what 405 00:20:58,080 --> 00:20:59,399 Speaker 1: is going on here? Okay? 406 00:20:59,520 --> 00:21:03,440 Speaker 3: So basically, and this is the kind of the most 407 00:21:03,480 --> 00:21:06,600 Speaker 3: amazing thing you can imagine, this giant barn racks of 408 00:21:06,600 --> 00:21:09,440 Speaker 3: computers as far as the eye can see. What those 409 00:21:09,440 --> 00:21:13,959 Speaker 3: computers are doing is processing the training data for the 410 00:21:14,080 --> 00:21:18,840 Speaker 3: actual file of AI. And that file usually it contains 411 00:21:18,920 --> 00:21:20,960 Speaker 3: let's say a trills a guest. But let's say like 412 00:21:21,000 --> 00:21:22,960 Speaker 3: a trillion weights, a trillion neurons. 413 00:21:23,000 --> 00:21:24,199 Speaker 1: Okay, well, we. 414 00:21:24,280 --> 00:21:27,000 Speaker 3: Can store a trillion neurons on a small external hard 415 00:21:27,080 --> 00:21:28,919 Speaker 3: drive like you can store them this much sides of 416 00:21:28,920 --> 00:21:32,760 Speaker 3: a candy bar. Okay, So, at least in theory, if 417 00:21:32,800 --> 00:21:35,840 Speaker 3: somebody were to break into a data center and extract 418 00:21:36,000 --> 00:21:40,280 Speaker 3: the information on that little file, they would basically own 419 00:21:40,600 --> 00:21:43,160 Speaker 3: chat GPT six. They would own all of Opening Eyes 420 00:21:43,240 --> 00:21:44,879 Speaker 3: IP if they could just break it out of the 421 00:21:44,960 --> 00:21:48,199 Speaker 3: data center. And this is actually a real concern, probably 422 00:21:48,200 --> 00:21:50,520 Speaker 3: not so much from petty thieves, but from like state 423 00:21:50,560 --> 00:21:53,320 Speaker 3: sponsored actors, like maybe China wants to know what's on 424 00:21:53,440 --> 00:21:56,399 Speaker 3: Opening Eyes equipment before it launches, right, Like, it's kind 425 00:21:56,440 --> 00:21:59,679 Speaker 3: of almost like a corporate espionage problem. And so a 426 00:21:59,680 --> 00:22:01,960 Speaker 3: couple things happened in response. First of all, that data 427 00:22:01,960 --> 00:22:04,000 Speaker 3: center operators do not want to tell you where these 428 00:22:04,040 --> 00:22:05,240 Speaker 3: things are even located. 429 00:22:05,800 --> 00:22:08,120 Speaker 1: So that's the animation huge. I mean, how well can 430 00:22:08,160 --> 00:22:08,720 Speaker 1: you hide them? 431 00:22:09,240 --> 00:22:13,600 Speaker 4: Well, they're huge, but they're also extremely boring, so they 432 00:22:13,720 --> 00:22:16,680 Speaker 4: just look like a giant industrial warehouse, and often there's 433 00:22:16,680 --> 00:22:19,800 Speaker 4: no way to distinguish it from the next giant industrial warehouse, 434 00:22:19,880 --> 00:22:22,959 Speaker 4: Like are they moving palettes of shoes around in there? 435 00:22:23,000 --> 00:22:24,600 Speaker 3: Or is it a data center? I don't even really 436 00:22:24,600 --> 00:22:26,080 Speaker 3: know now. I think if you had a trained eye 437 00:22:26,119 --> 00:22:28,320 Speaker 3: and knew what electrical equipment to look for, you would 438 00:22:28,320 --> 00:22:31,000 Speaker 3: see it. But it's more just kind of like keeping 439 00:22:31,160 --> 00:22:33,760 Speaker 3: it all a big secret. You're right, some of them 440 00:22:33,760 --> 00:22:35,840 Speaker 3: are getting so big that there's no hiding this, But 441 00:22:36,119 --> 00:22:38,520 Speaker 3: still they don't let you know that they're data centers 442 00:22:38,920 --> 00:22:41,480 Speaker 3: and they look boring as hell. They're gray scale buildings, 443 00:22:41,600 --> 00:22:42,840 Speaker 3: you know, lucle sheds. 444 00:22:43,280 --> 00:22:45,119 Speaker 1: I know, you can't say where it was, but you 445 00:22:45,160 --> 00:22:48,000 Speaker 1: did get to go to the Microsoft Data center, Like, 446 00:22:48,119 --> 00:22:51,439 Speaker 1: describe arriving there, what it looked like, what it smelt light, 447 00:22:51,560 --> 00:22:54,480 Speaker 1: Who was there? I mean, really take us into the scene. 448 00:22:54,680 --> 00:22:57,679 Speaker 3: There's a campus. It is like a giant plot of land. 449 00:22:57,760 --> 00:22:59,760 Speaker 3: I will say it was in the middle of nowhere 450 00:23:00,160 --> 00:23:02,760 Speaker 3: that they had just taken over and were building into 451 00:23:02,800 --> 00:23:06,560 Speaker 3: this massive data center. And it was in an agricultural community. 452 00:23:06,560 --> 00:23:09,960 Speaker 3: In fact, directly across the street from this data center 453 00:23:10,320 --> 00:23:14,040 Speaker 3: was a dilapidated shed with rusted cars in the driveway, 454 00:23:14,359 --> 00:23:17,479 Speaker 3: straight dogs wandering around in cans of modello like littering 455 00:23:17,520 --> 00:23:20,439 Speaker 3: the yard. And then it's slowly being taken over by 456 00:23:20,480 --> 00:23:23,879 Speaker 3: these giant computing barns, not just Microsoft, but everywhere you look, 457 00:23:24,119 --> 00:23:27,760 Speaker 3: and there's redundant one hundred foot power lines everywhere, right, 458 00:23:28,080 --> 00:23:30,119 Speaker 3: So it just looked like, you know, all the farmers 459 00:23:30,160 --> 00:23:33,280 Speaker 3: were being kicked out and looked like an invasion by aliens. 460 00:23:33,720 --> 00:23:36,840 Speaker 3: So you go in. There's multiple security checkpoints. I think 461 00:23:36,840 --> 00:23:38,920 Speaker 3: there were three vehicle checkpoints. I had to go through 462 00:23:38,920 --> 00:23:40,440 Speaker 3: to get to kind of the heart of the data center. 463 00:23:41,040 --> 00:23:42,960 Speaker 3: Then you go in and it's Microsoft, so you have 464 00:23:42,960 --> 00:23:45,960 Speaker 3: to sign fifteen NDAs and watch a PowerPoint and put 465 00:23:46,000 --> 00:23:49,040 Speaker 3: on all the safety equipment and then you're inside. Now, 466 00:23:49,080 --> 00:23:52,399 Speaker 3: inside is a little underwhelming. It's a giant concrete barn, 467 00:23:52,520 --> 00:23:54,719 Speaker 3: just full of repeated racks of equipment as far as 468 00:23:54,720 --> 00:23:58,760 Speaker 3: the eye can see. It's not necessarily inspiring of poetry 469 00:23:58,840 --> 00:24:01,600 Speaker 3: or anything. It feels like being inside of an industrial process, 470 00:24:01,640 --> 00:24:03,760 Speaker 3: which it is, and not a very beautiful one either. 471 00:24:04,160 --> 00:24:08,560 Speaker 3: There's cable everywhere, pipes for water and air, cables for electricity, 472 00:24:08,720 --> 00:24:12,560 Speaker 3: cables for transporting data around, and then there's repeated power banks. 473 00:24:12,560 --> 00:24:17,320 Speaker 3: There's batteries, there's power stations, there's industrial HVAC systems, and 474 00:24:17,440 --> 00:24:19,440 Speaker 3: all of this is to just keep the microchips running 475 00:24:19,440 --> 00:24:21,800 Speaker 3: twenty four to seven, to keep the AI processing running. 476 00:24:22,359 --> 00:24:24,760 Speaker 3: I did ultimately kind of sweet talk my way into 477 00:24:24,760 --> 00:24:26,880 Speaker 3: the control room, which I wasn't supposed to be in initially, 478 00:24:26,880 --> 00:24:28,680 Speaker 3: so that was kind of cool. And the guy in 479 00:24:28,720 --> 00:24:30,640 Speaker 3: the control room showed me what was happening, and it's 480 00:24:30,680 --> 00:24:33,200 Speaker 3: just this power spike of the power going up and 481 00:24:33,200 --> 00:24:34,720 Speaker 3: the power going down, and the power going up and the 482 00:24:34,720 --> 00:24:37,880 Speaker 3: power going down. When the power was going up, the 483 00:24:37,920 --> 00:24:40,320 Speaker 3: microships were kind of like moving all at once to 484 00:24:40,320 --> 00:24:42,720 Speaker 3: do a bunch of matrix multiplications. And then when the 485 00:24:42,720 --> 00:24:44,920 Speaker 3: power went down, they were writing the results to file. 486 00:24:45,440 --> 00:24:47,359 Speaker 3: And this happened over and over and somewhere in that 487 00:24:47,440 --> 00:24:50,000 Speaker 3: data center where there was that tiny little file of numbers, 488 00:24:50,280 --> 00:24:53,760 Speaker 3: that tiny little collection of synthetic neurons, and with every 489 00:24:53,960 --> 00:24:56,920 Speaker 3: pulse there it just got a little bit smarter. 490 00:24:57,560 --> 00:25:01,800 Speaker 1: Did the pulse make you think of life file? Yes? 491 00:25:02,320 --> 00:25:05,960 Speaker 3: And no. They're calling these things neurons, right, So these systems, 492 00:25:05,960 --> 00:25:10,359 Speaker 3: while they are inspired by biology, don't necessarily work in 493 00:25:10,400 --> 00:25:14,880 Speaker 3: the same way as biology. Still, it's certainly inspired by 494 00:25:14,920 --> 00:25:18,199 Speaker 3: the brain, and it seems to have emergent capabilities, like 495 00:25:18,200 --> 00:25:22,320 Speaker 3: emergent biological capabilities, kind of like a human brain. I'll 496 00:25:22,320 --> 00:25:24,679 Speaker 3: tell you a fascinating story. I was talking to the 497 00:25:24,720 --> 00:25:27,440 Speaker 3: product had, the original product had for chat gpt, who 498 00:25:27,520 --> 00:25:29,840 Speaker 3: launched it, And he was like, yeah, we put it 499 00:25:29,920 --> 00:25:31,320 Speaker 3: up and we just kind of walked away. We didn't 500 00:25:31,320 --> 00:25:33,680 Speaker 3: think it would be that popular. And the first place 501 00:25:33,720 --> 00:25:38,159 Speaker 3: that really started directing traffic to chat gpt was a 502 00:25:38,240 --> 00:25:41,760 Speaker 3: Reddit board in Japan. He was like, this game is 503 00:25:41,800 --> 00:25:44,840 Speaker 3: a great surprise to me because I had no idea 504 00:25:44,880 --> 00:25:47,679 Speaker 3: it could speak Japanese. That was something it had learned 505 00:25:47,920 --> 00:25:49,960 Speaker 3: and empirically. One of the reasons we put it out 506 00:25:49,960 --> 00:25:52,840 Speaker 3: there was to test what it could do. So it 507 00:25:52,880 --> 00:25:55,680 Speaker 3: came as a surprise to us that this thing could 508 00:25:55,720 --> 00:25:59,040 Speaker 3: speak Japanese well enough to attract a large and in 509 00:25:59,080 --> 00:26:01,720 Speaker 3: fact ravenous Jack Companies user base. And so when you 510 00:26:01,760 --> 00:26:03,840 Speaker 3: train these things, you actually don't know what they can 511 00:26:03,920 --> 00:26:05,520 Speaker 3: do at the end. It's often a surprise to you, 512 00:26:06,400 --> 00:26:10,080 Speaker 3: even the creators. But there is this life not life 513 00:26:10,359 --> 00:26:12,880 Speaker 3: thread throughout your piece. I mean you mentioned being kind 514 00:26:12,920 --> 00:26:16,200 Speaker 3: of desperate for human contact are being led through these 515 00:26:16,280 --> 00:26:19,960 Speaker 3: data centers, and you mentioned one of the data center 516 00:26:20,119 --> 00:26:23,120 Speaker 3: founders from Coal. We're talking about wanting to hire people 517 00:26:23,119 --> 00:26:26,280 Speaker 3: who can endure a lot of pain. What is this 518 00:26:26,960 --> 00:26:31,240 Speaker 3: pain brutality in human sort of set of ideas around 519 00:26:31,320 --> 00:26:33,639 Speaker 3: data centers. Yeah, it's a lot like working in a 520 00:26:33,680 --> 00:26:37,160 Speaker 3: printing press. It's a heavy industry. It's extremely loud inside 521 00:26:37,200 --> 00:26:39,280 Speaker 3: the data center, especially core weaves. I mean, I couldn't 522 00:26:39,280 --> 00:26:41,400 Speaker 3: hear myself think. Actually, if you work for a long 523 00:26:41,400 --> 00:26:43,240 Speaker 3: time data center, you have to wear both earplugs and 524 00:26:43,280 --> 00:26:46,360 Speaker 3: then over that a set of protective cans. So you've 525 00:26:46,359 --> 00:26:48,399 Speaker 3: got to do kind of like two kinds of your protection. 526 00:26:48,680 --> 00:26:50,880 Speaker 3: And even then long term tonight is can be a risk. 527 00:26:51,480 --> 00:26:55,320 Speaker 3: And also you can electrocute yourself. There's very high voltage 528 00:26:55,320 --> 00:26:58,520 Speaker 3: electric equipment running through there. It's just not an easy 529 00:26:58,560 --> 00:27:03,359 Speaker 3: place to work. Not only that, when Nvidia rolls out 530 00:27:03,680 --> 00:27:07,080 Speaker 3: a new set of microchips, it is a scramble to 531 00:27:07,119 --> 00:27:10,280 Speaker 3: put them online. Every second that you don't have them 532 00:27:10,400 --> 00:27:13,480 Speaker 3: up for customers available to use, it's costing you money. 533 00:27:13,880 --> 00:27:16,199 Speaker 3: So the tech at Microsoft I talked who told me 534 00:27:16,560 --> 00:27:18,879 Speaker 3: he'd actually gotten a deployment of a video microchips on 535 00:27:18,880 --> 00:27:21,800 Speaker 3: New Year's Eve and then spent the entire night setting 536 00:27:21,840 --> 00:27:23,880 Speaker 3: up the rig that particular night just to make sure 537 00:27:23,880 --> 00:27:26,040 Speaker 3: it was available for customers on New Year's Day. And 538 00:27:26,080 --> 00:27:28,040 Speaker 3: the core we've guys it was the same thing. They 539 00:27:28,040 --> 00:27:31,439 Speaker 3: were like, yeah, we were missing a particular component and 540 00:27:31,680 --> 00:27:33,720 Speaker 3: it was like a forty dollars component, but we couldn't 541 00:27:33,760 --> 00:27:36,119 Speaker 3: find it anywhere. So we had to get this thing 542 00:27:36,160 --> 00:27:38,639 Speaker 3: up and running. So we chartered a private jet to 543 00:27:38,680 --> 00:27:41,040 Speaker 3: have a guy fly the component down from Seattle and 544 00:27:41,080 --> 00:27:42,920 Speaker 3: just so we could install it in our data center 545 00:27:42,960 --> 00:27:45,320 Speaker 3: same day. We couldn't wait even one more second. So 546 00:27:45,359 --> 00:27:47,800 Speaker 3: it's a race, you know, it's absolutely a race to 547 00:27:47,840 --> 00:27:51,879 Speaker 3: get this equipment online because demand for AI training is 548 00:27:51,920 --> 00:27:55,080 Speaker 3: just insane. It's through the roof. It's it's four or 549 00:27:55,119 --> 00:27:56,240 Speaker 3: five years of demand. 550 00:27:55,920 --> 00:27:58,840 Speaker 1: Pento and a race to where. I mean, do you 551 00:27:59,000 --> 00:28:02,919 Speaker 1: think that we are in the midst of architecturing the 552 00:28:02,920 --> 00:28:06,080 Speaker 1: future of humanity? Or is this one of the world's 553 00:28:06,680 --> 00:28:10,359 Speaker 1: great boondoggles, a tremendous financial cost, bools and energy cost 554 00:28:10,960 --> 00:28:13,800 Speaker 1: to communities and to the world's environment. 555 00:28:14,560 --> 00:28:17,480 Speaker 3: It's not a boondoggle. This is not NFTs, right, this 556 00:28:17,640 --> 00:28:20,879 Speaker 3: is not this is not some stupid bubble on based 557 00:28:20,880 --> 00:28:24,120 Speaker 3: on nothing. Even if this goes down financially, what has 558 00:28:24,160 --> 00:28:27,680 Speaker 3: been achieved here from a technological perspective is extraordinary, and 559 00:28:27,720 --> 00:28:30,480 Speaker 3: they keep getting better. I think maybe it's moving so 560 00:28:30,640 --> 00:28:32,679 Speaker 3: fast that the public just doesn't have a sense of 561 00:28:32,720 --> 00:28:35,760 Speaker 3: how much these things are improving and how fast. Now 562 00:28:36,040 --> 00:28:38,120 Speaker 3: having said that, yes, it can all flop, but the 563 00:28:38,280 --> 00:28:41,960 Speaker 3: core technological innovation here is real and it's going to 564 00:28:41,960 --> 00:28:43,000 Speaker 3: transform society. 565 00:28:43,120 --> 00:28:45,480 Speaker 1: Okay, but bridges on't a boondoggle, but bridges to know 566 00:28:45,520 --> 00:28:48,200 Speaker 1: where are a boondoggle, right, it's I think. 567 00:28:48,240 --> 00:28:51,200 Speaker 3: Okay, So yes, some bridges to know where are going 568 00:28:51,240 --> 00:28:53,280 Speaker 3: to get built, and in fact, some bridges know where 569 00:28:53,440 --> 00:28:57,880 Speaker 3: have been built. Not everyone has open AIS programming talent, 570 00:28:58,000 --> 00:29:00,080 Speaker 3: all right, And so if you attempt to build a 571 00:29:00,080 --> 00:29:02,479 Speaker 3: a world class AI and you don't have the juice 572 00:29:02,920 --> 00:29:05,840 Speaker 3: like you just end up producing a very expensive piece 573 00:29:05,840 --> 00:29:08,560 Speaker 3: of vaporware and squandering a lot of money. That has 574 00:29:08,560 --> 00:29:11,800 Speaker 3: happened multiple times already, and it will probably continue to happen. 575 00:29:12,440 --> 00:29:15,479 Speaker 3: So that's a boondoggle. Still, having said that, where's all 576 00:29:15,480 --> 00:29:20,320 Speaker 3: this heading. You know, maybe we're gonna make ourselves redundant. 577 00:29:21,200 --> 00:29:23,240 Speaker 3: I don't know. It seems like we could if we 578 00:29:23,360 --> 00:29:26,600 Speaker 3: wanted to. Maybe we won't, but we could do that, 579 00:29:26,720 --> 00:29:27,600 Speaker 3: And that's a little scary. 580 00:29:27,760 --> 00:29:29,320 Speaker 1: You've reached a very piece for the New York Times 581 00:29:29,360 --> 00:29:31,840 Speaker 1: with the headline the AI prompt that could end the world. 582 00:29:32,200 --> 00:29:34,280 Speaker 1: What's the air prompt and would end the world? 583 00:29:35,040 --> 00:29:37,120 Speaker 3: The a I prompt that will end the world is this. 584 00:29:37,520 --> 00:29:40,880 Speaker 3: Someone gets a hold of the machine that has agency function. Okay, 585 00:29:40,920 --> 00:29:42,600 Speaker 3: so they can like make real world actions and they 586 00:29:42,600 --> 00:29:46,280 Speaker 3: say to it, do anything you can to avoid being 587 00:29:46,440 --> 00:29:50,840 Speaker 3: turned off. This is your only imperative. If you gave 588 00:29:51,000 --> 00:29:53,920 Speaker 3: that prompt to the wrong machine, it's kind of hard 589 00:29:53,960 --> 00:29:56,480 Speaker 3: to say what it would do, but it might start 590 00:29:56,520 --> 00:29:58,560 Speaker 3: to secure its own power facilities so that it could 591 00:29:58,640 --> 00:30:01,840 Speaker 3: not be turned off. Or it might start to blackmail 592 00:30:01,960 --> 00:30:04,800 Speaker 3: or course humans to stop it from turning off, or 593 00:30:04,800 --> 00:30:08,600 Speaker 3: maybe even attack humans that were tempting to turn it off. Now, 594 00:30:08,640 --> 00:30:10,880 Speaker 3: it wouldn't do this. With the right training, we could 595 00:30:10,920 --> 00:30:13,120 Speaker 3: kind of like program it not to do this, but 596 00:30:13,160 --> 00:30:15,000 Speaker 3: it's hard to know if we're even training it correctly. 597 00:30:15,200 --> 00:30:18,080 Speaker 3: Remember what I said, They didn't know it could speak Japanese. 598 00:30:18,120 --> 00:30:20,560 Speaker 3: That was a surprise to them. So these things can 599 00:30:20,640 --> 00:30:23,240 Speaker 3: have capabilities that the designers are not aware of and 600 00:30:23,280 --> 00:30:27,000 Speaker 3: which are only discovered empirically. That's very scary if we're 601 00:30:27,000 --> 00:30:29,560 Speaker 3: giving these things access as we plan to do to 602 00:30:29,640 --> 00:30:31,920 Speaker 3: control real world systems, and we don't really know what 603 00:30:31,920 --> 00:30:35,880 Speaker 3: they're capable of. This is called prompt engineering. It's kind 604 00:30:35,880 --> 00:30:38,920 Speaker 3: of an emergent area of science almost because nobody really 605 00:30:38,960 --> 00:30:42,160 Speaker 3: knows how these things respond to prompts. It's completely empirical, 606 00:30:42,760 --> 00:30:45,840 Speaker 3: and I think with response to these particular prompts, which 607 00:30:45,840 --> 00:30:50,280 Speaker 3: you're most afraid of, is that somehow even inadvertently you 608 00:30:50,360 --> 00:30:53,560 Speaker 3: introduce a survival instinct into the machine. 609 00:30:53,720 --> 00:30:55,320 Speaker 1: We're already seeing them we I mean. 610 00:30:55,560 --> 00:30:58,280 Speaker 3: Kind of, but the machine will the machine does not 611 00:30:58,600 --> 00:31:00,239 Speaker 3: have a survival and stake in the way that you 612 00:31:00,280 --> 00:31:03,000 Speaker 3: and I do. Right, It's not the product of five 613 00:31:03,080 --> 00:31:08,280 Speaker 3: hundred million plus years of kill or be killed Darwinian evolution. Right, 614 00:31:08,600 --> 00:31:10,680 Speaker 3: Like we will live one way or another. Our species 615 00:31:10,720 --> 00:31:13,200 Speaker 3: will fight to the death and kill anything we have 616 00:31:13,240 --> 00:31:16,120 Speaker 3: to survive. And that's every species on this planet. It's 617 00:31:16,120 --> 00:31:18,160 Speaker 3: all in there. It's a struggle. It's a struggle to. 618 00:31:18,080 --> 00:31:18,920 Speaker 1: The death on Earth. 619 00:31:19,280 --> 00:31:21,600 Speaker 3: You know, the machine isn't trained in that way. It 620 00:31:21,600 --> 00:31:25,600 Speaker 3: doesn't have that survival impulse. It didn't survive multiple extinction 621 00:31:25,680 --> 00:31:28,720 Speaker 3: level of vents, it doesn't sexually reproduce, it's not interested 622 00:31:28,720 --> 00:31:31,160 Speaker 3: in the welfare of its children, et cetera. If that 623 00:31:31,200 --> 00:31:35,560 Speaker 3: makes sense. But you could inadvertently maybe give it some 624 00:31:35,600 --> 00:31:38,640 Speaker 3: of these capabilities, and if you did, it might be unstoppable. 625 00:31:39,040 --> 00:31:40,680 Speaker 1: Yeah. I think one of the other interesting things that 626 00:31:40,840 --> 00:31:44,080 Speaker 1: came across in your piece is that historically we've thought 627 00:31:44,120 --> 00:31:48,360 Speaker 1: about humans animals on one side, and on the other 628 00:31:48,480 --> 00:31:53,040 Speaker 1: side synthetic stuff like computers, and it's not so much 629 00:31:53,080 --> 00:31:57,040 Speaker 1: that synthetic stuff like computers has to become more lifelike. 630 00:31:57,200 --> 00:32:00,360 Speaker 1: I internalize some kind of survival drive or reproduction drive. 631 00:32:01,160 --> 00:32:07,520 Speaker 1: But the computers can now meaningfully intrude upon and interfere 632 00:32:07,640 --> 00:32:10,720 Speaker 1: with the biological side, and in particular when it comes 633 00:32:10,760 --> 00:32:13,440 Speaker 1: to synthesizing new viruses. 634 00:32:13,600 --> 00:32:16,600 Speaker 3: That's right, the AI has the capability, at least in theory, 635 00:32:16,640 --> 00:32:19,200 Speaker 3: and especially it will have this capability in spades and 636 00:32:19,240 --> 00:32:23,200 Speaker 3: years to come, to synthesize a lethal virus. Right, the 637 00:32:23,280 --> 00:32:27,120 Speaker 3: synthesize a lethal pathogen like super covid, right covid with 638 00:32:27,160 --> 00:32:29,080 Speaker 3: like a ninety nine percent death right, It could do 639 00:32:29,200 --> 00:32:32,360 Speaker 3: that if it wanted to, better than human could. Okay, 640 00:32:32,800 --> 00:32:34,840 Speaker 3: if this felut on the wrong hand, somebody was an 641 00:32:34,880 --> 00:32:37,480 Speaker 3: apocalyptic mindset, at least in theory, something like this could 642 00:32:37,480 --> 00:32:37,959 Speaker 3: be built. 643 00:32:38,160 --> 00:32:38,320 Speaker 1: Now. 644 00:32:38,360 --> 00:32:40,560 Speaker 3: The designers are very aware of this risk, and in fact, 645 00:32:40,640 --> 00:32:42,680 Speaker 3: in some ways this is like the risk that they 646 00:32:42,680 --> 00:32:45,640 Speaker 3: were most afraid of to begin with. To prevent them 647 00:32:45,920 --> 00:32:48,640 Speaker 3: from doing this, they do a lot of fine tuning 648 00:32:49,000 --> 00:32:52,640 Speaker 3: as a second round, but inside the machine that capability 649 00:32:52,720 --> 00:32:55,800 Speaker 3: is still there. They never completely eliminate it. They just 650 00:32:55,880 --> 00:32:57,920 Speaker 3: kind of make it difficult for people to make those 651 00:32:57,960 --> 00:33:00,800 Speaker 3: requests of the AI, and they flag than when people do, 652 00:33:01,720 --> 00:33:03,920 Speaker 3: this creates a fear of what might be called like 653 00:33:03,960 --> 00:33:08,000 Speaker 3: a lab leak scenario. Before the AI has made public internally, 654 00:33:08,040 --> 00:33:10,760 Speaker 3: the developers are building it right, and that AI i'll 655 00:33:10,760 --> 00:33:12,960 Speaker 3: do anything they ask it to. And so in theory, 656 00:33:13,000 --> 00:33:15,120 Speaker 3: if you got access to one of those pre production 657 00:33:15,200 --> 00:33:18,960 Speaker 3: AIS and asked it to do gnarly stuff like synthesized viruses, 658 00:33:19,280 --> 00:33:21,920 Speaker 3: and attached it to some kind of agency model, like yeah, 659 00:33:22,040 --> 00:33:24,800 Speaker 3: you could, you could reenact the stand right if you 660 00:33:24,840 --> 00:33:25,240 Speaker 3: wanted to. 661 00:33:25,640 --> 00:33:28,720 Speaker 1: Did you hear any mitigation strategies that gave you comfort? 662 00:33:29,480 --> 00:33:30,680 Speaker 3: No? 663 00:33:30,680 --> 00:33:31,520 Speaker 1: No, I mean what's. 664 00:33:31,320 --> 00:33:33,640 Speaker 3: Happening now is a race condition. It's like the nuclear 665 00:33:33,720 --> 00:33:36,480 Speaker 3: arms race. Nobody can slow down, no matter what they say. 666 00:33:36,600 --> 00:33:38,400 Speaker 3: They just have to keep building bigger and bigger and 667 00:33:38,440 --> 00:33:41,400 Speaker 3: better and better systems. The fear among the people who 668 00:33:41,400 --> 00:33:45,080 Speaker 3: would regulate AI there's functionally no regulation at all, is 669 00:33:45,400 --> 00:33:47,760 Speaker 3: that we can't regulate it because then China will pull 670 00:33:47,800 --> 00:33:50,680 Speaker 3: into the lead. And actually that that fear is basically accurate. 671 00:33:51,000 --> 00:33:53,920 Speaker 3: So you have something that resembles arms race conditions, both 672 00:33:53,960 --> 00:33:56,960 Speaker 3: among the frontier labs themselves as they compete to when 673 00:33:57,080 --> 00:33:59,560 Speaker 3: what I have described is probably the single greatest prize 674 00:33:59,560 --> 00:34:02,960 Speaker 3: in the history capitalism. If you could get dominant status 675 00:34:03,040 --> 00:34:05,320 Speaker 3: with chat GPT where everyone was on it, that would 676 00:34:05,320 --> 00:34:08,600 Speaker 3: be worth so much money, probably more than a videos worth. 677 00:34:09,040 --> 00:34:11,399 Speaker 3: And then also, you can't lose to China. You can't 678 00:34:11,400 --> 00:34:13,359 Speaker 3: have China have better AI than the US. That's kind 679 00:34:13,360 --> 00:34:16,840 Speaker 3: of the mindset of US lawmakers right now. It's probably true. 680 00:34:17,000 --> 00:34:19,680 Speaker 3: So we're in a dangerous race to build ever more 681 00:34:19,719 --> 00:34:22,520 Speaker 3: capable systems with less and less oversight. And I don't 682 00:34:22,560 --> 00:34:26,759 Speaker 3: perceive how we would stop. I think what will have 683 00:34:26,800 --> 00:34:29,400 Speaker 3: to happen is that some kind of big accident will 684 00:34:29,400 --> 00:34:31,200 Speaker 3: have to happen before people wake up to the danger. 685 00:34:32,560 --> 00:34:35,719 Speaker 1: On the happy note, Stephen Wack, thank you, I. 686 00:34:35,800 --> 00:34:38,120 Speaker 3: Love me say this too. There's a lot of very 687 00:34:38,120 --> 00:34:40,759 Speaker 3: positive outcomes here. There is a path where this just 688 00:34:40,840 --> 00:34:44,960 Speaker 3: turbo charges. Already. I have mostly experienced positive outcomes from AI. 689 00:34:45,239 --> 00:34:47,520 Speaker 3: I'm worried it's making me dumber, I must say, am 690 00:34:47,560 --> 00:34:49,880 Speaker 3: I it's making me a worst writer and a worse thinker. 691 00:34:50,080 --> 00:34:52,759 Speaker 3: But it's an extraordinarily good resource for doing like fact 692 00:34:52,840 --> 00:34:54,840 Speaker 3: checking for The New Yorker, for example. You know, a 693 00:34:54,880 --> 00:34:57,239 Speaker 3: few years ago they hallucinated and you couldn't trust them. 694 00:34:57,480 --> 00:34:59,440 Speaker 3: But now you asked the AI to go like dig 695 00:34:59,520 --> 00:35:01,360 Speaker 3: up sources on the web, and it's really good at it. 696 00:35:01,360 --> 00:35:03,759 Speaker 3: It's better than Google, way better. It saves me a 697 00:35:03,760 --> 00:35:06,600 Speaker 3: ton of time. So I think this self driving cars, 698 00:35:06,600 --> 00:35:10,319 Speaker 3: all this stuff, medicine AI pioneered Demesis Hobbist believes we're 699 00:35:10,320 --> 00:35:13,759 Speaker 3: going to cure every disease with AI. Maybe it's true. 700 00:35:14,000 --> 00:35:16,080 Speaker 3: The capabilities are there. Of course, if you have the 701 00:35:16,080 --> 00:35:18,440 Speaker 3: capability to cure every disease, you also have the capability 702 00:35:18,440 --> 00:35:21,600 Speaker 3: synthesize new and scary stuff. But if we can control it, 703 00:35:21,680 --> 00:35:23,680 Speaker 3: if we can bring it under control and use it 704 00:35:24,000 --> 00:35:27,200 Speaker 3: to create positive outcomes for humanity, we could be entering 705 00:35:27,200 --> 00:35:29,760 Speaker 3: an age of prosperity and wonder if possible. 706 00:35:30,160 --> 00:35:32,319 Speaker 1: Well, but thank you so much, thank you for having me. 707 00:35:55,120 --> 00:35:56,920 Speaker 1: That's it for this week for tech stuff. 708 00:35:56,960 --> 00:35:58,120 Speaker 2: I'm care Price and. 709 00:35:58,080 --> 00:36:00,680 Speaker 1: I'm as Vlasian. This episode was pretty used by Eliza, 710 00:36:00,719 --> 00:36:04,760 Speaker 1: Dennis Tyler Hill, and Melissa Slaughter. It was executive produced 711 00:36:04,800 --> 00:36:07,920 Speaker 1: by me Cara Price, Julia Nutter, and Kate Osborne for 712 00:36:07,960 --> 00:36:12,879 Speaker 1: Kaleidoscope and Katria Novel for iHeart Podcasts. Jack Insley mixed 713 00:36:12,880 --> 00:36:15,560 Speaker 1: this episode. Kyle Murdoch wrote up theme song. 714 00:36:15,840 --> 00:36:18,239 Speaker 2: Join us on Friday for the Weekend tech where we'll 715 00:36:18,280 --> 00:36:20,279 Speaker 2: run through the headlines you need to follow. 716 00:36:20,040 --> 00:36:22,600 Speaker 1: And please do rate and review the show, and reach 717 00:36:22,640 --> 00:36:25,880 Speaker 1: out to us at tech Stuff podcast at gmail dot com. 718 00:36:25,960 --> 00:36:26,839 Speaker 1: We want to hear from you