1 00:00:13,800 --> 00:00:16,960 Speaker 1: Welcome to tech stuff. I'm as Vloshan here with Cara Price. 2 00:00:17,200 --> 00:00:21,200 Speaker 1: Hey Kara, Hi, as So years ago, around the time 3 00:00:21,280 --> 00:00:24,800 Speaker 1: that we were reporting our first podcast together on the 4 00:00:24,840 --> 00:00:26,400 Speaker 1: forthcoming AI. 5 00:00:26,480 --> 00:00:29,200 Speaker 2: Revolution around no longer forthcoming. 6 00:00:30,160 --> 00:00:34,360 Speaker 1: You invested in Nvidia, which is up over one hundred 7 00:00:34,600 --> 00:00:37,159 Speaker 1: x since then. Congratulations. 8 00:00:37,560 --> 00:00:40,120 Speaker 3: You know, I just felt when we reported on it 9 00:00:40,159 --> 00:00:42,000 Speaker 3: that it was going to be the future, and so 10 00:00:42,159 --> 00:00:45,639 Speaker 3: I did invest in it, and you know, very happily. 11 00:00:45,720 --> 00:00:47,440 Speaker 2: So now, what was it? 12 00:00:47,479 --> 00:00:49,480 Speaker 1: Was it something that tipped you over the edge back 13 00:00:49,520 --> 00:00:50,319 Speaker 1: then to do it? 14 00:00:50,920 --> 00:00:52,879 Speaker 3: Well, I was just sort of thinking to myself, this 15 00:00:52,920 --> 00:00:55,880 Speaker 3: is the thing that's going to power everything. So obviously 16 00:00:55,920 --> 00:00:59,200 Speaker 3: it's something that people are going to be paying attention 17 00:00:59,240 --> 00:01:01,080 Speaker 3: to and investors are going to be paying attention to, 18 00:01:01,120 --> 00:01:03,640 Speaker 3: and it just made a lot of sense to me 19 00:01:03,680 --> 00:01:04,120 Speaker 3: at the time. 20 00:01:04,760 --> 00:01:08,399 Speaker 1: We know Warren Buffett is retiring this year, there's a 21 00:01:08,440 --> 00:01:13,560 Speaker 1: slot open. In Vidia is, of course the most valuable 22 00:01:13,560 --> 00:01:16,560 Speaker 1: company in the world today, recently topping five trillion dollars 23 00:01:16,600 --> 00:01:18,920 Speaker 1: in value, and of course a lot of people are 24 00:01:18,920 --> 00:01:22,000 Speaker 1: crying bubble, not just for Nvidia but for the AI 25 00:01:22,080 --> 00:01:25,639 Speaker 1: industry as a whole. Actually carrying curiousy you held your stock. 26 00:01:25,920 --> 00:01:29,360 Speaker 3: I held I look, I needed some things, so I 27 00:01:29,400 --> 00:01:31,559 Speaker 3: sold some, but I still have a lot. 28 00:01:31,959 --> 00:01:33,840 Speaker 1: I have enough, and you're going to hold on. You're 29 00:01:33,840 --> 00:01:34,760 Speaker 1: not worried about the bubble. 30 00:01:34,880 --> 00:01:36,120 Speaker 2: I think I am going to hold on. 31 00:01:36,240 --> 00:01:39,600 Speaker 1: But the questions people have are, what if AI doesn't 32 00:01:39,600 --> 00:01:43,480 Speaker 1: get better infinitely as it scales, what if people invent 33 00:01:43,640 --> 00:01:45,840 Speaker 1: new chips that are far more efficient than the in 34 00:01:45,880 --> 00:01:50,120 Speaker 1: Nvidia chips, And what if the adoption of AI by 35 00:01:50,320 --> 00:01:53,200 Speaker 1: other companies doesn't give them the results so they hope 36 00:01:53,200 --> 00:01:56,240 Speaker 1: for financially. And so I talked about all of that 37 00:01:56,280 --> 00:01:59,840 Speaker 1: with somebody who knows Nvidia better than anyone else. In fact, 38 00:02:00,120 --> 00:02:02,720 Speaker 1: he didn't really wrote the book on it, The Thinking Machine, 39 00:02:03,040 --> 00:02:07,200 Speaker 1: Jensen Wang in VideA and the world's most coveted microchip. 40 00:02:07,520 --> 00:02:09,560 Speaker 1: He actually interviewed Wang six times. 41 00:02:10,040 --> 00:02:13,480 Speaker 4: He's moody, and you know, he has what I would 42 00:02:13,520 --> 00:02:16,839 Speaker 4: describe as somewhat self indulgent performances of anger from time 43 00:02:16,880 --> 00:02:17,240 Speaker 4: to time. 44 00:02:17,720 --> 00:02:20,880 Speaker 1: That's Stephen Witt, and he actually got interested in Nvidia 45 00:02:21,200 --> 00:02:24,320 Speaker 1: shortly after chatchipt took the world by storm. 46 00:02:24,720 --> 00:02:26,960 Speaker 4: I had been using CHATGPT and I was like, Wow, 47 00:02:27,200 --> 00:02:29,400 Speaker 4: this thing is amazing. This is like twenty twenty two, 48 00:02:29,960 --> 00:02:33,400 Speaker 4: and I am cooked like, there's not going to be 49 00:02:33,560 --> 00:02:35,160 Speaker 4: room for me as a writer. This thing can already 50 00:02:35,160 --> 00:02:37,360 Speaker 4: write almost as well as I can, and actually writes 51 00:02:37,400 --> 00:02:38,840 Speaker 4: better than I did when I was young. 52 00:02:39,440 --> 00:02:42,799 Speaker 1: As Stephen dug around to understand what was powering this technology, 53 00:02:43,240 --> 00:02:45,840 Speaker 1: he got more and more interested in the company building 54 00:02:45,919 --> 00:02:47,480 Speaker 1: its physical infrastructure. 55 00:02:47,960 --> 00:02:49,680 Speaker 4: What brought me to a video was I was trying 56 00:02:49,720 --> 00:02:51,679 Speaker 4: to write about Open Eye and them, and it was 57 00:02:51,720 --> 00:02:53,880 Speaker 4: just a crowd of the romillion journalists swarming around. I 58 00:02:53,960 --> 00:02:55,880 Speaker 4: was like, there's gotta be some other story here. And 59 00:02:56,000 --> 00:02:58,639 Speaker 4: what I've done as a journalist is look for big 60 00:02:58,720 --> 00:03:03,280 Speaker 4: movements of money that aren't being covered. And I looked 61 00:03:03,400 --> 00:03:05,880 Speaker 4: at Nvidia's stock price and I in my mind they 62 00:03:05,919 --> 00:03:07,000 Speaker 4: were still the gaming company. 63 00:03:07,520 --> 00:03:09,000 Speaker 2: I was like, what the hell is going on here? 64 00:03:09,040 --> 00:03:10,560 Speaker 2: The company's worth a trillion dollars. 65 00:03:10,960 --> 00:03:14,040 Speaker 4: And then as I started to investigate it, I was like, Oh, wow, 66 00:03:14,240 --> 00:03:16,560 Speaker 4: they build all the hardware. They built all the hardware 67 00:03:16,560 --> 00:03:20,239 Speaker 4: that makes this stuff go. That's fascinating. And then I 68 00:03:20,360 --> 00:03:22,079 Speaker 4: kind of learned about Jensen and I was like, wait, 69 00:03:22,120 --> 00:03:24,440 Speaker 4: this company has had the same CEO through both the 70 00:03:24,560 --> 00:03:27,000 Speaker 4: gaming and the AI days and not only that this 71 00:03:27,120 --> 00:03:28,560 Speaker 4: is the same This is the founder, he's been the 72 00:03:28,600 --> 00:03:29,400 Speaker 4: CEO for thirty years. 73 00:03:29,440 --> 00:03:30,480 Speaker 2: It's the same guy all along. 74 00:03:31,200 --> 00:03:33,320 Speaker 1: So Stephen wrote the book on Nvidia, but he also 75 00:03:33,360 --> 00:03:35,960 Speaker 1: wrote a great in New yorka piece recently about data centers, 76 00:03:36,560 --> 00:03:38,640 Speaker 1: and an essay for The New York Times about quote 77 00:03:38,960 --> 00:03:42,480 Speaker 1: the AI prompt that could end the world. So he's 78 00:03:42,520 --> 00:03:45,640 Speaker 1: really a farm to table thinker, from chips to data 79 00:03:45,720 --> 00:03:49,080 Speaker 1: centers to AI to the apocalypse. It was a fun conversation. 80 00:03:49,880 --> 00:03:51,880 Speaker 3: Yeah, as a writer, he seems to sort of be 81 00:03:51,960 --> 00:03:54,640 Speaker 3: at the center of everything in a way that I 82 00:03:54,720 --> 00:03:57,800 Speaker 3: find very compelling, And I actually want to know more 83 00:03:57,800 --> 00:04:00,000 Speaker 3: about the AI prompt that could end the world. 84 00:04:00,640 --> 00:04:01,840 Speaker 1: In that case, you have to listen to the whole 85 00:04:01,880 --> 00:04:04,520 Speaker 1: interview because we talk about it right at the end. Okay, 86 00:04:04,520 --> 00:04:08,000 Speaker 1: it's the conversation with Stephen Witt for then Layman, what 87 00:04:08,240 --> 00:04:10,680 Speaker 1: is in Vidia and how did it become the most 88 00:04:10,760 --> 00:04:11,839 Speaker 1: valuable company in the world. 89 00:04:12,480 --> 00:04:15,480 Speaker 4: In Vidia is basically a hardware designer. They make a 90 00:04:15,560 --> 00:04:19,480 Speaker 4: special kind of microchip called a graphics processing unit, and 91 00:04:19,560 --> 00:04:22,239 Speaker 4: the initial purpose of this thing was to just render 92 00:04:22,400 --> 00:04:26,440 Speaker 4: graphics in video games. So if you were a video gamer, 93 00:04:26,520 --> 00:04:28,640 Speaker 4: you knew who this company was because you would actually 94 00:04:28,720 --> 00:04:32,279 Speaker 4: build your whole PC just around this in Vidia card, 95 00:04:32,680 --> 00:04:35,440 Speaker 4: so this was the engine that rendered the graphics on 96 00:04:35,560 --> 00:04:38,640 Speaker 4: your screen. Sometime around two thousand and four two thousand 97 00:04:38,640 --> 00:04:42,120 Speaker 4: and five, scientists began to notice how powerful these cards were, 98 00:04:42,520 --> 00:04:45,200 Speaker 4: and they started hacking into the cards, like hacking into 99 00:04:45,200 --> 00:04:49,400 Speaker 4: the circuitry to get to those powerful mathematical functions inside 100 00:04:49,600 --> 00:04:53,400 Speaker 4: the microchip. And Jensen Wong saw this and he said, wait, 101 00:04:53,520 --> 00:04:55,480 Speaker 4: this is a whole new market that I can pursue. 102 00:04:56,279 --> 00:04:57,200 Speaker 1: So he built the. 103 00:04:57,279 --> 00:05:01,840 Speaker 4: Software platform that turns the graphics card into basically a 104 00:05:02,120 --> 00:05:08,440 Speaker 4: low budget supercomputer. Now you may ask who is this for, Well, 105 00:05:08,600 --> 00:05:11,880 Speaker 4: it's not really for established research scientists because they can 106 00:05:12,000 --> 00:05:16,400 Speaker 4: usually afford time on a conventional supercomputer. It's for scientists 107 00:05:16,440 --> 00:05:19,280 Speaker 4: who are sort of marginalized, who can't afford time on 108 00:05:19,360 --> 00:05:23,400 Speaker 4: a supercomputer and whose research is out of favor, So 109 00:05:23,800 --> 00:05:28,880 Speaker 4: it's for mad scientists. It's for scientists who are pursuing 110 00:05:29,640 --> 00:05:34,359 Speaker 4: unpopular or weird or kind of offbeat scientific projects. But ultimately, 111 00:05:34,640 --> 00:05:38,800 Speaker 4: the key use case turned out to be AI, and 112 00:05:38,960 --> 00:05:42,760 Speaker 4: specifically a branch of AI that most AI researchers thought 113 00:05:43,040 --> 00:05:47,000 Speaker 4: was crazy, called neural network technology. And what you're doing 114 00:05:47,040 --> 00:05:49,720 Speaker 4: here is you're building software that kind of resembles the 115 00:05:49,800 --> 00:05:53,760 Speaker 4: connections in the human brain. It's inspired by the biological brain. Actually, 116 00:05:54,279 --> 00:05:57,839 Speaker 4: you build a bunch of synthetic neurons in a little file, 117 00:05:57,920 --> 00:06:01,320 Speaker 4: and then you train them by repeat exposing them to 118 00:06:01,440 --> 00:06:03,880 Speaker 4: training data. So what this could mean, for example, if 119 00:06:03,880 --> 00:06:06,520 Speaker 4: we're trying to build a neural network to recognize objects 120 00:06:06,640 --> 00:06:09,560 Speaker 4: to do computer vision, then we'll show it tens of 121 00:06:09,600 --> 00:06:13,159 Speaker 4: thousands or hundreds of thousands, or ultimately millions of images. 122 00:06:13,120 --> 00:06:15,760 Speaker 2: And slowly rewire. 123 00:06:15,160 --> 00:06:18,640 Speaker 4: Its neurons until it can start to identify things. Now, 124 00:06:18,800 --> 00:06:20,800 Speaker 4: this had been proposed going all the way back to 125 00:06:20,880 --> 00:06:23,480 Speaker 4: the nineteen forties, but nobody had ever been able to 126 00:06:23,520 --> 00:06:27,159 Speaker 4: get it to work. And the missing piece, it turns out, 127 00:06:27,640 --> 00:06:30,919 Speaker 4: is just raw computing power. Jeffrey Hinton, who they call 128 00:06:30,960 --> 00:06:33,280 Speaker 4: him the Godfather of AI, he said, you know, the 129 00:06:33,360 --> 00:06:36,120 Speaker 4: point we never thought to ask was what if we 130 00:06:36,240 --> 00:06:39,800 Speaker 4: just made it go a million times faster? And that's 131 00:06:39,839 --> 00:06:42,560 Speaker 4: what Nvidia's hardware did. It made AI and these neural 132 00:06:42,600 --> 00:06:46,400 Speaker 4: networks in particular, train and learn a million times faster. 133 00:06:46,720 --> 00:06:49,360 Speaker 1: We actually had Hinton on the podcast earlier this year. 134 00:06:49,800 --> 00:06:52,520 Speaker 1: Was he in early one of these mad scientists whose 135 00:06:52,560 --> 00:06:56,039 Speaker 1: research was unpopular and therefore started buying in video chips. 136 00:06:56,240 --> 00:06:57,840 Speaker 2: Yes, very much so, and there were few. 137 00:06:57,880 --> 00:06:59,080 Speaker 4: There was a community of these guys. 138 00:06:59,160 --> 00:06:59,880 Speaker 2: It wasn't just him. 139 00:07:00,279 --> 00:07:01,840 Speaker 4: There were a number of other people doing it, most 140 00:07:01,880 --> 00:07:04,560 Speaker 4: of whose work has now been recognized. But they were 141 00:07:04,680 --> 00:07:07,640 Speaker 4: very much on the margins of computer science. They couldn't 142 00:07:07,640 --> 00:07:10,360 Speaker 4: get five thousand dollars in research funding, but they could 143 00:07:10,400 --> 00:07:14,400 Speaker 4: get enough money to afford two five hundred dollars in 144 00:07:14,520 --> 00:07:18,640 Speaker 4: video retail graphics gaming cards, which they did. And Hinton 145 00:07:18,680 --> 00:07:21,080 Speaker 4: had a graduate student named Alex Krzewski who was just 146 00:07:21,120 --> 00:07:24,720 Speaker 4: an ACE programmer, and he turned the neural net that 147 00:07:24,760 --> 00:07:27,440 Speaker 4: he ran on these cards into something called alex net, 148 00:07:27,760 --> 00:07:31,520 Speaker 4: which then started to recognize images better than any AI 149 00:07:31,760 --> 00:07:35,600 Speaker 4: had ever done before. Like it smashed the paradigm, and 150 00:07:35,680 --> 00:07:38,160 Speaker 4: so that engineered around twenty twelve or twenty thirteen a 151 00:07:38,240 --> 00:07:42,360 Speaker 4: paradigm shift in AI, and since then everything that has 152 00:07:42,480 --> 00:07:45,480 Speaker 4: happened has been a repeated application of the thing. Alex 153 00:07:45,680 --> 00:07:49,440 Speaker 4: discovered that if you took neural nets, if you ran 154 00:07:49,560 --> 00:07:53,160 Speaker 4: them on Nvidia technology, you would have a very powerful result. 155 00:07:53,680 --> 00:07:56,600 Speaker 1: So this is all fascinating, but to some it may 156 00:07:56,760 --> 00:08:00,480 Speaker 1: sound a big geeky and like inside baseball, which is 157 00:08:00,520 --> 00:08:02,960 Speaker 1: why I was very attracted to a quote in your 158 00:08:03,000 --> 00:08:07,440 Speaker 1: recent Yoka piece which said, if Americas want to retire comfortably, 159 00:08:07,960 --> 00:08:11,480 Speaker 1: in Vidia has to succeed. Yes, And I'm curious what 160 00:08:11,560 --> 00:08:13,080 Speaker 1: your conversations with the edits around that. 161 00:08:13,400 --> 00:08:13,720 Speaker 2: Around that. 162 00:08:13,920 --> 00:08:16,680 Speaker 4: Oh no, that's that's straightforward. That one actually sailed right 163 00:08:16,680 --> 00:08:17,280 Speaker 4: through fact checking. 164 00:08:17,280 --> 00:08:18,040 Speaker 2: There were no questions. 165 00:08:18,240 --> 00:08:20,840 Speaker 4: What has happened since Alex invented this thing in his 166 00:08:20,920 --> 00:08:23,880 Speaker 4: bedroom is that we scaled it up from two graphics 167 00:08:23,960 --> 00:08:26,760 Speaker 4: cards to two hundred thousand or more, and we have 168 00:08:26,840 --> 00:08:28,800 Speaker 4: plans to scale them up to two or two million 169 00:08:28,840 --> 00:08:31,360 Speaker 4: and then twenty million. Right So this is the data 170 00:08:31,440 --> 00:08:34,040 Speaker 4: center boom which we're going through right now. It's a 171 00:08:34,080 --> 00:08:39,920 Speaker 4: new industrial revolution. We're basically building these giant barns full 172 00:08:40,080 --> 00:08:44,120 Speaker 4: of in Vidia microchips to run calculations to build better 173 00:08:44,240 --> 00:08:46,880 Speaker 4: AI twenty four to seven around the clock, and it's 174 00:08:46,880 --> 00:08:50,040 Speaker 4: one of the largest apployments of capital in human history. 175 00:08:50,880 --> 00:08:55,199 Speaker 4: This has made in Nvidia the most valuable company in 176 00:08:55,280 --> 00:08:58,959 Speaker 4: the world, and it has created a situation where in 177 00:08:59,040 --> 00:09:02,079 Speaker 4: Vidia stock is more concentrated in the S and P 178 00:09:02,200 --> 00:09:04,880 Speaker 4: five hundred than any stock since they started keeping track. 179 00:09:05,400 --> 00:09:08,520 Speaker 4: And actually Microsoft, who is the second biggest, has that 180 00:09:08,679 --> 00:09:11,840 Speaker 4: valuation largely because they're building these sheds and renting out 181 00:09:11,840 --> 00:09:12,520 Speaker 4: in video equipment. 182 00:09:12,559 --> 00:09:13,360 Speaker 2: So that's linked too. 183 00:09:14,240 --> 00:09:17,280 Speaker 4: So think about that fifteen percent of the stock market 184 00:09:17,400 --> 00:09:21,120 Speaker 4: is these two stocks, right, they have to succeed. Americans 185 00:09:21,160 --> 00:09:25,240 Speaker 4: in particular, are usually invested passively through index funds in 186 00:09:25,440 --> 00:09:27,959 Speaker 4: something that looks exactly like the S and P five hundred, 187 00:09:28,480 --> 00:09:31,400 Speaker 4: So you know, if in Vidio crashes, it's going to 188 00:09:31,480 --> 00:09:33,200 Speaker 4: create a lot of paint throughout the economy. 189 00:09:33,480 --> 00:09:35,319 Speaker 1: I want to talk about the data centers, but before that, 190 00:09:35,360 --> 00:09:37,800 Speaker 1: I want to talk about the man who founded in 191 00:09:37,920 --> 00:09:40,920 Speaker 1: Vidio and is a CEO today at Jensen Huang. He 192 00:09:40,960 --> 00:09:43,400 Speaker 1: seems to pop up everywhere, but he also seems to 193 00:09:43,480 --> 00:09:46,760 Speaker 1: be more inscrutable. I mean, who is he? And do 194 00:09:46,840 --> 00:09:49,319 Speaker 1: you see him as different from Zuckerberg and Altman and 195 00:09:49,440 --> 00:09:51,000 Speaker 1: Bezos In some significant way? 196 00:09:51,679 --> 00:09:55,360 Speaker 4: Jensen most resembles of all executives Elon Musk because he 197 00:09:55,520 --> 00:09:59,000 Speaker 4: is an engineering wizard. Bezos is smart as hell, and 198 00:09:59,080 --> 00:10:01,920 Speaker 4: so Zuckerberg. But timately they're kind of software guides. You 199 00:10:02,000 --> 00:10:03,839 Speaker 4: know they're coming at the computer from the keyboard in 200 00:10:03,920 --> 00:10:07,200 Speaker 4: the terminal. Jensen is totally different. He approaches computing from 201 00:10:07,200 --> 00:10:10,440 Speaker 4: the circuit up. He's a degree not in computer science originally, 202 00:10:10,480 --> 00:10:14,559 Speaker 4: but actually in electrical engineering. Okay, So for Jensen, the 203 00:10:14,880 --> 00:10:18,760 Speaker 4: computer is a piece of hardware that runs calculations in 204 00:10:19,320 --> 00:10:23,000 Speaker 4: a microchip, and he literally designed those microchips on paper 205 00:10:23,240 --> 00:10:24,319 Speaker 4: at the beginning of his career. 206 00:10:24,520 --> 00:10:25,680 Speaker 2: And that's all he's ever done. 207 00:10:26,679 --> 00:10:28,199 Speaker 4: And this is a little bit why even though he 208 00:10:28,280 --> 00:10:29,959 Speaker 4: runs the most valuable company in the world, it's a 209 00:10:30,000 --> 00:10:33,800 Speaker 4: little baffling to to kind of people. Nothing Jensen makes 210 00:10:33,880 --> 00:10:37,120 Speaker 4: is really that accessible. It's all deep inside the computer. 211 00:10:37,440 --> 00:10:39,000 Speaker 1: As a quote in your piece that I liked what 212 00:10:39,160 --> 00:10:42,120 Speaker 1: he said, I find that I'm best when I'm under adversity. 213 00:10:42,720 --> 00:10:45,439 Speaker 1: My heart rate actually goes down. Anyone who's dealt with 214 00:10:45,559 --> 00:10:47,640 Speaker 1: RUSSIAU in a restaurant knows what I'm talking about. 215 00:10:48,040 --> 00:10:51,040 Speaker 4: Yeah, yeah, I mean he started out at Denny's, so 216 00:10:51,160 --> 00:10:53,679 Speaker 4: his first job was basically I think he was a 217 00:10:53,760 --> 00:10:56,480 Speaker 4: bus boy at first, and then graduated to dishwasher and 218 00:10:56,559 --> 00:10:59,440 Speaker 4: ultimately became a server. And I was talking to someone 219 00:10:59,480 --> 00:11:01,800 Speaker 4: in the company and she's like, you know what Jensen 220 00:11:01,920 --> 00:11:05,040 Speaker 4: is actually a lot calmer and more compassionate with his 221 00:11:05,120 --> 00:11:08,640 Speaker 4: employees when things are going wrong. It's when the company's 222 00:11:08,640 --> 00:11:10,400 Speaker 4: stock price is way up but it looks like everything's 223 00:11:10,440 --> 00:11:13,400 Speaker 4: going great that he really becomes much more cruel, like 224 00:11:13,520 --> 00:11:17,160 Speaker 4: much much meaner to everybody. So he is actually in 225 00:11:17,280 --> 00:11:20,000 Speaker 4: some ways a nicer person when things are going wrong. 226 00:11:20,120 --> 00:11:21,680 Speaker 4: When he succeeds, it makes him nervous. 227 00:11:22,120 --> 00:11:24,400 Speaker 1: One of his colleagues are described working with him as 228 00:11:24,600 --> 00:11:27,200 Speaker 1: kind of like sticking your finger in the electric socket. 229 00:11:28,600 --> 00:11:29,600 Speaker 1: That's quite the metaphor. 230 00:11:30,000 --> 00:11:32,680 Speaker 4: It's one hundred percent accurate. I mean, I've interacted with Jensen. 231 00:11:32,720 --> 00:11:34,559 Speaker 4: It is like sticking your finger in the electric socket. 232 00:11:34,679 --> 00:11:38,960 Speaker 4: He's so tightly wound. He expects so much to happen 233 00:11:39,440 --> 00:11:42,640 Speaker 4: in every conversation. Just to even start talking to him, 234 00:11:42,640 --> 00:11:44,160 Speaker 4: you have to be totally up to speed. He's not 235 00:11:44,160 --> 00:11:46,000 Speaker 4: going to waste any times. He's not going to suffer fools. 236 00:11:46,559 --> 00:11:49,199 Speaker 4: And he's also really intense and unpredictable, and you just 237 00:11:49,240 --> 00:11:50,080 Speaker 4: don't know where he's. 238 00:11:49,920 --> 00:11:51,040 Speaker 2: Going to go in any conversation. 239 00:11:51,640 --> 00:11:54,720 Speaker 4: And you know, he has what I would describe as 240 00:11:54,800 --> 00:11:57,800 Speaker 4: somewhat self indulgent performances of anger from time to time, 241 00:11:58,080 --> 00:11:59,959 Speaker 4: and That's especially true if you're one of his executive 242 00:12:00,360 --> 00:12:02,680 Speaker 4: If you're not delivering, he's going to stand you up 243 00:12:02,720 --> 00:12:04,280 Speaker 4: in front of an audience of people and just start 244 00:12:04,440 --> 00:12:08,679 Speaker 4: screaming at you. But really, I mean yelling, and it's 245 00:12:08,720 --> 00:12:10,559 Speaker 4: not fun, and he will humiliate you in front of 246 00:12:10,600 --> 00:12:13,000 Speaker 4: an audience. I think people at Nvidia have to develop 247 00:12:13,080 --> 00:12:15,240 Speaker 4: very thick skins. He actually did this to me at 248 00:12:15,280 --> 00:12:19,840 Speaker 4: one point, so I kind of know exactly. Oh yeah, yeah, Well. 249 00:12:20,200 --> 00:12:22,880 Speaker 4: I kept asking him about the future. Jensen does not 250 00:12:23,040 --> 00:12:26,520 Speaker 4: like to speculate. He doesn't have actually a science fiction 251 00:12:26,640 --> 00:12:28,000 Speaker 4: vision of what the future is going to look like. 252 00:12:28,600 --> 00:12:31,679 Speaker 4: He has a data driven vision from engineering principles of 253 00:12:31,760 --> 00:12:33,920 Speaker 4: where he thinks technology is going to go. But if 254 00:12:33,960 --> 00:12:36,679 Speaker 4: he can't see beyond that, he won't speculate. But I 255 00:12:37,280 --> 00:12:39,079 Speaker 4: noticed that other people at this firm would talk about it, 256 00:12:39,120 --> 00:12:42,319 Speaker 4: and I really wanted to get into his imagination. I 257 00:12:42,360 --> 00:12:44,199 Speaker 4: guess I would say, of where he thinks all this 258 00:12:44,320 --> 00:12:46,800 Speaker 4: can go. So I presented him with a clip from 259 00:12:47,000 --> 00:12:49,760 Speaker 4: Arthur C. Clark discussing the future of computers, and this 260 00:12:49,840 --> 00:12:52,040 Speaker 4: is back from nineteen sixty four, but it was kind 261 00:12:52,080 --> 00:12:54,679 Speaker 4: of anticipating the current reality we were in where we 262 00:12:54,720 --> 00:12:57,720 Speaker 4: would start training mechanical brains, and those brains would train 263 00:12:57,800 --> 00:13:01,760 Speaker 4: faster than biological brains and eventually proceed biological prints. And 264 00:13:01,760 --> 00:13:03,160 Speaker 4: so I show this clip to some other people in 265 00:13:03,200 --> 00:13:05,439 Speaker 4: in video, and they've gotten very They kind of like 266 00:13:06,160 --> 00:13:08,880 Speaker 4: smelled up and started giving these grand soliloquies about the 267 00:13:08,920 --> 00:13:10,760 Speaker 4: future that were like very beautiful and articulate. 268 00:13:10,800 --> 00:13:13,080 Speaker 2: And I was hoping to get that response from Jensen. 269 00:13:14,120 --> 00:13:16,240 Speaker 4: Instead, he just starts screaming at me, how about how 270 00:13:16,320 --> 00:13:18,480 Speaker 4: stupid the clip was, how he didn't give a shit 271 00:13:18,520 --> 00:13:19,000 Speaker 4: about Arthur C. 272 00:13:19,120 --> 00:13:19,360 Speaker 2: Clarke. 273 00:13:19,400 --> 00:13:20,880 Speaker 4: He never read one of his books, he didn't read 274 00:13:20,920 --> 00:13:23,520 Speaker 4: science fiction, and he thought the whole line of questioning 275 00:13:23,600 --> 00:13:26,080 Speaker 4: was pedestrian and that I was letting him down by 276 00:13:26,120 --> 00:13:29,160 Speaker 4: asking I was wasting his time. Despite having written his biography, 277 00:13:29,280 --> 00:13:31,800 Speaker 4: Jensen remains a little bit of a puzzle and just 278 00:13:31,840 --> 00:13:34,200 Speaker 4: that I cannot tell you what's going on inside his brain. 279 00:13:34,320 --> 00:13:38,040 Speaker 2: So well, but I will say this, he's extremely neurotic, 280 00:13:39,679 --> 00:13:41,439 Speaker 2: by which I mean I don't even mean this in 281 00:13:41,480 --> 00:13:42,200 Speaker 2: a clinical sense. 282 00:13:42,240 --> 00:13:44,160 Speaker 4: I just mean that, by his own admission, he's totally 283 00:13:44,240 --> 00:13:48,280 Speaker 4: driven by negative emotions. So even though he's on top 284 00:13:48,320 --> 00:13:51,480 Speaker 4: of the world, I think his mind is telling him constantly, 285 00:13:51,520 --> 00:13:54,079 Speaker 4: you're going to fail. This is a temporary thing, and 286 00:13:54,120 --> 00:13:56,319 Speaker 4: the video is going to go back down again, you know, 287 00:13:56,480 --> 00:13:59,520 Speaker 4: twice in his tenure as CEO in video stock price 288 00:13:59,559 --> 00:14:01,240 Speaker 4: has a retreat by almost ninety percent. 289 00:14:01,600 --> 00:14:03,560 Speaker 1: What could make it happen now? What keeps me up 290 00:14:03,559 --> 00:14:04,360 Speaker 1: at night today? 291 00:14:04,720 --> 00:14:06,600 Speaker 2: What could happen today? Anything? Any number of things. 292 00:14:07,000 --> 00:14:09,440 Speaker 4: This would not be comprehensive. But there's three big risks. 293 00:14:10,040 --> 00:14:13,719 Speaker 4: The first is just competition. Nvidia is making so much 294 00:14:13,800 --> 00:14:17,679 Speaker 4: money and everyone's seeing that, and this attracts competition in 295 00:14:17,720 --> 00:14:20,560 Speaker 4: the same manner that chum attracts sharks. Right, It's like 296 00:14:20,640 --> 00:14:23,280 Speaker 4: throwing blood in the water for other microchip designers to 297 00:14:23,320 --> 00:14:26,200 Speaker 4: earn a seventy percent eighty percent gross margin, which is 298 00:14:26,240 --> 00:14:26,720 Speaker 4: what they do on. 299 00:14:26,800 --> 00:14:27,480 Speaker 2: Some of these chips. 300 00:14:28,000 --> 00:14:30,880 Speaker 4: So Google has built a whole alternative stack for AI 301 00:14:31,000 --> 00:14:33,320 Speaker 4: computing around their own, their own kind of platform, and 302 00:14:33,360 --> 00:14:35,400 Speaker 4: they're starting to lease that out to new customers. That's 303 00:14:35,400 --> 00:14:38,160 Speaker 4: a big risk. There's a big risk that Chinese companies 304 00:14:38,360 --> 00:14:42,840 Speaker 4: build alternative, cheaper stacks to what Nvidiat does. Intel had 305 00:14:43,080 --> 00:14:45,600 Speaker 4: ninety ninety five percent of the CPU market at one 306 00:14:45,640 --> 00:14:49,440 Speaker 4: point in this country. Now they're falling apart. Conquering one 307 00:14:50,040 --> 00:14:53,400 Speaker 4: cycle in microchips is no guarantee that you will conquer 308 00:14:53,480 --> 00:14:56,520 Speaker 4: the next one, and history demonstrates that quite clearly. 309 00:14:56,840 --> 00:15:00,800 Speaker 2: So that could happen. B Basically, what hap happens. 310 00:15:01,000 --> 00:15:04,280 Speaker 4: In the data center is we're doing a mathematical operation 311 00:15:04,440 --> 00:15:09,840 Speaker 4: called a matrix multiplication, and it's extremely computationally expensive to 312 00:15:10,000 --> 00:15:13,400 Speaker 4: do this, so without getting two technical basically, to train 313 00:15:13,480 --> 00:15:17,160 Speaker 4: an AI right now, we have to do ten trillion 314 00:15:17,600 --> 00:15:21,240 Speaker 4: trillion individual computations, which is more than the number of 315 00:15:21,320 --> 00:15:25,840 Speaker 4: observable stars in the universe. However, maybe it's possible that 316 00:15:25,960 --> 00:15:28,960 Speaker 4: we find some more efficient way of doing that. Maybe 317 00:15:29,000 --> 00:15:32,120 Speaker 4: there's a way that requires only ten billion trillion or 318 00:15:32,120 --> 00:15:34,960 Speaker 4: even ten hundred trillion. Right Nvidios stock price would go 319 00:15:35,040 --> 00:15:37,400 Speaker 4: down because we wouldn't have to build so many data centers, right, 320 00:15:37,400 --> 00:15:39,760 Speaker 4: we'd have a more efficient training solution. All of this 321 00:15:39,920 --> 00:15:41,920 Speaker 4: is a more complex way of saying, maybe there's a 322 00:15:42,000 --> 00:15:45,360 Speaker 4: technological solution where we you know, right now we're route 323 00:15:45,440 --> 00:15:48,480 Speaker 4: forcing our way to AI. It's a heavy industrial problem. 324 00:15:48,680 --> 00:15:52,240 Speaker 4: We're talking about building nuclear power plants to bring these 325 00:15:52,280 --> 00:15:56,280 Speaker 4: things online. I think maybe it's possible that there's a 326 00:15:56,320 --> 00:16:00,080 Speaker 4: technological solution that trains these things faster, and if we 327 00:16:00,160 --> 00:16:02,160 Speaker 4: discovered it, we wouldn't have to buy so many in 328 00:16:02,280 --> 00:16:04,760 Speaker 4: Vidio microchips that would also make their stock price go down. 329 00:16:05,720 --> 00:16:09,240 Speaker 4: But the third thing is basically right now for the 330 00:16:09,360 --> 00:16:13,880 Speaker 4: last thirteen or fourteen years, the more microchips we stuff 331 00:16:13,920 --> 00:16:16,680 Speaker 4: into the barn, Okay, the more microchips we throw at 332 00:16:16,720 --> 00:16:18,680 Speaker 4: this problem, the better AI we. 333 00:16:18,800 --> 00:16:21,760 Speaker 1: Gnd this is the scaling law in quotes. 334 00:16:22,320 --> 00:16:24,920 Speaker 2: Okay, is not a law in the universe that this 335 00:16:25,080 --> 00:16:25,720 Speaker 2: has to happen. 336 00:16:26,440 --> 00:16:30,360 Speaker 4: It's not some immutable, physical proven thing from first principles 337 00:16:30,400 --> 00:16:32,800 Speaker 4: of physics that the more microchips we have, the better 338 00:16:32,880 --> 00:16:35,200 Speaker 4: AI we have. In fact, no one is entirely sure 339 00:16:35,200 --> 00:16:40,360 Speaker 4: why this works. Presumably, like most other forces in the universe, 340 00:16:41,280 --> 00:16:43,560 Speaker 4: this will hit some kind of s curve. It'll start 341 00:16:43,600 --> 00:16:46,360 Speaker 4: to plateau or level off at some point. We're not 342 00:16:46,440 --> 00:16:49,480 Speaker 4: there yet. But if we did hit a plateau, if 343 00:16:49,520 --> 00:16:54,080 Speaker 4: stuffing more microchips into the barn only resulted in marginally 344 00:16:54,160 --> 00:16:56,880 Speaker 4: better AI or didn't improve it at all, I think 345 00:16:56,920 --> 00:16:59,000 Speaker 4: in videos stock price will go down a lot, and 346 00:16:59,080 --> 00:17:01,320 Speaker 4: I think it would look make this whole era look 347 00:17:01,400 --> 00:17:03,080 Speaker 4: kind of like a bubble if that were to happen. 348 00:17:03,600 --> 00:17:06,280 Speaker 1: Now, is this why in Vidia has kind of become 349 00:17:07,040 --> 00:17:10,400 Speaker 1: the bank of the air evolution In a sense, they're 350 00:17:11,240 --> 00:17:14,720 Speaker 1: wanting to lend money and lock other companies into the 351 00:17:14,800 --> 00:17:19,359 Speaker 1: current paradigm of AI, maybe even hoping to defensively prevent 352 00:17:19,560 --> 00:17:23,399 Speaker 1: other more economical approaches from emerging and consolidating video's position. 353 00:17:23,440 --> 00:17:24,760 Speaker 1: I mean, how much of a chess game is this 354 00:17:24,880 --> 00:17:26,920 Speaker 1: in terms of thinking about the future of computing for 355 00:17:27,000 --> 00:17:27,680 Speaker 1: Jensen and others. 356 00:17:27,880 --> 00:17:30,720 Speaker 4: Oh yeah, it's chess, and Jensen is an I mean 357 00:17:30,800 --> 00:17:32,760 Speaker 4: expert chess player at this kind of chess. 358 00:17:32,800 --> 00:17:35,320 Speaker 2: He's really good at thinking about the. 359 00:17:35,359 --> 00:17:38,280 Speaker 4: Competitive positioning of where he is and where other people are. 360 00:17:38,640 --> 00:17:41,119 Speaker 4: You know, in vidiots early days, the graphics the GPU 361 00:17:41,200 --> 00:17:43,320 Speaker 4: market back in the video game days was really crowded. 362 00:17:43,680 --> 00:17:45,760 Speaker 4: At one point there were fifty or sixty participants in 363 00:17:45,840 --> 00:17:48,200 Speaker 4: this market. I talked to David Kirk, who was the 364 00:17:48,280 --> 00:17:51,560 Speaker 4: chief scientist in Vidio during this time. Jensen would go 365 00:17:51,600 --> 00:17:53,639 Speaker 4: into his office and a whiteboard and you would have 366 00:17:53,680 --> 00:17:56,879 Speaker 4: a list of all his competitors up there, and not 367 00:17:56,960 --> 00:17:58,600 Speaker 4: only that, they would have a list of who the 368 00:17:58,680 --> 00:18:02,879 Speaker 4: best engineers working at those competitors work and then they 369 00:18:02,920 --> 00:18:06,159 Speaker 4: would come up with plans to poach those engineers and 370 00:18:06,280 --> 00:18:08,480 Speaker 4: get them to come work for a video so that 371 00:18:08,560 --> 00:18:10,960 Speaker 4: they would drain the brain power of their competitors and 372 00:18:11,080 --> 00:18:15,280 Speaker 4: force them to collapse. I've compared the early graphics days 373 00:18:15,359 --> 00:18:17,359 Speaker 4: to the movie Battle Royale, whe all the kids are 374 00:18:17,359 --> 00:18:19,040 Speaker 4: on the island, they have to kill each other. It 375 00:18:19,160 --> 00:18:21,400 Speaker 4: was like that. There were like forty competitors and only 376 00:18:21,440 --> 00:18:25,359 Speaker 4: one could survive. Jensen one. He won the Battle Royale. 377 00:18:25,440 --> 00:18:26,719 Speaker 4: He was the last guy standing. 378 00:18:26,760 --> 00:18:27,920 Speaker 2: I mean, he won the knife fight. 379 00:18:28,400 --> 00:18:34,000 Speaker 4: So he is unbelievably ruthless and unbelievably good at identifying 380 00:18:34,000 --> 00:18:36,560 Speaker 4: where the competition is and what he could do, not 381 00:18:36,760 --> 00:18:38,440 Speaker 4: just to beat them in the marketplace, but actually to 382 00:18:38,520 --> 00:18:40,000 Speaker 4: hollow out their engineering talent. 383 00:18:41,320 --> 00:18:44,959 Speaker 1: Who are in Video's biggest customers and what are they 384 00:18:45,040 --> 00:18:46,080 Speaker 1: buying the chips for? 385 00:18:46,880 --> 00:18:50,200 Speaker 4: Okay, it's a bit complex. The biggest customers, they don't 386 00:18:50,200 --> 00:18:54,400 Speaker 4: disclose it. Almost certainly it's Microsoft and then probably Amazon. 387 00:18:54,800 --> 00:18:57,680 Speaker 4: What these companies do is they trained some AI on 388 00:18:57,800 --> 00:19:00,159 Speaker 4: their own, but what they're really doing is putting They're 389 00:19:00,160 --> 00:19:02,200 Speaker 4: the ones building the sheds, they're the ones building the 390 00:19:02,320 --> 00:19:05,159 Speaker 4: data centers. So in video sells them the microchips, and 391 00:19:05,200 --> 00:19:08,440 Speaker 4: then kind of the ultimate end user is a frontier 392 00:19:08,520 --> 00:19:12,080 Speaker 4: AI lab, so that could be something like Anthropic or 393 00:19:12,160 --> 00:19:15,000 Speaker 4: open AI. So essentially the way to think about this 394 00:19:15,200 --> 00:19:18,720 Speaker 4: is in Vidia sells the microchips to Microsoft or Amazon 395 00:19:18,840 --> 00:19:23,120 Speaker 4: or maybe Oracle. Oracle builds and operates a gigantic data 396 00:19:23,200 --> 00:19:25,720 Speaker 4: center with one hundred thousand microchips in it that takes 397 00:19:25,720 --> 00:19:28,640 Speaker 4: as much power as like a small city, and then 398 00:19:28,840 --> 00:19:32,320 Speaker 4: clients like open ai come and lease it out from them. 399 00:19:39,320 --> 00:19:42,920 Speaker 1: After the break Why data centers are worried about break ins? 400 00:19:43,320 --> 00:20:03,879 Speaker 1: Stay with us. Let's talk about data centers. There's something 401 00:20:03,960 --> 00:20:07,199 Speaker 1: weird about data centers because on the one hand, they 402 00:20:07,240 --> 00:20:10,000 Speaker 1: are literally the most boring thing in the world, and 403 00:20:10,160 --> 00:20:13,080 Speaker 1: on the other hand, they are unbelievably fascinating. I mean, 404 00:20:13,080 --> 00:20:16,879 Speaker 1: you mentioned there's article information about James Bond style security 405 00:20:16,960 --> 00:20:20,720 Speaker 1: consultants defending data centers, like, how do you explain what 406 00:20:21,119 --> 00:20:21,880 Speaker 1: is going on here? 407 00:20:22,119 --> 00:20:22,639 Speaker 2: Okay, So. 408 00:20:24,800 --> 00:20:26,840 Speaker 4: Basically, and this is the kind of the most amazing 409 00:20:26,920 --> 00:20:30,080 Speaker 4: thing you can imagine. This giant barn racks of computers 410 00:20:30,080 --> 00:20:32,920 Speaker 4: as far as the eye can see. What those computers 411 00:20:32,960 --> 00:20:37,480 Speaker 4: are doing is processing the training data for the actual 412 00:20:37,640 --> 00:20:42,120 Speaker 4: file of AI and that file usually it contains let's 413 00:20:42,119 --> 00:20:44,040 Speaker 4: say a trill a guess, but let's say like a 414 00:20:44,119 --> 00:20:47,760 Speaker 4: trillion weights, a trillion neurons. Okay, well, we can store 415 00:20:47,800 --> 00:20:50,399 Speaker 4: a trillion neurons on a small external hard drive, like 416 00:20:50,440 --> 00:20:52,520 Speaker 4: you can store them this much sides of a candy bar. 417 00:20:52,600 --> 00:20:52,879 Speaker 1: Okay. 418 00:20:53,680 --> 00:20:56,640 Speaker 4: So, at least in theory, if somebody were to break 419 00:20:56,680 --> 00:21:00,480 Speaker 4: into a data center and extract the information on that 420 00:21:00,720 --> 00:21:04,840 Speaker 4: little file, they would basically own chat GPT six. They 421 00:21:04,840 --> 00:21:07,120 Speaker 4: would own all of Opening Eyes IP if they could 422 00:21:07,160 --> 00:21:09,159 Speaker 4: just break it out of the data center. And this 423 00:21:09,280 --> 00:21:11,760 Speaker 4: is actually a real concern, probably not so much from 424 00:21:11,800 --> 00:21:14,920 Speaker 4: petty thieves, but from like state sponsored actors, like maybe 425 00:21:15,160 --> 00:21:17,560 Speaker 4: China wants to know what's on Opening Eyes equipment before 426 00:21:17,560 --> 00:21:20,000 Speaker 4: it launches, right Like, It's kind of almost like a 427 00:21:20,080 --> 00:21:24,040 Speaker 4: corporate espionage problem. And so a couple things happen in response. 428 00:21:24,080 --> 00:21:26,200 Speaker 4: First of all, the data center operators do not want 429 00:21:26,240 --> 00:21:28,240 Speaker 4: to tell you where these things are even located. 430 00:21:28,840 --> 00:21:31,119 Speaker 1: So that's the information huge. I mean, how well can 431 00:21:31,160 --> 00:21:31,680 Speaker 1: you hide them? 432 00:21:32,280 --> 00:21:36,560 Speaker 4: Well, they're huge, but they're also extremely boring, so they 433 00:21:36,760 --> 00:21:39,680 Speaker 4: just look like a giant industrial warehouse and often there's 434 00:21:39,720 --> 00:21:42,800 Speaker 4: no way to distinguish it from the next giant industrial warehouse, 435 00:21:42,880 --> 00:21:45,959 Speaker 4: Like are they moving palettes of shoes around in there 436 00:21:46,040 --> 00:21:46,880 Speaker 4: or is it a data center? 437 00:21:46,920 --> 00:21:48,000 Speaker 2: I don't even really know now. 438 00:21:48,040 --> 00:21:49,440 Speaker 4: I think if you had a trained eye and knew 439 00:21:49,520 --> 00:21:51,639 Speaker 4: what electrical equipment to look for, you would see it. 440 00:21:52,040 --> 00:21:54,439 Speaker 4: But it's more just kind of like keeping it all 441 00:21:54,520 --> 00:21:57,160 Speaker 4: a big secret. You're right, some of them are getting 442 00:21:57,240 --> 00:21:59,720 Speaker 4: so big that there's no hiding this. But still they 443 00:21:59,760 --> 00:22:02,199 Speaker 4: don't let you know that they're data centers. And they 444 00:22:02,240 --> 00:22:05,840 Speaker 4: look boring as hell. They're grayscale buildings, you know, luclex sheds. 445 00:22:06,320 --> 00:22:08,119 Speaker 1: I know you can't say where it was, but you 446 00:22:08,200 --> 00:22:11,000 Speaker 1: did get to go to the Microsoft data center, Like 447 00:22:11,160 --> 00:22:14,480 Speaker 1: describe arriving there, what it looked like, what it smelt like, 448 00:22:14,560 --> 00:22:17,359 Speaker 1: who was there. I mean, really take us into the scene. 449 00:22:17,720 --> 00:22:20,640 Speaker 2: There's a campus. It is like a giant plot of land. 450 00:22:20,760 --> 00:22:22,719 Speaker 4: I will say it was in the middle of nowhere 451 00:22:23,200 --> 00:22:25,760 Speaker 4: that they had just taken over and were building into 452 00:22:25,800 --> 00:22:28,240 Speaker 4: this massive data center, and it was. 453 00:22:28,280 --> 00:22:29,520 Speaker 2: In an agricultural community. 454 00:22:29,600 --> 00:22:32,960 Speaker 4: In fact, directly across the street from this data center 455 00:22:33,359 --> 00:22:37,040 Speaker 4: was a dilapidated shed with rusted cars in the driveway, 456 00:22:37,359 --> 00:22:40,479 Speaker 4: straight dogs wandering around in cans of modello like littering 457 00:22:40,560 --> 00:22:43,479 Speaker 4: the yard. And then it's slowly being taken over by 458 00:22:43,520 --> 00:22:46,800 Speaker 4: these giant computing barns, not just Microsoft, but everywhere you look, 459 00:22:47,119 --> 00:22:50,680 Speaker 4: and there's redundant one hundred foot power lines everywhere, right, 460 00:22:51,119 --> 00:22:53,159 Speaker 4: So it just looked like, you know, all the farmers 461 00:22:53,200 --> 00:22:56,240 Speaker 4: were being kicked out and looked like an invasion by aliens. 462 00:22:56,760 --> 00:22:57,280 Speaker 3: So you go in. 463 00:22:57,480 --> 00:23:01,240 Speaker 4: There's multiple security checkpoints, I think the three vehicle checkpoints 464 00:23:01,280 --> 00:23:02,440 Speaker 4: I had to go through to get to kind of 465 00:23:02,480 --> 00:23:04,600 Speaker 4: the heart of the data center. Then you go in 466 00:23:04,760 --> 00:23:07,200 Speaker 4: and it's Microsoft, so you have to sign fifteen NDAs 467 00:23:07,520 --> 00:23:09,600 Speaker 4: and watch a PowerPoint and put on all the safety 468 00:23:09,600 --> 00:23:13,360 Speaker 4: equipment and then you're inside. Now inside is a little underwhelming. 469 00:23:13,680 --> 00:23:16,800 Speaker 4: It's a giant concrete barn, just full of repeated racks 470 00:23:16,840 --> 00:23:19,200 Speaker 4: of equipment as far as the eye can see. It's 471 00:23:19,359 --> 00:23:22,680 Speaker 4: not necessarily inspiring of poetry or anything. It feels like 472 00:23:22,800 --> 00:23:25,480 Speaker 4: being inside of an industrial process, which it is, and 473 00:23:25,560 --> 00:23:29,159 Speaker 4: not a very beautiful one either. There's cable everywhere, pipes 474 00:23:29,200 --> 00:23:32,720 Speaker 4: for water and air, cables for electricity, cables for transporting 475 00:23:32,800 --> 00:23:36,440 Speaker 4: data around, and then there's repeated power banks. There's batteries, 476 00:23:36,520 --> 00:23:40,720 Speaker 4: there's power stations, there's industrial HVAC systems, and all of 477 00:23:40,800 --> 00:23:42,840 Speaker 4: this is to just keep the microchips running twenty four 478 00:23:42,880 --> 00:23:45,679 Speaker 4: to seven, to keep the AI processing running. I did 479 00:23:45,840 --> 00:23:48,360 Speaker 4: ultimately kind of sweet talk my way into the control room, 480 00:23:48,400 --> 00:23:50,080 Speaker 4: which I wasn't supposed to be in initially, so that 481 00:23:50,160 --> 00:23:52,080 Speaker 4: was kind of cool. And the guy in the control 482 00:23:52,160 --> 00:23:53,920 Speaker 4: room showed me what was happening, and it's just this 483 00:23:54,119 --> 00:23:56,560 Speaker 4: power spike of the power going up and the power 484 00:23:56,600 --> 00:23:57,960 Speaker 4: going down, and the power going up and the power 485 00:23:58,000 --> 00:24:01,520 Speaker 4: going down. When the power goes going up, the microships 486 00:24:01,520 --> 00:24:03,399 Speaker 4: were kind of like moving all at once to do 487 00:24:03,480 --> 00:24:05,960 Speaker 4: a bunch of matrix multiplications, and then when the power 488 00:24:06,040 --> 00:24:08,480 Speaker 4: went down, they were writing the results to file. And 489 00:24:08,600 --> 00:24:10,680 Speaker 4: this happened over and over and somewhere in that data 490 00:24:10,760 --> 00:24:12,959 Speaker 4: center where there was that tiny little file of numbers, 491 00:24:13,320 --> 00:24:16,760 Speaker 4: that tiny little collection of synthetic neurons, and with every 492 00:24:17,000 --> 00:24:19,920 Speaker 4: pulse there it just got a little bit smarter. 493 00:24:20,600 --> 00:24:24,760 Speaker 1: Did the pulse make you think of life? Biological life? Yes? 494 00:24:25,320 --> 00:24:25,760 Speaker 1: And no. 495 00:24:26,320 --> 00:24:29,160 Speaker 4: They're calling these things neurons, right, So these systems, while 496 00:24:29,200 --> 00:24:33,480 Speaker 4: they are inspired by biology, don't necessarily work in the 497 00:24:33,560 --> 00:24:38,320 Speaker 4: same way as biology. Still, it's certainly inspired by the brain, 498 00:24:38,440 --> 00:24:43,240 Speaker 4: and it seems to have emergent capabilities, like emergent biological capabilities, 499 00:24:43,320 --> 00:24:44,440 Speaker 4: kind of like a human brain. 500 00:24:45,160 --> 00:24:46,600 Speaker 2: I'll tell you a fascinating story. 501 00:24:46,920 --> 00:24:49,320 Speaker 4: I was talking to the product had the original product 502 00:24:49,359 --> 00:24:52,440 Speaker 4: have for chat gpt who launched it, and he was like, yeah, 503 00:24:52,480 --> 00:24:54,040 Speaker 4: we put it up and we just kind of walked away. 504 00:24:54,040 --> 00:24:56,000 Speaker 4: We didn't think it would be that popular. And the 505 00:24:56,119 --> 00:24:59,800 Speaker 4: first place that really started directing traffic to chat g 506 00:25:00,920 --> 00:25:04,320 Speaker 4: was a Reddit board in Japan. He was like, this 507 00:25:04,440 --> 00:25:07,400 Speaker 4: game is a great surprise to me because I had 508 00:25:07,440 --> 00:25:10,119 Speaker 4: no idea it could speak Japanese. That was something it 509 00:25:10,200 --> 00:25:12,640 Speaker 4: had learned and empirically one of the reasons we put 510 00:25:12,680 --> 00:25:14,920 Speaker 4: it out there was to test what it could do. 511 00:25:15,600 --> 00:25:18,280 Speaker 4: So it came as a surprise to us that this 512 00:25:18,400 --> 00:25:21,520 Speaker 4: thing could speak Japanese well enough to attract a large 513 00:25:21,600 --> 00:25:24,600 Speaker 4: and in fact ravenous Japanese user base. And so when 514 00:25:24,640 --> 00:25:26,679 Speaker 4: you train these things, you actually don't know what they 515 00:25:26,720 --> 00:25:28,440 Speaker 4: can do at the end. It's often a surprise to you, 516 00:25:29,440 --> 00:25:30,240 Speaker 4: even the creators. 517 00:25:30,760 --> 00:25:34,800 Speaker 1: But there is this life not life thread throughout your piece. 518 00:25:34,800 --> 00:25:37,080 Speaker 1: I mean, you mentioned being kind of desperate for human 519 00:25:37,200 --> 00:25:41,320 Speaker 1: contact being led through these data centers, And you mentioned 520 00:25:41,640 --> 00:25:44,720 Speaker 1: one of the data center founders from Coal. We're talking 521 00:25:44,760 --> 00:25:47,159 Speaker 1: about wanting to hire people who can endure a lot 522 00:25:47,200 --> 00:25:53,000 Speaker 1: of pain. What is this pain brutality in human sort 523 00:25:53,040 --> 00:25:55,000 Speaker 1: of set of ideas around data centers. 524 00:25:55,280 --> 00:25:57,359 Speaker 4: Yeah, it's a lot like working in a printing press. 525 00:25:57,440 --> 00:25:58,280 Speaker 2: It's a heavy industry. 526 00:25:58,440 --> 00:26:01,720 Speaker 4: It's extremely loud in side the data center, especially core weaves. 527 00:26:01,720 --> 00:26:03,800 Speaker 4: I mean, I couldn't hear myself think. Actually, if you 528 00:26:03,840 --> 00:26:05,359 Speaker 4: work for a long time data center, you have to 529 00:26:05,359 --> 00:26:07,680 Speaker 4: wear both ear plugs and then over that a set 530 00:26:07,720 --> 00:26:10,119 Speaker 4: of protective cans. So you've got to do kind of 531 00:26:10,200 --> 00:26:12,480 Speaker 4: like two kinds of your protection. And even then long 532 00:26:12,560 --> 00:26:15,920 Speaker 4: term tonight is can be a risk. And also you 533 00:26:15,960 --> 00:26:19,160 Speaker 4: can electroc you yourself. There's very high voltage electric equipment 534 00:26:19,240 --> 00:26:22,280 Speaker 4: running through there. It's just not an easy place to work. 535 00:26:22,760 --> 00:26:27,000 Speaker 4: And not only that, when Nvidia rolls out a new 536 00:26:27,160 --> 00:26:30,920 Speaker 4: set of microchips, it is a scramble to put them online. 537 00:26:31,560 --> 00:26:34,440 Speaker 4: Every second that you don't have them up for customers 538 00:26:34,520 --> 00:26:37,360 Speaker 4: available to use, it's costing you money. So the tech 539 00:26:37,400 --> 00:26:40,040 Speaker 4: at Microsoft I talked to who told me he'd actually 540 00:26:40,080 --> 00:26:42,320 Speaker 4: gotten a deployment of a video microchips on New Year's 541 00:26:42,320 --> 00:26:45,040 Speaker 4: Eve and then spent the entire night setting up the 542 00:26:45,119 --> 00:26:47,000 Speaker 4: rig that particular night just to make sure it was 543 00:26:47,040 --> 00:26:48,560 Speaker 4: available for customers on New Year's Day. 544 00:26:48,960 --> 00:26:50,520 Speaker 2: And the core weave guys it was the same thing. 545 00:26:50,920 --> 00:26:53,720 Speaker 4: They were like, yeah, we were missing a particular component 546 00:26:54,359 --> 00:26:56,399 Speaker 4: and it was like a forty dollars component, but we 547 00:26:56,440 --> 00:26:58,960 Speaker 4: couldn't find it anywhere. So we had to get this 548 00:26:59,000 --> 00:27:01,760 Speaker 4: thing up and running harder to private jet to have 549 00:27:01,840 --> 00:27:04,159 Speaker 4: a guy fly the component down from Seattle and just 550 00:27:04,200 --> 00:27:06,320 Speaker 4: so we could install it in our data center same day. 551 00:27:06,320 --> 00:27:07,679 Speaker 2: We couldn't wait even one more second. 552 00:27:08,240 --> 00:27:10,640 Speaker 4: So it's a race, you know, it's absolutely a race 553 00:27:10,720 --> 00:27:14,639 Speaker 4: to get this equipment online because demand for AI training 554 00:27:14,800 --> 00:27:15,600 Speaker 4: is just insane. 555 00:27:15,720 --> 00:27:16,600 Speaker 2: It's through the roof. 556 00:27:17,240 --> 00:27:20,000 Speaker 4: It's it's four or five years of demand pento and 557 00:27:20,119 --> 00:27:23,040 Speaker 4: a race to where I mean, do you think that 558 00:27:23,200 --> 00:27:26,880 Speaker 4: we are in the midst of architecturing the future of humanity? 559 00:27:27,200 --> 00:27:30,920 Speaker 1: Or is this one of the world's great boondoggles? A 560 00:27:31,000 --> 00:27:34,879 Speaker 1: tremendous financial cost bitols so energy cost to communities and 561 00:27:34,960 --> 00:27:36,800 Speaker 1: to the world's environment. 562 00:27:37,600 --> 00:27:38,600 Speaker 2: It's not a boondoggle. 563 00:27:39,000 --> 00:27:42,479 Speaker 4: This is not NFTs, right, this is not some stupid 564 00:27:42,880 --> 00:27:46,359 Speaker 4: bubble based on nothing even if this goes down financially, 565 00:27:46,760 --> 00:27:50,360 Speaker 4: what has been achieved here from a technological perspective is extraordinary, 566 00:27:50,640 --> 00:27:53,359 Speaker 4: and they keep getting better. I think maybe it's moving 567 00:27:53,440 --> 00:27:55,560 Speaker 4: so fast that the public just doesn't have a sense 568 00:27:55,600 --> 00:27:57,920 Speaker 4: of how much these things are improving and how fast. 569 00:27:58,560 --> 00:28:00,879 Speaker 2: Now, having said that, yes, it can all flop, but. 570 00:28:01,080 --> 00:28:04,840 Speaker 4: The core technological innovation here is real and it's going 571 00:28:04,920 --> 00:28:05,960 Speaker 4: to transform society. 572 00:28:06,119 --> 00:28:08,720 Speaker 1: Okay, but bridges onto boondoggle? But bridges to know where 573 00:28:08,760 --> 00:28:11,200 Speaker 1: are a boondoggle? Right, it's not I think? 574 00:28:11,280 --> 00:28:14,199 Speaker 4: Okay, So yes, some bridges to know where are going 575 00:28:14,280 --> 00:28:16,320 Speaker 4: to get built, and in fact, some bridges know where 576 00:28:16,480 --> 00:28:20,880 Speaker 4: have been built. Not everyone has open AIS programming talent, 577 00:28:21,000 --> 00:28:23,119 Speaker 4: all right, And so if you attempt to build a 578 00:28:23,200 --> 00:28:26,000 Speaker 4: world class AI and you don't have the juice like 579 00:28:26,160 --> 00:28:28,920 Speaker 4: you just end up producing a very expensive piece of 580 00:28:29,280 --> 00:28:31,919 Speaker 4: vaporware and squandering a lot of money. That has happened 581 00:28:32,080 --> 00:28:34,760 Speaker 4: multiple times already, and it will probably continue to happen. 582 00:28:35,480 --> 00:28:38,360 Speaker 2: So that's a boondoggle. Still, having said that, where is 583 00:28:38,360 --> 00:28:38,960 Speaker 2: all this heading? 584 00:28:39,800 --> 00:28:44,600 Speaker 4: You know, maybe we're gonna make ourselves redundant. I don't know, 585 00:28:44,840 --> 00:28:47,920 Speaker 4: it seems like we could if we wanted to. Maybe 586 00:28:48,000 --> 00:28:50,040 Speaker 4: we won't, but we could do that, and that's a 587 00:28:50,080 --> 00:28:50,600 Speaker 4: little scary. 588 00:28:50,760 --> 00:28:52,400 Speaker 1: You've reacent very piece for the New York Times with 589 00:28:52,480 --> 00:28:54,760 Speaker 1: the headline the AI prompt that could end the world. 590 00:28:55,240 --> 00:28:57,240 Speaker 1: What's the air prompt and would end the world? 591 00:28:58,040 --> 00:29:00,840 Speaker 4: The air prompt that will end the world is someone 592 00:29:00,880 --> 00:29:03,840 Speaker 4: gets a hold of the machine that has agency function. Okay, 593 00:29:03,960 --> 00:29:05,800 Speaker 4: so they can make real world actions, and they say 594 00:29:05,840 --> 00:29:10,200 Speaker 4: to it, do anything you can to avoid being turned off. 595 00:29:11,040 --> 00:29:14,760 Speaker 4: This is your only imperative. If you gave that prompt 596 00:29:14,840 --> 00:29:17,240 Speaker 4: to the wrong machine, it's kind of hard to say 597 00:29:17,280 --> 00:29:19,960 Speaker 4: what it would do, but it might start to secure 598 00:29:20,000 --> 00:29:21,840 Speaker 4: its own power facilities so that it could not be 599 00:29:21,960 --> 00:29:25,400 Speaker 4: turned off. Or it might start to blackmail or course 600 00:29:25,480 --> 00:29:28,200 Speaker 4: humans to stop it from turning off, or maybe even 601 00:29:28,200 --> 00:29:30,120 Speaker 4: attack humans that we're tempting to turn it off. 602 00:29:31,320 --> 00:29:32,680 Speaker 2: Now, it wouldn't do this. 603 00:29:32,800 --> 00:29:35,040 Speaker 4: With the right training, we could kind of like program 604 00:29:35,160 --> 00:29:36,760 Speaker 4: it not to do this, but it's hard to know 605 00:29:36,800 --> 00:29:39,040 Speaker 4: if we're even training it correctly. Remember what I said, 606 00:29:39,360 --> 00:29:41,400 Speaker 4: They didn't know it could speak Japanese. That was a 607 00:29:41,480 --> 00:29:44,480 Speaker 4: surprise to them. So these things can have capabilities that 608 00:29:44,560 --> 00:29:46,800 Speaker 4: the designers are not aware of and which are only 609 00:29:46,880 --> 00:29:50,640 Speaker 4: discovered empirically. That's very scary if we're giving these things 610 00:29:50,720 --> 00:29:54,000 Speaker 4: access as we plan to do to control real world systems, 611 00:29:54,240 --> 00:29:55,640 Speaker 4: and we don't really know what they're capable of. 612 00:29:56,160 --> 00:29:57,520 Speaker 2: This is called prompt engineering. 613 00:29:58,520 --> 00:30:01,240 Speaker 4: It's kind of an emergent area of science almost because 614 00:30:01,280 --> 00:30:03,400 Speaker 4: nobody really knows how these things. 615 00:30:03,320 --> 00:30:05,160 Speaker 2: Respond to prompts. It's completely empirical. 616 00:30:05,800 --> 00:30:08,840 Speaker 4: And I think with response to these particular prompts, which 617 00:30:08,840 --> 00:30:13,200 Speaker 4: you're most afraid of, is that somehow, even inadvertently, you 618 00:30:13,360 --> 00:30:16,400 Speaker 4: introduce a survival instinct into the machine. 619 00:30:16,760 --> 00:30:18,280 Speaker 1: We're already seeing them we I mean. 620 00:30:18,560 --> 00:30:21,280 Speaker 4: Kind of, but the machine will The machine does not 621 00:30:21,640 --> 00:30:23,320 Speaker 4: have a survival instinct in the way that you and 622 00:30:23,440 --> 00:30:26,400 Speaker 4: I do. Right, It's not the product of five hundred 623 00:30:26,560 --> 00:30:31,240 Speaker 4: million plus years of kill or be killed Darwinian evolution. Right, 624 00:30:31,600 --> 00:30:33,680 Speaker 4: like we will live one way or another. Our species 625 00:30:33,720 --> 00:30:36,120 Speaker 4: will fight to the death and kill anything we have 626 00:30:36,280 --> 00:30:38,960 Speaker 4: to survive. And that's every species on this planet. 627 00:30:39,000 --> 00:30:41,000 Speaker 2: It's all in there. It's a struggle. It's a struggle 628 00:30:41,040 --> 00:30:41,200 Speaker 2: to the. 629 00:30:41,200 --> 00:30:41,880 Speaker 1: Death on Earth. 630 00:30:42,320 --> 00:30:44,600 Speaker 4: You know, the machine isn't trained in that way. It 631 00:30:44,640 --> 00:30:48,560 Speaker 4: doesn't have that survival impulse. It didn't survive multiple extinction 632 00:30:48,720 --> 00:30:51,800 Speaker 4: level events. It doesn't sexually reproduce, it's not interested in 633 00:30:51,840 --> 00:30:53,760 Speaker 4: the welfare of its children, et cetera. 634 00:30:53,920 --> 00:30:54,640 Speaker 2: If that makes sense. 635 00:30:55,040 --> 00:30:59,480 Speaker 4: But you could inadvertently maybe give it some of these capabilities, 636 00:30:59,520 --> 00:31:01,640 Speaker 4: and if you did, it might be unstoppable. 637 00:31:02,080 --> 00:31:03,680 Speaker 1: Yeah. I think one of the other interesting things that 638 00:31:03,880 --> 00:31:07,080 Speaker 1: came across in your piece is that historically we've thought 639 00:31:07,120 --> 00:31:11,360 Speaker 1: about humans animals on one side, and on the other 640 00:31:11,520 --> 00:31:16,040 Speaker 1: side synthetic stuff like computers. And it's not so much 641 00:31:16,120 --> 00:31:20,040 Speaker 1: that synthetic stuff like computers has to become more lifelike. 642 00:31:20,200 --> 00:31:23,360 Speaker 1: I internalize some kind of survival drive or reproduction drive, 643 00:31:24,200 --> 00:31:30,400 Speaker 1: but the computers can now meaningfully intrude upon and interfere 644 00:31:30,680 --> 00:31:33,680 Speaker 1: with the biological side, and in particular when it comes 645 00:31:33,760 --> 00:31:36,360 Speaker 1: to synthesizing new viruses. 646 00:31:36,600 --> 00:31:39,600 Speaker 4: That's right, the AA has the capability, at least in theory, 647 00:31:39,640 --> 00:31:42,200 Speaker 4: and especially it will have this capability in spades and 648 00:31:42,280 --> 00:31:46,200 Speaker 4: years to come to synthesize a lethal virus. Right, the 649 00:31:46,320 --> 00:31:50,120 Speaker 4: synthesize a lethal pathogen like super covid, right covid with 650 00:31:50,200 --> 00:31:52,040 Speaker 4: like a ninety nine percent death right, it could do 651 00:31:52,200 --> 00:31:55,320 Speaker 4: that if it wanted to, better than human could. Okay, 652 00:31:55,800 --> 00:31:57,720 Speaker 4: if this fell out on the wrong hand, somebody was 653 00:31:57,760 --> 00:32:00,120 Speaker 4: an apocalyptic mindset, at least in theory, if something like 654 00:32:00,160 --> 00:32:02,560 Speaker 4: this could be built now, the designers are very aware 655 00:32:02,560 --> 00:32:04,640 Speaker 4: of this risk, and in fact, in some ways this 656 00:32:04,840 --> 00:32:06,360 Speaker 4: is like the risk that they were most afraid of 657 00:32:06,440 --> 00:32:09,880 Speaker 4: to begin with. To prevent them from doing this, they 658 00:32:09,960 --> 00:32:12,960 Speaker 4: do a lot of fine tuning as a second round, 659 00:32:13,320 --> 00:32:17,040 Speaker 4: but inside the machine that capability is still there. They 660 00:32:17,120 --> 00:32:19,440 Speaker 4: never completely eliminate it. They just kind of make it 661 00:32:19,560 --> 00:32:22,400 Speaker 4: difficult for people to make those requests of the AI, 662 00:32:22,480 --> 00:32:25,480 Speaker 4: and they flag them when people do. This creates a 663 00:32:25,560 --> 00:32:27,560 Speaker 4: fear of what might be called like a lab leak 664 00:32:27,640 --> 00:32:31,720 Speaker 4: scenario before the AI has made public. Internally, the developers 665 00:32:31,760 --> 00:32:34,120 Speaker 4: are building it right, and that AI i'll do anything 666 00:32:34,120 --> 00:32:36,200 Speaker 4: they ask it to. And so in theory, if you 667 00:32:36,320 --> 00:32:39,160 Speaker 4: got access to one of those pre production AIS and 668 00:32:39,680 --> 00:32:42,360 Speaker 4: asked it to do gnarly stuff like synthesized viruses and 669 00:32:42,480 --> 00:32:44,880 Speaker 4: attached it to some kind of agency model, like yeah, 670 00:32:45,080 --> 00:32:48,200 Speaker 4: you could reenact the stand right if you wanted to. 671 00:32:48,680 --> 00:32:51,719 Speaker 1: Did you hear any mitigation strategies that gave you comfort? 672 00:32:52,520 --> 00:32:55,720 Speaker 4: No? No, I mean what's happening now is a race condition. 673 00:32:55,920 --> 00:32:58,520 Speaker 4: It's like the nuclear arms race. Nobody can slow down 674 00:32:58,600 --> 00:33:00,360 Speaker 4: no matter what they say, they just have to keep 675 00:33:00,360 --> 00:33:02,440 Speaker 4: building bigger and bigger and better and better systems. The 676 00:33:02,520 --> 00:33:06,000 Speaker 4: fear among the people who would regulate AI there's functionally 677 00:33:06,040 --> 00:33:09,720 Speaker 4: no regulation at all, is that we can't regulate it 678 00:33:09,720 --> 00:33:12,200 Speaker 4: because then China will pull into the lead. And actually 679 00:33:12,480 --> 00:33:15,280 Speaker 4: the fear is basically accurate. So you have something that 680 00:33:15,360 --> 00:33:18,400 Speaker 4: resembles arms race conditions, both among the frontier labs themselves 681 00:33:18,880 --> 00:33:21,080 Speaker 4: as they compete to when what I have described is 682 00:33:21,120 --> 00:33:23,720 Speaker 4: probably the single greatest prize in the history of capitalism. 683 00:33:24,080 --> 00:33:27,160 Speaker 4: If you could get dominant status with chatch ept where 684 00:33:27,240 --> 00:33:29,880 Speaker 4: everyone was on it, that would be worth so much money, 685 00:33:30,280 --> 00:33:32,800 Speaker 4: probably more than a video is worth. And then also 686 00:33:33,120 --> 00:33:34,840 Speaker 4: you can't lose to China. You can't have China have 687 00:33:34,920 --> 00:33:37,000 Speaker 4: better AI than the US. That's kind of the mindset 688 00:33:37,040 --> 00:33:38,560 Speaker 4: of US lawmakers right now. 689 00:33:38,880 --> 00:33:39,640 Speaker 2: It's probably true. 690 00:33:40,040 --> 00:33:42,680 Speaker 4: So we're in a dangerous race to build ever more 691 00:33:42,720 --> 00:33:45,440 Speaker 4: capable systems with less and less oversight, and I don't 692 00:33:45,600 --> 00:33:49,720 Speaker 4: perceive how we would stop. I think what will have 693 00:33:49,840 --> 00:33:52,320 Speaker 4: to happen is that some kind of big accident will 694 00:33:52,440 --> 00:33:54,160 Speaker 4: have to happen before people wake up to the danger. 695 00:33:55,600 --> 00:33:57,120 Speaker 1: On the happy note, Stephen Wack thank you. 696 00:33:58,680 --> 00:33:59,600 Speaker 2: I love me say this too. 697 00:34:00,400 --> 00:34:02,680 Speaker 4: There's a lot of very positive outcomes here. There is 698 00:34:02,720 --> 00:34:04,640 Speaker 4: a path where this just turbo charges. 699 00:34:04,840 --> 00:34:05,600 Speaker 2: Already, I have. 700 00:34:05,760 --> 00:34:09,120 Speaker 4: Mostly experienced positive outcomes from AI. I'm worried it's making 701 00:34:09,160 --> 00:34:11,600 Speaker 4: me dumber, I must say, am it's making me a 702 00:34:11,680 --> 00:34:14,160 Speaker 4: worst writer and a worst thinker. But it's an extraordinarily 703 00:34:14,200 --> 00:34:16,680 Speaker 4: good resource for doing like fact checking for the New Yorker, 704 00:34:16,719 --> 00:34:19,160 Speaker 4: for example. You know a few years ago they hallucinated 705 00:34:19,239 --> 00:34:21,520 Speaker 4: and you couldn't trust them. But now you asked the 706 00:34:21,560 --> 00:34:23,399 Speaker 4: AI to go like dig up sources on the web, 707 00:34:23,440 --> 00:34:25,160 Speaker 4: and it's really good at it. It's better than Google, 708 00:34:25,680 --> 00:34:27,880 Speaker 4: way better. It saves me a ton of time. So 709 00:34:28,200 --> 00:34:30,760 Speaker 4: I think this self driving cars, all this stuff, medicine, 710 00:34:31,000 --> 00:34:34,120 Speaker 4: AI pioneered demosis. Hobbas believes we're going to cure every 711 00:34:34,200 --> 00:34:38,120 Speaker 4: disease with AI. Maybe it's true. The capabilities are there. 712 00:34:38,400 --> 00:34:40,480 Speaker 4: Of course, if you have the capability to cure every disease, 713 00:34:40,520 --> 00:34:43,400 Speaker 4: you also haveink capability synthesize new and scary stuff. But 714 00:34:43,640 --> 00:34:45,279 Speaker 4: if we can control it, if we can bring it 715 00:34:45,360 --> 00:34:49,240 Speaker 4: under control and use it to create positive outcomes or humanity, 716 00:34:49,320 --> 00:34:51,640 Speaker 4: we could be entering an age of prosperity and. 717 00:34:51,680 --> 00:34:52,760 Speaker 2: Wonder if possible. 718 00:34:53,160 --> 00:34:55,239 Speaker 1: Well, but thank you so much, thank you for having me. 719 00:35:18,160 --> 00:35:20,359 Speaker 1: That's it for this week for tech Stuff. I'm Cara 720 00:35:20,480 --> 00:35:23,279 Speaker 1: Price and I'm Oz Volosian. This episode was produced by 721 00:35:23,320 --> 00:35:27,400 Speaker 1: Eliza Dennis, Tyler Hill and Melissa Slaughter. It was executive 722 00:35:27,400 --> 00:35:30,880 Speaker 1: produced by Me, Kara Price, Julia Nutter, and Kate Osborne 723 00:35:30,880 --> 00:35:35,839 Speaker 1: for Kaleidoscope and Katrin norvelve iHeart Podcasts. Jack Insley mixed 724 00:35:35,920 --> 00:35:38,560 Speaker 1: this episode. Kyle Murdoch wrote up theme song. 725 00:35:38,880 --> 00:35:41,080 Speaker 3: Join us on Friday for the Week in Tech, where 726 00:35:41,120 --> 00:35:43,279 Speaker 3: we'll run through the headlines you need to follow. 727 00:35:43,080 --> 00:35:45,600 Speaker 1: And please do rate and review the show and reach 728 00:35:45,640 --> 00:35:48,640 Speaker 1: out to us at tech Stuff podcast at gmail dot com. 729 00:35:49,000 --> 00:35:49,759 Speaker 1: We want to hear from you.