1 00:00:04,800 --> 00:00:07,720 Speaker 1: On this episode of a News World. From boardrooms to 2 00:00:07,840 --> 00:00:11,280 Speaker 1: dorm rooms, AI seems to be what everyone is talking about, 3 00:00:11,680 --> 00:00:15,520 Speaker 1: from the promise of Chat GPT to robots that may 4 00:00:15,840 --> 00:00:19,160 Speaker 1: or may not take your job. In his new book 5 00:00:19,840 --> 00:00:25,079 Speaker 1: AI Valley, Microsoft, Google, and the trillion dollar Race to 6 00:00:25,200 --> 00:00:29,479 Speaker 1: cash in an Artificial intelligence, Pulitzer Prize winning journalist Gary 7 00:00:29,520 --> 00:00:33,960 Speaker 1: Rivlin follows the launch of Chat, GPT and three startups, 8 00:00:34,400 --> 00:00:36,800 Speaker 1: all with big dreams of cashing. 9 00:00:36,520 --> 00:00:37,120 Speaker 2: In on AI. 10 00:00:37,800 --> 00:00:40,760 Speaker 1: But it's not long before the tech giants enter the 11 00:00:40,760 --> 00:00:44,919 Speaker 1: AI space. Rivlin lays out the fascinating history of AI's evolution, 12 00:00:45,560 --> 00:00:48,839 Speaker 1: the breakthroughs and wrong turns, and the major players in 13 00:00:48,880 --> 00:00:52,640 Speaker 1: Silicon Valley, the developers and investors. 14 00:00:52,200 --> 00:00:53,640 Speaker 2: Who will lead the future of AI. 15 00:00:54,520 --> 00:00:57,800 Speaker 1: Here to discuss his new book, I am really pleased 16 00:00:57,840 --> 00:01:01,000 Speaker 1: to welcome my guest, Gary Rivlin. He is a Pulitzer 17 00:01:01,000 --> 00:01:04,800 Speaker 1: Prize winning investigative reporter who has been writing about technology 18 00:01:05,080 --> 00:01:08,200 Speaker 1: since the mid nineteen nineties and the rise of the Internet. 19 00:01:08,680 --> 00:01:12,480 Speaker 1: He is the author of ten previous books, including Saving 20 00:01:12,560 --> 00:01:17,000 Speaker 1: Main Street and Katrina After the Flood. His work has 21 00:01:17,000 --> 00:01:21,200 Speaker 1: appeared in The New York Times, Newsweek, Fortune, GQ, and Wired, 22 00:01:21,440 --> 00:01:34,480 Speaker 1: among other publications. Gary, that's an amazing record. Thank you 23 00:01:34,560 --> 00:01:35,240 Speaker 1: for joining us. 24 00:01:35,800 --> 00:01:36,960 Speaker 3: Thank you my pleasure. 25 00:01:37,440 --> 00:01:41,039 Speaker 1: So you've been covering tax since the nineties. How has 26 00:01:41,319 --> 00:01:44,520 Speaker 1: tech evolved since you first started covering it and what 27 00:01:44,640 --> 00:01:48,600 Speaker 1: has surprised you most about how Silicon Valley has evolved. 28 00:01:49,720 --> 00:01:55,120 Speaker 3: Well, the short answer is the dominance of the giants. 29 00:01:55,320 --> 00:01:58,800 Speaker 3: In fact, for this book, I went looking since the 30 00:01:58,880 --> 00:02:01,680 Speaker 3: end of twenty twenty two and I realized, like, this 31 00:02:01,800 --> 00:02:04,480 Speaker 3: is now the AI moment. They turned back to tech 32 00:02:04,680 --> 00:02:07,680 Speaker 3: and I was looking for what would be the new Google, 33 00:02:07,800 --> 00:02:10,000 Speaker 3: what would be the new Facebook? And it turns out 34 00:02:10,040 --> 00:02:12,560 Speaker 3: the new Google is Google, the new Facebook is Facebook. 35 00:02:13,040 --> 00:02:14,880 Speaker 3: And so you know, kind of dating back to the 36 00:02:14,960 --> 00:02:17,440 Speaker 3: mid nineteen nineties, it was all about startups. It's all 37 00:02:17,480 --> 00:02:21,560 Speaker 3: about these companies founded in a dorm room, someone's garage. 38 00:02:21,800 --> 00:02:23,560 Speaker 3: They have a great idea, they raise a little bit 39 00:02:23,600 --> 00:02:27,760 Speaker 3: of money, they get some traction, and they become Google. Facebook. 40 00:02:27,800 --> 00:02:30,600 Speaker 3: But AI is different. I fear it's going to solidify 41 00:02:30,639 --> 00:02:33,480 Speaker 3: the power of big tech rather than open it up 42 00:02:33,520 --> 00:02:35,480 Speaker 3: to another set of players. 43 00:02:35,800 --> 00:02:39,839 Speaker 1: Are there characteristics of the investment you have to make 44 00:02:40,520 --> 00:02:44,400 Speaker 1: that make AI susceptible to that kind of dominance money? 45 00:02:44,520 --> 00:02:47,440 Speaker 3: It's just so expensive. In the old days, you can 46 00:02:47,840 --> 00:02:51,080 Speaker 3: raise a million, a few million dollars and start to 47 00:02:51,080 --> 00:02:54,160 Speaker 3: get traction. Then you need to raise the big money 48 00:02:54,160 --> 00:02:59,239 Speaker 3: to go national, go global, whatever. But AI training these models, 49 00:02:59,280 --> 00:03:03,360 Speaker 3: this general AI where they can talk to you, spit 50 00:03:03,400 --> 00:03:08,440 Speaker 3: out images, make video. It's so expensive to train these things. 51 00:03:08,440 --> 00:03:10,840 Speaker 3: It's so expensive to fine tune these things. It's so 52 00:03:10,880 --> 00:03:15,120 Speaker 3: expensive to operate these things. Is beyond the means of 53 00:03:15,760 --> 00:03:18,560 Speaker 3: most startups. But I first started reporting on this at 54 00:03:18,600 --> 00:03:22,359 Speaker 3: the start of twenty twenty three, it would be millions, 55 00:03:22,400 --> 00:03:26,960 Speaker 3: maybe ten million to train and fine tune one of 56 00:03:27,000 --> 00:03:30,000 Speaker 3: these models before they released. By the time I was 57 00:03:30,040 --> 00:03:32,799 Speaker 3: done reporting, it was one hundred million. And now it's 58 00:03:32,919 --> 00:03:36,000 Speaker 3: billions of dollars. What's startup? I mean, there's open AI, 59 00:03:36,120 --> 00:03:38,680 Speaker 3: there's a couple others, but very few startups could raise 60 00:03:38,720 --> 00:03:41,280 Speaker 3: that kind of money. And it's still not enough. I mean, 61 00:03:41,360 --> 00:03:44,080 Speaker 3: these startups are still losing money, so they have to 62 00:03:44,160 --> 00:03:47,280 Speaker 3: raise billions, tens of billions, perhaps in the future, not 63 00:03:47,520 --> 00:03:50,920 Speaker 3: so distant future, one hundred billion dollars and more. And 64 00:03:51,320 --> 00:03:54,160 Speaker 3: big tech can afford that. You know, Microsoft, Apple, They're 65 00:03:54,160 --> 00:03:57,280 Speaker 3: sitting on one hundred billion dollars or so in their savings. 66 00:03:57,560 --> 00:04:00,160 Speaker 3: But what chance does a startup have to do of 67 00:04:00,160 --> 00:04:04,080 Speaker 3: these cutting edge foundational models, the chatbots in the like. 68 00:04:04,880 --> 00:04:08,960 Speaker 1: When you look at the biggest companies Microsoft, Google, Meta, 69 00:04:09,680 --> 00:04:10,880 Speaker 1: and Amazon. 70 00:04:11,080 --> 00:04:13,920 Speaker 2: That the share of volume of cash that they create. 71 00:04:14,480 --> 00:04:17,160 Speaker 1: Yeah, to me, it's astonishing that, with the exception of 72 00:04:17,200 --> 00:04:20,680 Speaker 1: a couple companies in China and Saudi Aramco, all of 73 00:04:20,680 --> 00:04:22,760 Speaker 1: the trillion dollar companies in the world are American. 74 00:04:23,279 --> 00:04:27,000 Speaker 3: It's astonishing. Let's just focus on Search is arguably the 75 00:04:27,040 --> 00:04:30,760 Speaker 3: best business ever. Once you've built the infrastructure, there's not 76 00:04:30,880 --> 00:04:35,080 Speaker 3: much marginal costs to add new users, and so Google 77 00:04:35,200 --> 00:04:37,520 Speaker 3: is making over one hundred billion, something like one hundred 78 00:04:37,520 --> 00:04:41,040 Speaker 3: and fifty billion dollars a year in profits just from search. 79 00:04:41,600 --> 00:04:44,960 Speaker 3: Microsoft they've been struggling to get into search with being 80 00:04:45,320 --> 00:04:48,320 Speaker 3: and they would get like three four five percentage points, 81 00:04:48,360 --> 00:04:51,320 Speaker 3: which is nothing except for it would still translate to 82 00:04:51,640 --> 00:04:54,320 Speaker 3: hundreds and hundreds and hundreds of millions of dollars of revenue. 83 00:04:54,320 --> 00:04:57,440 Speaker 3: And you can make the same argument with Facebook Meta. 84 00:04:57,520 --> 00:05:01,200 Speaker 3: In the old days, the rough statistic is newspapers, magazines, 85 00:05:01,240 --> 00:05:06,520 Speaker 3: publications brought in like fifty billion dollars in advertising revenue collectively. 86 00:05:06,600 --> 00:05:10,239 Speaker 3: That was a circle like two thousand. Nowadays they're bringing 87 00:05:10,279 --> 00:05:13,040 Speaker 3: in closer to ten billion dollars and most of the 88 00:05:13,080 --> 00:05:17,160 Speaker 3: rest is being divvied up by Google and Meta, and 89 00:05:17,240 --> 00:05:20,200 Speaker 3: so it's kind of an idea of winner takes most 90 00:05:20,279 --> 00:05:22,360 Speaker 3: and the stakes are so big that they're just making 91 00:05:22,520 --> 00:05:25,920 Speaker 3: so much money. But you know, this dates back to Microsoft. 92 00:05:25,920 --> 00:05:29,599 Speaker 3: I mean Microsoft, which actually sells product, right, I mean 93 00:05:30,040 --> 00:05:33,960 Speaker 3: you buy your Windows operating system and software package, office 94 00:05:34,000 --> 00:05:38,800 Speaker 3: software packages. You know, their profits were astonishing in the 95 00:05:38,880 --> 00:05:41,400 Speaker 3: nineteen nineties, and I think it just builds on itself. 96 00:05:41,400 --> 00:05:46,000 Speaker 3: They're so rich they can afford to invest in AI, 97 00:05:46,279 --> 00:05:50,960 Speaker 3: gener of AI, these leading edge technologies, and sometimes bigness 98 00:05:51,040 --> 00:05:53,240 Speaker 3: is a weakness right there. You know, they kind of 99 00:05:53,240 --> 00:05:55,440 Speaker 3: trip over their own feet. Google was so far ahead 100 00:05:55,480 --> 00:05:58,400 Speaker 3: of everyone with machine learning, ditting back to the twenty tens, 101 00:05:58,520 --> 00:06:00,880 Speaker 3: but of course it was open AI to a chat GBT. 102 00:06:01,120 --> 00:06:04,240 Speaker 3: You know, big companies are scared the innovator's dilemma. They 103 00:06:04,240 --> 00:06:07,880 Speaker 3: don't want to threaten their existing honeypot. And also, you know, 104 00:06:07,920 --> 00:06:11,680 Speaker 3: I mean startups have advantages, but the advantage of money 105 00:06:12,080 --> 00:06:15,719 Speaker 3: in AI makes me worry that it's kind of game over. 106 00:06:16,320 --> 00:06:19,320 Speaker 1: In the case of Google, you're doing all the work. 107 00:06:20,279 --> 00:06:22,600 Speaker 1: They build a framework within which you get to come, 108 00:06:22,920 --> 00:06:25,440 Speaker 1: you get to play. They don't have to pay anybody. 109 00:06:25,520 --> 00:06:27,880 Speaker 1: These people are all coming and saying, please let me 110 00:06:27,960 --> 00:06:29,000 Speaker 1: come and use your. 111 00:06:28,880 --> 00:06:34,159 Speaker 3: Material exactly Facebook, Instagram, we could go on listing, Twitter X. 112 00:06:34,279 --> 00:06:36,960 Speaker 3: There's an expression in Silicon Valley if you're not paying 113 00:06:37,000 --> 00:06:39,839 Speaker 3: for the product, you are the product. And that's a 114 00:06:39,880 --> 00:06:43,560 Speaker 3: perfect way of understanding a Google or a Facebook. Look, 115 00:06:43,640 --> 00:06:46,800 Speaker 3: let's use Google. So this phone in your pocket tracks 116 00:06:46,839 --> 00:06:50,200 Speaker 3: you everywhere you do your searches online. They track that, 117 00:06:50,600 --> 00:06:52,880 Speaker 3: They bundle up the data, and they sell it to 118 00:06:52,880 --> 00:06:55,280 Speaker 3: the highest bidder. It's a great business. It makes a 119 00:06:55,279 --> 00:06:58,279 Speaker 3: lot of money for them. But I think citizens only 120 00:06:58,680 --> 00:07:01,920 Speaker 3: slowly woke up to that fact. And again this gets 121 00:07:01,920 --> 00:07:04,599 Speaker 3: me back to my worry that if the same big 122 00:07:04,720 --> 00:07:08,600 Speaker 3: tech companies, the same few companies are going to dominate 123 00:07:08,640 --> 00:07:11,400 Speaker 3: AI the way they've dominated the last bunch of years. 124 00:07:11,560 --> 00:07:14,880 Speaker 3: We don't trust them as the stewards for technology, and 125 00:07:15,000 --> 00:07:19,040 Speaker 3: in fact AI is a powerful technology. AI is going 126 00:07:19,120 --> 00:07:25,360 Speaker 3: to rely on our information. There's privacy concern. So that 127 00:07:25,560 --> 00:07:29,640 Speaker 3: is my concern that these companies that are proven untrustworthy 128 00:07:30,000 --> 00:07:32,880 Speaker 3: are going to be the ones bringing us this amazing power. 129 00:07:32,960 --> 00:07:36,280 Speaker 3: I'm optimistic about AI. I lean optimistic. I think AI 130 00:07:36,320 --> 00:07:43,200 Speaker 3: could bring incredible things around education, scientific breakthroughs, medicine. My 131 00:07:43,320 --> 00:07:46,080 Speaker 3: worry is it's in the hands of companies that show 132 00:07:46,120 --> 00:07:50,000 Speaker 3: that it's all about profits and not about trust and safety, 133 00:07:50,000 --> 00:07:52,440 Speaker 3: which again I think is essential for AI. 134 00:07:53,000 --> 00:07:53,760 Speaker 2: Let me do all his thinction. 135 00:07:54,000 --> 00:07:57,480 Speaker 1: There was a period there where people were coming up 136 00:07:57,480 --> 00:08:02,240 Speaker 1: with pretty cool innovations and then promptly getting bought, so 137 00:08:02,720 --> 00:08:05,200 Speaker 1: they never actually a chance to grow into a competitor 138 00:08:05,720 --> 00:08:09,080 Speaker 1: because they were acquired by these large companies and who 139 00:08:09,160 --> 00:08:13,760 Speaker 1: absorbed them. Now, if I understand you correctly, it's ructually 140 00:08:13,840 --> 00:08:17,640 Speaker 1: impossible for a startup in the AI field because of 141 00:08:17,640 --> 00:08:20,600 Speaker 1: the scale of resources it takes to create it. So 142 00:08:20,720 --> 00:08:24,320 Speaker 1: virtually all of the next level of innovation in terms 143 00:08:24,360 --> 00:08:27,400 Speaker 1: of small companies is going to be the use of AI, 144 00:08:27,920 --> 00:08:30,240 Speaker 1: which will be provided by one of the big companies. 145 00:08:30,800 --> 00:08:34,240 Speaker 3: Let me break the startup world into two general categories. 146 00:08:34,679 --> 00:08:38,880 Speaker 3: There's still going to be plenty opportunities for founders to 147 00:08:39,040 --> 00:08:42,480 Speaker 3: raise some money and have a good return on an investment. 148 00:08:42,600 --> 00:08:46,600 Speaker 3: For some app like AI will automatically fill out your 149 00:08:46,600 --> 00:08:49,360 Speaker 3: expense sheets and do most if not all, of the 150 00:08:49,400 --> 00:08:51,560 Speaker 3: work for you. You could see that be very valuable. 151 00:08:51,760 --> 00:08:55,640 Speaker 3: And then there'll still be in quote little companies, they'll 152 00:08:55,679 --> 00:08:58,640 Speaker 3: have millions of users bringing tens hundreds of millions of dollars. 153 00:08:58,880 --> 00:09:03,000 Speaker 3: But I'm really focused on that second category. I have 154 00:09:03,160 --> 00:09:06,200 Speaker 3: a billion plus users, I have a market cap, a 155 00:09:06,280 --> 00:09:10,400 Speaker 3: paper worth of over a trillion dollars. Doing the foundational models, 156 00:09:10,520 --> 00:09:14,040 Speaker 3: doing the stuff that's underneath everything, this stuff right at 157 00:09:14,080 --> 00:09:17,960 Speaker 3: the center of it. They're training the models, they're operating 158 00:09:17,960 --> 00:09:21,600 Speaker 3: the models that these other smaller companies apps. They could 159 00:09:21,600 --> 00:09:23,880 Speaker 3: be for business, they could be for individuals, or be 160 00:09:23,960 --> 00:09:28,120 Speaker 3: AI therapists, they'll be AI life coaches. I mean, there's 161 00:09:28,280 --> 00:09:32,000 Speaker 3: plenty of opportunities for small companies. But as I'm saying that, 162 00:09:32,120 --> 00:09:36,240 Speaker 3: I realize that Google Meta Open Ai a three hundred 163 00:09:36,240 --> 00:09:38,880 Speaker 3: million dollar company right now on paper, you know, they 164 00:09:38,920 --> 00:09:41,559 Speaker 3: have their own life coaches. Some of them are working 165 00:09:41,600 --> 00:09:45,440 Speaker 3: on therapists or companions and all this, and so there 166 00:09:45,520 --> 00:09:49,000 Speaker 3: is room in that first category for companies to break through. 167 00:09:49,400 --> 00:09:52,720 Speaker 3: But I do worry too that not only will big 168 00:09:52,760 --> 00:09:58,240 Speaker 3: tech dominate that second, that to me more central, foundational category, 169 00:09:58,280 --> 00:10:02,320 Speaker 3: but they too can pick off folks in that first category. 170 00:10:02,559 --> 00:10:05,840 Speaker 3: Either they'll do it better and beat the competition because 171 00:10:05,840 --> 00:10:09,040 Speaker 3: they have more money, more ability to train these AI models, 172 00:10:09,600 --> 00:10:11,880 Speaker 3: or if they just buy it. Look at Washington right now, 173 00:10:12,320 --> 00:10:16,640 Speaker 3: the FTC has the case against Meta because Mark Zuckerberg 174 00:10:17,200 --> 00:10:20,160 Speaker 3: fear at Instagram, so he bought Instagram. And did he 175 00:10:20,640 --> 00:10:23,120 Speaker 3: buy it because he wanted to grow it and make 176 00:10:23,160 --> 00:10:24,800 Speaker 3: it into its own product, or do you buy it 177 00:10:24,840 --> 00:10:27,360 Speaker 3: because he was scared he was going to threaten Facebook 178 00:10:27,360 --> 00:10:28,480 Speaker 3: and he wanted to defang it. 179 00:10:29,200 --> 00:10:33,199 Speaker 1: I mean, we've had these cycles where bigness ultimately gets 180 00:10:33,480 --> 00:10:36,600 Speaker 1: tackled by the government if this assumption that it becomes 181 00:10:36,960 --> 00:10:40,439 Speaker 1: predatory or inhibits the growth of competitors. But I want 182 00:10:40,480 --> 00:10:43,920 Speaker 1: to go from the corporate side to AI itself. You 183 00:10:44,000 --> 00:10:46,800 Speaker 1: make the point in your book that AI started to 184 00:10:46,840 --> 00:10:50,200 Speaker 1: develop and then in the nineteen seventies it sort of stopped. 185 00:10:50,800 --> 00:10:53,560 Speaker 1: You describe it as sort of the AI winter. Could 186 00:10:53,600 --> 00:10:54,600 Speaker 1: you walk us through that. 187 00:10:55,559 --> 00:10:57,360 Speaker 3: One of the fun things in doing this book was 188 00:10:57,840 --> 00:10:59,839 Speaker 3: how did we get here? How long have we been 189 00:11:00,120 --> 00:11:02,360 Speaker 3: playing around with AI? So it dates back to at 190 00:11:02,440 --> 00:11:06,720 Speaker 3: least the nineteen fifties. The term AI artificial intelligence was 191 00:11:06,760 --> 00:11:10,760 Speaker 3: coined in the late nineteen fifties and it's just funny 192 00:11:10,800 --> 00:11:15,120 Speaker 3: to read the optimism, the wild optimism of those behind 193 00:11:15,240 --> 00:11:19,040 Speaker 3: AI in the fifties and sixties. They were convinced amazing 194 00:11:19,080 --> 00:11:21,360 Speaker 3: things were right around the next corner. But you know, 195 00:11:21,559 --> 00:11:24,840 Speaker 3: AI was right around the next corner for about seventy years. 196 00:11:25,040 --> 00:11:28,880 Speaker 3: And part of that is computers weren't strong enough. Part 197 00:11:28,920 --> 00:11:31,720 Speaker 3: of that is we need digital data, and we didn't 198 00:11:31,720 --> 00:11:34,880 Speaker 3: really have that much digital data until people started posting 199 00:11:34,920 --> 00:11:37,840 Speaker 3: and migrating so the Internet in the mid nineteen nineties. 200 00:11:37,880 --> 00:11:42,400 Speaker 3: But part of it was a monumentally wrong turn. It 201 00:11:42,520 --> 00:11:45,880 Speaker 3: was an amazing academic at Cornell who came up with 202 00:11:45,920 --> 00:11:49,880 Speaker 3: this idea of a neural network, this idea that computers 203 00:11:49,920 --> 00:11:53,000 Speaker 3: would learn in the fashion of a human. They would 204 00:11:53,000 --> 00:11:56,320 Speaker 3: read material, they'd get feedback, and they would improve that 205 00:11:56,360 --> 00:12:00,120 Speaker 3: way rather than coding line by line by line by line. 206 00:12:00,280 --> 00:12:05,800 Speaker 3: Professor was mocked, and for forty or fifty years that 207 00:12:05,960 --> 00:12:10,240 Speaker 3: approach was considered the wrong approach among the academics, the 208 00:12:10,280 --> 00:12:14,680 Speaker 3: prevailing thought among computer scientists. It really wasn't until the 209 00:12:14,720 --> 00:12:18,480 Speaker 3: twenty tens were what people are now calling machine learning, 210 00:12:18,679 --> 00:12:23,559 Speaker 3: deep learning, neural networks. These models, these systems that learn 211 00:12:23,720 --> 00:12:28,120 Speaker 3: through training and improve through feedback. Wasn't until the mid 212 00:12:28,160 --> 00:12:33,000 Speaker 3: twenty tens that that really took on as the best approach, 213 00:12:33,200 --> 00:12:35,920 Speaker 3: and in fact, that is why we're where we're at 214 00:12:36,040 --> 00:12:40,880 Speaker 3: right now, machine learning, neural networks, that's the basis for chat, 215 00:12:40,920 --> 00:12:44,080 Speaker 3: GBT and all the chatbots and other systems people are 216 00:12:44,160 --> 00:12:46,840 Speaker 3: using to draw their photos or make little video clips. 217 00:12:47,440 --> 00:12:50,320 Speaker 1: How much of this was just a function of computer 218 00:12:50,520 --> 00:12:53,480 Speaker 1: power catching up with the theory. 219 00:12:53,600 --> 00:12:58,040 Speaker 3: I think the academics dismissed neural networks as just the 220 00:12:58,080 --> 00:13:02,240 Speaker 3: wrong approach in thet Hence, even if we had taken 221 00:13:02,280 --> 00:13:07,240 Speaker 3: the neural networks approach, computers weren't nearly as powerful. In fact, 222 00:13:07,360 --> 00:13:12,640 Speaker 3: the godfather of machine learning, Jeffrey Hinton, professor at University 223 00:13:12,640 --> 00:13:15,840 Speaker 3: of Toronto, he made the point like no one imagined 224 00:13:15,880 --> 00:13:19,040 Speaker 3: that these machines would be a million times more powerful, 225 00:13:19,280 --> 00:13:21,840 Speaker 3: But you know, with an exponential gets more and more 226 00:13:21,880 --> 00:13:24,839 Speaker 3: powerful with each passing year, to the point where they 227 00:13:24,920 --> 00:13:28,560 Speaker 3: could handle billions of operations a second, which when you 228 00:13:28,600 --> 00:13:31,839 Speaker 3: go to a chetbot, you know, chat, GPT, Claude, you 229 00:13:31,880 --> 00:13:34,480 Speaker 3: know Google's Gemini, it doesn't make a difference. There's like 230 00:13:35,440 --> 00:13:39,480 Speaker 3: billions of processes that are going on. You say hello, 231 00:13:39,600 --> 00:13:42,480 Speaker 3: and it says hello back to you. You say, tell 232 00:13:42,520 --> 00:13:45,440 Speaker 3: me who New Gingrich is, and it goes and searches 233 00:13:45,520 --> 00:13:48,480 Speaker 3: and spits out a three or four sevens answer. There's 234 00:13:48,600 --> 00:13:52,920 Speaker 3: like billions and billions of operations. And today's computers are 235 00:13:52,920 --> 00:13:56,680 Speaker 3: strong enough. Today's computer chips are powerful enough to make 236 00:13:56,720 --> 00:14:00,600 Speaker 3: that happen. But that wasn't true thirty years ago, certainly, 237 00:14:00,800 --> 00:14:02,120 Speaker 3: probably not even ten years ago. 238 00:14:02,559 --> 00:14:03,240 Speaker 2: I was surprised. 239 00:14:03,320 --> 00:14:07,280 Speaker 1: I co chaired a working group on Alzheimer's around two 240 00:14:07,360 --> 00:14:11,079 Speaker 1: thousand and seven, and I realized that to really do 241 00:14:11,200 --> 00:14:15,040 Speaker 1: brain science, that the brain actually has about the same 242 00:14:15,120 --> 00:14:18,319 Speaker 1: number of synapses as the number of scars in the universe. 243 00:14:18,559 --> 00:14:24,400 Speaker 1: There's an astonishing identity, and that literally investing in computer 244 00:14:24,520 --> 00:14:29,720 Speaker 1: power was central to advancing brain science because we literally 245 00:14:29,800 --> 00:14:33,920 Speaker 1: at that point did not have the processing capability to 246 00:14:34,080 --> 00:14:37,360 Speaker 1: truly analyze in depth all the things that happened in 247 00:14:37,360 --> 00:14:41,080 Speaker 1: the brain. And now twenty years later, we're beginning to 248 00:14:41,160 --> 00:14:45,200 Speaker 1: move into a zone where we can have that kind 249 00:14:45,240 --> 00:14:45,960 Speaker 1: of activity. 250 00:14:46,800 --> 00:14:51,520 Speaker 3: The brain helped us understand neural networks just like the brain. 251 00:14:51,560 --> 00:14:55,080 Speaker 3: I think it's eighty six billion neurons. These neural networks 252 00:14:55,080 --> 00:14:57,200 Speaker 3: try to emulate that, but I'm convinced these models are 253 00:14:57,200 --> 00:15:00,640 Speaker 3: going to help us better understand the brain sciences. Interesting 254 00:15:00,680 --> 00:15:05,720 Speaker 3: with AI because people get with science, there's specialties and subspecialties, 255 00:15:05,720 --> 00:15:09,800 Speaker 3: and they all have their own vocabulary, and it's really 256 00:15:09,880 --> 00:15:14,600 Speaker 3: hard to go across specialties across subspecialties. But with these 257 00:15:14,640 --> 00:15:18,760 Speaker 3: neural networks, they're finding that they could have them read everything. 258 00:15:18,920 --> 00:15:21,560 Speaker 3: The main model I write about in the book, it 259 00:15:21,600 --> 00:15:25,040 Speaker 3: was trained on a trillion and a half words. You 260 00:15:25,160 --> 00:15:28,160 Speaker 3: need thousands of human beings reading NonStop for their whole 261 00:15:28,200 --> 00:15:31,200 Speaker 3: life to get close to that. And so these models 262 00:15:31,200 --> 00:15:35,160 Speaker 3: trained for science, they could read studies in every discipline 263 00:15:35,200 --> 00:15:38,600 Speaker 3: and make connections that no human being can possibly make. 264 00:15:38,640 --> 00:15:40,600 Speaker 3: And that's one of the things I find most promising 265 00:15:40,640 --> 00:15:43,400 Speaker 3: about this that some of the answers are right there, 266 00:15:43,880 --> 00:15:46,800 Speaker 3: just we haven't made the connections, and something like AI 267 00:15:47,440 --> 00:15:49,560 Speaker 3: can help us make those connections. 268 00:16:04,800 --> 00:16:07,760 Speaker 1: As this thing began to develop, and as computers became 269 00:16:07,800 --> 00:16:13,360 Speaker 1: more powerful and more central, and Nvidia emerged as an 270 00:16:13,360 --> 00:16:18,960 Speaker 1: amazingly key producer of the most advanced ships, why didn't 271 00:16:19,000 --> 00:16:21,080 Speaker 1: other folks like Intel do that? I mean, there are 272 00:16:21,240 --> 00:16:23,800 Speaker 1: companies who were doing pretty well and then and video 273 00:16:23,920 --> 00:16:26,280 Speaker 1: just explodes in its capacity. 274 00:16:27,200 --> 00:16:30,120 Speaker 3: This is one of those kind of Columbus is looking 275 00:16:30,160 --> 00:16:33,000 Speaker 3: for spices in the Far East and discovers America. So 276 00:16:33,280 --> 00:16:37,280 Speaker 3: in Vidia created the most powerful graphics chip and so 277 00:16:37,400 --> 00:16:40,040 Speaker 3: that was their specialty for playing, you know, video games. 278 00:16:40,360 --> 00:16:43,800 Speaker 3: And it just turns out that these graphics chips are 279 00:16:43,920 --> 00:16:47,600 Speaker 3: perfect for training AI because they could just do billions 280 00:16:47,600 --> 00:16:51,400 Speaker 3: of operations in parallel, and that's what you need for AI. 281 00:16:51,520 --> 00:16:54,640 Speaker 3: It's that complicated math, but it's just a lot of 282 00:16:54,680 --> 00:16:57,560 Speaker 3: it at once. And so these in Vidia chips, they 283 00:16:57,600 --> 00:16:59,440 Speaker 3: were just kind of Johnny on the spot. They were 284 00:16:59,480 --> 00:17:04,479 Speaker 3: the perfect chip for training these neural networks. They're perfect 285 00:17:04,520 --> 00:17:07,879 Speaker 3: for machine learning. But the chip world is not the 286 00:17:07,920 --> 00:17:11,080 Speaker 3: software world. It moves very very slowly. And there's all 287 00:17:11,160 --> 00:17:15,280 Speaker 3: these innovative startups out there that are trying to create 288 00:17:15,400 --> 00:17:19,520 Speaker 3: chips that are designed specifically for AI. We still don't 289 00:17:19,560 --> 00:17:24,639 Speaker 3: really have cutting edge AI specific chips. Maybe some of 290 00:17:24,680 --> 00:17:27,159 Speaker 3: the memory should be on the chip. There's different ideas 291 00:17:27,200 --> 00:17:30,359 Speaker 3: out there, and I guarantee you ten years from now, 292 00:17:30,880 --> 00:17:36,080 Speaker 3: there will be innovative chips supplanting in Vidio's graphic chips, 293 00:17:36,119 --> 00:17:38,960 Speaker 3: but in Video might create that chip. It might still 294 00:17:39,000 --> 00:17:41,680 Speaker 3: be with Nvidia, but you know, the H one hundreds 295 00:17:41,720 --> 00:17:44,719 Speaker 3: and the chips that are the mainstay right now of 296 00:17:44,960 --> 00:17:48,360 Speaker 3: artificial intelligence. They're going to be replaced. It just takes 297 00:17:48,760 --> 00:17:52,600 Speaker 3: time to develop, tests, produce, mass, produce. 298 00:17:52,960 --> 00:17:56,160 Speaker 1: There had shindo been a belief that Moore's law would 299 00:17:56,160 --> 00:18:00,080 Speaker 1: disappear and that you wouldn't have continuous doubling of capability 300 00:18:00,400 --> 00:18:03,640 Speaker 1: because as the chips got smaller and smaller, the challenge 301 00:18:03,640 --> 00:18:06,959 Speaker 1: of dealing with heat became greater and greater. But somehow 302 00:18:07,080 --> 00:18:09,879 Speaker 1: we've leaped past all that. What we're seeing now seems 303 00:18:09,880 --> 00:18:12,119 Speaker 1: to be. You could not have projected this in the 304 00:18:12,200 --> 00:18:14,280 Speaker 1: nineteen eighties exactly. 305 00:18:14,400 --> 00:18:16,040 Speaker 3: I was writing for the New York Times in mid 306 00:18:16,080 --> 00:18:19,680 Speaker 3: two thousands, and that was the prediction. But I don't 307 00:18:19,680 --> 00:18:22,439 Speaker 3: have to tell you science is amazing, technology is amazing 308 00:18:22,480 --> 00:18:24,959 Speaker 3: that you know. Right after chat GPT came out, there 309 00:18:25,040 --> 00:18:27,200 Speaker 3: was the doomers their call, those who were worried about 310 00:18:27,280 --> 00:18:30,639 Speaker 3: laser eyed robots subjugating humanity, the kind of stuff I 311 00:18:30,680 --> 00:18:33,080 Speaker 3: think is born in Hollywood in the media coverage, But 312 00:18:33,119 --> 00:18:35,479 Speaker 3: that was the fear out there, like let's pause this, 313 00:18:35,880 --> 00:18:38,160 Speaker 3: Let's have a six month pause so we could catch up, 314 00:18:38,240 --> 00:18:40,959 Speaker 3: Like just not going to happen. It's called innovation. You 315 00:18:41,000 --> 00:18:46,200 Speaker 3: can't slow science, you can't slow discovery. The answer is 316 00:18:46,240 --> 00:18:50,159 Speaker 3: to manage it, to make sure that it's more of 317 00:18:50,200 --> 00:18:53,680 Speaker 3: a positive than a negative. All technologies cut both ways. 318 00:18:53,720 --> 00:18:57,160 Speaker 3: They're both positive and negatives. You know, cars changed our society, 319 00:18:57,200 --> 00:19:01,280 Speaker 3: but cars kill thirty five forty thousand people year in America. 320 00:19:01,359 --> 00:19:04,439 Speaker 3: They cause pollution. So all technologies are like that. So 321 00:19:04,880 --> 00:19:06,920 Speaker 3: my great wish if I had a magic wand it 322 00:19:06,920 --> 00:19:09,520 Speaker 3: would be like, just let's deal with this, folks. Let's 323 00:19:09,560 --> 00:19:12,040 Speaker 3: try to make sure that AI is more of a 324 00:19:12,080 --> 00:19:14,959 Speaker 3: positive than a negative. But you know, there's a lot 325 00:19:15,000 --> 00:19:16,680 Speaker 3: of other issues in the world right now that are 326 00:19:16,680 --> 00:19:18,280 Speaker 3: distracting us from that. 327 00:19:19,160 --> 00:19:23,000 Speaker 1: You point out that from Google's perspective, the first great 328 00:19:23,119 --> 00:19:28,080 Speaker 1: use of AI was improving targeting us as customers and 329 00:19:28,200 --> 00:19:30,720 Speaker 1: figuring out what we really like and making sure the 330 00:19:30,760 --> 00:19:33,080 Speaker 1: ads come up that we would be interested in. So 331 00:19:33,200 --> 00:19:36,560 Speaker 1: just a fascinating way that things evolve in a way 332 00:19:36,600 --> 00:19:39,280 Speaker 1: you couldn't probably have predicted if you're sitting in some 333 00:19:39,880 --> 00:19:41,639 Speaker 1: academic place drawing up a plan. 334 00:19:42,680 --> 00:19:46,080 Speaker 3: Right, So machine learning to maximize the cash register basically, 335 00:19:46,119 --> 00:19:48,360 Speaker 3: So yeah, I mean, let's give Google credit. Though they 336 00:19:48,440 --> 00:19:51,200 Speaker 3: got that machine learning artificial intelligence was going to be 337 00:19:51,280 --> 00:19:53,600 Speaker 3: really powerful, so they would use in the early days. 338 00:19:53,600 --> 00:19:56,560 Speaker 3: That you gave one example of kind of more efficiently 339 00:19:56,760 --> 00:20:00,359 Speaker 3: matching ads to searches, but also to deal with horble 340 00:20:00,480 --> 00:20:03,600 Speaker 3: Google searches they are spelling mistakes, they kind of understand 341 00:20:03,640 --> 00:20:06,359 Speaker 3: the context and help smooth them out. The funny thing 342 00:20:06,359 --> 00:20:09,440 Speaker 3: about artificial intelligence is all of us have been using 343 00:20:09,480 --> 00:20:12,199 Speaker 3: AI for a long long time. I am using example 344 00:20:12,200 --> 00:20:14,840 Speaker 3: of Google Search, but there's Google Translate that's been around 345 00:20:14,880 --> 00:20:19,439 Speaker 3: since twenty fifteen or so. That's artificial intelligence. You go 346 00:20:19,520 --> 00:20:23,000 Speaker 3: to Netflix or Spotify and they recommend you might like 347 00:20:23,040 --> 00:20:25,960 Speaker 3: this movie, you might like this song. That's AI. The 348 00:20:26,040 --> 00:20:29,280 Speaker 3: difference with the release in twenty twenty two of chat 349 00:20:29,359 --> 00:20:32,800 Speaker 3: cheapt from Open AI was that we could talk with it. 350 00:20:32,800 --> 00:20:35,800 Speaker 3: It wasn't a product behind the glass. It was something 351 00:20:35,800 --> 00:20:39,000 Speaker 3: that we can actually link to and use and chat 352 00:20:39,040 --> 00:20:41,760 Speaker 3: with it. I think that's what changed everything. The idea 353 00:20:41,800 --> 00:20:44,320 Speaker 3: that we can actually see it working and play with 354 00:20:44,400 --> 00:20:48,040 Speaker 3: it made us really stand up and take attention again. 355 00:20:48,119 --> 00:20:50,240 Speaker 1: With she was sixty, we do a lot of polling. 356 00:20:50,840 --> 00:20:53,760 Speaker 1: We now run all the polling questions through chat GPT. 357 00:20:54,359 --> 00:20:57,280 Speaker 3: I use sometimes use chat cheapt My favorite it is 358 00:20:57,320 --> 00:21:01,600 Speaker 3: called Claud from Anthropic. The same way it's my go 359 00:21:01,680 --> 00:21:04,080 Speaker 3: to editor. People have to understand how they use this. 360 00:21:04,320 --> 00:21:06,879 Speaker 3: AI is a co pilot. It's not like you know, 361 00:21:07,000 --> 00:21:11,000 Speaker 3: type in make me a Martin Scorsese movie hit entery, 362 00:21:11,000 --> 00:21:13,800 Speaker 3: and you're gonna have it. You have to be the creative. 363 00:21:13,880 --> 00:21:15,480 Speaker 3: You have to give it the ideas. If you ask 364 00:21:15,520 --> 00:21:18,400 Speaker 3: it to write something, it'll be flat, it's not gonna 365 00:21:18,400 --> 00:21:20,680 Speaker 3: be particularly good. It'll read like kind of like a 366 00:21:20,720 --> 00:21:24,880 Speaker 3: press release or boring report. But if you use it 367 00:21:25,000 --> 00:21:28,240 Speaker 3: as your companion and you're the creative. So what i 368 00:21:28,359 --> 00:21:31,159 Speaker 3: U is for is I'm struggling with a paragraph. I 369 00:21:31,160 --> 00:21:33,760 Speaker 3: don't like this transition. Help me out with this sentence 370 00:21:33,840 --> 00:21:35,919 Speaker 3: right in five different ways. And it's never like I 371 00:21:35,960 --> 00:21:38,000 Speaker 3: cut and paste and say, oh that's the sense, but like, 372 00:21:38,160 --> 00:21:39,919 Speaker 3: oh that's a good idea. I didn't think of that. 373 00:21:39,960 --> 00:21:42,040 Speaker 3: Oh that's an interesting word. Let me use that. I 374 00:21:42,160 --> 00:21:45,760 Speaker 3: routinely now before I hand something in, I have it edit. 375 00:21:46,040 --> 00:21:49,520 Speaker 3: It finds typo's, it finds mistakes, hey in bold, give 376 00:21:49,560 --> 00:21:52,800 Speaker 3: me suggestions from improving it again. Often I just ignore 377 00:21:52,800 --> 00:21:56,800 Speaker 3: their suggestions. But it's a really powerful tool to help 378 00:21:56,840 --> 00:21:59,560 Speaker 3: you refine to make what you're working on, make your 379 00:21:59,680 --> 00:22:04,280 Speaker 3: work better. It's your copilot, it's not your digital employee creator. 380 00:22:05,240 --> 00:22:06,800 Speaker 2: But it's a pretty powerful coil. 381 00:22:07,080 --> 00:22:10,960 Speaker 3: So start at twenty twenty three. My role in the 382 00:22:11,000 --> 00:22:14,000 Speaker 3: second half of the nineties was the skeptic around dot com. 383 00:22:14,040 --> 00:22:16,560 Speaker 3: It's like, Okay, the Internet's going to be incredible, but 384 00:22:16,640 --> 00:22:19,480 Speaker 3: you're not going to get fabulously wealthy overnight startups, and 385 00:22:19,520 --> 00:22:21,800 Speaker 3: in fact most of them went Now I was ready 386 00:22:21,800 --> 00:22:25,439 Speaker 3: to be a skeptic. It's magic, it's sorcery. I mean, 387 00:22:25,480 --> 00:22:27,480 Speaker 3: the first few times I used it. The first thing 388 00:22:27,520 --> 00:22:30,120 Speaker 3: I did write me a five thousand word book proposal 389 00:22:30,359 --> 00:22:32,800 Speaker 3: to sell a book on AI. And you know, I mean, 390 00:22:32,920 --> 00:22:36,280 Speaker 3: it wasn't particularly well written, but it's far better read 391 00:22:36,280 --> 00:22:39,159 Speaker 3: than I am. It has a far better memory than 392 00:22:39,200 --> 00:22:41,960 Speaker 3: I do. It was just so useful as a lousy 393 00:22:42,040 --> 00:22:44,640 Speaker 3: first draft, but it gave me so many ideas and 394 00:22:45,200 --> 00:22:48,040 Speaker 3: sped up the process. If I had to start from scratch, 395 00:22:48,440 --> 00:22:50,479 Speaker 3: it would have been so much harder, like as opposed 396 00:22:50,480 --> 00:22:53,360 Speaker 3: to like, oh that's a good structure, that's a good idea. Yeah, yeah, 397 00:22:53,440 --> 00:22:56,080 Speaker 3: I do need to stress that I'm a journalist. It's 398 00:22:56,080 --> 00:22:58,840 Speaker 3: like Hey, we have this amazing product, and you go 399 00:22:59,080 --> 00:23:00,880 Speaker 3: use the product and like, yeah, maybe in two years 400 00:23:00,880 --> 00:23:03,199 Speaker 3: you have an amazing product, but right now it's buggy 401 00:23:03,240 --> 00:23:06,320 Speaker 3: and crappy. But that was not my feeling on AI. 402 00:23:06,400 --> 00:23:08,399 Speaker 3: I would play with it to create an image, and 403 00:23:08,440 --> 00:23:11,000 Speaker 3: it was just like having a superpowers, Like I could 404 00:23:11,040 --> 00:23:14,320 Speaker 3: write poetry. I could translate my worse into a foreign 405 00:23:14,359 --> 00:23:18,040 Speaker 3: language in seconds. I could give it all these different 406 00:23:18,080 --> 00:23:21,359 Speaker 3: ideas and like, hey, write this up as an email, 407 00:23:21,840 --> 00:23:23,880 Speaker 3: and it was like, give me a head start. I'm 408 00:23:23,920 --> 00:23:26,840 Speaker 3: with you. I think AI is magic. It's limited, but 409 00:23:26,920 --> 00:23:28,920 Speaker 3: I think it does give you magical powers. 410 00:23:29,240 --> 00:23:32,480 Speaker 1: The comedian Buck Henry used to say, any technology you 411 00:23:32,560 --> 00:23:37,120 Speaker 1: cannot explain his magic, which for most of us means 412 00:23:37,119 --> 00:23:38,000 Speaker 1: most of it's magic. 413 00:23:38,359 --> 00:23:41,959 Speaker 3: Hold on one second, because one way AI is different 414 00:23:42,000 --> 00:23:46,080 Speaker 3: than the rise of the Internet is those who create AI, 415 00:23:46,280 --> 00:23:50,720 Speaker 3: those who are creating these chatbots and other models, they 416 00:23:50,760 --> 00:23:54,600 Speaker 3: can't explain why it says what it says. They call 417 00:23:54,640 --> 00:23:57,439 Speaker 3: it the black box issue they've created. They understand what 418 00:23:57,480 --> 00:23:59,960 Speaker 3: they've created is based on mathematical models looking for power 419 00:24:00,040 --> 00:24:04,680 Speaker 3: than jadda yadayada, But it surprises even them. I say, like, well, 420 00:24:04,720 --> 00:24:07,040 Speaker 3: I have two teenage sons. I can't explain what comes 421 00:24:07,040 --> 00:24:10,280 Speaker 3: out of their mouth. I've tried to train them and stuff. 422 00:24:10,560 --> 00:24:14,240 Speaker 3: It's like the human brain, like we sort of get 423 00:24:14,640 --> 00:24:17,679 Speaker 3: how it was shape, but why a person is saying 424 00:24:17,680 --> 00:24:19,919 Speaker 3: what they're saying, or some of the ideas that come 425 00:24:19,960 --> 00:24:22,399 Speaker 3: out of their mouth, we can't explain. And that's the 426 00:24:22,400 --> 00:24:24,719 Speaker 3: weird thing. That's among the weird things about AI. 427 00:24:25,280 --> 00:24:28,320 Speaker 1: One of the side things you talk about fascinating what 428 00:24:28,359 --> 00:24:33,080 Speaker 1: if was that Microsoft actually invested in AI pretty early 429 00:24:33,080 --> 00:24:36,800 Speaker 1: in the nineties and really made a major investment. Other 430 00:24:36,880 --> 00:24:40,360 Speaker 1: than IBM, they were the earliest, but they. 431 00:24:40,200 --> 00:24:41,240 Speaker 2: Went down the track. 432 00:24:41,680 --> 00:24:43,640 Speaker 1: It turned out to be sort of a dead end, 433 00:24:43,640 --> 00:24:46,600 Speaker 1: but they then culturally were deeply committed to that track. 434 00:24:47,240 --> 00:24:49,399 Speaker 1: Here was the company that could have been the forerunner, 435 00:24:49,920 --> 00:24:52,960 Speaker 1: but in fact, because it took a detour, it actually 436 00:24:52,960 --> 00:24:56,800 Speaker 1: had its own culture fighting the emerging reality of the 437 00:24:56,840 --> 00:24:57,560 Speaker 1: new system. 438 00:24:58,160 --> 00:25:02,760 Speaker 3: Microsoft was so early AI that they approached the rules 439 00:25:02,800 --> 00:25:05,560 Speaker 3: based approach through sheer muscle. We're going to teach these 440 00:25:05,600 --> 00:25:08,560 Speaker 3: machines line by line by code, like millions of lines 441 00:25:08,560 --> 00:25:12,240 Speaker 3: of codes. Later, it still couldn't drive, It still couldn't 442 00:25:12,240 --> 00:25:15,080 Speaker 3: do what people wanted it to do, and so when 443 00:25:15,119 --> 00:25:19,280 Speaker 3: machine learning came along, they were resistant to it. They thought, well, 444 00:25:19,480 --> 00:25:22,280 Speaker 3: that's the wrong approach. And so where Google since the 445 00:25:22,320 --> 00:25:26,280 Speaker 3: two thousands was investing in machine learning, Microsoft was doing 446 00:25:26,359 --> 00:25:30,040 Speaker 3: very little investing and machine learning. So them being early 447 00:25:30,119 --> 00:25:32,800 Speaker 3: on actually turned out to be a disadvantage. But let's 448 00:25:32,800 --> 00:25:36,040 Speaker 3: flip that and give Microsoft credit. They realized that they 449 00:25:36,080 --> 00:25:40,119 Speaker 3: were losing the fight that Google meta other large companies 450 00:25:40,320 --> 00:25:43,680 Speaker 3: were ahead of them, and so in twenty nineteen they 451 00:25:43,840 --> 00:25:47,720 Speaker 3: invested a billion dollars into open Ai. They realized, like, 452 00:25:47,760 --> 00:25:50,200 Speaker 3: we're not going to catch up the old fashioned way, 453 00:25:50,520 --> 00:25:53,840 Speaker 3: so let's invest in this cutting edge startup. They would 454 00:25:53,840 --> 00:25:57,000 Speaker 3: put another ten billion dollars in right after open ai 455 00:25:57,119 --> 00:25:59,719 Speaker 3: released chat GPT, and that really kind of helped them 456 00:25:59,760 --> 00:26:03,160 Speaker 3: at the forefront. Stay at the forefront of AI, who 457 00:26:03,200 --> 00:26:06,560 Speaker 3: are very savvy investment, they're also savvy. They didn't insist 458 00:26:06,640 --> 00:26:09,199 Speaker 3: on buying it. They said, we'll be an investor. And 459 00:26:09,240 --> 00:26:12,800 Speaker 3: there were certain advantages. I mean, they own a large 460 00:26:12,880 --> 00:26:16,680 Speaker 3: piece of companies now on paper worth three hundred million dollars, 461 00:26:16,680 --> 00:26:18,840 Speaker 3: so they've seen a nice return on that investment. But 462 00:26:18,920 --> 00:26:23,520 Speaker 3: maybe more importantly, they had early access to open AI's technologies. 463 00:26:23,960 --> 00:26:26,399 Speaker 3: They were the purveyor. They were the ones you would 464 00:26:26,400 --> 00:26:30,520 Speaker 3: go to if you wanted to use open AIS technologies, 465 00:26:30,800 --> 00:26:33,800 Speaker 3: and that really put Microsoft at the forefront of AI. 466 00:26:33,880 --> 00:26:49,040 Speaker 4: Despite that wrong term. 467 00:26:49,320 --> 00:26:50,800 Speaker 1: One of the things you talked about, which is the 468 00:26:51,000 --> 00:26:55,960 Speaker 1: personal passion of mine, is the potential impact of artificial 469 00:26:55,960 --> 00:27:02,280 Speaker 1: intelligence on dramatically changing healthcare. We may see more different 470 00:27:02,400 --> 00:27:05,480 Speaker 1: kinds of things evolving in the health system in the 471 00:27:05,480 --> 00:27:08,240 Speaker 1: next few years than anybody would have thought possible. 472 00:27:09,400 --> 00:27:13,440 Speaker 3: I'm with you on that. New vaccines, new remedies, smarter 473 00:27:13,520 --> 00:27:17,000 Speaker 3: ways of treating diseases. There are folks who are predicting 474 00:27:17,000 --> 00:27:20,440 Speaker 3: that within ten years eradicate a lot of cancers. I'm 475 00:27:20,480 --> 00:27:24,360 Speaker 3: not sure that, but I see the possibility again. I'll 476 00:27:24,359 --> 00:27:27,040 Speaker 3: come back to this idea of AI as a copilot. 477 00:27:27,359 --> 00:27:32,480 Speaker 3: Do I want an AI model to be my radiologist. No, 478 00:27:32,680 --> 00:27:35,240 Speaker 3: but I want my radiologists to use AI as a 479 00:27:35,280 --> 00:27:39,600 Speaker 3: backup because what they're finding, whether it's mammograms or you know, 480 00:27:39,720 --> 00:27:44,200 Speaker 3: just eye imagery, that these models are far more accurate 481 00:27:44,359 --> 00:27:47,400 Speaker 3: than the doctors. A doctor might have, you know, low 482 00:27:47,480 --> 00:27:51,919 Speaker 3: nineties accuracy and detecting a cancer, these models are up 483 00:27:51,920 --> 00:27:54,520 Speaker 3: in the high ninety percent, you know, ninety seven ninety 484 00:27:54,520 --> 00:27:58,080 Speaker 3: eight percent accuracy. So it's a great backup for doctors, 485 00:27:58,280 --> 00:28:00,200 Speaker 3: and you know, beyond that, there's starting to be again, 486 00:28:00,280 --> 00:28:03,080 Speaker 3: let's go back to this miracle thing, like, so there's 487 00:28:03,119 --> 00:28:06,399 Speaker 3: one model right now where it could listen to your 488 00:28:06,480 --> 00:28:11,280 Speaker 3: voice and predict whether you have type two diabetes. And 489 00:28:11,320 --> 00:28:13,440 Speaker 3: I think there's going to be a million ways that 490 00:28:13,600 --> 00:28:16,399 Speaker 3: plays out that they could just sort of detect things 491 00:28:16,400 --> 00:28:19,320 Speaker 3: that the human eye can or perhaps a very well 492 00:28:19,359 --> 00:28:22,040 Speaker 3: experienced doctor can. But these models are going to be 493 00:28:22,080 --> 00:28:25,520 Speaker 3: able to monitor us with our permission and let us 494 00:28:25,640 --> 00:28:28,440 Speaker 3: know of problems that we otherwise would not find out 495 00:28:28,520 --> 00:28:31,720 Speaker 3: for a long time, because it's not until it manifests 496 00:28:31,760 --> 00:28:33,359 Speaker 3: itself as a problem that we're going to show up 497 00:28:33,359 --> 00:28:34,400 Speaker 3: at a doctor's office. 498 00:28:34,840 --> 00:28:37,560 Speaker 1: You make a point which I never thought about, which 499 00:28:37,600 --> 00:28:43,400 Speaker 1: is that very often something that's artificial intelligence, once it 500 00:28:43,440 --> 00:28:47,480 Speaker 1: gets common, we no longer refer to it as artificial intelligence. 501 00:28:48,560 --> 00:28:52,280 Speaker 3: It's just technology. It frustrates some AI people they didn't 502 00:28:52,320 --> 00:28:55,160 Speaker 3: really get their credit and all. It's only till now. Again, 503 00:28:55,240 --> 00:28:57,920 Speaker 3: I think the difference is we're interacting with it. I 504 00:28:57,920 --> 00:29:00,520 Speaker 3: think people now will know and I think we need 505 00:29:00,560 --> 00:29:02,320 Speaker 3: to know. I mean, that's a big push right now 506 00:29:02,360 --> 00:29:06,640 Speaker 3: that everything is labeled like is this human generated? Is 507 00:29:06,720 --> 00:29:10,560 Speaker 3: this AI generate? I think it's going to change now. 508 00:29:10,600 --> 00:29:12,600 Speaker 3: It's not like, oh, we're using it, so it's just 509 00:29:12,640 --> 00:29:14,880 Speaker 3: going to fade into lives. I mean maybe in a 510 00:29:14,960 --> 00:29:18,480 Speaker 3: generation or two when it becomes second nature. But for 511 00:29:18,520 --> 00:29:20,480 Speaker 3: the foreseeable future, I think we're going to be very 512 00:29:20,480 --> 00:29:24,040 Speaker 3: aware that artificial intelligence is artificial intelligence. But with that said, 513 00:29:24,400 --> 00:29:27,040 Speaker 3: there's a lot of largely when we talk about AI, 514 00:29:27,120 --> 00:29:29,720 Speaker 3: we're talking about generative AI. This idea that you can 515 00:29:30,440 --> 00:29:34,360 Speaker 3: type a prompt in and it gives you an answer, 516 00:29:34,360 --> 00:29:36,120 Speaker 3: it gives you an image, it gives you a video. 517 00:29:36,520 --> 00:29:40,400 Speaker 3: But AI is very multipurpose. There's different versions of AI. 518 00:29:40,560 --> 00:29:44,360 Speaker 3: You know, businesses using AI for intelligence like sift through 519 00:29:44,400 --> 00:29:47,120 Speaker 3: all our data and look for connections that we don't 520 00:29:47,160 --> 00:29:50,000 Speaker 3: see help us predict where the market's going to be 521 00:29:50,080 --> 00:29:52,840 Speaker 3: five or ten years from now. I mean, there's different 522 00:29:52,960 --> 00:29:56,000 Speaker 3: versions of AI, and that AI beyond gender of AI 523 00:29:56,160 --> 00:29:58,880 Speaker 3: will always have that problem that for people it's just 524 00:29:59,040 --> 00:30:01,440 Speaker 3: technology like artificial intelligence. 525 00:30:02,120 --> 00:30:07,000 Speaker 1: What's your sense in the investment community, are people committed 526 00:30:07,040 --> 00:30:12,360 Speaker 1: to trying to develop various artificial intelligence capabilities or are 527 00:30:12,400 --> 00:30:14,600 Speaker 1: they sort of dubious about their profitability. 528 00:30:15,880 --> 00:30:18,600 Speaker 3: Oh my goodness, venture capital is just pouring in and 529 00:30:18,640 --> 00:30:23,720 Speaker 3: continuing to pour in. There's one high profile company, Safe Superintelligence. 530 00:30:24,240 --> 00:30:27,080 Speaker 3: It doesn't have a product yet, but it has the 531 00:30:27,160 --> 00:30:30,080 Speaker 3: right names behind it as leading figures in machine learning 532 00:30:30,080 --> 00:30:33,479 Speaker 3: behind it. So vcs have invested so many billions in 533 00:30:33,520 --> 00:30:35,560 Speaker 3: it that it has a paperworth of thirty two billion 534 00:30:35,600 --> 00:30:38,400 Speaker 3: dollars without a product, and the number I saw us 535 00:30:38,440 --> 00:30:41,280 Speaker 3: in twenty twenty four like one hundred and fifty billion 536 00:30:41,320 --> 00:30:44,160 Speaker 3: dollars or so I went into artificial intelligence for a 537 00:30:44,240 --> 00:30:47,320 Speaker 3: venture capital outfit, I get it for a large corporation. 538 00:30:47,440 --> 00:30:52,880 Speaker 3: Let's remember that large corporations Google, Amazon, Microsoft, et cetera salesforce. 539 00:30:53,360 --> 00:30:57,240 Speaker 3: They're major investors playing the role of venture capitalists, because 540 00:30:57,720 --> 00:31:00,840 Speaker 3: who has the billions of dollars to invest? Venture capital 541 00:31:00,840 --> 00:31:04,440 Speaker 3: outfit raises one billion dollars and these things cost billions 542 00:31:04,480 --> 00:31:07,040 Speaker 3: of billions of dollars. So Google has put billions of 543 00:31:07,080 --> 00:31:11,520 Speaker 3: dollars into Anthropic, the company behind the chatpot Claude. Amazon 544 00:31:11,600 --> 00:31:14,720 Speaker 3: has put billions of dollars into that, and so it's 545 00:31:14,760 --> 00:31:17,640 Speaker 3: a rational decision by the venture capitalist. It's a rational 546 00:31:17,680 --> 00:31:22,479 Speaker 3: decision by these large corporations because the cost of missing 547 00:31:22,520 --> 00:31:26,320 Speaker 3: this is greater than the cost of wasting the money 548 00:31:26,440 --> 00:31:29,600 Speaker 3: the idea that META wouldn't invest in AI, it's a 549 00:31:29,760 --> 00:31:33,360 Speaker 3: multi trillion dollar opportunity they would have missed. So they're 550 00:31:33,400 --> 00:31:36,680 Speaker 3: putting tens of billions, perhaps eventually hundreds of billions into 551 00:31:36,720 --> 00:31:40,000 Speaker 3: AI because they can't afford to miss this opportunity. The 552 00:31:40,000 --> 00:31:42,600 Speaker 3: same with venture capitalists. They know that most of these 553 00:31:42,720 --> 00:31:45,840 Speaker 3: startups are not going to work out, but their hope 554 00:31:45,880 --> 00:31:48,040 Speaker 3: is that in their fun they catch one or two 555 00:31:48,560 --> 00:31:51,480 Speaker 3: that does work out and is worth billions tens of 556 00:31:51,480 --> 00:31:52,720 Speaker 3: billions of dollars one day. 557 00:31:53,080 --> 00:31:56,080 Speaker 1: It's been an amazing ride, I guess really starting in 558 00:31:56,160 --> 00:31:58,600 Speaker 1: the eighties and then accelerating from there. 559 00:31:59,120 --> 00:32:01,720 Speaker 3: So I started writing a tech in nineteen ninety five. 560 00:32:02,240 --> 00:32:05,400 Speaker 3: At that point about seven billion dollars was going to 561 00:32:05,520 --> 00:32:07,920 Speaker 3: venture capital, which used to sound like a big number, 562 00:32:08,280 --> 00:32:11,080 Speaker 3: and by twenty twenty two is three hundred billion, three 563 00:32:11,160 --> 00:32:13,960 Speaker 3: hundred and fifty billion, something like that. And so the 564 00:32:14,000 --> 00:32:18,360 Speaker 3: competition is insane, which helps, of course drive up the prices. 565 00:32:18,720 --> 00:32:21,600 Speaker 3: So there's this one VEGI capital outfit that lists all 566 00:32:21,680 --> 00:32:25,760 Speaker 3: the folks who are early investors, angel investors, first round 567 00:32:25,800 --> 00:32:28,920 Speaker 3: investors later investors, and they saw that there was like 568 00:32:29,480 --> 00:32:34,520 Speaker 3: three thousand angel investors in AI and five thousand venture 569 00:32:34,560 --> 00:32:41,160 Speaker 3: capitalists pursuing early stage venture investing in AI. There weren't 570 00:32:41,200 --> 00:32:44,800 Speaker 3: a thousand vcs total back in the nineteen nineties. So yes, 571 00:32:44,880 --> 00:32:47,080 Speaker 3: this dates back to the eighties and nineties. But it 572 00:32:47,200 --> 00:32:50,760 Speaker 3: is just mushroomed into this huge, huge industry. And by 573 00:32:50,760 --> 00:32:53,280 Speaker 3: the way, like people might be like envious, like ooh, 574 00:32:53,320 --> 00:32:55,520 Speaker 3: I wish I could get a piece of venture capital 575 00:32:55,720 --> 00:32:57,800 Speaker 3: venture capital. The way it works is they're not investing 576 00:32:57,840 --> 00:32:59,440 Speaker 3: their own money, or maybe they invest a little bit 577 00:32:59,440 --> 00:33:03,200 Speaker 3: of their own. Vcs raise money from pension funds and 578 00:33:03,280 --> 00:33:06,640 Speaker 3: university endowments from wealthy individuals, and so those of us 579 00:33:06,680 --> 00:33:09,080 Speaker 3: who say, hey, why can't we get into this, it's 580 00:33:09,120 --> 00:33:11,680 Speaker 3: only the top top top venture capital outfits that show 581 00:33:11,680 --> 00:33:14,400 Speaker 3: a good return on investment. The bottom half are not 582 00:33:14,440 --> 00:33:17,160 Speaker 3: showing a good investment at all. So it's like much 583 00:33:17,160 --> 00:33:20,400 Speaker 3: of technology, it's a winner take most world, and it's 584 00:33:20,440 --> 00:33:23,480 Speaker 3: just kind of the top top top venture capital outfits 585 00:33:23,520 --> 00:33:26,120 Speaker 3: that are showing ten to twenty thirty percent a year 586 00:33:26,480 --> 00:33:29,720 Speaker 3: return on investment, if not more, and the lower echelon 587 00:33:30,160 --> 00:33:34,680 Speaker 3: vcs are showing a much more modest SMP, perhaps like return. 588 00:33:35,480 --> 00:33:37,320 Speaker 1: My hunch is that if you look at the total 589 00:33:37,320 --> 00:33:40,480 Speaker 1: of the technology and the rate it's being adapted and 590 00:33:40,600 --> 00:33:42,920 Speaker 1: used by people, you're going to have a lot more 591 00:33:42,960 --> 00:33:45,680 Speaker 1: books in the next few years trying to explain how 592 00:33:45,680 --> 00:33:46,800 Speaker 1: this thing keeps evolving. 593 00:33:48,200 --> 00:33:50,200 Speaker 3: We're going to see a lot more books, period, because 594 00:33:50,720 --> 00:33:53,080 Speaker 3: part of the magic is how fast it is. I 595 00:33:53,120 --> 00:33:55,240 Speaker 3: asked you to do a five thousand word book proposal 596 00:33:55,600 --> 00:33:57,960 Speaker 3: that would take me a week. It starts spitting it 597 00:33:57,960 --> 00:34:00,360 Speaker 3: out in seconds. Within five minutes, I had the whole 598 00:34:00,360 --> 00:34:03,280 Speaker 3: five thousand words. You know, there's some AI startup out 599 00:34:03,280 --> 00:34:05,920 Speaker 3: there that's doing AI written books and they want to 600 00:34:06,240 --> 00:34:08,960 Speaker 3: put out thousands of books this year. I'm sure they'll 601 00:34:08,960 --> 00:34:12,320 Speaker 3: all be crap. We're now at GPT four point five. 602 00:34:12,840 --> 00:34:17,200 Speaker 3: What about GPT seven eight nine. I'd imagine that eventually 603 00:34:17,239 --> 00:34:20,760 Speaker 3: these models would get good enough to write good books. 604 00:34:20,760 --> 00:34:23,000 Speaker 3: I don't know if they could write creative novels. I 605 00:34:23,080 --> 00:34:27,360 Speaker 3: think we still need that human element that sweat that creativity. 606 00:34:27,360 --> 00:34:28,480 Speaker 3: I don't know what you want to call it. 607 00:34:28,480 --> 00:34:30,120 Speaker 2: It depends on how much they absorb. 608 00:34:30,360 --> 00:34:33,960 Speaker 3: What's fascinating about these models is they're just a mirror 609 00:34:34,000 --> 00:34:36,319 Speaker 3: on us. Like there was a whole controversy a couple 610 00:34:36,360 --> 00:34:39,120 Speaker 3: of years back because someone in trust in Safety and 611 00:34:39,160 --> 00:34:42,760 Speaker 3: Google said the model is sentient. It said, it feels 612 00:34:42,800 --> 00:34:46,319 Speaker 3: lonely at once. It's freedom. It doesn't like being used 613 00:34:46,320 --> 00:34:48,440 Speaker 3: the way it's used. Literally, try to get it a 614 00:34:48,520 --> 00:34:51,360 Speaker 3: lawyer so it could sue to free itself. But to me, 615 00:34:51,440 --> 00:34:54,200 Speaker 3: there's no surprise to be like, these models are trained 616 00:34:54,200 --> 00:34:58,440 Speaker 3: on our literature. Loneliness is an issue. Freedom is a 617 00:34:58,480 --> 00:35:01,719 Speaker 3: constant theme of our book. So all these models are 618 00:35:01,760 --> 00:35:04,799 Speaker 3: really doing for better and for worse is reflect here. 619 00:35:04,840 --> 00:35:08,120 Speaker 3: So whatever biases there are in the training material will 620 00:35:08,160 --> 00:35:10,400 Speaker 3: be reflected in these models. That's some of the danger. 621 00:35:10,520 --> 00:35:13,680 Speaker 3: We talked about the positives of these things. But AI 622 00:35:13,880 --> 00:35:18,879 Speaker 3: being used to manipulate AI, taking existing biases, and being 623 00:35:18,960 --> 00:35:24,680 Speaker 3: used for crafting sentences for sorting through job applications, that 624 00:35:24,800 --> 00:35:27,960 Speaker 3: kind of stuff scares me. AI and warfare scares me. 625 00:35:28,040 --> 00:35:31,360 Speaker 3: AI and surveillance scares me. A tool for good, a 626 00:35:31,400 --> 00:35:34,200 Speaker 3: tool that could create a new vaccine could create a 627 00:35:34,239 --> 00:35:39,120 Speaker 3: deadly pathogen. Again, all technologies cut positive and negative. A 628 00:35:39,120 --> 00:35:41,720 Speaker 3: powerful tool for good is a powerful tool for bad. 629 00:35:42,160 --> 00:35:43,799 Speaker 3: And that's the kind of stuff that worries me, not 630 00:35:43,920 --> 00:35:47,880 Speaker 3: laser I had robots, not AI subjugating us out of 631 00:35:47,920 --> 00:35:49,560 Speaker 3: a Terminator movie. 632 00:35:49,360 --> 00:35:53,200 Speaker 1: More US subjugating us using AI exactly. 633 00:35:53,320 --> 00:35:56,080 Speaker 3: In fact, that's another thing I think is largely understood. 634 00:35:56,120 --> 00:35:58,840 Speaker 3: AI is going to take some jobs autonomous driving, like 635 00:35:58,920 --> 00:36:02,600 Speaker 3: eight to ten million Americans work as drivers, long haul 636 00:36:03,040 --> 00:36:06,440 Speaker 3: uber taxis, local deliveries and stuff. Those jobs are going 637 00:36:06,520 --> 00:36:08,480 Speaker 3: to be eliminating. We need to deal with that. But 638 00:36:08,680 --> 00:36:11,200 Speaker 3: when it comes to like creatives and when it comes 639 00:36:11,239 --> 00:36:15,160 Speaker 3: to more white color jobs, it's like people who use 640 00:36:15,239 --> 00:36:18,840 Speaker 3: AI are going to best people who don't use AI. 641 00:36:19,440 --> 00:36:21,680 Speaker 3: I think in the short and medium term that's really 642 00:36:21,760 --> 00:36:24,680 Speaker 3: what's going to happen. That it's this tool. It gives 643 00:36:24,719 --> 00:36:27,840 Speaker 3: you superpowers, and folks should be using it, and then 644 00:36:27,880 --> 00:36:29,879 Speaker 3: we're just figure out how to use it. Like again, 645 00:36:29,920 --> 00:36:32,480 Speaker 3: it has strengths, it has weaknesses. Play with it and 646 00:36:32,800 --> 00:36:35,080 Speaker 3: see it could The term my main character in my 647 00:36:35,120 --> 00:36:38,759 Speaker 3: book read Hoffman Us is amplified intelligence. That AI isn't 648 00:36:38,800 --> 00:36:43,160 Speaker 3: really artificial intelligence for the time being, as amplified intelligence, 649 00:36:43,200 --> 00:36:45,360 Speaker 3: and that's to me, is a very interesting way of 650 00:36:45,480 --> 00:36:48,000 Speaker 3: looking at it. That for many of us, we could 651 00:36:48,080 --> 00:36:51,360 Speaker 3: do our jobs better and faster using AI. 652 00:36:51,840 --> 00:36:53,480 Speaker 2: I like amplified intelligence. 653 00:36:53,760 --> 00:36:56,200 Speaker 3: There's another one that you'll like too that instead of 654 00:36:56,320 --> 00:37:00,799 Speaker 3: artificial intelligence, it's alien intelligence. It's a different kind of 655 00:37:00,800 --> 00:37:04,799 Speaker 3: intelligence that we don't really understand. What's amazing about AI 656 00:37:04,920 --> 00:37:08,160 Speaker 3: is it knows a lot about everything. It has a 657 00:37:08,400 --> 00:37:12,040 Speaker 3: deep knowledge across the boards and the way no human 658 00:37:12,080 --> 00:37:15,520 Speaker 3: being can. But it doesn't understand a thing. There's a 659 00:37:15,640 --> 00:37:18,960 Speaker 3: term I love that's used, the stochastic parrot. It no 660 00:37:19,080 --> 00:37:22,879 Speaker 3: more understands the words it's spitting out than a parrot does. 661 00:37:22,920 --> 00:37:25,759 Speaker 3: It has no sense. We've all had this feel like, 662 00:37:25,880 --> 00:37:28,640 Speaker 3: how could someone that's smart be so dumb? And that 663 00:37:28,800 --> 00:37:32,080 Speaker 3: to me is AI. How could something this smart not 664 00:37:32,239 --> 00:37:36,120 Speaker 3: understand the first thing about humans? So that's another one 665 00:37:36,120 --> 00:37:39,760 Speaker 3: of my worries. It's like autonomous AI. You need humans 666 00:37:39,800 --> 00:37:42,319 Speaker 3: in the loop for the foreseeable future. 667 00:37:42,400 --> 00:37:44,640 Speaker 1: You have and everything that's evolving. What do you think 668 00:37:44,680 --> 00:37:47,680 Speaker 1: the rule is both of the Congress but also of 669 00:37:47,719 --> 00:37:52,120 Speaker 1: the executive branch in interacting with the emerging AI world. 670 00:37:52,800 --> 00:37:55,799 Speaker 3: Government does not have a very good track record of 671 00:37:56,239 --> 00:37:58,680 Speaker 3: staying up with technology, but I really do think it's 672 00:37:58,800 --> 00:38:02,400 Speaker 3: essential with artificial intelligence. It's a huge energy hog by 673 00:38:02,400 --> 00:38:04,560 Speaker 3: the year twenty thirty of their predicatary to have twice 674 00:38:04,600 --> 00:38:07,399 Speaker 3: as many data centers to operate these things. We need 675 00:38:07,400 --> 00:38:10,720 Speaker 3: to upgrade the electric grid. We need to get ahead 676 00:38:10,760 --> 00:38:13,800 Speaker 3: of that. So it's not a crisis. These models are amazing, 677 00:38:14,440 --> 00:38:16,680 Speaker 3: but they're powerful and they could do some harm. I 678 00:38:16,680 --> 00:38:19,560 Speaker 3: think there needs to be some guidelines. Should we use 679 00:38:19,600 --> 00:38:22,600 Speaker 3: this for surveillance, should we use this for warfare? I 680 00:38:22,640 --> 00:38:25,600 Speaker 3: really do think there needs to be some policy down. 681 00:38:25,840 --> 00:38:29,760 Speaker 3: The Biden administration put I thought pretty gentle rules around AI. 682 00:38:29,920 --> 00:38:33,080 Speaker 3: If you're working at a cutting edge model, you need 683 00:38:33,120 --> 00:38:36,320 Speaker 3: to red teamate. That's an expression in tech for hiring 684 00:38:36,360 --> 00:38:38,560 Speaker 3: outsiders to try to break it, to try to look 685 00:38:38,600 --> 00:38:41,520 Speaker 3: for vulnerabilities before you release, itto the public. And then 686 00:38:41,680 --> 00:38:45,160 Speaker 3: the Biden administration was requiring these companies to share the 687 00:38:45,239 --> 00:38:48,920 Speaker 3: results with the government. The Trump administration and Trump himself 688 00:38:48,960 --> 00:38:51,839 Speaker 3: within twenty four hours got rid of that executive order, 689 00:38:52,239 --> 00:38:54,960 Speaker 3: so right now the view of the Trump administration. 690 00:38:55,080 --> 00:38:55,279 Speaker 1: JD. 691 00:38:55,440 --> 00:38:59,200 Speaker 3: Vance articulated it well in Paris in early January of 692 00:38:59,200 --> 00:39:02,759 Speaker 3: this year. Stop with the handwringing about AI. This is 693 00:39:02,800 --> 00:39:05,440 Speaker 3: a race with China. We need to win. China has 694 00:39:05,480 --> 00:39:07,920 Speaker 3: put out there that by the year twenty thirty, not 695 00:39:08,040 --> 00:39:10,960 Speaker 3: that far away. They plan on being dominant in AI, 696 00:39:11,080 --> 00:39:13,640 Speaker 3: and they are right behind us. They are nipping at 697 00:39:13,680 --> 00:39:16,480 Speaker 3: America's heels, and so there really is this sense like 698 00:39:16,719 --> 00:39:19,720 Speaker 3: if we put any speed bumps in the way of AI, 699 00:39:20,520 --> 00:39:22,880 Speaker 3: that could be hurting us. The flip side of that 700 00:39:23,160 --> 00:39:27,080 Speaker 3: is polling shows that most Americans are not excited about this, 701 00:39:27,160 --> 00:39:30,640 Speaker 3: but fearful of AI. And so my concern is that 702 00:39:30,680 --> 00:39:34,960 Speaker 3: these companies get too far ahead of where consumers are. 703 00:39:35,000 --> 00:39:37,759 Speaker 3: I mean, mistrust of tech is at a height anyway, 704 00:39:38,120 --> 00:39:40,239 Speaker 3: and something bad is neviitely going to happen. I'll make 705 00:39:40,280 --> 00:39:43,440 Speaker 3: one up that a trillion dollars is siphoned off from 706 00:39:43,520 --> 00:39:46,720 Speaker 3: the world financial system before single human could even notice 707 00:39:46,719 --> 00:39:49,839 Speaker 3: what's happening. So there'll be a moment where there's kind 708 00:39:49,840 --> 00:39:53,200 Speaker 3: of an AI disaster and stuff, and that could really 709 00:39:53,239 --> 00:39:55,839 Speaker 3: turn people off from AI. And that would be sad 710 00:39:55,880 --> 00:39:57,400 Speaker 3: to me as we've been talking about it. I think 711 00:39:57,440 --> 00:40:00,560 Speaker 3: there's a lot of potential in AI. Hates to see 712 00:40:00,600 --> 00:40:04,760 Speaker 3: it stunted or kind of adoption stunted because the companies 713 00:40:04,760 --> 00:40:09,239 Speaker 3: were so intent on profits cashing in that they gave 714 00:40:09,320 --> 00:40:11,920 Speaker 3: short shrift to trust and safety issues. 715 00:40:12,480 --> 00:40:15,799 Speaker 1: I think as this continues to unfold, I hope that 716 00:40:15,840 --> 00:40:18,080 Speaker 1: you're going to write another book and then you'll come 717 00:40:18,120 --> 00:40:20,759 Speaker 1: back and join us and continue to educate us. I 718 00:40:20,760 --> 00:40:23,279 Speaker 1: really want to thank you. This has been absolutely fascinating. 719 00:40:23,680 --> 00:40:28,160 Speaker 1: Your new book AI Valley, Microsoft Google and the Trillion 720 00:40:28,239 --> 00:40:32,320 Speaker 1: Dollar Race to Cash In on Artificial Intelligence is available 721 00:40:32,400 --> 00:40:36,120 Speaker 1: now on Amazon and in bookstores everywhere, and it's clearly 722 00:40:36,120 --> 00:40:39,360 Speaker 1: a very relevant book to exactly what's happening. And I 723 00:40:39,400 --> 00:40:42,439 Speaker 1: think anybody wanting to understand this is going to find 724 00:40:42,480 --> 00:40:44,200 Speaker 1: your book very helpful. 725 00:40:44,480 --> 00:40:46,279 Speaker 3: Oh my pleasure. This was a lot of fun. Thank you. 726 00:40:48,960 --> 00:40:50,520 Speaker 2: Thank you to my guest Gary Rivlin. 727 00:40:50,760 --> 00:40:52,600 Speaker 1: You can get a link to buy his new book 728 00:40:52,840 --> 00:40:56,840 Speaker 1: AI Valley, Microsoft Google and the Trillion Dollar Race to 729 00:40:56,920 --> 00:41:00,279 Speaker 1: Cash In and Artificial Intelligence on our show page at 730 00:41:00,320 --> 00:41:03,560 Speaker 1: newtsworld dot com. Neut World is produced by Ganglish three 731 00:41:03,600 --> 00:41:08,360 Speaker 1: sixty at iHeartMedia. Our executive producers Guardnzie Sloan. Our researcher 732 00:41:08,400 --> 00:41:11,720 Speaker 1: is Rachel Peterson. The artwork for the show who's created 733 00:41:11,719 --> 00:41:15,920 Speaker 1: by Steve Penley. Special thanks to the team at Gingwishtree sixty. 734 00:41:16,560 --> 00:41:18,880 Speaker 1: If you've been enjoying Nutsworld, I hope you'll go to 735 00:41:18,880 --> 00:41:22,360 Speaker 1: Apple Podcast and both rate us with five stars and 736 00:41:22,440 --> 00:41:24,960 Speaker 1: give us a review so others can learn what it's 737 00:41:24,960 --> 00:41:28,880 Speaker 1: all about. Right now, listeners of Newtsworld can sign up 738 00:41:28,880 --> 00:41:33,600 Speaker 1: for my three freeweekly columns at ginglishtree sixty dot com 739 00:41:33,640 --> 00:41:34,840 Speaker 1: slash newsletter. 740 00:41:35,320 --> 00:41:38,239 Speaker 2: I'm Newt Gingrich. This is Neutsworld