1 00:00:02,520 --> 00:00:07,040 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:08,039 --> 00:00:10,639 Speaker 2: Let's stick with tech. The impact of generative AI under 3 00:00:10,680 --> 00:00:14,400 Speaker 2: scrutiny as some investors question massive capex spending in the space. 4 00:00:14,680 --> 00:00:17,840 Speaker 2: Read Halfman, the co founder of Malice AI and author 5 00:00:17,840 --> 00:00:21,040 Speaker 2: of Superagency What could possibly go Right with our AI Future? 6 00:00:21,320 --> 00:00:23,400 Speaker 2: Joined us now read It's good to see you. I 7 00:00:23,440 --> 00:00:25,160 Speaker 2: said as you walked into the room. There's a book 8 00:00:25,200 --> 00:00:26,880 Speaker 2: I need to read What Could Go Right? Because I'm 9 00:00:26,960 --> 00:00:29,680 Speaker 2: very worried about what things could go wrong. Let's put 10 00:00:29,680 --> 00:00:32,160 Speaker 2: it that way. You in the book approach something really 11 00:00:32,200 --> 00:00:36,040 Speaker 2: important as we make these technological revolutions, these advancements over 12 00:00:36,240 --> 00:00:40,440 Speaker 2: human history. Actually it benefits human agency, and I'm worried 13 00:00:40,440 --> 00:00:42,839 Speaker 2: that maybe with AI it guys in a different direction. 14 00:00:43,120 --> 00:00:44,839 Speaker 2: Help me out and tell me why it could go right. 15 00:00:45,120 --> 00:00:47,479 Speaker 3: Well, the shorting answer is it gives you superpowers, right, 16 00:00:47,520 --> 00:00:49,560 Speaker 3: and it gives you superpowers across a large number of things. 17 00:00:49,600 --> 00:00:54,120 Speaker 3: Everyone should go try Chat, GPT, Inflections, Pie and Trapicsquad. 18 00:00:54,720 --> 00:00:56,800 Speaker 3: Give it a try. You can find what should I 19 00:00:56,840 --> 00:00:59,120 Speaker 3: make for dinner? How do I get medical advice that's 20 00:00:59,160 --> 00:01:01,720 Speaker 3: available to me? How I am I to have a 21 00:01:01,760 --> 00:01:04,520 Speaker 3: difficult conversation with a colleague, you know, how might I 22 00:01:05,120 --> 00:01:07,840 Speaker 3: actually do things in parenting that might be different? You know, 23 00:01:07,920 --> 00:01:11,640 Speaker 3: all of those things are are our superpowers that we get. 24 00:01:12,120 --> 00:01:14,760 Speaker 2: And each run. Lisa under the bus, I am, do 25 00:01:14,760 --> 00:01:16,760 Speaker 2: you want to share with the audience what you do? 26 00:01:16,319 --> 00:01:18,840 Speaker 4: You can manipulate what your kids look at if you 27 00:01:18,880 --> 00:01:22,080 Speaker 4: send them links to things that then change their algorithms 28 00:01:22,080 --> 00:01:25,840 Speaker 4: that then give them more news feeds in their in 29 00:01:25,880 --> 00:01:27,280 Speaker 4: their feed. I mean there are ways that you can 30 00:01:27,360 --> 00:01:28,920 Speaker 4: kind of play the system a little bit. 31 00:01:28,840 --> 00:01:29,840 Speaker 1: And I think that that's proper. 32 00:01:29,920 --> 00:01:32,000 Speaker 3: Yeah, exactly. And by the way it helps them learn, 33 00:01:32,160 --> 00:01:34,640 Speaker 3: it gets some news about the environment around them. I mean, 34 00:01:34,840 --> 00:01:36,039 Speaker 3: this is one of the things that is if you're 35 00:01:36,080 --> 00:01:40,000 Speaker 3: not actually in fact using these AI agents to learn 36 00:01:40,080 --> 00:01:42,000 Speaker 3: right now, you should. And like, for example, one of 37 00:01:42,040 --> 00:01:44,360 Speaker 3: the ways that I do when I approach a difficult subject, 38 00:01:44,400 --> 00:01:47,520 Speaker 3: say for example, quantum computing, I stick a technical paper 39 00:01:47,520 --> 00:01:49,440 Speaker 3: in and I say, explain this to me like I'm 40 00:01:49,440 --> 00:01:52,360 Speaker 3: twelve superpowers superpowers. 41 00:01:52,480 --> 00:01:54,480 Speaker 2: Yes, okay, So my next question is going to be 42 00:01:54,680 --> 00:01:57,800 Speaker 2: how should we participate and how we shake this intelligence? 43 00:01:57,800 --> 00:01:59,520 Speaker 2: And Lisa has given us one idea, could you give 44 00:01:59,560 --> 00:02:00,000 Speaker 2: us another time? 45 00:02:00,640 --> 00:02:03,960 Speaker 3: Well, I think part of the key thing is my 46 00:02:04,080 --> 00:02:07,360 Speaker 3: very first chapter is about chat GBT, where humanity enters 47 00:02:07,360 --> 00:02:09,639 Speaker 3: the chat. You got hundreds of millions of people who 48 00:02:09,639 --> 00:02:13,200 Speaker 3: are interacting with it, and the company watches what works 49 00:02:13,200 --> 00:02:15,280 Speaker 3: and what doesn't work and then improves it. So by 50 00:02:15,480 --> 00:02:18,720 Speaker 3: simply going and experimenting and getting which kind of things 51 00:02:18,760 --> 00:02:22,600 Speaker 3: are the things that help you elevate your agency, then 52 00:02:23,040 --> 00:02:27,119 Speaker 3: actually in fact helps the company know, oh, these things 53 00:02:27,160 --> 00:02:29,160 Speaker 3: really work, Oh these things need to be improved. 54 00:02:29,520 --> 00:02:32,480 Speaker 1: At the same time, you talk about technical papers, and 55 00:02:32,560 --> 00:02:35,280 Speaker 1: one thing that AI does really well is work with 56 00:02:35,320 --> 00:02:38,919 Speaker 1: technical details, not so much, the social not so much. 57 00:02:38,919 --> 00:02:41,560 Speaker 1: When you have to have certain biases that are baked 58 00:02:41,639 --> 00:02:45,000 Speaker 1: into some of these algorithms, what are the limitations to 59 00:02:45,080 --> 00:02:47,520 Speaker 1: this in sort of broad raised application at a time 60 00:02:47,560 --> 00:02:51,040 Speaker 1: where people are saying this could do qualitative jobs and 61 00:02:51,080 --> 00:02:54,240 Speaker 1: there could potentially be a lot of biases baked in. 62 00:02:54,800 --> 00:02:59,920 Speaker 3: So every major group is working intensely on biases. And actually, 63 00:03:00,000 --> 00:03:02,600 Speaker 3: one of the things that we did with inflection and 64 00:03:02,800 --> 00:03:06,320 Speaker 3: PIE personal intelligence PUN intended was to teach it to 65 00:03:06,360 --> 00:03:10,600 Speaker 3: be kind and empathetic, to have EQ as important as IQ. 66 00:03:11,200 --> 00:03:13,440 Speaker 3: And actually you see that now in anthropics Claude and 67 00:03:13,440 --> 00:03:15,760 Speaker 3: other things is it's kind of spreading, and so you 68 00:03:15,800 --> 00:03:19,120 Speaker 3: can actually have these kinds of interactions be trained in them, 69 00:03:19,200 --> 00:03:21,760 Speaker 3: and this is kind of reinforcement learning by human feedback 70 00:03:22,040 --> 00:03:24,600 Speaker 3: to actually be much better than your average human. 71 00:03:24,760 --> 00:03:27,040 Speaker 1: The reason why I find this so fascinating is because 72 00:03:27,600 --> 00:03:29,960 Speaker 1: I think about my children and what careers they potentially 73 00:03:30,000 --> 00:03:32,480 Speaker 1: could have, and I'm like, no, not programmer, because that's 74 00:03:32,480 --> 00:03:34,200 Speaker 1: going to be dead. You know, all of these other ones. 75 00:03:34,200 --> 00:03:37,320 Speaker 1: Probably if you can be empathetic human, those human skills 76 00:03:37,360 --> 00:03:39,360 Speaker 1: are probably going to be more needed. That's what people say. 77 00:03:39,480 --> 00:03:40,720 Speaker 1: Are you saying those two are going to go out 78 00:03:40,720 --> 00:03:41,000 Speaker 1: the waydo? 79 00:03:41,000 --> 00:03:41,200 Speaker 2: I mean? 80 00:03:41,200 --> 00:03:42,280 Speaker 1: What is this due to employment? 81 00:03:42,520 --> 00:03:46,720 Speaker 3: So first, I actually think programming skill jobs aren't going 82 00:03:46,720 --> 00:03:48,840 Speaker 3: out the door. I think there's human application, but I 83 00:03:48,840 --> 00:03:52,040 Speaker 3: think it's also human application of like EQ as well. 84 00:03:52,080 --> 00:03:55,440 Speaker 3: So the fact that you have a for example, say 85 00:03:55,440 --> 00:03:57,960 Speaker 3: a medical assistant that can help you with something, or 86 00:03:58,000 --> 00:03:59,720 Speaker 3: a coach that might be able to talk with something, 87 00:04:00,000 --> 00:04:02,320 Speaker 3: doesn't mean you don't want a human coach. It's more 88 00:04:02,360 --> 00:04:05,920 Speaker 3: coaches the better, right, and being able to make it work. 89 00:04:06,160 --> 00:04:08,480 Speaker 5: You're talking about AI for the good, but what about 90 00:04:08,480 --> 00:04:10,080 Speaker 5: AI in the hands of adversaries? 91 00:04:11,400 --> 00:04:16,160 Speaker 3: Well, so superpowers superpowers for terrorists, superpowers for criminals, superpowers 92 00:04:16,200 --> 00:04:18,000 Speaker 3: for rogues states. That's one of the things that we 93 00:04:18,080 --> 00:04:20,760 Speaker 3: need to be careful about how we navigate. Once again, 94 00:04:21,120 --> 00:04:23,720 Speaker 3: all of the major groups actually have security groups that 95 00:04:23,800 --> 00:04:25,520 Speaker 3: are working on this to try to make sure and 96 00:04:25,839 --> 00:04:28,400 Speaker 3: that's I think part of the having call so called 97 00:04:28,480 --> 00:04:30,960 Speaker 3: red teaming plans and kind of trying to make sure 98 00:04:30,960 --> 00:04:33,080 Speaker 3: that that's as minimized as much as possible. 99 00:04:33,160 --> 00:04:35,040 Speaker 5: How do think the US government should be approaching this? 100 00:04:35,320 --> 00:04:38,160 Speaker 5: The market's freaked out because of deep seek and how 101 00:04:38,279 --> 00:04:41,640 Speaker 5: much potentially the CCP is putting money into their version 102 00:04:41,760 --> 00:04:44,360 Speaker 5: of AI. But when you're in China and you want 103 00:04:44,360 --> 00:04:46,560 Speaker 5: to look at things on deep seek that are particularly 104 00:04:46,600 --> 00:04:49,240 Speaker 5: critical to the CCP, it will not show you. So 105 00:04:49,279 --> 00:04:51,280 Speaker 5: how does the US government go about this? 106 00:04:51,880 --> 00:04:54,440 Speaker 3: Well, So I think it's important that the US government 107 00:04:55,440 --> 00:04:57,880 Speaker 3: kind of continues some of the threads that the previous 108 00:04:57,920 --> 00:05:02,039 Speaker 3: executive order did, which is you have red teaming things 109 00:05:02,080 --> 00:05:04,919 Speaker 3: that you actually do world leadership about what kind of 110 00:05:04,920 --> 00:05:07,480 Speaker 3: things are happening. I mean, there's connections in the UK 111 00:05:08,480 --> 00:05:10,359 Speaker 3: AI Safety and Suit and the US one and the 112 00:05:10,360 --> 00:05:12,479 Speaker 3: French one, and I think these are important to do. 113 00:05:13,560 --> 00:05:15,120 Speaker 3: But I think that this is one of the things 114 00:05:15,120 --> 00:05:17,080 Speaker 3: where there will be multiple AIS and part of the 115 00:05:17,120 --> 00:05:20,080 Speaker 3: reason why I want AI not to just be artificial 116 00:05:20,080 --> 00:05:24,159 Speaker 3: intelligence or amplification intelligence, but also American intelligence as part 117 00:05:24,160 --> 00:05:24,760 Speaker 3: of what we're doing. 118 00:05:24,960 --> 00:05:28,000 Speaker 5: What do you make then, of the tech industry being 119 00:05:28,160 --> 00:05:30,800 Speaker 5: very close to this administration. I'm not just talking about 120 00:05:30,800 --> 00:05:32,320 Speaker 5: your former front Elon Musk. I was at the White 121 00:05:32,360 --> 00:05:34,640 Speaker 5: House yesterday and Bill Gates was walking into the West 122 00:05:34,640 --> 00:05:37,600 Speaker 5: wing for meetings. Do you think they can help the 123 00:05:37,640 --> 00:05:42,400 Speaker 5: administration basically get to there faster when it comes to 124 00:05:42,440 --> 00:05:43,080 Speaker 5: safe AI? 125 00:05:44,080 --> 00:05:47,039 Speaker 3: In the short absolutely, I think it's if you said, 126 00:05:47,200 --> 00:05:50,760 Speaker 3: of any kind of Western democracy government the world, most especially, 127 00:05:50,760 --> 00:05:54,040 Speaker 3: of course, you know, American presidency, American government, should they 128 00:05:54,040 --> 00:05:56,200 Speaker 3: be connected with the tech industry talking about how tech 129 00:05:56,320 --> 00:05:57,960 Speaker 3: creating the future? What are the things that we can 130 00:05:58,000 --> 00:06:00,920 Speaker 3: do to make better American industry, better American society, better 131 00:06:00,960 --> 00:06:03,839 Speaker 3: lives for American citizens? The short answer is absolutely. 132 00:06:04,160 --> 00:06:06,080 Speaker 2: What you make in the back lash against Dale Musk 133 00:06:06,160 --> 00:06:09,320 Speaker 2: more recently about he's in film from the government, well. 134 00:06:09,680 --> 00:06:13,159 Speaker 3: In government, the move fast and break things maybe should 135 00:06:13,160 --> 00:06:16,000 Speaker 3: be you know, a little bit more compassionate and judicious. 136 00:06:16,360 --> 00:06:17,600 Speaker 2: Is that what you think the apprice should be. 137 00:06:18,360 --> 00:06:21,480 Speaker 3: I think that the question should be is trying to say, hey, look, 138 00:06:21,800 --> 00:06:24,440 Speaker 3: there's a reason government doesn't work like companies, even though 139 00:06:24,480 --> 00:06:26,880 Speaker 3: I think there's efficiency that's really good and we need 140 00:06:26,920 --> 00:06:28,200 Speaker 3: to respect those reasons too. 141 00:06:28,600 --> 00:06:32,360 Speaker 5: Do you feel that you're going to basically get some 142 00:06:32,360 --> 00:06:35,840 Speaker 5: sort of repudiation because of where your politics were ahead 143 00:06:35,880 --> 00:06:39,120 Speaker 5: of this election and given how Trump has basically cosied 144 00:06:39,240 --> 00:06:40,960 Speaker 5: up to a lot of individuals in the tech world. 145 00:06:41,000 --> 00:06:42,800 Speaker 5: I have not seen you yet at mar Lago in 146 00:06:42,880 --> 00:06:44,880 Speaker 5: inauguration or walking into the West wag. 147 00:06:46,480 --> 00:06:49,560 Speaker 3: Look, I think the thing that we most want as 148 00:06:49,560 --> 00:06:52,839 Speaker 3: American citizens is for our government to succeed, to succeed 149 00:06:52,880 --> 00:06:56,440 Speaker 3: for our industry, succeed for our citizens, And so that's 150 00:06:56,480 --> 00:06:59,039 Speaker 3: what I most want, and I think we all do. 151 00:06:59,520 --> 00:07:02,279 Speaker 1: You have been an incredible investor. You are an early 152 00:07:02,320 --> 00:07:06,320 Speaker 1: investor in open Ai, You invested in LinkedIn, just so 153 00:07:06,440 --> 00:07:09,479 Speaker 1: many of the behemoths currently. Do you see a lot 154 00:07:09,480 --> 00:07:12,240 Speaker 1: of upstarts right now that look like they could become 155 00:07:12,800 --> 00:07:16,680 Speaker 1: the next big thing in some of this machine learning technology? 156 00:07:17,800 --> 00:07:21,400 Speaker 3: So the short answer is yes, although picking them is 157 00:07:21,400 --> 00:07:23,440 Speaker 3: one of the difficult things about being a venture capitalist, 158 00:07:23,480 --> 00:07:27,240 Speaker 3: and there's a lot of different options. Back a few 159 00:07:27,280 --> 00:07:29,800 Speaker 3: years ago, I was arguing that we were five large 160 00:07:29,800 --> 00:07:32,720 Speaker 3: tech companies heading to ten. But I wouldn't necessarily have 161 00:07:32,720 --> 00:07:35,840 Speaker 3: predicted that Nvidio would have so quickly become one of those, 162 00:07:36,040 --> 00:07:38,360 Speaker 3: you know, into the seven and then going to ten. 163 00:07:38,400 --> 00:07:40,520 Speaker 3: And so I think that's we are in that progress. 164 00:07:40,560 --> 00:07:42,480 Speaker 3: And I think there are a number of startups that 165 00:07:42,560 --> 00:07:45,480 Speaker 3: are creating amazing technology. And then the question is product 166 00:07:45,480 --> 00:07:46,440 Speaker 3: market fit at scale? 167 00:07:46,600 --> 00:07:49,200 Speaker 1: What do you think the next technological evolution is going 168 00:07:49,240 --> 00:07:52,160 Speaker 1: to be? They could potentially be a sort of chat 169 00:07:52,200 --> 00:07:55,240 Speaker 1: GPT moment, or a deep seek moment, or one of 170 00:07:55,280 --> 00:07:57,640 Speaker 1: these sort of breakthrough ahaz. 171 00:07:58,600 --> 00:08:00,160 Speaker 3: One of the things I think we're going to to 172 00:08:00,200 --> 00:08:04,560 Speaker 3: see this year, all of the major AI companies are 173 00:08:04,600 --> 00:08:07,760 Speaker 3: working on amplifying coding. And one of the things like, 174 00:08:07,800 --> 00:08:10,320 Speaker 3: for example, you know for your kids, other things will 175 00:08:10,360 --> 00:08:13,920 Speaker 3: be yes, you have to bring up the UH will 176 00:08:13,960 --> 00:08:17,400 Speaker 3: be actually all of us will have a software engineering 177 00:08:17,440 --> 00:08:19,640 Speaker 3: assistant and that will make all of our work a 178 00:08:19,680 --> 00:08:21,680 Speaker 3: lot better. UH. And that's one of the things I 179 00:08:21,680 --> 00:08:23,800 Speaker 3: think hasn't yet entered the consciousness. So I think that's 180 00:08:23,840 --> 00:08:26,200 Speaker 3: one of the things that's coming. UH. And then of course, 181 00:08:26,320 --> 00:08:28,520 Speaker 3: you know, with what I'm doing with manus I'm hopeful 182 00:08:28,560 --> 00:08:31,840 Speaker 3: that we will have some really great early results in 183 00:08:32,480 --> 00:08:33,520 Speaker 3: hearing cancer. 184 00:08:33,320 --> 00:08:36,000 Speaker 2: And how we can accelerate the scientific process. I say, 185 00:08:36,040 --> 00:08:38,640 Speaker 2: we're all up with you on that page. You're very successful. 186 00:08:38,679 --> 00:08:40,720 Speaker 2: Thank you, sir, read to appreciate your time. Thank you 187 00:08:40,760 --> 00:08:42,839 Speaker 2: so good to see you. Ride half and there manus 188 00:08:42,920 --> 00:08:44,319 Speaker 2: Ai and a whole lot more