1 00:00:02,400 --> 00:00:06,760 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,240 --> 00:00:09,479 Speaker 2: Well, our next guest needs no introduction, but we're gonna 3 00:00:09,520 --> 00:00:12,320 Speaker 2: give them one anyway. He's a Silicon Valley legend. He 4 00:00:12,320 --> 00:00:15,200 Speaker 2: helped build PayPal. He co founded LinkedIn. He was an 5 00:00:15,200 --> 00:00:18,280 Speaker 2: early investor in Facebook and Airbnb. He was the first 6 00:00:18,320 --> 00:00:21,400 Speaker 2: investor in open Ai. He co founded the AI company 7 00:00:21,400 --> 00:00:24,640 Speaker 2: Inflection Ai. He's on Microsoft's board. He's a partner at 8 00:00:24,640 --> 00:00:27,200 Speaker 2: the VC firm Greylock Partners. He's a New York Times 9 00:00:27,240 --> 00:00:31,160 Speaker 2: bestselling author, and he's also a podcast we have read Hoffmans. 10 00:00:31,800 --> 00:00:33,159 Speaker 3: Great to be here. Thank you for you. 11 00:00:34,159 --> 00:00:36,959 Speaker 4: As we were just talking, it's the loudest environment where 12 00:00:36,960 --> 00:00:38,920 Speaker 4: we've ever done a radio show before it so hopefully 13 00:00:38,920 --> 00:00:39,319 Speaker 4: it'll work. 14 00:00:39,560 --> 00:00:41,319 Speaker 5: It's working so far, so far, so good. 15 00:00:41,320 --> 00:00:44,040 Speaker 2: But I have to ask you, are you a real 16 00:00:45,200 --> 00:00:47,800 Speaker 2: or am I talking to some sort of AI digital 17 00:00:47,800 --> 00:00:48,360 Speaker 2: twin round? 18 00:00:48,520 --> 00:00:51,879 Speaker 5: Well, and explain why I have to ask that question, please, So. 19 00:00:51,920 --> 00:00:55,120 Speaker 4: As you as you know, I did an interview with 20 00:00:55,160 --> 00:00:57,800 Speaker 4: myself where I was talking to my own digital twin 21 00:00:57,880 --> 00:00:59,080 Speaker 4: read AI, and. 22 00:00:59,080 --> 00:01:01,440 Speaker 3: I did it in order to kind of show that 23 00:01:01,640 --> 00:01:02,920 Speaker 3: not only obviously there's a. 24 00:01:02,840 --> 00:01:05,800 Speaker 4: Whole bunch of concerns, there's concerns and political and misinformation 25 00:01:06,040 --> 00:01:08,520 Speaker 4: and and and a deep makes and a bunch of 26 00:01:08,600 --> 00:01:11,560 Speaker 4: other stuff that are real concerns. But even here there's 27 00:01:11,800 --> 00:01:14,280 Speaker 4: optimism to look at what can we shaped towards. And 28 00:01:14,319 --> 00:01:18,200 Speaker 4: I wanted to show not just how that message and 29 00:01:18,200 --> 00:01:19,959 Speaker 4: so I was like, okay, well let me talk to 30 00:01:20,000 --> 00:01:23,520 Speaker 4: my own read AI and see how it works. And 31 00:01:23,600 --> 00:01:25,440 Speaker 4: so and the reason why you asked that question, of course, 32 00:01:25,440 --> 00:01:26,600 Speaker 4: and you said, well, wait a minute. 33 00:01:26,600 --> 00:01:29,000 Speaker 3: That was pretty good. It was we're here in person, 34 00:01:29,760 --> 00:01:31,679 Speaker 3: you can tell this is I'm the real. 35 00:01:32,440 --> 00:01:34,840 Speaker 5: Sounds like this looked like you. Yes, yeah, it is 36 00:01:34,920 --> 00:01:35,840 Speaker 5: like you looked. 37 00:01:35,640 --> 00:01:41,680 Speaker 4: Like video audio and the words we're all AI generated. 38 00:01:43,280 --> 00:01:47,200 Speaker 3: It was zero from me. It was other than mirroring off. 39 00:01:47,600 --> 00:01:51,080 Speaker 4: My earlier videos, mirroring off my earlier podcast. 40 00:01:50,720 --> 00:01:53,600 Speaker 5: Use your use your book to train as well. 41 00:01:53,720 --> 00:01:57,320 Speaker 4: Yes, yeah, it used all everything I written to good 42 00:01:57,400 --> 00:02:00,000 Speaker 4: answers the way read wouldn't answered. 43 00:02:00,560 --> 00:02:03,200 Speaker 6: We're all going to have digital twins someday, and we're 44 00:02:03,200 --> 00:02:04,000 Speaker 6: gonna use us. 45 00:02:04,120 --> 00:02:10,079 Speaker 5: We go to the DMV for me, So I think 46 00:02:10,919 --> 00:02:11,440 Speaker 5: we must. 47 00:02:11,639 --> 00:02:15,880 Speaker 4: It's certainly possible. Certainly, if you're a media person, you'll 48 00:02:15,880 --> 00:02:19,200 Speaker 4: have a digital twin. Certainly if you're a leader, you'll 49 00:02:19,240 --> 00:02:21,200 Speaker 4: have a digital twin because it'll be used in If 50 00:02:21,240 --> 00:02:23,200 Speaker 4: you're a lawyer, you'll have one because then people can 51 00:02:23,320 --> 00:02:25,800 Speaker 4: talk to your digital twins and maybe at a cheaper 52 00:02:25,880 --> 00:02:28,080 Speaker 4: rate the dog you are preparing for your meeting or 53 00:02:28,080 --> 00:02:29,480 Speaker 4: that kind of thing. So I think it will be 54 00:02:29,480 --> 00:02:32,040 Speaker 4: a stack of areas for digital twins that will be 55 00:02:32,200 --> 00:02:32,720 Speaker 4: very real. 56 00:02:33,800 --> 00:02:36,400 Speaker 3: But I'm not sure all eight billion people were. 57 00:02:36,320 --> 00:02:39,840 Speaker 2: Also district well, so so why should this excite us 58 00:02:39,840 --> 00:02:41,639 Speaker 2: and not scare us and want us to all move 59 00:02:41,680 --> 00:02:42,400 Speaker 2: to bunkers. 60 00:02:43,360 --> 00:02:47,519 Speaker 4: So it should be, but naturally it should both excite 61 00:02:47,600 --> 00:02:48,520 Speaker 4: and scare us. 62 00:02:48,720 --> 00:02:50,320 Speaker 3: The scaret is the negative is the. 63 00:02:50,320 --> 00:02:54,480 Speaker 4: Scas is of political misinformation, you know, things that are 64 00:02:54,480 --> 00:02:56,640 Speaker 4: going to be happening this year in the US until 65 00:02:56,639 --> 00:03:01,080 Speaker 4: we're in other major elections of democracy. It should scare 66 00:03:01,160 --> 00:03:04,280 Speaker 4: us because there's questions around a minute, what did someone 67 00:03:04,280 --> 00:03:08,000 Speaker 4: who created a deep make of me doing something or 68 00:03:08,040 --> 00:03:08,840 Speaker 4: saying something? 69 00:03:08,880 --> 00:03:11,440 Speaker 3: I really am poor? Like you know, like you know 70 00:03:11,560 --> 00:03:12,239 Speaker 3: how I. 71 00:03:12,120 --> 00:03:14,880 Speaker 4: Think Trump is a corruption in democracy against the rule 72 00:03:15,000 --> 00:03:18,280 Speaker 4: on what happens to your created I love Trump. 73 00:03:18,160 --> 00:03:20,040 Speaker 3: You know that would be obviously a problem. 74 00:03:20,160 --> 00:03:22,440 Speaker 4: And so there's all kinds of ways that there are 75 00:03:22,520 --> 00:03:25,679 Speaker 4: real problems are So on the other hand, we will 76 00:03:26,440 --> 00:03:28,760 Speaker 4: we will, as we always do with technology, to figure 77 00:03:28,800 --> 00:03:32,680 Speaker 4: out how to navigate and get that not perfectly but sufficiently, 78 00:03:33,320 --> 00:03:37,240 Speaker 4: and there will be great things like for example, that's 79 00:03:37,320 --> 00:03:39,880 Speaker 4: part of the reason I started showing me. Hey, look here, 80 00:03:39,920 --> 00:03:44,280 Speaker 4: I'm making a deep bake of myself and it's not 81 00:03:44,280 --> 00:03:44,640 Speaker 4: that bad. 82 00:03:45,920 --> 00:03:48,120 Speaker 6: You know, read when you sat down, I said, the 83 00:03:48,200 --> 00:03:50,560 Speaker 6: conversations we have a lot of blueberg and we've written 84 00:03:50,560 --> 00:03:51,720 Speaker 6: about this is here. 85 00:03:51,800 --> 00:03:54,760 Speaker 5: You have Lincoln kind of the social. 86 00:03:54,480 --> 00:03:56,960 Speaker 6: Media site, that kind of the social media site that 87 00:03:57,000 --> 00:04:01,720 Speaker 6: seems to be clean, pure, You can stress any conversations 88 00:04:02,080 --> 00:04:05,400 Speaker 6: so we can get it right. So how is that 89 00:04:05,480 --> 00:04:08,240 Speaker 6: kind of a guiding for since in terms of jenai 90 00:04:08,400 --> 00:04:11,000 Speaker 6: and how do you think about this? What are the guardrails? 91 00:04:11,240 --> 00:04:12,320 Speaker 6: How do we regulate it? 92 00:04:12,480 --> 00:04:16,200 Speaker 4: Yeah, and so the good news is the mostly frontier 93 00:04:16,240 --> 00:04:21,320 Speaker 4: model companies, Microsoft, Open Ai, Google are all putting a 94 00:04:21,320 --> 00:04:24,800 Speaker 4: lot of energy in that kind of making these models 95 00:04:25,560 --> 00:04:29,680 Speaker 4: healthy participants within the US, but within the global kind 96 00:04:29,720 --> 00:04:34,920 Speaker 4: of media ecosystem, like don't generate fake inflammation, be civil, 97 00:04:35,600 --> 00:04:38,920 Speaker 4: you know, don't do hate speech, don't enabe it self harm. 98 00:04:39,080 --> 00:04:41,000 Speaker 4: They all put a lot of energy and a lot 99 00:04:41,000 --> 00:04:42,239 Speaker 4: of work in that, and I don't. 100 00:04:42,240 --> 00:04:46,240 Speaker 3: Can invest hundreds of millions of dollars and teams one 101 00:04:46,320 --> 00:04:47,599 Speaker 3: hundreds of people in order to do that. 102 00:04:48,360 --> 00:04:51,240 Speaker 4: Now, one of the challenges is is that some of 103 00:04:51,240 --> 00:04:54,160 Speaker 4: these models are an open source, and as an open source, 104 00:04:54,400 --> 00:04:57,279 Speaker 4: someone else takes the open source model to something that's 105 00:04:57,360 --> 00:04:59,160 Speaker 4: not any of that with that right, and that's part 106 00:04:59,160 --> 00:05:00,920 Speaker 4: of the reason why we still have to navigate this. 107 00:05:01,279 --> 00:05:03,120 Speaker 6: So you have trust so that we can navigate it 108 00:05:03,240 --> 00:05:06,279 Speaker 6: correctly and safely in the future AI. 109 00:05:06,680 --> 00:05:10,920 Speaker 3: So the ultimate answer is absolutely yes. That doesn't mean 110 00:05:10,960 --> 00:05:11,960 Speaker 3: that there aren't. 111 00:05:12,000 --> 00:05:17,080 Speaker 4: Potholes, you know, bumps in the road, under scrapes all 112 00:05:17,160 --> 00:05:20,400 Speaker 4: the rest, and I guarantee you there will be those two. 113 00:05:21,120 --> 00:05:24,159 Speaker 2: The US can't get Vladimir Putin not to invade in Ukraine. 114 00:05:24,480 --> 00:05:26,800 Speaker 2: The world can't get him not to stop fighting there. 115 00:05:27,320 --> 00:05:29,520 Speaker 2: What convinces you that we could. 116 00:05:29,360 --> 00:05:33,640 Speaker 5: Get a leader like him not to use AI? Oh bad? 117 00:05:34,160 --> 00:05:34,800 Speaker 3: Zero percent? 118 00:05:35,400 --> 00:05:40,280 Speaker 4: Like if you said that, uh that Hooton will be 119 00:05:40,400 --> 00:05:46,479 Speaker 4: using AI to disrupt the US elections, you know probably 120 00:05:46,520 --> 00:05:49,520 Speaker 4: support Trump because that's his exit strategy from Ukraine. 121 00:05:50,040 --> 00:05:53,240 Speaker 3: Uh, he will do that. One is no way of 122 00:05:53,279 --> 00:05:56,000 Speaker 3: having it. The defense is not getting him not to 123 00:05:56,040 --> 00:05:56,320 Speaker 3: do it. 124 00:05:56,560 --> 00:06:00,200 Speaker 4: The defense is integrating AI in our own tech platform. 125 00:06:00,040 --> 00:06:03,200 Speaker 3: I'm using our stronger AI in defects. 126 00:06:03,520 --> 00:06:06,640 Speaker 2: So essentially saying, okay, well, maybe you see something on 127 00:06:06,680 --> 00:06:10,640 Speaker 2: Facebook and met platforms might have developed something that says, Okay, 128 00:06:10,680 --> 00:06:12,680 Speaker 2: well this isn't necessarily. 129 00:06:12,320 --> 00:06:13,119 Speaker 5: Something that's true. 130 00:06:13,360 --> 00:06:15,480 Speaker 3: Yes, but do you. 131 00:06:15,400 --> 00:06:17,039 Speaker 5: Trust Do you trust Facebook to do that? 132 00:06:17,839 --> 00:06:18,279 Speaker 3: Actually? 133 00:06:20,520 --> 00:06:23,719 Speaker 5: So, and you have a deep history with the company, Yes, 134 00:06:23,760 --> 00:06:24,359 Speaker 5: I have a deep. 135 00:06:24,200 --> 00:06:24,960 Speaker 3: History of the company. 136 00:06:25,160 --> 00:06:27,040 Speaker 4: I actually have talked to Zucker Bigger about it, and 137 00:06:27,080 --> 00:06:29,880 Speaker 4: they're investing in it. I think they're making a real effort. 138 00:06:30,240 --> 00:06:34,159 Speaker 4: So I think like they're really try. I'm more worried 139 00:06:34,160 --> 00:06:37,440 Speaker 4: about Twitter that I am about Facebook on this matter. 140 00:06:37,800 --> 00:06:39,280 Speaker 5: Okay, why are you more worried about Twitter? 141 00:06:39,360 --> 00:06:41,200 Speaker 3: Well, because, looks Elon. 142 00:06:41,040 --> 00:06:42,680 Speaker 5: Musk is someone you also go way beastly. 143 00:06:42,839 --> 00:06:45,599 Speaker 4: Yeah, and look, Elon is a literally one of the 144 00:06:45,680 --> 00:06:46,839 Speaker 4: heroes of our generation. 145 00:06:47,040 --> 00:06:49,279 Speaker 3: You know, uh, Space and. 146 00:06:50,960 --> 00:06:55,520 Speaker 4: Kind of Starlink and Tesla and easy just amazing, amazing, amazing. 147 00:06:55,560 --> 00:06:57,479 Speaker 4: You wouldn't have any of this about it. It is 148 00:06:57,600 --> 00:07:01,320 Speaker 4: so spectacular. On the other hand, before he buys Twitter, 149 00:07:01,360 --> 00:07:03,640 Speaker 4: he's like, oh, we have this problem with robots and 150 00:07:03,640 --> 00:07:06,440 Speaker 4: then after he bought the lot trying to get out 151 00:07:06,440 --> 00:07:08,400 Speaker 4: of the and then after your body, oh what robots, 152 00:07:08,600 --> 00:07:11,560 Speaker 4: and you're like, oh, the robots are still there. There's 153 00:07:11,600 --> 00:07:15,280 Speaker 4: still like Internet Research Agency watching robot farms doing shit. 154 00:07:15,520 --> 00:07:18,720 Speaker 3: Are you doing anything about them? I'm not hearing anything. 155 00:07:19,600 --> 00:07:21,720 Speaker 5: To be fair, I did see I did see an 156 00:07:21,760 --> 00:07:22,480 Speaker 5: ad on. 157 00:07:22,360 --> 00:07:25,320 Speaker 2: Twitter recently with a deep bake of Jeff Bezos pushing 158 00:07:25,320 --> 00:07:28,640 Speaker 2: a cryptocurrency. It's a paid ad, yes, that has showed 159 00:07:28,680 --> 00:07:31,080 Speaker 2: up repeatedly in my feed obviously not a proved vice. 160 00:07:31,480 --> 00:07:34,840 Speaker 4: Yes, yes, yeah, And so therefore, what are you doing 161 00:07:34,920 --> 00:07:38,120 Speaker 4: about AI and j Well, you know, elon is so complex. 162 00:07:38,760 --> 00:07:41,239 Speaker 6: Back from Milkin, he did a conversation with Michael Milton. 163 00:07:41,320 --> 00:07:42,440 Speaker 3: He said, very important to have. 164 00:07:42,840 --> 00:07:47,000 Speaker 6: Maximum seeking AI and maximum curious AI. And then it's 165 00:07:47,040 --> 00:07:50,040 Speaker 6: taught not to lie nor do things that are not true. 166 00:07:50,560 --> 00:07:54,360 Speaker 3: Well, walk the lock, don't just lock the dock. Well, 167 00:07:54,720 --> 00:07:56,680 Speaker 3: can I ask? I'm as curious as we have. 168 00:07:56,720 --> 00:07:59,400 Speaker 6: All these conversations about jen AI. How is it going 169 00:07:59,480 --> 00:08:01,680 Speaker 6: to impact my life? How's it really going to impact 170 00:08:01,720 --> 00:08:04,320 Speaker 6: him's life, How's it really going to impact your life? 171 00:08:04,680 --> 00:08:08,080 Speaker 4: So here's a simple way of looking at all of 172 00:08:08,160 --> 00:08:11,720 Speaker 4: us are going to have a personal AI assistant that 173 00:08:11,840 --> 00:08:14,520 Speaker 4: it's going to be helping with us with its focus 174 00:08:14,680 --> 00:08:17,880 Speaker 4: on us, right, so it'll be like, Hey, what do 175 00:08:17,920 --> 00:08:20,280 Speaker 4: you need help with? Do you need help with figuring 176 00:08:20,280 --> 00:08:23,240 Speaker 4: out say something like like where to go? What's entertaining 177 00:08:23,320 --> 00:08:26,760 Speaker 4: to do you know in the cities and I or 178 00:08:27,160 --> 00:08:29,320 Speaker 4: Oh I'm having this, I'm trying to figure out this 179 00:08:29,320 --> 00:08:31,360 Speaker 4: thing with my kids. Oh, we can help you with that. 180 00:08:31,720 --> 00:08:33,480 Speaker 4: Oh I'm trying to figure this medical thing. Oh I 181 00:08:33,480 --> 00:08:35,400 Speaker 4: can help you with that. Oh, I'm trying to figure 182 00:08:35,400 --> 00:08:38,120 Speaker 4: out I had this difficult conversation with a coworker. Oh, 183 00:08:38,320 --> 00:08:40,440 Speaker 4: I'll talk to you about that. It will help us 184 00:08:40,480 --> 00:08:42,480 Speaker 4: with a whole range of things, and like, for example, 185 00:08:42,559 --> 00:08:45,320 Speaker 4: part of what we found an inflection was really surprising 186 00:08:45,360 --> 00:08:46,480 Speaker 4: to us and very positive. 187 00:08:47,040 --> 00:08:50,280 Speaker 3: Which has pides personal intelligence is. 188 00:08:51,120 --> 00:08:53,199 Speaker 4: People will stay, Oh, I got these eight ingredients with 189 00:08:53,240 --> 00:08:56,240 Speaker 4: my friends what should I name? Or my toaster broke 190 00:08:56,320 --> 00:08:58,680 Speaker 4: and I fixed it and you're like, actually, in fact, 191 00:08:59,080 --> 00:09:01,640 Speaker 4: it helps with all of that. So that's that human 192 00:09:01,679 --> 00:09:05,040 Speaker 4: amplification that's helping us navigate our lives, which is the 193 00:09:05,080 --> 00:09:07,719 Speaker 4: thing I guarantee will be. 194 00:09:07,720 --> 00:09:10,079 Speaker 5: That I love the ingredients in the Fridge part of it. 195 00:09:10,240 --> 00:09:11,599 Speaker 2: Hey, if you're just joining us, we're speaking to the 196 00:09:11,640 --> 00:09:15,840 Speaker 2: REDA Hoffmann, the PayPal of LinkedIn, of open AI, of 197 00:09:15,960 --> 00:09:19,960 Speaker 2: Inflection AI, of the podcast Possible. But this certainly goes 198 00:09:20,000 --> 00:09:25,080 Speaker 2: on read I'm wondering about winners and losers when it 199 00:09:25,080 --> 00:09:27,160 Speaker 2: comes to AI AI in terms of companies. 200 00:09:27,320 --> 00:09:29,439 Speaker 5: You're on the board of Microsoft. I'm mented a couple 201 00:09:29,480 --> 00:09:29,640 Speaker 5: of the. 202 00:09:29,600 --> 00:09:33,800 Speaker 2: Big companies that you've invested in, that you founded. Is 203 00:09:33,840 --> 00:09:36,800 Speaker 2: there a concern that we're just going to see the 204 00:09:36,800 --> 00:09:41,800 Speaker 2: big companies like Microsoft, like in video, like Amazon, like 205 00:09:41,840 --> 00:09:44,480 Speaker 2: meta platforms, they're going to be the only winners when 206 00:09:44,480 --> 00:09:45,160 Speaker 2: it comes to AI. 207 00:09:45,760 --> 00:09:47,400 Speaker 4: What I guarantee you is they're not going to be 208 00:09:47,400 --> 00:09:49,199 Speaker 4: the only one. They will be winners. 209 00:09:49,559 --> 00:09:49,679 Speaker 6: Right. 210 00:09:51,120 --> 00:09:55,320 Speaker 4: Satia has just done a masterful job, like like remember 211 00:09:55,440 --> 00:09:57,400 Speaker 4: it started with a. 212 00:09:56,800 --> 00:09:59,640 Speaker 3: Million dollars deal with a five O one C three. 213 00:10:00,000 --> 00:10:01,920 Speaker 4: I think in the first time in the history something 214 00:10:02,000 --> 00:10:05,360 Speaker 4: like that's happening, like genius strategy from talking and I 215 00:10:05,360 --> 00:10:08,760 Speaker 4: think it will be Uh, it'll across all of these companies. 216 00:10:08,840 --> 00:10:11,280 Speaker 4: I think they will have strong winds going in the future. 217 00:10:11,679 --> 00:10:13,480 Speaker 4: On the other hand, you know at bray Locke, we're 218 00:10:13,480 --> 00:10:14,880 Speaker 4: investing a whole bunch of AI company. 219 00:10:14,920 --> 00:10:16,080 Speaker 3: We do it reasonably well. 220 00:10:16,160 --> 00:10:19,360 Speaker 4: We have a whole portfolio that we're excited about. And 221 00:10:19,600 --> 00:10:23,120 Speaker 4: I think that the I think that it's just you 222 00:10:23,120 --> 00:10:25,120 Speaker 4: don't play the same game. It's a little bit like 223 00:10:25,160 --> 00:10:27,480 Speaker 4: and you said, hey, I gotta start up. I'm gonna 224 00:10:27,520 --> 00:10:29,880 Speaker 4: try to make a new desktop search company. You're like, well, 225 00:10:30,480 --> 00:10:33,439 Speaker 4: one of the difficults, right there is this company called Google. 226 00:10:33,440 --> 00:10:34,400 Speaker 3: It's a little challenging. 227 00:10:34,559 --> 00:10:38,720 Speaker 4: Okay, you know, I'm gonna make a new handset mobile 228 00:10:38,760 --> 00:10:41,439 Speaker 4: phone device. You're like, oh, there's this company called a 229 00:10:41,559 --> 00:10:44,080 Speaker 4: whole it's a little challenging. So you know, you don't 230 00:10:44,080 --> 00:10:45,840 Speaker 4: do it that way. But there's gonna be a whole 231 00:10:45,960 --> 00:10:47,520 Speaker 4: range of other AI companies. 232 00:10:48,160 --> 00:10:52,480 Speaker 3: The things that will like everything from opportunities that the large. 233 00:10:52,280 --> 00:10:55,560 Speaker 4: Tech companies just can't focus on, to taking interesting risk 234 00:10:55,640 --> 00:10:57,559 Speaker 4: and making that risk look good and then all of 235 00:10:57,559 --> 00:10:59,440 Speaker 4: a sudden building out ins and mean, so I think 236 00:10:59,480 --> 00:11:00,280 Speaker 4: there's gonna be rain. 237 00:11:00,400 --> 00:11:03,839 Speaker 3: The startups are well excited about ideas. We have podcasts. 238 00:11:04,000 --> 00:11:05,120 Speaker 3: We have a great podcast. 239 00:11:05,280 --> 00:11:06,680 Speaker 5: You have a great new podcast. 240 00:11:06,720 --> 00:11:10,520 Speaker 6: Tell us about possible, Well, tell us about it, like 241 00:11:11,120 --> 00:11:12,400 Speaker 6: you want to talk too. 242 00:11:12,960 --> 00:11:15,959 Speaker 3: What are the conversations you want to have, so for me, 243 00:11:16,320 --> 00:11:17,040 Speaker 3: it's possible. 244 00:11:17,760 --> 00:11:21,240 Speaker 4: It's too much of the dialogue is about technology being 245 00:11:21,320 --> 00:11:26,360 Speaker 4: dangerous to us. How we evolve this humanity, how we 246 00:11:26,520 --> 00:11:27,840 Speaker 4: become better and more human. 247 00:11:27,960 --> 00:11:29,360 Speaker 3: It's throd technology. 248 00:11:29,640 --> 00:11:34,520 Speaker 4: It's it's clothing, it's classes, it's it's it's food, it's cooking, 249 00:11:34,600 --> 00:11:35,920 Speaker 4: it's fired, it's buildings. 250 00:11:36,240 --> 00:11:36,880 Speaker 3: All of this is. 251 00:11:36,840 --> 00:11:39,840 Speaker 1: Technology that's helped us make us as the human names 252 00:11:39,880 --> 00:11:42,840 Speaker 1: we are. AI will be saying as the whole point 253 00:11:42,880 --> 00:11:46,800 Speaker 1: I'm on the Possible podcast used to say, this is 254 00:11:47,080 --> 00:11:49,880 Speaker 1: how technology can help us become more human, not to go, 255 00:11:50,000 --> 00:11:53,400 Speaker 1: oh my god, what's coming in technology. It's it can 256 00:11:53,480 --> 00:11:56,440 Speaker 1: be great for us, and it's not that hard to shape. 257 00:11:56,640 --> 00:11:59,560 Speaker 3: We just need to shape it. Whether it's AI, whether 258 00:11:59,600 --> 00:12:01,000 Speaker 3: it's just the use of phones. 259 00:12:01,360 --> 00:12:03,200 Speaker 4: And you know, obviously people like to talk about what 260 00:12:03,280 --> 00:12:05,400 Speaker 4: social networks and they go, wow, that's going to have 261 00:12:05,440 --> 00:12:08,040 Speaker 4: a talented It's like, well, look at LinkedIn. LinkedIn's work 262 00:12:08,080 --> 00:12:11,000 Speaker 4: out pretty well, it's doable. It's doable, so let's do it. 263 00:12:11,360 --> 00:12:13,080 Speaker 2: We don't want to end with something that everybody loves 264 00:12:13,080 --> 00:12:15,880 Speaker 2: to talk about politics. 265 00:12:16,200 --> 00:12:17,080 Speaker 6: We all agree. 266 00:12:18,880 --> 00:12:21,199 Speaker 2: Look, you've been open about your feelings about these some Trump, 267 00:12:21,280 --> 00:12:24,000 Speaker 2: do you help fund Eugene Carroll's definition of lawsuits against 268 00:12:24,080 --> 00:12:27,840 Speaker 2: the former president. There are a lot of business leaders 269 00:12:27,880 --> 00:12:30,720 Speaker 2: out there, some of them people you work with very closely, 270 00:12:31,040 --> 00:12:34,840 Speaker 2: to say that business under a President Trump administration was 271 00:12:34,880 --> 00:12:38,760 Speaker 2: better than business under a Biden administration. And that's why 272 00:12:38,800 --> 00:12:41,480 Speaker 2: we're in the corner of Donald Trump when it comes 273 00:12:41,480 --> 00:12:42,360 Speaker 2: to twenty twenty four. 274 00:12:42,720 --> 00:12:43,480 Speaker 5: What do you say to them? 275 00:12:43,920 --> 00:12:47,960 Speaker 4: So I understand the approach that Trump reason is more 276 00:12:48,040 --> 00:12:51,240 Speaker 4: regulatory lightweight than Biden. And by the way, generally speaking, 277 00:12:51,280 --> 00:12:54,120 Speaker 4: being more regulatory lightweight is a good thing. I try 278 00:12:54,160 --> 00:12:56,760 Speaker 4: to encourage the Biden administration to do that. On the 279 00:12:56,800 --> 00:12:59,360 Speaker 4: other hand, what's more fundamental is the rule of law. 280 00:13:00,000 --> 00:13:03,680 Speaker 4: Business works much much better with a healthy, strong rule 281 00:13:03,720 --> 00:13:04,040 Speaker 4: of law. 282 00:13:04,559 --> 00:13:06,840 Speaker 3: You do not want to have leaders who are literally 283 00:13:07,679 --> 00:13:08,520 Speaker 3: in a court. 284 00:13:08,600 --> 00:13:11,199 Speaker 4: Like had been convicted by a court of you were 285 00:13:11,240 --> 00:13:16,480 Speaker 4: slandering about your sexual assault and a jury found you guilty. Right, 286 00:13:16,920 --> 00:13:20,199 Speaker 4: we should be a rule of law country first. And 287 00:13:20,200 --> 00:13:21,800 Speaker 4: by the way, this is what's the business leaders. By 288 00:13:21,800 --> 00:13:24,079 Speaker 4: the way, rule of law is what makes great business. 289 00:13:24,440 --> 00:13:27,080 Speaker 4: That's the reason why actually Biden is a matter of 290 00:13:27,080 --> 00:13:30,480 Speaker 4: president for business, even though the regulatory thing won't be 291 00:13:30,520 --> 00:13:31,240 Speaker 4: exactly what you. 292 00:13:31,200 --> 00:13:33,040 Speaker 3: Want, and we'll have to work on it. But rule 293 00:13:33,040 --> 00:13:36,560 Speaker 3: of laws fundamental and to your perspective, your point of you, yes, 294 00:13:36,679 --> 00:13:37,640 Speaker 3: my mind, yes, business. 295 00:13:37,640 --> 00:13:39,599 Speaker 6: But I do wonder that you know, soon as you 296 00:13:39,720 --> 00:13:43,240 Speaker 6: look at the November election, that outcome, if it is 297 00:13:43,400 --> 00:13:48,000 Speaker 6: not another Biden White House present, former president from back 298 00:13:48,000 --> 00:13:49,840 Speaker 6: in the White House, what does it mean for the 299 00:13:49,840 --> 00:13:52,920 Speaker 6: tech industry or in terms of the rule. 300 00:13:52,720 --> 00:13:53,880 Speaker 5: Of laws not plowed? 301 00:13:54,120 --> 00:13:56,840 Speaker 4: Well, So, look, I there's the tech industry can pride 302 00:13:57,440 --> 00:13:58,640 Speaker 4: under either administration. 303 00:13:59,600 --> 00:14:03,800 Speaker 3: But I do think that if either buy nor trust stuff. 304 00:14:03,880 --> 00:14:06,120 Speaker 4: But if I think Trump's elected, I think history will 305 00:14:06,160 --> 00:14:08,200 Speaker 4: look back as the beginning of the end of the 306 00:14:08,240 --> 00:14:11,760 Speaker 4: American world order, and that will have a massive effect 307 00:14:11,840 --> 00:14:12,959 Speaker 4: on global business. 308 00:14:13,120 --> 00:14:15,720 Speaker 5: How far are you willing to go? It's a reelect your. 309 00:14:16,480 --> 00:14:19,840 Speaker 4: Well, I'm willing to invest, I'm willing to speak, I'm 310 00:14:19,840 --> 00:14:23,560 Speaker 4: willing to campaign, I'm willing to try to Like when 311 00:14:23,600 --> 00:14:26,360 Speaker 4: a prominent business leader speaks up and says I'm pro drought, 312 00:14:26,520 --> 00:14:28,480 Speaker 4: I call them and I said, let's talk about it. 313 00:14:28,520 --> 00:14:29,760 Speaker 5: Do you ever change their mind? 314 00:14:30,000 --> 00:14:30,440 Speaker 3: Of course? 315 00:14:30,760 --> 00:14:32,720 Speaker 4: Look for example, look, one of the things you have 316 00:14:32,760 --> 00:14:34,560 Speaker 4: to do is think about blind spots, like I just 317 00:14:34,600 --> 00:14:38,560 Speaker 4: said something that was positive about how Trump's administration was running. 318 00:14:38,600 --> 00:14:40,720 Speaker 4: Like I inspite to say you want a new regulation, 319 00:14:40,800 --> 00:14:41,640 Speaker 4: we're move an old one. 320 00:14:41,960 --> 00:14:43,080 Speaker 3: It's an exactly right thing. 321 00:14:43,320 --> 00:14:46,800 Speaker 6: But I think what you said about globally the perception 322 00:14:47,160 --> 00:14:50,000 Speaker 6: of the rest of the world potentially what the US 323 00:14:50,040 --> 00:14:54,560 Speaker 6: means and what it is in terms of busy. 324 00:14:53,480 --> 00:14:57,360 Speaker 4: People will not trust us. They will say we now 325 00:14:57,440 --> 00:14:59,480 Speaker 4: no longer trust that. You're into the world of law 326 00:14:59,560 --> 00:15:01,040 Speaker 4: and equal system all the rest. 327 00:15:01,160 --> 00:15:03,600 Speaker 6: All right, if you weren't in a world where Jenna 328 00:15:03,640 --> 00:15:07,440 Speaker 6: AI was everything twenty seconds, what's the other technology? We 329 00:15:07,440 --> 00:15:09,320 Speaker 6: should have it on radars quickly. 330 00:15:09,800 --> 00:15:11,560 Speaker 4: Well, so the other thing that's following behind it is 331 00:15:11,560 --> 00:15:14,080 Speaker 4: synthetic biology. And when you put them together, by the way, 332 00:15:14,280 --> 00:15:17,560 Speaker 4: like the invention of new pharmaceuticals, new medicines, everything else, 333 00:15:18,360 --> 00:15:20,680 Speaker 4: it's like, let's just hold on to get to the future, 334 00:15:20,760 --> 00:15:22,400 Speaker 4: because the future could be so amazing. 335 00:15:22,680 --> 00:15:24,320 Speaker 6: You do you think about what AI could do right 336 00:15:24,400 --> 00:15:25,720 Speaker 6: and just play around? 337 00:15:27,280 --> 00:15:30,760 Speaker 3: Thank you so much, Thank you for enjuring my pleasure. 338 00:15:31,240 --> 00:15:32,320 Speaker 5: Appreciated read. 339 00:15:32,400 --> 00:15:33,160 Speaker 6: Thank you so much. 340 00:15:33,640 --> 00:15:37,280 Speaker 5: That's Reid Hoffman of PayPal Linked. 341 00:15:41,120 --> 00:15:44,760 Speaker 2: Saying that your Time's best selling author his podcast Possible 342 00:15:44,840 --> 00:15:45,760 Speaker 2: The Possible Podcast. 343 00:15:45,800 --> 00:15:47,920 Speaker 5: Season two started. It's out now,