1 00:00:02,400 --> 00:00:06,720 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. 2 00:00:06,800 --> 00:00:09,280 Speaker 2: Today on Bloomberg Technology, we're going to speak with Reid Hoffman. 3 00:00:09,360 --> 00:00:12,479 Speaker 2: You perhaps know him best as the co founder of LinkedIn, 4 00:00:13,000 --> 00:00:15,079 Speaker 2: but the billionaire and investor is also part of a 5 00:00:15,120 --> 00:00:17,880 Speaker 2: group of Silicon Valley investors that are immersed in this 6 00:00:17,960 --> 00:00:22,480 Speaker 2: presidential election, rallying behind his or their favorite candidate in 7 00:00:22,520 --> 00:00:25,840 Speaker 2: the race, Kamala Harris. Reid Hoffman joins US now and 8 00:00:26,120 --> 00:00:28,960 Speaker 2: read thank you for your time in coming on Bloomberg Technology. 9 00:00:30,000 --> 00:00:32,279 Speaker 2: You know, the pitch in the plage was to keep 10 00:00:32,320 --> 00:00:35,240 Speaker 2: this focused on tech. So I would like to ask 11 00:00:35,320 --> 00:00:39,800 Speaker 2: you why is Kamala Harris the best candidate for the 12 00:00:39,840 --> 00:00:43,040 Speaker 2: technology industry and benure capitalists around the US. 13 00:00:45,240 --> 00:00:48,640 Speaker 3: Well, so, what the tech industry needs is kind of 14 00:00:48,640 --> 00:00:52,159 Speaker 3: a stable environment in order to you know, kind of 15 00:00:52,280 --> 00:00:55,760 Speaker 3: follow the rules and prosper in an ability because, by 16 00:00:55,760 --> 00:00:57,560 Speaker 3: the way, one of the things that the tech industry 17 00:00:57,560 --> 00:00:59,560 Speaker 3: brings to the US is a kind of a It's 18 00:00:59,600 --> 00:01:02,680 Speaker 3: one of our most successful global industries where most of 19 00:01:02,840 --> 00:01:06,839 Speaker 3: for example, the large tech companies get the vast majority 20 00:01:06,880 --> 00:01:10,520 Speaker 3: of the revenues overseas, like the majority and depends on 21 00:01:10,760 --> 00:01:15,600 Speaker 3: the business and so good global relations, in addition to 22 00:01:15,640 --> 00:01:18,600 Speaker 3: stability and an ability to kind of know that you're 23 00:01:18,640 --> 00:01:21,160 Speaker 3: following a rule of law versus what I think of 24 00:01:21,200 --> 00:01:22,360 Speaker 3: as grifter or capitalism. 25 00:01:23,120 --> 00:01:25,160 Speaker 1: All of that leads. 26 00:01:24,840 --> 00:01:28,040 Speaker 3: To, like, you know, the reason why I and others 27 00:01:28,600 --> 00:01:33,760 Speaker 3: like me support Vice President Harris for president because I 28 00:01:33,800 --> 00:01:36,800 Speaker 3: think those are the more fundamental things. Then you know 29 00:01:36,800 --> 00:01:38,920 Speaker 3: whether or not a corporate tax rate goes up or 30 00:01:38,920 --> 00:01:41,679 Speaker 3: down a little. Obviously those can be important in other 31 00:01:42,040 --> 00:01:44,760 Speaker 3: other times for business. But that's the reason why I 32 00:01:44,840 --> 00:01:48,280 Speaker 3: think within the technology industry, I think it's a fairly 33 00:01:48,320 --> 00:01:49,040 Speaker 3: clear choice. 34 00:01:50,120 --> 00:01:53,680 Speaker 2: Read we recently had SEC check against on the program, 35 00:01:54,520 --> 00:01:57,880 Speaker 2: and a part of the conversation for both candidates is 36 00:01:57,960 --> 00:02:02,240 Speaker 2: continuity at the FTC and the You've actually spoken publicly, 37 00:02:02,360 --> 00:02:05,080 Speaker 2: I think about both already, But what is your latest 38 00:02:05,160 --> 00:02:09,360 Speaker 2: position on those two roles. Lena Kahn and Gary Gensler 39 00:02:09,800 --> 00:02:12,079 Speaker 2: if Kamala Harris were to continue in the White House 40 00:02:12,160 --> 00:02:14,240 Speaker 2: due to their impact on the technology industry. 41 00:02:16,240 --> 00:02:17,440 Speaker 1: Well, let's see. 42 00:02:17,440 --> 00:02:21,080 Speaker 3: So, first, since I've spoken, because I get asked a 43 00:02:21,080 --> 00:02:24,040 Speaker 3: lot about Lena Cohn, I've spoken more about Lena Kahn. 44 00:02:25,200 --> 00:02:27,679 Speaker 3: You know, one misconception to clear up is I've never 45 00:02:27,800 --> 00:02:30,800 Speaker 3: ever talked to Kamala Harris or anyone in the White 46 00:02:30,800 --> 00:02:34,000 Speaker 3: House about Lena kN for this. So this is entirely 47 00:02:34,040 --> 00:02:37,080 Speaker 3: a press set of stories where I'm asked about my 48 00:02:37,440 --> 00:02:39,480 Speaker 3: point of view, and I think she's done a very 49 00:02:39,480 --> 00:02:42,600 Speaker 3: good job on a number of things, the anti compete 50 00:02:42,639 --> 00:02:46,200 Speaker 3: stuff that click to cancel and subscriptions, a number of 51 00:02:46,200 --> 00:02:50,040 Speaker 3: things on M and A. Speaking as a partner at 52 00:02:50,040 --> 00:02:52,640 Speaker 3: Greyloc and as a Silicon Valley person who invests in 53 00:02:52,680 --> 00:02:56,600 Speaker 3: companies trying to create the next generational large tech companies, 54 00:02:57,200 --> 00:02:59,840 Speaker 3: her you know, kind of like waging war on M 55 00:03:00,120 --> 00:03:04,920 Speaker 3: A is actually, in fact as unhelpful to the tech industry. 56 00:03:04,960 --> 00:03:09,559 Speaker 3: It actually accomplishes something different than what she thinks she's accomplishing. 57 00:03:09,639 --> 00:03:12,639 Speaker 3: She thinks she's trying to accomplish the you know, kind 58 00:03:12,639 --> 00:03:15,480 Speaker 3: of the limitation of the growth of the large tech companies, 59 00:03:15,520 --> 00:03:18,240 Speaker 3: and actually, in fact, she's causing a lack of funding 60 00:03:18,600 --> 00:03:21,200 Speaker 3: for any of the companies that might potentially compete with them. 61 00:03:21,560 --> 00:03:23,600 Speaker 3: And so that's a that's a different thing, and that's 62 00:03:23,360 --> 00:03:25,959 Speaker 3: the that's the Lena Khan, which I still have the 63 00:03:26,000 --> 00:03:28,200 Speaker 3: same point of view, and I've asked about it from 64 00:03:28,240 --> 00:03:28,919 Speaker 3: anyone and from an. 65 00:03:28,840 --> 00:03:29,799 Speaker 1: Expertise point of view. 66 00:03:29,800 --> 00:03:32,680 Speaker 3: I will continue to say that Gary Gensler, I think 67 00:03:32,720 --> 00:03:35,480 Speaker 3: the question, and I know Gary, I've talked to him 68 00:03:35,480 --> 00:03:38,560 Speaker 3: from the MIT days, is how do you kind of 69 00:03:38,640 --> 00:03:42,640 Speaker 3: shift kind of a kind of approach to crypto that's 70 00:03:42,720 --> 00:03:46,280 Speaker 3: less kind of hit with stick and more of a 71 00:03:46,600 --> 00:03:47,760 Speaker 3: here's the set. 72 00:03:47,560 --> 00:03:49,520 Speaker 1: Of rules that we could set out that would. 73 00:03:49,240 --> 00:03:54,560 Speaker 3: Create a beneficial technological investment that helps us position globally. 74 00:03:55,640 --> 00:03:58,400 Speaker 3: And I think it's been uneven and so I would 75 00:03:58,440 --> 00:04:02,640 Speaker 3: hope that, you know, Vice President Harris, who grew up 76 00:04:03,120 --> 00:04:07,080 Speaker 3: within the California tech community, understands the importance of this 77 00:04:07,200 --> 00:04:09,680 Speaker 3: kind of rule of law, you know, kind of equal 78 00:04:09,760 --> 00:04:13,440 Speaker 3: basis innovation economy. She talked about, you know, kind of 79 00:04:13,480 --> 00:04:18,640 Speaker 3: innovation opportunities. She's the only presidential candidate in US history 80 00:04:18,640 --> 00:04:21,640 Speaker 3: who's spoken about the importance of founders. And so I 81 00:04:21,680 --> 00:04:23,599 Speaker 3: hope that she would bring a kind of a more 82 00:04:23,680 --> 00:04:27,159 Speaker 3: innovation forward point of view here. And you know, would 83 00:04:27,200 --> 00:04:28,080 Speaker 3: anticipate that. 84 00:04:28,480 --> 00:04:32,640 Speaker 4: An innovation forward perspective is that the key difference that 85 00:04:32,680 --> 00:04:36,919 Speaker 4: you anticipate if hypothetically we had a Harris administration, a 86 00:04:37,000 --> 00:04:40,719 Speaker 4: difference from the current Biden administration, or where else might 87 00:04:40,760 --> 00:04:41,560 Speaker 4: she differ. 88 00:04:42,360 --> 00:04:47,120 Speaker 3: Well, I think so she has always brought a certain 89 00:04:47,560 --> 00:04:49,080 Speaker 3: you know talk to her over the years, and this 90 00:04:49,600 --> 00:04:52,320 Speaker 3: intellectual curiosity to what's going on in the tech industry, 91 00:04:52,680 --> 00:04:55,159 Speaker 3: how that can help everyday Americans, how that can help 92 00:04:55,200 --> 00:04:58,080 Speaker 3: the rest of American industry. And so it's kind of like, look, 93 00:04:58,080 --> 00:05:01,560 Speaker 3: I realized that the tech industry is about innovation and 94 00:05:01,680 --> 00:05:03,920 Speaker 3: building new things, and those building of new things or 95 00:05:03,960 --> 00:05:06,159 Speaker 3: new products and services, but there are also ways that 96 00:05:06,200 --> 00:05:07,719 Speaker 3: can help all of America. 97 00:05:07,839 --> 00:05:09,839 Speaker 1: And those are the questions that she's asked. Now. 98 00:05:11,040 --> 00:05:13,560 Speaker 3: Biden, who has done a number of great things in 99 00:05:13,600 --> 00:05:18,279 Speaker 3: his presidency. You know, he's controlled inflation better than you know, 100 00:05:18,320 --> 00:05:21,679 Speaker 3: post COVID than any other you know, kind of first 101 00:05:21,680 --> 00:05:24,360 Speaker 3: world country in doing that, and I think has done 102 00:05:24,360 --> 00:05:26,440 Speaker 3: a number of very good things, but he has not 103 00:05:26,600 --> 00:05:29,760 Speaker 3: been as engaged with the tech industry, whereas I think, 104 00:05:30,000 --> 00:05:33,680 Speaker 3: you know, Vice President Harris, including you know, kind of 105 00:05:34,279 --> 00:05:37,479 Speaker 3: in her very acceptance speech, has been And so I 106 00:05:37,520 --> 00:05:41,039 Speaker 3: think that that kind of of of of how do 107 00:05:41,040 --> 00:05:43,359 Speaker 3: we engage in the conversation. I mean, look, for example, 108 00:05:43,400 --> 00:05:47,480 Speaker 3: what she did in the artificial intelligence you know, regulation 109 00:05:47,600 --> 00:05:50,359 Speaker 3: and executive order process. She ran That first thing she 110 00:05:50,360 --> 00:05:53,000 Speaker 3: did is She brought the companies into the White House 111 00:05:53,040 --> 00:05:56,839 Speaker 3: to say, hey, we really need you guys to be 112 00:05:57,000 --> 00:05:59,880 Speaker 3: going foot forward on safety and regulation. What kinds of 113 00:05:59,920 --> 00:06:02,880 Speaker 3: things can you voluntarily commit to? You know, I talked 114 00:06:02,880 --> 00:06:04,480 Speaker 3: to the folks who were there. She pushed them very 115 00:06:04,480 --> 00:06:07,240 Speaker 3: harder map then she looked at Okay, so this is 116 00:06:07,279 --> 00:06:09,279 Speaker 3: what they can commit to. Here's what we can put 117 00:06:09,279 --> 00:06:12,719 Speaker 3: into an executive order law, here's what we can do 118 00:06:12,800 --> 00:06:13,960 Speaker 3: to ongoing. 119 00:06:13,560 --> 00:06:16,120 Speaker 1: Monitor or what happened and leave room renovation. 120 00:06:16,240 --> 00:06:19,880 Speaker 3: That's what I expect across her approaching the entire technology 121 00:06:19,920 --> 00:06:22,040 Speaker 3: industry with AI as the instance of it. 122 00:06:22,680 --> 00:06:26,200 Speaker 4: Let's go to the other hypothetical, because she's engaging at 123 00:06:26,200 --> 00:06:29,920 Speaker 4: the moment with the tech community. But many would say 124 00:06:29,920 --> 00:06:32,839 Speaker 4: that Trump is very much engaging with the technology community 125 00:06:33,200 --> 00:06:36,040 Speaker 4: with the amount that he's having Elon Musk on stage. 126 00:06:36,680 --> 00:06:40,719 Speaker 4: If we see Musk taking a bigger role within a 127 00:06:40,800 --> 00:06:44,840 Speaker 4: future administration, what does that mean for you and ultimately 128 00:06:45,120 --> 00:06:48,240 Speaker 4: for the relationship that Silicon Valley has with Elon the. 129 00:06:48,240 --> 00:06:53,600 Speaker 3: Musk Well, I think it depends on what the Trump administration, 130 00:06:53,720 --> 00:06:57,719 Speaker 3: what Elon does. I think you know, the the Trump 131 00:06:58,040 --> 00:07:03,000 Speaker 3: you know Trump and Trump presidency one, you know, his 132 00:07:03,000 --> 00:07:06,040 Speaker 3: his early presidency has shown a certain amount of grifter capitalism. 133 00:07:06,080 --> 00:07:10,040 Speaker 3: It's like trying to penalize his opponents using the instruments 134 00:07:10,040 --> 00:07:12,600 Speaker 3: of state. You know, obviously, you know, we saw that 135 00:07:12,640 --> 00:07:16,840 Speaker 3: happen with Amazon, which probably is you know, no inside knowledge, 136 00:07:16,840 --> 00:07:20,960 Speaker 3: that probably correlated with Bezos's you know, forbidding the Washington 137 00:07:21,040 --> 00:07:25,280 Speaker 3: Post from issuing a support statement. And so it's kind 138 00:07:25,280 --> 00:07:28,680 Speaker 3: of like the if you're going to be individual company retaliatory, 139 00:07:29,320 --> 00:07:32,480 Speaker 3: It's part of what might underlie a terrorist regime, is 140 00:07:32,520 --> 00:07:35,640 Speaker 3: I can choose like who I'm going out favors to 141 00:07:35,760 --> 00:07:40,560 Speaker 3: that does not create a good stable environment for investing 142 00:07:40,600 --> 00:07:42,480 Speaker 3: in you know, kind of like you know, how do 143 00:07:42,560 --> 00:07:45,640 Speaker 3: you invest in an overall industry and ecosystem. And so 144 00:07:45,720 --> 00:07:48,200 Speaker 3: then the question, you know, would come down with Elon. Obviously, 145 00:07:48,240 --> 00:07:53,440 Speaker 3: Elon's businesses have great ties to government. I mean, the 146 00:07:53,680 --> 00:07:58,160 Speaker 3: whole space business is deeply tied to regulation and kind 147 00:07:58,200 --> 00:08:03,680 Speaker 3: of government contracts. Automobiles is also tied to a bunch 148 00:08:03,760 --> 00:08:04,960 Speaker 3: of regulatory frameworks. 149 00:08:05,600 --> 00:08:07,440 Speaker 1: Obviously, you know what's going on with. 150 00:08:07,880 --> 00:08:12,720 Speaker 3: You know, kind of media and information has ties there too, 151 00:08:12,840 --> 00:08:15,840 Speaker 3: and I think unfortunately X dot Com is the primary 152 00:08:16,120 --> 00:08:19,640 Speaker 3: supporter of conspiracy theories and you know, other kinds of things, 153 00:08:20,240 --> 00:08:21,560 Speaker 3: which is a serious problem. 154 00:08:21,960 --> 00:08:23,080 Speaker 1: All of that, the question. 155 00:08:22,960 --> 00:08:24,840 Speaker 3: Is is, all right, well, does it continue to be 156 00:08:24,920 --> 00:08:28,800 Speaker 3: grifter capitalism which is calling out to specific companies, or 157 00:08:28,920 --> 00:08:29,680 Speaker 3: is it rule of law? 158 00:08:31,160 --> 00:08:34,600 Speaker 2: Read on X and conspiracy theories. Bloomberg did a deep report, 159 00:08:34,640 --> 00:08:38,040 Speaker 2: a data based piece of investiative journalism on that recently, 160 00:08:38,360 --> 00:08:39,920 Speaker 2: and in that segment on the show, which you can 161 00:08:39,920 --> 00:08:42,800 Speaker 2: find on YouTube, we gave X's point of view, which 162 00:08:42,840 --> 00:08:45,040 Speaker 2: is that they disagree with that. And if you allow 163 00:08:45,080 --> 00:08:47,560 Speaker 2: me to, I've invited Elon Musk onto this program many 164 00:08:47,600 --> 00:08:51,640 Speaker 2: times and you are here and he's not. Something's different 165 00:08:51,640 --> 00:08:54,160 Speaker 2: in this election. You're a well known person in the 166 00:08:54,160 --> 00:08:56,960 Speaker 2: world of technology, but there are others on both sides 167 00:08:57,320 --> 00:09:01,640 Speaker 2: who are speaking out, in my experience, more prominently than previously. 168 00:09:02,360 --> 00:09:05,560 Speaker 2: Have there been any negative consequences of that, For example, 169 00:09:05,600 --> 00:09:08,640 Speaker 2: if you are a venture capitalist or a CEO that 170 00:09:08,840 --> 00:09:13,719 Speaker 2: publicly speaking about your political orientation and your support of 171 00:09:13,760 --> 00:09:16,880 Speaker 2: a candidate has resulted in a loss of opportunity in 172 00:09:16,960 --> 00:09:18,480 Speaker 2: deals or whatever it may be. 173 00:09:20,640 --> 00:09:23,360 Speaker 3: Well, I've seen and you know, I think you know 174 00:09:23,400 --> 00:09:26,080 Speaker 3: this in something that's this fraught, you know, I don't 175 00:09:26,080 --> 00:09:31,600 Speaker 3: think it's just you know, retaliation in business circumstances from 176 00:09:32,040 --> 00:09:35,079 Speaker 3: folks who are supporting Trump. Like I saw, you know, 177 00:09:35,200 --> 00:09:38,440 Speaker 3: kind of you know, when vcs were speaking out, some 178 00:09:38,720 --> 00:09:43,160 Speaker 3: LPs were very supportive of Trump, not at gray Lock 179 00:09:43,200 --> 00:09:47,400 Speaker 3: in this, but in other firms kind of uh, you know, 180 00:09:47,480 --> 00:09:51,600 Speaker 3: withdrew their support from the venture firms. I've seen other 181 00:09:51,720 --> 00:09:56,719 Speaker 3: kinds of things where you know, people you know kind 182 00:09:56,760 --> 00:10:00,800 Speaker 3: of uh, you know, kind of business relationship have soured 183 00:10:00,840 --> 00:10:02,920 Speaker 3: because of it, you know, which I think is very unfortunate. 184 00:10:03,000 --> 00:10:05,320 Speaker 3: I mean, I actually think the part of what we 185 00:10:05,320 --> 00:10:09,080 Speaker 3: should aspire to as Americans and as business people is 186 00:10:09,120 --> 00:10:11,800 Speaker 3: to is to kind of say, hey, look, what we 187 00:10:11,840 --> 00:10:13,920 Speaker 3: should do is say what's the thing that we do 188 00:10:14,080 --> 00:10:19,120 Speaker 3: to make American business and American industry better? And you know, 189 00:10:19,160 --> 00:10:23,760 Speaker 3: like I've I've heard of Republican senators trash talking me 190 00:10:23,960 --> 00:10:27,920 Speaker 3: internationally where you know, you think they'd be supportive of 191 00:10:28,000 --> 00:10:31,640 Speaker 3: American business people. So I think that there is I 192 00:10:31,720 --> 00:10:34,200 Speaker 3: think there are consequences of it, which you know, I 193 00:10:34,200 --> 00:10:37,920 Speaker 3: think is unfortunate. But I think it's more important to 194 00:10:37,960 --> 00:10:41,959 Speaker 3: be you know, kind of patriotic about what is good 195 00:10:42,040 --> 00:10:45,920 Speaker 3: for America, American industry and American society even when you. 196 00:10:45,880 --> 00:10:50,679 Speaker 4: Have this here, let's talking about the souring of certain relationships. 197 00:10:51,480 --> 00:10:53,120 Speaker 4: The first one that comes to my mind is the 198 00:10:53,160 --> 00:10:56,600 Speaker 4: souring of the relationship of who was a co founder 199 00:10:56,640 --> 00:11:00,240 Speaker 4: of open AI, Elon Musk and now open AI more 200 00:11:00,280 --> 00:11:03,240 Speaker 4: explicitly in its new form. Have you spoken to Sam 201 00:11:03,240 --> 00:11:06,680 Speaker 4: Altman about and indeed Open AI about the worries of 202 00:11:06,880 --> 00:11:10,800 Speaker 4: Edol Musk, who seems to have a dislike for the 203 00:11:10,840 --> 00:11:14,280 Speaker 4: business if he took a more prominent role within the administration, 204 00:11:14,400 --> 00:11:16,720 Speaker 4: What would it look like for AI more broadly to 205 00:11:16,720 --> 00:11:18,319 Speaker 4: have Elon Musk in a position of power. 206 00:11:20,080 --> 00:11:23,439 Speaker 3: Well, one of the things that I would worry about 207 00:11:23,600 --> 00:11:26,360 Speaker 3: if Elon took a more prominent role is that he 208 00:11:26,400 --> 00:11:30,199 Speaker 3: would view that the only real success for you. 209 00:11:30,120 --> 00:11:32,120 Speaker 1: Know, the AI industry is if he is doing it. 210 00:11:32,200 --> 00:11:36,080 Speaker 3: I mean, this is part of the question around the 211 00:11:36,120 --> 00:11:39,000 Speaker 3: fact that he's kind of doing these baseless lawsuits against 212 00:11:39,040 --> 00:11:42,480 Speaker 3: open AI that are you know, kind of like, well, 213 00:11:42,520 --> 00:11:45,160 Speaker 3: I gave money philanthropically and I should kind of cut 214 00:11:45,640 --> 00:11:49,640 Speaker 3: of what then turns into commercial which is by the way, illegal, right, 215 00:11:50,040 --> 00:11:54,880 Speaker 3: and so you know, so that kind of of of 216 00:11:55,559 --> 00:11:59,720 Speaker 3: of issue might then rear its head in this kind 217 00:11:59,720 --> 00:12:01,360 Speaker 3: of tree radition of you know, kind. 218 00:12:01,240 --> 00:12:03,720 Speaker 1: Of Trump's grifter or capitalism. 219 00:12:04,320 --> 00:12:08,240 Speaker 3: I would hope that actually, in fact, if Elon were 220 00:12:08,440 --> 00:12:10,120 Speaker 3: to accept a role, that he would go, Look, what 221 00:12:10,200 --> 00:12:12,319 Speaker 3: I care about is all of them, you know, the 222 00:12:12,640 --> 00:12:16,000 Speaker 3: tech sector of American industry, not just my own efforts. 223 00:12:16,559 --> 00:12:18,920 Speaker 1: And I think we'd have to see what would play out. 224 00:12:18,960 --> 00:12:19,080 Speaker 3: Then. 225 00:12:20,640 --> 00:12:23,480 Speaker 2: The latest on the listigation between the two, I believe 226 00:12:23,520 --> 00:12:26,440 Speaker 2: is also the eighth when Open AI accused Elon Musk 227 00:12:26,480 --> 00:12:29,040 Speaker 2: of harassment in that legal fight, and again we've covered 228 00:12:29,080 --> 00:12:32,439 Speaker 2: it on this program Read and we'll go back to it. 229 00:12:32,480 --> 00:12:36,600 Speaker 2: Is AI and regulation of AI being discussed enough in 230 00:12:36,640 --> 00:12:38,960 Speaker 2: this election by either candidate. 231 00:12:40,880 --> 00:12:44,200 Speaker 3: Well, I'm not sure that it's the most central thing 232 00:12:44,520 --> 00:12:48,640 Speaker 3: for kind of the general American society at the moment. Look, 233 00:12:48,679 --> 00:12:51,480 Speaker 3: I think the how we get the benefits of AI 234 00:12:51,679 --> 00:12:55,440 Speaker 3: for American industry, American society, American consumers, I think is 235 00:12:55,480 --> 00:12:58,360 Speaker 3: really important. And one of the things about regulation is 236 00:12:58,360 --> 00:13:00,880 Speaker 3: it tends to slow down innovation, that tends to delay 237 00:13:00,880 --> 00:13:04,319 Speaker 3: and I think are tools for getting a medical assistant 238 00:13:04,400 --> 00:13:08,400 Speaker 3: on every smartphone, a tutor on every smartphone for every 239 00:13:08,840 --> 00:13:12,360 Speaker 3: American and maybe even more globally. I think those are 240 00:13:12,400 --> 00:13:14,160 Speaker 3: the important things to be trying to get to. And 241 00:13:14,200 --> 00:13:17,800 Speaker 3: that's less of a discussion about you know, AI in 242 00:13:17,840 --> 00:13:21,560 Speaker 3: the election, but more kind of a like we should 243 00:13:21,559 --> 00:13:22,800 Speaker 3: be building towards the future. 244 00:13:23,880 --> 00:13:26,480 Speaker 2: Read if former President Trump were to get a second 245 00:13:26,600 --> 00:13:29,360 Speaker 2: term in office, how are you preparing for that and 246 00:13:29,400 --> 00:13:32,040 Speaker 2: how do you approach it were that to be the case. 247 00:13:33,960 --> 00:13:36,240 Speaker 3: Well, if that's the case, then I would hope for 248 00:13:36,280 --> 00:13:39,240 Speaker 3: the folks who who have argued to me that you know, 249 00:13:39,360 --> 00:13:43,280 Speaker 3: you shouldn't actually in fact take President Trump's words seriously. 250 00:13:43,440 --> 00:13:45,480 Speaker 3: That the fact that he is declaring a you know, 251 00:13:45,520 --> 00:13:47,760 Speaker 3: like a tariff for w where you can do individual 252 00:13:48,040 --> 00:13:50,960 Speaker 3: kind of grifter capitalism, but that's actually not what he's. 253 00:13:50,800 --> 00:13:51,160 Speaker 1: Going to do. 254 00:13:51,200 --> 00:13:53,600 Speaker 3: That he's going to try to be more business forward, 255 00:13:53,720 --> 00:13:56,720 Speaker 3: more technology forward, and you should not be paying attention 256 00:13:57,200 --> 00:13:59,280 Speaker 3: to what his words are. I would hope that they 257 00:13:59,280 --> 00:14:02,600 Speaker 3: would be they would be correct if it ends up 258 00:14:02,720 --> 00:14:06,280 Speaker 3: being what I fear, which is actually in fact a 259 00:14:06,360 --> 00:14:09,120 Speaker 3: you know, kind of an individual kind of uh, you know, 260 00:14:09,240 --> 00:14:12,800 Speaker 3: CLI sweating various businesses for his own kind of political favor. 261 00:14:13,600 --> 00:14:16,080 Speaker 3: I think that will affect American industry in a way 262 00:14:16,120 --> 00:14:19,520 Speaker 3: that will affect American prosperity, and so I think you 263 00:14:19,640 --> 00:14:22,360 Speaker 3: have to. You have to you build in more resilience 264 00:14:22,400 --> 00:14:25,440 Speaker 3: to chaos and uncertainty, and that's what you would do. 265 00:14:25,520 --> 00:14:28,960 Speaker 3: I'm obviously very hopeful that Vice President Harris will win 266 00:14:29,000 --> 00:14:33,360 Speaker 3: the election, and so therefore haven't done enormous contingency planning yet. 267 00:14:34,320 --> 00:14:36,840 Speaker 4: I want to thank you so much for your perspective 268 00:14:36,840 --> 00:14:41,320 Speaker 4: today as you do. Indeed, back Vice Paradism and Harris 269 00:14:41,720 --> 00:14:44,840 Speaker 4: for the administration is the future. Gray Lot partner Reaed Hoffman, 270 00:14:45,080 --> 00:14:45,840 Speaker 4: I appreciate it.