1 00:00:03,200 --> 00:00:08,000 Speaker 1: This is Bloomberg Law with June Brusso from Bloomberg Radio. 2 00:00:09,000 --> 00:00:12,680 Speaker 2: The prosecution and defense have rested their cases in the 3 00:00:12,720 --> 00:00:16,919 Speaker 2: trial of Sam bankman Fried, who's accused of masterminding a 4 00:00:17,040 --> 00:00:21,440 Speaker 2: multi billion dollar fraud at FTX. Bankman Fried spent two 5 00:00:21,520 --> 00:00:24,680 Speaker 2: and a half days on the witness stand, testimony that's 6 00:00:24,720 --> 00:00:28,080 Speaker 2: crucial to his hopes of avoiding a conviction in decades 7 00:00:28,080 --> 00:00:31,560 Speaker 2: behind bars. But during his last hours on the stand, 8 00:00:31,880 --> 00:00:35,839 Speaker 2: he struggled through a withering cross examination by the prosecutor 9 00:00:36,200 --> 00:00:39,440 Speaker 2: joining me is Bloomberg Legal reporter Bob van Vories, who 10 00:00:39,479 --> 00:00:43,320 Speaker 2: was in the courtroom for his testimony, was SBF able 11 00:00:43,600 --> 00:00:48,080 Speaker 2: to establish any part of his case for the jury. 12 00:00:48,440 --> 00:00:51,080 Speaker 3: Well, he did get up on the stand. I think 13 00:00:51,159 --> 00:00:54,720 Speaker 3: he was able to humanize himself in front of the jury. 14 00:00:54,840 --> 00:00:58,640 Speaker 3: I think he was able to give his perspective on 15 00:00:58,720 --> 00:01:03,200 Speaker 3: a lot of transacts that prosecutors had tried to make 16 00:01:03,280 --> 00:01:07,800 Speaker 3: look very sinister. And I think he was successful in 17 00:01:08,080 --> 00:01:11,040 Speaker 3: giving his own perspective to some of the things that 18 00:01:11,240 --> 00:01:14,399 Speaker 3: went on at FTX and now need a research. But 19 00:01:14,959 --> 00:01:19,600 Speaker 3: I do think that on cross examination today yesterday, he 20 00:01:19,880 --> 00:01:22,399 Speaker 3: gave back some of the games he had made on 21 00:01:22,440 --> 00:01:27,600 Speaker 3: the direct testimony just inevitably, the prosecution asked him the 22 00:01:27,640 --> 00:01:31,320 Speaker 3: difficult questions and the questions where they had evidence that 23 00:01:31,520 --> 00:01:35,000 Speaker 3: maybe countered or undercut some of the points that he 24 00:01:35,120 --> 00:01:37,840 Speaker 3: was trying to make out direct. It's difficult to know 25 00:01:38,920 --> 00:01:42,800 Speaker 3: whether the decision to go on the stand and submit 26 00:01:42,880 --> 00:01:45,320 Speaker 3: himself to the host of questioning was a good one, 27 00:01:45,400 --> 00:01:49,360 Speaker 3: whether he benefited more than that he suffered. But the 28 00:01:49,440 --> 00:01:53,080 Speaker 3: question of whether he provided enough testimony to get over 29 00:01:53,120 --> 00:01:56,080 Speaker 3: the hump of countering the story that was told by 30 00:01:56,120 --> 00:02:01,960 Speaker 3: the three cooperating witnesses, the former executive at FTX and Alameda, 31 00:02:02,080 --> 00:02:05,160 Speaker 3: it seems like he probably wasn't able to do that. 32 00:02:06,320 --> 00:02:08,320 Speaker 2: So there was this picture of him in the opening 33 00:02:08,360 --> 00:02:13,600 Speaker 2: statements where the prosecution painted him as a criminal mastermind 34 00:02:14,080 --> 00:02:19,520 Speaker 2: and the defense painted him as a hapless executive. Which 35 00:02:19,560 --> 00:02:22,160 Speaker 2: picture do you think fits him better at the end 36 00:02:22,200 --> 00:02:22,440 Speaker 2: of the. 37 00:02:22,400 --> 00:02:25,959 Speaker 3: Day, It's difficult to say, because there was a balance 38 00:02:26,040 --> 00:02:31,160 Speaker 3: between the two. Certainly, he presented himself as somebody who 39 00:02:31,240 --> 00:02:35,520 Speaker 3: was too busy running FTX to be able to pay 40 00:02:35,520 --> 00:02:40,040 Speaker 3: attention to what was going on at Alamede Research. Prosecutors 41 00:02:40,040 --> 00:02:44,720 Speaker 3: claimed that Alameda Research were billions of dollars of FTX 42 00:02:44,760 --> 00:02:48,320 Speaker 3: customer money and used it for all kinds of things, 43 00:02:48,320 --> 00:02:53,200 Speaker 3: including for expensive real estate for venture capital investments. However, 44 00:02:53,280 --> 00:02:59,359 Speaker 3: that counter with is very sort of precise style of speaking, 45 00:02:59,680 --> 00:03:04,920 Speaker 3: his understanding of many many aspects in great detail of 46 00:03:05,000 --> 00:03:08,120 Speaker 3: his business, and so he had to sort of walk 47 00:03:08,120 --> 00:03:11,160 Speaker 3: a tightrope of on the one hand, not knowing what 48 00:03:11,240 --> 00:03:13,560 Speaker 3: was going on, but on the other hand seeming like 49 00:03:13,600 --> 00:03:16,639 Speaker 3: somebody who generally did know everything that was going on. 50 00:03:17,480 --> 00:03:21,960 Speaker 2: Did he make any really damaging admissions during cross He. 51 00:03:22,040 --> 00:03:26,640 Speaker 3: Did admit to elements that the prosecutors already knew that 52 00:03:26,720 --> 00:03:31,080 Speaker 3: the prosecutors were already able to prove with documents and 53 00:03:31,400 --> 00:03:35,280 Speaker 3: through previous testimony. There were some things that would have 54 00:03:35,440 --> 00:03:38,480 Speaker 3: been very hard for him to deny, and so he 55 00:03:38,520 --> 00:03:42,920 Speaker 3: did it in those nights. But I think perhaps one 56 00:03:42,960 --> 00:03:46,120 Speaker 3: of the most damaging aspects of his testimony was when 57 00:03:46,160 --> 00:03:50,800 Speaker 3: he was asked the very difficult question, he would become evasive, 58 00:03:50,960 --> 00:03:55,280 Speaker 3: he would become he would quarrel with the questioning. He 59 00:03:55,880 --> 00:04:00,680 Speaker 3: I don't think was able to credibly answer questions that 60 00:04:00,720 --> 00:04:06,120 Speaker 3: were directed towards the most damaging evidence against him in 61 00:04:06,200 --> 00:04:08,360 Speaker 3: other people's testimonies. 62 00:04:07,680 --> 00:04:10,360 Speaker 2: And when he testified about the people who flipped. Did 63 00:04:10,360 --> 00:04:13,520 Speaker 2: he appear antagonistic, No, not at all. 64 00:04:13,560 --> 00:04:19,040 Speaker 3: He testified fairly gently about his former friends and co 65 00:04:19,120 --> 00:04:24,119 Speaker 3: executives at FTX and Alameda, particularly Caroline Ellison, his former 66 00:04:24,200 --> 00:04:28,599 Speaker 3: girlfriend who he put in her job as CEO of Alameda. 67 00:04:28,839 --> 00:04:34,440 Speaker 3: He was very understanding complementary in his direct testimony, but 68 00:04:34,520 --> 00:04:38,040 Speaker 3: he certainly had to come around to blaming her for 69 00:04:38,800 --> 00:04:44,880 Speaker 3: not heading Alameda's risk at the time when he wanted 70 00:04:44,880 --> 00:04:49,440 Speaker 3: her to do that. Also for not being able to 71 00:04:49,680 --> 00:04:52,200 Speaker 3: sort of keep tabs on the amount of money that 72 00:04:52,680 --> 00:04:57,480 Speaker 3: Alameda was borrowing from FTX. So again, for him, it 73 00:04:57,560 --> 00:05:00,119 Speaker 3: was kind of a type book. He needed to blame her. 74 00:05:00,720 --> 00:05:03,120 Speaker 3: But she was somebody who I think presented a very 75 00:05:03,160 --> 00:05:06,880 Speaker 3: sympathetic face to the jury, and I think they believed 76 00:05:06,880 --> 00:05:09,440 Speaker 3: her testimony. So it was very difficult, I think, for 77 00:05:09,520 --> 00:05:12,000 Speaker 3: as SBF to do both of those things at the 78 00:05:12,040 --> 00:05:12,520 Speaker 3: same time. 79 00:05:13,000 --> 00:05:16,880 Speaker 2: Were the jurors paying close attention to him at all times? 80 00:05:16,920 --> 00:05:21,240 Speaker 2: Did they get lost in this cross examination that at 81 00:05:21,240 --> 00:05:22,760 Speaker 2: times seemed painstaking? 82 00:05:23,320 --> 00:05:25,680 Speaker 3: There were certainly times in the testimony when he was 83 00:05:25,760 --> 00:05:29,359 Speaker 3: covering technical aspects and when he was talking about financial 84 00:05:29,440 --> 00:05:33,279 Speaker 3: transactions that were maybe not quite as accessible to the 85 00:05:33,360 --> 00:05:36,320 Speaker 3: jury that they seemed a little more disengaged, But when 86 00:05:36,360 --> 00:05:40,720 Speaker 3: he was talking about the most important parts of his testimony, 87 00:05:41,000 --> 00:05:46,240 Speaker 3: and particularly on cross examination by Assistant US Attorney Danielle Sassoon, 88 00:05:46,839 --> 00:05:50,960 Speaker 3: she was very focused in her questions. She was just 89 00:05:51,120 --> 00:05:56,560 Speaker 3: relentless in coming back at him and asking questions, following 90 00:05:56,640 --> 00:06:01,200 Speaker 3: up on logical implications of what his testimony was. She 91 00:06:01,440 --> 00:06:04,920 Speaker 3: had him, particularly this morning, really kind of had him 92 00:06:04,960 --> 00:06:09,479 Speaker 3: on the ropes with her, you know, questioning over and over, 93 00:06:10,000 --> 00:06:12,400 Speaker 3: and the jury was paying attention to that. 94 00:06:13,040 --> 00:06:17,880 Speaker 2: Did he come across as sympathetic or friendly that one 95 00:06:18,000 --> 00:06:21,640 Speaker 2: juror or two might think, I don't want to send 96 00:06:21,680 --> 00:06:22,559 Speaker 2: this guy to jail. 97 00:06:23,120 --> 00:06:26,479 Speaker 3: I think for sure he made sense for him to 98 00:06:26,520 --> 00:06:29,159 Speaker 3: take the stand, you know, being able to connect to 99 00:06:30,320 --> 00:06:33,480 Speaker 3: the jury at least on some level. Before he took 100 00:06:33,480 --> 00:06:36,560 Speaker 3: the stand on Friday, they had not heard a word 101 00:06:36,560 --> 00:06:39,560 Speaker 3: from him. They'd just seen him sitting at the defense table, 102 00:06:39,680 --> 00:06:42,760 Speaker 3: watching just like they were watching everything that was going on. 103 00:06:43,080 --> 00:06:45,040 Speaker 3: So I think he was able to sort of present 104 00:06:45,160 --> 00:06:49,440 Speaker 3: himself sympathetically to some extent, But I'm not sure he 105 00:06:49,520 --> 00:06:53,799 Speaker 3: was able to break through and counter the prosecution case, 106 00:06:54,040 --> 00:06:55,880 Speaker 3: which is what he needs to do to be able 107 00:06:55,920 --> 00:06:56,800 Speaker 3: to be acquitted here. 108 00:06:57,279 --> 00:06:59,400 Speaker 2: That question will be in the hands of the jury 109 00:06:59,480 --> 00:07:04,479 Speaker 2: probably by Thursday, with closing arguments tomorrow. Thanks so much, 110 00:07:04,520 --> 00:07:08,240 Speaker 2: Bob for that look from inside the courtroom. That's Bloomberg 111 00:07:08,360 --> 00:07:12,320 Speaker 2: Legal reporter Bob van Voris. It's the biggest US anti 112 00:07:12,320 --> 00:07:16,760 Speaker 2: trust case since the Justice Department went after Microsoft twenty 113 00:07:16,800 --> 00:07:20,400 Speaker 2: five years ago. The government has spent six weeks presenting 114 00:07:20,480 --> 00:07:24,200 Speaker 2: evidence in its case, contending that Google pays off tech 115 00:07:24,240 --> 00:07:29,000 Speaker 2: companies to lock out rival search engines to smother competition 116 00:07:29,120 --> 00:07:32,640 Speaker 2: and innovation. But now it's Google's turn, and the defense 117 00:07:32,680 --> 00:07:37,320 Speaker 2: put on its star witness on Monday, Google CEO Sundhor Pichai, 118 00:07:37,720 --> 00:07:41,160 Speaker 2: who defended his company's practice of paying Apple and other 119 00:07:41,280 --> 00:07:44,880 Speaker 2: tech companies to make Google the default search engine on 120 00:07:45,000 --> 00:07:48,720 Speaker 2: their devices, saying the intent was to make the user 121 00:07:48,800 --> 00:07:53,679 Speaker 2: experience seamless and easy. Joining me is anti trust expert 122 00:07:53,720 --> 00:07:57,880 Speaker 2: Harry First, a professor at NYU Law School. The government 123 00:07:58,080 --> 00:08:02,160 Speaker 2: spent so much time zetting its case, do you think 124 00:08:02,240 --> 00:08:07,800 Speaker 2: they drew a strong enough line between Google's actions and 125 00:08:08,600 --> 00:08:10,360 Speaker 2: measurable harm to consumers? 126 00:08:11,120 --> 00:08:16,360 Speaker 4: Well, that's a great question, because it isn't clear as 127 00:08:16,560 --> 00:08:20,480 Speaker 4: you are saying it. What that's through line is to consumers. 128 00:08:20,560 --> 00:08:22,800 Speaker 4: And one of the problems in the tech cases are, 129 00:08:23,280 --> 00:08:25,720 Speaker 4: by and large, a lot of consumers, like these companies, 130 00:08:25,960 --> 00:08:29,920 Speaker 4: you know, become reliant on Google for searches and you know, 131 00:08:30,240 --> 00:08:33,160 Speaker 4: just use it all the time. It's become a verb. 132 00:08:33,760 --> 00:08:38,680 Speaker 4: So that's a hard task and legally not necessarily one 133 00:08:38,720 --> 00:08:41,760 Speaker 4: the government has to carry. So what the government has 134 00:08:41,800 --> 00:08:47,560 Speaker 4: to carry is that Google has improperly excluded competition, that 135 00:08:47,640 --> 00:08:51,640 Speaker 4: it has a monopoly and has excluded competition and thereby 136 00:08:51,720 --> 00:08:55,480 Speaker 4: maintained it and excluded it by improper means rather than 137 00:08:55,480 --> 00:08:59,480 Speaker 4: by competition on the merits. So that's really the task. 138 00:08:59,559 --> 00:09:02,440 Speaker 4: The task isn't to show that we get bad searches 139 00:09:02,640 --> 00:09:06,160 Speaker 4: or that our search results are litered with ads or 140 00:09:06,200 --> 00:09:10,320 Speaker 4: any of those things. It's really to show that Google 141 00:09:10,360 --> 00:09:14,720 Speaker 4: has made a strong effort to sort of cement itself 142 00:09:15,120 --> 00:09:17,760 Speaker 4: in as the leader. You know, all paths are going 143 00:09:17,800 --> 00:09:21,040 Speaker 4: to lead to Google, and that it's done that through 144 00:09:21,120 --> 00:09:25,880 Speaker 4: any competitive means, rather than by competing with having just 145 00:09:25,920 --> 00:09:27,120 Speaker 4: simply the best product. 146 00:09:27,760 --> 00:09:30,320 Speaker 2: Do you think the government has gotten close to that? 147 00:09:31,480 --> 00:09:34,640 Speaker 4: Well, Looming large in all of this is the amount 148 00:09:34,679 --> 00:09:39,280 Speaker 4: of money that Google paid for these defaults, and you know, 149 00:09:39,520 --> 00:09:42,200 Speaker 4: sort of the easy argument, or the direct argument is, 150 00:09:42,280 --> 00:09:45,319 Speaker 4: if you're so good, why did you pay twenty six 151 00:09:45,440 --> 00:09:49,680 Speaker 4: billion dollars to ensure that consumers use your product? 152 00:09:49,679 --> 00:09:49,839 Speaker 2: Why? 153 00:09:49,920 --> 00:09:51,840 Speaker 4: I just put it in front of them and consumers 154 00:09:51,880 --> 00:09:55,000 Speaker 4: will say, hey, yeah, I like that one. And you know, 155 00:09:55,400 --> 00:09:57,640 Speaker 4: what are you getting for your twenty six billion dollars? 156 00:09:57,880 --> 00:10:00,959 Speaker 4: And I think that's that's a big h for Google 157 00:10:01,520 --> 00:10:05,240 Speaker 4: and big plus I think for the government. You know, 158 00:10:05,520 --> 00:10:08,880 Speaker 4: companies do rational things. They don't throw money away for nothing. 159 00:10:09,320 --> 00:10:11,559 Speaker 4: So what do they think they're getting, Well, something they 160 00:10:11,559 --> 00:10:14,839 Speaker 4: can't get through competition on the merits, which is cementing 161 00:10:14,880 --> 00:10:20,560 Speaker 4: in that loyalty. Because defaults are really important in consumer behavior. 162 00:10:20,800 --> 00:10:24,720 Speaker 2: Does it seem as if Google has learned from the 163 00:10:24,800 --> 00:10:27,280 Speaker 2: lessons of the Microsoft case. 164 00:10:28,920 --> 00:10:34,960 Speaker 4: That is a question tinged with irony. So Google started 165 00:10:35,000 --> 00:10:37,320 Speaker 4: out by saying, you know, we're going to be the 166 00:10:37,360 --> 00:10:41,240 Speaker 4: good company, We're not going to be Microsoft. And in fact, 167 00:10:41,400 --> 00:10:45,839 Speaker 4: they took advantage of the remedy decree in the Microsoft 168 00:10:45,920 --> 00:10:52,199 Speaker 4: case to make sure that as Windows was being updated 169 00:10:52,240 --> 00:10:56,760 Speaker 4: and new versions were in that Microsoft Search engine, which 170 00:10:56,840 --> 00:10:59,480 Speaker 4: was not called being at the time, would not be 171 00:10:59,559 --> 00:11:03,680 Speaker 4: cement in is the default in Windows that consumers would 172 00:11:03,720 --> 00:11:07,960 Speaker 4: be able to get to Google and choose it. So 173 00:11:08,600 --> 00:11:11,880 Speaker 4: they were concerned to be sure, and one point there 174 00:11:12,000 --> 00:11:15,160 Speaker 4: was some interaction with District Court that was, you know, 175 00:11:15,520 --> 00:11:20,920 Speaker 4: supervising to create to make sure that Microsoft didn't use 176 00:11:21,280 --> 00:11:25,880 Speaker 4: or build in a default that would exclude Google. So 177 00:11:26,280 --> 00:11:29,640 Speaker 4: they understand the value, and they understand competition, and they 178 00:11:29,640 --> 00:11:33,000 Speaker 4: wanted to get their product in front of consumers to use. 179 00:11:33,440 --> 00:11:37,920 Speaker 4: But as time went on, as happens, you know, monopolies 180 00:11:39,200 --> 00:11:44,080 Speaker 4: a heady thing. So having cemented their position, they wanted 181 00:11:44,120 --> 00:11:46,880 Speaker 4: to keep it and keep it by whatever means. 182 00:11:47,160 --> 00:11:51,640 Speaker 2: So here we are something that they might have learned 183 00:11:51,679 --> 00:11:55,280 Speaker 2: from the Microsoft case. People are pointing to the importance 184 00:11:55,320 --> 00:11:58,800 Speaker 2: of not having a paper trail. The Justice Department has 185 00:11:58,880 --> 00:12:05,240 Speaker 2: implied that Google automatically deleted messages for that reason. How 186 00:12:05,280 --> 00:12:08,440 Speaker 2: important do you think that is to the judge who's 187 00:12:08,480 --> 00:12:09,840 Speaker 2: going to design this case. 188 00:12:10,320 --> 00:12:12,760 Speaker 4: You know, I think in the end this isn't going 189 00:12:12,840 --> 00:12:15,200 Speaker 4: to sway the judge particularly one way or the other. 190 00:12:15,679 --> 00:12:18,000 Speaker 4: I think, you know, the governments tried to say that, 191 00:12:18,559 --> 00:12:22,000 Speaker 4: you know, particularly early on, Google was very concerned about 192 00:12:22,360 --> 00:12:27,200 Speaker 4: the language used in emails and you know, not saying 193 00:12:27,320 --> 00:12:31,240 Speaker 4: things like we have to cut off their air supply, 194 00:12:31,600 --> 00:12:35,200 Speaker 4: like one of the Microsoft people allegedly did you know 195 00:12:35,280 --> 00:12:38,600 Speaker 4: with regard to Netscape, it's nascent competitor at the time, 196 00:12:39,040 --> 00:12:41,880 Speaker 4: so they were aware, and you know, maybe this was 197 00:12:41,920 --> 00:12:45,200 Speaker 4: done intentionally, maybe it wasn't. Who knows that this doesn't 198 00:12:45,200 --> 00:12:48,040 Speaker 4: look great. I think in the end this is not 199 00:12:48,160 --> 00:12:53,199 Speaker 4: going to be determinative. I think there are important legal 200 00:12:53,240 --> 00:12:57,720 Speaker 4: issues that are going to be more important for this judge. 201 00:12:58,400 --> 00:13:01,480 Speaker 2: One disagreement in the case has been over a search 202 00:13:01,559 --> 00:13:06,239 Speaker 2: engine's quote scale, the amount of data collects from websites 203 00:13:06,280 --> 00:13:09,720 Speaker 2: and users. So explain explain the scale argument. 204 00:13:10,600 --> 00:13:13,160 Speaker 4: Well, this is a very interesting argument, I think is 205 00:13:13,200 --> 00:13:15,760 Speaker 4: one of the important arguments in the case. And in 206 00:13:15,800 --> 00:13:18,960 Speaker 4: some ways it cuts two ways. So the government says, 207 00:13:19,080 --> 00:13:22,120 Speaker 4: you know, so why is it important for Google to 208 00:13:22,360 --> 00:13:24,560 Speaker 4: keep it as a default and to you know, have 209 00:13:24,679 --> 00:13:27,400 Speaker 4: all paths lead to Google. Well, obviously it's important for 210 00:13:27,520 --> 00:13:31,079 Speaker 4: revenue and you know, for their advertising revenue. But more important, 211 00:13:31,320 --> 00:13:34,480 Speaker 4: the more searches they get, the more they build up 212 00:13:34,520 --> 00:13:39,080 Speaker 4: their base of searches of information about people, about things, 213 00:13:39,120 --> 00:13:42,800 Speaker 4: about all sorts of things, and that this enables them 214 00:13:42,840 --> 00:13:48,760 Speaker 4: to constantly update the search function and makes the search 215 00:13:48,880 --> 00:13:53,200 Speaker 4: engine in some ways better and better. And what they're 216 00:13:53,240 --> 00:13:56,640 Speaker 4: trying to do basically is make sure that other search 217 00:13:56,679 --> 00:14:01,079 Speaker 4: engines never get that scale, never get to be of 218 00:14:01,120 --> 00:14:05,120 Speaker 4: a size where where they can have the base of 219 00:14:05,240 --> 00:14:09,959 Speaker 4: data that will enable them to provide searches of equality 220 00:14:10,160 --> 00:14:15,680 Speaker 4: that equal Google. So it explains Google's motivation for wanting 221 00:14:15,720 --> 00:14:19,720 Speaker 4: to keep being small or other competitors small. On the 222 00:14:19,720 --> 00:14:22,600 Speaker 4: other hand, it does play into the notion that bigger 223 00:14:22,840 --> 00:14:27,360 Speaker 4: maybe better. And you know, how far does the scale go? 224 00:14:28,000 --> 00:14:30,880 Speaker 4: Would Google be even better if it had all the 225 00:14:31,080 --> 00:14:34,320 Speaker 4: data in the world and there were no other search engines. 226 00:14:34,800 --> 00:14:39,880 Speaker 4: So the government doesn't want to quite say that. Now. Google, 227 00:14:39,920 --> 00:14:42,680 Speaker 4: on the other hand, doesn't want to quite say that 228 00:14:43,640 --> 00:14:47,240 Speaker 4: scale is the reason why they're so good, So they 229 00:14:47,280 --> 00:14:50,720 Speaker 4: want to downplay the scale argument a little bit and 230 00:14:50,880 --> 00:14:53,000 Speaker 4: say no, no, no, no, we're not trying to deprive them 231 00:14:53,000 --> 00:14:55,800 Speaker 4: of scale. The reason why we're so good is what 232 00:14:55,840 --> 00:14:59,479 Speaker 4: we do with the data. It's not just having it. 233 00:14:59,480 --> 00:15:02,560 Speaker 4: It's all the money we put into the engineering of 234 00:15:02,600 --> 00:15:05,360 Speaker 4: figuring out the best algorithms, the best way to use 235 00:15:05,400 --> 00:15:08,880 Speaker 4: the data. So it isn't just a question of scale, 236 00:15:09,280 --> 00:15:12,200 Speaker 4: that's not just this thing that keeps us better. So 237 00:15:12,760 --> 00:15:15,160 Speaker 4: it's sort of an argument that's fun in a way. 238 00:15:15,200 --> 00:15:18,480 Speaker 4: Google could use the scale argument to say, Aha, we're 239 00:15:18,480 --> 00:15:21,760 Speaker 4: better off with monopoly. But that's a hard legal argument 240 00:15:21,840 --> 00:15:24,280 Speaker 4: to make. There's room in the law for saying that, 241 00:15:24,320 --> 00:15:27,120 Speaker 4: by the way, but it's not the best argument to 242 00:15:27,120 --> 00:15:30,400 Speaker 4: put in front of a judge. And the government doesn't 243 00:15:30,400 --> 00:15:33,480 Speaker 4: want to say scale is so important that we're better 244 00:15:33,520 --> 00:15:37,680 Speaker 4: off with a monopoly. So it's an argument. And here's 245 00:15:38,080 --> 00:15:42,800 Speaker 4: where I'm not quite sure the testimony shows us what's true. 246 00:15:43,320 --> 00:15:47,640 Speaker 4: It's not clear where the scale economies, and so I'm 247 00:15:47,640 --> 00:15:50,600 Speaker 4: not sure. Maybe Google doesn't know. And this has been 248 00:15:50,600 --> 00:15:54,320 Speaker 4: true in other cases Facebook, for example, you know how 249 00:15:54,320 --> 00:15:57,400 Speaker 4: big does Facebook have to be? Amazon? Any of these 250 00:15:57,600 --> 00:16:01,880 Speaker 4: companies that rely on data are where points where you know, 251 00:16:02,000 --> 00:16:05,760 Speaker 4: economies of scale just diminished and getting bigger doesn't necessarily 252 00:16:05,800 --> 00:16:09,040 Speaker 4: make you better. In manufacturing, there are you know, that's 253 00:16:09,080 --> 00:16:14,400 Speaker 4: a commonplace in manufacturing, But for data not so clear. 254 00:16:15,480 --> 00:16:20,720 Speaker 2: Some internal Google emails show that executives were mindful of 255 00:16:20,720 --> 00:16:26,200 Speaker 2: avoiding keywords like market share in their records. Is this 256 00:16:26,280 --> 00:16:29,720 Speaker 2: case about market share or does scale take the place 257 00:16:29,720 --> 00:16:30,640 Speaker 2: of market share. 258 00:16:31,680 --> 00:16:35,200 Speaker 4: Well, the answer is not either. 259 00:16:35,080 --> 00:16:37,640 Speaker 2: Or so I'm wrong on both counts. 260 00:16:38,240 --> 00:16:41,320 Speaker 4: No, you're right on both camps. It is about market share. 261 00:16:41,360 --> 00:16:44,240 Speaker 4: So for the law, you have to show two things 262 00:16:44,280 --> 00:16:47,840 Speaker 4: to violate Section two of the Shermanac, which is the monopoly. 263 00:16:47,920 --> 00:16:50,440 Speaker 4: Section one is that you are a monopoly, that you 264 00:16:50,520 --> 00:16:54,200 Speaker 4: have monopoly power in a relevant market. That's the language. 265 00:16:54,240 --> 00:16:58,520 Speaker 4: So that's market share, not only market share, but that's 266 00:16:58,560 --> 00:17:02,840 Speaker 4: a key determinant of whether you're a monopolist. So government 267 00:17:02,880 --> 00:17:07,000 Speaker 4: exhibits show Google with eighty nine to ninety some percent 268 00:17:07,200 --> 00:17:11,400 Speaker 4: of the market, being with three percent, So market share 269 00:17:11,480 --> 00:17:15,280 Speaker 4: is the first indication that you have monopoly power. You 270 00:17:15,280 --> 00:17:18,240 Speaker 4: have most of the market. So market share is important. 271 00:17:18,240 --> 00:17:22,040 Speaker 4: And I can understand if you were counseling you know, Google, 272 00:17:22,080 --> 00:17:25,080 Speaker 4: you would say, look, don't say great, we have our 273 00:17:25,160 --> 00:17:28,040 Speaker 4: share is as high as it can get. You know, 274 00:17:28,119 --> 00:17:30,959 Speaker 4: it's just doesn't sound good. But of course, in the end, 275 00:17:31,000 --> 00:17:35,680 Speaker 4: this doesn't depend on you know, some emails claiming it. 276 00:17:35,680 --> 00:17:39,240 Speaker 4: It depends on the data. And you can find you know, 277 00:17:39,280 --> 00:17:42,840 Speaker 4: the exhibits that the government has put together they're online, 278 00:17:42,920 --> 00:17:46,760 Speaker 4: which show very high market shairs based on data put 279 00:17:46,800 --> 00:17:50,320 Speaker 4: together by you know, other companies like stat Counter. So 280 00:17:50,840 --> 00:17:53,280 Speaker 4: that is the first thing. Market share scale goes to 281 00:17:53,920 --> 00:17:58,439 Speaker 4: this strategy question of you know, why they're trying to 282 00:17:58,760 --> 00:18:02,159 Speaker 4: what they're going to gain by keeping others small, you know, 283 00:18:02,200 --> 00:18:05,040 Speaker 4: other than a lot of money, which is important, but 284 00:18:05,440 --> 00:18:08,040 Speaker 4: you know this competitive edge that is going to be 285 00:18:08,080 --> 00:18:09,000 Speaker 4: hard to overcome. 286 00:18:09,840 --> 00:18:15,399 Speaker 2: Interesting is the focus on artificial intelligence. The Department of 287 00:18:15,600 --> 00:18:20,119 Speaker 2: Justice says that Google was way ahead in generative AI 288 00:18:20,280 --> 00:18:24,480 Speaker 2: and chose not to release the technology sooner because of 289 00:18:24,560 --> 00:18:27,159 Speaker 2: fear of losing its monopoly on search. 290 00:18:28,640 --> 00:18:31,960 Speaker 4: I haven't seen that as dominant in argument. That's an 291 00:18:32,000 --> 00:18:36,280 Speaker 4: interesting argument, hard to you know, it is an argument 292 00:18:36,359 --> 00:18:40,480 Speaker 4: about monopolist that they control the pace of innovation and 293 00:18:40,560 --> 00:18:45,840 Speaker 4: they don't want their current products cannibalized by new products 294 00:18:45,880 --> 00:18:49,880 Speaker 4: that will only you know, take away market share and 295 00:18:49,920 --> 00:18:53,280 Speaker 4: not help them. So in one sense, that could very 296 00:18:53,280 --> 00:18:57,600 Speaker 4: well be AI. And you know, major economists have argued 297 00:18:57,920 --> 00:19:02,240 Speaker 4: this point in terms of the ability and willingness of 298 00:19:02,280 --> 00:19:06,720 Speaker 4: monopolis to innovate. Now at this point, you know, there 299 00:19:06,760 --> 00:19:09,160 Speaker 4: may be other explanations for why they would be cautious 300 00:19:09,160 --> 00:19:13,040 Speaker 4: with AI. And everybody seems to be jumping into the 301 00:19:13,119 --> 00:19:15,840 Speaker 4: AI race, and they may want to turn it the 302 00:19:15,880 --> 00:19:19,520 Speaker 4: other way. You know, don't worry so much about Google 303 00:19:19,560 --> 00:19:24,840 Speaker 4: because you know, Being is featuring chat, GPT, and they've 304 00:19:24,880 --> 00:19:28,840 Speaker 4: made this big investment in artificial intelligence, and you know 305 00:19:28,960 --> 00:19:31,199 Speaker 4: they're going to have searches are going to be fantastic. 306 00:19:31,600 --> 00:19:35,120 Speaker 4: You know that competition is just coming tomorrow. 307 00:19:35,760 --> 00:19:39,520 Speaker 2: A contrast between this trial and the Microsoft trial seems 308 00:19:39,520 --> 00:19:43,640 Speaker 2: to be the testimony of the CEOs. Apparently, Sundar Prashai 309 00:19:43,840 --> 00:19:48,520 Speaker 2: took the stand and was cool, calm and collected, unlike 310 00:19:48,560 --> 00:19:51,160 Speaker 2: what we saw from Bill Gates so many years ago. 311 00:19:51,680 --> 00:19:55,760 Speaker 2: He talked about that search deal with Apple and said 312 00:19:56,560 --> 00:19:59,720 Speaker 2: we fiercely compete on so many products, and that the 313 00:19:59,760 --> 00:20:02,679 Speaker 2: meta when they decided that was tense. At times, we 314 00:20:02,760 --> 00:20:06,720 Speaker 2: continue to have moments of tension between the companies. Is 315 00:20:06,760 --> 00:20:08,000 Speaker 2: that enough? I don't know. 316 00:20:08,800 --> 00:20:13,200 Speaker 4: Well, nice try, Yeah, I'm tempted to. You know, the 317 00:20:13,200 --> 00:20:17,400 Speaker 4: old point when the elephants danced, the fleas get crushed. 318 00:20:17,880 --> 00:20:21,640 Speaker 4: So I don't know. Yeah, I'm sure that they are 319 00:20:21,680 --> 00:20:26,240 Speaker 4: in some way frenemies, as the word might be. These platforms, 320 00:20:26,800 --> 00:20:31,040 Speaker 4: major platforms have points of competition and points of cooperation, 321 00:20:31,520 --> 00:20:34,400 Speaker 4: But in some sense that doesn't matter all that much. 322 00:20:34,520 --> 00:20:38,639 Speaker 4: What matters is this point of cooperation on search and 323 00:20:38,680 --> 00:20:43,800 Speaker 4: the payments, which you know also acted to dissuade Apple 324 00:20:44,640 --> 00:20:48,280 Speaker 4: from developing its own search engine. So you know, they 325 00:20:48,320 --> 00:20:50,639 Speaker 4: could cooperate a lot of things, which isn't necessarily the 326 00:20:50,640 --> 00:20:53,280 Speaker 4: greatest thing in the world or competing a lot of things, 327 00:20:53,720 --> 00:20:56,080 Speaker 4: but the focus here is on search. 328 00:20:56,760 --> 00:20:58,879 Speaker 2: I saw this analysis and I want to pass it 329 00:20:58,920 --> 00:21:02,199 Speaker 2: by you that much of the Justice Department's case is 330 00:21:02,280 --> 00:21:06,680 Speaker 2: based on documents, emails, and other records from Google itself, 331 00:21:07,040 --> 00:21:11,520 Speaker 2: while Google's case so far seems to be executives testifying 332 00:21:11,600 --> 00:21:14,400 Speaker 2: and contesting the conclusions from those records. 333 00:21:15,040 --> 00:21:18,520 Speaker 4: So they'll have their experts. You know, they're economists. They've 334 00:21:18,520 --> 00:21:21,680 Speaker 4: got a computer scientist on now, you know, they'll have that. 335 00:21:22,280 --> 00:21:25,919 Speaker 4: But sort of a standard way these trials seem to 336 00:21:25,960 --> 00:21:30,280 Speaker 4: be going in recent years is the government has experts 337 00:21:30,359 --> 00:21:35,080 Speaker 4: talking about the industry, how persuasive or not. They're experts 338 00:21:35,119 --> 00:21:39,359 Speaker 4: talking about the industry, and sometimes industry experts, but that's 339 00:21:39,400 --> 00:21:44,119 Speaker 4: often counterweighted by people from the companies themselves. And this 340 00:21:44,240 --> 00:21:47,040 Speaker 4: is often the case in mergers. For example, you know, 341 00:21:47,240 --> 00:21:51,360 Speaker 4: high level executives who are very persuasive and come in 342 00:21:51,600 --> 00:21:55,320 Speaker 4: and do often convince judges that what they've done is 343 00:21:55,520 --> 00:21:59,199 Speaker 4: you know, rational business behavior, you know, done not to 344 00:21:59,280 --> 00:22:04,040 Speaker 4: exclude compet but to advance competition. And the people who 345 00:22:04,640 --> 00:22:09,520 Speaker 4: rise in these companies are often very skilled, you know, communicators. 346 00:22:09,720 --> 00:22:15,960 Speaker 4: So in the Microsoft activision case, that merger case, you know, 347 00:22:15,960 --> 00:22:20,040 Speaker 4: there were high level Microsoft executives testifying. Now they're also 348 00:22:20,080 --> 00:22:23,119 Speaker 4: executives testifying for the government here, so it makes an 349 00:22:23,600 --> 00:22:27,280 Speaker 4: bit of an interesting contrast to most merger cases. So 350 00:22:27,359 --> 00:22:31,440 Speaker 4: you did have Microsoft testifying on behalf of the government. 351 00:22:32,119 --> 00:22:35,840 Speaker 2: And but Shaw seemed to be one of those good communicators. 352 00:22:36,440 --> 00:22:40,000 Speaker 2: He started his testimony talking about his childhood in India, 353 00:22:40,440 --> 00:22:45,240 Speaker 2: then studying at Stanford University and joining Google in two 354 00:22:45,240 --> 00:22:48,680 Speaker 2: thousand and four as a product manager for Google Toolbar. 355 00:22:49,880 --> 00:22:52,280 Speaker 4: Yeah, well, it would have been better for the government 356 00:22:52,320 --> 00:22:54,399 Speaker 4: if they had a good villain. You know, these stories 357 00:22:54,440 --> 00:22:56,560 Speaker 4: are always better with villains. So Bill Gates was a 358 00:22:56,560 --> 00:23:00,000 Speaker 4: great villain, and Mark Zuckerberg would be a great villain. 359 00:23:00,560 --> 00:23:05,320 Speaker 4: But you know, for Google, no villains really of that sense. 360 00:23:05,400 --> 00:23:09,840 Speaker 4: So yeah, the government doesn't have that part of the narrative. 361 00:23:10,200 --> 00:23:13,679 Speaker 2: Is it too soon to tell which side has the 362 00:23:13,680 --> 00:23:14,720 Speaker 2: better case? So far? 363 00:23:15,560 --> 00:23:18,400 Speaker 4: Hard to say. I think there were issues that the 364 00:23:18,520 --> 00:23:24,720 Speaker 4: judge wrote about. There was a motion made by Google 365 00:23:24,880 --> 00:23:26,800 Speaker 4: and I think by the government to end the case 366 00:23:26,840 --> 00:23:30,440 Speaker 4: before trial, motion for summary judgment, and the judge addressed 367 00:23:30,440 --> 00:23:33,040 Speaker 4: some of the issues that he felt were in contention 368 00:23:33,440 --> 00:23:36,600 Speaker 4: at trial. And these are legal issues, so, you know, 369 00:23:36,640 --> 00:23:40,040 Speaker 4: I think it remains to be seen how the judge 370 00:23:40,080 --> 00:23:42,560 Speaker 4: is going to decide that, and he's pushed as I 371 00:23:42,600 --> 00:23:45,800 Speaker 4: read the reports of the trial, you know, push some 372 00:23:45,880 --> 00:23:50,800 Speaker 4: of the witnesses on these issues, particularly whether it really 373 00:23:50,840 --> 00:23:54,160 Speaker 4: makes a difference. Would have made a difference in market shares, 374 00:23:54,280 --> 00:23:58,159 Speaker 4: in Google's position in the market, or in beings if 375 00:23:58,200 --> 00:24:01,960 Speaker 4: these defaults weren't there, And depending on the legal standard 376 00:24:02,040 --> 00:24:05,399 Speaker 4: that he applies, this could be a very important issue. 377 00:24:05,440 --> 00:24:07,920 Speaker 4: And it's not a behavioral issue so much as a 378 00:24:08,119 --> 00:24:11,760 Speaker 4: well but for these defaults, what would the world look like? 379 00:24:12,119 --> 00:24:14,600 Speaker 2: I believe the judge is going to make his decision 380 00:24:14,800 --> 00:24:18,720 Speaker 2: in December, so we'll find out all this soon enough. 381 00:24:18,960 --> 00:24:22,240 Speaker 2: Thanks so much, Harry. That's Professor Harry First of NYU 382 00:24:22,320 --> 00:24:27,320 Speaker 2: Law School. Coming up next, the Executive Order on artificial intelligence. 383 00:24:27,720 --> 00:24:28,639 Speaker 2: This is Bloomberg. 384 00:24:29,400 --> 00:24:31,720 Speaker 5: We're going to see more technological change in the next 385 00:24:31,760 --> 00:24:34,720 Speaker 5: ten maybe next five years, and we've seen in the 386 00:24:34,840 --> 00:24:39,160 Speaker 5: last fifty years, and that's a fact. As the most 387 00:24:39,160 --> 00:24:44,240 Speaker 5: consequential technology are our time, artificial intelligence is accelerating that change. 388 00:24:44,280 --> 00:24:46,160 Speaker 5: It's going to accelerate at warp speed. 389 00:24:46,640 --> 00:24:50,399 Speaker 2: On Monday, President Joe Biden signed an executive Order on 390 00:24:50,600 --> 00:24:54,399 Speaker 2: artificial Intelligence that he says will make the development of 391 00:24:54,440 --> 00:24:59,399 Speaker 2: AI safer for Americans. Among other things, it establishes standards 392 00:24:59,400 --> 00:25:04,040 Speaker 2: for secure in privacy protections and requires developers to safety 393 00:25:04,080 --> 00:25:07,800 Speaker 2: test new models. Joining me is Reggie Babin, Senior counsel 394 00:25:07,840 --> 00:25:13,040 Speaker 2: at Ach and Gump. How important is this executive order 395 00:25:13,080 --> 00:25:17,520 Speaker 2: in light of the skyrocketing use of AI in recent months. 396 00:25:18,040 --> 00:25:20,920 Speaker 1: That's a good opening question. I would say it is 397 00:25:21,640 --> 00:25:25,480 Speaker 1: significant for two reasons. At a minimum, it serves as 398 00:25:25,600 --> 00:25:29,440 Speaker 1: a clear signal to the world of what the President 399 00:25:29,480 --> 00:25:33,280 Speaker 1: and this administration's priorities are as it relates to artificial intelligence. 400 00:25:33,320 --> 00:25:36,800 Speaker 1: There's been a bit of a vacuum of sorts on 401 00:25:37,359 --> 00:25:39,680 Speaker 1: substantive US leadership on the issue as it relates to 402 00:25:39,720 --> 00:25:44,359 Speaker 1: AI governance, and now there is a clear, digestible set 403 00:25:44,400 --> 00:25:48,520 Speaker 1: of principles and set of mandates that reflect where this 404 00:25:48,600 --> 00:25:51,040 Speaker 1: administration would like to see the technology go. And then 405 00:25:51,160 --> 00:25:54,119 Speaker 1: beyond that, there are those aforementioned mandates, particularly as it 406 00:25:54,160 --> 00:25:58,240 Speaker 1: relates to companies training foundational models and cloud service companies 407 00:25:58,240 --> 00:26:01,359 Speaker 1: that are providing services for foreign customers. There are new, 408 00:26:01,600 --> 00:26:06,119 Speaker 1: previously unreported requirements in THEEO that are likely going to 409 00:26:06,160 --> 00:26:10,119 Speaker 1: have some impact on particularly those companies advancing the most 410 00:26:10,160 --> 00:26:13,640 Speaker 1: advanced AI models and how they go about training and 411 00:26:13,840 --> 00:26:17,120 Speaker 1: reporting on the development of that technology. So it's both 412 00:26:17,160 --> 00:26:19,840 Speaker 1: a huge political signal and will likely shape the global 413 00:26:19,840 --> 00:26:23,960 Speaker 1: AI governance conversation, but also has some fairly significant implications 414 00:26:24,000 --> 00:26:29,120 Speaker 1: for the domestic AI development process and the US's desire 415 00:26:29,200 --> 00:26:30,960 Speaker 1: to maintain its global lead in that space. 416 00:26:31,640 --> 00:26:35,320 Speaker 2: Is it difficult to make rules or you know, suggestions, 417 00:26:35,359 --> 00:26:40,440 Speaker 2: even in an area that's still developing, Yeah, it is. 418 00:26:40,640 --> 00:26:45,840 Speaker 1: It's difficult, though not impossible. The challenge is one how 419 00:26:45,880 --> 00:26:49,640 Speaker 1: do you balance concerns about safety and reducing the risk 420 00:26:49,760 --> 00:26:53,400 Speaker 1: of whatever harms may be perceived or feared, while maintaining 421 00:26:53,400 --> 00:26:58,679 Speaker 1: the flexibility and the dynamic space needed to lead to 422 00:26:58,720 --> 00:27:00,879 Speaker 1: the types of innovations that we all want to see. 423 00:27:01,400 --> 00:27:03,600 Speaker 1: And then on the second piece, if you take as 424 00:27:03,640 --> 00:27:06,639 Speaker 1: a given that the federal government by design moves much 425 00:27:06,720 --> 00:27:10,280 Speaker 1: more slowly than industry normally, and certainly much more slowly 426 00:27:10,280 --> 00:27:14,280 Speaker 1: than this exponentially advancing technology. How do you write rules 427 00:27:14,320 --> 00:27:18,680 Speaker 1: that are both effective but also flexible enough that they 428 00:27:18,760 --> 00:27:21,760 Speaker 1: are still relevant in eighteen months when the technology is 429 00:27:21,800 --> 00:27:24,920 Speaker 1: advanced beyond where it is now. So it's a constant 430 00:27:25,000 --> 00:27:28,200 Speaker 1: challenge trying to regulate in a way that allows for 431 00:27:28,480 --> 00:27:32,119 Speaker 1: continued innovation without undue risk, but also trying to do 432 00:27:32,240 --> 00:27:34,119 Speaker 1: so in a way that takes into account the rapidly 433 00:27:34,160 --> 00:27:38,199 Speaker 1: changing nature of this particular space, which is unique in 434 00:27:38,520 --> 00:27:42,040 Speaker 1: so far as we're moving at an exponential pace while 435 00:27:42,119 --> 00:27:45,200 Speaker 1: trying to regulate through a system that's designed to move 436 00:27:45,240 --> 00:27:47,040 Speaker 1: at something closer to guatial So. 437 00:27:47,119 --> 00:27:50,280 Speaker 2: Let's go through a couple of areas of concern and 438 00:27:50,480 --> 00:27:53,879 Speaker 2: you tell me in how the Executive Order addresses it. 439 00:27:54,000 --> 00:27:58,200 Speaker 2: So a big concern has been privacy, So. 440 00:27:58,480 --> 00:28:00,480 Speaker 1: This is one of the areas where it's a bit 441 00:28:00,480 --> 00:28:04,800 Speaker 1: more limited. There are calls for more stringent practices as 442 00:28:04,800 --> 00:28:08,119 Speaker 1: it relates to the federal government's handling of data of 443 00:28:08,240 --> 00:28:12,080 Speaker 1: American citizens. But there's also the EOS, accompanied by a 444 00:28:12,080 --> 00:28:15,560 Speaker 1: call from the President for Congress to pass comprehensive privacy legislation, 445 00:28:15,600 --> 00:28:19,800 Speaker 1: which I think is indicative of how much more Congressional 446 00:28:19,880 --> 00:28:23,040 Speaker 1: action is needed to see the type of significant movement 447 00:28:23,080 --> 00:28:24,800 Speaker 1: in this space that some have called for. And so 448 00:28:24,840 --> 00:28:28,600 Speaker 1: they're attempting to balance the ability to move forward with 449 00:28:28,640 --> 00:28:31,480 Speaker 1: existing powers with the acknowledgement that more authority is needed 450 00:28:31,480 --> 00:28:34,680 Speaker 1: from Congress if we're actually going to see comprehensive privacy 451 00:28:35,320 --> 00:28:37,760 Speaker 1: movement in a way that has not necessarily been experienced 452 00:28:37,760 --> 00:28:38,440 Speaker 1: to the state. 453 00:28:38,720 --> 00:28:42,400 Speaker 2: Something that I think people can relate to because they've 454 00:28:42,400 --> 00:28:45,680 Speaker 2: seen it. As Biden said that he'd watch deep fakes 455 00:28:45,720 --> 00:28:49,520 Speaker 2: of himself speaking and marveled at it, saying, when the 456 00:28:49,520 --> 00:28:52,880 Speaker 2: hell did I say that? They're asking the Commerce Department 457 00:28:52,960 --> 00:28:54,360 Speaker 2: to develop standards here? 458 00:28:55,200 --> 00:28:58,120 Speaker 1: Yeah, and then content providence is one of the big 459 00:28:58,200 --> 00:29:01,320 Speaker 1: areas that is being hotly debated in Washington. Now. It's 460 00:29:01,320 --> 00:29:05,200 Speaker 1: how do we allow for the type of creative freedom 461 00:29:05,200 --> 00:29:08,040 Speaker 1: that these technologies are going to provide while also acknowledging 462 00:29:08,040 --> 00:29:12,320 Speaker 1: that the potential for widespread dissemination of potentially misleading information, 463 00:29:12,400 --> 00:29:15,240 Speaker 1: whether it be images, voice, or text, has the power 464 00:29:15,320 --> 00:29:19,040 Speaker 1: to be significantly disorienting in a democracy that relies on 465 00:29:19,120 --> 00:29:22,400 Speaker 1: the transfer of reliable information from person to person. So 466 00:29:22,880 --> 00:29:25,320 Speaker 1: there's a request for the Department of Commerce to come 467 00:29:25,400 --> 00:29:28,520 Speaker 1: up with standards or guidance on how to develop standards 468 00:29:28,680 --> 00:29:33,160 Speaker 1: as it relates to managing quote unquote defakes our AI 469 00:29:33,280 --> 00:29:36,480 Speaker 1: produced content. But this is an area where we're probably 470 00:29:36,520 --> 00:29:39,280 Speaker 1: at the beginning stages of getting the federal government's arms 471 00:29:39,280 --> 00:29:42,000 Speaker 1: around how to address the concerns that have been expressed 472 00:29:42,000 --> 00:29:43,080 Speaker 1: in some that are anticipated. 473 00:29:43,280 --> 00:29:48,680 Speaker 2: Yeah, guidance on guidance sounds like. So now there are 474 00:29:48,720 --> 00:29:52,479 Speaker 2: also concerns about workers. I don't know how many workers 475 00:29:52,480 --> 00:29:56,760 Speaker 2: have been displaced by AI already. They directed the Department 476 00:29:56,760 --> 00:29:58,560 Speaker 2: of Labor to try to do something with that. 477 00:29:59,120 --> 00:30:01,800 Speaker 1: Yeah, and your point about not knowing how many have 478 00:30:01,800 --> 00:30:04,400 Speaker 1: been displaced already, this is an area where I think 479 00:30:04,480 --> 00:30:08,240 Speaker 1: there's anticipated displacement and an attempt to get out ahead 480 00:30:08,240 --> 00:30:11,080 Speaker 1: of it to avoid the kind of disorientation you can 481 00:30:11,120 --> 00:30:13,120 Speaker 1: see in the market and as a result society with 482 00:30:13,160 --> 00:30:16,520 Speaker 1: people potentially being pushed out of jobs that are automated 483 00:30:16,600 --> 00:30:19,520 Speaker 1: very quickly. There's also a need to ensure that we 484 00:30:19,640 --> 00:30:22,640 Speaker 1: have the type of skilled labor force in America that 485 00:30:22,680 --> 00:30:25,920 Speaker 1: we need to actually maximize our ability to lead on 486 00:30:25,960 --> 00:30:29,040 Speaker 1: these technologies. So there's both a request for the Department 487 00:30:29,080 --> 00:30:32,360 Speaker 1: of labor to work with the private sector who publish 488 00:30:32,440 --> 00:30:36,360 Speaker 1: best practices on how to mitigate AI's harms to employees, 489 00:30:36,400 --> 00:30:39,600 Speaker 1: but also to solicit information on how and where we 490 00:30:39,640 --> 00:30:41,960 Speaker 1: can increase the flow of immigrants with advanced skills to 491 00:30:42,080 --> 00:30:44,840 Speaker 1: ensure that we have the workforce on hand to continue 492 00:30:44,840 --> 00:30:46,800 Speaker 1: to innovate. So it's a bit of a double edged 493 00:30:46,800 --> 00:30:49,640 Speaker 1: sword where we need more workers to advance technology, but 494 00:30:49,680 --> 00:30:52,960 Speaker 1: we also need protections in place to protect from undoed 495 00:30:53,000 --> 00:30:54,920 Speaker 1: harm to the existing workforce that we have. 496 00:30:55,360 --> 00:31:00,000 Speaker 2: Tell me about how discrimination occurs in hiring systems driven 497 00:31:00,160 --> 00:31:02,800 Speaker 2: by AI. That seems to be a concern. 498 00:31:03,640 --> 00:31:07,560 Speaker 1: Yeah, it's a concern essentially rooted in the nature of 499 00:31:07,640 --> 00:31:11,440 Speaker 1: the data we have, right like, AI systems operate on 500 00:31:12,320 --> 00:31:15,640 Speaker 1: repositories of existing data, and to the extent that data 501 00:31:16,000 --> 00:31:19,960 Speaker 1: flows from systems wherein discrimination has occurred previously, there is 502 00:31:20,000 --> 00:31:23,920 Speaker 1: a risk that by automating on that arguably discriminatory data, 503 00:31:23,920 --> 00:31:27,920 Speaker 1: we increase the risk of future discrimination by basically training 504 00:31:27,920 --> 00:31:30,360 Speaker 1: the systems to act in the way that whether it 505 00:31:30,440 --> 00:31:33,280 Speaker 1: be previous human actors or previous automated systems have And 506 00:31:33,360 --> 00:31:36,480 Speaker 1: so there's a challenge wherein you're trying to train on 507 00:31:36,520 --> 00:31:39,520 Speaker 1: the best available data, but also trying to acknowledge where 508 00:31:39,560 --> 00:31:42,440 Speaker 1: that data may have built in biases, and then trying 509 00:31:42,480 --> 00:31:46,840 Speaker 1: to figure out technologically how you can innovate around those 510 00:31:47,040 --> 00:31:51,240 Speaker 1: existing potentially discriminatory outcomes. And it's a challenge in lending, 511 00:31:51,280 --> 00:31:54,240 Speaker 1: it's a challenge in law enforcement of criminal justice. It's 512 00:31:54,360 --> 00:31:57,840 Speaker 1: one of the bigger I would say social challenges that 513 00:31:57,880 --> 00:32:00,640 Speaker 1: particularly this administration, both through its a I Build Rights 514 00:32:00,680 --> 00:32:02,360 Speaker 1: and now to the order, is trying to get its 515 00:32:02,600 --> 00:32:03,160 Speaker 1: hands around. 516 00:32:03,720 --> 00:32:08,240 Speaker 2: Tell me about concerns about national security and I don't know, 517 00:32:08,280 --> 00:32:11,440 Speaker 2: do national security and cybersecurity go hand in hand or not? 518 00:32:12,040 --> 00:32:14,760 Speaker 1: Yeah, I would say national security in cybersecurity and national 519 00:32:14,760 --> 00:32:17,400 Speaker 1: security and economic security go hand in hand. This is 520 00:32:18,160 --> 00:32:20,440 Speaker 1: an issue, like Mini and Washington at the moment, that 521 00:32:20,680 --> 00:32:24,600 Speaker 1: is colored in large part by questions and concerns around 522 00:32:24,720 --> 00:32:28,520 Speaker 1: the US's relationship, particularly with China, and the desire to 523 00:32:28,680 --> 00:32:33,040 Speaker 1: ensure we maintain and edge and innovation development and deployment 524 00:32:33,080 --> 00:32:36,360 Speaker 1: of AI technology as it could relate to numerous uses, 525 00:32:36,360 --> 00:32:40,720 Speaker 1: including national security specific uses both in terms of offensive 526 00:32:40,760 --> 00:32:44,680 Speaker 1: capacity but also the ability to defend against potential malicious 527 00:32:44,680 --> 00:32:49,200 Speaker 1: cyberactivity that could be supercharged using automated systems, and so 528 00:32:49,280 --> 00:32:52,520 Speaker 1: there's a cybersecurity component and the need to ensure that 529 00:32:52,560 --> 00:32:57,000 Speaker 1: we have technological capacity to defend against beefed up cyber attacks. 530 00:32:57,000 --> 00:32:59,920 Speaker 1: And there's also the need to stay out ahead as 531 00:33:00,040 --> 00:33:02,160 Speaker 1: the global leader in this technology to ensure that we're 532 00:33:02,160 --> 00:33:04,960 Speaker 1: able to incorporate it in whatever ways are deemed necessary 533 00:33:05,400 --> 00:33:09,440 Speaker 1: to maintain advantages across a number of fields related to security. 534 00:33:09,840 --> 00:33:12,240 Speaker 2: Is there anything else in this order that I know 535 00:33:12,400 --> 00:33:14,160 Speaker 2: is pretty long for an executive orders? 536 00:33:14,240 --> 00:33:16,960 Speaker 1: Yeah, No, I would say as far as executive orders go, 537 00:33:17,040 --> 00:33:20,600 Speaker 1: ex massive, there's a lot in there. It reads both 538 00:33:20,720 --> 00:33:23,640 Speaker 1: parts as a document principles, but again a document of mandates. 539 00:33:23,640 --> 00:33:27,040 Speaker 1: And I think there's particularly two important mandates that are 540 00:33:27,040 --> 00:33:30,680 Speaker 1: included that are going to be discussed and deliberated at 541 00:33:30,760 --> 00:33:33,240 Speaker 1: nauseum over the next few months as the Commerce Department 542 00:33:33,320 --> 00:33:36,440 Speaker 1: gets its stated rules in place. And so one is 543 00:33:36,520 --> 00:33:40,360 Speaker 1: the requirement for companies training foundational models are frontier models 544 00:33:40,440 --> 00:33:45,200 Speaker 1: rather to report to the Commerce Department on the development 545 00:33:45,240 --> 00:33:48,560 Speaker 1: of those models. And then the second would be know 546 00:33:48,640 --> 00:33:51,600 Speaker 1: your customer style requirement for cloud service providers who are 547 00:33:51,600 --> 00:33:56,320 Speaker 1: providing services for foreign customers training similarly powerful frontier models 548 00:33:56,320 --> 00:33:59,960 Speaker 1: and it's not exactly clear what the contours of those 549 00:34:00,120 --> 00:34:02,680 Speaker 1: rules and requirements are going to be. There's a pretty 550 00:34:02,680 --> 00:34:05,800 Speaker 1: tight turnaround that's required by the EO. There are standards 551 00:34:05,800 --> 00:34:07,640 Speaker 1: that still have to be developed in order to give 552 00:34:07,720 --> 00:34:11,480 Speaker 1: those requirements the kind of clarity and anteeth frankly that 553 00:34:11,480 --> 00:34:13,520 Speaker 1: they're going to need to be effective. And there's the 554 00:34:13,600 --> 00:34:17,640 Speaker 1: ongoing conversation as to whether and to what extent that 555 00:34:17,920 --> 00:34:22,920 Speaker 1: information will be used to dictate the development and deployment 556 00:34:22,960 --> 00:34:25,640 Speaker 1: of these types of systems. And so I think those 557 00:34:25,680 --> 00:34:30,640 Speaker 1: two are particularly interesting components, given that they are requirements 558 00:34:30,640 --> 00:34:34,040 Speaker 1: and mandates rather than asked for reporting and potentially collecting 559 00:34:34,040 --> 00:34:35,480 Speaker 1: information for future actions. 560 00:34:35,800 --> 00:34:39,719 Speaker 2: Is congressional action really needed in this area and how 561 00:34:39,800 --> 00:34:40,879 Speaker 2: close are we to that? 562 00:34:41,360 --> 00:34:43,560 Speaker 1: Well? Yeah, I think at a minimum, you've seen the 563 00:34:43,560 --> 00:34:46,400 Speaker 1: President say yesterday that congressional action would be needed on 564 00:34:46,440 --> 00:34:50,240 Speaker 1: privacy as the administration sees fit. But there's also federal 565 00:34:50,239 --> 00:34:53,719 Speaker 1: mandates typically require federal funding, and Congress still has the 566 00:34:53,760 --> 00:34:55,520 Speaker 1: power of the purse, and so at a minimum you 567 00:34:55,520 --> 00:34:58,600 Speaker 1: would expect the need for some type of congressional appropriations 568 00:34:58,600 --> 00:35:01,279 Speaker 1: in order to ensure that some where all of these 569 00:35:01,440 --> 00:35:05,560 Speaker 1: policies are able to be achieved successfully. There's also the 570 00:35:05,680 --> 00:35:09,879 Speaker 1: need for potentially federal investment to ensure that we are 571 00:35:09,960 --> 00:35:12,840 Speaker 1: able to lead the world in cutting edge R and D. 572 00:35:13,080 --> 00:35:15,120 Speaker 1: There have been some calls for as much as thirty 573 00:35:15,160 --> 00:35:18,520 Speaker 1: two billion dollars in annual federal investment by twenty twenty six. 574 00:35:18,560 --> 00:35:20,720 Speaker 1: I believe it is from the National AI Security Council. 575 00:35:21,400 --> 00:35:24,080 Speaker 1: So the President, as I understand it, is meeting with 576 00:35:24,239 --> 00:35:26,839 Speaker 1: Majority Leader Schumer and the bipartisan group of Senators who 577 00:35:26,840 --> 00:35:29,880 Speaker 1: are working on this issue today to discuss their continued 578 00:35:29,880 --> 00:35:32,440 Speaker 1: to push for legislation and all of its messaging. Yesterday, 579 00:35:32,480 --> 00:35:35,640 Speaker 1: the administration signaled that it wants to continue working with 580 00:35:35,680 --> 00:35:38,280 Speaker 1: Congress to try to move legislation, and so I think again, 581 00:35:38,360 --> 00:35:41,880 Speaker 1: this is an important document insofar as includes some critical mandates, 582 00:35:42,239 --> 00:35:44,759 Speaker 1: It leverages the federal government's purchasing power to try to 583 00:35:44,800 --> 00:35:49,279 Speaker 1: shape the domestic market. But also it necessarily signals that 584 00:35:49,320 --> 00:35:51,759 Speaker 1: there are some limitations and existing authorities and the need 585 00:35:51,800 --> 00:35:54,719 Speaker 1: for additional Congressional action in order to continue to move 586 00:35:54,760 --> 00:35:57,560 Speaker 1: the American AI governance infrastructure forward. 587 00:35:58,120 --> 00:36:00,040 Speaker 2: So I read that the US has set aside I 588 00:36:00,040 --> 00:36:03,359 Speaker 2: at one point six billion in fiscal twenty twenty three 589 00:36:03,440 --> 00:36:05,400 Speaker 2: for AI, what does that go toward? 590 00:36:06,080 --> 00:36:09,000 Speaker 1: Mostly R and D through DoD and NSF, as I 591 00:36:09,080 --> 00:36:12,160 Speaker 1: understand it. There are probably other pots, but those are 592 00:36:12,200 --> 00:36:14,759 Speaker 1: the two at the top of my mind. Most of 593 00:36:14,800 --> 00:36:18,200 Speaker 1: what we do at the federal level is research and development, 594 00:36:18,360 --> 00:36:23,200 Speaker 1: is advancing standards and basically setting the stage for private 595 00:36:23,200 --> 00:36:26,200 Speaker 1: sector innovation. But again, there are experts who believe that 596 00:36:26,360 --> 00:36:29,719 Speaker 1: considerably more investment is needed at the federal level to 597 00:36:29,840 --> 00:36:32,000 Speaker 1: augment what the private sector is able to do and 598 00:36:32,080 --> 00:36:35,160 Speaker 1: to invest in. Also, the types of public resources are 599 00:36:35,160 --> 00:36:37,720 Speaker 1: going to be necessary to make sure that a wide 600 00:36:37,800 --> 00:36:41,880 Speaker 1: swath of researchers, academics, and others are able to access 601 00:36:41,880 --> 00:36:44,799 Speaker 1: the best technology, which now is very capital intensive and 602 00:36:45,239 --> 00:36:48,480 Speaker 1: certainly not widely available to all who would want to 603 00:36:48,960 --> 00:36:51,720 Speaker 1: conduct research or pursue different types of innovation. 604 00:36:52,719 --> 00:36:54,760 Speaker 2: And is this iss You're gaining a lot of traction 605 00:36:55,120 --> 00:36:55,760 Speaker 2: in DC. 606 00:36:56,560 --> 00:37:00,920 Speaker 1: Yeah, I think it's important to stay explicitly how significantly 607 00:37:01,600 --> 00:37:04,279 Speaker 1: this issue has taken over Washington in my time here. 608 00:37:04,360 --> 00:37:06,080 Speaker 1: The only thing that I could frankly think of as 609 00:37:06,120 --> 00:37:10,279 Speaker 1: even close to it is the COVID response, and that was, yeah, 610 00:37:10,280 --> 00:37:12,799 Speaker 1: it's really I mean, you've had I think over two 611 00:37:12,800 --> 00:37:15,160 Speaker 1: dozen hearings in Congress at this point across the number 612 00:37:15,160 --> 00:37:18,000 Speaker 1: of committees that do not typically share jurisdiction. You have 613 00:37:18,520 --> 00:37:20,960 Speaker 1: this executive order that was released yesterday. You have the 614 00:37:21,000 --> 00:37:24,600 Speaker 1: Senate efforts that's ongoing. It's captured the imagination in a 615 00:37:24,600 --> 00:37:30,000 Speaker 1: way again absent a significant, potentially world historic emergency that 616 00:37:30,120 --> 00:37:33,880 Speaker 1: is ongoing. It is not normal for Congress, Congress in particular, 617 00:37:33,920 --> 00:37:35,960 Speaker 1: but the federal government in general, to spend this much 618 00:37:35,960 --> 00:37:39,000 Speaker 1: time and this much attention and to dive this deep 619 00:37:39,160 --> 00:37:41,640 Speaker 1: on a specific policy issue. And I think it just 620 00:37:41,680 --> 00:37:44,640 Speaker 1: speaks to one the breath of the issue and the 621 00:37:44,640 --> 00:37:47,680 Speaker 1: fact that frankly, advances in AI have the power to 622 00:37:48,080 --> 00:37:51,120 Speaker 1: impact nearly every aspect of our lives and our work, 623 00:37:51,239 --> 00:37:55,080 Speaker 1: but also the rapid expansion and the rapid deployment of 624 00:37:55,120 --> 00:37:58,000 Speaker 1: the technology, at least the perceived rapid expansion and deployment, 625 00:37:58,040 --> 00:38:00,759 Speaker 1: and how that has the ability to the system in 626 00:38:00,800 --> 00:38:02,960 Speaker 1: a way that can trigger political action in a way 627 00:38:03,000 --> 00:38:05,759 Speaker 1: that's steady or progress may not. So it's just I 628 00:38:05,800 --> 00:38:08,960 Speaker 1: don't want folks to take for granted that there was 629 00:38:09,000 --> 00:38:11,120 Speaker 1: a signing ceremony at the White House yesterday and its 630 00:38:11,160 --> 00:38:14,560 Speaker 1: business as usual. This is a wholly unique issue, and 631 00:38:14,600 --> 00:38:17,600 Speaker 1: I have a very difficult time coming up with a 632 00:38:17,640 --> 00:38:20,919 Speaker 1: proper historical analogy because it's not just rushing to try 633 00:38:20,960 --> 00:38:24,360 Speaker 1: to avert disaster, it's also rushing to try to ensure 634 00:38:24,360 --> 00:38:27,800 Speaker 1: we're able to maximize the upside effect. And that's a really, 635 00:38:27,880 --> 00:38:31,640 Speaker 1: really interesting dichotomy that you don't often see with something, 636 00:38:31,719 --> 00:38:34,719 Speaker 1: again that covers nearly every sector of the economy and 637 00:38:34,800 --> 00:38:36,280 Speaker 1: every aspect of American life. 638 00:38:36,800 --> 00:38:40,200 Speaker 2: You think the lawmakers are afraid they'll be replaced by bots. 639 00:38:40,520 --> 00:38:45,919 Speaker 1: No, No, it's something so human about our system of democracy. 640 00:38:46,200 --> 00:38:48,560 Speaker 1: And frankly, I think that the AI systems may be 641 00:38:48,640 --> 00:38:50,759 Speaker 1: smart enough to not want to subject themselves to the 642 00:38:50,840 --> 00:38:54,239 Speaker 1: rigors of public service. It takes a special kind of 643 00:38:54,320 --> 00:38:58,080 Speaker 1: person in particular, but I'm sure they are trying to 644 00:38:58,120 --> 00:39:01,520 Speaker 1: figure out how to best use the bots to ensure 645 00:39:01,560 --> 00:39:04,160 Speaker 1: they're able to better serve their constituents and frankly, to 646 00:39:04,440 --> 00:39:06,600 Speaker 1: maintain office, which is the way that it's going to 647 00:39:06,600 --> 00:39:07,160 Speaker 1: design to work. 648 00:39:07,200 --> 00:39:09,399 Speaker 2: Thanks so much for coming on the show, Reggie. That's 649 00:39:09,400 --> 00:39:12,799 Speaker 2: Reggie Babin, senior counsel at Ach and Gump. And that's 650 00:39:12,840 --> 00:39:15,480 Speaker 2: it for this edition of The Bloomberg Law Show. Remember 651 00:39:15,480 --> 00:39:17,600 Speaker 2: you can always get the latest legal news on our 652 00:39:17,600 --> 00:39:21,759 Speaker 2: Bloomberg Law Podcast. You can find them on Apple Podcasts, Spotify, 653 00:39:21,960 --> 00:39:27,000 Speaker 2: and at www dot bloomberg dot com, slash podcast slash Law, 654 00:39:27,400 --> 00:39:29,960 Speaker 2: and remember to tune into the Bloomberg Law Show every 655 00:39:30,040 --> 00:39:33,920 Speaker 2: weeknight at ten pm Wall Street Time. I'm June Grosso 656 00:39:34,080 --> 00:39:35,680 Speaker 2: and you're listening to Bloomberg