1 00:00:03,200 --> 00:00:08,000 Speaker 1: This is Bloomberg Law with June Brusso from Bloomberg Radio. 2 00:00:10,480 --> 00:00:13,960 Speaker 1: Sam bankman Fried has taken a big gamble, taking a 3 00:00:14,080 --> 00:00:17,320 Speaker 1: stand in his own defense to try to convince stures 4 00:00:17,360 --> 00:00:19,960 Speaker 1: that he didn't orchestrate one of the biggest frauds of 5 00:00:20,040 --> 00:00:24,079 Speaker 1: all time. During direct examination, the thirty one year old 6 00:00:24,079 --> 00:00:28,280 Speaker 1: painted himself as a disengaged CEO who didn't code, didn't 7 00:00:28,360 --> 00:00:32,080 Speaker 1: oversee Alameda research, or do little more than skim the 8 00:00:32,120 --> 00:00:35,879 Speaker 1: firm's terms of service. Prosecutors say that he directed the 9 00:00:35,960 --> 00:00:41,640 Speaker 1: transfer of FTX customer money into Alameda for investments, political donations, 10 00:00:41,680 --> 00:00:45,880 Speaker 1: and expensive real estate before both companies filed for bankruptcy, 11 00:00:46,400 --> 00:00:50,280 Speaker 1: and on cross examination, the prosecutor confronted him with many 12 00:00:50,360 --> 00:00:53,959 Speaker 1: of his public and private statements that contrasted with his 13 00:00:54,000 --> 00:00:57,400 Speaker 1: story about his role in the collapse of FTX. Joining 14 00:00:57,400 --> 00:01:00,520 Speaker 1: me is Bloomberg Legal reporter Ava Benny Morrison, who covering 15 00:01:00,560 --> 00:01:03,800 Speaker 1: the trial. How did he seem today? During the direct 16 00:01:04,040 --> 00:01:05,840 Speaker 1: part of his testimony. 17 00:01:05,760 --> 00:01:10,199 Speaker 2: He seems to relaxed. He was using very simple explanations 18 00:01:10,240 --> 00:01:14,040 Speaker 2: to go through the final days at FTX, the relationship 19 00:01:14,080 --> 00:01:17,319 Speaker 2: between FTX and Alameda. He'd got a warning from the 20 00:01:17,400 --> 00:01:21,000 Speaker 2: judge earlier that he shouldn't use sort of vague generalities 21 00:01:21,800 --> 00:01:25,400 Speaker 2: to explain his version of events. It wouldn't go well 22 00:01:25,440 --> 00:01:27,280 Speaker 2: with the jury. So it seems like he took that 23 00:01:27,360 --> 00:01:30,480 Speaker 2: on board. So he seemed to pre relax when he 24 00:01:30,600 --> 00:01:33,240 Speaker 2: was giving answers. He was talking directly to the jury, 25 00:01:33,480 --> 00:01:36,600 Speaker 2: and he seemed pretty comfortable that demeanors seemed to change 26 00:01:36,600 --> 00:01:38,479 Speaker 2: a little bit under cross examination. 27 00:01:38,959 --> 00:01:41,720 Speaker 1: Did the prosecutor come at him right out of the gate? 28 00:01:42,240 --> 00:01:45,679 Speaker 2: She did, And she went through dozens of comments he 29 00:01:45,760 --> 00:01:50,320 Speaker 2: had made before and just after FTX collapsed, comments on Twitter, 30 00:01:50,720 --> 00:01:54,920 Speaker 2: comments during press interviews before Congress, even in internal flack 31 00:01:55,000 --> 00:02:00,120 Speaker 2: messages with other FTX executives, and she asked him, did 32 00:02:00,160 --> 00:02:02,440 Speaker 2: you say this? Did you say words to this effect? 33 00:02:02,960 --> 00:02:05,920 Speaker 2: And at some point contrasted it with other comments he 34 00:02:05,960 --> 00:02:08,800 Speaker 2: had made, so trying to show the Jewy that there 35 00:02:09,120 --> 00:02:12,640 Speaker 2: may beholds in the versions of events that he has 36 00:02:12,680 --> 00:02:16,280 Speaker 2: provided so far. He seemed to be a little bit 37 00:02:16,720 --> 00:02:19,960 Speaker 2: combative and a little bit vague. He repeated again and again, 38 00:02:20,360 --> 00:02:23,600 Speaker 2: I'm not sure. I can't recall or he would quibble 39 00:02:23,680 --> 00:02:27,000 Speaker 2: with the exact wording or the nuance of a comment 40 00:02:27,200 --> 00:02:30,519 Speaker 2: in a Financial Times article, or a New York Times 41 00:02:30,639 --> 00:02:34,040 Speaker 2: article or even a Bloomberg article. So it was painstaking 42 00:02:34,080 --> 00:02:36,800 Speaker 2: at times because she would ask him if he said something, 43 00:02:36,840 --> 00:02:38,680 Speaker 2: then he would dispute it, and then she would pull 44 00:02:38,720 --> 00:02:40,880 Speaker 2: up the article and then put it to him again, 45 00:02:40,960 --> 00:02:42,840 Speaker 2: and then he would say, oh, I don't recall I 46 00:02:42,840 --> 00:02:44,760 Speaker 2: don't necessarily agree with that context. 47 00:02:45,040 --> 00:02:48,520 Speaker 1: So what was the main thrust of her questions? What 48 00:02:48,639 --> 00:02:52,680 Speaker 1: areas was she trying to get into or disprove From. 49 00:02:52,520 --> 00:02:56,200 Speaker 2: The get go, she definitely used his words against him, 50 00:02:56,280 --> 00:03:00,560 Speaker 2: really seizing on public statements he had made about what 51 00:03:00,639 --> 00:03:03,800 Speaker 2: the problems were at FTX and his reaction to it. 52 00:03:04,160 --> 00:03:07,720 Speaker 2: She also tried to shape him as a good storyteller 53 00:03:07,919 --> 00:03:12,640 Speaker 2: who was tailoring his testimony after months of reviewing the 54 00:03:12,680 --> 00:03:16,720 Speaker 2: government's evidence against him, also suggesting that he was tailoring 55 00:03:16,760 --> 00:03:20,120 Speaker 2: his earlier media interviews to serve human to serve his 56 00:03:20,240 --> 00:03:23,600 Speaker 2: case under directed domination. He was asked why did you 57 00:03:24,160 --> 00:03:28,560 Speaker 2: go on this so called apology press tour after FTX collapsed, 58 00:03:28,639 --> 00:03:30,720 Speaker 2: and he said he felt like it was the right 59 00:03:30,800 --> 00:03:33,040 Speaker 2: thing to do, and he felt like he had to 60 00:03:33,120 --> 00:03:35,080 Speaker 2: explain himself to the world. 61 00:03:35,600 --> 00:03:38,920 Speaker 1: You call the shots as CEO, the prosecutor asked, I 62 00:03:38,960 --> 00:03:41,600 Speaker 1: call some of them. He said, you think of yourself 63 00:03:41,600 --> 00:03:45,280 Speaker 1: as a smart guy in many ways, not always. He said, 64 00:03:45,640 --> 00:03:48,360 Speaker 1: you thought highly of yourself, and he said I did. 65 00:03:48,720 --> 00:03:50,360 Speaker 1: You can see what it's like reading it. What was 66 00:03:50,400 --> 00:03:53,800 Speaker 1: that like in the courtroom? Did it seem very antagonistic? 67 00:03:54,880 --> 00:03:58,440 Speaker 2: Not really? Well, most of her coffee nomination was focused 68 00:03:58,440 --> 00:04:02,120 Speaker 2: on his public statements and pointing out any holes in 69 00:04:02,360 --> 00:04:06,520 Speaker 2: everything that he has said since FTX collapsed. She seemed 70 00:04:06,560 --> 00:04:11,160 Speaker 2: to thread through some characterizations of him, you know, asking 71 00:04:11,200 --> 00:04:13,960 Speaker 2: if he thought he was a good storyteller, if he 72 00:04:14,200 --> 00:04:16,920 Speaker 2: was a CEO who called all the shops, and if 73 00:04:16,960 --> 00:04:19,880 Speaker 2: he thought he was smart. He didn't seem to be 74 00:04:19,960 --> 00:04:23,400 Speaker 2: personally affronted by that or offended at all. He was 75 00:04:23,480 --> 00:04:26,880 Speaker 2: very dead tam when he was replying. In many ways, 76 00:04:26,920 --> 00:04:29,120 Speaker 2: yes he thought he was smart. In other ways he didn't. Yes, 77 00:04:29,160 --> 00:04:31,880 Speaker 2: he did think he was a good CEO. So it 78 00:04:32,000 --> 00:04:36,560 Speaker 2: was interesting to get those little characterizations into his testimony. 79 00:04:37,080 --> 00:04:39,800 Speaker 1: Where was she able to, you know, crack some holes 80 00:04:39,839 --> 00:04:40,880 Speaker 1: in his testimony? 81 00:04:41,360 --> 00:04:45,799 Speaker 2: So she scored a couple of concessions, namely that bankman 82 00:04:45,960 --> 00:04:49,080 Speaker 2: Freed made the decision on a number of very costly 83 00:04:49,160 --> 00:04:53,000 Speaker 2: venture investments, at Alameda. She was really trying to hone 84 00:04:53,040 --> 00:04:58,320 Speaker 2: in on how involved he was with Alameda day to 85 00:04:58,400 --> 00:05:02,280 Speaker 2: day previously, to distance himself from it publicly and in 86 00:05:02,279 --> 00:05:06,000 Speaker 2: this trial as well. He definitely did that during direct examination, 87 00:05:06,880 --> 00:05:11,320 Speaker 2: trying to put himself at armslan from other corporating witnesses, 88 00:05:11,400 --> 00:05:15,240 Speaker 2: including Mishad Sing and Gary Wong, who will remember changed 89 00:05:15,480 --> 00:05:19,160 Speaker 2: fdx's code to give Alameda special privileges on the exchange. 90 00:05:19,360 --> 00:05:23,480 Speaker 2: He also sought to downplay Alameda's ten million liability to FTX, 91 00:05:23,680 --> 00:05:27,680 Speaker 2: saying at the time, yes, it warranted further analysis, but 92 00:05:28,120 --> 00:05:30,160 Speaker 2: if it was more, he would have been in christis no, 93 00:05:30,320 --> 00:05:33,920 Speaker 2: but he wasn't at the time. So during cross examination, 94 00:05:34,800 --> 00:05:39,800 Speaker 2: Sasson really tried to challenge that arm's length characterization that 95 00:05:39,880 --> 00:05:42,200 Speaker 2: he had provided, and she did get a couple of 96 00:05:42,240 --> 00:05:42,640 Speaker 2: wins there. 97 00:05:43,400 --> 00:05:47,760 Speaker 1: How much did he blame Caroline Allison or one of 98 00:05:47,800 --> 00:05:50,480 Speaker 1: the other two main witnesses for what. 99 00:05:50,560 --> 00:05:54,680 Speaker 2: Happened, It was very subtle, to be honest. Today, he 100 00:05:54,839 --> 00:05:58,320 Speaker 2: mentioned a meeting that he had with Caroline in twenty 101 00:05:58,360 --> 00:06:01,560 Speaker 2: twenty two and he had discussed the need for Alim 102 00:06:01,920 --> 00:06:06,359 Speaker 2: to hedge to protect against market volatility. At the time. 103 00:06:06,839 --> 00:06:08,760 Speaker 2: He said that when he brought this up to her, 104 00:06:09,040 --> 00:06:10,880 Speaker 2: she didn't do it initially, and then he brought it 105 00:06:10,960 --> 00:06:14,080 Speaker 2: up again during this meeting and she started crying and 106 00:06:14,120 --> 00:06:18,000 Speaker 2: then offer to resign. So he didn't explicitly say this 107 00:06:18,200 --> 00:06:21,320 Speaker 2: was her fold and she's the one to blame, but 108 00:06:21,560 --> 00:06:24,440 Speaker 2: that was the tone of what he was saying as well. 109 00:06:24,440 --> 00:06:29,200 Speaker 2: With Mischad and Gary, he mentioned during his dress examination 110 00:06:29,680 --> 00:06:33,000 Speaker 2: that yes, he directed them to change the code, but 111 00:06:33,080 --> 00:06:36,960 Speaker 2: he was not a software developer or a coder, and 112 00:06:37,000 --> 00:06:41,000 Speaker 2: he wasn't fully all over exactly what they were doing, 113 00:06:41,040 --> 00:06:42,520 Speaker 2: and he trusted them to do the work. 114 00:06:43,279 --> 00:06:47,320 Speaker 1: There seemed to be a lot about backdating of documents. 115 00:06:47,360 --> 00:06:48,880 Speaker 1: Did he do a lot of backdating? 116 00:06:49,400 --> 00:06:55,240 Speaker 2: Allegedly they asked him about a document that he had 117 00:06:55,279 --> 00:06:59,160 Speaker 2: signed apparently a year after it came into effects, and 118 00:06:59,200 --> 00:07:01,640 Speaker 2: so Food asked him was this something that he did regularly, 119 00:07:01,760 --> 00:07:04,960 Speaker 2: which he denied. We've heard earlier in the trial that 120 00:07:05,360 --> 00:07:10,160 Speaker 2: these documents were backdated and quite frequently. This sort of 121 00:07:10,200 --> 00:07:15,400 Speaker 2: ties into bateman free defense that he was relying on 122 00:07:15,760 --> 00:07:18,640 Speaker 2: the presence and the advice and the guidance of lawyers 123 00:07:18,640 --> 00:07:22,119 Speaker 2: within FTX and outside of FTX as well on matters 124 00:07:22,200 --> 00:07:22,480 Speaker 2: like this. 125 00:07:22,840 --> 00:07:25,760 Speaker 1: I mean, would you say he held up well. 126 00:07:25,520 --> 00:07:29,239 Speaker 2: During cross bankman Fred didn't seem to lose his call 127 00:07:29,320 --> 00:07:34,960 Speaker 2: during cross examination, but the sheer volume of his public 128 00:07:35,000 --> 00:07:40,520 Speaker 2: statements was quite compelling just to hear Sassoon lay them 129 00:07:40,560 --> 00:07:44,400 Speaker 2: out one by one, asking him firstly, did you say this? 130 00:07:44,680 --> 00:07:46,480 Speaker 2: And then he would approval with it, or say he 131 00:07:46,520 --> 00:07:49,000 Speaker 2: didn't remember, or sort of give some vague answer, and 132 00:07:49,000 --> 00:07:51,080 Speaker 2: then she would pull it up. But the jury could 133 00:07:51,120 --> 00:07:54,600 Speaker 2: see it there clear as day in writing. I think 134 00:07:54,640 --> 00:07:56,040 Speaker 2: that was quite compelling. 135 00:07:56,560 --> 00:07:59,119 Speaker 1: So he's back on the stand tomorrow. Is the trial 136 00:07:59,160 --> 00:07:59,560 Speaker 1: coming to. 137 00:07:59,520 --> 00:08:02,400 Speaker 2: An endswer certainly getting to the pointy end of the trial. 138 00:08:02,560 --> 00:08:07,080 Speaker 2: We're expecting across the conomination to last for a couple 139 00:08:07,120 --> 00:08:11,120 Speaker 2: more hours on Tuesday, and then there will probably be 140 00:08:11,520 --> 00:08:16,080 Speaker 2: some redirect from Bateman Freed's attorneys, and then we may 141 00:08:16,160 --> 00:08:20,160 Speaker 2: hear from some rebuttal witnesses. So we're thinking that the 142 00:08:20,200 --> 00:08:24,800 Speaker 2: evidence might be all done dusted by Tuesday afternoon or 143 00:08:24,800 --> 00:08:25,520 Speaker 2: Wednesday morning. 144 00:08:25,800 --> 00:08:28,480 Speaker 1: And who were there rebuttal witnesses that the prosecution is 145 00:08:28,520 --> 00:08:29,160 Speaker 1: going to call. 146 00:08:29,600 --> 00:08:34,000 Speaker 2: We could hear from potentially two rebuttal witnesses. One is 147 00:08:34,000 --> 00:08:39,240 Speaker 2: a FBI agent and another was a investor in FDx 148 00:08:39,920 --> 00:08:40,719 Speaker 2: and what was. 149 00:08:40,679 --> 00:08:43,080 Speaker 1: The courtroom like today? Mean? Was it packed again? 150 00:08:43,640 --> 00:08:47,360 Speaker 2: The courtroom was absolutely packed. Reporters have been lining up 151 00:08:47,400 --> 00:08:51,280 Speaker 2: outside the courthouse since last night to try and get 152 00:08:51,600 --> 00:08:55,200 Speaker 2: one of twenty twenty three on press seats inside the courtroom. 153 00:08:55,200 --> 00:08:57,400 Speaker 2: If they couldn't get a seat there, they had to 154 00:08:57,440 --> 00:09:00,880 Speaker 2: go to a number of overflow rooms in other areas 155 00:09:00,880 --> 00:09:04,040 Speaker 2: at the courthouse, So there was a lot of competition 156 00:09:04,240 --> 00:09:07,160 Speaker 2: to get inside to get a front row seat to 157 00:09:07,240 --> 00:09:10,080 Speaker 2: his testimony. Obviously, this is a major point in the trial, 158 00:09:10,120 --> 00:09:11,400 Speaker 2: and the. 159 00:09:11,440 --> 00:09:13,480 Speaker 1: US Attorney was there for the Southern District. 160 00:09:14,520 --> 00:09:18,240 Speaker 2: Yes, the US Attorney Damian Williams was seated in the 161 00:09:18,240 --> 00:09:22,880 Speaker 2: front row with a few other senior officials from his office. 162 00:09:23,000 --> 00:09:27,359 Speaker 2: He watched mainly the cross examination of Bateman free this afternoon. 163 00:09:27,720 --> 00:09:31,560 Speaker 2: He's also popped in for other important parts in the trial, 164 00:09:32,160 --> 00:09:36,120 Speaker 2: the evidence of Caroline Ellison and also the opening statement. 165 00:09:36,679 --> 00:09:39,080 Speaker 1: So coming down to the wire, we'll check in with 166 00:09:39,120 --> 00:09:41,600 Speaker 1: you again tomorrow, Eva to hear about the conclusion of 167 00:09:41,640 --> 00:09:45,800 Speaker 1: Sam bagman Fried's testimony. That's Bloomberg Legal reporter Eva. Benny 168 00:09:45,920 --> 00:09:48,760 Speaker 1: Morrison coming up next on the Bloomberg Last Show, We'll 169 00:09:48,800 --> 00:09:52,920 Speaker 1: talk to former federal prosecutor Michael Weinstein about the tactics 170 00:09:52,960 --> 00:09:57,520 Speaker 1: and strategy at play during SBF's testimony. I'm June Grasso, 171 00:09:57,600 --> 00:10:00,280 Speaker 1: and you're listening to Bloomberg. It would seem and like 172 00:10:00,320 --> 00:10:02,640 Speaker 1: this is the time for a judge to give some 173 00:10:02,880 --> 00:10:06,120 Speaker 1: leeway to the defense when the defendants on the stand yes. 174 00:10:06,400 --> 00:10:09,120 Speaker 3: And I'm sure the judge did that, And I'm sure 175 00:10:09,160 --> 00:10:12,199 Speaker 3: the judge gave him great latitude because the judge does 176 00:10:12,240 --> 00:10:14,960 Speaker 3: not want to be seen as tipping the scales one 177 00:10:15,000 --> 00:10:17,760 Speaker 3: way or the other, or have it become an appeal 178 00:10:17,800 --> 00:10:20,319 Speaker 3: issue if he's too heavy handed in what he does 179 00:10:20,760 --> 00:10:23,520 Speaker 3: and the way he handles the witnesses and the evidence. 180 00:10:23,559 --> 00:10:27,920 Speaker 3: So I'm sure you know us seeing the judge react 181 00:10:28,080 --> 00:10:30,360 Speaker 3: in that way, which was just answer yes or no, 182 00:10:31,200 --> 00:10:33,040 Speaker 3: was not out of the blue. There was a build 183 00:10:33,120 --> 00:10:35,760 Speaker 3: up to that. There was questions that were asked. There 184 00:10:35,840 --> 00:10:39,000 Speaker 3: was probably a lot of language that the defendant used 185 00:10:39,000 --> 00:10:41,760 Speaker 3: which was not really clear and concise in order to 186 00:10:41,840 --> 00:10:44,880 Speaker 3: answer specifically the question. And so it got to the 187 00:10:44,880 --> 00:10:46,720 Speaker 3: point that the judge felt is he had to rain 188 00:10:46,800 --> 00:10:48,880 Speaker 3: him in a little bit and tell him, please answer 189 00:10:48,960 --> 00:10:51,280 Speaker 3: yes or no. Obviously the judge has to also be 190 00:10:51,360 --> 00:10:54,360 Speaker 3: mindful that, you know, Juri's look at the judge in 191 00:10:54,520 --> 00:10:57,760 Speaker 3: somewhat of a high light. And so if a judge 192 00:10:57,800 --> 00:11:00,440 Speaker 3: is being heavy handed with an attorney or heavy handed 193 00:11:00,440 --> 00:11:03,880 Speaker 3: with the defendant, or calls out a defendant and acts 194 00:11:03,920 --> 00:11:06,800 Speaker 3: as though he doesn't even believe him, that may influence 195 00:11:06,880 --> 00:11:09,640 Speaker 3: directly or indirectly the jury. So the judge has to 196 00:11:09,679 --> 00:11:11,839 Speaker 3: be mindful of that and be careful what he says, 197 00:11:12,160 --> 00:11:15,320 Speaker 3: how he says it, so that he always is impartial. 198 00:11:15,800 --> 00:11:18,960 Speaker 1: So what's the defense's goal to make him likable to 199 00:11:19,040 --> 00:11:22,080 Speaker 1: the jury or you know, to try to contradict some 200 00:11:22,200 --> 00:11:26,280 Speaker 1: of the testimony of the three main witnesses against him. 201 00:11:26,320 --> 00:11:28,000 Speaker 1: I mean, if you were the defense attorney, what would 202 00:11:28,040 --> 00:11:29,640 Speaker 1: you be going there to accomplish? 203 00:11:29,920 --> 00:11:33,240 Speaker 3: To try to provide a justification for the collapse of 204 00:11:33,280 --> 00:11:35,560 Speaker 3: the business and that it did not come back to 205 00:11:36,000 --> 00:11:38,720 Speaker 3: the defendant and that there are other people that he 206 00:11:38,960 --> 00:11:43,400 Speaker 3: entrusted to make decisions right or wrong which did not 207 00:11:43,800 --> 00:11:47,439 Speaker 3: end up positively for investors. I mean, look, the bottom 208 00:11:47,480 --> 00:11:50,559 Speaker 3: line here is that the defendant is trying to make 209 00:11:50,600 --> 00:11:54,160 Speaker 3: the evidence out as as possible for him. It's like 210 00:11:54,200 --> 00:11:56,240 Speaker 3: putting lips to k on a pig. He can only 211 00:11:56,280 --> 00:11:59,199 Speaker 3: make it look that good, but in the end it's 212 00:11:59,200 --> 00:11:59,760 Speaker 3: still bacon. 213 00:12:00,720 --> 00:12:04,920 Speaker 1: The defense also spent time asking him about the massive 214 00:12:05,000 --> 00:12:09,120 Speaker 1: spending of FTX on the marketing, the way he dressed 215 00:12:09,200 --> 00:12:12,480 Speaker 1: and his hair, which Ellison said was to create an 216 00:12:12,520 --> 00:12:16,960 Speaker 1: image for himself. How important are those little things what 217 00:12:16,960 --> 00:12:18,760 Speaker 1: I would call little things to the jury. 218 00:12:19,360 --> 00:12:21,679 Speaker 3: So I think in the long run, it's not going 219 00:12:21,760 --> 00:12:24,959 Speaker 3: to be material to the jury, but it adds little 220 00:12:25,040 --> 00:12:28,640 Speaker 3: elements to the government's case in that he was doing 221 00:12:28,679 --> 00:12:32,280 Speaker 3: it not for genuine purposes, that that was his personality, 222 00:12:32,320 --> 00:12:35,280 Speaker 3: but he was doing it to create an aura of himself, 223 00:12:35,880 --> 00:12:40,199 Speaker 3: and they're tried to show that he had some ulterior 224 00:12:40,400 --> 00:12:42,920 Speaker 3: motivation when he did these types of things, dressed in 225 00:12:42,960 --> 00:12:45,560 Speaker 3: a certain way, kept his hair in a certain way, 226 00:12:45,960 --> 00:12:48,840 Speaker 3: and it wasn't just you know, that was him, because look, 227 00:12:48,880 --> 00:12:52,320 Speaker 3: there are business people who are very eccentric and do 228 00:12:52,520 --> 00:12:55,960 Speaker 3: very well and that's just them, and they're genuine in 229 00:12:55,960 --> 00:12:58,720 Speaker 3: that regard. But I think that can come back to 230 00:12:58,800 --> 00:13:01,680 Speaker 3: hurt people, and hurt defend it when they look like 231 00:13:01,720 --> 00:13:03,760 Speaker 3: they're being a little bit conniving and they're being a 232 00:13:03,800 --> 00:13:07,000 Speaker 3: little calculating in the way they act, the way they speak, 233 00:13:07,240 --> 00:13:09,520 Speaker 3: the way they dress, And I think that's what the 234 00:13:09,559 --> 00:13:12,320 Speaker 3: government's trying to get across, is that he had some 235 00:13:12,400 --> 00:13:15,520 Speaker 3: type of ulterior motive to act that way and dress 236 00:13:15,559 --> 00:13:18,800 Speaker 3: that way, which reflects, you know, him being a little 237 00:13:18,840 --> 00:13:21,319 Speaker 3: shady or a little sinister in the way he ran 238 00:13:21,400 --> 00:13:24,800 Speaker 3: the business. So I don't think it's material insofar as 239 00:13:24,840 --> 00:13:28,760 Speaker 3: the charges, but I think it reflects the government's push 240 00:13:29,160 --> 00:13:33,280 Speaker 3: to have him seen as not just eccentric, but you know, 241 00:13:33,440 --> 00:13:36,720 Speaker 3: trying to push a narrative about himself in a certain 242 00:13:36,760 --> 00:13:39,560 Speaker 3: way as a mastermind of this crypto space. 243 00:13:39,760 --> 00:13:44,200 Speaker 1: And the defense also spent some time on his relationship 244 00:13:44,280 --> 00:13:47,800 Speaker 1: with Caroline Ellison. He said he wasn't able to give 245 00:13:47,920 --> 00:13:52,280 Speaker 1: her the time and attention she wanted in their personal relationship. 246 00:13:52,600 --> 00:13:55,840 Speaker 1: Why do you think there's time being spent on why 247 00:13:55,880 --> 00:13:56,560 Speaker 1: they broke up. 248 00:13:57,200 --> 00:14:02,800 Speaker 3: Every juror every person is thinking, is she testifying because 249 00:14:02,800 --> 00:14:05,480 Speaker 3: she was a jilted lover although they may not express it. 250 00:14:06,080 --> 00:14:09,240 Speaker 3: So when someone testifies against someone who they had a 251 00:14:09,280 --> 00:14:13,679 Speaker 3: relationship with, and they're giving testimony which is harmful, which 252 00:14:13,720 --> 00:14:17,199 Speaker 3: you know, really goes at the heart of and criticizing them. 253 00:14:17,559 --> 00:14:20,600 Speaker 3: If you're a human being and you're watching this you're going, Wow, 254 00:14:20,640 --> 00:14:23,080 Speaker 3: they must have had a really bad breakup. I wonder 255 00:14:23,120 --> 00:14:26,120 Speaker 3: if he wronged her and as a result of that, 256 00:14:26,520 --> 00:14:30,040 Speaker 3: whether she's now getting back at him by testifying, and 257 00:14:30,120 --> 00:14:31,880 Speaker 3: so that's why it came up. 258 00:14:32,280 --> 00:14:35,760 Speaker 1: I mean, how can the defense possibly attack all the 259 00:14:35,800 --> 00:14:36,960 Speaker 1: evidence against him. 260 00:14:37,400 --> 00:14:39,160 Speaker 3: I think what the defense is trying to do is 261 00:14:39,680 --> 00:14:42,960 Speaker 3: give the jury pause and have the jury get back 262 00:14:42,960 --> 00:14:46,440 Speaker 3: into the jury room and not immediately say he's dead 263 00:14:46,440 --> 00:14:48,760 Speaker 3: in the water. He's guilty. You know, the evidence was 264 00:14:48,800 --> 00:14:51,320 Speaker 3: so overwhelming. I think what the defense wants the jury 265 00:14:51,320 --> 00:14:53,320 Speaker 3: to do is get back in that jury room and 266 00:14:53,440 --> 00:14:56,080 Speaker 3: be thoughtful in their deliberations, which I'm sure they'll try 267 00:14:56,120 --> 00:14:58,520 Speaker 3: to be, but also maybe give him the benefit of 268 00:14:58,520 --> 00:15:02,240 Speaker 3: the doubt and and not just convict him, because you know, 269 00:15:02,280 --> 00:15:05,240 Speaker 3: you had fifteen people testify against him, and you had, 270 00:15:05,280 --> 00:15:07,560 Speaker 3: you know, ten thousand pages of materials about what he 271 00:15:07,600 --> 00:15:10,800 Speaker 3: did wrong. Maybe the defense is trying to have the 272 00:15:10,920 --> 00:15:13,880 Speaker 3: jury give him the benefit of the doubt so that 273 00:15:13,920 --> 00:15:18,360 Speaker 3: they pause in their deliberations and look at him as 274 00:15:18,360 --> 00:15:21,760 Speaker 3: a human being and not just as this sinister, you know, 275 00:15:21,880 --> 00:15:25,320 Speaker 3: crypto mogul who was trying to one up people and 276 00:15:25,680 --> 00:15:28,120 Speaker 3: just take money and abuse money, so on. 277 00:15:28,160 --> 00:15:31,000 Speaker 1: Cross, the defendant has given the prosecution a lot to 278 00:15:31,080 --> 00:15:34,680 Speaker 1: work with, not only his direct testimony, but the commentary 279 00:15:34,720 --> 00:15:39,360 Speaker 1: that he offered as FTX rose and crashed. So she 280 00:15:39,400 --> 00:15:40,440 Speaker 1: had a lot to work with. 281 00:15:42,360 --> 00:15:45,480 Speaker 3: Yes, a tremendous amount. And that's from his own mouth. 282 00:15:45,960 --> 00:15:53,880 Speaker 3: And that's both internal messages, external messages, text messages, interviews posts, 283 00:15:54,000 --> 00:15:57,320 Speaker 3: congressional testimony. I mean, I think it's actually hard for 284 00:15:57,400 --> 00:16:00,400 Speaker 3: the prosecutor to decide what to use because there is 285 00:16:00,440 --> 00:16:03,200 Speaker 3: so much. And I think that's the problem he's going 286 00:16:03,280 --> 00:16:06,720 Speaker 3: to run into, is that everything he testified to on 287 00:16:06,800 --> 00:16:11,880 Speaker 3: direct can be contrasted with things he said previously which 288 00:16:11,880 --> 00:16:15,920 Speaker 3: are different from that. And I think that's when the 289 00:16:16,040 --> 00:16:19,479 Speaker 3: jury is really going to look at him and say, Okay, 290 00:16:19,720 --> 00:16:22,400 Speaker 3: maybe he was okay on direct, but when he wasn't 291 00:16:22,520 --> 00:16:25,400 Speaker 3: under the hot lights and he was just free flowing 292 00:16:25,440 --> 00:16:28,520 Speaker 3: in his opinion about his business and about the money flow, 293 00:16:28,960 --> 00:16:31,920 Speaker 3: you know, during congressional testimony or during text messages of 294 00:16:32,000 --> 00:16:35,120 Speaker 3: during interviews six eight ten months ago, look what he said. 295 00:16:35,400 --> 00:16:38,360 Speaker 3: And so I think the jury's really gonna have a 296 00:16:38,400 --> 00:16:42,160 Speaker 3: problem with the testimony on direct and things he said 297 00:16:42,200 --> 00:16:45,120 Speaker 3: six a ten months ago, which is what the government 298 00:16:45,160 --> 00:16:46,800 Speaker 3: is alleging is where the truth lies. 299 00:16:47,440 --> 00:16:50,360 Speaker 1: I mean, it seems like though he almost had no 300 00:16:50,520 --> 00:16:53,440 Speaker 1: choice because there was so much evidence against him. 301 00:16:54,000 --> 00:16:56,600 Speaker 3: Yes, it appeared I agree with you. I think his 302 00:16:57,760 --> 00:17:02,520 Speaker 3: only chance at this point, after three insiders testified against him, 303 00:17:03,000 --> 00:17:06,159 Speaker 3: after all the testimony and direct evidence came in about 304 00:17:06,160 --> 00:17:11,480 Speaker 3: his actions, I think, was him to try and convince 305 00:17:11,520 --> 00:17:14,160 Speaker 3: the jury out of his own mouth that what they're 306 00:17:14,200 --> 00:17:17,800 Speaker 3: suggesting he did was not illegal or criminal. He made 307 00:17:17,800 --> 00:17:21,159 Speaker 3: bad decisions, but it wasn't unlawful. So I agree with you. 308 00:17:21,200 --> 00:17:23,200 Speaker 3: I think that you know, this is one of those 309 00:17:23,240 --> 00:17:27,359 Speaker 3: situations where if he didn't testify, it's almost a slam 310 00:17:27,440 --> 00:17:30,119 Speaker 3: dunk that he would have been convicted. But by putting 311 00:17:30,200 --> 00:17:33,760 Speaker 3: him up to testify, maybe there's a shot that you'll 312 00:17:33,760 --> 00:17:36,320 Speaker 3: get a juror too who believe his story. 313 00:17:36,800 --> 00:17:39,480 Speaker 1: How harmful is it to his case that the judge, 314 00:17:39,560 --> 00:17:44,080 Speaker 1: after listening to him testify for hours outside the presence 315 00:17:44,119 --> 00:17:47,879 Speaker 1: of the jury on Thursday, decided that he could not 316 00:17:48,240 --> 00:17:50,760 Speaker 1: use an advice of council defense. 317 00:17:51,840 --> 00:17:54,840 Speaker 3: I think that was a real problem that could potentially 318 00:17:54,840 --> 00:17:57,439 Speaker 3: be another nail in the coffin for him, because that 319 00:17:57,600 --> 00:18:01,600 Speaker 3: was one very significant thing that they were trying to 320 00:18:02,240 --> 00:18:06,120 Speaker 3: suggest and use, you know, on his behalf, and the judge, 321 00:18:06,160 --> 00:18:08,800 Speaker 3: you know, made a ruling, which is what the judges 322 00:18:08,840 --> 00:18:11,520 Speaker 3: have to do. But the consequence of that is that, 323 00:18:11,680 --> 00:18:13,960 Speaker 3: you know, the defense is fighting with one hand tied 324 00:18:14,000 --> 00:18:17,399 Speaker 3: behind their back because they can't point to lawyers who 325 00:18:17,440 --> 00:18:19,800 Speaker 3: are advising the company as being the ones that made 326 00:18:19,840 --> 00:18:20,320 Speaker 3: the mistake. 327 00:18:20,680 --> 00:18:23,880 Speaker 1: Do you think that's a reversible error on appeal. 328 00:18:24,240 --> 00:18:26,439 Speaker 3: It's certainly going to be something that goes up on appeal, 329 00:18:26,840 --> 00:18:29,480 Speaker 3: But the judge, you know, had a solid grounding to 330 00:18:29,520 --> 00:18:33,119 Speaker 3: make that decision. There was even obviously testimony as to 331 00:18:33,200 --> 00:18:36,040 Speaker 3: that outside the presence of the jury, So the judge 332 00:18:36,080 --> 00:18:38,879 Speaker 3: took every step possible to avoid this being in a 333 00:18:38,960 --> 00:18:41,119 Speaker 3: pellet issue. Although of course if there's a conviction, it 334 00:18:41,160 --> 00:18:43,640 Speaker 3: will probably go up on appeal, but I think that's 335 00:18:43,800 --> 00:18:45,920 Speaker 3: likely not to be a material issue. 336 00:18:46,480 --> 00:18:49,800 Speaker 1: It's a real uphill battle for bankman Freed. 337 00:18:50,200 --> 00:18:51,719 Speaker 3: I don't think it's going to be enough. June. I 338 00:18:51,760 --> 00:18:54,440 Speaker 3: think that, you know, the defense did the best they could, 339 00:18:54,760 --> 00:18:58,480 Speaker 3: but the evidence it was so overwhelming, and you had 340 00:18:58,560 --> 00:19:01,480 Speaker 3: really firsthand test money from the people in the room, 341 00:19:02,119 --> 00:19:04,560 Speaker 3: and even though he might have come off on direct 342 00:19:05,040 --> 00:19:08,679 Speaker 3: as best as possible. The cross is going to undermine 343 00:19:08,680 --> 00:19:11,760 Speaker 3: that and mitigate that, and the Cross is going to 344 00:19:11,800 --> 00:19:14,600 Speaker 3: be able to show how, you know, he was doing 345 00:19:14,640 --> 00:19:17,760 Speaker 3: all of this with full knowledge and had directed people 346 00:19:17,800 --> 00:19:21,560 Speaker 3: to do this and was in control, and you know, 347 00:19:21,600 --> 00:19:23,479 Speaker 3: there's no way for him to get around that. And 348 00:19:24,080 --> 00:19:29,119 Speaker 3: he's laid out so much, so many statements previously that 349 00:19:29,160 --> 00:19:32,399 Speaker 3: there's really nothing now he can say which can't be 350 00:19:32,520 --> 00:19:35,919 Speaker 3: contrasted with things that he's previously said. And that's a 351 00:19:36,000 --> 00:19:37,400 Speaker 3: real problem for the defense. 352 00:19:37,960 --> 00:19:41,119 Speaker 1: And he's the last defense witness. So the case we'll 353 00:19:41,160 --> 00:19:44,879 Speaker 1: probably go to the jury this week. Thanks for your insights, Michael. 354 00:19:45,320 --> 00:19:50,120 Speaker 1: That's former federal prosecutor Michael Weinstein of Cole's Shots. Coming 355 00:19:50,240 --> 00:19:54,280 Speaker 1: up next, Supreme Court arguments over the phrase trump too Small. 356 00:19:54,520 --> 00:19:57,600 Speaker 1: This is Bloomberg. On Wednesday, the Supreme Court will hear 357 00:19:57,720 --> 00:20:01,040 Speaker 1: arguments in a free speech showdown over a man trying 358 00:20:01,040 --> 00:20:05,760 Speaker 1: to get federal trademark protection for the phrase trump too Small. 359 00:20:06,280 --> 00:20:09,119 Speaker 1: Attorney Stephen Elstra says he wants to use the phrase 360 00:20:09,160 --> 00:20:12,320 Speaker 1: on T shirts, but he was refused a trademark by 361 00:20:12,359 --> 00:20:15,920 Speaker 1: the US Patent in Trademark Office because of a provision 362 00:20:15,920 --> 00:20:20,040 Speaker 1: in federal law barring registration of marks that identify a 363 00:20:20,080 --> 00:20:24,640 Speaker 1: living person without that person's consent. However, the US Court 364 00:20:24,640 --> 00:20:28,639 Speaker 1: of Appeals for the Federal Circuit said that provision violates 365 00:20:28,680 --> 00:20:32,280 Speaker 1: the First Amendment when the trademark includes criticism of a 366 00:20:32,320 --> 00:20:36,560 Speaker 1: government official or public figure. Now the justices will decide. 367 00:20:37,119 --> 00:20:39,639 Speaker 1: Joining me is far As sunder G, a partner at 368 00:20:39,640 --> 00:20:43,399 Speaker 1: Dorsey and Whitney. In twenty eighteen, the Patent and Trademark 369 00:20:43,560 --> 00:20:49,359 Speaker 1: Office rejected the application for Trump too small. Tell us why? 370 00:20:50,160 --> 00:20:50,200 Speaker 3: So? 371 00:20:50,600 --> 00:20:55,520 Speaker 4: The rejection was issued on two grounds. Actually, the first 372 00:20:55,680 --> 00:20:58,439 Speaker 4: is the one that the Supreme Court is going to 373 00:20:58,480 --> 00:21:03,560 Speaker 4: take up. So initially the Trademark Office rejected the trademark 374 00:21:03,560 --> 00:21:07,919 Speaker 4: application based on the consent of a living individual, and 375 00:21:08,000 --> 00:21:12,640 Speaker 4: that living individual is obviously Trump. There's actually another rejection 376 00:21:12,800 --> 00:21:16,240 Speaker 4: that is not being considered on the appeal, which will 377 00:21:16,240 --> 00:21:20,199 Speaker 4: still stand if the case gets remanded, which is false association. 378 00:21:20,720 --> 00:21:23,800 Speaker 1: So explain what that means. So when you. 379 00:21:23,840 --> 00:21:29,840 Speaker 4: Piper trademark and you include a person's name in it, 380 00:21:30,200 --> 00:21:33,840 Speaker 4: the Trademark Office will say, hey, we need you to 381 00:21:34,240 --> 00:21:37,440 Speaker 4: get the consent of that person, so we're not violating 382 00:21:37,840 --> 00:21:41,119 Speaker 4: their rights of privacy and their rights of publicity. And 383 00:21:41,200 --> 00:21:44,280 Speaker 4: so when you're applying for it in the name of 384 00:21:44,680 --> 00:21:49,520 Speaker 4: let's say, in the case of Bloomberg, Bloomberg Radio, Bloomberg News, 385 00:21:49,840 --> 00:21:53,199 Speaker 4: you'd go and get the consent of mister Bloomberg, and 386 00:21:53,240 --> 00:21:57,640 Speaker 4: you get that consent because Bloomberg is associated with the company, 387 00:21:57,680 --> 00:22:00,119 Speaker 4: and of course he knows that it's being done. 388 00:22:00,440 --> 00:22:03,480 Speaker 1: Is this based on a trademark law or is it 389 00:22:03,560 --> 00:22:05,000 Speaker 1: based on a court interpretation. 390 00:22:05,640 --> 00:22:08,080 Speaker 4: It's based on a statute. It is based on a 391 00:22:08,119 --> 00:22:12,400 Speaker 4: section of the Lanham Act. So it's a congressionally enacted statute. 392 00:22:13,200 --> 00:22:17,120 Speaker 1: Okay. So now the Federal Circuit, which handles these appellate 393 00:22:17,200 --> 00:22:21,480 Speaker 1: cases of trademarks, reversed. Why did the Federal Circuit reverse? 394 00:22:22,200 --> 00:22:26,640 Speaker 4: So the Federal Circuit reversed on First Amendment grounds. They 395 00:22:26,840 --> 00:22:32,119 Speaker 4: actually approached the case from the following angle. They asked. 396 00:22:32,160 --> 00:22:35,680 Speaker 4: They said, the question here is whether the government has 397 00:22:35,680 --> 00:22:40,480 Speaker 4: an interest in limiting speech on privacy or publicity grounds 398 00:22:40,560 --> 00:22:45,679 Speaker 4: if that speech involves criticism of government officials, speech that 399 00:22:45,800 --> 00:22:50,359 Speaker 4: is otherwise at the heart of the First Amendments. And interestingly, 400 00:22:50,440 --> 00:22:54,280 Speaker 4: there's actually an intermediate step that we didn't discuss initially. 401 00:22:54,320 --> 00:22:59,359 Speaker 4: What happens is that a trademark Examiner refuses an application, 402 00:22:59,840 --> 00:23:03,119 Speaker 4: and then there's an initial refusal, there's a final refusal, 403 00:23:03,240 --> 00:23:06,240 Speaker 4: and then the Trademark Trial and Appeal Board is the 404 00:23:06,240 --> 00:23:11,320 Speaker 4: one who takes it up first. And the trademark Examiner 405 00:23:11,560 --> 00:23:14,800 Speaker 4: and the Trademark Trial and Appeal Board, they actually don't 406 00:23:14,840 --> 00:23:19,880 Speaker 4: have the ability to say that something is unconstitutional. The 407 00:23:19,920 --> 00:23:23,560 Speaker 4: applicant did make all of the First Amendment arguments to 408 00:23:23,960 --> 00:23:27,040 Speaker 4: the trademark office, both to the examiner and to the 409 00:23:27,080 --> 00:23:31,399 Speaker 4: Trademark Trial and Appeal Board, but being an administrative agency, 410 00:23:31,520 --> 00:23:35,640 Speaker 4: they don't have the ability to say that something is unconstitutional. 411 00:23:35,680 --> 00:23:39,320 Speaker 4: That's obviously left to the courts. And so that's exactly 412 00:23:39,359 --> 00:23:41,359 Speaker 4: what the Federal Circuit did. 413 00:23:41,359 --> 00:23:44,000 Speaker 1: This appeal to the Supreme Court. Which side is the 414 00:23:44,000 --> 00:23:45,159 Speaker 1: Biden administration on. 415 00:23:45,680 --> 00:23:49,640 Speaker 4: So the government is the one who actually filed the appeal, 416 00:23:51,600 --> 00:23:57,160 Speaker 4: and the government believes that the ruling of the refusal 417 00:23:57,320 --> 00:23:58,080 Speaker 4: should stand. 418 00:23:58,880 --> 00:24:02,600 Speaker 1: Let's go into a little more detail about what Elster 419 00:24:02,760 --> 00:24:04,919 Speaker 1: is arguing to the Supreme Court. 420 00:24:05,920 --> 00:24:11,480 Speaker 4: So Elster is arguing that his First Amendment rights are 421 00:24:11,560 --> 00:24:19,080 Speaker 4: being abridged by this refusal of the trademark application. He's 422 00:24:19,280 --> 00:24:23,760 Speaker 4: making these arguments along with a couple of there are 423 00:24:23,760 --> 00:24:27,280 Speaker 4: a couple of third parties who filed a Vegas brief 424 00:24:27,440 --> 00:24:31,400 Speaker 4: here too, And in general, what they're arguing is that 425 00:24:32,080 --> 00:24:36,920 Speaker 4: a rejection of a trademark application ends up chilling speech 426 00:24:37,680 --> 00:24:40,840 Speaker 4: when the Trademark Office says that you cannot have a 427 00:24:40,880 --> 00:24:45,399 Speaker 4: trademark registration that disfavors speech, and so it chills it, 428 00:24:45,440 --> 00:24:51,360 Speaker 4: and so First Amendment grounds are important here. And Elster 429 00:24:51,680 --> 00:24:57,639 Speaker 4: actually made his case a little bit more narrow and 430 00:24:57,720 --> 00:25:01,359 Speaker 4: he is making an argument that this doesn't apply to 431 00:25:01,920 --> 00:25:07,280 Speaker 4: all cases everywhere that involve any trademark that includes any 432 00:25:07,720 --> 00:25:12,960 Speaker 4: person's name. He's really narrowing it and thinks it's almost 433 00:25:13,080 --> 00:25:17,280 Speaker 4: really just a one off case. When he submitted his 434 00:25:17,440 --> 00:25:20,560 Speaker 4: materials to the Supreme Court, he actually said, the question 435 00:25:21,000 --> 00:25:25,600 Speaker 4: presented is whether the Trademark Office violated the First Amendment 436 00:25:25,680 --> 00:25:29,960 Speaker 4: when it applied this refusal of registration to a political 437 00:25:30,040 --> 00:25:34,320 Speaker 4: slogan on a T shirt that criticized former President Trump 438 00:25:34,359 --> 00:25:38,359 Speaker 4: without his consent. So you can see that he's not 439 00:25:38,440 --> 00:25:42,879 Speaker 4: even arguing that it's for political commentary in general, but 440 00:25:43,000 --> 00:25:46,359 Speaker 4: he's really just arguing political slogan on a T shirt 441 00:25:46,480 --> 00:25:51,600 Speaker 4: that criticizes President Trump without his consent. The government is 442 00:25:51,640 --> 00:25:55,720 Speaker 4: taking a much broader view of it, and they're talking 443 00:25:56,720 --> 00:25:59,359 Speaker 4: about it in a way that makes more sense and 444 00:25:59,440 --> 00:26:02,639 Speaker 4: has more impact, as I think the Supreme Court will 445 00:26:02,680 --> 00:26:05,840 Speaker 4: address it in that way where it has farther reaching 446 00:26:05,920 --> 00:26:09,840 Speaker 4: concerns for various parties, not just people who want to 447 00:26:09,880 --> 00:26:12,960 Speaker 4: put a trademark slogan on a T shirt that criticizing 448 00:26:13,640 --> 00:26:17,240 Speaker 4: President Trump or any other political figure. 449 00:26:17,560 --> 00:26:21,119 Speaker 1: So go into a little more depth about the government's 450 00:26:21,160 --> 00:26:22,240 Speaker 1: broader argument. 451 00:26:23,240 --> 00:26:27,480 Speaker 4: So the government's broader argument, they want the Court to 452 00:26:27,560 --> 00:26:31,680 Speaker 4: look at this section, it's called Section ten fifty two 453 00:26:31,840 --> 00:26:35,520 Speaker 4: C and to see whether it violates the free speech 454 00:26:35,640 --> 00:26:39,639 Speaker 4: clause of the First Amendment when a trademark contains a 455 00:26:39,640 --> 00:26:42,960 Speaker 4: criticism of a government figure or a public figure. And 456 00:26:43,480 --> 00:26:48,439 Speaker 4: the interesting thing I think about the two sides of 457 00:26:48,480 --> 00:26:52,080 Speaker 4: this case, the government and the applicants, they're starting from 458 00:26:52,200 --> 00:26:57,159 Speaker 4: different premises. So, as I said before, the applicant his 459 00:26:57,320 --> 00:27:02,560 Speaker 4: position is that denying a trademark registration practically suppresses speech 460 00:27:03,040 --> 00:27:06,359 Speaker 4: because when the Trademark Office says you can't have a registration, 461 00:27:07,400 --> 00:27:11,360 Speaker 4: that's chilling the speech that's in the registration. The government 462 00:27:12,000 --> 00:27:15,639 Speaker 4: is coming at it from a completely opposite angle, where 463 00:27:15,800 --> 00:27:20,840 Speaker 4: they're arguing that denial of a trademark registration doesn't restrict speech. 464 00:27:21,560 --> 00:27:24,040 Speaker 4: It is a government benefit and it is not a 465 00:27:24,080 --> 00:27:30,040 Speaker 4: restriction of speech. Because you have a trademark registration or 466 00:27:30,080 --> 00:27:33,359 Speaker 4: you get a trademark registration denied. It doesn't mean you 467 00:27:33,400 --> 00:27:35,800 Speaker 4: can't put those words on a T shirt. It doesn't 468 00:27:35,840 --> 00:27:37,919 Speaker 4: mean you can't use those words in a slogan or 469 00:27:37,960 --> 00:27:40,359 Speaker 4: say them. And so they're coming at it from two 470 00:27:40,440 --> 00:27:44,920 Speaker 4: different angles. And I think that is the most important 471 00:27:45,160 --> 00:27:48,560 Speaker 4: thing and interesting thing that the Court hopefully will end 472 00:27:48,640 --> 00:27:53,520 Speaker 4: up deciding whether the denial of a trademark registration is 473 00:27:53,560 --> 00:27:55,199 Speaker 4: a restriction of speech or not. 474 00:27:56,160 --> 00:28:01,280 Speaker 1: So the Supreme Court in recent years has struck down 475 00:28:01,480 --> 00:28:04,840 Speaker 1: to trademark laws based on free speech concerns. Tell us 476 00:28:04,840 --> 00:28:05,840 Speaker 1: about those cases. 477 00:28:07,080 --> 00:28:11,800 Speaker 4: So the other two cases that are from recent history. 478 00:28:12,400 --> 00:28:16,919 Speaker 4: So in twenty seventeen and twenty nineteen, So twenty seventeen 479 00:28:17,400 --> 00:28:22,000 Speaker 4: the Supreme Court struck down the disparaging part of the 480 00:28:22,040 --> 00:28:25,080 Speaker 4: refusal on the Lanham Act, and in twenty nineteen the 481 00:28:25,119 --> 00:28:30,240 Speaker 4: follow on case dealt with immoral and scandalous marks. Now, 482 00:28:30,560 --> 00:28:33,400 Speaker 4: both of these, these parts of the statute that got 483 00:28:33,440 --> 00:28:38,520 Speaker 4: struck down as unconstitutional are different, the government argues, than 484 00:28:38,880 --> 00:28:44,360 Speaker 4: the current statutory provision that we're dealing with, because the disparaging, immoral, 485 00:28:44,400 --> 00:28:51,040 Speaker 4: and scandalous were all situations where the Trademark Office needed 486 00:28:51,200 --> 00:28:57,600 Speaker 4: to decide what their viewpoint was on these specific marks, 487 00:28:57,600 --> 00:29:00,320 Speaker 4: and so whether something is disparaging or not was a 488 00:29:00,400 --> 00:29:05,600 Speaker 4: judgment call that had to be made by the Trademark Office. Here, 489 00:29:06,040 --> 00:29:09,760 Speaker 4: the government is arguing that the statute itself is viewpoint 490 00:29:09,840 --> 00:29:13,280 Speaker 4: neutral because it's just a simple consent. You can say 491 00:29:13,800 --> 00:29:16,600 Speaker 4: Trump too small, or you can say Trump is great, 492 00:29:16,720 --> 00:29:20,880 Speaker 4: You're still going to need Trump's consent, And in that way, 493 00:29:20,960 --> 00:29:25,400 Speaker 4: the government is arguing that it's viewpoint neutral. Now on 494 00:29:25,440 --> 00:29:29,000 Speaker 4: the other side of it, the applicant is arguing that 495 00:29:29,160 --> 00:29:32,880 Speaker 4: it's not viewpoint neutral. It's speaker based. And some of 496 00:29:32,920 --> 00:29:36,680 Speaker 4: the amicus briefs filed actually talked about this, and there 497 00:29:36,720 --> 00:29:40,200 Speaker 4: was one in particular that was really interesting that was 498 00:29:40,760 --> 00:29:44,080 Speaker 4: filed by the Foundation of Individual Rights and Expressions in 499 00:29:44,120 --> 00:29:48,680 Speaker 4: the Manhattan Institute, and they they characterized this clause as 500 00:29:48,720 --> 00:29:54,040 Speaker 4: a happy talk clause. And so how they're characterizing it 501 00:29:54,080 --> 00:29:58,160 Speaker 4: that way is that, of course, any trademark application that 502 00:29:58,320 --> 00:30:01,920 Speaker 4: includes men to of a person's name especially a political 503 00:30:02,000 --> 00:30:05,480 Speaker 4: figure that is positive more likely to get a sign 504 00:30:05,520 --> 00:30:09,040 Speaker 4: off on that. If it's negative, the chances of you 505 00:30:09,080 --> 00:30:11,640 Speaker 4: getting a sign off from that person are really low. 506 00:30:12,000 --> 00:30:15,760 Speaker 4: And so it is not viewpoint neutral because it favors 507 00:30:16,280 --> 00:30:21,240 Speaker 4: the positive, happy talk speech as opposed to anything that 508 00:30:21,320 --> 00:30:24,680 Speaker 4: criticizes the person who's referenced in the trademark application. 509 00:30:25,080 --> 00:30:28,560 Speaker 1: So how do you think the Supreme Court will rule. 510 00:30:29,240 --> 00:30:31,800 Speaker 4: It's hard to say how they're going to rule. There's 511 00:30:31,920 --> 00:30:36,680 Speaker 4: actually really good arguments on both sides. One thing that 512 00:30:36,760 --> 00:30:39,400 Speaker 4: I haven't talked about with you that I did want 513 00:30:39,400 --> 00:30:42,720 Speaker 4: to mention is that the International Trademark Association stiles a 514 00:30:42,800 --> 00:30:49,600 Speaker 4: great brief in this case, and they actually argued that 515 00:30:49,800 --> 00:30:52,840 Speaker 4: they're in support of the government, and so that's support 516 00:30:52,880 --> 00:30:56,240 Speaker 4: of the refusal. They made this really good argument that 517 00:30:56,760 --> 00:31:01,239 Speaker 4: the refusal of a trademark application and is not a 518 00:31:01,280 --> 00:31:05,840 Speaker 4: denial of speech. It doesn't actually limit speech. It does 519 00:31:05,960 --> 00:31:10,840 Speaker 4: quite the opposite. They argue that the refusals under this 520 00:31:11,040 --> 00:31:15,000 Speaker 4: section two see as it's called, actually permit more speech, 521 00:31:15,200 --> 00:31:21,040 Speaker 4: not less. And so their argument goes, when the USPTO 522 00:31:21,240 --> 00:31:24,840 Speaker 4: grants somebody a trademark registration, so say they granted Elster 523 00:31:24,960 --> 00:31:28,480 Speaker 4: a trademark registration for Trump too small. Mister Elsterer then 524 00:31:28,560 --> 00:31:33,800 Speaker 4: now has the ability to stop other people from using 525 00:31:33,880 --> 00:31:39,800 Speaker 4: that phrase on T shirts and possibly in other ways, 526 00:31:40,600 --> 00:31:45,000 Speaker 4: and so that's actually limiting speech. And so it's an 527 00:31:45,680 --> 00:31:50,520 Speaker 4: It's an interesting argument, I think in terms of how 528 00:31:50,600 --> 00:31:52,840 Speaker 4: the Supreme Court is going to come out. Like I said, 529 00:31:52,880 --> 00:31:56,320 Speaker 4: I think there are really good arguments on both sides. 530 00:31:56,400 --> 00:32:01,320 Speaker 4: It's hard to tell, especially ahead of the arguments, really 531 00:32:01,320 --> 00:32:04,000 Speaker 4: what's going to happen. I think that a key thing 532 00:32:04,120 --> 00:32:09,560 Speaker 4: here is that hopefully the Supreme Court will realize that 533 00:32:11,080 --> 00:32:14,000 Speaker 4: parsing this in a small way is going to be 534 00:32:14,080 --> 00:32:17,440 Speaker 4: difficult to enforce. And what I mean by that is 535 00:32:18,520 --> 00:32:24,400 Speaker 4: it would be very difficult for the Trademark Office to 536 00:32:24,680 --> 00:32:29,920 Speaker 4: be the arbiter of does this particular trademark involve a 537 00:32:29,960 --> 00:32:35,000 Speaker 4: political criticism or a parody or political speech that we 538 00:32:35,080 --> 00:32:39,440 Speaker 4: want to protect with free speech, whereas some other version 539 00:32:39,480 --> 00:32:42,640 Speaker 4: of a trademark doesn't fall into that category and can 540 00:32:43,320 --> 00:32:47,040 Speaker 4: continue to be refused under this section to see because 541 00:32:47,080 --> 00:32:51,040 Speaker 4: for example, it's not about a political figure. I think 542 00:32:51,080 --> 00:32:54,120 Speaker 4: that's going to be a dangerous road to go down, 543 00:32:54,200 --> 00:32:57,239 Speaker 4: and I hope that the Suprene Court doesn't end up 544 00:32:57,320 --> 00:33:00,280 Speaker 4: giving that ability to the Trademark Office because it's going 545 00:33:00,320 --> 00:33:03,840 Speaker 4: to be very hard for them to make those determinations 546 00:33:03,880 --> 00:33:05,440 Speaker 4: and for them to enforce it. 547 00:33:05,880 --> 00:33:08,760 Speaker 1: We'll learn more during the oral arguments on Wednesday. Thanks 548 00:33:08,800 --> 00:33:12,360 Speaker 1: so much, Fara. That's Fara Sunderg of Dorsey and Whitney. 549 00:33:12,800 --> 00:33:15,120 Speaker 1: And that's it for this edition of The Bloomberg Law Show. 550 00:33:15,480 --> 00:33:17,800 Speaker 1: Remember you can always get the latest legal news on 551 00:33:17,880 --> 00:33:22,160 Speaker 1: our Bloomberg Law Podcast. You can find them on Apple Podcasts, Spotify, 552 00:33:22,320 --> 00:33:27,360 Speaker 1: and at www dot Bloomberg dot com slash podcast slash Law, 553 00:33:27,760 --> 00:33:30,360 Speaker 1: and remember to tune into The Bloomberg Law Show every 554 00:33:30,400 --> 00:33:34,320 Speaker 1: weeknight at ten pm Wall Street Time. I'm June Grosso 555 00:33:34,440 --> 00:33:36,040 Speaker 1: and you're listening to Bloomberg