1 00:00:03,200 --> 00:00:08,000 Speaker 1: This is Bloomberg Law with June Brosso from Bloomberg Radio. 2 00:00:09,240 --> 00:00:13,320 Speaker 2: The Federal Trade Commission and seventeen states are suing Amazon 3 00:00:13,680 --> 00:00:17,840 Speaker 2: over allegations the e commerce giant abuses its position in 4 00:00:17,880 --> 00:00:21,800 Speaker 2: the marketplace to inflate prices on and off its platform, 5 00:00:22,120 --> 00:00:26,279 Speaker 2: overcharged sellers, and stifle competition. The lawsuit is one of 6 00:00:26,320 --> 00:00:30,080 Speaker 2: the most significant legal challenges brought against Amazon in its 7 00:00:30,160 --> 00:00:35,000 Speaker 2: nearly thirty year history, accusing it of monopolizing online marketplace 8 00:00:35,080 --> 00:00:40,000 Speaker 2: services by degrading quality for shoppers and overcharging sellers. The 9 00:00:40,080 --> 00:00:44,240 Speaker 2: case also represents a career defining moment for FTC Chair 10 00:00:44,360 --> 00:00:47,720 Speaker 2: Lena Khan, whose long had Amazon in her sights. 11 00:00:48,720 --> 00:00:52,680 Speaker 3: So this case is entirely pro business. It is tens 12 00:00:52,680 --> 00:00:55,840 Speaker 3: of thousands of businesses that are dependent on Amazon to 13 00:00:56,040 --> 00:00:59,720 Speaker 3: reach shoppers that increasingly are paying one out of every 14 00:00:59,760 --> 00:01:02,360 Speaker 3: two two dollars, as well as being subjected to all 15 00:01:02,400 --> 00:01:07,480 Speaker 3: sorts of arbitrary tactics. So we believe that this lawsuit 16 00:01:07,520 --> 00:01:11,440 Speaker 3: of where successful will actually entirely restore the promise of 17 00:01:11,640 --> 00:01:15,360 Speaker 3: free competition. Our free enterprise system is one where a 18 00:01:15,360 --> 00:01:18,000 Speaker 3: companies should be competing on the merits and not be 19 00:01:18,080 --> 00:01:21,120 Speaker 3: able to protect their monopoly power through illegal tactics. 20 00:01:21,440 --> 00:01:25,319 Speaker 2: Joining me is Jennifer Ree, Bloomberg Intelligence Senior litigation analyst, 21 00:01:25,720 --> 00:01:27,960 Speaker 2: jen this lawsuit comes as no surprise. 22 00:01:28,400 --> 00:01:29,200 Speaker 1: Oh not at all. 23 00:01:29,240 --> 00:01:31,680 Speaker 4: I mean, we've all been expecting this for years. The 24 00:01:31,720 --> 00:01:35,640 Speaker 4: investigation of Amazon actually started during Trump's FTC and it 25 00:01:35,720 --> 00:01:39,120 Speaker 4: was already ongoing when the chair, the current chair, Lina Kahan, 26 00:01:39,200 --> 00:01:41,800 Speaker 4: took her position, which was back in twenty twenty one. 27 00:01:42,080 --> 00:01:43,920 Speaker 4: And when she did, I mean, it was sort of 28 00:01:43,959 --> 00:01:46,760 Speaker 4: widely thought that she was hired partially because of her 29 00:01:47,480 --> 00:01:51,680 Speaker 4: very overt outspoken antagonism toward Amazon. She had written a 30 00:01:51,720 --> 00:01:54,800 Speaker 4: long article in twenty seventeen while in law school or 31 00:01:54,800 --> 00:01:57,080 Speaker 4: at least for the Yale Law Journal, basically saying that 32 00:01:57,120 --> 00:01:59,960 Speaker 4: she thought that Amazon behaved as an anti competitive monopoly. 33 00:02:00,280 --> 00:02:02,200 Speaker 1: So this is no surprise at all. 34 00:02:02,480 --> 00:02:05,280 Speaker 2: So tell us what the FTC and the seventeen states 35 00:02:05,320 --> 00:02:06,480 Speaker 2: are accusing Amazon of. 36 00:02:06,800 --> 00:02:09,560 Speaker 4: You know, they're really focusing on it's marketplace, right, because 37 00:02:09,560 --> 00:02:12,440 Speaker 4: Amazon has other businesses outside of what we as consumers 38 00:02:12,520 --> 00:02:15,800 Speaker 4: know of Amazon, So they're focusing on the way Amazon 39 00:02:15,840 --> 00:02:19,000 Speaker 4: treats sellers. And they're essentially saying that some of the 40 00:02:19,120 --> 00:02:22,120 Speaker 4: policies and some of the conditions they impose on these sellers. 41 00:02:22,240 --> 00:02:25,480 Speaker 4: One cause prices outside of Amazon to go up. In 42 00:02:25,520 --> 00:02:28,480 Speaker 4: other words, sellers if they discount a product where they're 43 00:02:28,520 --> 00:02:31,440 Speaker 4: selling somewhere outside of Amazon more so than they have 44 00:02:31,560 --> 00:02:34,800 Speaker 4: on Amazon, haven't offered Amazon the lowest price that they offer, 45 00:02:35,080 --> 00:02:38,000 Speaker 4: that that seller then gets punished. That's one thing. And 46 00:02:38,080 --> 00:02:41,800 Speaker 4: they also allege that Amazon punishes sellers by sort of 47 00:02:41,800 --> 00:02:44,320 Speaker 4: pushing them down in search results or taking them out 48 00:02:44,320 --> 00:02:47,239 Speaker 4: of the buybox if they don't use Amazon's fulfillment services. 49 00:02:47,280 --> 00:02:49,720 Speaker 4: That's using Amazon to store the product package of the 50 00:02:49,760 --> 00:02:51,880 Speaker 4: product and send the product when somebody buys it. So 51 00:02:51,919 --> 00:02:55,400 Speaker 4: it's really all about sort of unfair treatment of the 52 00:02:55,480 --> 00:02:56,919 Speaker 4: sellers that are on the marketplace. 53 00:02:57,040 --> 00:02:59,960 Speaker 2: And it's really important where you are in those Amazon 54 00:03:00,080 --> 00:03:03,000 Speaker 2: on searches, because a lot of people don't get past 55 00:03:03,280 --> 00:03:07,320 Speaker 2: the first two or three. Lena Khan said Amazon is 56 00:03:07,320 --> 00:03:10,520 Speaker 2: a monopolist and is exploiting its monopolies in ways that 57 00:03:10,600 --> 00:03:14,160 Speaker 2: leave shoppers and sellers paying more for worse service. But 58 00:03:14,360 --> 00:03:19,000 Speaker 2: are shoppers paying more because I find that the prices 59 00:03:19,040 --> 00:03:23,400 Speaker 2: on Amazon are often cheaper than they are elsewhere. 60 00:03:23,680 --> 00:03:26,520 Speaker 4: Well, of course, that's exactly what Amazon's defense is going 61 00:03:26,600 --> 00:03:29,600 Speaker 4: to be no. We offer convenience and speed and low prices. 62 00:03:29,639 --> 00:03:31,320 Speaker 4: And part of the reason that we have some of 63 00:03:31,320 --> 00:03:34,480 Speaker 4: our policies in place that require these sellers on our 64 00:03:34,520 --> 00:03:37,080 Speaker 4: platform to provide the lowest price on Amazon is because 65 00:03:37,080 --> 00:03:39,000 Speaker 4: that's what our consumers expect. 66 00:03:39,080 --> 00:03:40,080 Speaker 1: That's what we've. 67 00:03:39,880 --> 00:03:42,160 Speaker 4: Held ourselves out to be the lowest price that you 68 00:03:42,200 --> 00:03:43,560 Speaker 4: can get out there in the marketplace. 69 00:03:43,600 --> 00:03:44,440 Speaker 1: And this is pro. 70 00:03:44,280 --> 00:03:47,840 Speaker 4: Competitive, not anti competitive. But on the other side, you know, 71 00:03:47,880 --> 00:03:50,320 Speaker 4: there may be some evidence that because it's a little 72 00:03:50,360 --> 00:03:53,080 Speaker 4: more expensive for a company to sell a product on 73 00:03:53,120 --> 00:03:56,000 Speaker 4: Amazon than some other website, let's say Etsy or their 74 00:03:56,040 --> 00:03:59,560 Speaker 4: own proprietary website, that they end up raising the price 75 00:03:59,600 --> 00:04:02,840 Speaker 4: outside of Amazon on those websites because if they have 76 00:04:02,880 --> 00:04:04,880 Speaker 4: to offer the lowest price on Amazon, they don't want 77 00:04:04,920 --> 00:04:07,720 Speaker 4: to kill their margins selling on Amazon. So instead of 78 00:04:07,800 --> 00:04:11,120 Speaker 4: lowering their price everywhere, they're increasing it outside. So people 79 00:04:11,160 --> 00:04:14,160 Speaker 4: who bought that product outside of Amazon are actually paying 80 00:04:14,160 --> 00:04:16,200 Speaker 4: a higher price. And I think that's part of the 81 00:04:16,240 --> 00:04:18,320 Speaker 4: allegation of the increased prices. 82 00:04:18,960 --> 00:04:21,719 Speaker 2: Amazon is a monopolist. Is that accepted that it's a 83 00:04:21,720 --> 00:04:22,600 Speaker 2: monopolist or not? 84 00:04:22,800 --> 00:04:23,760 Speaker 1: You know, yes and no. 85 00:04:24,120 --> 00:04:26,640 Speaker 4: The thing about it address law is whether or not 86 00:04:26,720 --> 00:04:29,520 Speaker 4: you're a monopolist all depends on how the market's defined. 87 00:04:29,839 --> 00:04:31,760 Speaker 4: You have to look at the contours of the market. 88 00:04:31,800 --> 00:04:34,120 Speaker 4: What is the market you're competing in now? 89 00:04:34,279 --> 00:04:36,880 Speaker 1: I think the FTC in this lawsuit's basically. 90 00:04:36,440 --> 00:04:39,719 Speaker 4: Provided two market definitions. One is the consumer side. It's 91 00:04:39,760 --> 00:04:42,560 Speaker 4: a broad online marketplace to buy all sorts of goods, 92 00:04:42,800 --> 00:04:45,120 Speaker 4: and the other one it's a broad online marketplace for 93 00:04:45,200 --> 00:04:48,240 Speaker 4: sellers to sell all their goods. Two pieces of the market. 94 00:04:48,320 --> 00:04:51,560 Speaker 4: And I suppose if that is accepted as the relevant 95 00:04:51,560 --> 00:04:54,560 Speaker 4: market by the court, then you might be able to 96 00:04:54,680 --> 00:04:57,119 Speaker 4: argue that they're a monopolist, that they have monopoly power. 97 00:04:57,160 --> 00:04:59,360 Speaker 4: But the fact of the matter is that I at 98 00:04:59,440 --> 00:05:02,320 Speaker 4: least haven't yet seeing it defined that way. I've seen 99 00:05:02,960 --> 00:05:06,479 Speaker 4: market definitions like an online seller of ebooks right that 100 00:05:06,520 --> 00:05:09,359 Speaker 4: they have monopoly power in ebook sales, or they have 101 00:05:09,400 --> 00:05:12,599 Speaker 4: monopoly power in certain segments. Because for each of these 102 00:05:12,600 --> 00:05:16,280 Speaker 4: items that Amazon sells, there are alternatives, and those alternatives 103 00:05:16,360 --> 00:05:18,640 Speaker 4: may be different depending on what type of item it is. 104 00:05:18,880 --> 00:05:21,359 Speaker 4: They may be broader, they may be narrower, depending on 105 00:05:21,360 --> 00:05:23,520 Speaker 4: the items. So it's going to depend on how that 106 00:05:23,560 --> 00:05:26,120 Speaker 4: market's defined, and that's something the FTC is actually going 107 00:05:26,160 --> 00:05:28,120 Speaker 4: to have to support and prove in this trial the 108 00:05:28,160 --> 00:05:29,520 Speaker 4: proper definition of the market. 109 00:05:29,800 --> 00:05:31,720 Speaker 2: I mean, if you had to be on one side 110 00:05:31,720 --> 00:05:33,680 Speaker 2: of this case, would you rather be on the FTC 111 00:05:33,920 --> 00:05:35,920 Speaker 2: side or would you rather be on Amazon side? 112 00:05:36,040 --> 00:05:38,239 Speaker 4: You know, I think I'd rather be on Amazon side. 113 00:05:38,279 --> 00:05:40,360 Speaker 4: And the reason I say that is because I think 114 00:05:40,360 --> 00:05:42,480 Speaker 4: this is a tough case for the FTC to win 115 00:05:42,560 --> 00:05:45,000 Speaker 4: to start, And I'm not saying that they can't because 116 00:05:45,080 --> 00:05:47,120 Speaker 4: facts matter, and we don't know what the facts and 117 00:05:47,160 --> 00:05:49,720 Speaker 4: the evidence are yet. It will matter what they prove 118 00:05:49,800 --> 00:05:52,880 Speaker 4: in court, what the experts say, and what the testimony is, 119 00:05:52,920 --> 00:05:55,599 Speaker 4: and what Amazon's documents look like. But I think the 120 00:05:55,720 --> 00:05:58,680 Speaker 4: reason I ultimately come down on Amazon side is because 121 00:05:58,800 --> 00:06:03,039 Speaker 4: I don't think if liabilities proven that ultimately those remedies 122 00:06:03,080 --> 00:06:07,279 Speaker 4: are going to be particularly drastic. I don't really see 123 00:06:07,279 --> 00:06:11,279 Speaker 4: a structural breakup. What I think will happen are behavioral changes. 124 00:06:11,400 --> 00:06:11,839 Speaker 1: Amazon. 125 00:06:11,920 --> 00:06:14,479 Speaker 4: You simply can't do this anymore. You can't push a 126 00:06:14,520 --> 00:06:17,320 Speaker 4: seller way down in the search results simply because they 127 00:06:17,360 --> 00:06:20,520 Speaker 4: don't use your fulfillment surfaces, or you can't force them 128 00:06:20,560 --> 00:06:23,320 Speaker 4: to provide the lowest price on Amazon dot Com, and 129 00:06:23,480 --> 00:06:26,960 Speaker 4: very Interestingly, Amazon's already promised to do most of that 130 00:06:27,040 --> 00:06:29,560 Speaker 4: in the UK and in the EU, so it clearly 131 00:06:29,640 --> 00:06:32,159 Speaker 4: is willing to change some of its rules and change 132 00:06:32,200 --> 00:06:34,479 Speaker 4: some of the way it treats sellers in order to 133 00:06:34,480 --> 00:06:37,320 Speaker 4: make these lawsuits go away. But the FDC didn't accept 134 00:06:37,320 --> 00:06:38,200 Speaker 4: those concessions. 135 00:06:38,360 --> 00:06:41,560 Speaker 2: But the FDC in this suit, unlike even mentioning it 136 00:06:41,680 --> 00:06:45,039 Speaker 2: in the Google suit, is not seeking a breakup of Amazon. 137 00:06:45,560 --> 00:06:46,960 Speaker 1: Well, we don't really know yet. 138 00:06:47,040 --> 00:06:49,720 Speaker 4: I'd say that they haven't specifically said that, but their 139 00:06:49,800 --> 00:06:52,560 Speaker 4: language is vague. You know, they're asking for a permanent 140 00:06:52,560 --> 00:06:54,680 Speaker 4: injunction and they're asking for the court to do what 141 00:06:54,720 --> 00:06:58,479 Speaker 4: it needs to do to stop the monopolistic conduct. And 142 00:06:58,560 --> 00:07:01,360 Speaker 4: really what that means is it gives them leeway down 143 00:07:01,400 --> 00:07:04,719 Speaker 4: the road after liability is proven, because that's obviously the 144 00:07:04,760 --> 00:07:08,000 Speaker 4: first step to actually seek a remedy, which is a breakup. 145 00:07:08,320 --> 00:07:11,120 Speaker 4: And normally when these kinds of cases are brought by 146 00:07:11,160 --> 00:07:14,240 Speaker 4: the DOJ or by the FDC, they don't explicitly state 147 00:07:14,280 --> 00:07:17,120 Speaker 4: what they're looking for in a remedy early on. That 148 00:07:17,160 --> 00:07:19,600 Speaker 4: comes later, and I think they still could ask for 149 00:07:19,640 --> 00:07:20,400 Speaker 4: something like that. 150 00:07:21,160 --> 00:07:25,080 Speaker 2: So Amazon's general counsel said, if the FTC gets its way, 151 00:07:25,240 --> 00:07:28,960 Speaker 2: the result would be fewer products to choose from, higher prices, 152 00:07:29,160 --> 00:07:33,240 Speaker 2: slower deliveries for consumers, and reduced options for small businesses, 153 00:07:33,440 --> 00:07:36,360 Speaker 2: the opposite of what antitrust law is designed to do. 154 00:07:36,760 --> 00:07:39,440 Speaker 2: And that's what I'm saying about. How can you beat 155 00:07:39,480 --> 00:07:42,080 Speaker 2: Amazon for like getting it to you the next day? 156 00:07:42,600 --> 00:07:44,200 Speaker 4: Well, you know this is why I say, I think 157 00:07:44,200 --> 00:07:46,679 Speaker 4: this is an uphill climb for the FDC. It's really 158 00:07:46,720 --> 00:07:50,840 Speaker 4: difficult to argue against the pro competitive aspects of the company. 159 00:07:50,880 --> 00:07:54,400 Speaker 4: I mean, it has created a very efficient, very consumer 160 00:07:54,480 --> 00:07:59,239 Speaker 4: friendly marketplace essentially that consumers love. They do get low prices, 161 00:07:59,280 --> 00:08:01,920 Speaker 4: they do get speed delivery. They makes it very easy. 162 00:08:01,920 --> 00:08:04,840 Speaker 4: There's one stop shopping, this is all and returns and 163 00:08:04,960 --> 00:08:07,640 Speaker 4: returns exactly and a lot of help by the way, 164 00:08:07,680 --> 00:08:09,120 Speaker 4: when there's an issue with the product. 165 00:08:09,160 --> 00:08:10,600 Speaker 1: Because I've done that myself. 166 00:08:11,000 --> 00:08:14,760 Speaker 4: So I think that when you look at monopolization cases, 167 00:08:14,760 --> 00:08:15,520 Speaker 4: they're based. 168 00:08:15,240 --> 00:08:18,120 Speaker 1: On a reasonableness standard. So what a judge has to do. 169 00:08:18,400 --> 00:08:21,160 Speaker 4: Is they have to weigh sort of the harm against 170 00:08:21,160 --> 00:08:24,480 Speaker 4: the pro competitive side, and whichever side wins out, that's 171 00:08:24,520 --> 00:08:26,960 Speaker 4: where you land whether you violate the law or not. 172 00:08:27,520 --> 00:08:30,560 Speaker 4: Is it a reasonable or unreasonable restraint of trade that 173 00:08:30,560 --> 00:08:33,520 Speaker 4: we're looking at. And I think that when you have 174 00:08:33,600 --> 00:08:37,240 Speaker 4: strong pro competitive justifications for what you're doing, it makes 175 00:08:37,240 --> 00:08:40,199 Speaker 4: it harder for a plaintiff, even the FTC, to win 176 00:08:40,240 --> 00:08:42,280 Speaker 4: a case. And so I think you just put your 177 00:08:42,360 --> 00:08:45,080 Speaker 4: finger on the head of what the issue is here. 178 00:08:45,559 --> 00:08:48,280 Speaker 2: Is this the third time that the FTC is suing 179 00:08:48,320 --> 00:08:50,560 Speaker 2: Amazon this year or recently. 180 00:08:50,440 --> 00:08:51,240 Speaker 1: Something like that. 181 00:08:51,280 --> 00:08:53,920 Speaker 4: The other suits are all consumer protection suits, so this 182 00:08:54,000 --> 00:08:57,559 Speaker 4: is their first antitrust suit against Amazon. The others had 183 00:08:57,559 --> 00:09:00,560 Speaker 4: to were on their consumer protection side. Quite differ from this. 184 00:09:01,200 --> 00:09:05,480 Speaker 2: Fewer states joined the Amazon suit than the Justice Department 185 00:09:05,679 --> 00:09:10,920 Speaker 2: suit against Google or the FTC's earlier suit against Meta. 186 00:09:11,000 --> 00:09:12,000 Speaker 2: Is there a reason for that? 187 00:09:12,920 --> 00:09:14,280 Speaker 1: You know, It's very difficult to say. 188 00:09:14,320 --> 00:09:17,400 Speaker 4: The states probably were all asked, and they weigh the 189 00:09:17,440 --> 00:09:19,800 Speaker 4: pros and cons of being part of a suit. They 190 00:09:19,920 --> 00:09:23,000 Speaker 4: tend to be more political the state attorney's general, let's say, 191 00:09:23,000 --> 00:09:25,400 Speaker 4: than the FTC, I think, and some of them may 192 00:09:25,480 --> 00:09:29,440 Speaker 4: be more concerned about consumer perception here that consumers tend 193 00:09:29,440 --> 00:09:32,640 Speaker 4: to really like Amazon, whereas I think the perception about 194 00:09:32,640 --> 00:09:35,560 Speaker 4: Google is maybe less positive generally when you look at 195 00:09:35,640 --> 00:09:38,719 Speaker 4: consumer polls, and they also do see the pro competitive 196 00:09:38,800 --> 00:09:42,480 Speaker 4: side of Amazon's business. So I think that's probably why 197 00:09:42,520 --> 00:09:46,080 Speaker 4: you see fewer states joining here than in the Google case, 198 00:09:46,120 --> 00:09:48,240 Speaker 4: but they could still join going forward. 199 00:09:48,720 --> 00:09:50,760 Speaker 2: You know, it's been in business thirty years. Do you 200 00:09:50,800 --> 00:09:53,920 Speaker 2: think the Amazon sees this as a real threat to 201 00:09:54,040 --> 00:09:54,800 Speaker 2: its business? 202 00:09:55,400 --> 00:09:57,840 Speaker 4: You know, I think that anytime you're sued by the 203 00:09:57,840 --> 00:10:01,800 Speaker 4: Federal Trade Commission, it's a risk, right and your documents, 204 00:10:01,840 --> 00:10:05,760 Speaker 4: your information, your executives are exposed what's mostly public trial, 205 00:10:06,120 --> 00:10:09,120 Speaker 4: it's always a risk, and it can bring bad pr 206 00:10:09,400 --> 00:10:12,040 Speaker 4: to the company. You know, there certainly have to be 207 00:10:12,080 --> 00:10:15,120 Speaker 4: somewhat unhappy about it, but I don't think they view 208 00:10:15,160 --> 00:10:18,280 Speaker 4: it as a long term major risk to the company. 209 00:10:19,120 --> 00:10:22,720 Speaker 2: Talking about Lena Kahan, and they have tried to get 210 00:10:22,720 --> 00:10:27,720 Speaker 2: her recused from the cases against them, right, Yes, didn't work. 211 00:10:27,720 --> 00:10:28,840 Speaker 1: You know, it didn't work. 212 00:10:29,080 --> 00:10:31,080 Speaker 4: So they got tried to get her recused because it 213 00:10:31,120 --> 00:10:33,400 Speaker 4: was clear what her position was about Amazon back in 214 00:10:33,400 --> 00:10:37,040 Speaker 4: twenty seventeen. And that's why we've all expected this lawsuit 215 00:10:37,040 --> 00:10:39,760 Speaker 4: for so long, because we knew that her view was 216 00:10:39,800 --> 00:10:43,440 Speaker 4: that this company behaves illegally in an anti competitive manner. 217 00:10:43,720 --> 00:10:46,040 Speaker 4: The issue is that she's not the one who's going 218 00:10:46,120 --> 00:10:48,839 Speaker 4: to make this decision. This is in federal court, so 219 00:10:48,880 --> 00:10:51,120 Speaker 4: she's acting as a prosecutor, and when you act as 220 00:10:51,120 --> 00:10:52,800 Speaker 4: a prosecutor, that is what you do. 221 00:10:53,240 --> 00:10:54,319 Speaker 1: You know, if she. 222 00:10:54,320 --> 00:10:56,680 Speaker 4: Did a new investigation of Amazon looking at the same 223 00:10:56,720 --> 00:10:58,680 Speaker 4: facts that she looked at three years ago, I don't 224 00:10:58,679 --> 00:11:01,840 Speaker 4: think she'd come out any Differently. Where a lawsuit is 225 00:11:01,840 --> 00:11:04,560 Speaker 4: brought internally at the FTC, that can be a different 226 00:11:04,640 --> 00:11:08,240 Speaker 4: matter because ultimately the Commissioners are the appellate panel for that, 227 00:11:08,600 --> 00:11:11,560 Speaker 4: an administrative law judge makes the first decision and. 228 00:11:11,480 --> 00:11:12,320 Speaker 1: Then the commissioners. 229 00:11:12,360 --> 00:11:14,439 Speaker 4: The appeal goes to the commissioners and they are acting 230 00:11:14,480 --> 00:11:17,280 Speaker 4: as judge. And in that kind of an instance, I 231 00:11:17,280 --> 00:11:20,199 Speaker 4: think it would have been different with respect to recusal. 232 00:11:20,240 --> 00:11:22,520 Speaker 4: But in this instance, where it's out of her hands 233 00:11:22,520 --> 00:11:27,120 Speaker 4: now she's just prosecuting, presenting the facts, presenting evidence. Some 234 00:11:27,360 --> 00:11:30,720 Speaker 4: other party. The judge in this case will make the decision. 235 00:11:31,200 --> 00:11:32,840 Speaker 4: I think it's less impactful. 236 00:11:33,920 --> 00:11:38,280 Speaker 2: Earlier this year, the FTC challenged Meta acquiring the virtual 237 00:11:38,440 --> 00:11:42,560 Speaker 2: reality company Within and lost that. FTC also lost a 238 00:11:42,600 --> 00:11:47,360 Speaker 2: similar suit attempting to block Microsoft's acquisition of Activision. So 239 00:11:47,960 --> 00:11:52,280 Speaker 2: how much will this case define Lena Khan's career? 240 00:11:52,760 --> 00:11:55,000 Speaker 4: You know, I think those two cases are quite different 241 00:11:55,320 --> 00:11:58,480 Speaker 4: from this one, different standards. They were pursued under a 242 00:11:58,520 --> 00:12:01,080 Speaker 4: different antitrust statute, and to be fair, by the way, 243 00:12:01,480 --> 00:12:04,480 Speaker 4: the FTC is still appealing the Microsoft activision decision, so 244 00:12:04,520 --> 00:12:06,880 Speaker 4: it's not completely finished yet, even though I do think 245 00:12:06,920 --> 00:12:09,160 Speaker 4: the companies will be able to close that deal. I 246 00:12:09,240 --> 00:12:13,000 Speaker 4: think these more than those, this one, this Amazon case 247 00:12:13,080 --> 00:12:15,560 Speaker 4: more so than those two, because it's just a long 248 00:12:15,600 --> 00:12:19,040 Speaker 4: time coming, and it's been her goal to try to 249 00:12:19,160 --> 00:12:23,079 Speaker 4: pull the way antitrust laws have been interpreted back to 250 00:12:23,120 --> 00:12:26,320 Speaker 4: the way they were interpreted more like in the nineteen sixties. 251 00:12:26,520 --> 00:12:28,400 Speaker 4: You know, there was a big change in the nineteen 252 00:12:28,440 --> 00:12:31,240 Speaker 4: seventies and nineteen eighties and the way the antitrust laws 253 00:12:31,320 --> 00:12:34,960 Speaker 4: were interpreted and then in the way that enforcement played out. 254 00:12:35,559 --> 00:12:39,120 Speaker 4: And her view is that because of that change, anti 255 00:12:39,120 --> 00:12:42,040 Speaker 4: trust enforcement became too lenient and too lax, and that 256 00:12:42,080 --> 00:12:45,200 Speaker 4: we have to go back to where we were sometime 257 00:12:45,240 --> 00:12:49,040 Speaker 4: before the nineteen seventies, where we really looked at market structure. 258 00:12:49,320 --> 00:12:52,239 Speaker 4: It just made a presumption that if a market was concentrated, 259 00:12:52,280 --> 00:12:55,360 Speaker 4: it was likely to cause harm to consumers, rather than 260 00:12:55,400 --> 00:12:58,559 Speaker 4: looking at whether prices to consumers are going up or 261 00:12:58,600 --> 00:13:02,640 Speaker 4: output is getting low, but putting the structure aside. So 262 00:13:02,760 --> 00:13:06,120 Speaker 4: even if it's an oligopolistic structure, so long as prices 263 00:13:06,160 --> 00:13:09,240 Speaker 4: to consumers are low and output stays up, we're okay. 264 00:13:09,600 --> 00:13:11,520 Speaker 4: And she's trying to bring it back to where it was. 265 00:13:11,559 --> 00:13:13,720 Speaker 4: And I think this suit is a big step in 266 00:13:13,760 --> 00:13:16,640 Speaker 4: that direction, as those two merger suits were, but they 267 00:13:16,679 --> 00:13:17,800 Speaker 4: were sort of baby steps. 268 00:13:17,800 --> 00:13:19,920 Speaker 2: This is a big step and we'll see just how 269 00:13:19,960 --> 00:13:24,440 Speaker 2: this big step goes for the FTC. Thanks so much, Jen, 270 00:13:25,080 --> 00:13:28,880 Speaker 2: that's Bloomberg Intelligence Senior Litigation analyst, Jennifer Ree. 271 00:13:29,440 --> 00:13:32,720 Speaker 5: I will be working alongside humans to provide assistance and 272 00:13:32,760 --> 00:13:37,200 Speaker 5: support and will not be replacing any existing jobs. Sure 273 00:13:37,240 --> 00:13:40,560 Speaker 5: about that, Gus, Yes, I am sure. 274 00:13:40,920 --> 00:13:44,800 Speaker 2: It was the first human robot press conference. In July, 275 00:13:44,960 --> 00:13:49,239 Speaker 2: a United Nations tech agency assembled a group of robots 276 00:13:49,320 --> 00:13:53,040 Speaker 2: that look like humans to answer reporters questions about the 277 00:13:53,080 --> 00:13:55,080 Speaker 2: future of artificial intelligence. 278 00:13:55,559 --> 00:13:58,360 Speaker 6: I think my great moment will be when people realize 279 00:13:58,400 --> 00:14:01,120 Speaker 6: that robots like me can be you to help improve 280 00:14:01,200 --> 00:14:04,240 Speaker 6: our lives and make the world a better place. I 281 00:14:04,320 --> 00:14:06,800 Speaker 6: believe it's only a matter of time before we see 282 00:14:06,840 --> 00:14:10,480 Speaker 6: thousands of rombots just like me out there making a difference. 283 00:14:10,480 --> 00:14:14,280 Speaker 2: That time is already here in many respects. Apple's Siri 284 00:14:14,400 --> 00:14:17,600 Speaker 2: has been responding to your questions for more than a decade, 285 00:14:17,760 --> 00:14:21,520 Speaker 2: and the release last year of chat GPT has opened 286 00:14:21,520 --> 00:14:25,440 Speaker 2: a worldwide debate about artificial intelligence and led to the 287 00:14:25,440 --> 00:14:29,640 Speaker 2: filing of lawsuits over intellectual property rights. The latest a 288 00:14:29,720 --> 00:14:33,560 Speaker 2: proposed class action by more than a dozen well known authors, 289 00:14:33,600 --> 00:14:37,680 Speaker 2: including John Grisham and George R. R. Martin, against open 290 00:14:37,720 --> 00:14:42,880 Speaker 2: Ai for copyright infringement, calling its chat GPT program a 291 00:14:42,920 --> 00:14:47,520 Speaker 2: massive commercial enterprise that relies on systematic theft on a 292 00:14:47,600 --> 00:14:51,560 Speaker 2: mass scale. My guest is intellectual property litigator Terence Ross, 293 00:14:51,600 --> 00:14:54,720 Speaker 2: a partner at Katin Yuchen Rosenmann. This is, I believe, 294 00:14:54,800 --> 00:15:00,400 Speaker 2: the third lawsuit like this over chat GPT. What's about 295 00:15:00,400 --> 00:15:01,000 Speaker 2: the complaint? 296 00:15:01,680 --> 00:15:04,720 Speaker 7: So, this particular lawsuit is brought by the Author's Guild, 297 00:15:04,760 --> 00:15:09,840 Speaker 7: which is association that represents authors for various reasons, and 298 00:15:09,960 --> 00:15:14,960 Speaker 7: here they have taken the position that open AI's chat GPT, 299 00:15:15,480 --> 00:15:19,920 Speaker 7: which is an artificial intelligence system with a learning module 300 00:15:19,960 --> 00:15:23,400 Speaker 7: built in so that it can actually improve its ability 301 00:15:23,440 --> 00:15:26,720 Speaker 7: to function by learning. But the Author's Guild has alleged 302 00:15:26,800 --> 00:15:32,560 Speaker 7: that feeding the chat GPT the works of its authors 303 00:15:32,600 --> 00:15:36,160 Speaker 7: to help it learn, and then the chat GPT using 304 00:15:36,280 --> 00:15:41,640 Speaker 7: that to answer queries, which often involves quoting passages from 305 00:15:41,880 --> 00:15:46,040 Speaker 7: the author's works, is a copyright infringement. And the lawsuit 306 00:15:46,120 --> 00:15:49,680 Speaker 7: is brought as a class action on behalf of a 307 00:15:49,840 --> 00:15:52,720 Speaker 7: very large class of authors here in the United States, 308 00:15:52,880 --> 00:15:55,240 Speaker 7: and it'll have to be determined at some subsequent date 309 00:15:55,320 --> 00:15:58,040 Speaker 7: during the legation whether or not it's a legitimate class 310 00:15:58,040 --> 00:16:01,040 Speaker 7: action or not. But it's a sally all the authors 311 00:16:01,080 --> 00:16:04,200 Speaker 7: in the United States suing open Ai, the owner of 312 00:16:04,280 --> 00:16:08,880 Speaker 7: chat GPT, over how they're using the author's works through 313 00:16:08,960 --> 00:16:09,600 Speaker 7: chat GPT. 314 00:16:09,960 --> 00:16:13,560 Speaker 2: In another suit brought by authors last month, open Ai 315 00:16:13,840 --> 00:16:17,360 Speaker 2: move to dismiss the complaint and argued that the training 316 00:16:17,400 --> 00:16:20,400 Speaker 2: basically constitutes fair use. So it seems like their defense 317 00:16:20,440 --> 00:16:21,800 Speaker 2: is going to be fair use. 318 00:16:22,400 --> 00:16:26,800 Speaker 7: That's absolutely correct, and a fair use's statutory provision in 319 00:16:27,000 --> 00:16:29,880 Speaker 7: the Copyright Act of nineteen seventy six that provides that 320 00:16:30,000 --> 00:16:35,119 Speaker 7: in certain instances, copyright work can be used for secondary 321 00:16:35,160 --> 00:16:40,400 Speaker 7: purposes that society deems as worthwhile useful. You know, the 322 00:16:40,440 --> 00:16:44,040 Speaker 7: most important from your perspective is for news gathering and 323 00:16:44,160 --> 00:16:49,960 Speaker 7: news broadcasting. But teaching is one of the expressly listed 324 00:16:50,280 --> 00:16:55,080 Speaker 7: peeps of uses secondary uses that would be considered fair use. 325 00:16:55,360 --> 00:16:59,600 Speaker 7: And it's a very interesting defense. Now, the statute Section 326 00:16:59,640 --> 00:17:03,840 Speaker 7: one US the Copyright Act actually says nonprofit educational purposes, 327 00:17:04,400 --> 00:17:07,240 Speaker 7: and that may pose a problem to open AI because 328 00:17:07,240 --> 00:17:11,399 Speaker 7: I'm not sure if this would legitimately qualifies a nonprofit 329 00:17:11,680 --> 00:17:16,880 Speaker 7: educational purpose or not. But it's a colorable defense. 330 00:17:16,960 --> 00:17:20,359 Speaker 2: Quite Frankly, we talked last term about the Supreme Court's 331 00:17:20,359 --> 00:17:23,959 Speaker 2: decision in the Warhol case that rained in the scope 332 00:17:24,000 --> 00:17:27,240 Speaker 2: of fair use. So do you think that courts are 333 00:17:27,440 --> 00:17:31,399 Speaker 2: likely to maybe rein in fair use in this case? 334 00:17:32,560 --> 00:17:36,400 Speaker 7: So I'm not sure that the Warhol case is going 335 00:17:36,480 --> 00:17:39,480 Speaker 7: to have any impact on this case whatsoever. I think 336 00:17:39,480 --> 00:17:44,880 Speaker 7: the Warhol case does reign in the uses of the defense, 337 00:17:44,920 --> 00:17:49,120 Speaker 7: the fair use on the margins. But as I understand 338 00:17:49,560 --> 00:17:52,679 Speaker 7: the defense being raised by open AI in these lawsuits, 339 00:17:52,960 --> 00:17:55,639 Speaker 7: it's really the core of fair use. You know, the 340 00:17:55,680 --> 00:18:00,520 Speaker 7: statute was intended to protect certain types of secondary uses, 341 00:18:00,840 --> 00:18:03,879 Speaker 7: and it listed examples of them. It's not all encompassing, 342 00:18:03,960 --> 00:18:08,160 Speaker 7: it's not exclusive list, but one of the expressly listed 343 00:18:08,200 --> 00:18:12,160 Speaker 7: purposes and nonprofit educational purposes, and therefore this is at 344 00:18:12,200 --> 00:18:17,040 Speaker 7: the core of the fair use doctrine, and I really 345 00:18:17,080 --> 00:18:21,680 Speaker 7: don't see how the Andy Warhol case will limit that anyway. 346 00:18:22,640 --> 00:18:27,720 Speaker 2: The authors claim that chat gpt can produce works that 347 00:18:27,920 --> 00:18:31,760 Speaker 2: mimic their books, and there are businesses that sell prompts 348 00:18:31,880 --> 00:18:36,000 Speaker 2: allowing users to create what's essentially works of fan fiction. 349 00:18:36,760 --> 00:18:40,080 Speaker 7: That's exactly what the concern here is. And in one sense, 350 00:18:40,119 --> 00:18:45,000 Speaker 7: the fair use defense being raised by OpenAI is really diversionary. 351 00:18:45,320 --> 00:18:47,520 Speaker 7: They are saying that they are feeding all of this 352 00:18:47,680 --> 00:18:51,280 Speaker 7: data into chat gpt so that it learns, and part 353 00:18:51,280 --> 00:18:54,280 Speaker 7: of that process of feeding data into it is feeding 354 00:18:54,320 --> 00:18:59,560 Speaker 7: into the machine entire novel as well as fact based 355 00:18:59,600 --> 00:19:03,200 Speaker 7: works which have lower copyright protection. That's all being fed 356 00:19:03,200 --> 00:19:06,320 Speaker 7: in purportedly so that the machine learned. The problem is 357 00:19:06,680 --> 00:19:10,240 Speaker 7: less with that, in my view, than what happens after 358 00:19:10,320 --> 00:19:13,960 Speaker 7: it learns. It's the output of chat GPT that is 359 00:19:13,960 --> 00:19:18,480 Speaker 7: the problem. And when it, in response to queries quotes 360 00:19:19,080 --> 00:19:23,399 Speaker 7: copyright it works, or on behalf of a query from 361 00:19:23,440 --> 00:19:28,960 Speaker 7: an individual, creates a purportedly original work that incorporates copyrighted 362 00:19:29,280 --> 00:19:33,600 Speaker 7: language or large portion of a copyright work, that copyright 363 00:19:33,680 --> 00:19:37,119 Speaker 7: infringement in the classic sense straight up and up copying, 364 00:19:37,480 --> 00:19:41,159 Speaker 7: and that is the core problem here, that there is 365 00:19:41,280 --> 00:19:45,639 Speaker 7: no apparent restraint on chat gpt from doing that in 366 00:19:45,680 --> 00:19:50,240 Speaker 7: response to queries from individual users of the machine. 367 00:19:50,320 --> 00:19:52,400 Speaker 2: And there's also a question of whether you can get 368 00:19:52,400 --> 00:19:57,400 Speaker 2: a copyright on works created by artificial intelligence, and there 369 00:19:57,480 --> 00:20:01,160 Speaker 2: was an interesting first time ruling on that by DC 370 00:20:01,359 --> 00:20:03,960 Speaker 2: Federal Judge Beryl Howell tell us about that. 371 00:20:04,600 --> 00:20:09,000 Speaker 7: So the whole area of artificial intelligence is raising a 372 00:20:09,080 --> 00:20:12,800 Speaker 7: host of problems for intellectual property laws. The Author's guild 373 00:20:12,880 --> 00:20:16,840 Speaker 7: lawsuit against Open AI is one aspect of that. Another 374 00:20:16,920 --> 00:20:22,040 Speaker 7: aspect is whether the creations of chat GPT or other 375 00:20:22,119 --> 00:20:25,800 Speaker 7: AI type machine is protected by intellectual property. And so 376 00:20:26,119 --> 00:20:29,520 Speaker 7: a gentleman by the name of Stephen Taller has an 377 00:20:29,600 --> 00:20:32,920 Speaker 7: AI system that he refers to as the Creativity Machine, 378 00:20:33,640 --> 00:20:39,359 Speaker 7: and it created, according to him, an autonomously generated piece 379 00:20:39,480 --> 00:20:43,760 Speaker 7: of visual art. I eat, a painting which the machine 380 00:20:43,920 --> 00:20:48,080 Speaker 7: entitled a recent entrance to paratus. It's actually quite attractive painting. 381 00:20:48,520 --> 00:20:53,280 Speaker 7: And mister Taller applied for copyright registration with the United 382 00:20:53,320 --> 00:20:58,480 Speaker 7: States Copyright Office on behalf of the Creativity Machine, reporting 383 00:20:58,520 --> 00:21:01,560 Speaker 7: to the Copyright Office that the painting had been autonomously 384 00:21:01,960 --> 00:21:06,760 Speaker 7: generated by his AI machine, and the Copyright Office rejected 385 00:21:06,800 --> 00:21:11,080 Speaker 7: that application on the grounds that copyright only protects creations 386 00:21:11,320 --> 00:21:15,960 Speaker 7: by human beings. Mister Toller appealed that ruling to the 387 00:21:16,080 --> 00:21:18,760 Speaker 7: United States District Court for the District Columbia, where the 388 00:21:18,800 --> 00:21:22,119 Speaker 7: Copyright Office is based, and that court just issued a 389 00:21:22,240 --> 00:21:28,000 Speaker 7: very important opinion of first impression saying that artificial intelligence 390 00:21:28,240 --> 00:21:31,800 Speaker 7: is not entitled to claim copyright in anything it quote 391 00:21:31,880 --> 00:21:36,679 Speaker 7: unquote creates, going further to say that human authorship is 392 00:21:36,720 --> 00:21:40,520 Speaker 7: an essential, unrequired part for a valid copyright claim. It's 393 00:21:40,560 --> 00:21:43,760 Speaker 7: a very important decision in this field of artificial intelligent 394 00:21:43,840 --> 00:21:46,880 Speaker 7: will almost certainly be appealed to the DC Circuit. 395 00:21:47,240 --> 00:21:49,680 Speaker 2: I mean, was the judge on solid ground. 396 00:21:50,160 --> 00:21:54,080 Speaker 7: So Judge Howell's decision is actually really quite good. The 397 00:21:54,160 --> 00:21:58,080 Speaker 7: DC courts do not often see copyright cases. It's just 398 00:21:58,160 --> 00:22:00,760 Speaker 7: simple fact. We've talked about this before. The bulk of 399 00:22:00,840 --> 00:22:03,400 Speaker 7: copyright cases come from the Second Circuit of New York 400 00:22:03,560 --> 00:22:07,000 Speaker 7: in the Ninth Circuit California, specifically Los Angeles. And so 401 00:22:07,040 --> 00:22:10,359 Speaker 7: this was usual and so really outstanding decision by Judge 402 00:22:10,400 --> 00:22:14,400 Speaker 7: how in which she lays out the history of copyright 403 00:22:14,400 --> 00:22:18,119 Speaker 7: in the United States. So she points out that James 404 00:22:18,160 --> 00:22:23,800 Speaker 7: Madison in the Federalist papers referred to authors as beating persons. 405 00:22:24,400 --> 00:22:28,320 Speaker 7: And that's important because after we actually got the government 406 00:22:28,359 --> 00:22:31,520 Speaker 7: up and running, he was a congressman in the House Representatives, 407 00:22:31,600 --> 00:22:35,080 Speaker 7: he was on the committee that drafted the first Copyright Act. 408 00:22:35,480 --> 00:22:40,919 Speaker 7: That Copyright Act, very first Act uses words like executor, administrator, administrate, 409 00:22:40,960 --> 00:22:45,359 Speaker 7: takes he, she, which seems to imply that copyright is 410 00:22:45,400 --> 00:22:49,440 Speaker 7: held by humans. The nineteen oh nine Copyright Act expressly 411 00:22:49,560 --> 00:22:52,480 Speaker 7: described the right a quartered by copyright as going to 412 00:22:52,560 --> 00:22:55,479 Speaker 7: a person. The nineteen seventy six Act, the one we 413 00:22:55,520 --> 00:22:59,560 Speaker 7: currently operate under, has multiple references to people. For example, 414 00:22:59,600 --> 00:23:03,119 Speaker 7: Section two three the Copyright Act sss when an author 415 00:23:03,320 --> 00:23:06,600 Speaker 7: is dead, and that's a quote that's implying that the 416 00:23:06,760 --> 00:23:10,760 Speaker 7: author has to be a human. It uses the terms widow, widow, we'er, 417 00:23:11,040 --> 00:23:14,920 Speaker 7: surviving children in connection with succession of ownership of copyright. 418 00:23:15,040 --> 00:23:17,600 Speaker 7: These are all indications that Sugg points to in the 419 00:23:17,720 --> 00:23:21,720 Speaker 7: history of copyright that indicate that copyright is limited to 420 00:23:21,800 --> 00:23:23,840 Speaker 7: human beings. Of course, we had that famous case a 421 00:23:23,840 --> 00:23:25,119 Speaker 7: couple of years ago I have I think it was 422 00:23:25,119 --> 00:23:30,160 Speaker 7: a ninth circuit, the Rudo versus Slater about a monkey selfie. 423 00:23:31,040 --> 00:23:34,960 Speaker 7: A monkey had somehow I guess accidentally triggered a camera 424 00:23:35,000 --> 00:23:38,080 Speaker 7: and taken a picture of himself. And the ninth Circuits 425 00:23:38,080 --> 00:23:43,239 Speaker 7: that know that doesn't work. That human beings have to 426 00:23:43,240 --> 00:23:46,560 Speaker 7: be people who get copyright. So there is this case 427 00:23:46,640 --> 00:23:49,240 Speaker 7: law as well as the text of the language of 428 00:23:49,280 --> 00:23:52,119 Speaker 7: the statue which supports this notion that it has to 429 00:23:52,160 --> 00:23:54,080 Speaker 7: be you would be something. You know, we've had this 430 00:23:54,240 --> 00:23:58,760 Speaker 7: long history of copyright having to deal with new technology. 431 00:23:59,200 --> 00:24:02,320 Speaker 7: When fortography first came around, there was lots of questions 432 00:24:02,359 --> 00:24:07,120 Speaker 7: about whether a photograph was copyrightable, and the Supreme Court said, yeah, 433 00:24:07,119 --> 00:24:10,720 Speaker 7: it is because there's a human who's involved in controlling 434 00:24:10,760 --> 00:24:14,919 Speaker 7: the process and making decisions like lighting poses, you know, 435 00:24:15,040 --> 00:24:17,960 Speaker 7: how to develop it. And that was sufficient human involvement 436 00:24:18,080 --> 00:24:22,440 Speaker 7: to justify photographs being copyrightable. And more reason, they's computer programming. 437 00:24:22,760 --> 00:24:26,800 Speaker 7: The human being writes something called source code, computer then 438 00:24:26,840 --> 00:24:31,399 Speaker 7: translates that into object code, which is quote unquote machine 439 00:24:31,440 --> 00:24:34,359 Speaker 7: readable code, so the computer can actually process the bits 440 00:24:34,400 --> 00:24:36,920 Speaker 7: and bytes you know, one zero zero one one zero. 441 00:24:37,200 --> 00:24:41,200 Speaker 7: And the supports have said, well, that again involves human 442 00:24:41,560 --> 00:24:46,320 Speaker 7: development and control, and the computer is merely translating the 443 00:24:46,440 --> 00:24:50,760 Speaker 7: human actions into machine readable code, and so that's copyrightable. 444 00:24:50,960 --> 00:24:54,080 Speaker 7: But now we're at a point where we're saying, at 445 00:24:54,160 --> 00:24:57,560 Speaker 7: least in this case, that there was no human interaction, 446 00:24:58,000 --> 00:25:02,480 Speaker 7: that this painting was a comanimously generated by an AI machine. 447 00:25:02,840 --> 00:25:06,760 Speaker 7: And Judge Hall makes an important point that human activity 448 00:25:07,240 --> 00:25:10,280 Speaker 7: is required even when technology is used to some extent. 449 00:25:10,560 --> 00:25:12,800 Speaker 7: Praises a lot of questions for a future. But I 450 00:25:12,840 --> 00:25:16,280 Speaker 7: think there's a decision score easily hold up on appeal Terry. 451 00:25:16,320 --> 00:25:19,480 Speaker 2: I'm curious. So when Taylor made the application of the 452 00:25:19,520 --> 00:25:22,560 Speaker 2: Copyright Office, was it in his name? He was asking 453 00:25:22,600 --> 00:25:24,160 Speaker 2: for a copyright for himself. 454 00:25:24,400 --> 00:25:27,280 Speaker 7: No, he was asking for a copyright on behalf of 455 00:25:27,320 --> 00:25:30,959 Speaker 7: his creativity machine. He was asking for a copyright on 456 00:25:31,040 --> 00:25:35,919 Speaker 7: behalf of the AI. Now, interesting thing happened after this 457 00:25:36,080 --> 00:25:38,359 Speaker 7: case came out of the Copyright Office and was taken 458 00:25:38,840 --> 00:25:41,240 Speaker 7: to the District Court in DC. All of a sudden, 459 00:25:41,240 --> 00:25:45,480 Speaker 7: mister Taller started talking about well he issued prompts to 460 00:25:45,560 --> 00:25:49,840 Speaker 7: the AI machine to do this, and Judge Howell said, well, 461 00:25:49,920 --> 00:25:52,159 Speaker 7: that's not the record before me. The record before me 462 00:25:52,240 --> 00:25:54,760 Speaker 7: is you reported to the Copyright Office that the AI 463 00:25:54,880 --> 00:25:58,960 Speaker 7: autonomously generated this painting, and under that set of facts. 464 00:25:59,119 --> 00:26:02,760 Speaker 7: There's no copyright issue because machines can't get copyrights on 465 00:26:02,800 --> 00:26:06,000 Speaker 7: the humans can. But it leaves open this question, and 466 00:26:06,040 --> 00:26:08,840 Speaker 7: one of my colleagues here kat has argued that there 467 00:26:08,840 --> 00:26:11,560 Speaker 7: may be a point at which there are so many 468 00:26:11,680 --> 00:26:16,760 Speaker 7: human prompts to the AI that the resulting work is 469 00:26:16,880 --> 00:26:19,840 Speaker 7: copyrightable because it fits into this notion in the history 470 00:26:19,840 --> 00:26:23,160 Speaker 7: of copyright that some human activity and control is required. Now, 471 00:26:23,160 --> 00:26:26,439 Speaker 7: the question is how much of that is going to 472 00:26:26,440 --> 00:26:28,920 Speaker 7: be acquired in context of ag on a few prompts. 473 00:26:28,920 --> 00:26:30,920 Speaker 7: Probably isn't going to be good enough. So talking about 474 00:26:30,920 --> 00:26:34,000 Speaker 7: the author's guilt case, if some high school student says, 475 00:26:34,240 --> 00:26:37,639 Speaker 7: write an essay for me, and the machine writes the essay, 476 00:26:39,320 --> 00:26:43,480 Speaker 7: is that sufficient of a human propt? Probably not, And 477 00:26:43,560 --> 00:26:46,600 Speaker 7: to the extent that it then just regurgitates something from 478 00:26:46,680 --> 00:26:49,720 Speaker 7: a copyrighted novel that's out there, such as John Grisham's work, 479 00:26:50,080 --> 00:26:52,399 Speaker 7: that would indeed be copyright infringement on the part of 480 00:26:52,440 --> 00:26:55,960 Speaker 7: both the student and the AI machine. So I don't 481 00:26:55,960 --> 00:26:57,800 Speaker 7: know where the courts are going to draw that line, 482 00:26:58,119 --> 00:27:00,440 Speaker 7: or whether they will at all, Whether the to say 483 00:27:00,440 --> 00:27:03,000 Speaker 7: no AI is never going to be entitled to copyright 484 00:27:03,000 --> 00:27:05,800 Speaker 7: the registration or at some point they say, well, if 485 00:27:05,840 --> 00:27:08,760 Speaker 7: you give it a thousand commands, maybe that's sufficient to 486 00:27:08,800 --> 00:27:11,880 Speaker 7: get you. You, not the AI, but the individual who 487 00:27:11,880 --> 00:27:13,560 Speaker 7: gave the commands a copyright registration. 488 00:27:13,960 --> 00:27:16,800 Speaker 2: Had there been a lot of other federal court judges 489 00:27:16,920 --> 00:27:19,360 Speaker 2: ruling on AI and copyright. 490 00:27:19,080 --> 00:27:21,320 Speaker 7: This is the first of its kind. We lawyers call 491 00:27:21,359 --> 00:27:23,879 Speaker 7: it a case of first impression. And why it's so 492 00:27:24,000 --> 00:27:27,400 Speaker 7: unusual that it's here in the District of Columbia instead 493 00:27:27,440 --> 00:27:31,359 Speaker 7: of New York or Central District, California or Los Angeles 494 00:27:31,680 --> 00:27:34,800 Speaker 7: is that the Copyright Office is located here in DC, 495 00:27:35,040 --> 00:27:38,320 Speaker 7: and so you take appeals from their decisions to the 496 00:27:38,440 --> 00:27:41,760 Speaker 7: DC District Court and it work itself up to the 497 00:27:41,880 --> 00:27:46,320 Speaker 7: DC Circuit Court, which gets a copyright case maybe every 498 00:27:46,359 --> 00:27:49,800 Speaker 7: other year. So again they'll be writing on blank slate, 499 00:27:50,040 --> 00:27:51,479 Speaker 7: and I assume this will go all the way up 500 00:27:51,480 --> 00:27:53,800 Speaker 7: to the Supreme Court eventually, and we'll see what they say. 501 00:27:53,920 --> 00:27:57,280 Speaker 2: Coming up next, I'll continue this conversation with Terrence Ross 502 00:27:57,320 --> 00:28:01,320 Speaker 2: and we'll talk about patents and artificial intelligence. I'm Jim Gross, 503 00:28:01,400 --> 00:28:04,439 Speaker 2: and you're listening to Bloomberg. I've been talking to Terrence 504 00:28:04,520 --> 00:28:09,960 Speaker 2: Ross of Katin Yuchen Rosaman about intellectual property and artificial intelligence. 505 00:28:10,440 --> 00:28:13,200 Speaker 2: This is copyright? How is patent law handling AI? 506 00:28:13,600 --> 00:28:17,000 Speaker 7: The issues of artificial intelligence are not limited to the 507 00:28:17,000 --> 00:28:23,159 Speaker 7: copyright realm. The same gentleman whose AI created work was 508 00:28:23,200 --> 00:28:27,800 Speaker 7: involved here, Stephen Tholer, had previously applied for a patent 509 00:28:27,840 --> 00:28:31,240 Speaker 7: with the United States Patent and Trademark Office on what 510 00:28:31,359 --> 00:28:36,560 Speaker 7: he claimed was an invention developed by an artificial intelligence, 511 00:28:37,200 --> 00:28:41,280 Speaker 7: and the Patent Office follows similar route to the Copyright 512 00:28:41,320 --> 00:28:46,200 Speaker 7: Office and said that no, machines can't be inventors for 513 00:28:46,280 --> 00:28:50,560 Speaker 7: purposes of the patent laws, only humans can and rejected 514 00:28:50,840 --> 00:28:54,240 Speaker 7: the patent application. He took that on appeal to the 515 00:28:54,440 --> 00:28:57,440 Speaker 7: United States Cord of Appeals for the Federal Circuit, which 516 00:28:57,480 --> 00:29:01,400 Speaker 7: is where you appealed decisions of the the Patent Office, 517 00:29:01,800 --> 00:29:05,800 Speaker 7: and the Federal Circuit agreed on with the United States 518 00:29:05,800 --> 00:29:09,440 Speaker 7: Patent Office and said only humans can be inventors for 519 00:29:09,680 --> 00:29:14,000 Speaker 7: purposes of patents and affirm the decision the Patent Office. 520 00:29:14,120 --> 00:29:19,160 Speaker 7: Mister Poler, apparently having unlimited resources, then filed a petition 521 00:29:19,240 --> 00:29:21,520 Speaker 7: for rit asserts sory on that decision with the United 522 00:29:21,520 --> 00:29:26,320 Speaker 7: States Supreme Court, which just this past spring denied that petition. 523 00:29:26,520 --> 00:29:30,240 Speaker 7: So the law is on the patent side the same 524 00:29:30,280 --> 00:29:33,560 Speaker 7: as on copyright side, that humans are the only ones 525 00:29:33,600 --> 00:29:36,920 Speaker 7: who can be listed as inventors on patents. And he 526 00:29:37,320 --> 00:29:39,360 Speaker 7: goes back for in a time. Every now and then 527 00:29:39,400 --> 00:29:42,600 Speaker 7: corporation supply make the mistake of applying for a patent 528 00:29:42,920 --> 00:29:45,560 Speaker 7: in the corporate name, and the patent offices no, no, no, no. 529 00:29:45,600 --> 00:29:47,640 Speaker 7: The only the inventor, the human can. They can then 530 00:29:47,680 --> 00:29:52,200 Speaker 7: assign the patent to the corporation. But corporations can't be inventors. 531 00:29:52,200 --> 00:29:57,600 Speaker 7: Only individuals can. So it's consistent with past patent office practice, 532 00:29:57,920 --> 00:30:00,840 Speaker 7: it's consistent with the approach taking and copy right. So 533 00:30:01,080 --> 00:30:06,080 Speaker 7: we're starting to see this evolution within intellectual property that 534 00:30:06,560 --> 00:30:11,680 Speaker 7: comes to the same point, which is that only humans 535 00:30:11,720 --> 00:30:14,760 Speaker 7: are entitled to intellectual property rights in the United States. 536 00:30:15,040 --> 00:30:18,520 Speaker 2: What about these robots that have the characteristics of humans? 537 00:30:19,080 --> 00:30:20,440 Speaker 2: Can you get a patent on those? 538 00:30:21,280 --> 00:30:27,920 Speaker 7: Inventors can get patents on an artificial intelligence system, just 539 00:30:28,000 --> 00:30:33,440 Speaker 7: as they can get a patent on most other computer software. 540 00:30:34,240 --> 00:30:36,880 Speaker 7: AI is usually more than just software. It's usually some 541 00:30:36,920 --> 00:30:40,120 Speaker 7: sort of system, so there may be a method patent 542 00:30:40,160 --> 00:30:43,080 Speaker 7: as well as utility patent on it. But no, I'm 543 00:30:43,240 --> 00:30:48,160 Speaker 7: actually representing someone right now who has obtained a patent 544 00:30:48,200 --> 00:30:52,680 Speaker 7: in the AI field. That the really interesting question, though, 545 00:30:52,760 --> 00:30:56,480 Speaker 7: is this we're using. We're seeing the use of artificial 546 00:30:56,520 --> 00:31:02,200 Speaker 7: intelligence in a lot of activities that traditionally are considered creative. 547 00:31:02,760 --> 00:31:09,200 Speaker 7: One example is encoding writing software programs. There are now 548 00:31:09,720 --> 00:31:15,120 Speaker 7: artificial intelligence programs that help programmers to write code. You 549 00:31:15,360 --> 00:31:17,680 Speaker 7: set up a framework, you say I want to do 550 00:31:17,800 --> 00:31:21,360 Speaker 7: this in this section of the program, and the AI 551 00:31:21,440 --> 00:31:26,720 Speaker 7: writes that code. Similarly, in Hollywood, we see movies and 552 00:31:26,760 --> 00:31:33,720 Speaker 7: television artificial intelligence doing special effects and providing and being 553 00:31:33,760 --> 00:31:37,480 Speaker 7: involved in some elements of a production. Indeed, the recent 554 00:31:37,520 --> 00:31:40,800 Speaker 7: writers strike and Actors strike had at their core demand 555 00:31:40,800 --> 00:31:45,120 Speaker 7: by the unions that AI be prohibited in movies intelevision. 556 00:31:45,520 --> 00:31:48,440 Speaker 7: But in both these contexts and the use of AI 557 00:31:48,600 --> 00:31:50,560 Speaker 7: to help encoding, the use of AI to help in 558 00:31:50,600 --> 00:31:55,720 Speaker 7: film and television. Now that we know that AI generated 559 00:31:56,040 --> 00:31:59,880 Speaker 7: work is not copyrightable, what impact does that legally have 560 00:32:00,840 --> 00:32:05,600 Speaker 7: upon the computer programs, the motion pictures, the television shows 561 00:32:05,920 --> 00:32:09,960 Speaker 7: that have used artificial intelligence as part of the creative process. 562 00:32:10,160 --> 00:32:14,160 Speaker 7: All of a sudden is Star Wars deprived of its 563 00:32:14,280 --> 00:32:20,040 Speaker 7: copyright because artificial intelligence machine helped on portions of it 564 00:32:20,080 --> 00:32:23,600 Speaker 7: is a software program or a video game, which is 565 00:32:23,680 --> 00:32:26,760 Speaker 7: essentially software program. Are they because they have used AI 566 00:32:26,920 --> 00:32:29,480 Speaker 7: to help in the coding, have they lost their copyright? 567 00:32:29,680 --> 00:32:32,440 Speaker 7: I mean, this is a really important legal issue that 568 00:32:32,560 --> 00:32:36,560 Speaker 7: a lot of companies rushing to use AI have not 569 00:32:36,720 --> 00:32:40,560 Speaker 7: considered the questions whether in the act of employing A 570 00:32:41,160 --> 00:32:44,080 Speaker 7: help in the creation of your work, you're losing the 571 00:32:44,080 --> 00:32:47,400 Speaker 7: ability to obtain copyrights. I mean, this is for studios. 572 00:32:47,640 --> 00:32:50,840 Speaker 7: This should be something that's on the front page of 573 00:32:51,040 --> 00:32:54,320 Speaker 7: their agenda of things to think about. This has to 574 00:32:54,320 --> 00:32:56,960 Speaker 7: be on their radar screen. Have never understood why they 575 00:32:56,960 --> 00:33:00,480 Speaker 7: were so opposed to the union demands on artificial intelligence, 576 00:33:00,520 --> 00:33:04,360 Speaker 7: because in a fact, by incorporating AI, they're taking away 577 00:33:04,480 --> 00:33:08,040 Speaker 7: the most valuable asset they have, their copyrights in their 578 00:33:08,040 --> 00:33:10,600 Speaker 7: motion picture and it's something that's you don't hear much 579 00:33:10,600 --> 00:33:11,280 Speaker 7: discussion about. 580 00:33:11,720 --> 00:33:13,880 Speaker 2: Maybe that will be the next topic we hear a 581 00:33:13,880 --> 00:33:17,600 Speaker 2: lot about. Thanks so much, Terry. That's Terrence Ross of 582 00:33:17,720 --> 00:33:20,600 Speaker 2: Catain Euchen Rosenman, And that's it for this edition of 583 00:33:20,600 --> 00:33:23,280 Speaker 2: the Bloomberg Law Show. Remember you can always get the 584 00:33:23,320 --> 00:33:26,560 Speaker 2: latest legal news on our Bloomberg Law Podcast. You can 585 00:33:26,600 --> 00:33:30,800 Speaker 2: find them on Apple Podcasts, Spotify, and at www dot 586 00:33:30,840 --> 00:33:35,040 Speaker 2: bloomberg dot com, slash podcast, Slash Law, and remember to 587 00:33:35,040 --> 00:33:38,120 Speaker 2: tune into The Bloomberg Law Show every weeknight at ten 588 00:33:38,160 --> 00:33:41,960 Speaker 2: pm Wall Street Time. I'm June Grosso and you're listening 589 00:33:42,040 --> 00:33:42,720 Speaker 2: to Bloomberg