1 00:00:02,440 --> 00:00:08,000 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. I'd like to welcome 2 00:00:08,000 --> 00:00:11,520 Speaker 1: our global Bloomberg television and radio audiences. I'm joined in 3 00:00:11,600 --> 00:00:15,800 Speaker 1: conversation by Democratic Senator Amy Klobuchar of Minnesota, live from 4 00:00:15,840 --> 00:00:19,079 Speaker 1: Capitol Hill. Senator, thank you very much for being here. 5 00:00:19,120 --> 00:00:22,159 Speaker 1: You are leading a subcommittee hearing today titled Continuing a 6 00:00:22,239 --> 00:00:27,159 Speaker 1: Bipartisan Path Forward for Antitrust Enforcement and Reform. And I 7 00:00:27,200 --> 00:00:30,520 Speaker 1: do wonder, in what will become January, a Republican control 8 00:00:30,600 --> 00:00:33,680 Speaker 1: of Congress and Billa versus and White House, what that 9 00:00:33,800 --> 00:00:35,760 Speaker 1: bipartisan path actually looks like. 10 00:00:36,680 --> 00:00:39,040 Speaker 2: Okay, well, first of all, thanks for having me on, Kayley. 11 00:00:39,080 --> 00:00:41,680 Speaker 2: It's great to be on. And I think what you've 12 00:00:41,720 --> 00:00:45,600 Speaker 2: seen in this area of antitrust, you've seen some broad 13 00:00:45,640 --> 00:00:50,520 Speaker 2: bipartisan agreement on a number of these cases. Right. For instance, 14 00:00:50,640 --> 00:00:55,200 Speaker 2: the Google case started under the Trump administration, the first 15 00:00:55,240 --> 00:00:58,440 Speaker 2: Trump administration, as did the Facebook case that was at 16 00:00:58,480 --> 00:01:02,080 Speaker 2: the FTC. Google was at the Justice Department. Then it 17 00:01:02,160 --> 00:01:07,480 Speaker 2: proceeded through the Biden administration. Biden administration also brought some 18 00:01:07,600 --> 00:01:12,520 Speaker 2: major cases, for instance against Ticketmaster and other companies, and 19 00:01:12,560 --> 00:01:16,320 Speaker 2: so you've seen a more aggressive anti trust enforcement over 20 00:01:16,360 --> 00:01:20,040 Speaker 2: the last few years now, you have a new appointees 21 00:01:20,160 --> 00:01:23,840 Speaker 2: going in place, and at least one of them I 22 00:01:23,959 --> 00:01:28,200 Speaker 2: have heard the belief is over at the Justice Department 23 00:01:28,319 --> 00:01:32,240 Speaker 2: with some former Democratic anti trust enforcers that she knows 24 00:01:32,280 --> 00:01:34,720 Speaker 2: what she's doing. I'm looking forward to meeting with her, 25 00:01:34,840 --> 00:01:39,600 Speaker 2: Gail Slater, and I'm hoping that we will continue those cases. 26 00:01:40,200 --> 00:01:43,360 Speaker 2: I love competition, Okay, I like capitalism. That's why I'm 27 00:01:43,400 --> 00:01:44,839 Speaker 2: for anti trust enforcement. 28 00:01:45,959 --> 00:01:49,160 Speaker 1: Well, you've mentioned Gail Slater. What about Andrew Ferguson. Do 29 00:01:49,240 --> 00:01:52,040 Speaker 1: you think he loves the issue of competition in the 30 00:01:52,080 --> 00:01:54,040 Speaker 1: same way that you do. Do you trust that the 31 00:01:54,120 --> 00:01:56,320 Speaker 1: FTC will be in good hands under his leadership? 32 00:01:56,840 --> 00:01:58,960 Speaker 2: You know, I am looking forward to meeting with him. 33 00:01:58,960 --> 00:02:01,960 Speaker 2: Interestingly enough, we do not confirm that position because he 34 00:02:02,080 --> 00:02:06,279 Speaker 2: was already on the commission. There is a new member 35 00:02:06,320 --> 00:02:09,280 Speaker 2: that is being appointed on the Republican side. I'll note 36 00:02:09,280 --> 00:02:12,320 Speaker 2: that he also he wrote a piece on the breakup 37 00:02:12,360 --> 00:02:15,400 Speaker 2: of Ticketmaster in which he favored that. So I just 38 00:02:15,440 --> 00:02:18,960 Speaker 2: think with anti trust because at its core, it is 39 00:02:18,960 --> 00:02:23,840 Speaker 2: about competition, and it's been laggered for many decades, and 40 00:02:23,880 --> 00:02:26,680 Speaker 2: as a result, we're seeing more and more consolidation. It 41 00:02:26,720 --> 00:02:29,120 Speaker 2: isn't that big companies are bad. It's that sometimes when 42 00:02:29,160 --> 00:02:32,480 Speaker 2: you have no competition, then you start getting less innovation, 43 00:02:32,840 --> 00:02:36,600 Speaker 2: more high prices, et cetera, et cetera. So I'm actually 44 00:02:36,600 --> 00:02:38,760 Speaker 2: really excited about this. Center Lee and I are doing 45 00:02:38,800 --> 00:02:42,000 Speaker 2: this together. This hearing. Sena Grassley and I have passed 46 00:02:42,000 --> 00:02:44,320 Speaker 2: our bill together, and I think you're going to continue 47 00:02:44,360 --> 00:02:47,240 Speaker 2: to see interest in tech. Fact, Center Cruise and I 48 00:02:47,360 --> 00:02:51,800 Speaker 2: did a joint interview this morning and a bill that 49 00:02:51,840 --> 00:02:54,680 Speaker 2: we have gotten through the Senate on taking online porn 50 00:02:54,800 --> 00:02:58,200 Speaker 2: off the Internet. Different than anti trust, However, you're just 51 00:02:58,240 --> 00:03:00,960 Speaker 2: going to continue to see bipartisan and work on tech. 52 00:03:02,240 --> 00:03:04,200 Speaker 1: Well, I'm glad that you have brought up the Take 53 00:03:04,200 --> 00:03:06,720 Speaker 1: It Down Act, which you co sponsored with Senator Cruz. 54 00:03:07,200 --> 00:03:10,840 Speaker 1: Is that trying to tackle symptoms of an underlying disease 55 00:03:10,960 --> 00:03:13,680 Speaker 1: rather than the disease itself. And the disease I'm referring 56 00:03:13,720 --> 00:03:16,079 Speaker 1: to here is unregulated artificial intelligence. 57 00:03:17,320 --> 00:03:20,320 Speaker 2: Thanks, I really would like to put in some rules 58 00:03:20,360 --> 00:03:23,799 Speaker 2: of the road on AI. And this bill is actually 59 00:03:23,840 --> 00:03:28,360 Speaker 2: broader than just AI pornographic issue pictures. It's actually also 60 00:03:28,520 --> 00:03:32,600 Speaker 2: real pictures as well as AI created ones. We're now 61 00:03:32,600 --> 00:03:35,640 Speaker 2: seeing one in twelve Americans saying that they have been 62 00:03:35,680 --> 00:03:37,800 Speaker 2: a victim or no, someone that's a victim of this. 63 00:03:38,160 --> 00:03:41,200 Speaker 2: We've had twenty suicides in one year of young kids. 64 00:03:41,480 --> 00:03:45,800 Speaker 2: Twenty suicides because someone a girlfriend, boyfriend, someone they knew 65 00:03:46,040 --> 00:03:48,400 Speaker 2: put up their photo. They were embarrassed that their friends 66 00:03:48,440 --> 00:03:51,080 Speaker 2: and their family would know, and they killed themselves. These 67 00:03:51,080 --> 00:03:54,840 Speaker 2: are FBI statistics. So Senator Cruse and I came together 68 00:03:54,880 --> 00:03:57,000 Speaker 2: to build us two things. One make it clearly a 69 00:03:57,080 --> 00:04:01,600 Speaker 2: crime to use pornographic imagery of someone else, whether it's 70 00:04:01,680 --> 00:04:04,920 Speaker 2: AI created or real. And then number two, that the 71 00:04:04,960 --> 00:04:07,240 Speaker 2: platforms have to take it down. That's why it's called 72 00:04:07,280 --> 00:04:10,440 Speaker 2: the Take It Down Act. They take down other violations 73 00:04:10,440 --> 00:04:13,280 Speaker 2: of intellectual property and the fact that people can be 74 00:04:13,400 --> 00:04:16,680 Speaker 2: abused in this way to the point of committing suicide. 75 00:04:16,839 --> 00:04:21,680 Speaker 2: And in one case that I know of, Senator Cruz's case, 76 00:04:21,720 --> 00:04:24,280 Speaker 2: he actually had a call Snapchat to get the image 77 00:04:24,320 --> 00:04:27,760 Speaker 2: down after months of this victim from his state dealing 78 00:04:27,800 --> 00:04:29,080 Speaker 2: with it. That's just wrong. 79 00:04:30,920 --> 00:04:32,800 Speaker 1: Senator, I'd like to ask you about another one of 80 00:04:32,839 --> 00:04:36,280 Speaker 1: your colleagues, your fellow Democrats, Senator Elizabeth Warren of Massachusetts, 81 00:04:36,320 --> 00:04:39,240 Speaker 1: wrote a letter we understand, to President elect Donald Trump 82 00:04:39,279 --> 00:04:42,479 Speaker 1: asking for firm conflict of interest rules to be put 83 00:04:42,520 --> 00:04:45,560 Speaker 1: into place. Related to Elon Musk, who, of course has 84 00:04:45,560 --> 00:04:48,360 Speaker 1: been tapped to co lead this new Department of Government Efficiency, 85 00:04:48,360 --> 00:04:50,800 Speaker 1: have you had any conversations with Senator Warren about that, 86 00:04:50,880 --> 00:04:53,400 Speaker 1: or at the very least you share in that sentiment. 87 00:04:54,480 --> 00:04:57,080 Speaker 2: Now, I haven't seen this letter, but I will say 88 00:04:57,160 --> 00:05:01,119 Speaker 2: that I believe that we need con inflictive interest rules 89 00:05:01,120 --> 00:05:04,800 Speaker 2: in place for people who are making major decisions in 90 00:05:04,839 --> 00:05:09,240 Speaker 2: the government. That is what our people have done voluntarily 91 00:05:10,160 --> 00:05:12,680 Speaker 2: for years now. And you have a number of very 92 00:05:12,720 --> 00:05:16,080 Speaker 2: wealthy people going into the Trump administration. There's been wealthy 93 00:05:16,800 --> 00:05:19,719 Speaker 2: people as well under democratic administration, but you have a 94 00:05:19,839 --> 00:05:23,719 Speaker 2: number of them coming in and we need the conflict 95 00:05:23,839 --> 00:05:26,280 Speaker 2: rules in force, and we need to know that the 96 00:05:26,320 --> 00:05:28,839 Speaker 2: decisions they are making are not for their own interest 97 00:05:28,920 --> 00:05:31,080 Speaker 2: but for the interests of the American people. And I 98 00:05:31,080 --> 00:05:33,600 Speaker 2: would hope that President elect Trump agrees. 99 00:05:35,520 --> 00:05:39,400 Speaker 1: Finally, President Elect Trump yesterday met with the CEO of TikTok, 100 00:05:39,480 --> 00:05:41,520 Speaker 1: Show Chew at mar A Lago, after saying in a 101 00:05:41,600 --> 00:05:44,359 Speaker 1: news conference Senator that he is a warm spot in 102 00:05:44,400 --> 00:05:46,160 Speaker 1: his heart for TikTok. When asked if you would like 103 00:05:46,200 --> 00:05:48,320 Speaker 1: to see the ban go through or will try to 104 00:05:48,360 --> 00:05:50,279 Speaker 1: stop it. Given some of the issues we have already 105 00:05:50,320 --> 00:05:53,880 Speaker 1: discussed around technology in particular and what is propagated on 106 00:05:53,920 --> 00:05:56,360 Speaker 1: these platforms, what is your view about whether that band 107 00:05:56,400 --> 00:05:58,960 Speaker 1: should be enforced come January. 108 00:05:59,200 --> 00:06:03,279 Speaker 2: Of course, out of Congress with strong bipartisan support. And 109 00:06:03,279 --> 00:06:06,880 Speaker 2: there are two avenues here. One is that they can 110 00:06:06,920 --> 00:06:10,240 Speaker 2: follow the law and divest and find a buyer for 111 00:06:10,320 --> 00:06:13,039 Speaker 2: the company, and the second is that they still are 112 00:06:13,120 --> 00:06:16,599 Speaker 2: appealing to the Supreme Court. So my view has been 113 00:06:16,800 --> 00:06:19,520 Speaker 2: that we should have rules of the road in place, 114 00:06:19,600 --> 00:06:22,400 Speaker 2: by the way, for all platforms. I have been way 115 00:06:22,400 --> 00:06:24,840 Speaker 2: out there, as I think you know, in terms of 116 00:06:24,960 --> 00:06:31,240 Speaker 2: getting not just pornography off the Internet, but other very 117 00:06:31,360 --> 00:06:34,040 Speaker 2: very difficult things that are on there right now, and 118 00:06:34,040 --> 00:06:35,919 Speaker 2: that we should have a better policing of that, and 119 00:06:35,960 --> 00:06:39,160 Speaker 2: people should have the right to protect their own intellectual properties, 120 00:06:39,480 --> 00:06:43,159 Speaker 2: and also that we should have antitrust enforcement. You just 121 00:06:43,240 --> 00:06:46,200 Speaker 2: can't have, say Google, with a ninety percent market share 122 00:06:46,200 --> 00:06:49,160 Speaker 2: on the search engine and not have any competition and 123 00:06:49,200 --> 00:06:52,520 Speaker 2: then allow them to self preference, as we see with 124 00:06:52,560 --> 00:06:55,880 Speaker 2: Amazon and other companies their own products at the top. 125 00:06:56,320 --> 00:06:59,360 Speaker 2: That's why the NFIB, which is not a liberal organization, 126 00:06:59,640 --> 00:07:03,960 Speaker 2: the National Federation of Independent Businesses, is strongly supporting the 127 00:07:04,000 --> 00:07:06,880 Speaker 2: bill that I have with Senator Grassley, which simply put 128 00:07:06,960 --> 00:07:09,240 Speaker 2: some rules in the road in place for competition on 129 00:07:09,279 --> 00:07:11,640 Speaker 2: the Internet. All right. 130 00:07:11,720 --> 00:07:14,920 Speaker 1: Democratic Senator Amy Clobaschar of Minnesota joining us live from 131 00:07:14,920 --> 00:07:17,880 Speaker 1: Capitol Hill on Bloomberg Television and Radio. Thank you so much.