1 00:00:00,080 --> 00:00:02,120 Speaker 1: The US tech giants are going to be spending a 2 00:00:02,120 --> 00:00:04,600 Speaker 1: lot more time in court next year, and not just 3 00:00:04,800 --> 00:00:07,760 Speaker 1: on one front, but on several. For years, the country's 4 00:00:07,800 --> 00:00:12,840 Speaker 1: biggest technology companies, Apple, Google, Facebook, Microsoft and Amazon seemed 5 00:00:12,880 --> 00:00:15,760 Speaker 1: to occupy some kind of privileged position in this country, 6 00:00:15,800 --> 00:00:19,520 Speaker 1: with no serious regulations touching them. But it's apparent that 7 00:00:19,600 --> 00:00:23,439 Speaker 1: things are changing. Take the multiple congressional hearings last week 8 00:00:23,480 --> 00:00:27,920 Speaker 1: over Russian meddling in last year's presidential campaign. Facebook, Google, 9 00:00:27,960 --> 00:00:31,360 Speaker 1: and Twitter were unaccustomed to the tough questions and even 10 00:00:31,360 --> 00:00:35,040 Speaker 1: sarcastic comments. Senator Richard Burr is chair of the Senate 11 00:00:35,040 --> 00:00:40,120 Speaker 1: Intelligence Committee. This kind of national security vulnerability represents on 12 00:00:40,320 --> 00:00:44,879 Speaker 1: unacceptable risk, and your companies have a responsibility to reduce 13 00:00:45,280 --> 00:00:49,720 Speaker 1: that vulnerability. Our guests are Garrett E. Vank, Bloomberg News 14 00:00:49,760 --> 00:00:53,000 Speaker 1: Tech policy reporter, and Eric Goldman, co director of the 15 00:00:53,040 --> 00:00:57,800 Speaker 1: High Tech Law Institute at Santa Clara University School of Law. Garrett, 16 00:00:57,800 --> 00:01:00,480 Speaker 1: in your article in this week's Bloomberg Big this week, 17 00:01:00,520 --> 00:01:03,760 Speaker 1: you look at ten different areas where tech companies will 18 00:01:03,800 --> 00:01:09,039 Speaker 1: face challenges in whether in court cases or legislation. They 19 00:01:09,120 --> 00:01:14,040 Speaker 1: are not They're done alphabetically rather than in terms of important. 20 00:01:14,120 --> 00:01:17,960 Speaker 1: So what is the top threat to them in your estimation? Well, 21 00:01:18,000 --> 00:01:20,240 Speaker 1: I think in terms of top threat, I mean, it's 22 00:01:20,280 --> 00:01:22,199 Speaker 1: interesting to see what's been going on in the US 23 00:01:22,240 --> 00:01:24,399 Speaker 1: because of course we've talked about regulation of big tech 24 00:01:24,440 --> 00:01:26,640 Speaker 1: companies for years in other countries, a little bit here 25 00:01:26,640 --> 00:01:28,680 Speaker 1: in the US, but this year it really started to 26 00:01:28,720 --> 00:01:32,440 Speaker 1: pick up steam steam. You even had people talking about antitrust, 27 00:01:32,520 --> 00:01:34,520 Speaker 1: you know, concerns that Google was too big in the 28 00:01:34,560 --> 00:01:36,840 Speaker 1: advertising business, that Amazon was too big in the e 29 00:01:36,920 --> 00:01:40,160 Speaker 1: commerce business. So we'll have to see, you know, it 30 00:01:40,200 --> 00:01:42,840 Speaker 1: definitely looks like that kind of noise, those kind of 31 00:01:42,840 --> 00:01:45,959 Speaker 1: discussions happening at the political level have definitely been increasing 32 00:01:46,000 --> 00:01:49,000 Speaker 1: over the year and is a year that we'll see 33 00:01:49,040 --> 00:01:51,520 Speaker 1: whether or not real regulation comes down and what kind 34 00:01:51,560 --> 00:01:55,120 Speaker 1: of effect that has in the companies eric, To what 35 00:01:55,280 --> 00:01:58,320 Speaker 1: degree do the fact that they're facing more legal challenges 36 00:01:58,360 --> 00:02:01,920 Speaker 1: in the United States, how much does it reflected you know, 37 00:02:01,960 --> 00:02:04,120 Speaker 1: these big internet companies have kind of moved from being 38 00:02:04,160 --> 00:02:09,200 Speaker 1: seen as innovators and convenience for consumers to threats to 39 00:02:09,240 --> 00:02:12,600 Speaker 1: the way we like to live or to privacy. What 40 00:02:12,720 --> 00:02:15,480 Speaker 1: way to think about is that, um, the government has 41 00:02:15,520 --> 00:02:18,840 Speaker 1: gotten nervous about the consolidated power at the tech giants, 42 00:02:19,000 --> 00:02:22,400 Speaker 1: and the government's used to having uh, the most power 43 00:02:22,400 --> 00:02:26,400 Speaker 1: in our society, and that threat um has emerged that 44 00:02:26,520 --> 00:02:28,760 Speaker 1: maybe they don't um. So I see a lot of 45 00:02:28,760 --> 00:02:32,200 Speaker 1: it actually as the government reacting to the strength and 46 00:02:32,240 --> 00:02:37,680 Speaker 1: power consulted in the private hands. They don't like the competition, Garrett. 47 00:02:37,840 --> 00:02:40,919 Speaker 1: The Honest Ads Act, which was introduced in the US 48 00:02:41,000 --> 00:02:45,600 Speaker 1: Senate in October with bipartisan support, would require Internet companies 49 00:02:45,639 --> 00:02:49,280 Speaker 1: to reveal who's buying political ads and archived them for review. 50 00:02:49,360 --> 00:02:52,720 Speaker 1: And it seems like, in this political environment, a no brainer. 51 00:02:52,800 --> 00:02:57,720 Speaker 1: But why is Congress being so slow to pass it? Well, 52 00:02:57,760 --> 00:02:59,959 Speaker 1: I mean a lot of people in Congress just general, 53 00:03:00,000 --> 00:03:02,799 Speaker 1: really are against you know, disclosures of these kind of things, 54 00:03:02,880 --> 00:03:06,239 Speaker 1: or they're sort of hesitant. I mean, obviously they want 55 00:03:06,240 --> 00:03:08,240 Speaker 1: to be careful not to impinge on free speech. They 56 00:03:08,240 --> 00:03:11,440 Speaker 1: want people to be able to buy political advertising, to 57 00:03:11,560 --> 00:03:14,480 Speaker 1: share political speech whenever they want to, and whatever they 58 00:03:14,560 --> 00:03:17,520 Speaker 1: way they want to. You know, the main impetus behind 59 00:03:17,520 --> 00:03:19,720 Speaker 1: this bill is what we saw with Russia, you know, 60 00:03:19,880 --> 00:03:23,160 Speaker 1: using show social networks primarily Facebook but also Twitter and 61 00:03:23,240 --> 00:03:25,520 Speaker 1: YouTube to try to sort of you know, stir up 62 00:03:25,560 --> 00:03:28,040 Speaker 1: debate in the US and potentially try to interfere with 63 00:03:28,080 --> 00:03:31,360 Speaker 1: the election. So some of the people who are behind 64 00:03:31,360 --> 00:03:33,440 Speaker 1: this bill are kind of using this as an opportunity 65 00:03:33,440 --> 00:03:36,200 Speaker 1: to say, hey, let's bring the online platforms up to 66 00:03:36,240 --> 00:03:38,200 Speaker 1: the level that we have for radio and television when 67 00:03:38,240 --> 00:03:40,560 Speaker 1: it comes to having to disclose who paid for a 68 00:03:40,560 --> 00:03:43,640 Speaker 1: political ad, you know. But other people are saying, well, 69 00:03:43,680 --> 00:03:45,520 Speaker 1: let's be careful not to you know, make it two 70 00:03:45,560 --> 00:03:47,120 Speaker 1: owners or push it too far. And of course the 71 00:03:47,120 --> 00:03:48,840 Speaker 1: companies themselves want to do it on their own. They 72 00:03:48,840 --> 00:03:50,640 Speaker 1: want to self regulate, they don't want someone telling them 73 00:03:50,680 --> 00:03:55,600 Speaker 1: what to do. Eric to to what degree can we 74 00:03:55,680 --> 00:03:58,480 Speaker 1: expect the companies to be able to navigate this, because 75 00:03:58,640 --> 00:04:00,880 Speaker 1: it would seem as though if you got not just 76 00:04:00,920 --> 00:04:03,600 Speaker 1: the United States, but other countries starting to regulate the 77 00:04:03,680 --> 00:04:08,000 Speaker 1: content that is on these these platforms, it would be 78 00:04:08,040 --> 00:04:11,480 Speaker 1: a real threat to their way of doing business. Yeah, 79 00:04:11,720 --> 00:04:15,320 Speaker 1: there's no doubt that the regulators could undermine everything that 80 00:04:15,360 --> 00:04:18,920 Speaker 1: we love about the Internet. Um and I'd like to 81 00:04:18,920 --> 00:04:22,320 Speaker 1: think that the regulators recognize that capacity to make some 82 00:04:22,480 --> 00:04:25,960 Speaker 1: really serious mistakes. On the other hand, regulators in the 83 00:04:25,960 --> 00:04:28,039 Speaker 1: business regulation that's what they do. So I don't know 84 00:04:28,040 --> 00:04:32,560 Speaker 1: that they can self control their impulses, and so I 85 00:04:32,600 --> 00:04:35,800 Speaker 1: am nervous. We have seen across the globe, including here 86 00:04:35,800 --> 00:04:38,839 Speaker 1: in the United States, that the forces of censorship, the 87 00:04:38,880 --> 00:04:41,680 Speaker 1: idea that we can tell people what they can and 88 00:04:41,720 --> 00:04:45,120 Speaker 1: cannot say, um are having a fantastic run um. And 89 00:04:45,200 --> 00:04:48,240 Speaker 1: I think it would be uh somewhat disingenuous for us 90 00:04:48,279 --> 00:04:50,520 Speaker 1: to think, Oh, that's just happening overseas, that's happening here 91 00:04:50,520 --> 00:04:53,720 Speaker 1: in the U s as well, Garrett. The FCC is 92 00:04:53,760 --> 00:04:57,080 Speaker 1: on track to undo the net neutrality rules by early 93 00:04:58,440 --> 00:05:01,640 Speaker 1: How will that affect the big tech companies? Well, the 94 00:05:01,680 --> 00:05:04,159 Speaker 1: nen new Childe rules that we came in under Barack Obama. 95 00:05:04,200 --> 00:05:06,640 Speaker 1: We're sort of, you know, the reflective of the kind 96 00:05:06,640 --> 00:05:09,040 Speaker 1: of original ethos of the Internet, which is, you know, 97 00:05:09,240 --> 00:05:12,120 Speaker 1: this sort of idea that everything is should be fair 98 00:05:12,160 --> 00:05:14,360 Speaker 1: and equal and that you can't you know, someone who 99 00:05:14,400 --> 00:05:18,200 Speaker 1: owns a network can't charge you know, company A from 100 00:05:18,360 --> 00:05:21,200 Speaker 1: more for using that network than they do to company B, 101 00:05:21,360 --> 00:05:24,600 Speaker 1: or individual beer or politician A or whatever, what what 102 00:05:24,600 --> 00:05:27,440 Speaker 1: whatever you have. And you know, we're seeing you know 103 00:05:27,520 --> 00:05:30,200 Speaker 1: the potential of new rules coming down that are you know, 104 00:05:30,320 --> 00:05:32,080 Speaker 1: kind of moved past that and are a little bit 105 00:05:32,120 --> 00:05:35,880 Speaker 1: you know more and kind of the spirit of traditional capitalism, 106 00:05:35,920 --> 00:05:37,400 Speaker 1: which says, you know, if you own the roadways, you 107 00:05:37,440 --> 00:05:39,760 Speaker 1: get decide who goes on them and how much you 108 00:05:39,839 --> 00:05:42,880 Speaker 1: charge them. So, you know, the largest Internet companies are 109 00:05:42,920 --> 00:05:45,599 Speaker 1: so big and so powerful that they will be fine 110 00:05:45,800 --> 00:05:48,160 Speaker 1: regardless of sort of tweaks or changes to the rules. 111 00:05:48,520 --> 00:05:51,120 Speaker 1: You know. The concern here from critics of a Jedie 112 00:05:51,360 --> 00:05:54,240 Speaker 1: at at at the regulator changing these rules are that, 113 00:05:54,279 --> 00:05:56,320 Speaker 1: you know, smaller companies who want to compete with the 114 00:05:56,360 --> 00:05:58,560 Speaker 1: netflix is of the world will have a tougher time 115 00:05:58,600 --> 00:06:03,240 Speaker 1: doing it. Eric. Privacy is also an area that's been 116 00:06:03,279 --> 00:06:06,040 Speaker 1: a very big deal because the for these companies and 117 00:06:06,360 --> 00:06:08,880 Speaker 1: for people who are concerned about them, because they have 118 00:06:09,000 --> 00:06:10,960 Speaker 1: so much information about their users, and a lot of 119 00:06:10,960 --> 00:06:13,360 Speaker 1: their business models in fact, are built around that information. 120 00:06:13,800 --> 00:06:18,080 Speaker 1: To what degree should we expect, you know, restrictions and 121 00:06:18,120 --> 00:06:23,000 Speaker 1: requirements for the way that these companies handle private information 122 00:06:23,080 --> 00:06:27,200 Speaker 1: of their customers. That's a pretty complicated question because there's 123 00:06:27,240 --> 00:06:31,360 Speaker 1: a thousand different moving fronts on which the the regulators 124 00:06:31,440 --> 00:06:36,400 Speaker 1: might move to help um impose privacy restrictions, but at 125 00:06:36,400 --> 00:06:39,760 Speaker 1: the same time, there also might be movements to reduce privacy, 126 00:06:39,839 --> 00:06:42,160 Speaker 1: such as the Honest Dad's Act that was mentioned earlier 127 00:06:42,160 --> 00:06:46,839 Speaker 1: in this conversation, which actually designed to increase transparency in theory, 128 00:06:46,880 --> 00:06:50,599 Speaker 1: actually reduced the privacy of quote advertisers and what they're doing. 129 00:06:51,120 --> 00:06:55,760 Speaker 1: Um so, uh, no doubt though there's I think a 130 00:06:55,760 --> 00:06:59,960 Speaker 1: lot of skepticism about, um the amount of private information 131 00:07:00,360 --> 00:07:03,880 Speaker 1: under control of the tech giants, and I think that 132 00:07:04,120 --> 00:07:07,400 Speaker 1: the combination of the fears about their power and the 133 00:07:07,440 --> 00:07:10,080 Speaker 1: ability of that information to be used in ways that 134 00:07:10,280 --> 00:07:14,760 Speaker 1: could could really um undercut our expectations as consumers. I 135 00:07:14,800 --> 00:07:17,680 Speaker 1: think that creates a real recipe for some of those 136 00:07:17,680 --> 00:07:22,960 Speaker 1: privacy regulations to succeed. In about thirty seconds, Garrett, are 137 00:07:23,000 --> 00:07:26,080 Speaker 1: we already see any big announcements like we hear from 138 00:07:26,240 --> 00:07:30,000 Speaker 1: the EU Competition Commission and Margaret Vestaire or is it 139 00:07:30,000 --> 00:07:32,840 Speaker 1: going to be a little bit uh on the quieter 140 00:07:32,920 --> 00:07:34,840 Speaker 1: side in the US just quickly, on any trust I mean, 141 00:07:34,840 --> 00:07:37,200 Speaker 1: it's unlikely that you'll see the same kind of you know, 142 00:07:37,440 --> 00:07:39,720 Speaker 1: aggressive movements that you've seen in in in the EU 143 00:07:39,880 --> 00:07:42,320 Speaker 1: on antitrust here in the US, although people are talking 144 00:07:42,320 --> 00:07:44,200 Speaker 1: more about it. But the thing that you want to 145 00:07:44,600 --> 00:07:47,200 Speaker 1: look for is sort of you know, legislation like the 146 00:07:47,240 --> 00:07:49,680 Speaker 1: Honest Ads Act, and there's another one. You should read 147 00:07:49,680 --> 00:07:51,600 Speaker 1: our story that there's a lot going on, but yeah, 148 00:07:51,680 --> 00:07:54,080 Speaker 1: it's more on the legislative side. It's a great story, 149 00:07:54,320 --> 00:07:57,920 Speaker 1: has lots of charts and pictures to help you understand. 150 00:07:57,960 --> 00:08:01,040 Speaker 1: It's usually that's a Garrett of Ink, Bloomberg News Tech 151 00:08:01,080 --> 00:08:03,560 Speaker 1: policy reporter on Eric Gobin, co director of the High 152 00:08:03,560 --> 00:08:06,680 Speaker 1: Tech Law Institutent Santa Clara University Law School. That's it 153 00:08:06,840 --> 00:08:10,000 Speaker 1: for this edition of Bloomberg Law. We will see you tomorrow. 154 00:08:10,720 --> 00:08:13,720 Speaker 1: This is Bloomberg Law on Bloomberg Radio.