1 00:00:03,520 --> 00:00:07,040 Speaker 1: Welcome to the Bloomberg Law Podcast. I'm June Grosso. Every 2 00:00:07,120 --> 00:00:09,680 Speaker 1: day we bring you insight and analysis into the most 3 00:00:09,720 --> 00:00:12,200 Speaker 1: important legal news of the day. You can find more 4 00:00:12,240 --> 00:00:16,160 Speaker 1: episodes of the Bloomberg Law Podcast on Apple podcast, SoundCloud 5 00:00:16,280 --> 00:00:19,840 Speaker 1: and on Bloomberg dot com slash podcasts. You'd never guess 6 00:00:19,880 --> 00:00:23,400 Speaker 1: from Facebook's repeated privacy scandals that the company has been 7 00:00:23,520 --> 00:00:26,639 Speaker 1: under consent agreement with the Federal Trade Commission over a 8 00:00:26,680 --> 00:00:31,240 Speaker 1: privacy breach since twenty eleven. But now users, lawmakers, and 9 00:00:31,320 --> 00:00:35,360 Speaker 1: data security advocates are demanding a forceful government response, and 10 00:00:35,479 --> 00:00:38,159 Speaker 1: that job mainly falls to the chairman of the FTC. 11 00:00:38,880 --> 00:00:41,920 Speaker 1: Joining me is an expert in privacy law, Joel Riidenberger, 12 00:00:41,920 --> 00:00:46,120 Speaker 1: Professor at Fordham Law School. Joel, is Facebook the worst 13 00:00:46,240 --> 00:00:49,320 Speaker 1: offender of privacy in the world of social media or 14 00:00:49,400 --> 00:00:52,960 Speaker 1: are we just hearing more about its violations. That's a 15 00:00:53,000 --> 00:00:57,120 Speaker 1: great question. It's not really clear. I think Facebook probably 16 00:00:57,200 --> 00:01:01,960 Speaker 1: has the largest number of users, so any large scale 17 00:01:02,080 --> 00:01:08,080 Speaker 1: violation that Facebook does will affect more people than most 18 00:01:08,120 --> 00:01:11,360 Speaker 1: other organizations. I think some of the big companies like 19 00:01:11,400 --> 00:01:16,640 Speaker 1: Google and Amazon are also facing issues, but we certainly 20 00:01:16,680 --> 00:01:22,080 Speaker 1: hear more about Facebook because of their behavior. Sources are 21 00:01:22,080 --> 00:01:25,360 Speaker 1: telling Bloomberg that the FTC is expected to hit Facebook 22 00:01:25,400 --> 00:01:30,319 Speaker 1: with the record fine in the current investigation involving Cambridge Analytica. 23 00:01:30,400 --> 00:01:34,640 Speaker 1: What can you tell us about that investigation? Well, Facebook 24 00:01:34,840 --> 00:01:37,160 Speaker 1: settled with the FTC a number of years ago. There 25 00:01:37,200 --> 00:01:41,679 Speaker 1: was two thousand eleven for essentially collecting and sharing information 26 00:01:41,720 --> 00:01:46,240 Speaker 1: from its users without proper consent, and it certainly seems 27 00:01:46,240 --> 00:01:49,600 Speaker 1: for what's been made publicly available, that's exactly what happened 28 00:01:49,600 --> 00:01:52,320 Speaker 1: in Cambridge Analytica, along with a whole series of other 29 00:01:52,360 --> 00:01:56,080 Speaker 1: scandals this year. That's not the only one. Cambridge Analytica 30 00:01:56,080 --> 00:02:00,440 Speaker 1: affected about eighty seven million users. For the ft SEE 31 00:02:00,440 --> 00:02:06,720 Speaker 1: to really show that it matters as a privacy enforcement agency, 32 00:02:07,120 --> 00:02:10,359 Speaker 1: they're going to have to issue a quite substantial fine. 33 00:02:10,520 --> 00:02:14,359 Speaker 1: And Facebook has revenue of what approximately fifty billion dollars 34 00:02:14,760 --> 00:02:18,280 Speaker 1: in the last year, So if it's anything like the 35 00:02:18,320 --> 00:02:23,080 Speaker 1: fines that the FTC has issued thus far for privacy violations, 36 00:02:23,680 --> 00:02:26,080 Speaker 1: it's chunk changed for Facebook. I mean, this is going 37 00:02:26,120 --> 00:02:29,200 Speaker 1: to have to be a fine in the nine figures 38 00:02:29,440 --> 00:02:33,480 Speaker 1: for it to actually be taken seriously. Do find even 39 00:02:33,560 --> 00:02:36,240 Speaker 1: work with a company like Facebook. They haven't worked in 40 00:02:36,280 --> 00:02:39,000 Speaker 1: the past so some people are calling for it to 41 00:02:39,040 --> 00:02:42,519 Speaker 1: have a privacy law. Others are calling for breakup of Facebook. 42 00:02:42,600 --> 00:02:45,320 Speaker 1: What do you think is needed, Well, there's no question 43 00:02:45,360 --> 00:02:47,880 Speaker 1: that we need a stronger privacy laws in the United 44 00:02:47,919 --> 00:02:51,560 Speaker 1: States to deal with these kinds of problems. The fine, 45 00:02:51,720 --> 00:02:55,120 Speaker 1: if the fine is sufficient enough, it might be successful 46 00:02:55,520 --> 00:02:59,000 Speaker 1: in changing a company like Facebook. But something else. You know, 47 00:02:59,480 --> 00:03:02,360 Speaker 1: we just don't have the capacity right now, and the 48 00:03:02,440 --> 00:03:06,120 Speaker 1: FTC doesn't to deal with all of the different scandals 49 00:03:06,120 --> 00:03:08,040 Speaker 1: that Facebook was involved in this year. I mean, you 50 00:03:08,080 --> 00:03:12,520 Speaker 1: have Cambridge ANALYTICAUP, you have Facebook. Uh. It was exposed 51 00:03:12,520 --> 00:03:15,840 Speaker 1: that they were tracking non Facebook users across the web, 52 00:03:15,880 --> 00:03:19,600 Speaker 1: they were tracking logged out users across the web. They 53 00:03:19,600 --> 00:03:25,440 Speaker 1: were capturing call history of clients. They bought What's Apped, 54 00:03:25,440 --> 00:03:29,560 Speaker 1: and the co founder this past year quit essentially in 55 00:03:29,639 --> 00:03:34,920 Speaker 1: discuss because of their data misuse. Facebook tries to capture 56 00:03:35,640 --> 00:03:41,640 Speaker 1: telephone numbers, claiming they needed it for dual factor security 57 00:03:41,760 --> 00:03:44,880 Speaker 1: to make your account more secure, and then it turns 58 00:03:44,880 --> 00:03:47,200 Speaker 1: out they turned around and use those phone numbers for 59 00:03:47,280 --> 00:03:50,880 Speaker 1: all sorts of other purposes that weren't disclosed. That's a 60 00:03:50,960 --> 00:03:56,000 Speaker 1: huge problem. They've been involved in manipulating the elections in 61 00:03:56,000 --> 00:03:58,840 Speaker 1: the United States and they knew about it and denied it. 62 00:03:59,400 --> 00:04:04,120 Speaker 1: We saw the the press reports that Cheryl Sandberg knew 63 00:04:04,160 --> 00:04:08,520 Speaker 1: of the Russian election activities, and she publicly denied it, 64 00:04:08,680 --> 00:04:12,400 Speaker 1: and she privately tried to stop the investigation. You know, 65 00:04:12,440 --> 00:04:15,680 Speaker 1: all of these sorts of things are occurring, I think 66 00:04:15,720 --> 00:04:20,839 Speaker 1: in large part because the tech giants have built a 67 00:04:21,320 --> 00:04:31,040 Speaker 1: economic model on monetizing individuals data essentially without their effective consent, 68 00:04:31,320 --> 00:04:33,400 Speaker 1: and that's a problem. We need to be able to 69 00:04:33,400 --> 00:04:36,320 Speaker 1: address that effectively in this country. So what about the 70 00:04:36,360 --> 00:04:40,680 Speaker 1: privacy experts who are calling for breaking up Facebook? Is 71 00:04:40,720 --> 00:04:43,680 Speaker 1: that even a possibility that the FTC may try that. 72 00:04:45,520 --> 00:04:48,599 Speaker 1: I guess it's a possibility, but the legal authority to 73 00:04:48,680 --> 00:04:53,839 Speaker 1: do that would be antitrust, and I don't think Facebook 74 00:04:53,880 --> 00:04:58,800 Speaker 1: would necessarily be the first target for an antitrust breakup order. 75 00:04:59,320 --> 00:05:01,719 Speaker 1: It might be part of a consent decree, but I'm 76 00:05:01,720 --> 00:05:05,200 Speaker 1: not sure what you would break off from Facebook. Because 77 00:05:05,240 --> 00:05:10,320 Speaker 1: the problem with Facebook is the core of data misuse 78 00:05:10,720 --> 00:05:14,479 Speaker 1: comes from its core business, comes from the main social 79 00:05:14,520 --> 00:05:21,799 Speaker 1: media activity, and that's where we're seeing the essentially non 80 00:05:21,880 --> 00:05:24,760 Speaker 1: consensual use of the data. You know, they purport to 81 00:05:24,839 --> 00:05:30,240 Speaker 1: have customers consent because somewhere buried in terms of service, 82 00:05:30,360 --> 00:05:33,920 Speaker 1: there is something there. There are some settings that can 83 00:05:33,960 --> 00:05:39,920 Speaker 1: be adjusted, but the reality is that the privacy policies 84 00:05:39,960 --> 00:05:43,320 Speaker 1: are so ambiguous. There's absolutely no way a well informed 85 00:05:43,360 --> 00:05:48,960 Speaker 1: consumer can really understand how their data is being used 86 00:05:49,760 --> 00:05:53,360 Speaker 1: and have a tool to say no. That just isn't 87 00:05:53,400 --> 00:05:55,640 Speaker 1: there in the marketplace. Right now, let's just talk for 88 00:05:55,680 --> 00:05:59,359 Speaker 1: a moment about FTC chairman Joe Simons, who talked tough 89 00:05:59,400 --> 00:06:02,520 Speaker 1: when he took over the agency. He's been criticized for 90 00:06:02,600 --> 00:06:05,560 Speaker 1: not being tough enough. What's your take on how he's 91 00:06:05,560 --> 00:06:08,800 Speaker 1: been Well, we haven't seen any action coming from the 92 00:06:08,800 --> 00:06:11,480 Speaker 1: FTC yet. I think the jury is still out as 93 00:06:11,520 --> 00:06:14,600 Speaker 1: to what he's going to do if he gives Facebook. 94 00:06:14,760 --> 00:06:17,920 Speaker 1: I mean, I look at the fine that the British 95 00:06:18,080 --> 00:06:23,200 Speaker 1: Information Commissioner there their privacy regulator, issued against Google this week. 96 00:06:23,400 --> 00:06:27,200 Speaker 1: It was about fifty seven million dollars. That's chump change 97 00:06:27,240 --> 00:06:30,080 Speaker 1: for Google. That's nothing. What would be a big enough 98 00:06:30,120 --> 00:06:33,120 Speaker 1: fine for any of these companies? I think for a 99 00:06:33,160 --> 00:06:36,880 Speaker 1: company like Facebook or another one where their revenue is 100 00:06:37,360 --> 00:06:40,960 Speaker 1: in the forty fifty billion dollar range, you need to 101 00:06:41,000 --> 00:06:43,880 Speaker 1: find over a hundred million dollars for it to actually 102 00:06:44,360 --> 00:06:47,839 Speaker 1: make a difference. Well, we shall see if that happens, 103 00:06:48,080 --> 00:06:51,040 Speaker 1: because I think twenty two and a half million dollars 104 00:06:51,080 --> 00:06:54,359 Speaker 1: against Google was the previous record for the FTC. Great 105 00:06:54,400 --> 00:06:57,360 Speaker 1: to have you on again, Joel. That's Joel Ridenberg. He's 106 00:06:57,360 --> 00:07:00,320 Speaker 1: an expert in privacy law and a professor at Wardom 107 00:07:00,400 --> 00:07:05,320 Speaker 1: Law School. Thanks for listening to the Bloomberg Law Podcast. 108 00:07:05,640 --> 00:07:09,720 Speaker 1: You can subscribe and listen to the show on Apple Podcasts, SoundCloud, 109 00:07:09,800 --> 00:07:13,680 Speaker 1: and on Bloomberg dot com slash podcast. I'm June Brosso. 110 00:07:14,160 --> 00:07:15,440 Speaker 1: This is Bloomberg