1 00:00:03,520 --> 00:00:07,040 Speaker 1: Welcome to the Bloomberg Law Podcast. I'm June Grosso. Every 2 00:00:07,120 --> 00:00:09,680 Speaker 1: day we bring you insight and analysis into the most 3 00:00:09,720 --> 00:00:12,200 Speaker 1: important legal news of the day. You can find more 4 00:00:12,240 --> 00:00:16,160 Speaker 1: episodes of the Bloomberg Law Podcast on Apple Podcasts, SoundCloud, 5 00:00:16,280 --> 00:00:19,960 Speaker 1: and on Bloomberg dot com slash podcast. The Justice Department 6 00:00:20,040 --> 00:00:24,160 Speaker 1: is ratcheting up the ongoing federal investigations of Facebook's data 7 00:00:24,200 --> 00:00:27,800 Speaker 1: sharing practices with a criminal subpoena. A federal grand jury 8 00:00:27,800 --> 00:00:30,160 Speaker 1: in New York subpoena records from at least two other 9 00:00:30,240 --> 00:00:33,960 Speaker 1: tech companies that had data sharing agreements with Facebook. That's 10 00:00:33,960 --> 00:00:37,040 Speaker 1: according to The New York Times. Joining us as Garrett 11 00:00:37,080 --> 00:00:40,680 Speaker 1: de Vinc Bloomberg News technology reporter to talk about all 12 00:00:40,760 --> 00:00:43,600 Speaker 1: things Facebook. First of all, Garrett, what do you know 13 00:00:43,680 --> 00:00:47,680 Speaker 1: about the Justice Department's investigation. It's out of the Eastern District. 14 00:00:47,920 --> 00:00:50,159 Speaker 1: What they're looking at is, you know, there's been all 15 00:00:50,200 --> 00:00:53,240 Speaker 1: these stories written the past year or two about Facebook 16 00:00:53,280 --> 00:00:55,680 Speaker 1: and how it was using our data, right, and so 17 00:00:56,120 --> 00:00:59,640 Speaker 1: there has been some reporting that Facebook struck deals with 18 00:00:59,720 --> 00:01:03,760 Speaker 1: other companies, so these would be handset manufacturers in some case, 19 00:01:03,960 --> 00:01:07,399 Speaker 1: um certain banks. They had to deal with Spotify as well, 20 00:01:07,760 --> 00:01:09,920 Speaker 1: So that if you were using, you know, one of 21 00:01:09,959 --> 00:01:12,760 Speaker 1: these companies products, you could sort of integrate it with 22 00:01:12,800 --> 00:01:16,280 Speaker 1: Facebook Messenger and chat with your friends. Now, the question 23 00:01:16,440 --> 00:01:20,080 Speaker 1: that is being investigated is whether those things violated some 24 00:01:20,120 --> 00:01:22,160 Speaker 1: promises that Facebook had made to the U S Government 25 00:01:22,160 --> 00:01:25,800 Speaker 1: about how it would share user data, and also whether 26 00:01:25,840 --> 00:01:29,520 Speaker 1: they were telling users that there may be sometimes private messages, 27 00:01:29,560 --> 00:01:32,679 Speaker 1: private information would be seen by other companies that they 28 00:01:32,680 --> 00:01:35,120 Speaker 1: didn't know. We're going to be looking at that. Every 29 00:01:35,120 --> 00:01:38,440 Speaker 1: time this comes up, flashing through my mind is well, 30 00:01:38,480 --> 00:01:42,640 Speaker 1: there's aleven consent degree that Facebook has and it's based 31 00:01:42,680 --> 00:01:47,880 Speaker 1: on privacy. Are there other organizations that are investigating Facebook 32 00:01:47,920 --> 00:01:50,160 Speaker 1: in the US? Yeah, I mean, so that's the the 33 00:01:50,240 --> 00:01:52,360 Speaker 1: FTC consent degree, which is sort of what I what 34 00:01:52,440 --> 00:01:54,640 Speaker 1: I was referencing about that promise that they made about 35 00:01:54,680 --> 00:01:57,440 Speaker 1: how they were gonna use and share data. I think 36 00:01:57,480 --> 00:01:59,880 Speaker 1: a lot of people in the US, at the regular 37 00:02:00,000 --> 00:02:02,840 Speaker 1: a level, at the political level, are interested in going 38 00:02:02,880 --> 00:02:06,080 Speaker 1: after Facebook. It's a big news story. It's something that 39 00:02:06,320 --> 00:02:08,760 Speaker 1: is gaining some traction with consumers. Even at the end 40 00:02:08,760 --> 00:02:10,960 Speaker 1: of the day, most people still really you know, like 41 00:02:11,040 --> 00:02:13,359 Speaker 1: the products are at least use them. Maybe maybe we're 42 00:02:13,360 --> 00:02:15,880 Speaker 1: addicted to them. Some of us. But I think that 43 00:02:15,960 --> 00:02:17,520 Speaker 1: at the end of the day, you know this as 44 00:02:17,520 --> 00:02:21,440 Speaker 1: a political story, and it's something for regulators and politicians 45 00:02:21,440 --> 00:02:23,359 Speaker 1: who want to take a stand on something. It is 46 00:02:23,400 --> 00:02:26,000 Speaker 1: something that they will keep beating the drum on. And 47 00:02:26,480 --> 00:02:29,120 Speaker 1: it seems that every day they have something new to 48 00:02:29,160 --> 00:02:33,560 Speaker 1: beat the drum on. And Facebook again being criticized for 49 00:02:33,840 --> 00:02:37,920 Speaker 1: live streaming of that shooting in New Zealand. Tell us 50 00:02:37,919 --> 00:02:40,880 Speaker 1: about that. Yeah, so, I mean Facebook has allowed live 51 00:02:40,960 --> 00:02:42,760 Speaker 1: streaming for a couple of years now. Obviously they're not 52 00:02:42,800 --> 00:02:44,520 Speaker 1: the only platform. You can do it on Twitter, you 53 00:02:44,520 --> 00:02:46,640 Speaker 1: can do it on YouTube, you can do it on 54 00:02:46,639 --> 00:02:49,160 Speaker 1: on a host of different apps for it. And it's 55 00:02:49,160 --> 00:02:51,760 Speaker 1: a really tricky problem, right because Facebook and the other 56 00:02:51,800 --> 00:02:55,119 Speaker 1: technology companies they want to do live streaming for them. 57 00:02:55,120 --> 00:02:57,600 Speaker 1: It gets people more engaged. People are more likely to 58 00:02:58,000 --> 00:03:00,720 Speaker 1: comment on live videos than they are on static ones. 59 00:03:00,960 --> 00:03:03,160 Speaker 1: But at the same time, you will have incidence like 60 00:03:03,200 --> 00:03:06,760 Speaker 1: this where someone live streams a crime, where someone live 61 00:03:06,800 --> 00:03:10,520 Speaker 1: streams content that otherwise would not be allowed on the platform, 62 00:03:10,560 --> 00:03:14,280 Speaker 1: and because it's live, there's no way to monitor it 63 00:03:14,440 --> 00:03:17,320 Speaker 1: using the technology that is usually monitored to sort of 64 00:03:17,360 --> 00:03:20,920 Speaker 1: take down or block videos that otherwise would break the rules. 65 00:03:21,680 --> 00:03:26,480 Speaker 1: Let's talk a little bit about Mark Zuckerberg's recent thoughts, 66 00:03:26,840 --> 00:03:32,680 Speaker 1: very very long thoughts, about lengthy thoughts about pivoting to privacy. 67 00:03:32,800 --> 00:03:35,800 Speaker 1: Do you see anything substantial coming out of that. Yeah, 68 00:03:35,880 --> 00:03:38,000 Speaker 1: I think anything that Mark Zuckerberg says you have to 69 00:03:38,000 --> 00:03:41,840 Speaker 1: take substantially. I mean it's sort of, you know, interesting 70 00:03:41,880 --> 00:03:43,680 Speaker 1: to kind of watch all his comments and see what 71 00:03:43,720 --> 00:03:45,520 Speaker 1: they actually mean and then look back a year later 72 00:03:45,520 --> 00:03:48,000 Speaker 1: and see whether things have changed or not. I'm pretty 73 00:03:48,040 --> 00:03:50,520 Speaker 1: skeptical of that word pivot, which I think has been 74 00:03:50,520 --> 00:03:53,680 Speaker 1: thrown around a lot when it comes to his recent announcement. 75 00:03:53,800 --> 00:03:57,240 Speaker 1: Facebook is not going to get rid of its public platforms. 76 00:03:57,240 --> 00:03:59,720 Speaker 1: It's not going to get rid of the core apps 77 00:03:59,720 --> 00:04:02,480 Speaker 1: where you know, I post a photo of my children 78 00:04:02,720 --> 00:04:05,240 Speaker 1: or whatnot and can share it publicly or share it 79 00:04:05,280 --> 00:04:07,440 Speaker 1: with my friends. But they do realize that, you know, 80 00:04:07,480 --> 00:04:10,760 Speaker 1: a huge part of communication and social networking is becoming 81 00:04:10,760 --> 00:04:15,160 Speaker 1: private and is becoming encrypted, and with their What's App application, 82 00:04:15,200 --> 00:04:18,280 Speaker 1: which is, if not the most popular messaging tool in 83 00:04:18,279 --> 00:04:21,239 Speaker 1: the world, definitely one of the top couple is um 84 00:04:21,279 --> 00:04:23,680 Speaker 1: you know, already encrypted, and so they're really leaning into 85 00:04:23,760 --> 00:04:25,760 Speaker 1: that side of the business as an area future growth. 86 00:04:25,960 --> 00:04:29,919 Speaker 1: They've lost one of their top executives. Yeah, so Chris 87 00:04:29,920 --> 00:04:33,760 Speaker 1: Cox resigned, And the way that Facebook was talking about 88 00:04:33,760 --> 00:04:36,400 Speaker 1: it is that, you know, he was someone who kind 89 00:04:36,400 --> 00:04:38,159 Speaker 1: of a couple of years ago, was already looking at 90 00:04:38,200 --> 00:04:40,680 Speaker 1: other alternatives. He'd been at Facebook pretty much since the 91 00:04:40,760 --> 00:04:43,880 Speaker 1: very beginning, and then because of all these these complications 92 00:04:43,920 --> 00:04:46,920 Speaker 1: they've had with privacy, with regulation and having to deal 93 00:04:46,960 --> 00:04:49,280 Speaker 1: with that, he stayed on. You know, you can kind 94 00:04:49,320 --> 00:04:51,560 Speaker 1: of probably take that with a grain of salt. Some 95 00:04:51,600 --> 00:04:54,359 Speaker 1: people could say that he's leaving because of the last 96 00:04:54,400 --> 00:04:57,640 Speaker 1: couple of years of complications, And at the end of 97 00:04:57,640 --> 00:04:59,719 Speaker 1: the day, I think that's probably why the stock is down, 98 00:05:00,040 --> 00:05:01,880 Speaker 1: you know, cynically, it's probably not because of the live 99 00:05:01,960 --> 00:05:03,919 Speaker 1: streaming and because of the New Zealand shooting. It's probably 100 00:05:03,960 --> 00:05:08,120 Speaker 1: because someone who is very close within Mark Zuckerberg's inner circle. 101 00:05:08,200 --> 00:05:10,919 Speaker 1: I think, after Zuckerberg and Charles Soundberg, he would be 102 00:05:10,960 --> 00:05:13,400 Speaker 1: one of the top people has now left the company. 103 00:05:13,680 --> 00:05:17,960 Speaker 1: You cover tech. So is there any real fear on 104 00:05:18,000 --> 00:05:22,719 Speaker 1: the part of Facebook, Google, or Apple about Elizabeth Warren's 105 00:05:22,920 --> 00:05:26,520 Speaker 1: calls to break them up if she becomes president, and frankly, 106 00:05:26,560 --> 00:05:29,240 Speaker 1: the calls of other people who see them as just 107 00:05:29,400 --> 00:05:32,880 Speaker 1: too big. It's a really interesting story that that we've 108 00:05:32,880 --> 00:05:34,880 Speaker 1: been tracking very closely. We will continue to track. I 109 00:05:34,880 --> 00:05:36,880 Speaker 1: have no idea where this is going to go. How 110 00:05:36,960 --> 00:05:41,039 Speaker 1: serious in the United States, These calls for very very 111 00:05:41,120 --> 00:05:44,240 Speaker 1: strict antitrust regulations, just changes in how we think about 112 00:05:44,240 --> 00:05:46,960 Speaker 1: anti trust in the United States are actually gonna happen. 113 00:05:46,960 --> 00:05:49,320 Speaker 1: But this is a movement I would say that has 114 00:05:49,320 --> 00:05:51,200 Speaker 1: been brewing for a couple of years, and it started, 115 00:05:51,240 --> 00:05:54,560 Speaker 1: you know, with some progressive policy think tank kind of 116 00:05:54,600 --> 00:05:57,080 Speaker 1: people saying, you know, we really need to rethink how 117 00:05:57,080 --> 00:06:00,400 Speaker 1: this is going, and it has filtered into the political mainstream, 118 00:06:00,440 --> 00:06:03,839 Speaker 1: you know, most strikingly now with Elizabeth Warren very clearly 119 00:06:03,920 --> 00:06:06,720 Speaker 1: stating I will break up the company's I will do 120 00:06:06,760 --> 00:06:10,440 Speaker 1: it this way. I will separate Instagram from Facebook. And 121 00:06:10,520 --> 00:06:12,520 Speaker 1: she's sort of stating this in a way that you know, 122 00:06:12,560 --> 00:06:15,000 Speaker 1: although other politicians have talked about the need for stricter 123 00:06:15,040 --> 00:06:17,600 Speaker 1: antitrust rules, they haven't gone as far as to say 124 00:06:17,640 --> 00:06:20,440 Speaker 1: explicitly what they want to do. And so, you know, 125 00:06:20,520 --> 00:06:22,560 Speaker 1: she is a long way off from the presidency. She 126 00:06:22,600 --> 00:06:24,840 Speaker 1: has a grueling primary to get through, and then of 127 00:06:24,839 --> 00:06:27,800 Speaker 1: course to you know, whoever does win that that democratic 128 00:06:27,800 --> 00:06:30,320 Speaker 1: promimly has to defeat Donald Trump. But I'm sure it 129 00:06:30,320 --> 00:06:33,679 Speaker 1: will be a topic of debate during Democratic primary pleasure. Garrett, 130 00:06:33,680 --> 00:06:35,960 Speaker 1: that's so nice to have you. That's Garrett Davinky is 131 00:06:35,960 --> 00:06:41,040 Speaker 1: the Bloomberg News technology reporter. Thanks for listening to the 132 00:06:41,040 --> 00:06:44,400 Speaker 1: Bloomberg Law Podcast. You can subscribe and listen to the 133 00:06:44,440 --> 00:06:48,320 Speaker 1: show on Apple Podcasts, SoundCloud, and on bloomberg dot com 134 00:06:48,400 --> 00:06:52,560 Speaker 1: slash podcast. I'm June Brasso. This is Bloomberg