1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,760 --> 00:00:14,240 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,280 --> 00:00:17,120 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,160 --> 00:00:20,440 Speaker 1: And how the tech are you so? Not long ago, 5 00:00:21,079 --> 00:00:25,560 Speaker 1: David Meyer wrote a piece titled even Facebook's critics don't 6 00:00:25,600 --> 00:00:29,320 Speaker 1: grasp how much troubled meta is in and he wrote 7 00:00:29,320 --> 00:00:32,239 Speaker 1: it for Fast Company. And if you've been keeping up 8 00:00:32,280 --> 00:00:36,960 Speaker 1: with meta slash Facebook, you probably have a long list 9 00:00:37,000 --> 00:00:39,920 Speaker 1: of things that Meyer could have been referring to. Could 10 00:00:39,920 --> 00:00:43,360 Speaker 1: it be that various governments, such as the United States 11 00:00:43,440 --> 00:00:48,400 Speaker 1: are frequently scrutinizing meta and calling company leaders to appear 12 00:00:48,440 --> 00:00:52,479 Speaker 1: before legislative bodies to answer tough questions. Could it be 13 00:00:52,520 --> 00:00:55,960 Speaker 1: the fact that TikTok continues to dominate as the social 14 00:00:56,000 --> 00:01:00,600 Speaker 1: platform favored by younger people, meaning that meta slash Facebook's 15 00:01:00,640 --> 00:01:04,400 Speaker 1: user base is slowly aging out and it's not replacing 16 00:01:04,400 --> 00:01:08,080 Speaker 1: it when new younger users. Is it that the company 17 00:01:08,160 --> 00:01:10,280 Speaker 1: jumped the gun in an effort to be the front 18 00:01:10,360 --> 00:01:14,039 Speaker 1: runner to define whatever the heck the metaverse is going 19 00:01:14,080 --> 00:01:16,720 Speaker 1: to be? Well, all of those are factors that should 20 00:01:16,760 --> 00:01:20,600 Speaker 1: be matters of concern for Facebook executives and for shareholders. 21 00:01:20,640 --> 00:01:23,880 Speaker 1: But what Meyer was talking about with something else, something 22 00:01:24,000 --> 00:01:27,960 Speaker 1: involving privacy and the law and a change that happened 23 00:01:28,000 --> 00:01:31,600 Speaker 1: a couple of years ago that has affected everything. So 24 00:01:33,160 --> 00:01:38,200 Speaker 1: on July twenty twenty, the European Union's Court of Justice 25 00:01:38,800 --> 00:01:43,679 Speaker 1: made a decision that would have enormous consequences. It concluded 26 00:01:43,800 --> 00:01:49,720 Speaker 1: that an earlier data transfer process called the EU US 27 00:01:49,960 --> 00:01:54,520 Speaker 1: Privacy Shield was not sufficient to protect the private data 28 00:01:54,640 --> 00:01:58,760 Speaker 1: of EU citizens, and that it would thus be struck down. 29 00:01:59,160 --> 00:02:03,680 Speaker 1: It would be invalidated. This has massive repercussions for companies 30 00:02:03,720 --> 00:02:07,160 Speaker 1: like Meta, not just Mata, in fact, as repercussions for 31 00:02:07,200 --> 00:02:11,520 Speaker 1: any company that operates within the EU, but in fact 32 00:02:11,560 --> 00:02:14,040 Speaker 1: has it's you know, any kind of data transfers that 33 00:02:14,160 --> 00:02:19,520 Speaker 1: exit the EU. So Mark Zuckerberg said essentially that unless 34 00:02:19,560 --> 00:02:22,560 Speaker 1: the EU changes the stance or makes an exception for 35 00:02:22,600 --> 00:02:26,240 Speaker 1: the company, platforms like Facebook and Instagram will have to 36 00:02:26,360 --> 00:02:30,280 Speaker 1: pull out of the European Union. That sounds kind of 37 00:02:30,320 --> 00:02:32,520 Speaker 1: like they're making a threat, right, like somehow you know 38 00:02:32,600 --> 00:02:34,680 Speaker 1: Zuckerberg saying, Hey, if you don't play by my rules, 39 00:02:34,680 --> 00:02:39,119 Speaker 1: I'm taking my ball and going home. But really this 40 00:02:39,200 --> 00:02:42,600 Speaker 1: is more of a plea. It's really more, please, please, 41 00:02:42,639 --> 00:02:45,440 Speaker 1: please don't do this, because I can't do my thing 42 00:02:45,880 --> 00:02:49,120 Speaker 1: if you do. So. Today I thought I would talk 43 00:02:49,520 --> 00:02:53,799 Speaker 1: about what the privacy shield was, why it existed, why 44 00:02:53,840 --> 00:02:57,360 Speaker 1: the EU decided it wasn't sufficient, what they're planning in 45 00:02:57,400 --> 00:03:00,440 Speaker 1: its place, and what all this means for come Benese 46 00:03:00,480 --> 00:03:03,679 Speaker 1: like Meta. To do that, we actually have to look 47 00:03:03,720 --> 00:03:06,200 Speaker 1: back at the history of the EU and its stands 48 00:03:06,240 --> 00:03:09,640 Speaker 1: on data privacy and security. Now, depending on how you 49 00:03:09,680 --> 00:03:12,079 Speaker 1: look at it, the EU really traces its history back 50 00:03:12,120 --> 00:03:15,600 Speaker 1: to the conclusion of World War Two, but the single 51 00:03:15,680 --> 00:03:18,440 Speaker 1: market that we would refer to as the European Union 52 00:03:18,919 --> 00:03:24,120 Speaker 1: would not formally emerge until nine Now. Around that same time, 53 00:03:24,760 --> 00:03:28,440 Speaker 1: there was a growing general awareness about the Internet, in 54 00:03:28,520 --> 00:03:31,359 Speaker 1: large part helped by the introduction of something new called 55 00:03:31,520 --> 00:03:34,560 Speaker 1: the Worldwide Web, and it would take a few years 56 00:03:34,600 --> 00:03:37,000 Speaker 1: for the Web and the Internet at large to really 57 00:03:37,040 --> 00:03:39,680 Speaker 1: gain a foothold in the minds of the mainstream public, 58 00:03:40,280 --> 00:03:43,560 Speaker 1: but some leaders in the EU were already dealing with 59 00:03:43,640 --> 00:03:50,000 Speaker 1: concepts like data privacy. Data privacy doesn't just require you know, 60 00:03:50,200 --> 00:03:53,160 Speaker 1: digital transfers, right like, you don't have to have that 61 00:03:53,280 --> 00:03:56,200 Speaker 1: be part of the process for data privacy to be 62 00:03:56,240 --> 00:03:59,280 Speaker 1: a concern, and in fact, the countries that made up 63 00:03:59,280 --> 00:04:05,440 Speaker 1: the European Union had already been concerned about protecting EU 64 00:04:05,600 --> 00:04:09,840 Speaker 1: citizen privacy when dealing with companies that existed outside the 65 00:04:09,840 --> 00:04:14,440 Speaker 1: European Union. How can you guarantee that their private data 66 00:04:14,520 --> 00:04:19,600 Speaker 1: remains safe when it's going into the hands of companies 67 00:04:19,720 --> 00:04:22,560 Speaker 1: that aren't based in the European Union itself. That had 68 00:04:22,640 --> 00:04:29,320 Speaker 1: already been a concern, but the EU member states knew 69 00:04:29,400 --> 00:04:31,160 Speaker 1: that there needed to be put in place laws that 70 00:04:31,200 --> 00:04:35,560 Speaker 1: could protect citizen data, that there are fundamental rights associated 71 00:04:35,560 --> 00:04:39,720 Speaker 1: with data that have to be protected. To that end, 72 00:04:40,160 --> 00:04:44,480 Speaker 1: the EU built upon an earlier, non binding list of 73 00:04:44,520 --> 00:04:51,000 Speaker 1: guiding principles relating to protecting citizen information. These principles included 74 00:04:51,200 --> 00:04:55,600 Speaker 1: pretty common stuff like alerting someone as to win their 75 00:04:55,680 --> 00:05:00,920 Speaker 1: data would be collected, a requesting consent before the disclose 76 00:05:01,040 --> 00:05:04,320 Speaker 1: that information to some other party. So if you were 77 00:05:04,360 --> 00:05:07,120 Speaker 1: to collect an e uses and information, you would then 78 00:05:07,200 --> 00:05:09,040 Speaker 1: have to get their consent before you could share it 79 00:05:09,120 --> 00:05:13,159 Speaker 1: with someone else, and various other concepts that are pretty 80 00:05:13,160 --> 00:05:16,200 Speaker 1: common to what we see in in privacy protection laws. 81 00:05:16,720 --> 00:05:18,960 Speaker 1: They had been around before the rise of the Internet, 82 00:05:19,000 --> 00:05:22,479 Speaker 1: but because they were non binding, they didn't really have 83 00:05:22,520 --> 00:05:25,000 Speaker 1: any teeth to them. It was like, it would be 84 00:05:25,120 --> 00:05:29,640 Speaker 1: nice if everyone agreed to obey these things, but there 85 00:05:29,680 --> 00:05:32,479 Speaker 1: was no requirement to do so. The EU decided to 86 00:05:32,520 --> 00:05:37,960 Speaker 1: formally establish data privacy rules, though these would have limitations 87 00:05:38,000 --> 00:05:41,480 Speaker 1: to which we'll talk about and that. These rules became 88 00:05:41,520 --> 00:05:46,520 Speaker 1: known as the Data Protection Directive or dp D. This 89 00:05:46,640 --> 00:05:50,080 Speaker 1: directive set out the parameters for when and how entities 90 00:05:50,080 --> 00:05:55,120 Speaker 1: would be allowed to collect European Union citizen information and 91 00:05:55,360 --> 00:05:58,480 Speaker 1: how they could use it. Specifically, you know, how they 92 00:05:58,520 --> 00:06:01,120 Speaker 1: would be allowed to use it if it required a 93 00:06:01,160 --> 00:06:05,080 Speaker 1: transfer outside the EU and U, and also how they 94 00:06:05,080 --> 00:06:08,400 Speaker 1: were to alert citizens of things like collecting their data. 95 00:06:08,920 --> 00:06:12,080 Speaker 1: Each member's state of the EU was responsible for establishing 96 00:06:12,080 --> 00:06:15,640 Speaker 1: a supervisory department to make sure that all parties were 97 00:06:15,680 --> 00:06:19,640 Speaker 1: complying with this directive, and the directive stated that the 98 00:06:19,680 --> 00:06:23,120 Speaker 1: only time data could be shared with countries outside the 99 00:06:23,160 --> 00:06:27,440 Speaker 1: European Union is when those countries could adequately protect the 100 00:06:27,560 --> 00:06:33,200 Speaker 1: data's security. So if a if a country or company 101 00:06:33,760 --> 00:06:38,200 Speaker 1: was unable to do that, then by this directive, it 102 00:06:38,279 --> 00:06:42,280 Speaker 1: would not be allowed to transfer information outside the EU 103 00:06:42,600 --> 00:06:46,600 Speaker 1: now right away. These rules created challenges both within and 104 00:06:46,720 --> 00:06:49,599 Speaker 1: without the EU and when you really break it all down, 105 00:06:50,080 --> 00:06:53,279 Speaker 1: all traffic on the Internet is information, and a lot 106 00:06:53,320 --> 00:06:57,680 Speaker 1: of that information ends up including personal identification information or 107 00:06:57,720 --> 00:07:03,400 Speaker 1: at least personally identify alable information. So you might argue 108 00:07:03,440 --> 00:07:06,520 Speaker 1: that personal information should only include stuff like, you know, 109 00:07:06,600 --> 00:07:11,040 Speaker 1: a legal information like a person's name, or their address, 110 00:07:11,120 --> 00:07:14,080 Speaker 1: or their birth date or maybe the hospital where they 111 00:07:14,080 --> 00:07:17,200 Speaker 1: were born. That kind of stuff. You know, information that 112 00:07:17,240 --> 00:07:21,440 Speaker 1: relates directly to that individual, and when you take this 113 00:07:21,520 --> 00:07:25,040 Speaker 1: information in a hole, it's more or less unique to 114 00:07:25,120 --> 00:07:27,520 Speaker 1: that person. I have to say more or less simply 115 00:07:27,560 --> 00:07:31,560 Speaker 1: because you know, weird stuff. Anyway, that kind of information 116 00:07:31,640 --> 00:07:35,000 Speaker 1: is absolutely important. It is worthy of being protected, and 117 00:07:35,120 --> 00:07:38,120 Speaker 1: it's very easy to define. Right. You could say, this 118 00:07:38,160 --> 00:07:42,240 Speaker 1: information directly corresponds to this individual, therefore we need to 119 00:07:42,240 --> 00:07:45,320 Speaker 1: protect it. But then there's also other information that, well, 120 00:07:45,360 --> 00:07:51,960 Speaker 1: not specifically about a particular individual, could collectively identify that 121 00:07:52,040 --> 00:07:55,520 Speaker 1: person all the same, So an IP address could be 122 00:07:55,560 --> 00:07:58,480 Speaker 1: part of that. You might argue that's personal information, or 123 00:07:58,520 --> 00:08:04,800 Speaker 1: you might argue, well, IP addresses aren't fully reliable because 124 00:08:04,840 --> 00:08:07,560 Speaker 1: you could use something like a VPN which would hide 125 00:08:07,560 --> 00:08:11,880 Speaker 1: your IP address, so you can't just rely on that 126 00:08:11,920 --> 00:08:16,120 Speaker 1: to identify a person. However, it falls into this gray area. 127 00:08:16,720 --> 00:08:19,520 Speaker 1: But then there's stuff like the person's browsing behaviors, you know, 128 00:08:19,560 --> 00:08:22,400 Speaker 1: what they like, what they don't like, how long they 129 00:08:22,440 --> 00:08:25,080 Speaker 1: stay on a page. All of these things can actually 130 00:08:25,080 --> 00:08:27,480 Speaker 1: start to create a digital fingerprint that points to a 131 00:08:27,520 --> 00:08:31,400 Speaker 1: specific person. And it sounds wild, but it really doesn't 132 00:08:31,440 --> 00:08:34,240 Speaker 1: take that many points of data to narrow down folks 133 00:08:34,320 --> 00:08:37,800 Speaker 1: and figure out who created those data points. In the 134 00:08:37,840 --> 00:08:40,600 Speaker 1: old days, doing that would have been tough simply because 135 00:08:40,600 --> 00:08:43,960 Speaker 1: you're talking about a lot of data being generated and 136 00:08:43,960 --> 00:08:46,920 Speaker 1: then trying to suss out what is signaled based on 137 00:08:46,960 --> 00:08:50,360 Speaker 1: all the noise, you know, to actually analyze that information 138 00:08:50,360 --> 00:08:52,520 Speaker 1: to get something useful out of it. It was a 139 00:08:52,559 --> 00:08:55,720 Speaker 1: time consuming process and it just you know, when you 140 00:08:55,760 --> 00:08:58,560 Speaker 1: look at it from a return on investment standpoint. In 141 00:08:58,600 --> 00:09:01,160 Speaker 1: the old days, it just it makes sense, right, unless 142 00:09:01,160 --> 00:09:05,000 Speaker 1: you were going after someone specific for nefarious purposes. You 143 00:09:05,000 --> 00:09:07,640 Speaker 1: wouldn't do that for just anybody because it was too 144 00:09:07,720 --> 00:09:11,160 Speaker 1: much effort. However, we have gotten a lot better at 145 00:09:11,160 --> 00:09:14,839 Speaker 1: analyzing enormous data sets in a short amount of time 146 00:09:14,960 --> 00:09:19,520 Speaker 1: using things like artificial intelligence and machine learning and various algorithms, 147 00:09:19,559 --> 00:09:22,640 Speaker 1: so this has become less of an obstacle. It's not 148 00:09:23,280 --> 00:09:26,840 Speaker 1: like science fiction level yet, but it's pretty darn close. 149 00:09:27,559 --> 00:09:32,960 Speaker 1: So now some of the technical restrictions that meant we 150 00:09:32,960 --> 00:09:34,640 Speaker 1: didn't have to worry about this so much in the 151 00:09:34,720 --> 00:09:39,480 Speaker 1: past aren't really a thing anymore. Anyway, The euse directive 152 00:09:39,480 --> 00:09:41,720 Speaker 1: meant that the United States, that the country you know, 153 00:09:41,760 --> 00:09:44,600 Speaker 1: where the Internet got its start, would need to figure 154 00:09:44,640 --> 00:09:47,760 Speaker 1: out a way to comply with this set of rules 155 00:09:48,120 --> 00:09:51,120 Speaker 1: if it wanted to allow information to pass between the 156 00:09:51,200 --> 00:09:54,319 Speaker 1: US and the EU. Because a lot of these companies, 157 00:09:54,559 --> 00:09:58,800 Speaker 1: their servers all exist within the United States, so by 158 00:09:58,840 --> 00:10:01,960 Speaker 1: the nature of their business this any any information that 159 00:10:01,960 --> 00:10:04,120 Speaker 1: would be coming from the European Union would have to 160 00:10:04,120 --> 00:10:07,840 Speaker 1: go across the Atlantic to a server in the US. 161 00:10:08,480 --> 00:10:11,480 Speaker 1: To that end, some EU officials began to piece together 162 00:10:11,520 --> 00:10:16,520 Speaker 1: what would become known as the International Safe Harbor Privacy Principles. 163 00:10:17,200 --> 00:10:19,040 Speaker 1: Now we're going to take a quick break, but when 164 00:10:19,120 --> 00:10:21,680 Speaker 1: we come back, I'll talk a bit about Safe Harbor, 165 00:10:21,720 --> 00:10:24,559 Speaker 1: what it was meant to do, and why it no 166 00:10:24,640 --> 00:10:36,840 Speaker 1: longer is a thing. But first, these messages, the International 167 00:10:37,120 --> 00:10:40,400 Speaker 1: Safe Harbor Privacy Principles. What the heck was this? Well, 168 00:10:40,440 --> 00:10:44,720 Speaker 1: it was a program that US companies could apply to join. 169 00:10:45,559 --> 00:10:49,440 Speaker 1: The companies would apply for certification, and that certification essentially 170 00:10:49,440 --> 00:10:53,000 Speaker 1: said these companies are taking the necessary steps to protect 171 00:10:53,160 --> 00:10:56,520 Speaker 1: user data so they can be considered to be compliant 172 00:10:56,760 --> 00:11:00,880 Speaker 1: with the Data Protection Directive that the EU had obviously passed. 173 00:11:01,200 --> 00:11:05,080 Speaker 1: So ultimately, the goal here was to prevent the accidental 174 00:11:05,120 --> 00:11:10,040 Speaker 1: disclosure of EU citizen private information that happened to be 175 00:11:10,080 --> 00:11:13,080 Speaker 1: stored on servers within the United States so it's outside 176 00:11:13,080 --> 00:11:17,120 Speaker 1: the e use control. This was the system by which 177 00:11:17,120 --> 00:11:21,040 Speaker 1: companies would guarantee they would make sure that data would 178 00:11:21,080 --> 00:11:25,680 Speaker 1: remain safe. The Safe Harbor system became effective in two thousand. 179 00:11:25,720 --> 00:11:29,920 Speaker 1: It took several years for it to formalize and then 180 00:11:29,960 --> 00:11:33,680 Speaker 1: to be enacted, and US companies that receive certification under 181 00:11:33,679 --> 00:11:37,080 Speaker 1: Safe Harbor and then registered with the EU would be 182 00:11:37,120 --> 00:11:40,960 Speaker 1: allowed to operate things that would transfer data between the 183 00:11:41,000 --> 00:11:43,920 Speaker 1: U S and EU without much trouble. Oh and in 184 00:11:44,040 --> 00:11:47,320 Speaker 1: order to qualify, those companies would also have to be 185 00:11:47,360 --> 00:11:51,000 Speaker 1: companies that were regulated by the United States FTC, Federal 186 00:11:51,040 --> 00:11:54,720 Speaker 1: Trade Commission, or the Department of Transportation. Those were the 187 00:11:54,720 --> 00:11:57,560 Speaker 1: only companies that could qualify for Safe Harbor. Anything that 188 00:11:57,720 --> 00:12:01,760 Speaker 1: didn't fall into those categories was an exception, and that 189 00:12:01,800 --> 00:12:04,000 Speaker 1: actually cuts back on a lot of businesses, believe it 190 00:12:04,080 --> 00:12:07,600 Speaker 1: or not. Now, something that I'm sure will not surprise 191 00:12:07,920 --> 00:12:11,000 Speaker 1: many of you out there is that various reviews that 192 00:12:11,040 --> 00:12:13,800 Speaker 1: were done on this system showed that a lot of 193 00:12:13,800 --> 00:12:18,360 Speaker 1: the participating US companies were not complying with the program, 194 00:12:18,400 --> 00:12:21,640 Speaker 1: at least not to the extent that they should. Companies 195 00:12:21,640 --> 00:12:24,840 Speaker 1: were found to be reluctant to actually enforce the principles 196 00:12:24,880 --> 00:12:28,920 Speaker 1: defined by the Safe Harbor program, and questions arose as 197 00:12:28,960 --> 00:12:32,960 Speaker 1: to whether or not the industry could really be self regulating, 198 00:12:33,360 --> 00:12:37,120 Speaker 1: like can we trust these companies to regulate themselves? And 199 00:12:37,240 --> 00:12:40,760 Speaker 1: of course we can't. All right, so quick side rant, 200 00:12:41,120 --> 00:12:46,559 Speaker 1: But this applies directly to the topic. So, the currency 201 00:12:46,800 --> 00:12:50,680 Speaker 1: of the modern world isn't bitcoin, It's not any other 202 00:12:50,720 --> 00:12:54,199 Speaker 1: cryptocurrency because it goes a level deeper than that. The 203 00:12:54,240 --> 00:12:59,719 Speaker 1: currency of the modern world is information. Data is valuable 204 00:12:59,840 --> 00:13:04,040 Speaker 1: you or data is valuable. If it weren't companies like 205 00:13:04,120 --> 00:13:08,400 Speaker 1: Meta Slash, Facebook or Google, they wouldn't even exist if 206 00:13:08,440 --> 00:13:12,000 Speaker 1: your data had no value. These companies depend upon us 207 00:13:12,080 --> 00:13:17,280 Speaker 1: generating information, which the companies can then leverage in various ways. Now, 208 00:13:17,320 --> 00:13:21,120 Speaker 1: an obvious way they do this is through advertising, specifically 209 00:13:21,160 --> 00:13:25,280 Speaker 1: targeted advertising. You know, by analyzing the information I generate, 210 00:13:25,400 --> 00:13:29,120 Speaker 1: a platform like Facebook or Google can suss out what 211 00:13:29,240 --> 00:13:32,760 Speaker 1: matters to me and to compare my experience with ads 212 00:13:32,800 --> 00:13:36,120 Speaker 1: that are more likely to get my attention and my action. 213 00:13:36,880 --> 00:13:40,440 Speaker 1: That is money right there that is incredibly valuable to 214 00:13:40,600 --> 00:13:44,200 Speaker 1: these platforms. It's incredibly valuable to the advertisers and to 215 00:13:44,360 --> 00:13:49,720 Speaker 1: their clients. So my information does have value. Yours does too. 216 00:13:50,000 --> 00:13:55,880 Speaker 1: But even beyond targeted advertising, this information has incredible value. 217 00:13:56,400 --> 00:14:01,200 Speaker 1: Through real time analysis of browsing data across millions or 218 00:14:01,400 --> 00:14:05,600 Speaker 1: hundreds of millions of users, platforms can detect and respond 219 00:14:05,640 --> 00:14:08,480 Speaker 1: to trends before anyone is even aware that there is 220 00:14:08,520 --> 00:14:12,120 Speaker 1: a trend there. So I think back to the description 221 00:14:12,160 --> 00:14:16,679 Speaker 1: of chaos theory that says, imagine the flap of butterflies 222 00:14:16,800 --> 00:14:20,000 Speaker 1: wings in South America setting into motion the variables that 223 00:14:20,040 --> 00:14:24,360 Speaker 1: are necessary to generate a typhoon that hits Southeast Asia. 224 00:14:24,640 --> 00:14:29,160 Speaker 1: That it without that one instigating event, the variables are 225 00:14:29,160 --> 00:14:31,640 Speaker 1: not in the right place to make that happen. Well, 226 00:14:31,680 --> 00:14:35,480 Speaker 1: think for a moment about how many people use platforms 227 00:14:35,480 --> 00:14:41,680 Speaker 1: like Google or Amazon or Facebook individually that users. Data 228 00:14:41,840 --> 00:14:46,960 Speaker 1: is valuable, right, but collectively across all users, that can 229 00:14:47,040 --> 00:14:51,040 Speaker 1: drive corporate strategy. So there should be absolutely no surprise 230 00:14:51,080 --> 00:14:56,520 Speaker 1: that companies are eager to exploit information personal information. It's 231 00:14:56,680 --> 00:15:01,280 Speaker 1: key to their business model and their success. Which is 232 00:15:01,320 --> 00:15:03,680 Speaker 1: why it's also not a big surprise that a lot 233 00:15:03,680 --> 00:15:06,080 Speaker 1: of companies were slacking off when it came to self 234 00:15:06,120 --> 00:15:10,800 Speaker 1: regulation and complying with the principles of safe harbor. If 235 00:15:10,840 --> 00:15:12,840 Speaker 1: the companies could get away with it, if they could 236 00:15:12,920 --> 00:15:17,200 Speaker 1: operate without having to actually worry about complying with these rules, 237 00:15:17,480 --> 00:15:19,960 Speaker 1: then they do it. And I'm sure there were no 238 00:15:20,040 --> 00:15:24,320 Speaker 1: shortage of companies that weren't being outright nefarious or flaunting 239 00:15:24,360 --> 00:15:27,800 Speaker 1: the law or anything like that. But we're falling short 240 00:15:27,840 --> 00:15:30,160 Speaker 1: of holding up to their end of the bargain, you know, 241 00:15:30,320 --> 00:15:32,720 Speaker 1: because it's also hard to do. It's hard to pull 242 00:15:32,760 --> 00:15:35,760 Speaker 1: off and still do business in a way that is 243 00:15:35,800 --> 00:15:39,600 Speaker 1: cost effective. Right, in order to comply with these rules, 244 00:15:40,160 --> 00:15:43,880 Speaker 1: you do have to spend some money, honestly, was what 245 00:15:44,000 --> 00:15:46,320 Speaker 1: it really comes down to. It might not be money money, 246 00:15:46,320 --> 00:15:49,360 Speaker 1: it might be more assets and resources or time or whatever, 247 00:15:50,200 --> 00:15:53,720 Speaker 1: but it's ultimately a cost. Whatever the reason, it was 248 00:15:53,720 --> 00:15:57,760 Speaker 1: clear that this particular approach to protecting information wasn't sufficient 249 00:15:57,840 --> 00:16:01,320 Speaker 1: if the EU actually wanted to keep EU citizen information 250 00:16:01,360 --> 00:16:05,120 Speaker 1: secure and servers that weren't even in the European Union 251 00:16:05,600 --> 00:16:07,640 Speaker 1: all right. Flash forward to two thousand and twelve, the 252 00:16:07,680 --> 00:16:10,440 Speaker 1: EU decided it needed to take another stab at creating 253 00:16:10,480 --> 00:16:15,080 Speaker 1: a unified data protection law to replace the Data Protection Directive. 254 00:16:15,760 --> 00:16:19,320 Speaker 1: So the Director had ultimately been too lucy goosey, and 255 00:16:19,360 --> 00:16:22,440 Speaker 1: that meant that different member nations had different principles and 256 00:16:22,600 --> 00:16:27,640 Speaker 1: enforcement strategies. It was two piecemeal and it wasn't unified 257 00:16:27,720 --> 00:16:31,160 Speaker 1: the way a European Union needed to be. So this 258 00:16:31,360 --> 00:16:35,080 Speaker 1: new law would resolve the various differences between the different 259 00:16:35,280 --> 00:16:38,520 Speaker 1: implementations and the member states of the EU and create 260 00:16:38,520 --> 00:16:42,120 Speaker 1: a more coherent policy that it was EU wide and 261 00:16:42,160 --> 00:16:47,320 Speaker 1: would protect citizen data privacy. That took four years two 262 00:16:47,640 --> 00:16:52,040 Speaker 1: actually formalize, but in April fourteen sixteen, the EU approved 263 00:16:52,320 --> 00:16:55,160 Speaker 1: the new set of rules called the General Data Protection 264 00:16:55,200 --> 00:16:59,120 Speaker 1: Regulation or g d p R, and this became a 265 00:16:59,200 --> 00:17:02,480 Speaker 1: truly huge deal for any company outside the EU that 266 00:17:02,600 --> 00:17:06,560 Speaker 1: wanted to do business inside the EU, particularly for Internet 267 00:17:06,560 --> 00:17:11,000 Speaker 1: based companies. The rules covered any entity that processed or 268 00:17:11,040 --> 00:17:15,480 Speaker 1: transmitted data from within the EU to somewhere else. A 269 00:17:15,560 --> 00:17:18,440 Speaker 1: whole bunch of other stuff was in those rules too, 270 00:17:18,480 --> 00:17:20,359 Speaker 1: But I've done episodes about g d p R in 271 00:17:20,359 --> 00:17:23,000 Speaker 1: the past, so we're just gonna say this was a 272 00:17:23,040 --> 00:17:28,160 Speaker 1: more broad, sweeping, and yet unified approach to data privacy, 273 00:17:28,200 --> 00:17:30,960 Speaker 1: and it created big old headaches for companies around the 274 00:17:30,960 --> 00:17:33,680 Speaker 1: world to ensure that they were compliant with g DPR. 275 00:17:34,080 --> 00:17:37,480 Speaker 1: In fact, to this day, that's still a big thing. Ultimately, 276 00:17:38,040 --> 00:17:40,119 Speaker 1: that's at the heart of the meta problem we were 277 00:17:40,160 --> 00:17:43,000 Speaker 1: talking about. It was g d p R that would 278 00:17:43,040 --> 00:17:45,760 Speaker 1: necessitate things like a pop up message that would alert 279 00:17:45,840 --> 00:17:49,200 Speaker 1: users to a sites reliance on web cookies, for example, 280 00:17:49,240 --> 00:17:53,040 Speaker 1: because that's a type of tracking. It would also require 281 00:17:53,119 --> 00:17:56,719 Speaker 1: foreign services to expressly ask for the consent of users 282 00:17:56,720 --> 00:17:59,479 Speaker 1: in order to collect their data. And you know, companies 283 00:17:59,520 --> 00:18:01,800 Speaker 1: tried to find in different creative ways to get around 284 00:18:01,840 --> 00:18:05,440 Speaker 1: that to maximize the number of people who had quote 285 00:18:05,520 --> 00:18:09,960 Speaker 1: unquote uh agree to this by making it a difficult 286 00:18:09,960 --> 00:18:13,520 Speaker 1: thing to opt out of. That doesn't fly very well 287 00:18:13,520 --> 00:18:14,840 Speaker 1: on the g d PR. There are a lot of 288 00:18:14,840 --> 00:18:18,920 Speaker 1: regulatory agencies that pounce on that kind of practice. They're 289 00:18:18,920 --> 00:18:21,840 Speaker 1: also supposed to explain how information is going to be used, 290 00:18:21,880 --> 00:18:24,159 Speaker 1: and to give people the opportunity to opt out of 291 00:18:24,200 --> 00:18:26,960 Speaker 1: any data collection and that kind of thing. So the 292 00:18:27,040 --> 00:18:30,359 Speaker 1: g d p R replaced the Data Protection Directive and 293 00:18:30,400 --> 00:18:35,320 Speaker 1: became enforceable in all right now in the meantime where 294 00:18:35,560 --> 00:18:38,639 Speaker 1: that was happening, the Safe Harbor Principles, which remember this 295 00:18:38,720 --> 00:18:42,720 Speaker 1: was a framework that companies could follow in order to 296 00:18:42,720 --> 00:18:47,119 Speaker 1: be considered UH safe under g d p R rules 297 00:18:48,119 --> 00:18:52,919 Speaker 1: that had already been invalidated by the EU in ten 298 00:18:53,520 --> 00:18:55,720 Speaker 1: They said, well, you know, Data Protection Directive is not 299 00:18:55,840 --> 00:18:59,400 Speaker 1: sufficient and Safe Harbor, which was designed to work within 300 00:18:59,520 --> 00:19:03,600 Speaker 1: Data protet Action Directive that by extension, is not sufficient, 301 00:19:03,680 --> 00:19:07,040 Speaker 1: so it doesn't apply anymore. It was not robust enough 302 00:19:07,440 --> 00:19:10,920 Speaker 1: to satisfy the requirements of the upcoming g DPR. So 303 00:19:11,200 --> 00:19:14,520 Speaker 1: the European Commission and the United States government negotiated a 304 00:19:14,560 --> 00:19:19,760 Speaker 1: new political agreement to codify rules on how commercial transatlantic 305 00:19:19,800 --> 00:19:23,479 Speaker 1: exchanges of personal information from EU citizens to U S 306 00:19:23,480 --> 00:19:28,040 Speaker 1: servers could actually happen. Those rules would become known as 307 00:19:28,080 --> 00:19:33,359 Speaker 1: the EU US Privacy Shield. Like the Safe Harbor Principles, 308 00:19:33,400 --> 00:19:35,720 Speaker 1: this was really meant to create a framework in which 309 00:19:35,760 --> 00:19:41,040 Speaker 1: companies could operate legally within the European Union. US companies 310 00:19:41,080 --> 00:19:43,440 Speaker 1: that gather user data would have to comply with this 311 00:19:43,600 --> 00:19:46,159 Speaker 1: set of rules in order to make services available to 312 00:19:46,200 --> 00:19:49,960 Speaker 1: citizens in the EU, Otherwise they would be violating privacy 313 00:19:50,040 --> 00:19:54,000 Speaker 1: law in Europe. Like the previous system, the Privacy Shield 314 00:19:54,040 --> 00:19:57,840 Speaker 1: includes guiding principles that all organizations are expected to follow. 315 00:19:58,400 --> 00:20:02,320 Speaker 1: While it beefed up some other protections incorporated into the 316 00:20:02,400 --> 00:20:06,399 Speaker 1: previous systems, critics were worried that there were still some 317 00:20:06,480 --> 00:20:10,959 Speaker 1: big gaps in the Privacy Shield process and that ultimately 318 00:20:11,600 --> 00:20:14,800 Speaker 1: it would get challenged and struck down by the European Commission, 319 00:20:15,400 --> 00:20:18,639 Speaker 1: and those concerns likely went into overdrive in twenty seventeen 320 00:20:19,080 --> 00:20:22,480 Speaker 1: when then President Donald Trump signed an executive order that 321 00:20:22,600 --> 00:20:26,520 Speaker 1: denied US privacy protections to anyone who is not a 322 00:20:26,640 --> 00:20:30,159 Speaker 1: US citizen or resident. So, in other words, according to 323 00:20:30,200 --> 00:20:34,159 Speaker 1: that executive order, US companies would not be held accountable 324 00:20:34,720 --> 00:20:38,280 Speaker 1: for guaranteeing data privacy and security for any non US 325 00:20:38,359 --> 00:20:42,440 Speaker 1: citizens or residents. Considering that g DPR demands that any 326 00:20:42,600 --> 00:20:47,040 Speaker 1: entity that transfers e U citizen data overseas must protect 327 00:20:47,040 --> 00:20:50,480 Speaker 1: that information, that was a problem. By the way. Joe 328 00:20:50,520 --> 00:20:54,560 Speaker 1: Biden would later rescind that executive order in one but 329 00:20:54,600 --> 00:20:58,880 Speaker 1: by then things that already changed in Europe. So we're 330 00:20:58,920 --> 00:21:02,320 Speaker 1: going to talk about those chain ages and how Privacy 331 00:21:02,320 --> 00:21:05,399 Speaker 1: Shield would follow in the footsteps of Safe Harbor and 332 00:21:05,480 --> 00:21:17,560 Speaker 1: get invalidated after we come back from these messages. So, 333 00:21:17,680 --> 00:21:20,800 Speaker 1: as I was alluding before the break the critics of 334 00:21:20,840 --> 00:21:25,680 Speaker 1: the Privacy Shield process who said this is not going 335 00:21:25,760 --> 00:21:28,679 Speaker 1: to be seen as sufficient, that we're absolutely right. The 336 00:21:28,720 --> 00:21:32,800 Speaker 1: EU Commission reviewed the Privacy Shield policy in twenty and 337 00:21:32,840 --> 00:21:36,280 Speaker 1: determined that it was not enough to protect EU citizen 338 00:21:36,560 --> 00:21:40,800 Speaker 1: private data and struck it down. Specifically, there were concerns 339 00:21:40,840 --> 00:21:44,200 Speaker 1: that the US government would be able to conduct surveillance 340 00:21:44,320 --> 00:21:48,479 Speaker 1: on EU citizen data and that under EU law that 341 00:21:48,520 --> 00:21:51,880 Speaker 1: was a violation of of human rights and freedom rights 342 00:21:51,880 --> 00:21:56,000 Speaker 1: of EU citizens. So there was a need to formulate 343 00:21:56,160 --> 00:22:00,639 Speaker 1: yet another data privacy framework that would address this issue, 344 00:22:01,080 --> 00:22:04,240 Speaker 1: and that's kind of where we are now. See, without 345 00:22:04,240 --> 00:22:07,800 Speaker 1: a framework, it becomes very difficult to do business in 346 00:22:07,840 --> 00:22:11,440 Speaker 1: the European Union. The framework, you know, it smooths things out, 347 00:22:11,520 --> 00:22:15,680 Speaker 1: it speeds things up because it's it's one point one 348 00:22:15,720 --> 00:22:19,879 Speaker 1: system that companies in say, well specifically the United States, 349 00:22:20,160 --> 00:22:24,760 Speaker 1: can go through in order to qualify to do business 350 00:22:24,800 --> 00:22:29,320 Speaker 1: in the EU and be considered compliant with the rules 351 00:22:29,320 --> 00:22:34,639 Speaker 1: of g d p R. So this new framework is 352 00:22:34,680 --> 00:22:38,000 Speaker 1: still taking shape. It doesn't exist yet, it is in 353 00:22:38,080 --> 00:22:41,800 Speaker 1: the process of existing, and it will take even longer 354 00:22:41,840 --> 00:22:45,160 Speaker 1: for the EU to formalize and adopt and enforce that 355 00:22:45,280 --> 00:22:49,239 Speaker 1: rule once it is finished. In the meantime, we're in 356 00:22:49,280 --> 00:22:53,040 Speaker 1: an era where things are really unstable now. One way 357 00:22:53,080 --> 00:22:56,679 Speaker 1: companies have managed to continue to operate in the absence 358 00:22:56,840 --> 00:23:00,320 Speaker 1: of a formal framework is to file what are called 359 00:23:00,560 --> 00:23:05,240 Speaker 1: standard contractual clauses or sc c s with the EU. 360 00:23:06,480 --> 00:23:09,200 Speaker 1: You can think of this as essentially being a legal agreement, 361 00:23:10,160 --> 00:23:14,680 Speaker 1: and that this legal agreement provides a guarantee that the 362 00:23:14,760 --> 00:23:20,600 Speaker 1: non EU company is taking pains to conform to g 363 00:23:20,680 --> 00:23:25,239 Speaker 1: d p R requirements, so it's essentially saying, you know, 364 00:23:25,600 --> 00:23:30,600 Speaker 1: we're obeying the rules. Securing sccs can be time consuming 365 00:23:30,680 --> 00:23:34,240 Speaker 1: and it isn't a smooth process, at least not as 366 00:23:34,240 --> 00:23:38,840 Speaker 1: smooth as being able to just apply to a framework 367 00:23:39,119 --> 00:23:43,159 Speaker 1: like Privacy Shield or Safe Harbor, so it can be 368 00:23:43,200 --> 00:23:46,200 Speaker 1: a bit of a headache. And now let's talk about 369 00:23:46,200 --> 00:23:50,320 Speaker 1: Ireland and its Data Protection Commission or DPC, because this 370 00:23:50,359 --> 00:23:55,000 Speaker 1: relates directly to the Meta story. The DPC determined back 371 00:23:55,080 --> 00:24:00,440 Speaker 1: in twenty that two of Meta's platforms, namely face Book 372 00:24:00,680 --> 00:24:04,560 Speaker 1: and Instagram, relied on a data controller that could not 373 00:24:04,800 --> 00:24:09,280 Speaker 1: provide a guarantee that data from Irish citizens would be 374 00:24:09,280 --> 00:24:14,280 Speaker 1: protected from US government surveillance, and so by extension that 375 00:24:14,320 --> 00:24:18,280 Speaker 1: would violate data privacy laws in the EU. That would 376 00:24:18,320 --> 00:24:22,720 Speaker 1: also mean that Meta would not qualify for an sc C, 377 00:24:23,080 --> 00:24:27,199 Speaker 1: at least in terms of Facebook and Instagram. WhatsApp, a 378 00:24:27,240 --> 00:24:30,960 Speaker 1: totally different platform, uses a completely different data controller and 379 00:24:31,240 --> 00:24:34,359 Speaker 1: is not part of this like WhatsApp, can operate in 380 00:24:34,400 --> 00:24:37,679 Speaker 1: the EU find because it is not subject to the 381 00:24:37,800 --> 00:24:43,720 Speaker 1: same vulnerabilities that Facebook and Instagram are. Then, last month, 382 00:24:43,760 --> 00:24:46,600 Speaker 1: which for those listening in the future would be July 383 00:24:46,880 --> 00:24:52,360 Speaker 1: of two thousand twenty two, the DPC, this regulatory agency 384 00:24:52,359 --> 00:24:57,639 Speaker 1: in Ireland, filed an updated draft order to shut down 385 00:24:57,720 --> 00:25:01,560 Speaker 1: Instagram and Facebook services in the U and filed that 386 00:25:01,640 --> 00:25:05,920 Speaker 1: with other regulators within the EU. So the other member 387 00:25:06,000 --> 00:25:09,960 Speaker 1: states that have regulated Tory agencies, they all received a 388 00:25:10,040 --> 00:25:14,359 Speaker 1: filing of this updated draft decision. While the contents of 389 00:25:14,359 --> 00:25:17,440 Speaker 1: that order weren't made entirely public, it did become clear 390 00:25:17,560 --> 00:25:21,840 Speaker 1: that DPC was telling other regulators that they should halt 391 00:25:22,000 --> 00:25:25,600 Speaker 1: Facebook and Instagram's ability to transfer EU citizen data to 392 00:25:25,640 --> 00:25:30,480 Speaker 1: the US because it could not guarantee safety against US surveillance. 393 00:25:30,920 --> 00:25:35,320 Speaker 1: This would effectively shut down Facebook and Instagram within the 394 00:25:35,359 --> 00:25:41,160 Speaker 1: European Union and to EU citizens. So let's get into 395 00:25:41,240 --> 00:25:45,600 Speaker 1: some complicated political stuff now. Under Article sixty of the 396 00:25:45,680 --> 00:25:48,600 Speaker 1: g d p R, the rest of the EU's data 397 00:25:48,640 --> 00:25:53,240 Speaker 1: protection agencies have four weeks from that filing to comment 398 00:25:53,400 --> 00:25:57,640 Speaker 1: on the dpc's conclusion. Uh. Those four weeks are up 399 00:25:57,680 --> 00:26:01,160 Speaker 1: this week, by the way. So if after four weeks 400 00:26:01,200 --> 00:26:05,679 Speaker 1: there are no objections to the dpc's decision, which is 401 00:26:05,840 --> 00:26:09,880 Speaker 1: again to essentially shutter Facebook and Instagram within the EU, 402 00:26:10,560 --> 00:26:17,200 Speaker 1: that decision then becomes binding YEWSA. Now, if there are objections, 403 00:26:17,440 --> 00:26:21,320 Speaker 1: which you know likely there are some, then the DPC, 404 00:26:21,760 --> 00:26:25,800 Speaker 1: the Irish regulatory agency, has two weeks to respond and 405 00:26:25,840 --> 00:26:30,720 Speaker 1: address any objections, or alternatively, they can choose not to 406 00:26:30,800 --> 00:26:34,920 Speaker 1: change anything and just submit their decision to the European 407 00:26:35,320 --> 00:26:38,879 Speaker 1: Data Protection Board or e d p B. This is 408 00:26:38,920 --> 00:26:43,760 Speaker 1: like the overall regulatory agency the agency of regulatory agencies, 409 00:26:44,320 --> 00:26:47,080 Speaker 1: and the e d p B would then decide whether 410 00:26:47,200 --> 00:26:50,440 Speaker 1: or not the decision should apply across the European Union. 411 00:26:50,680 --> 00:26:53,600 Speaker 1: The e d p B, which by the way is 412 00:26:53,680 --> 00:26:57,200 Speaker 1: hard to say quickly, would have one month to make 413 00:26:57,240 --> 00:27:00,520 Speaker 1: that decision, uh, though it could also request a month 414 00:27:00,560 --> 00:27:04,600 Speaker 1: extension if the board determined that the matter is complicated 415 00:27:04,720 --> 00:27:08,399 Speaker 1: enough to warrant more consideration, So two months maximum to 416 00:27:08,480 --> 00:27:12,760 Speaker 1: decide on this matter. At that point, After a month 417 00:27:12,840 --> 00:27:15,520 Speaker 1: or two months if it's extended, the board would go 418 00:27:15,640 --> 00:27:19,320 Speaker 1: to a vote. If the vote passes in either direction 419 00:27:19,359 --> 00:27:23,600 Speaker 1: by two thirds majority, then that's the decision. So you 420 00:27:23,640 --> 00:27:26,040 Speaker 1: have to have a two thirds majority for there to 421 00:27:26,080 --> 00:27:28,720 Speaker 1: be a clear decision on the matter. If it doesn't 422 00:27:28,720 --> 00:27:31,800 Speaker 1: get two thirds, then the whole thing is given another 423 00:27:31,880 --> 00:27:35,240 Speaker 1: two weeks of debate and then it goes to another vote, 424 00:27:35,280 --> 00:27:38,920 Speaker 1: and then this one just requires a simple majority. So 425 00:27:39,080 --> 00:27:42,000 Speaker 1: it does get bureaucratically complicated. If like all of this 426 00:27:42,160 --> 00:27:46,320 Speaker 1: plays out, now, will that happen? That's hard to say. 427 00:27:46,440 --> 00:27:49,560 Speaker 1: Let's take a few different scenarios in turn. So Ireland's 428 00:27:49,600 --> 00:27:53,600 Speaker 1: DPC filed the decision in early July, it's already been 429 00:27:53,640 --> 00:27:57,040 Speaker 1: four weeks. So if no other data protection agency in 430 00:27:57,040 --> 00:28:00,399 Speaker 1: the EU has objected to the dpc's conclusion whom the 431 00:28:00,440 --> 00:28:03,880 Speaker 1: decision becomes binding and will know really soon. If some 432 00:28:03,960 --> 00:28:07,000 Speaker 1: data protection agency objected, well, then that adds another two 433 00:28:07,000 --> 00:28:10,480 Speaker 1: weeks for the DPC to respond, at which point, if 434 00:28:10,480 --> 00:28:14,439 Speaker 1: there are no other objections, boom, decision becomes binding or 435 00:28:14,520 --> 00:28:19,080 Speaker 1: the DPC might submit this decision to the overall agency, 436 00:28:19,160 --> 00:28:22,040 Speaker 1: the e d p B. And boy, how do these 437 00:28:22,080 --> 00:28:24,480 Speaker 1: initialisms are really getting clunky? And the e d p 438 00:28:24,680 --> 00:28:28,200 Speaker 1: B would have at least one month at most two 439 00:28:28,440 --> 00:28:30,679 Speaker 1: to come to a vote on the matter. If the 440 00:28:30,840 --> 00:28:34,040 Speaker 1: vote fails to gain two thirds majority in either direction, 441 00:28:34,119 --> 00:28:37,199 Speaker 1: then again another two weeks and then it goes to 442 00:28:37,240 --> 00:28:41,120 Speaker 1: another vote with majority rules. Meta has indicated that it 443 00:28:41,200 --> 00:28:44,320 Speaker 1: might have to shut down its services of Facebook and 444 00:28:44,320 --> 00:28:47,960 Speaker 1: Instagram in the EU anyway, at least until the new 445 00:28:48,000 --> 00:28:51,720 Speaker 1: framework takes effect. The new framework is called the Transatlantic 446 00:28:51,880 --> 00:28:56,320 Speaker 1: Data Privacy Framework, and even then it's uncertain because, after all, 447 00:28:56,400 --> 00:29:01,000 Speaker 1: the European Commission has already determined that two seating frameworks, 448 00:29:01,040 --> 00:29:05,680 Speaker 1: safe Harbor and Privacy Shield, that were meant to be 449 00:29:05,840 --> 00:29:10,520 Speaker 1: in in compliance with EU law, we're lacking and both 450 00:29:10,560 --> 00:29:13,920 Speaker 1: of those got struck down, So there's no guarantee that 451 00:29:14,040 --> 00:29:17,880 Speaker 1: the same thing would not happen yet. Again, this raises 452 00:29:17,880 --> 00:29:20,760 Speaker 1: the question if it's even possible for a company like 453 00:29:20,840 --> 00:29:23,320 Speaker 1: Meta to operate these services in the EU, at least 454 00:29:23,360 --> 00:29:26,600 Speaker 1: the way it has been doing without a massive overhaul 455 00:29:27,160 --> 00:29:31,080 Speaker 1: and its data handling services. Maybe if Meta were to 456 00:29:31,200 --> 00:29:36,560 Speaker 1: establish EU centric servers that were separate from everything else, 457 00:29:36,560 --> 00:29:39,600 Speaker 1: it was not sending EU data to any place outside 458 00:29:39,640 --> 00:29:43,080 Speaker 1: the European Union. It was like a EU specific version 459 00:29:43,080 --> 00:29:46,640 Speaker 1: of Facebook and an EU specific version of Instagram. Maybe 460 00:29:46,640 --> 00:29:49,200 Speaker 1: then it would be fine, but that would be kind 461 00:29:49,200 --> 00:29:51,160 Speaker 1: of ridiculous. Also, I have a feeling that a lot 462 00:29:51,160 --> 00:29:53,160 Speaker 1: of users would be upset that they wouldn't be able 463 00:29:53,240 --> 00:29:58,600 Speaker 1: to access or interact with stuff outside the EU. Or 464 00:29:58,640 --> 00:30:01,920 Speaker 1: if Meta were able to guarantee that it's, you know, 465 00:30:02,000 --> 00:30:05,920 Speaker 1: the the agencies that are handling data from the EU 466 00:30:06,000 --> 00:30:11,040 Speaker 1: to the US were in fact, uh protected against US surveillance, 467 00:30:11,080 --> 00:30:13,760 Speaker 1: then maybe it would be all right. But it can't, 468 00:30:13,880 --> 00:30:18,080 Speaker 1: at least not now. Now. It's possible that this new framework, 469 00:30:18,120 --> 00:30:21,600 Speaker 1: once enacted, would allow METTA to continue operating Facebook and 470 00:30:21,640 --> 00:30:25,400 Speaker 1: Instagram within the EU through some sort of exception, though 471 00:30:25,440 --> 00:30:29,000 Speaker 1: again there's no guarantee that this framework will withstand court scrutiny. 472 00:30:29,120 --> 00:30:33,000 Speaker 1: Over time. This is the situation that Meyer referred to 473 00:30:33,120 --> 00:30:36,239 Speaker 1: in that article in Fast Company that Meta may have 474 00:30:36,400 --> 00:30:40,080 Speaker 1: no choice but to stop offering Facebook and Instagram services 475 00:30:40,320 --> 00:30:44,360 Speaker 1: to EU citizens. Meyer also quotes a Facebook investor named 476 00:30:44,520 --> 00:30:48,080 Speaker 1: Robert mcnami who snarkily said that this could be a 477 00:30:48,120 --> 00:30:51,400 Speaker 1: real disaster for Meta because users would soon figure out 478 00:30:51,440 --> 00:30:54,680 Speaker 1: that they're much better off without access to those platforms. 479 00:30:55,200 --> 00:30:58,360 Speaker 1: I happen to agree with macnamy as someone who got 480 00:30:58,400 --> 00:31:02,160 Speaker 1: off of Instagram and Facebook. Um, I feel like I'm 481 00:31:02,200 --> 00:31:05,840 Speaker 1: better off for doing that. Better off in the larger sense. 482 00:31:06,160 --> 00:31:08,760 Speaker 1: I do miss being able to interact with my friends 483 00:31:08,920 --> 00:31:12,880 Speaker 1: in a concentrated, easy way. It does take a little 484 00:31:12,920 --> 00:31:16,959 Speaker 1: more effort, and you quickly figure out which friends decide 485 00:31:16,960 --> 00:31:21,320 Speaker 1: you're worth that effort, all right. Obviously, the loss of 486 00:31:21,360 --> 00:31:24,080 Speaker 1: a market the size of the European Union would be 487 00:31:24,160 --> 00:31:27,760 Speaker 1: a huge blow to Meta, a company that's already dealing 488 00:31:28,240 --> 00:31:31,320 Speaker 1: with other crises. Not still too early to say if 489 00:31:31,360 --> 00:31:34,160 Speaker 1: that's definitely gonna happen, But no matter what the outcome, 490 00:31:34,760 --> 00:31:38,960 Speaker 1: this ongoing struggle to find ways for non EU companies 491 00:31:39,000 --> 00:31:42,840 Speaker 1: to comply with EU privacy laws is going to be 492 00:31:42,920 --> 00:31:46,240 Speaker 1: an enormous challenge it has been and it will continue 493 00:31:46,280 --> 00:31:51,120 Speaker 1: to be. EU regulators and politicians are exceedingly wary about 494 00:31:51,120 --> 00:31:54,120 Speaker 1: the sincerity of US companies when it comes to their 495 00:31:54,120 --> 00:31:58,280 Speaker 1: claims of protecting information, and for good reason. There's lots 496 00:31:58,280 --> 00:32:01,400 Speaker 1: of evidence to point that we should be suspicious of 497 00:32:01,440 --> 00:32:04,360 Speaker 1: those kinds of claims. And while I have focused on 498 00:32:04,400 --> 00:32:07,320 Speaker 1: Meta in this episode, the truth is those requirements apply 499 00:32:07,440 --> 00:32:11,240 Speaker 1: to all non EU companies. H And I've been really 500 00:32:11,320 --> 00:32:14,600 Speaker 1: focusing on the US here, but that applies to anything 501 00:32:14,640 --> 00:32:18,080 Speaker 1: outside the EU. So doing business in the Internet age 502 00:32:18,360 --> 00:32:21,760 Speaker 1: and doing business within the EU is going to require 503 00:32:21,840 --> 00:32:25,920 Speaker 1: regular investment to assure the EU that companies are playing 504 00:32:25,920 --> 00:32:28,960 Speaker 1: by the rules. And uh, that's just gonna be difficult. 505 00:32:29,400 --> 00:32:33,160 Speaker 1: You have entire companies that exists as consulting firms to 506 00:32:33,320 --> 00:32:36,760 Speaker 1: help other companies make sure that they are complying by 507 00:32:36,800 --> 00:32:40,520 Speaker 1: the rules, because the cost of business if you're found 508 00:32:40,640 --> 00:32:44,200 Speaker 1: by some regulatory agency in the EU too have fallen short, 509 00:32:44,840 --> 00:32:47,840 Speaker 1: is enormous. That's what Meta is going through now. I 510 00:32:47,880 --> 00:32:51,280 Speaker 1: don't know if Facebook and and Instagram or are not 511 00:32:51,400 --> 00:32:54,920 Speaker 1: long for this world in the EU. UM, we will 512 00:32:54,960 --> 00:32:57,200 Speaker 1: have to keep our eyes on it. It wouldn't surprise 513 00:32:57,280 --> 00:33:00,840 Speaker 1: me if we see politicians struggle to make sure that 514 00:33:00,880 --> 00:33:04,800 Speaker 1: there remains access within the EU for these platforms. They're 515 00:33:04,840 --> 00:33:08,440 Speaker 1: incredibly popular, they're important for things like small businesses within 516 00:33:08,480 --> 00:33:11,680 Speaker 1: the EU. But you know, you have the regulators and 517 00:33:11,680 --> 00:33:15,120 Speaker 1: then you have the politicians, and politicians move slowly when 518 00:33:15,160 --> 00:33:19,760 Speaker 1: it comes to creating these policies that sometimes get overturned 519 00:33:20,000 --> 00:33:25,960 Speaker 1: later on, regulators move way faster. So it may see 520 00:33:25,960 --> 00:33:29,280 Speaker 1: that we'll see in eruptions in service, perhaps with a return. 521 00:33:29,360 --> 00:33:31,560 Speaker 1: I mean there you would have to imagine that Meta 522 00:33:31,560 --> 00:33:34,320 Speaker 1: would want to return even if it's business is curtailed 523 00:33:34,920 --> 00:33:38,800 Speaker 1: for you know, some indeterminate length of time, because you 524 00:33:38,800 --> 00:33:41,720 Speaker 1: don't want to leave money on the table. Anyway, I 525 00:33:41,840 --> 00:33:44,959 Speaker 1: thought that that was an interesting topic. It relates heavily 526 00:33:45,000 --> 00:33:50,480 Speaker 1: to technology because ultimately it is very hard to guarantee 527 00:33:51,400 --> 00:33:56,240 Speaker 1: data security. Uh, it's it's hard to do because often 528 00:33:56,800 --> 00:33:59,840 Speaker 1: you come up with ways that data is really valuable 529 00:34:00,040 --> 00:34:02,720 Speaker 1: you want to use it, and sometimes that breaks the 530 00:34:02,800 --> 00:34:07,040 Speaker 1: rules or sometimes it's it's just hard because creating any 531 00:34:07,160 --> 00:34:11,840 Speaker 1: secure system is incredibly difficult. If someone's really determined to 532 00:34:11,880 --> 00:34:15,400 Speaker 1: get access to a secure system, often they can find 533 00:34:15,400 --> 00:34:19,919 Speaker 1: a way. So yeah, a difficult difficult challenge and uh 534 00:34:20,000 --> 00:34:23,839 Speaker 1: you know, the European Union has created laws that in 535 00:34:23,920 --> 00:34:30,360 Speaker 1: many ways have made it difficult to innovate uh in 536 00:34:30,360 --> 00:34:34,120 Speaker 1: in certain ways and also comply with those laws. That's 537 00:34:34,120 --> 00:34:36,800 Speaker 1: not necessarily a bad thing, you know. It may be 538 00:34:37,000 --> 00:34:40,359 Speaker 1: that whatever the innovation was isn't worth the trade off 539 00:34:40,440 --> 00:34:44,560 Speaker 1: in privacy and security, but it also means that it 540 00:34:44,840 --> 00:34:50,040 Speaker 1: creates this extra hurdle that innovators and companies and and 541 00:34:50,239 --> 00:34:53,120 Speaker 1: all sorts of people have to get over in order 542 00:34:53,160 --> 00:34:56,440 Speaker 1: to make their vision a reality. Um. Yeah, it's a 543 00:34:56,440 --> 00:34:59,640 Speaker 1: balancing act. Well, that's it for this episode. If you 544 00:34:59,680 --> 00:35:02,600 Speaker 1: have some sugestions for future episodes of tech Stuff, please 545 00:35:02,640 --> 00:35:04,319 Speaker 1: reach out to me and let me know. One way 546 00:35:04,360 --> 00:35:07,000 Speaker 1: to do that is to download the I Heart radio app. 547 00:35:07,160 --> 00:35:10,279 Speaker 1: It is free to download. You can then search for 548 00:35:10,320 --> 00:35:12,920 Speaker 1: tech Stuff, navigate over to the podcast page. There's a 549 00:35:12,920 --> 00:35:15,239 Speaker 1: little microphone icon there. If you click on that, you 550 00:35:15,239 --> 00:35:17,640 Speaker 1: can leave a voice message up to thirty seconds in 551 00:35:17,760 --> 00:35:19,719 Speaker 1: length let me know what you would like to hear, 552 00:35:20,440 --> 00:35:22,880 Speaker 1: or if you prefer, you can reach out on Twitter. 553 00:35:22,960 --> 00:35:25,920 Speaker 1: The handle for the show is tech Stuff H s 554 00:35:26,160 --> 00:35:35,560 Speaker 1: W and I'll talk to you again really soon. Tech 555 00:35:35,640 --> 00:35:39,120 Speaker 1: Stuff is an i heart Radio production. For more podcasts 556 00:35:39,120 --> 00:35:41,880 Speaker 1: from my heart Radio, visit the i heart Radio app, 557 00:35:42,000 --> 00:35:45,160 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.