1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:14,000 Speaker 1: stuff Works dot com either and welcome to tech Stuff. 3 00:00:14,040 --> 00:00:16,840 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:16,880 --> 00:00:20,119 Speaker 1: how Stuff Works in I love all things tech, and 5 00:00:20,200 --> 00:00:26,239 Speaker 1: today we're going to tackle a fairly topical subject, something 6 00:00:26,280 --> 00:00:29,360 Speaker 1: that really came into play in the spring of and 7 00:00:29,440 --> 00:00:33,280 Speaker 1: chances are you have received an email or two or 8 00:00:33,760 --> 00:00:37,880 Speaker 1: two dozen or more from various companies about new policies 9 00:00:38,200 --> 00:00:41,400 Speaker 1: that relate to g d p R. Often they will 10 00:00:41,440 --> 00:00:44,319 Speaker 1: ask you for your permission to continue to communicate with you. 11 00:00:44,960 --> 00:00:47,519 Speaker 1: So what's it all about? A Well, g d p 12 00:00:47,760 --> 00:00:51,800 Speaker 1: R stands for General Data Protection Regulation and it's a 13 00:00:51,880 --> 00:00:55,000 Speaker 1: data protection law, as the name suggests, and it's from 14 00:00:55,000 --> 00:00:58,840 Speaker 1: the European Union or EU. But the Internet, as it 15 00:00:58,880 --> 00:01:01,600 Speaker 1: turns out, is a mobile entity, so even if you 16 00:01:01,720 --> 00:01:04,800 Speaker 1: do not live in the EU, you will likely be 17 00:01:04,880 --> 00:01:08,080 Speaker 1: affected by this new law. In this episode, I'm going 18 00:01:08,120 --> 00:01:10,600 Speaker 1: to go through the history of the law, what the 19 00:01:10,680 --> 00:01:13,640 Speaker 1: law is actually all about, and how companies are doing 20 00:01:13,920 --> 00:01:17,119 Speaker 1: as far as complying with that law. And here's a hint, 21 00:01:17,200 --> 00:01:20,080 Speaker 1: there are some companies that are not even close to compliance. 22 00:01:20,160 --> 00:01:23,840 Speaker 1: But we'll get to that First, let's look back to 23 00:01:24,120 --> 00:01:29,240 Speaker 1: nineteen That's when the European Union adopted the Data Protection 24 00:01:29,360 --> 00:01:32,800 Speaker 1: Directive or d p D. There was a different world 25 00:01:32,959 --> 00:01:38,440 Speaker 1: back in the Worldwide Web was still a baby. Heck, 26 00:01:38,560 --> 00:01:42,680 Speaker 1: I was still in college in n The heart of 27 00:01:42,720 --> 00:01:46,039 Speaker 1: the Data Protection Directive was an effort to protect the 28 00:01:46,120 --> 00:01:50,600 Speaker 1: privacy of citizens in the EU, and the EU as 29 00:01:50,640 --> 00:01:54,200 Speaker 1: a whole has placed a high value on privacy, something 30 00:01:54,280 --> 00:01:59,400 Speaker 1: that has been treated with uh, let's say, a more 31 00:01:59,560 --> 00:02:03,640 Speaker 1: casual ual demeanor here in the United States, except in 32 00:02:03,680 --> 00:02:08,640 Speaker 1: cases where something has gone terribly terribly wrong. The directive 33 00:02:08,840 --> 00:02:12,519 Speaker 1: specifically covered how data can be processed and in what 34 00:02:12,680 --> 00:02:16,720 Speaker 1: context it might be processed within the European Union. It 35 00:02:16,760 --> 00:02:21,600 Speaker 1: didn't matter if the data was collected manually or automatically 36 00:02:21,720 --> 00:02:23,640 Speaker 1: as and it didn't matter if there was a human 37 00:02:23,639 --> 00:02:26,679 Speaker 1: in charge of it or if it was an algorithm. 38 00:02:26,720 --> 00:02:30,400 Speaker 1: The rules were a broad overview, leaving up specifics to 39 00:02:30,639 --> 00:02:34,320 Speaker 1: the member countries to actually adopt those those rules and 40 00:02:34,360 --> 00:02:37,360 Speaker 1: incorporate them into their own laws. But some of the 41 00:02:37,400 --> 00:02:40,840 Speaker 1: general tenants included that personal data could only be quote 42 00:02:41,160 --> 00:02:46,560 Speaker 1: collected for specified explicit and legitimate purposes and not further 43 00:02:46,720 --> 00:02:51,799 Speaker 1: process in a way incompatible with those purposes. End quote. Further, 44 00:02:52,120 --> 00:02:56,200 Speaker 1: only the data needed for those purposes should be collected. 45 00:02:56,560 --> 00:02:59,040 Speaker 1: There should not be a case of an entity collecting 46 00:02:59,120 --> 00:03:03,359 Speaker 1: practically everything if that entity stated purpose is to run 47 00:03:03,400 --> 00:03:07,040 Speaker 1: a process on just a narrow scope of all that data. 48 00:03:07,280 --> 00:03:11,080 Speaker 1: This might remind you of the old days of Facebook apps, 49 00:03:11,120 --> 00:03:14,000 Speaker 1: where you could uh or add ons. You know, there's 50 00:03:14,120 --> 00:03:17,079 Speaker 1: little things that you could attach to your Facebook profile 51 00:03:17,360 --> 00:03:20,440 Speaker 1: and they would ask you for permission to view certain 52 00:03:20,600 --> 00:03:23,440 Speaker 1: parts of your information. Well, in the old wild West days, 53 00:03:23,480 --> 00:03:26,400 Speaker 1: that could be anything, it could be absolutely everything, even 54 00:03:26,400 --> 00:03:29,200 Speaker 1: though the app itself may only use a tiny bit 55 00:03:29,240 --> 00:03:32,520 Speaker 1: of information at any given time, especially for whatever the 56 00:03:32,520 --> 00:03:36,000 Speaker 1: app was supposed to do. Well. Eventually, Facebook cracked down 57 00:03:36,000 --> 00:03:39,200 Speaker 1: on that and said, you know what, you should only 58 00:03:39,800 --> 00:03:43,280 Speaker 1: ask permission to get access to the data you need 59 00:03:43,560 --> 00:03:47,120 Speaker 1: to do whatever it is that you do, and otherwise 60 00:03:47,160 --> 00:03:49,640 Speaker 1: you should leave everything else alone. That's kind of what 61 00:03:49,680 --> 00:03:53,840 Speaker 1: was going on back in with this this directive. Further, 62 00:03:54,280 --> 00:03:57,120 Speaker 1: the data was meant to be as accurate as possible, 63 00:03:57,320 --> 00:04:00,600 Speaker 1: and if there were any indications that the information was 64 00:04:00,680 --> 00:04:03,480 Speaker 1: inaccurate or it was out of date, it would be 65 00:04:03,640 --> 00:04:07,920 Speaker 1: quote erased or rectified. End quote. And finally, the data 66 00:04:07,920 --> 00:04:09,760 Speaker 1: would have to be kept in such a way that 67 00:04:09,800 --> 00:04:13,880 Speaker 1: the identity of the individuals involved would only be knowable 68 00:04:14,120 --> 00:04:17,800 Speaker 1: for as long as it was necessary to run the process. 69 00:04:17,920 --> 00:04:20,760 Speaker 1: Once the entity has done whatever it needed to do 70 00:04:20,839 --> 00:04:23,640 Speaker 1: with all that information, it was supposed to anonymize the 71 00:04:23,760 --> 00:04:26,000 Speaker 1: data so that there would be no way of knowing 72 00:04:26,200 --> 00:04:29,120 Speaker 1: who it pertained to. So once you had finished running 73 00:04:29,120 --> 00:04:31,880 Speaker 1: whatever the process was, you had to make sure that 74 00:04:31,920 --> 00:04:35,200 Speaker 1: the information would no longer be traced back to the 75 00:04:35,240 --> 00:04:38,800 Speaker 1: people who gave you the information. In addition, the directive 76 00:04:38,800 --> 00:04:42,960 Speaker 1: required entities to obtain user consent before collecting their information 77 00:04:42,960 --> 00:04:46,839 Speaker 1: in the first place, and that consent had to be unambiguous. 78 00:04:47,200 --> 00:04:50,400 Speaker 1: In addition, the data collector was under the obligation of 79 00:04:50,440 --> 00:04:54,359 Speaker 1: providing the individuals with information about who was ultimately getting 80 00:04:54,440 --> 00:04:57,719 Speaker 1: the data and to what purpose, as well as provide 81 00:04:57,760 --> 00:05:00,440 Speaker 1: for an opportunity for the individual to review the data 82 00:05:00,480 --> 00:05:04,240 Speaker 1: for any potential errors, so that way, you, as the 83 00:05:04,320 --> 00:05:06,680 Speaker 1: person involved, could say, well, let me take a look 84 00:05:06,680 --> 00:05:08,640 Speaker 1: at what you've gathered and make sure that you don't 85 00:05:08,640 --> 00:05:11,360 Speaker 1: have any information that is inaccurate or out of date. 86 00:05:11,680 --> 00:05:14,160 Speaker 1: Now already it might sound to you like this directive 87 00:05:14,440 --> 00:05:16,720 Speaker 1: might have been a challenge to implement for a lot 88 00:05:16,760 --> 00:05:20,120 Speaker 1: of reasons. In two thousand eleven, the European Data Protection 89 00:05:20,200 --> 00:05:24,880 Speaker 1: Supervisor published an opinion titled quote a Comprehensive Approach on 90 00:05:24,960 --> 00:05:28,280 Speaker 1: Personal Data Protection in EU end quote as sort of 91 00:05:28,320 --> 00:05:31,600 Speaker 1: an update to this policy. By two thousand eleven, the 92 00:05:31,640 --> 00:05:34,320 Speaker 1: Internet was much more mature than had been back in 93 00:05:35,040 --> 00:05:36,880 Speaker 1: at least in the sense that there are a lot 94 00:05:36,920 --> 00:05:40,240 Speaker 1: more people and businesses using it. There's still no shortage 95 00:05:40,279 --> 00:05:44,400 Speaker 1: of immature content on the Internet anyway. By two thousand eleven, 96 00:05:44,520 --> 00:05:47,360 Speaker 1: e commerce was a really big deal, and Internet access 97 00:05:47,400 --> 00:05:51,480 Speaker 1: was increasingly being viewed as a right. But that also 98 00:05:51,560 --> 00:05:54,359 Speaker 1: brought with it threats to privacy. Many of the Internet 99 00:05:54,360 --> 00:05:58,720 Speaker 1: connected services we enjoy are constantly collecting data on us, 100 00:05:59,160 --> 00:06:02,480 Speaker 1: either about no information about us in particular, or tracking 101 00:06:02,480 --> 00:06:06,279 Speaker 1: our behaviors over time, and that data is kind of 102 00:06:06,320 --> 00:06:09,200 Speaker 1: like currency. It's got value to it, So you and 103 00:06:09,279 --> 00:06:13,240 Speaker 1: I might enjoy a service while simultaneously supplying information to 104 00:06:13,320 --> 00:06:16,919 Speaker 1: the service provider, which in turn could potentially sell that 105 00:06:17,000 --> 00:06:21,560 Speaker 1: information off to other buyers. This applies for everything from apps, 106 00:06:21,560 --> 00:06:26,120 Speaker 1: to social media networks to smart devices. On January two 107 00:06:26,160 --> 00:06:30,560 Speaker 1: thousand twelve, the European Commission proposed a reform of the 108 00:06:30,640 --> 00:06:34,119 Speaker 1: data protection rules that came from in order to better 109 00:06:34,160 --> 00:06:38,560 Speaker 1: represent the new digital landscape and protects in privacy while 110 00:06:38,560 --> 00:06:42,760 Speaker 1: simultaneously supporting the digital economy, which is a really tough job. 111 00:06:42,800 --> 00:06:45,240 Speaker 1: You've gotta balance a lot of stuff that way. So 112 00:06:45,920 --> 00:06:48,040 Speaker 1: it was in fact so difficult it would take four 113 00:06:48,240 --> 00:06:52,400 Speaker 1: years just to draft the new rules. In fourteen, the 114 00:06:52,440 --> 00:06:55,760 Speaker 1: European Parliament voted on adopting a new set of rules, 115 00:06:56,000 --> 00:06:58,599 Speaker 1: though those rules were not yet actually written. This was 116 00:06:58,640 --> 00:07:01,680 Speaker 1: just the Parliament saying, yeah, I think it's time for 117 00:07:01,760 --> 00:07:06,559 Speaker 1: us to actually have new rules. Regarding this, six votes 118 00:07:06,600 --> 00:07:10,200 Speaker 1: were in favor of developing new rules, only ten votes 119 00:07:10,520 --> 00:07:14,120 Speaker 1: were against it, and there were twenty two abstentions. The 120 00:07:14,200 --> 00:07:17,160 Speaker 1: various governing bodies of the EU reached an agreement on 121 00:07:17,200 --> 00:07:20,240 Speaker 1: the g d p R rules on December fifteen, two 122 00:07:20,240 --> 00:07:25,640 Speaker 1: thousand fifteen. On April two thousand sixteen, the EU officially 123 00:07:25,680 --> 00:07:28,360 Speaker 1: adopted the g d p R set of rules, but 124 00:07:28,400 --> 00:07:30,840 Speaker 1: they would not be enacted for another two years. In 125 00:07:30,880 --> 00:07:35,040 Speaker 1: other words, they said, hey, everybody, you have two years 126 00:07:35,080 --> 00:07:37,480 Speaker 1: to get your act together. Here are the rules that 127 00:07:37,560 --> 00:07:40,480 Speaker 1: you need to abide by go, You've got two years 128 00:07:40,520 --> 00:07:44,080 Speaker 1: to get there. This decision also, by the way, repealed 129 00:07:44,280 --> 00:07:47,960 Speaker 1: that previous nine directive. It said, it's not in addition 130 00:07:48,040 --> 00:07:51,320 Speaker 1: to that directive. This replaces it entirely. As of the 131 00:07:51,360 --> 00:07:55,040 Speaker 1: spring of the g d p R s provisions became 132 00:07:55,200 --> 00:07:59,480 Speaker 1: directly applicable and all member states of the EU, and 133 00:07:59,600 --> 00:08:03,720 Speaker 1: late in the spring, the EU published a corrigendum to 134 00:08:03,800 --> 00:08:06,720 Speaker 1: the regulation. That means the EU published a list of 135 00:08:06,720 --> 00:08:10,240 Speaker 1: corrections and clarifications relating to the law. And in the 136 00:08:10,280 --> 00:08:13,400 Speaker 1: interest of full disclosure, I have to admit I needed 137 00:08:13,400 --> 00:08:16,120 Speaker 1: to look up the word corrigendum because I don't think 138 00:08:16,160 --> 00:08:19,480 Speaker 1: I've ever seen it before. One of the biggest differences 139 00:08:19,560 --> 00:08:21,880 Speaker 1: between the g d p R and the earlier Data 140 00:08:21,920 --> 00:08:25,720 Speaker 1: Protection Directive is in its binding nature. The dp D, 141 00:08:25,880 --> 00:08:27,920 Speaker 1: as I said, was a set of policies that had 142 00:08:27,960 --> 00:08:31,000 Speaker 1: to be transposed into the national law of each of 143 00:08:31,040 --> 00:08:35,200 Speaker 1: the members of the EU, which created a somewhat fragmented 144 00:08:35,320 --> 00:08:38,680 Speaker 1: and messy set of policies. The g d p R, however, 145 00:08:38,840 --> 00:08:42,600 Speaker 1: is different. It has direct legal effect on all EU 146 00:08:42,800 --> 00:08:45,920 Speaker 1: member states, with no need for the policies to be 147 00:08:45,960 --> 00:08:50,280 Speaker 1: incorporated into those nations laws. So let's start talking about 148 00:08:50,280 --> 00:08:52,959 Speaker 1: the specifics in the law. What does it cover and 149 00:08:53,000 --> 00:08:55,280 Speaker 1: in what ways might a person's data still be used 150 00:08:55,320 --> 00:08:59,600 Speaker 1: without their knowledge or consent. Much of the regulation affirmed 151 00:08:59,679 --> 00:09:02,600 Speaker 1: the earlier Data Protection Directive, but here are some of 152 00:09:02,640 --> 00:09:04,800 Speaker 1: the key points. And first I'm gonna look at the 153 00:09:04,880 --> 00:09:07,839 Speaker 1: opening statements of the g d p R. The EU 154 00:09:07,920 --> 00:09:11,120 Speaker 1: has identified the protection of persons in relation of the 155 00:09:11,120 --> 00:09:15,040 Speaker 1: processing of personal data as a fundamental right, and that 156 00:09:15,120 --> 00:09:18,760 Speaker 1: right includes the right of protection of personal data. At 157 00:09:18,760 --> 00:09:21,400 Speaker 1: the same time, there's a need to allow for the 158 00:09:21,480 --> 00:09:25,120 Speaker 1: free flow of data between member states of the European Union, 159 00:09:25,280 --> 00:09:29,920 Speaker 1: so any regulations in place must not create obstacles between 160 00:09:29,960 --> 00:09:33,680 Speaker 1: different member states. Citizens of the EU are allowed to 161 00:09:33,720 --> 00:09:37,240 Speaker 1: move freely through the EU taking jobs in different member states, 162 00:09:37,240 --> 00:09:40,840 Speaker 1: so their data should also be free to move through 163 00:09:40,880 --> 00:09:44,400 Speaker 1: the EU with their consent. This next bit is pretty important, 164 00:09:44,400 --> 00:09:48,839 Speaker 1: so I'm going to quote it directly. Quote the processing 165 00:09:49,000 --> 00:09:53,160 Speaker 1: of personal data should be designed to serve mankind. The 166 00:09:53,320 --> 00:09:56,080 Speaker 1: right to the protection of personal data is not an 167 00:09:56,120 --> 00:09:59,680 Speaker 1: absolute right. It must be considered in relation to its 168 00:09:59,720 --> 00:10:04,240 Speaker 1: fun in society and be balanced against other fundamental rights 169 00:10:04,360 --> 00:10:09,280 Speaker 1: in accordance with the principle of proportionality. The regulation respects 170 00:10:09,320 --> 00:10:13,200 Speaker 1: all fundamental rights and observes the freedoms and principles recognized 171 00:10:13,200 --> 00:10:16,559 Speaker 1: in the Charter as enshrined in the Treaties, in particular 172 00:10:16,640 --> 00:10:20,520 Speaker 1: the respect for private and family life, home and communications, 173 00:10:20,800 --> 00:10:25,240 Speaker 1: the protection of personal data, freedom of thought, conscience and religion, 174 00:10:25,520 --> 00:10:29,400 Speaker 1: freedom of expression and information, freedom to conduct a business, 175 00:10:29,720 --> 00:10:32,600 Speaker 1: the right to an effective remedy and to a fair trial, 176 00:10:32,960 --> 00:10:37,000 Speaker 1: and cultural, religious, and linguistic diversity. So, in other words, 177 00:10:37,040 --> 00:10:39,840 Speaker 1: they were saying, yes, this is a fundamental right, but 178 00:10:39,920 --> 00:10:44,080 Speaker 1: it does not take precedence over other fundamental rights. So 179 00:10:44,120 --> 00:10:46,960 Speaker 1: there will come times when you have to take various 180 00:10:46,960 --> 00:10:50,439 Speaker 1: things into consideration and you can't just say the privacy 181 00:10:50,840 --> 00:10:54,800 Speaker 1: is the most important element in this particular scenario. You 182 00:10:54,840 --> 00:10:57,240 Speaker 1: have to consider all the different parts and weigh them 183 00:10:57,280 --> 00:10:59,600 Speaker 1: against each other. But what do we mean when we 184 00:10:59,679 --> 00:11:04,560 Speaker 1: say processing data. Well, essentially, it means any sort of 185 00:11:04,600 --> 00:11:09,680 Speaker 1: operation on information, whether performed automatically or manually. That includes 186 00:11:09,760 --> 00:11:15,120 Speaker 1: collecting data, recording data, structuring it in different ways, organizing it, 187 00:11:15,520 --> 00:11:19,760 Speaker 1: altering it or adapting it, consulting the data, using it 188 00:11:19,840 --> 00:11:23,439 Speaker 1: in some way, transmitting the data, or even erasing the data. 189 00:11:23,520 --> 00:11:27,400 Speaker 1: All of that is considered processing, so essentially, if you 190 00:11:27,520 --> 00:11:30,880 Speaker 1: touch that data, you're processing it in some way. The 191 00:11:30,960 --> 00:11:34,440 Speaker 1: document goes on to acknowledge that it is increasingly difficult 192 00:11:34,480 --> 00:11:38,679 Speaker 1: and complicated to protect personal data in today's world. Globalization 193 00:11:38,720 --> 00:11:41,760 Speaker 1: and the rapid exchange of information, coupled with platforms that 194 00:11:41,880 --> 00:11:46,600 Speaker 1: encourage people to share their personal data, either explicitly or otherwise, 195 00:11:46,880 --> 00:11:50,000 Speaker 1: have made it pretty hard to regulate. Gdp ARE also 196 00:11:50,040 --> 00:11:55,360 Speaker 1: identifies two major categories of parties in addition to EU citizens. 197 00:11:55,840 --> 00:11:59,679 Speaker 1: These are the data controllers and the data processors. The 198 00:11:59,760 --> 00:12:03,600 Speaker 1: can trollers are the entities that determine why and how 199 00:12:03,800 --> 00:12:07,199 Speaker 1: data will be used, and the processors are the entities 200 00:12:07,200 --> 00:12:10,480 Speaker 1: that actually carry out those operations on behalf of a collector. 201 00:12:11,160 --> 00:12:15,280 Speaker 1: One single company can be both a collector and a processor, 202 00:12:15,960 --> 00:12:19,160 Speaker 1: or they could partner with other companies. All right, those 203 00:12:19,200 --> 00:12:21,600 Speaker 1: are the basics. When we come back, I'll get into 204 00:12:21,640 --> 00:12:24,000 Speaker 1: more specifics with the g d p R, but first 205 00:12:24,120 --> 00:12:34,400 Speaker 1: let's take a quick break to thank our sponsor. One 206 00:12:34,440 --> 00:12:37,520 Speaker 1: tricky thing in the policy is that it covers all 207 00:12:37,760 --> 00:12:42,640 Speaker 1: entities that process data that belongs to EU citizens, even 208 00:12:42,679 --> 00:12:46,280 Speaker 1: if those entities themselves are not in the EU. So, 209 00:12:46,440 --> 00:12:49,240 Speaker 1: for example, let's say I have set up a new 210 00:12:49,360 --> 00:12:53,880 Speaker 1: social networking platform and I'm calling it Strict Book. So 211 00:12:53,960 --> 00:12:56,480 Speaker 1: I've got Strict Book, and I've built a data center 212 00:12:56,520 --> 00:12:59,520 Speaker 1: in my garage here in the United States. But there 213 00:12:59,520 --> 00:13:02,800 Speaker 1: are bowl in the European Union that have made accounts 214 00:13:02,840 --> 00:13:06,040 Speaker 1: on my platform. And let's say that I make money 215 00:13:06,120 --> 00:13:09,599 Speaker 1: by dealing in data to parties that want that information. 216 00:13:09,640 --> 00:13:12,800 Speaker 1: So I gather information from my users and I sell 217 00:13:12,840 --> 00:13:16,120 Speaker 1: it to various entities that want access to it. Maybe 218 00:13:16,120 --> 00:13:19,600 Speaker 1: they want to market to my users directly some advertising 219 00:13:19,640 --> 00:13:22,280 Speaker 1: to them. Well, I would be the subject to the 220 00:13:22,320 --> 00:13:25,760 Speaker 1: regulations of g d p R because there would be 221 00:13:25,880 --> 00:13:29,200 Speaker 1: EU citizens using my service even though my services located 222 00:13:29,240 --> 00:13:31,960 Speaker 1: in the United States. So as long as those EU 223 00:13:32,040 --> 00:13:35,240 Speaker 1: citizens were using my services while they were in the EU, 224 00:13:35,640 --> 00:13:38,520 Speaker 1: I would have to play by this policy's rules. From 225 00:13:38,600 --> 00:13:42,720 Speaker 1: the EU g DPR dot org site quote, the g 226 00:13:42,840 --> 00:13:45,480 Speaker 1: d p R will also apply to the processing of 227 00:13:45,559 --> 00:13:48,560 Speaker 1: personal data of data subjects in the EU by a 228 00:13:48,559 --> 00:13:53,000 Speaker 1: controller or processor not established in the EU, where the 229 00:13:53,040 --> 00:13:57,079 Speaker 1: activities relate to offering goods or services to EU citizens, 230 00:13:57,120 --> 00:14:01,160 Speaker 1: irrespective of whether payment is required, and the monitoring of 231 00:14:01,280 --> 00:14:06,120 Speaker 1: behavior that takes place within the EU. Non EU businesses 232 00:14:06,200 --> 00:14:09,240 Speaker 1: processing the data of EU citizens will also have to 233 00:14:09,280 --> 00:14:13,760 Speaker 1: appoint a representative in the EU. So essentially, what that's 234 00:14:13,760 --> 00:14:16,760 Speaker 1: saying is, if you want to use the data that 235 00:14:16,840 --> 00:14:20,560 Speaker 1: our citizens create, whether it's to market to them or 236 00:14:20,640 --> 00:14:23,400 Speaker 1: you're tracking their information for some other purposes, you've got 237 00:14:23,400 --> 00:14:25,960 Speaker 1: to play by our rules. Doesn't matter that you don't 238 00:14:26,080 --> 00:14:29,680 Speaker 1: have your operations here in the European Union. The introduction 239 00:14:29,720 --> 00:14:33,600 Speaker 1: also explains that the policy does not protect in all cases. 240 00:14:33,840 --> 00:14:37,160 Speaker 1: For example, it says this regulation does not apply to 241 00:14:37,320 --> 00:14:40,400 Speaker 1: issues of protection of fundamental rights and freedoms or the 242 00:14:40,440 --> 00:14:43,480 Speaker 1: free flow of personal data related to activities which fall 243 00:14:43,560 --> 00:14:48,000 Speaker 1: outside the scope of Union law, such as activities concerning 244 00:14:48,120 --> 00:14:51,800 Speaker 1: national security. This regulation does not apply to the processing 245 00:14:51,800 --> 00:14:54,440 Speaker 1: of personal data by the Member states when carrying out 246 00:14:54,480 --> 00:14:57,960 Speaker 1: activities in relation to the Common Foreign and Security Policy 247 00:14:58,080 --> 00:15:01,800 Speaker 1: of the Union. Likewise, in the case of law enforcement 248 00:15:01,800 --> 00:15:05,640 Speaker 1: conducting an investigation, the policy does not protect personal data. 249 00:15:05,720 --> 00:15:09,360 Speaker 1: The policy does, however, point towards other regulations in the 250 00:15:09,400 --> 00:15:13,440 Speaker 1: EU that govern how law enforcement can access personal information, 251 00:15:13,520 --> 00:15:15,640 Speaker 1: because it's not just willy nilly, they have to go 252 00:15:15,680 --> 00:15:19,960 Speaker 1: through the proper procedures. However, they say there are obviously 253 00:15:20,040 --> 00:15:25,840 Speaker 1: cases where persons private data may become an important element 254 00:15:26,040 --> 00:15:31,040 Speaker 1: in some state level or law enforcement level UH process, 255 00:15:31,080 --> 00:15:34,840 Speaker 1: and in those cases this does not protect him. You can't, 256 00:15:34,920 --> 00:15:37,720 Speaker 1: as a citizen say no, police, you can't look at 257 00:15:37,720 --> 00:15:40,520 Speaker 1: my personal data as part of this investigation, even though 258 00:15:40,560 --> 00:15:44,000 Speaker 1: you lawfully obtained it by going through all the right processes. 259 00:15:44,720 --> 00:15:47,800 Speaker 1: That would not be allowable under g d p R. 260 00:15:48,320 --> 00:15:50,640 Speaker 1: Another limitation of the g d p R is that 261 00:15:50,760 --> 00:15:54,200 Speaker 1: it applies only to information for a person who is 262 00:15:54,240 --> 00:16:00,960 Speaker 1: identified or is identifiable by that information, which includes suit oonymization. 263 00:16:01,720 --> 00:16:05,880 Speaker 1: What means the data is quasi anonymous, that if you 264 00:16:05,920 --> 00:16:08,520 Speaker 1: were presented with the information, you might not be able 265 00:16:08,560 --> 00:16:12,720 Speaker 1: to immediately identify who that information pertained to, but that 266 00:16:12,800 --> 00:16:16,840 Speaker 1: with additional information you would be able to identify the person. 267 00:16:17,360 --> 00:16:20,000 Speaker 1: So this is pretty important stuff, it turns out, because 268 00:16:20,560 --> 00:16:23,720 Speaker 1: not that much information is actually needed to identify a person. 269 00:16:23,920 --> 00:16:26,520 Speaker 1: For example, here in the United States, there was a 270 00:16:26,560 --> 00:16:30,240 Speaker 1: Harvard professor named Latanya Sweeney who conducted the study a 271 00:16:30,240 --> 00:16:33,520 Speaker 1: few years ago, and she discovered that all she needed 272 00:16:33,680 --> 00:16:37,000 Speaker 1: was a zip code, a gender, and a birth date 273 00:16:37,320 --> 00:16:40,680 Speaker 1: to identify up to eight seven percent of all people 274 00:16:40,720 --> 00:16:44,560 Speaker 1: in the United states that's it. Three data points. Those 275 00:16:44,600 --> 00:16:46,880 Speaker 1: three pieces of data was all it would be needed 276 00:16:46,920 --> 00:16:49,240 Speaker 1: in order for you to say that those pieces of 277 00:16:49,280 --> 00:16:53,160 Speaker 1: information specifically refer to this person, and it worked eighty 278 00:16:53,240 --> 00:16:55,720 Speaker 1: seven percent of the time for all us adults. It 279 00:16:55,760 --> 00:17:00,000 Speaker 1: doesn't take much to single someone out. That is why 280 00:17:00,040 --> 00:17:04,760 Speaker 1: the pseudo not nymization term is used. It's pseudo anonymous. 281 00:17:05,400 --> 00:17:08,080 Speaker 1: The policy goes a little bit further with this, stating 282 00:17:08,119 --> 00:17:10,760 Speaker 1: that if it were to take an unreasonable amount of 283 00:17:10,760 --> 00:17:13,800 Speaker 1: effort or money to ascertain the identity of a person 284 00:17:13,880 --> 00:17:18,040 Speaker 1: based on this limited information, it's probably okay because it's 285 00:17:18,119 --> 00:17:21,879 Speaker 1: unlikely anyone would actually go to those lengths. But the 286 00:17:22,000 --> 00:17:24,720 Speaker 1: easier it is to identify a person based on the data, 287 00:17:25,000 --> 00:17:27,080 Speaker 1: the more it falls under the protection of g d 288 00:17:27,200 --> 00:17:30,120 Speaker 1: p R. But then, what about anonymous data? What about 289 00:17:30,200 --> 00:17:33,200 Speaker 1: data that really seems to have no connection with any 290 00:17:33,240 --> 00:17:36,840 Speaker 1: particular individual. The g d p R does not protect 291 00:17:37,040 --> 00:17:40,679 Speaker 1: truly anonymous data. If there's no way to identify a 292 00:17:40,760 --> 00:17:44,320 Speaker 1: single person out of a collection of anonymized data for 293 00:17:44,440 --> 00:17:48,199 Speaker 1: statistical or research purposes, that's fine. So if you were 294 00:17:48,240 --> 00:17:51,639 Speaker 1: doing an academic study that took demographics into account and 295 00:17:51,680 --> 00:17:54,480 Speaker 1: the population size, you were looking at were sufficiently large 296 00:17:54,560 --> 00:17:57,879 Speaker 1: enough to ensure no respondent could be identified from the information, 297 00:17:58,240 --> 00:18:00,240 Speaker 1: you'd be all set. Not if you're worth king with 298 00:18:00,240 --> 00:18:03,480 Speaker 1: a really small population size, anyone who is an outlier 299 00:18:03,600 --> 00:18:06,680 Speaker 1: would be easily identifiable, and that would therefore fall under 300 00:18:06,720 --> 00:18:09,640 Speaker 1: g d p R because it's not truly anonymous data. 301 00:18:09,800 --> 00:18:12,600 Speaker 1: But if you're working with really big data sets, then 302 00:18:13,080 --> 00:18:16,240 Speaker 1: outliers you will have enough of them to kind of 303 00:18:16,359 --> 00:18:20,240 Speaker 1: make sure that anonymity is preserved, So again it's all spectrum. 304 00:18:20,720 --> 00:18:23,399 Speaker 1: The g DPR also does not apply if you happen 305 00:18:23,480 --> 00:18:26,280 Speaker 1: to be dead, but then at that point you're probably 306 00:18:26,320 --> 00:18:29,439 Speaker 1: past caring about your personal data. Also, it's hard to 307 00:18:29,480 --> 00:18:32,399 Speaker 1: have personal data if you, you know, if you're no 308 00:18:32,440 --> 00:18:36,479 Speaker 1: longer a person. The g DPR also says that any 309 00:18:36,480 --> 00:18:39,960 Speaker 1: party that intends to collect and process data must be 310 00:18:40,080 --> 00:18:43,159 Speaker 1: transparent in its policies. So those policies have to be 311 00:18:43,240 --> 00:18:45,359 Speaker 1: easy to find, and they have to be written in 312 00:18:45,400 --> 00:18:48,240 Speaker 1: such a way as to be easily understood. You aren't 313 00:18:48,280 --> 00:18:52,879 Speaker 1: supposed to obfuscate your intentions with unnecessarily complicated jargon or 314 00:18:52,960 --> 00:18:56,520 Speaker 1: legal lease. This includes not just how data is collected, 315 00:18:56,800 --> 00:18:59,600 Speaker 1: but to what purpose that data will be put so, 316 00:18:59,640 --> 00:19:01,880 Speaker 1: if a cop he wants to collect information in order 317 00:19:01,920 --> 00:19:03,960 Speaker 1: to sell it to other parties, it would have to 318 00:19:04,040 --> 00:19:06,879 Speaker 1: disclose that in this policy and do so in a 319 00:19:06,920 --> 00:19:11,919 Speaker 1: way that was pretty transparent, easy to understand, and the 320 00:19:11,960 --> 00:19:13,919 Speaker 1: g d p R does not mess around. When it 321 00:19:13,920 --> 00:19:16,919 Speaker 1: comes to the concept of a user consenting to have 322 00:19:17,119 --> 00:19:21,920 Speaker 1: his or her data collected or processed, I quote, consent 323 00:19:22,160 --> 00:19:26,000 Speaker 1: should be given by a clear affirmative act establishing a 324 00:19:26,119 --> 00:19:31,640 Speaker 1: freely given, specific, informed, and unambiguous indication of the data 325 00:19:31,720 --> 00:19:35,240 Speaker 1: subjects agreement to the processing of personal data relating to 326 00:19:35,320 --> 00:19:38,720 Speaker 1: him or her, such as by a written statement, including 327 00:19:38,760 --> 00:19:42,560 Speaker 1: by electronic means, or an oral statement. This could include 328 00:19:42,600 --> 00:19:46,440 Speaker 1: tacking a box when visiting an Internet website, choosing technical 329 00:19:46,480 --> 00:19:50,720 Speaker 1: settings for information society services, or another statement or conduct 330 00:19:50,760 --> 00:19:55,160 Speaker 1: which clearly indicates in this context the data subjects acceptance 331 00:19:55,240 --> 00:19:59,720 Speaker 1: of the proposed processing of his or her personal data. Silence, 332 00:20:00,240 --> 00:20:05,920 Speaker 1: pre ticked boxes, or inactivity should not therefore constitute consent. 333 00:20:06,600 --> 00:20:09,840 Speaker 1: Consent should cover all processing activities carried out for the 334 00:20:09,920 --> 00:20:14,200 Speaker 1: same purpose or purposes. When the processing has multiple purposes, 335 00:20:14,440 --> 00:20:17,639 Speaker 1: consent should be given for all of them. If the 336 00:20:17,720 --> 00:20:20,600 Speaker 1: data subjects consent is to be given following a request 337 00:20:20,600 --> 00:20:24,600 Speaker 1: by electronic means. The request must be clear, concise, and 338 00:20:24,720 --> 00:20:28,199 Speaker 1: not unnecessarily disruptive to the use of the service for 339 00:20:28,280 --> 00:20:31,560 Speaker 1: which it is provided. So yeah, it's a big deal, 340 00:20:31,920 --> 00:20:34,919 Speaker 1: and companies are supposed to be real clear about this 341 00:20:35,000 --> 00:20:38,960 Speaker 1: whenever they present a user with the option to opt 342 00:20:39,040 --> 00:20:43,240 Speaker 1: into this kind of data collection and processing. Moreover, consent 343 00:20:43,280 --> 00:20:46,239 Speaker 1: should be just as easy to withdraw as it is 344 00:20:46,280 --> 00:20:50,080 Speaker 1: to grant, so if a user decides after giving consent 345 00:20:50,200 --> 00:20:53,520 Speaker 1: to revoke that consent, it has to be possible to 346 00:20:53,560 --> 00:20:56,439 Speaker 1: do so, and the entity collecting or processing the data 347 00:20:56,480 --> 00:21:00,639 Speaker 1: has to knock it off. Citizens are will allowed to 348 00:21:00,760 --> 00:21:03,960 Speaker 1: ask for all their data from a controller, and then 349 00:21:04,000 --> 00:21:07,600 Speaker 1: those citizens can even send that data to another controller. So, 350 00:21:07,640 --> 00:21:10,160 Speaker 1: in other words, potentially the g d p R could 351 00:21:10,160 --> 00:21:13,840 Speaker 1: allow citizens to act as their own data brokers. Granted, 352 00:21:14,080 --> 00:21:17,320 Speaker 1: personal data on an individual level is not worth all 353 00:21:17,400 --> 00:21:20,920 Speaker 1: that much. It's mostly valued when it's in bulk, when 354 00:21:20,920 --> 00:21:25,200 Speaker 1: you have thousands of people's data. Selling one person's data 355 00:21:25,400 --> 00:21:28,040 Speaker 1: not that big a deal unless you're talking about things like, 356 00:21:28,960 --> 00:21:31,120 Speaker 1: you know, credit card numbers and stuff like that. Even 357 00:21:31,160 --> 00:21:34,159 Speaker 1: then it's not that expensive, So you might wonder how 358 00:21:34,240 --> 00:21:36,920 Speaker 1: much is all your data worth? Well, that depends on 359 00:21:36,960 --> 00:21:39,719 Speaker 1: the nature of the information and how much of it 360 00:21:39,760 --> 00:21:43,119 Speaker 1: you're providing and who you are really. But there's a 361 00:21:43,119 --> 00:21:46,560 Speaker 1: great post on medium titled quote how much is your 362 00:21:46,640 --> 00:21:50,280 Speaker 1: data worth? And that uses some basic industry analysis to 363 00:21:50,320 --> 00:21:54,000 Speaker 1: conclude that, on average, a person's data is worth about 364 00:21:54,040 --> 00:21:57,600 Speaker 1: two forty dollars per year. But this calculation was done 365 00:21:57,600 --> 00:21:59,919 Speaker 1: with a lot of assumptions, and that is something the 366 00:22:00,000 --> 00:22:03,439 Speaker 1: author of the piece readily admits to. It's a tricky question, 367 00:22:03,560 --> 00:22:06,720 Speaker 1: but still it could now be a question answered by 368 00:22:06,760 --> 00:22:11,040 Speaker 1: individual citizens rather than data brokers. Related to this is 369 00:22:11,080 --> 00:22:14,720 Speaker 1: the concept of data erasure, or better known as the 370 00:22:14,840 --> 00:22:17,560 Speaker 1: right to be forgotten. A lot was written about this 371 00:22:17,600 --> 00:22:20,640 Speaker 1: a couple of years ago when the EU first agreed 372 00:22:20,840 --> 00:22:23,919 Speaker 1: to these rules, and I'll probably chat a little bit 373 00:22:23,920 --> 00:22:26,919 Speaker 1: more about that later, but it is pretty tricky generally speaking. 374 00:22:26,960 --> 00:22:29,280 Speaker 1: This policy says the data subject has the right to 375 00:22:29,320 --> 00:22:32,080 Speaker 1: tell a data collection entity to cut it out, to 376 00:22:32,200 --> 00:22:35,879 Speaker 1: delete all the collected data about that person, and potentially 377 00:22:36,080 --> 00:22:39,600 Speaker 1: have third parties that partnered with that data collection entity 378 00:22:39,960 --> 00:22:44,359 Speaker 1: to stop any data processing of the information. However, this 379 00:22:44,400 --> 00:22:46,879 Speaker 1: has to be done with a consideration toward quote the 380 00:22:46,960 --> 00:22:50,280 Speaker 1: public interest in the availability of the data end quote. 381 00:22:51,040 --> 00:22:53,320 Speaker 1: So in other words, let's say you go and do 382 00:22:53,400 --> 00:22:58,360 Speaker 1: something really really dumb, like colossally stupid, and news outlets 383 00:22:58,400 --> 00:23:01,360 Speaker 1: pick it up and they cover your lostly stupid mistake, 384 00:23:02,119 --> 00:23:05,080 Speaker 1: and now your name is associated with this terrible mistake 385 00:23:05,160 --> 00:23:07,679 Speaker 1: you made, and you know you made it honestly, you 386 00:23:07,720 --> 00:23:10,639 Speaker 1: didn't set out to make it. It just happened. But 387 00:23:10,720 --> 00:23:14,040 Speaker 1: now it's attached to your name. Well, you wouldn't be 388 00:23:14,080 --> 00:23:16,800 Speaker 1: able to just sweep that under the rug by asking 389 00:23:16,840 --> 00:23:20,680 Speaker 1: all search engines to delete information about you, thus reducing 390 00:23:20,680 --> 00:23:23,280 Speaker 1: the chance anyone would ever see that information about your 391 00:23:23,359 --> 00:23:27,040 Speaker 1: dumb mistake, because it goes against the public interest of 392 00:23:27,080 --> 00:23:29,639 Speaker 1: the availability of that data. This is one of the 393 00:23:29,720 --> 00:23:31,840 Speaker 1: points that I was talking about just a second ago 394 00:23:32,040 --> 00:23:35,080 Speaker 1: that a lot of news outlets were really focusing on. 395 00:23:35,160 --> 00:23:39,840 Speaker 1: Because imagine you are, let's say, a political hopeful, and 396 00:23:39,960 --> 00:23:43,520 Speaker 1: you decide you want to wipe out any references to 397 00:23:43,560 --> 00:23:46,520 Speaker 1: your past that are online as best you can, and 398 00:23:46,560 --> 00:23:49,399 Speaker 1: so you contact all of these different search engines to 399 00:23:49,680 --> 00:23:53,520 Speaker 1: have all the information be quote unquote forgotten because you 400 00:23:53,600 --> 00:23:56,720 Speaker 1: don't want people to dig up something you did, you know, 401 00:23:56,800 --> 00:23:59,560 Speaker 1: fifteen years ago that would look really really bad while 402 00:23:59,600 --> 00:24:04,360 Speaker 1: you're thing for office, that would be considered against this 403 00:24:04,440 --> 00:24:06,520 Speaker 1: policy at this point. But there was a lot of 404 00:24:06,560 --> 00:24:09,879 Speaker 1: discussion back when these rules were first being proposed that 405 00:24:10,000 --> 00:24:13,520 Speaker 1: said this might end up causing some huge problems down 406 00:24:13,520 --> 00:24:15,320 Speaker 1: the road. And it turns out and it needs to 407 00:24:15,359 --> 00:24:17,440 Speaker 1: be a case by case basis. It's not like something 408 00:24:17,480 --> 00:24:22,440 Speaker 1: that is clearly spelled out within the charter of g DPR. Well, 409 00:24:22,480 --> 00:24:24,320 Speaker 1: we're going to take another quick break, but when we 410 00:24:24,400 --> 00:24:26,560 Speaker 1: come back, I'll tell you a little bit more about 411 00:24:26,640 --> 00:24:28,919 Speaker 1: this and how companies are doing as they try to 412 00:24:28,960 --> 00:24:32,000 Speaker 1: measure up to g DPR. But first a quick word 413 00:24:32,240 --> 00:24:43,080 Speaker 1: from our sponsor. So another change from the earlier Data 414 00:24:43,119 --> 00:24:46,359 Speaker 1: Protection Directive is that the g d p R requires 415 00:24:46,400 --> 00:24:50,040 Speaker 1: all system designers to incorporate privacy as part of the 416 00:24:50,119 --> 00:24:53,399 Speaker 1: design from the start of their system, Like as soon 417 00:24:53,440 --> 00:24:57,160 Speaker 1: as they start designing any kind of online system, data 418 00:24:57,400 --> 00:25:01,399 Speaker 1: privacy protection has to be part of the design. Previously 419 00:25:01,440 --> 00:25:04,960 Speaker 1: had been treated more as an addition to previously existing systems, 420 00:25:05,000 --> 00:25:09,640 Speaker 1: but now system designers have to buy law incorporate privacy 421 00:25:09,640 --> 00:25:13,000 Speaker 1: design into the actual development of their systems if they 422 00:25:13,040 --> 00:25:16,280 Speaker 1: are to operate within the European Union. If a data 423 00:25:16,280 --> 00:25:20,720 Speaker 1: collection or processing company detects a data breach, it is 424 00:25:20,800 --> 00:25:25,400 Speaker 1: obligated under g DPR to notify affected parties within seventy 425 00:25:25,400 --> 00:25:29,040 Speaker 1: two hours of detecting the breach, so three days afterwards 426 00:25:29,040 --> 00:25:33,040 Speaker 1: they have to disclose this. That means that gone or 427 00:25:33,119 --> 00:25:35,560 Speaker 1: the days when a company would sit on that information 428 00:25:35,640 --> 00:25:39,399 Speaker 1: for maybe months or longer at a time. A data 429 00:25:39,480 --> 00:25:43,919 Speaker 1: processor has to alert data collectors quote without undue delay 430 00:25:44,000 --> 00:25:47,640 Speaker 1: in the quote upon detecting a breach. So let's say 431 00:25:47,640 --> 00:25:50,880 Speaker 1: that Facebook is the data collector, and let's say there's 432 00:25:50,880 --> 00:25:54,200 Speaker 1: an app out there are an add on, some form 433 00:25:54,280 --> 00:25:58,480 Speaker 1: of enhancer for Facebook that is operating, and then the 434 00:25:58,600 --> 00:26:02,240 Speaker 1: enhancer detects that air systems have been breached. They would 435 00:26:02,800 --> 00:26:07,520 Speaker 1: be obligated to alert Facebook to that problem without undue 436 00:26:07,560 --> 00:26:11,520 Speaker 1: delay in order for Facebook to take any measures it 437 00:26:11,600 --> 00:26:14,560 Speaker 1: could under the g d p R, and then it 438 00:26:14,600 --> 00:26:17,720 Speaker 1: would also have to alert all the people affected by 439 00:26:17,760 --> 00:26:20,120 Speaker 1: this within three days. The g d p R also 440 00:26:20,160 --> 00:26:24,280 Speaker 1: sets up the basis for a mandatory data protection officer 441 00:26:24,720 --> 00:26:28,479 Speaker 1: for organizations that have core activities that quote consist of 442 00:26:28,560 --> 00:26:33,840 Speaker 1: processing operations which require regular and systematic monitoring of data subjects. 443 00:26:34,000 --> 00:26:37,399 Speaker 1: On a large scale, or of special categories of data 444 00:26:37,520 --> 00:26:41,040 Speaker 1: or data relating to criminal convictions and offenses end quote. 445 00:26:41,760 --> 00:26:45,440 Speaker 1: Other organizations can have a data protection officer, but it's 446 00:26:45,440 --> 00:26:49,000 Speaker 1: not mandatory if they are outside of that definition. However, 447 00:26:49,280 --> 00:26:52,680 Speaker 1: all the really big companies out there kind of fall 448 00:26:52,760 --> 00:26:55,560 Speaker 1: into this, So I expect we're going to have data 449 00:26:55,600 --> 00:27:00,280 Speaker 1: protection officer as a new type of UH employee at 450 00:27:00,280 --> 00:27:04,800 Speaker 1: most of those large companies, either employed directly by the company, 451 00:27:04,960 --> 00:27:07,639 Speaker 1: or they will be offering their services as a data 452 00:27:07,680 --> 00:27:11,320 Speaker 1: protection officer and it will be a contract issue with, 453 00:27:11,560 --> 00:27:15,119 Speaker 1: you know, some sort of provider that specializes in this. 454 00:27:15,720 --> 00:27:18,000 Speaker 1: Because the person who is the data protection officer is 455 00:27:18,000 --> 00:27:20,720 Speaker 1: supposed to have a specialty in that field. It's not 456 00:27:21,200 --> 00:27:23,359 Speaker 1: just supposed to be Hey, Bob, do you want to 457 00:27:23,359 --> 00:27:25,879 Speaker 1: be data protection officer this week? It's not supposed to 458 00:27:25,920 --> 00:27:28,879 Speaker 1: be like that. In addition, this data protection officer becomes 459 00:27:28,880 --> 00:27:32,840 Speaker 1: a liaison with data protection authorities or d p a s. 460 00:27:33,240 --> 00:27:36,760 Speaker 1: The data protection authorities are kind of like the overseers 461 00:27:36,800 --> 00:27:40,320 Speaker 1: of this system. They're the ones who are making certain 462 00:27:40,359 --> 00:27:43,440 Speaker 1: that everyone is is complying with the rules. If anyone's 463 00:27:43,480 --> 00:27:46,439 Speaker 1: not complying with the rules, they can take action uh, 464 00:27:46,480 --> 00:27:48,640 Speaker 1: And they have a lot of power. So for example, 465 00:27:48,800 --> 00:27:52,159 Speaker 1: they can impose corrective actions such as a temporary or 466 00:27:52,280 --> 00:27:57,920 Speaker 1: definitive limitation on data processing activities, including a complete ban 467 00:27:58,280 --> 00:28:01,280 Speaker 1: on data processing, or or to order the suspension of 468 00:28:01,359 --> 00:28:04,080 Speaker 1: data flows to a recipient in a third country. So 469 00:28:04,119 --> 00:28:06,960 Speaker 1: in other words, they could say, hey, Facebook, you are 470 00:28:07,040 --> 00:28:10,720 Speaker 1: not allowed to process any data from any citizen in 471 00:28:10,760 --> 00:28:14,880 Speaker 1: the European Union ever again, because you broke this rule. 472 00:28:15,200 --> 00:28:18,440 Speaker 1: They technically have the power to do that. Uh. In addition, 473 00:28:18,920 --> 00:28:21,639 Speaker 1: if a controller is found to be in breach of 474 00:28:21,720 --> 00:28:23,600 Speaker 1: g d p R, it can be hit with a 475 00:28:23,680 --> 00:28:27,480 Speaker 1: fine of up to four percent of its annual global 476 00:28:27,560 --> 00:28:33,000 Speaker 1: turnover or twenty million euro whichever is greater. Well, global 477 00:28:33,040 --> 00:28:36,320 Speaker 1: turnover is a European phrase. It's a way of saying 478 00:28:36,359 --> 00:28:38,360 Speaker 1: total revenues. That's what we would call it in the 479 00:28:38,440 --> 00:28:41,360 Speaker 1: United States. So if you want to look at global 480 00:28:41,640 --> 00:28:47,320 Speaker 1: total revenues, that can be a truly mind numbingly huge 481 00:28:47,360 --> 00:28:49,760 Speaker 1: sum of money depending on the company. Let's go with 482 00:28:49,800 --> 00:28:53,240 Speaker 1: a big one. Let's think about Apple. So Apple made 483 00:28:53,240 --> 00:28:57,560 Speaker 1: two nine point two billion US dollars in revenue in 484 00:28:58,880 --> 00:29:02,240 Speaker 1: If we convert that to your oh, that's one billion 485 00:29:02,520 --> 00:29:06,000 Speaker 1: eight hundred ninety three million, eight hundred two thousand euro. 486 00:29:06,400 --> 00:29:08,120 Speaker 1: So let's take four percent of that. That would be 487 00:29:08,120 --> 00:29:11,680 Speaker 1: a fine. Let's say Apple has committed this, this breach 488 00:29:11,800 --> 00:29:16,120 Speaker 1: of g d PR, this this worst case scenario, and 489 00:29:16,280 --> 00:29:21,600 Speaker 1: the data protection authorities levy this four percent fine. That 490 00:29:21,720 --> 00:29:25,560 Speaker 1: four percent fine would amount to seven billion, eight hundred 491 00:29:25,640 --> 00:29:29,040 Speaker 1: thirty five million, seven hundred fifty two thousand eighty euro, 492 00:29:30,320 --> 00:29:34,800 Speaker 1: almost eight billion euro in a in one fine. That's 493 00:29:35,880 --> 00:29:38,600 Speaker 1: it's it's crazy, and that's why a lot of companies 494 00:29:38,600 --> 00:29:41,720 Speaker 1: have really been taking a lot of effort to try 495 00:29:41,720 --> 00:29:44,120 Speaker 1: and at least appear to comply with g d p 496 00:29:44,320 --> 00:29:49,360 Speaker 1: R because the consequences are truly scary. And that's if, obviously, 497 00:29:49,400 --> 00:29:52,640 Speaker 1: if your company is operating at a level where more 498 00:29:52,680 --> 00:29:55,400 Speaker 1: than four percent of your your or your four percent 499 00:29:55,520 --> 00:29:58,360 Speaker 1: rather of your total revenue is greater than twenty million euro. 500 00:29:58,680 --> 00:30:00,520 Speaker 1: If it's less, then you still have to hey twenty 501 00:30:00,520 --> 00:30:04,440 Speaker 1: million euro. Now the penalties are tiered, so it's not 502 00:30:04,520 --> 00:30:08,320 Speaker 1: like it's that fine for any infraction. The example I 503 00:30:08,400 --> 00:30:11,280 Speaker 1: just cited was for the absolute worst case scenario, But 504 00:30:11,600 --> 00:30:14,800 Speaker 1: if it were for a smaller infraction, Let's say that 505 00:30:14,960 --> 00:30:19,080 Speaker 1: it's like you didn't conduct a proper impact assessment for 506 00:30:19,120 --> 00:30:22,600 Speaker 1: something like a potential data breach, and the d p 507 00:30:22,640 --> 00:30:24,680 Speaker 1: A S found that out. They said, oh, well, you 508 00:30:24,680 --> 00:30:27,000 Speaker 1: didn't take the necessary steps. According to the g d 509 00:30:27,080 --> 00:30:29,360 Speaker 1: p R, you could be hit by a smaller fine, 510 00:30:29,400 --> 00:30:31,480 Speaker 1: but by smaller fine, it could still be two percent 511 00:30:31,520 --> 00:30:34,520 Speaker 1: of your total revenues. I mean, half of seven billion 512 00:30:34,600 --> 00:30:38,560 Speaker 1: euros is still a huge amount of money, right Anyway. 513 00:30:38,680 --> 00:30:42,400 Speaker 1: The full GDPR document is available online and in many 514 00:30:42,480 --> 00:30:46,920 Speaker 1: different languages. The English language version is eighty eight pages long, 515 00:30:47,080 --> 00:30:49,760 Speaker 1: so there's a lot more in there that I kind 516 00:30:49,760 --> 00:30:52,280 Speaker 1: of skimmed over for the purposes of this episode. I 517 00:30:52,280 --> 00:30:53,880 Speaker 1: wanted to take a bit of time, however, to talk 518 00:30:53,920 --> 00:30:56,680 Speaker 1: about the effect this has had on the world already. 519 00:30:57,120 --> 00:31:00,280 Speaker 1: I'm recording this at the beginning of July two, team, 520 00:31:00,600 --> 00:31:03,320 Speaker 1: and already we're starting to see the effects of GDPR. 521 00:31:03,440 --> 00:31:06,280 Speaker 1: And first you probably received some of those emails or 522 00:31:06,320 --> 00:31:09,520 Speaker 1: messages from different organizations about their efforts to comply with 523 00:31:09,560 --> 00:31:11,640 Speaker 1: g d p R. A lot of companies have been 524 00:31:11,640 --> 00:31:14,320 Speaker 1: working toward compliance ever since the policy was approved in 525 00:31:14,320 --> 00:31:16,600 Speaker 1: twenty sixteen. That's exactly what they were supposed to do. 526 00:31:16,640 --> 00:31:19,000 Speaker 1: They were supposed to get their acts and gear within 527 00:31:19,040 --> 00:31:22,560 Speaker 1: those two years, but according to analyst firm Gartner, more 528 00:31:22,600 --> 00:31:26,560 Speaker 1: than half of all companies affected by g DPR will 529 00:31:26,600 --> 00:31:29,400 Speaker 1: not be in full compliance with his regulations even by 530 00:31:29,440 --> 00:31:33,480 Speaker 1: the end of ten. On a related note, there's a 531 00:31:33,520 --> 00:31:38,200 Speaker 1: consumer advocacy group called the European Consumer Organization. Its initialism 532 00:31:38,240 --> 00:31:40,440 Speaker 1: is b e u C because it comes from the 533 00:31:40,480 --> 00:31:42,880 Speaker 1: French name for the group, which I am not going 534 00:31:42,920 --> 00:31:47,600 Speaker 1: to attempt to pronounce because I love you French speakers 535 00:31:47,600 --> 00:31:50,640 Speaker 1: and I don't want to hurt you with my terrible pronunciation. 536 00:31:50,840 --> 00:31:53,680 Speaker 1: But anyway, the b e u C conducted a study 537 00:31:53,760 --> 00:31:56,400 Speaker 1: of big tech companies and how they hold up to 538 00:31:56,480 --> 00:31:59,720 Speaker 1: g DPR policies. And they analyzed a bunch of privacy 539 00:31:59,720 --> 00:32:04,959 Speaker 1: policy sees by fourteen major companies, including Facebook, Apple, and Google, 540 00:32:05,440 --> 00:32:08,480 Speaker 1: and they said that most of them had privacy policies 541 00:32:08,520 --> 00:32:11,040 Speaker 1: that might not meet the g d PR standard at all. 542 00:32:11,120 --> 00:32:15,200 Speaker 1: They said a lot of them included vague and insufficient language. 543 00:32:15,880 --> 00:32:18,600 Speaker 1: One big reason for that comes down to the era 544 00:32:18,800 --> 00:32:21,479 Speaker 1: of big data. So big data or big data if 545 00:32:21,480 --> 00:32:25,200 Speaker 1: you prefer, refers to enormous data sets that can include 546 00:32:25,200 --> 00:32:28,240 Speaker 1: all sorts of information, including stuff that upon first glance 547 00:32:28,320 --> 00:32:31,520 Speaker 1: might be useless or completely unrelated to whatever you want 548 00:32:31,560 --> 00:32:34,840 Speaker 1: to analyze. But data analysts have found that you can 549 00:32:34,880 --> 00:32:38,640 Speaker 1: discover really interesting patterns and associations and trends. If you 550 00:32:38,720 --> 00:32:42,760 Speaker 1: have really large sets of data, sometimes you can find 551 00:32:42,880 --> 00:32:44,840 Speaker 1: new ways to make use of that data that are 552 00:32:44,880 --> 00:32:48,280 Speaker 1: really transformative. But you might not have that idea before 553 00:32:48,320 --> 00:32:50,560 Speaker 1: you actually get hold of all the information, and therein 554 00:32:50,680 --> 00:32:54,360 Speaker 1: lies a huge problem. G d PR requires companies to 555 00:32:54,400 --> 00:32:58,520 Speaker 1: spell out in clear terms why they want a person's information. 556 00:32:59,040 --> 00:33:02,240 Speaker 1: They're only soupposed to collect the data relevant to whatever 557 00:33:02,360 --> 00:33:05,280 Speaker 1: process they wish to perform, and they have to get 558 00:33:05,280 --> 00:33:07,520 Speaker 1: the users consent to do it. So I have a 559 00:33:07,560 --> 00:33:09,560 Speaker 1: feeling that if you were to ask a data collection 560 00:33:09,560 --> 00:33:12,240 Speaker 1: company why do you need all this information about me 561 00:33:12,320 --> 00:33:15,600 Speaker 1: and my behavior? You might get a responsive oh no, 562 00:33:16,200 --> 00:33:18,840 Speaker 1: and that won't cut it. There are companies right now 563 00:33:19,280 --> 00:33:23,160 Speaker 1: that have so much information about us that they don't 564 00:33:23,200 --> 00:33:26,560 Speaker 1: even know what they have. It's kind of like going 565 00:33:26,560 --> 00:33:29,760 Speaker 1: to an auction and buying a locked storage unit and 566 00:33:29,800 --> 00:33:31,640 Speaker 1: you don't get to look inside it. You have to 567 00:33:31,640 --> 00:33:34,320 Speaker 1: buy it site unseen. You have no idea what you're 568 00:33:34,320 --> 00:33:36,640 Speaker 1: going to get once you open that storage unit stores. 569 00:33:36,960 --> 00:33:38,960 Speaker 1: It might be a gold mine of antiques, or it 570 00:33:38,960 --> 00:33:40,840 Speaker 1: could just be a bunch of worthless junk, or might 571 00:33:40,840 --> 00:33:44,880 Speaker 1: even be empty. Well, some companies have enormous repositories of 572 00:33:44,960 --> 00:33:48,080 Speaker 1: information that effectively amounts to the same thing. They don't 573 00:33:48,160 --> 00:33:52,000 Speaker 1: know what they have yet. Having these companies comply with 574 00:33:52,080 --> 00:33:54,560 Speaker 1: g d PR requires them to sift through all that 575 00:33:54,640 --> 00:33:58,400 Speaker 1: information and to determine which bits are identifiable as defined 576 00:33:58,440 --> 00:34:00,680 Speaker 1: by g d p R, and then be able to 577 00:34:00,880 --> 00:34:03,600 Speaker 1: produce it or destroy it upon request, which is a 578 00:34:03,640 --> 00:34:05,880 Speaker 1: pretty tall order, all right. So what does this mean 579 00:34:06,040 --> 00:34:09,319 Speaker 1: to the average person. Well, if you live in the EU, 580 00:34:09,440 --> 00:34:12,919 Speaker 1: you now have some pretty darn powerful legislation looking after 581 00:34:12,960 --> 00:34:15,719 Speaker 1: your data protection, and if you so choose, you can 582 00:34:15,760 --> 00:34:19,000 Speaker 1: exert your rights to request data or even have it deleted, 583 00:34:19,280 --> 00:34:21,839 Speaker 1: assuming doing so does not go against the public interest 584 00:34:21,920 --> 00:34:25,000 Speaker 1: in general, and you should be able to expect that 585 00:34:25,040 --> 00:34:28,960 Speaker 1: to be delivered upon. Although I've listened to some podcasts, 586 00:34:29,160 --> 00:34:32,200 Speaker 1: my buddy Nate Lankson did one for Bloomberg where he 587 00:34:32,200 --> 00:34:36,040 Speaker 1: talked about how difficult it was to get his information 588 00:34:36,400 --> 00:34:39,960 Speaker 1: from certain organizations h even going through the g d 589 00:34:40,040 --> 00:34:44,520 Speaker 1: p R process. So companies aren't really necessarily prepared to 590 00:34:44,560 --> 00:34:46,719 Speaker 1: do this, but they are supposed to comply with it. 591 00:34:47,080 --> 00:34:50,240 Speaker 1: But if you're outside the EU, like me, you're probably 592 00:34:50,239 --> 00:34:52,200 Speaker 1: just getting a ton of emails about this, and for 593 00:34:52,239 --> 00:34:54,520 Speaker 1: the most part you can ignore them. Some of them 594 00:34:54,560 --> 00:34:57,239 Speaker 1: are likely asking you if you consent to being included 595 00:34:57,280 --> 00:35:00,239 Speaker 1: on mailing lists as kind of a protective measu. You're 596 00:35:00,680 --> 00:35:03,879 Speaker 1: there people who aren't certain if this is absolutely necessary yet, 597 00:35:04,320 --> 00:35:06,919 Speaker 1: But a lot of companies are like, we'd better, We'd 598 00:35:07,040 --> 00:35:10,840 Speaker 1: rather we'd rather send an unnecessary email out now and 599 00:35:11,440 --> 00:35:13,920 Speaker 1: cover our bases, then find out later on that we 600 00:35:13,960 --> 00:35:16,279 Speaker 1: should have done that. So if you want to keep 601 00:35:16,280 --> 00:35:18,920 Speaker 1: getting email from those companies, you might need to skim 602 00:35:18,920 --> 00:35:21,200 Speaker 1: the message. There might be a link you have to 603 00:35:21,239 --> 00:35:24,719 Speaker 1: click in to indicate you've opted in to receive that mail. 604 00:35:25,080 --> 00:35:27,520 Speaker 1: But if you're like me, you're probably just deleting the 605 00:35:27,560 --> 00:35:29,440 Speaker 1: emails and then relishing in the thought that you're not 606 00:35:29,440 --> 00:35:31,160 Speaker 1: gonna have to deal with as much spam on a 607 00:35:31,200 --> 00:35:34,319 Speaker 1: regular basis now. I noticed on zd net that there 608 00:35:34,440 --> 00:35:37,920 Speaker 1: is a theory going around about clout. That's the company 609 00:35:37,960 --> 00:35:40,960 Speaker 1: that assigned people a social media score based off the 610 00:35:41,040 --> 00:35:44,160 Speaker 1: reach and impact of their various social media accounts like 611 00:35:44,160 --> 00:35:47,759 Speaker 1: Twitter and Facebook. Cloud closed up shop right around the 612 00:35:47,760 --> 00:35:50,480 Speaker 1: time the gdp R compliance was to go into effect, 613 00:35:50,480 --> 00:35:54,560 Speaker 1: which was May two thousand eighteen. And the theory is 614 00:35:54,960 --> 00:35:59,760 Speaker 1: that's that it's possible Cloud went and closed up partly 615 00:35:59,800 --> 00:36:02,640 Speaker 1: because it was so hard to comply with g d 616 00:36:02,760 --> 00:36:05,919 Speaker 1: p R. I mean, they're an organization that's dependent upon 617 00:36:06,040 --> 00:36:10,040 Speaker 1: many other entities that are collecting and processing data. So 618 00:36:10,120 --> 00:36:12,399 Speaker 1: the owners of Cloud may have opted just to walk 619 00:36:12,440 --> 00:36:15,120 Speaker 1: away rather than try and work that out. But again 620 00:36:15,160 --> 00:36:17,279 Speaker 1: that's just a theory. It may have nothing to do 621 00:36:17,320 --> 00:36:19,000 Speaker 1: with the g d p R, but it is interesting 622 00:36:19,000 --> 00:36:23,120 Speaker 1: the timing. And also there's some sites like news organizations 623 00:36:23,120 --> 00:36:25,319 Speaker 1: such as the l A Times or the Baltimore Sun 624 00:36:26,000 --> 00:36:28,520 Speaker 1: that have restricted the access to their sites within the 625 00:36:28,560 --> 00:36:31,080 Speaker 1: European Unions. So if you're in the EU and you 626 00:36:31,160 --> 00:36:33,200 Speaker 1: try to visit one of those sites, you might get 627 00:36:33,239 --> 00:36:35,600 Speaker 1: a message stating that due to g d p R, 628 00:36:35,920 --> 00:36:38,399 Speaker 1: the users would be unable to access the site at 629 00:36:38,440 --> 00:36:42,120 Speaker 1: that time. Now that's not necessarily permanent. These sites are 630 00:36:42,200 --> 00:36:45,200 Speaker 1: more more likely just trying to find ways that they 631 00:36:45,200 --> 00:36:48,359 Speaker 1: can comply with g DPR that might even require them 632 00:36:48,360 --> 00:36:51,640 Speaker 1: to set up a different web portal for their various 633 00:36:51,760 --> 00:36:54,520 Speaker 1: articles and services that operates on a different set of 634 00:36:54,600 --> 00:36:56,400 Speaker 1: rules than the ones than the rest of the world do, 635 00:36:56,920 --> 00:36:59,759 Speaker 1: which creates sort of a fragmented experience, But it might 636 00:36:59,800 --> 00:37:01,880 Speaker 1: be the only way they're able to comply with gdp 637 00:37:02,080 --> 00:37:05,880 Speaker 1: R without overhauling their entire system. But the penalties for 638 00:37:06,000 --> 00:37:08,439 Speaker 1: failing to comply are so high some companies would rather 639 00:37:08,480 --> 00:37:11,520 Speaker 1: step back in the short term and lose all that 640 00:37:11,560 --> 00:37:14,600 Speaker 1: traffic from the EU while they're trying to work on 641 00:37:14,640 --> 00:37:19,480 Speaker 1: a more compliant implementation, rather than risk and enormous fine. 642 00:37:20,239 --> 00:37:21,960 Speaker 1: And that brings us up to speed on what g 643 00:37:22,080 --> 00:37:24,239 Speaker 1: d p R is and why it's causing so much 644 00:37:24,280 --> 00:37:26,799 Speaker 1: ruckis in the text sphere right now. I'm sure we'll 645 00:37:26,840 --> 00:37:29,239 Speaker 1: have plenty of stories relating to g d p R 646 00:37:29,400 --> 00:37:32,040 Speaker 1: unfold over the next few years, and I'm I'm certain 647 00:37:32,040 --> 00:37:34,120 Speaker 1: that in the future I'll cover some of them. But 648 00:37:34,280 --> 00:37:36,960 Speaker 1: I wanted to do this episode just in case some 649 00:37:37,000 --> 00:37:39,920 Speaker 1: people out there were like me, wondering what this was 650 00:37:39,960 --> 00:37:42,480 Speaker 1: all about. And you don't have time to read all 651 00:37:42,560 --> 00:37:47,680 Speaker 1: eighty eight pages of that legislation. It's a real page turner. 652 00:37:47,880 --> 00:37:50,839 Speaker 1: It's actually not that bad to read, um, but it's 653 00:37:50,840 --> 00:37:53,600 Speaker 1: a lot, so hopefully this was helpful. If you have 654 00:37:53,640 --> 00:37:56,400 Speaker 1: any suggestions for future episodes of tech Stuff, whether it 655 00:37:56,600 --> 00:38:00,560 Speaker 1: is a technology, a person, a company. Maybe there's a 656 00:38:00,600 --> 00:38:04,920 Speaker 1: particular story in tech that you think really deserves deep treatment, 657 00:38:05,440 --> 00:38:07,360 Speaker 1: send me a message asked me to cover it. I 658 00:38:07,360 --> 00:38:09,560 Speaker 1: would love to hear from you. The email address for 659 00:38:09,560 --> 00:38:13,280 Speaker 1: the show is tech Stuff at how stuff works dot com. 660 00:38:13,400 --> 00:38:15,719 Speaker 1: Or draw me a line on Facebook or Twitter. The 661 00:38:15,800 --> 00:38:18,680 Speaker 1: handle of both of those is tech Stuff H s W. 662 00:38:19,440 --> 00:38:22,399 Speaker 1: Don't forget to follow us on Instagram and I'll talk 663 00:38:22,400 --> 00:38:30,520 Speaker 1: to you again really soon for more on this and 664 00:38:30,560 --> 00:38:33,120 Speaker 1: thousands of other topics. Because it how stuff works dot 665 00:38:33,120 --> 00:38:43,239 Speaker 1: com