1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,520 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,600 --> 00:00:20,920 Speaker 1: and a love of all things tech, and I figured 5 00:00:20,960 --> 00:00:26,120 Speaker 1: i'd do UH an episode or two about tech and privacy. 6 00:00:26,239 --> 00:00:29,639 Speaker 1: I've titled this episode how Tech Spies on Us, But 7 00:00:29,880 --> 00:00:32,560 Speaker 1: right from the get go, I have to admit that 8 00:00:32,720 --> 00:00:37,479 Speaker 1: that is a little disingenuous because in many cases, not all, 9 00:00:37,680 --> 00:00:42,600 Speaker 1: but in many cases, we're essentially giving tech permission to 10 00:00:42,760 --> 00:00:45,760 Speaker 1: just harvest information about us, which is a little different 11 00:00:46,000 --> 00:00:49,480 Speaker 1: than spying. It's kind of like saying, no, I'm cool 12 00:00:49,600 --> 00:00:53,360 Speaker 1: with there being a live camera on me all the time, 13 00:00:53,760 --> 00:00:57,880 Speaker 1: just broadcasting to some other location. You can't really call 14 00:00:57,960 --> 00:01:00,360 Speaker 1: that spying if you were in on it at the top. 15 00:01:00,440 --> 00:01:05,279 Speaker 1: But the way tech tends to go about this UH 16 00:01:05,600 --> 00:01:10,760 Speaker 1: usually follows a pathway that isn't transparent, and it's not 17 00:01:10,840 --> 00:01:15,920 Speaker 1: always obvious, and sometimes it relies on people not really 18 00:01:15,959 --> 00:01:19,320 Speaker 1: being aware of what they've agreed to. So I really 19 00:01:19,360 --> 00:01:21,200 Speaker 1: wanted to go into that so that we can all 20 00:01:21,240 --> 00:01:27,120 Speaker 1: have a deeper understanding of how tech gathers information and 21 00:01:27,120 --> 00:01:31,600 Speaker 1: then how companies leverage that and exploit our information, and 22 00:01:31,680 --> 00:01:34,600 Speaker 1: furthermore to kind of, you know, think about whether or 23 00:01:34,600 --> 00:01:36,800 Speaker 1: not we're okay with this. Some of you might be, 24 00:01:37,000 --> 00:01:39,759 Speaker 1: and there's I'm not casting any shade, I'm not making 25 00:01:39,800 --> 00:01:43,240 Speaker 1: any judgments, but others might not be. And that's why 26 00:01:43,280 --> 00:01:45,200 Speaker 1: I wanted to kind of go through this and really 27 00:01:45,240 --> 00:01:49,000 Speaker 1: get down to why it seems like our tech has 28 00:01:49,040 --> 00:01:53,040 Speaker 1: almost a preternatural ability to know things about us. So 29 00:01:53,160 --> 00:01:57,320 Speaker 1: let me paint for you a hypothetical situation. You are 30 00:01:57,440 --> 00:02:00,240 Speaker 1: hanging out with a friend and you're both chatting a out, 31 00:02:00,480 --> 00:02:03,000 Speaker 1: you know, all sorts of stuff, and the two of 32 00:02:03,040 --> 00:02:05,520 Speaker 1: you decided to go to a nice restaurant with an 33 00:02:05,520 --> 00:02:08,040 Speaker 1: outdoor seating area because you know, we're still kind of 34 00:02:08,520 --> 00:02:11,960 Speaker 1: post pandemic type stuff, and you know you're being super 35 00:02:11,960 --> 00:02:15,280 Speaker 1: responsible with regard to health concerns and all of that. Anyway, 36 00:02:15,600 --> 00:02:18,120 Speaker 1: you have a conversation with your friend who is telling 37 00:02:18,120 --> 00:02:21,520 Speaker 1: you all about this movie they just watched. And after 38 00:02:21,600 --> 00:02:23,960 Speaker 1: your meal, you say goodbye to your friend and you 39 00:02:24,000 --> 00:02:27,799 Speaker 1: head on home. Later that same day or evening or 40 00:02:27,919 --> 00:02:30,560 Speaker 1: night whatever, you're on the internet and you see something 41 00:02:30,639 --> 00:02:33,639 Speaker 1: kind of weird. You actually start seeing ads for that 42 00:02:33,680 --> 00:02:37,240 Speaker 1: film your friend was talking about popping up on different 43 00:02:37,280 --> 00:02:40,760 Speaker 1: sites and services, and you don't remember seeing ads for 44 00:02:40,880 --> 00:02:46,720 Speaker 1: this particular movie before, So maybe it's cherry picking. Maybe 45 00:02:46,800 --> 00:02:49,519 Speaker 1: it's the whole van as always by the corner thing 46 00:02:49,520 --> 00:02:52,000 Speaker 1: where you only notice something after it's been brought to 47 00:02:52,040 --> 00:02:54,520 Speaker 1: your attention. But maybe it's not. So what the heck 48 00:02:54,560 --> 00:02:57,800 Speaker 1: is going on? Was your phone spying on you? Was 49 00:02:57,840 --> 00:03:01,000 Speaker 1: it listening in on your conversationation that you had with 50 00:03:01,000 --> 00:03:04,320 Speaker 1: your friend at dinner and then sent that information off 51 00:03:04,600 --> 00:03:09,520 Speaker 1: somewhere to be processed and analyzed, then sold to advertisers, 52 00:03:09,560 --> 00:03:13,040 Speaker 1: And then when you get back and now you have 53 00:03:13,120 --> 00:03:17,320 Speaker 1: targeted ads aimed at you because your phone was eavesdropping 54 00:03:17,360 --> 00:03:20,680 Speaker 1: on you. Now, I mentioned this sort of thing in 55 00:03:20,720 --> 00:03:24,000 Speaker 1: a recent tech News episode, and I'm not gonna make 56 00:03:24,080 --> 00:03:27,800 Speaker 1: you wait through this whole episode to answer that hypothetical question. 57 00:03:28,280 --> 00:03:33,919 Speaker 1: Your tech is most likely not listening in on your conversations, 58 00:03:34,480 --> 00:03:37,040 Speaker 1: not because that would be hard to do it. It 59 00:03:37,120 --> 00:03:40,440 Speaker 1: actually wouldn't be hard to do that, but because there's 60 00:03:40,520 --> 00:03:43,040 Speaker 1: really no need to do it. There's no need to 61 00:03:43,040 --> 00:03:47,680 Speaker 1: go that far. Also, if tech were actually actively listening 62 00:03:47,680 --> 00:03:50,560 Speaker 1: in on you all the time, governments around the world 63 00:03:50,640 --> 00:03:54,440 Speaker 1: would be extremely interested in that, either to use it 64 00:03:55,040 --> 00:03:58,440 Speaker 1: for their themselves as a kind of surveillance of their 65 00:03:58,480 --> 00:04:01,880 Speaker 1: own citizens and people who are within the country, or 66 00:04:02,160 --> 00:04:05,120 Speaker 1: they would want to go after tech companies for violating 67 00:04:05,200 --> 00:04:09,880 Speaker 1: various privacy and security laws. Now, the point is that 68 00:04:10,040 --> 00:04:14,280 Speaker 1: the way our modern tech works, and the way companies collect, share, 69 00:04:14,440 --> 00:04:18,400 Speaker 1: and barter our data, means there's no need to listen 70 00:04:18,480 --> 00:04:21,479 Speaker 1: in to what we're saying. Our tech knows who we are, 71 00:04:22,080 --> 00:04:25,480 Speaker 1: it knows where we go, it knows what we're doing, 72 00:04:26,080 --> 00:04:29,240 Speaker 1: and because the people around us also carry similar tech, 73 00:04:29,520 --> 00:04:32,800 Speaker 1: it knows who we're with, and by cross referencing data, 74 00:04:33,080 --> 00:04:36,960 Speaker 1: it knows what sort of connections exist between us. So 75 00:04:37,040 --> 00:04:39,840 Speaker 1: today I thought I talk a bit more about how 76 00:04:39,880 --> 00:04:43,000 Speaker 1: our tech collects data and the history of that, from 77 00:04:43,000 --> 00:04:45,919 Speaker 1: the stuff that we willingly surrender to the stuff that 78 00:04:45,960 --> 00:04:49,000 Speaker 1: we might not even realize was being shared. We'll talk 79 00:04:49,040 --> 00:04:52,400 Speaker 1: about how this has evolved and how geolocation has played 80 00:04:52,440 --> 00:04:54,920 Speaker 1: a really big part in this, and this will lead 81 00:04:55,000 --> 00:04:57,159 Speaker 1: us to some recent events in which we've seen big 82 00:04:57,200 --> 00:05:00,880 Speaker 1: companies like Facebook and Google try to skirt around attempts 83 00:05:00,960 --> 00:05:03,800 Speaker 1: to give users the option to opt out of some 84 00:05:03,880 --> 00:05:08,839 Speaker 1: forms of data collection now, as always in tech stuff, 85 00:05:08,839 --> 00:05:11,440 Speaker 1: anyone who's listened to a textuff episode knows I love 86 00:05:12,120 --> 00:05:15,840 Speaker 1: going through history because it's not like there was just 87 00:05:16,000 --> 00:05:18,799 Speaker 1: a switch that was flipped one day and then suddenly 88 00:05:18,839 --> 00:05:22,920 Speaker 1: all of our personal information got hoovered up by the Internet. Also, 89 00:05:22,920 --> 00:05:24,480 Speaker 1: we should keep in mind that there are a lot 90 00:05:24,520 --> 00:05:28,239 Speaker 1: of different privacy laws around the world that can restrict 91 00:05:28,320 --> 00:05:31,719 Speaker 1: this kind of you know, data collection and how data 92 00:05:31,720 --> 00:05:36,160 Speaker 1: companies can use that information. So, for example, the European 93 00:05:36,279 --> 00:05:40,160 Speaker 1: Union has laws that provide a decent amount of protection 94 00:05:40,240 --> 00:05:44,799 Speaker 1: for citizens. But as I am located in the United States, 95 00:05:45,000 --> 00:05:47,839 Speaker 1: where the Internet first got its start, and thus where 96 00:05:47,920 --> 00:05:51,920 Speaker 1: the attitudes and practices regarding private information on the Internet 97 00:05:52,440 --> 00:05:55,080 Speaker 1: got their start, at least with regard to you know, 98 00:05:55,320 --> 00:05:58,279 Speaker 1: the web and the Internet in general, I'm mostly going 99 00:05:58,320 --> 00:06:00,279 Speaker 1: to be focusing on the good old us of a 100 00:06:01,400 --> 00:06:05,239 Speaker 1: which isn't really that old, and frequently isn't that good either. 101 00:06:06,720 --> 00:06:09,800 Speaker 1: So here in the United States, a right to privacy 102 00:06:10,120 --> 00:06:13,880 Speaker 1: was not one of the guarantees laid out in the Constitution, 103 00:06:14,000 --> 00:06:18,680 Speaker 1: at least not explicitly. However, the various forms of government, 104 00:06:18,680 --> 00:06:22,160 Speaker 1: including the Supreme Court, over the course of decades and 105 00:06:22,240 --> 00:06:26,080 Speaker 1: several court cases has essentially found that four of the 106 00:06:26,120 --> 00:06:31,479 Speaker 1: first five amendments to the Constitution extend over towards a 107 00:06:31,600 --> 00:06:34,480 Speaker 1: right to privacy. So those would be the first Amendment. 108 00:06:34,839 --> 00:06:38,159 Speaker 1: That's the one that guarantees the right to assembly, uh, 109 00:06:38,200 --> 00:06:41,520 Speaker 1: the freedom of religion, the freedom of speech, and the 110 00:06:41,560 --> 00:06:45,240 Speaker 1: freedom of expression. Then we skip over the second Amendment 111 00:06:45,440 --> 00:06:48,320 Speaker 1: and we head over to Amendment number three, which prevents 112 00:06:48,320 --> 00:06:51,200 Speaker 1: the government from being able to station soldiers in private 113 00:06:51,240 --> 00:06:55,600 Speaker 1: homes of citizens. Then there's the Fourth Amendment, which protects 114 00:06:55,640 --> 00:06:59,120 Speaker 1: citizens from unreasonable search and seizure. That's a big part 115 00:06:59,160 --> 00:07:02,760 Speaker 1: of privacy obviously. And then we've got Amendment number five, 116 00:07:03,040 --> 00:07:06,360 Speaker 1: which has a little bit of Monica in my life. Wait, no, 117 00:07:06,440 --> 00:07:10,760 Speaker 1: I'm sorry, that's mambo number five. Amendment number five says 118 00:07:10,840 --> 00:07:15,240 Speaker 1: that US citizens enjoy certain legal protections. For example, you 119 00:07:15,480 --> 00:07:19,000 Speaker 1: can't be tried for the same crime twice, right, that's 120 00:07:19,000 --> 00:07:23,360 Speaker 1: double jeopardy. That obviously doesn't include stuff like the appeals process, 121 00:07:23,400 --> 00:07:27,720 Speaker 1: but that's separate. Also, citizens can take the Fifth Amendment 122 00:07:27,800 --> 00:07:31,520 Speaker 1: to avoid self incrimination during a trial, so they cannot 123 00:07:31,560 --> 00:07:35,840 Speaker 1: be compelled to confess to a crime. Uh. That again 124 00:07:35,920 --> 00:07:39,440 Speaker 1: relates to privacy. Now, none of those four amendments provide 125 00:07:39,520 --> 00:07:42,880 Speaker 1: explicit rights to privacy, but they all extend towards that 126 00:07:42,920 --> 00:07:46,440 Speaker 1: direction in different ways. So that was back in seventeen 127 00:07:46,600 --> 00:07:49,200 Speaker 1: eighty nine, which means it was like two centuries before 128 00:07:49,200 --> 00:07:52,280 Speaker 1: we had to worry about the Internet, but a century 129 00:07:52,320 --> 00:07:56,480 Speaker 1: after the Constitution was ratified. That being an eight ninety 130 00:07:57,040 --> 00:08:01,640 Speaker 1: an article written by Justice Louis Brandy and Samuel Warren 131 00:08:02,080 --> 00:08:05,680 Speaker 1: argued for a more explicit right to privacy. It was 132 00:08:05,720 --> 00:08:09,360 Speaker 1: the early days of photography in the eighteen nineties, and 133 00:08:09,480 --> 00:08:12,240 Speaker 1: already the two saw the potential for people to have 134 00:08:12,320 --> 00:08:15,960 Speaker 1: their privacy violated due to the fact that now it 135 00:08:16,040 --> 00:08:19,880 Speaker 1: was easier to capture a moment forever and then potentially 136 00:08:19,960 --> 00:08:24,600 Speaker 1: distribute it widely through the press. In nineteen fourteen, we 137 00:08:24,720 --> 00:08:29,040 Speaker 1: got the founding of the Federal Trade Commission, or FTC. Now, 138 00:08:29,040 --> 00:08:32,600 Speaker 1: the primary responsibility of the FTC is to ensure that 139 00:08:32,640 --> 00:08:36,880 Speaker 1: companies within the United States are playing fair. The FTC 140 00:08:36,960 --> 00:08:40,880 Speaker 1: can go after companies that use deceptive commercial practices, but 141 00:08:41,080 --> 00:08:45,240 Speaker 1: later on, really like starting around the nineteen seventies, the 142 00:08:45,320 --> 00:08:50,000 Speaker 1: FTC would also focus on companies that violated citizen privacy, 143 00:08:50,280 --> 00:08:53,680 Speaker 1: going global. For a moment in nineteen forty eight, the 144 00:08:53,760 --> 00:08:58,280 Speaker 1: United Nations drafted the Declaration of Human Rights, which included 145 00:08:58,320 --> 00:09:01,920 Speaker 1: within it article number twelve, of which states, quote no 146 00:09:01,960 --> 00:09:07,160 Speaker 1: one shall be subjected to arbitrary interference with his privacy, family, home, 147 00:09:07,360 --> 00:09:11,599 Speaker 1: or correspondence, nor two attacks upon his honor and reputation. 148 00:09:12,000 --> 00:09:14,680 Speaker 1: Everyone has the right to the protection of the law 149 00:09:14,800 --> 00:09:19,040 Speaker 1: against such interference or attacks. End quote. Now, apart from 150 00:09:19,080 --> 00:09:22,760 Speaker 1: the non inclusive pronoun usage there, this was a really 151 00:09:22,800 --> 00:09:26,480 Speaker 1: big step toward positioning privacy as a basic human right. 152 00:09:27,280 --> 00:09:31,720 Speaker 1: In nineteen sixty a legal scholar named William Prosser published 153 00:09:31,760 --> 00:09:36,040 Speaker 1: an article titled Fittingly Enough Privacy, and in that article 154 00:09:36,160 --> 00:09:40,559 Speaker 1: processor outlined four cases in which infringing on privacy should 155 00:09:40,600 --> 00:09:44,720 Speaker 1: allow for the victim to pursue a civil lawsuit against 156 00:09:44,760 --> 00:09:50,320 Speaker 1: the perpetrator. Those four cases were intrusion upon seclusion or solitude, 157 00:09:50,600 --> 00:09:53,559 Speaker 1: so in other words, someone's barging in on you when 158 00:09:53,559 --> 00:09:58,520 Speaker 1: you're just trying to be alone, or intrusion into private affairs. Similarly, 159 00:09:59,280 --> 00:10:03,680 Speaker 1: public disco cosure of embarrassing private facts, which when you 160 00:10:03,720 --> 00:10:07,680 Speaker 1: think about that, the Internet and the way it works 161 00:10:08,000 --> 00:10:11,199 Speaker 1: would just catch on fire if people were really holding 162 00:10:11,240 --> 00:10:15,080 Speaker 1: that seriously. Today publicity which places a person in a 163 00:10:15,120 --> 00:10:18,520 Speaker 1: false light in the public eye was another case. And 164 00:10:18,679 --> 00:10:22,559 Speaker 1: appropriation of one's name or likeness. So if someone were 165 00:10:22,600 --> 00:10:26,640 Speaker 1: to go about posing as Jonathan Strickland and it's not me, 166 00:10:27,240 --> 00:10:29,360 Speaker 1: it's not someone else who's actually named Jonathan Strickland like 167 00:10:29,400 --> 00:10:32,080 Speaker 1: they were trying to pose as me, I should be 168 00:10:32,160 --> 00:10:38,480 Speaker 1: allowed to sue that person based on that that criteria. Now, 169 00:10:38,520 --> 00:10:41,920 Speaker 1: obviously in these cases, uh, there's a lot of leeway 170 00:10:41,960 --> 00:10:45,840 Speaker 1: because there also are issues where it kind of starts 171 00:10:45,840 --> 00:10:49,160 Speaker 1: to track a little bit toward issues with the First Amendment. Right, 172 00:10:49,240 --> 00:10:52,800 Speaker 1: So in case of public disclosure of embarrassing private facts, 173 00:10:53,840 --> 00:10:56,319 Speaker 1: if you're talking about a public figure, there's a lot 174 00:10:56,360 --> 00:10:59,680 Speaker 1: more leeway there because it may be that that public 175 00:10:59,760 --> 00:11:04,760 Speaker 1: fig years private facts have a public bearing, like if 176 00:11:04,840 --> 00:11:07,920 Speaker 1: it's a politician or something. So it does get a 177 00:11:07,960 --> 00:11:12,079 Speaker 1: little dicey, but these matters always do. We'll skip ahead 178 00:11:12,120 --> 00:11:14,400 Speaker 1: to nineteen seventy four, though I should say that there 179 00:11:14,400 --> 00:11:17,400 Speaker 1: were some court cases and some scholarly articles that further 180 00:11:17,559 --> 00:11:21,240 Speaker 1: the framing of privacy. By nineteen seventy four, the US 181 00:11:21,320 --> 00:11:25,280 Speaker 1: passed the Privacy Act, which placed limits on how federal 182 00:11:25,400 --> 00:11:30,360 Speaker 1: agencies can collect and use personally identifiable information not granted. 183 00:11:30,720 --> 00:11:34,959 Speaker 1: These restrictions were all about government use, federal government use, 184 00:11:35,080 --> 00:11:39,240 Speaker 1: not state government, and not corporate use, so it's not 185 00:11:39,320 --> 00:11:43,040 Speaker 1: like this applied across the board. In nineteen six, the 186 00:11:43,120 --> 00:11:46,840 Speaker 1: US government passed the Telephone Consumer Protection Act and established 187 00:11:46,880 --> 00:11:49,840 Speaker 1: they Do Not Call Registry, and these were meant to 188 00:11:49,880 --> 00:11:52,920 Speaker 1: reduce the number of solicitation calls that citizens would get, 189 00:11:52,960 --> 00:11:55,040 Speaker 1: you know, the spam calls we would call them today, 190 00:11:55,640 --> 00:11:58,600 Speaker 1: and it would give people the opportunity to opt out 191 00:11:58,960 --> 00:12:01,640 Speaker 1: of being on that regis streets, so that in theory, 192 00:12:01,720 --> 00:12:04,640 Speaker 1: you wouldn't get those cold calls. Now, this is one 193 00:12:04,640 --> 00:12:07,200 Speaker 1: of those things that in recent years has become an 194 00:12:07,240 --> 00:12:11,199 Speaker 1: area of focus because stuff like robo calls and spoofing 195 00:12:11,280 --> 00:12:14,920 Speaker 1: have really sidestepped the protections that were in place all 196 00:12:14,960 --> 00:12:19,520 Speaker 1: the way back in six rendering them almost meaningless. And 197 00:12:19,559 --> 00:12:22,240 Speaker 1: you've heard a lot of calls even in you know, 198 00:12:22,360 --> 00:12:27,079 Speaker 1: high areas of Congress to have a new version of 199 00:12:27,120 --> 00:12:32,679 Speaker 1: this too address things like companies that use spoofing and 200 00:12:32,800 --> 00:12:37,800 Speaker 1: robo calls to do a widespread targeting of spam. Over 201 00:12:37,880 --> 00:12:41,440 Speaker 1: in Europe, the EU adopted the Data Protection Directive in 202 00:12:41,559 --> 00:12:47,360 Speaker 1: nine and then in the EU replaced that with the 203 00:12:47,400 --> 00:12:51,559 Speaker 1: General Data Protection Regulation or g d p R. This 204 00:12:51,640 --> 00:12:54,280 Speaker 1: is the set of rules and restrictions that really prevents 205 00:12:54,840 --> 00:12:58,160 Speaker 1: largely like big tech companies from following the same source 206 00:12:58,200 --> 00:13:01,400 Speaker 1: of strategies in the EU that they follow here over 207 00:13:01,440 --> 00:13:04,120 Speaker 1: in the United States. So you'll hear a lot about 208 00:13:04,120 --> 00:13:08,040 Speaker 1: companies having to readjust how they work in the EU, 209 00:13:09,120 --> 00:13:12,679 Speaker 1: because if they were to continue to operate as they 210 00:13:12,720 --> 00:13:15,520 Speaker 1: do in the United States, they could be held legally 211 00:13:15,920 --> 00:13:20,200 Speaker 1: liable for lots of violations. There are other rules that 212 00:13:20,240 --> 00:13:25,160 Speaker 1: are important. There's KAPPA. That's the Children's Online Privacy Protection 213 00:13:25,200 --> 00:13:27,920 Speaker 1: Act here in the United States. That act places much 214 00:13:27,920 --> 00:13:31,439 Speaker 1: tighter restrictions on companies when it comes to the collection 215 00:13:31,480 --> 00:13:34,720 Speaker 1: and use of data about people who are younger than thirteen. 216 00:13:35,320 --> 00:13:37,840 Speaker 1: That law and the enforcement of it have created some 217 00:13:38,080 --> 00:13:41,720 Speaker 1: fairly tricky situations for companies and for others, like notably 218 00:13:41,880 --> 00:13:45,760 Speaker 1: content creators on platforms like YouTube. But I'm pretty sure 219 00:13:45,800 --> 00:13:47,600 Speaker 1: I'm gonna have to do just a full episode on 220 00:13:47,679 --> 00:13:52,800 Speaker 1: KAPPA and its consequences in the future, because those consequences 221 00:13:52,840 --> 00:13:57,000 Speaker 1: include stuff that I think are valuable and stuff that 222 00:13:57,320 --> 00:14:01,240 Speaker 1: because of the interpretation of the law and the implementation 223 00:14:01,280 --> 00:14:05,679 Speaker 1: of policies on platforms like YouTube can also be harmful. 224 00:14:06,640 --> 00:14:11,240 Speaker 1: It's complicated, so that's why it would require its own episode. 225 00:14:10,160 --> 00:14:15,439 Speaker 1: In the US past, the Graham Leach Blightly Act, which 226 00:14:15,640 --> 00:14:19,640 Speaker 1: requires financial institutions to explain how they use and share 227 00:14:19,720 --> 00:14:22,880 Speaker 1: private consumer data. So this was for things like credit 228 00:14:23,000 --> 00:14:26,280 Speaker 1: organizations banks. That kind of thing also requires that those 229 00:14:26,280 --> 00:14:30,000 Speaker 1: companies provide a means for customers to opt out of 230 00:14:30,120 --> 00:14:34,000 Speaker 1: having their information shared, which is a pretty good point 231 00:14:34,000 --> 00:14:36,600 Speaker 1: to focus on for a second. In the United States, 232 00:14:37,200 --> 00:14:40,760 Speaker 1: the general approach the default is to require companies to 233 00:14:40,800 --> 00:14:44,240 Speaker 1: provide an opt out option, but a lot of companies 234 00:14:44,920 --> 00:14:49,200 Speaker 1: bury these sorts of settings or features so that they're 235 00:14:49,200 --> 00:14:53,000 Speaker 1: not easy to find or activate, and often any disclosure 236 00:14:53,200 --> 00:14:57,440 Speaker 1: of there being an opt out feature can be buried 237 00:14:57,480 --> 00:15:01,680 Speaker 1: inside a long terms of service end user agreement page 238 00:15:01,840 --> 00:15:04,600 Speaker 1: which most people don't take the time to read. And 239 00:15:04,840 --> 00:15:07,600 Speaker 1: I'm just as guilty of this as other people. I 240 00:15:07,640 --> 00:15:10,880 Speaker 1: don't I'm not you know, I'm not throwing shade here, folks, 241 00:15:11,040 --> 00:15:14,720 Speaker 1: I am. I've done this, and some people argue that 242 00:15:14,800 --> 00:15:17,520 Speaker 1: it would be better to have an opt in approach, 243 00:15:17,760 --> 00:15:21,360 Speaker 1: so this would be where you would sign up for 244 00:15:21,400 --> 00:15:24,400 Speaker 1: a service. You would be told, hey, do you mind 245 00:15:24,880 --> 00:15:27,400 Speaker 1: if we use your information to do X, Y and 246 00:15:27,480 --> 00:15:31,040 Speaker 1: Z in a very clear and transparent way, not buried 247 00:15:31,080 --> 00:15:35,200 Speaker 1: in like pages and pages of terms of service, and 248 00:15:35,320 --> 00:15:37,440 Speaker 1: you would then click a box to allow it to happen. 249 00:15:37,880 --> 00:15:41,600 Speaker 1: But since so much revenue depends intrinsically on the ability 250 00:15:41,600 --> 00:15:44,880 Speaker 1: of companies to collect and share data, there's an extreme 251 00:15:44,920 --> 00:15:49,800 Speaker 1: financial incentive to go the opt out route. More on 252 00:15:49,880 --> 00:15:53,960 Speaker 1: that later. On the state level, presently, two states in 253 00:15:54,000 --> 00:15:56,760 Speaker 1: the US have passed privacy laws meant to protect the 254 00:15:56,800 --> 00:16:01,440 Speaker 1: private information of citizens of those states, and they are California, 255 00:16:01,680 --> 00:16:05,560 Speaker 1: which passed the California Consumer Privacy Act in which went 256 00:16:05,600 --> 00:16:09,920 Speaker 1: into effect in twenty twenty, and then Virginia, which very 257 00:16:10,000 --> 00:16:14,040 Speaker 1: recently passed the Consumer Data Protection Act just this past 258 00:16:14,120 --> 00:16:19,880 Speaker 1: March in but that law won't go into effect until January. 259 00:16:20,080 --> 00:16:22,640 Speaker 1: The other forty eight states of the US pretty much 260 00:16:22,680 --> 00:16:26,840 Speaker 1: just followed the general federal rules to some extent. So 261 00:16:26,880 --> 00:16:30,480 Speaker 1: that's the legal background on privacy in the United States 262 00:16:30,520 --> 00:16:33,600 Speaker 1: and to a lesser extent, the EU without going into 263 00:16:33,800 --> 00:16:38,040 Speaker 1: like super detailed analysis. The idea being privacy is clearly 264 00:16:38,080 --> 00:16:43,680 Speaker 1: a thing, but it's not as heavily protected from a 265 00:16:43,800 --> 00:16:47,800 Speaker 1: legal standpoint in the United States as it could be. Meanwhile, 266 00:16:48,480 --> 00:16:54,080 Speaker 1: obviously technology advances at an incredible pace and frequently leaves 267 00:16:54,240 --> 00:16:57,840 Speaker 1: the legal system in the dust. But what about the 268 00:16:57,880 --> 00:17:01,360 Speaker 1: actual collection of personal information in the tech age. Well, 269 00:17:01,400 --> 00:17:04,359 Speaker 1: again we need to think back historically. Now, in the 270 00:17:04,440 --> 00:17:08,960 Speaker 1: old old days, you know, pre digital age, businesses essentially 271 00:17:09,040 --> 00:17:13,560 Speaker 1: kept an eye on customers, particularly repeat customers, and kept 272 00:17:13,600 --> 00:17:17,240 Speaker 1: an eye on what they were buying. This was necessary 273 00:17:17,280 --> 00:17:19,560 Speaker 1: in order to just keep stores stocked with the stuff 274 00:17:19,560 --> 00:17:23,320 Speaker 1: people actually needed. And I think this kind of tracking, 275 00:17:23,600 --> 00:17:26,920 Speaker 1: which is being really generous, uh, it was much more 276 00:17:26,920 --> 00:17:30,040 Speaker 1: general approach. It's pretty easy to understand and to forgive. 277 00:17:30,280 --> 00:17:33,600 Speaker 1: So imagine that you are a store owner and you 278 00:17:33,640 --> 00:17:36,879 Speaker 1: notice that every two months farmer Betty comes in and 279 00:17:36,960 --> 00:17:40,040 Speaker 1: she buys two bags of grain. You would learn pretty 280 00:17:40,080 --> 00:17:42,439 Speaker 1: quickly that you need to make sure you have grain 281 00:17:42,560 --> 00:17:46,920 Speaker 1: stocked every two months because typically that's when she would 282 00:17:46,920 --> 00:17:48,320 Speaker 1: come in and buy them. So you want to make 283 00:17:48,359 --> 00:17:50,960 Speaker 1: sure you had that on hand. And you would do 284 00:17:51,000 --> 00:17:53,119 Speaker 1: this with all your customers, whatever it was that they 285 00:17:53,119 --> 00:17:55,199 Speaker 1: were buying. You would make sure that you were stocked 286 00:17:55,280 --> 00:17:58,679 Speaker 1: up on and you might also try and experiment and 287 00:17:58,760 --> 00:18:01,960 Speaker 1: stock some stuff was related to the products that your 288 00:18:01,960 --> 00:18:06,320 Speaker 1: regulars were buying, and assuming that you speculate correctly, everyone 289 00:18:06,440 --> 00:18:10,119 Speaker 1: benefits from that. You make more sales and your customers 290 00:18:10,359 --> 00:18:12,880 Speaker 1: end up getting stuff that they need, even if they 291 00:18:12,880 --> 00:18:16,240 Speaker 1: didn't know they needed it at the time. Fast forward 292 00:18:16,680 --> 00:18:19,679 Speaker 1: a whole lot to the point where we had electronic 293 00:18:19,800 --> 00:18:23,440 Speaker 1: means of keeping inventory. The invention of stuff like computer 294 00:18:23,520 --> 00:18:27,119 Speaker 1: systems and the bar code in the nineteen seventies that 295 00:18:27,200 --> 00:18:29,199 Speaker 1: made it possible for us to keep an up to 296 00:18:29,359 --> 00:18:33,960 Speaker 1: date electronic record of inventory and sales, and these weren't 297 00:18:34,000 --> 00:18:37,640 Speaker 1: directly tied to customers just yet. That would still largely 298 00:18:37,680 --> 00:18:41,400 Speaker 1: depend heavily on observation, but now it was much easier 299 00:18:41,400 --> 00:18:45,439 Speaker 1: to spot buying trends and respond to them in close 300 00:18:45,480 --> 00:18:49,679 Speaker 1: to real time. That particular branch would lead to data 301 00:18:49,760 --> 00:18:54,120 Speaker 1: driven marketing, in which experts and marketing would look at 302 00:18:54,240 --> 00:18:57,680 Speaker 1: various goods and services and use data to determine which 303 00:18:57,720 --> 00:19:00,840 Speaker 1: regions they should focus on and which one's might prove 304 00:19:00,840 --> 00:19:05,119 Speaker 1: to be less profitable. So let's say you are uh, 305 00:19:05,160 --> 00:19:08,120 Speaker 1: the owner of a chain of grocery stores in the Southeast. 306 00:19:08,600 --> 00:19:12,600 Speaker 1: You might notice that grocery stores in Atlanta are selling 307 00:19:12,760 --> 00:19:15,800 Speaker 1: a lot of a particular type of product, and then meanwhile, 308 00:19:16,200 --> 00:19:21,280 Speaker 1: a store in you know, Charlotte isn't selling that so much, 309 00:19:21,320 --> 00:19:24,359 Speaker 1: but it is selling a different product at very high volume. 310 00:19:24,880 --> 00:19:26,520 Speaker 1: These are the sort of dad trends you would want 311 00:19:26,560 --> 00:19:27,960 Speaker 1: to know so that you could make sure you had 312 00:19:27,960 --> 00:19:31,120 Speaker 1: the right stock on hand, you could do advertising campaigns, 313 00:19:31,640 --> 00:19:35,440 Speaker 1: and you can maximize your sales and minimize your waste. 314 00:19:35,840 --> 00:19:41,080 Speaker 1: It was incredibly valuable data that was undeniable. Information was 315 00:19:41,119 --> 00:19:46,120 Speaker 1: the key to maximizing profit and reducing costs as much 316 00:19:46,160 --> 00:19:49,280 Speaker 1: as possible. In the nineteen eighties we saw the rise 317 00:19:49,400 --> 00:19:53,920 Speaker 1: of direct marketing, in which marketers would customize and personalized 318 00:19:53,920 --> 00:19:58,080 Speaker 1: efforts and aim them at specific shoppers. Now we're no 319 00:19:58,160 --> 00:20:02,400 Speaker 1: longer looking at regions, we're looking at individuals. And typically 320 00:20:02,400 --> 00:20:05,800 Speaker 1: they did this through direct mail sales. And this was 321 00:20:05,840 --> 00:20:08,960 Speaker 1: a pretty primitive approach and based solely on the customer's 322 00:20:08,960 --> 00:20:11,280 Speaker 1: past purchases. So I can give you an example of this. 323 00:20:11,480 --> 00:20:14,000 Speaker 1: I grew up in the eighties. I was one of 324 00:20:14,040 --> 00:20:17,919 Speaker 1: those kids who occasionally would buy comic books, and of 325 00:20:17,960 --> 00:20:20,880 Speaker 1: course stuff on the backs of comic books would sometimes 326 00:20:20,880 --> 00:20:24,159 Speaker 1: be really tempting, Like man I really would like that 327 00:20:24,320 --> 00:20:27,400 Speaker 1: hand buzzer. That seems like that will be a real hit, 328 00:20:28,200 --> 00:20:31,400 Speaker 1: And you go and you send your like dollar seventy 329 00:20:31,400 --> 00:20:33,679 Speaker 1: five or the mail and you buy one. And then 330 00:20:33,720 --> 00:20:36,000 Speaker 1: next thing you know, you start getting catalogs for all 331 00:20:36,040 --> 00:20:39,040 Speaker 1: these sort of novelty gifts. And that's how I ended 332 00:20:39,119 --> 00:20:41,960 Speaker 1: up on a billion mailing lists that were all aiming 333 00:20:41,960 --> 00:20:44,600 Speaker 1: at me in different ways for different types of weird 334 00:20:44,760 --> 00:20:48,000 Speaker 1: or novelty type stuff. That was kind of the approach. 335 00:20:48,040 --> 00:20:52,520 Speaker 1: It was pretty primitive. It was not, you know, super sophisticated, 336 00:20:52,960 --> 00:20:58,160 Speaker 1: but it did set a foundation that particular branch would 337 00:20:58,400 --> 00:21:02,280 Speaker 1: later extend further into st like loyalty programs in which 338 00:21:02,400 --> 00:21:06,320 Speaker 1: stores would issue cards or tokens, often using a bar 339 00:21:06,400 --> 00:21:09,480 Speaker 1: code right that they would scan, and that would link 340 00:21:09,560 --> 00:21:13,240 Speaker 1: purchases to specific customers. Now you actually know who it 341 00:21:13,400 --> 00:21:16,160 Speaker 1: is that's buying stuff and how frequently they're buying it, 342 00:21:16,760 --> 00:21:19,880 Speaker 1: and when linked to other information such as a person's 343 00:21:19,920 --> 00:21:23,680 Speaker 1: email address or their snail mail address, that can allow 344 00:21:23,800 --> 00:21:27,080 Speaker 1: stores to proactively reach out to a customer and alert 345 00:21:27,119 --> 00:21:30,199 Speaker 1: them of sales, maybe offer up coupons, all in an 346 00:21:30,240 --> 00:21:33,040 Speaker 1: effort to sell more stuff. And it gave the stores 347 00:21:33,320 --> 00:21:37,240 Speaker 1: way more information about the preferences of their customers on 348 00:21:37,320 --> 00:21:41,040 Speaker 1: that individual basis. That's sort of a microcosm of what 349 00:21:41,080 --> 00:21:43,880 Speaker 1: we're looking at with personal data on the Internet. I'll 350 00:21:43,920 --> 00:21:56,200 Speaker 1: explain more after this short break. Okay, we have gotten 351 00:21:56,280 --> 00:21:59,359 Speaker 1: up to the ninety nineties and the birth of the 352 00:21:59,400 --> 00:22:02,440 Speaker 1: World Wide Web. Now, the original purpose of the web 353 00:22:02,560 --> 00:22:05,800 Speaker 1: was to create a collection of documents that could be 354 00:22:05,840 --> 00:22:09,919 Speaker 1: linked to one another using hypertext links. So it was 355 00:22:09,960 --> 00:22:13,640 Speaker 1: just a means of sharing information and linking information together 356 00:22:14,080 --> 00:22:17,080 Speaker 1: which could allow you to create a type of contextualization. 357 00:22:17,640 --> 00:22:19,720 Speaker 1: So think of the average experience you might have on 358 00:22:19,760 --> 00:22:23,480 Speaker 1: a site like Wikipedia. Sure you might start off reading 359 00:22:23,520 --> 00:22:26,919 Speaker 1: about sloths, but then through a series of clicking on 360 00:22:27,040 --> 00:22:30,200 Speaker 1: various links in different articles, you ultimately end up reading 361 00:22:30,240 --> 00:22:34,359 Speaker 1: about the communist revolution in Cuba. It didn't take too 362 00:22:34,440 --> 00:22:37,760 Speaker 1: long after the initial launch of the earliest web pages 363 00:22:38,200 --> 00:22:43,720 Speaker 1: for commerce to follow onto the web. In the digital 364 00:22:43,800 --> 00:22:49,680 Speaker 1: magazine hot Wired introduced something new, the banner ad. Hot 365 00:22:49,720 --> 00:22:53,760 Speaker 1: Wired was an online branch of the print magazine Wired. 366 00:22:54,119 --> 00:22:57,840 Speaker 1: Now hot Wired doesn't exist anymore, but Wired now occupies 367 00:22:57,880 --> 00:23:01,639 Speaker 1: both print and digital formats. However, the banner ad became 368 00:23:01,760 --> 00:23:05,600 Speaker 1: a really important stepping stone in our story. The first 369 00:23:05,640 --> 00:23:08,560 Speaker 1: banner ad was for a T and T, which reportedly 370 00:23:08,560 --> 00:23:12,840 Speaker 1: paid hot Wire a fee of thirty dollars so that 371 00:23:12,960 --> 00:23:15,159 Speaker 1: the banner ad would be placed at the top of 372 00:23:15,240 --> 00:23:19,639 Speaker 1: hot Wired pages for the duration of three months during 373 00:23:19,680 --> 00:23:22,960 Speaker 1: that time, according to one source, at least I could 374 00:23:22,960 --> 00:23:26,000 Speaker 1: not verify this. It's widely reported, but I feel like 375 00:23:26,000 --> 00:23:28,320 Speaker 1: they're all pulling it from the same source, So take 376 00:23:28,359 --> 00:23:32,879 Speaker 1: this with a grain of salt. But apparently that banner 377 00:23:32,920 --> 00:23:38,240 Speaker 1: ad enjoyed a click through rate of forty so quick aside, 378 00:23:38,359 --> 00:23:41,159 Speaker 1: just in case you're not familiar with online ad terms, 379 00:23:41,680 --> 00:23:44,480 Speaker 1: click through is what it sounds like. It's how many 380 00:23:44,480 --> 00:23:48,960 Speaker 1: people actively clicked on an ad, which would then link 381 00:23:49,080 --> 00:23:51,160 Speaker 1: the person to some other page. It would send them 382 00:23:51,160 --> 00:23:53,919 Speaker 1: to a page that might have a little more information 383 00:23:54,080 --> 00:23:57,720 Speaker 1: about a specific service or product, and frequently it would 384 00:23:57,720 --> 00:24:01,360 Speaker 1: also include some means of signing up or purchasing that 385 00:24:01,440 --> 00:24:04,560 Speaker 1: service or product. So if you're an advertiser, you want 386 00:24:04,640 --> 00:24:08,200 Speaker 1: a pretty decent click through rate because one, it shows 387 00:24:08,240 --> 00:24:11,200 Speaker 1: your ad was effective, and too, it makes your client 388 00:24:11,240 --> 00:24:14,800 Speaker 1: happy because presumably they're going to get more sales, and 389 00:24:16,000 --> 00:24:22,600 Speaker 1: click through is indescribably effective. It is it's insane when 390 00:24:22,640 --> 00:24:26,240 Speaker 1: nearly half of all the people visiting a web page 391 00:24:26,280 --> 00:24:30,440 Speaker 1: are clicking on an AD that's on that page. That's incredible. 392 00:24:31,400 --> 00:24:35,399 Speaker 1: These days, the average click through on display ads is 393 00:24:35,520 --> 00:24:39,240 Speaker 1: somewhere in the neighborhood of point three five per cent, 394 00:24:39,440 --> 00:24:43,840 Speaker 1: so less than half a percentage point. Banner ads, by 395 00:24:43,840 --> 00:24:46,480 Speaker 1: the way, get even less, sometimes as low as point 396 00:24:46,760 --> 00:24:52,679 Speaker 1: zero five percent click through. So yeah, was unthinkably successful. 397 00:24:52,680 --> 00:24:56,080 Speaker 1: It probably said a lot of very unrealistic expectations, if 398 00:24:56,080 --> 00:24:59,760 Speaker 1: I'm being honest. But it was also a brand new thing. 399 00:25:00,280 --> 00:25:03,760 Speaker 1: People had yet to develop ad blindness to banner ads, right. 400 00:25:03,840 --> 00:25:07,040 Speaker 1: They They hadn't sort of trained themselves to just ignore 401 00:25:07,119 --> 00:25:09,080 Speaker 1: everything that's at the top or the right of the 402 00:25:09,080 --> 00:25:12,680 Speaker 1: page or sometimes the bottom. They were looking at everything. 403 00:25:13,320 --> 00:25:16,920 Speaker 1: And also ad blockers were not yet a thing, so 404 00:25:17,160 --> 00:25:21,600 Speaker 1: it wasn't like people were using means to prevent ads 405 00:25:21,640 --> 00:25:24,520 Speaker 1: from showing up in the first place. Now, in those 406 00:25:24,560 --> 00:25:28,720 Speaker 1: early days, the general thought was that a conversion, that is, 407 00:25:28,760 --> 00:25:32,840 Speaker 1: converting someone from viewing an AD to acting on it, 408 00:25:32,920 --> 00:25:35,840 Speaker 1: was pretty much limited to sales. So, in other words, 409 00:25:36,040 --> 00:25:39,359 Speaker 1: people thought of ads as only being successful if someone 410 00:25:39,600 --> 00:25:44,199 Speaker 1: was actively buying stuff through interacting with that ad. But 411 00:25:44,280 --> 00:25:47,280 Speaker 1: pretty early on that attitude began to change because there 412 00:25:47,280 --> 00:25:51,479 Speaker 1: were other aspects of conversion to consider. Clicking on the 413 00:25:51,520 --> 00:25:55,480 Speaker 1: ad showed interest, and interest alone could be a conversion. 414 00:25:55,560 --> 00:25:59,080 Speaker 1: It might mean that the company that was in charge 415 00:25:59,160 --> 00:26:02,439 Speaker 1: of of uh, whatever projects our service the ad was 416 00:26:02,480 --> 00:26:05,480 Speaker 1: linked to, might have to do some extra work, but 417 00:26:06,080 --> 00:26:10,480 Speaker 1: you have at least identified a a prospect. Downloading a 418 00:26:10,520 --> 00:26:14,800 Speaker 1: coupon was another type of conversion, and companies began to 419 00:26:15,119 --> 00:26:18,879 Speaker 1: have a way to track data trends through online interactions, 420 00:26:18,920 --> 00:26:21,840 Speaker 1: at least to how well their ads were kind of doing. 421 00:26:22,200 --> 00:26:26,160 Speaker 1: At the same time, a developer named Lou Montoli created 422 00:26:26,240 --> 00:26:29,439 Speaker 1: a new type of computer file that would transform the 423 00:26:29,480 --> 00:26:33,879 Speaker 1: experience of using the World Wide Web. These files, called cookies, 424 00:26:34,119 --> 00:26:37,800 Speaker 1: are integral to how we experience the web. The cookies 425 00:26:37,920 --> 00:26:42,119 Speaker 1: keep track of which websites we visit, how frequently we 426 00:26:42,280 --> 00:26:45,240 Speaker 1: visit them, and what we do on those sites, including 427 00:26:45,240 --> 00:26:47,919 Speaker 1: stuff like if we make a purchase or if we 428 00:26:48,040 --> 00:26:50,720 Speaker 1: click through an ad. They allow us to do stuff 429 00:26:50,760 --> 00:26:53,640 Speaker 1: like log into a service online, like you know, put 430 00:26:53,680 --> 00:26:56,160 Speaker 1: in your user name and password, and then you can 431 00:26:56,200 --> 00:26:58,800 Speaker 1: stay logged in even if you navigate away from the 432 00:26:58,840 --> 00:27:00,720 Speaker 1: site and you come back to it later, you're still 433 00:27:00,760 --> 00:27:03,480 Speaker 1: lugged in. Well, that's thanks to cookies. That way, when 434 00:27:03,480 --> 00:27:05,359 Speaker 1: we come back, we're already lugged in. We don't have 435 00:27:05,400 --> 00:27:08,399 Speaker 1: to go through that process again unless you're using something 436 00:27:08,400 --> 00:27:11,800 Speaker 1: that has really high security, in which case you would 437 00:27:11,800 --> 00:27:14,400 Speaker 1: have to do it because the security is the most 438 00:27:14,400 --> 00:27:18,919 Speaker 1: important part. Similarly, if you're using a site that allows 439 00:27:18,920 --> 00:27:22,119 Speaker 1: you to set certain preferences, the cookie file on your 440 00:27:22,160 --> 00:27:25,320 Speaker 1: computer tells that site what those preferences are, so that 441 00:27:25,359 --> 00:27:28,280 Speaker 1: when you return, those preferences are already in place for 442 00:27:28,320 --> 00:27:31,760 Speaker 1: you automatically. It creates a level of convenience that makes 443 00:27:31,760 --> 00:27:34,800 Speaker 1: the web more usable. Now, when you visit a site, 444 00:27:34,880 --> 00:27:38,359 Speaker 1: the site can request the information stored in that cookie file. 445 00:27:38,400 --> 00:27:41,640 Speaker 1: It has to because if the site needs to adjust 446 00:27:41,720 --> 00:27:44,440 Speaker 1: things for you, it has to know what to do right. 447 00:27:44,960 --> 00:27:46,399 Speaker 1: And this is where we start to get to a 448 00:27:46,400 --> 00:27:51,600 Speaker 1: point where personal information becomes a hot commodity online. The cookie, 449 00:27:51,800 --> 00:27:54,960 Speaker 1: which had clear benefits for users, would have even more 450 00:27:55,040 --> 00:27:59,760 Speaker 1: significant benefits for businesses online. So let's switch back to 451 00:27:59,800 --> 00:28:02,960 Speaker 1: the ads side of the story. At first, companies would 452 00:28:02,960 --> 00:28:05,720 Speaker 1: follow a T and T S lead and purchase banner 453 00:28:05,800 --> 00:28:09,399 Speaker 1: space or right rail space on a website for a 454 00:28:09,480 --> 00:28:11,560 Speaker 1: set amount of money for a set amount of time. 455 00:28:12,040 --> 00:28:14,520 Speaker 1: But it didn't take long for that to change. In 456 00:28:16,160 --> 00:28:20,560 Speaker 1: Netscape and info seek migrated to a new pricing model. 457 00:28:21,200 --> 00:28:25,400 Speaker 1: Instead of creating a one size fits a few approach 458 00:28:25,560 --> 00:28:28,840 Speaker 1: to selling web page landscape, which meant companies would have 459 00:28:28,880 --> 00:28:31,760 Speaker 1: to pay the same amount whether their ads were working 460 00:28:31,760 --> 00:28:34,560 Speaker 1: on that page or not, and for you know, whatever 461 00:28:34,640 --> 00:28:37,280 Speaker 1: duration they had picked. The new approach was the good 462 00:28:37,320 --> 00:28:43,280 Speaker 1: old CPM. CPM stands for cost per milay, with milla 463 00:28:43,400 --> 00:28:46,719 Speaker 1: being the Roman word for thousand. So it sounds at 464 00:28:46,760 --> 00:28:48,640 Speaker 1: first like you're saying cost per million, but you're really 465 00:28:48,640 --> 00:28:51,760 Speaker 1: saying cost per thousand. So the thousand in this case 466 00:28:52,080 --> 00:28:56,360 Speaker 1: tends to mean impressions. That is the number of times 467 00:28:56,400 --> 00:29:01,320 Speaker 1: the ad is presumably seen by visitors to that particular website. 468 00:29:01,840 --> 00:29:04,360 Speaker 1: So what this really breaks down to is that web 469 00:29:04,400 --> 00:29:07,680 Speaker 1: pages that are really popular and they get lots of 470 00:29:07,760 --> 00:29:12,600 Speaker 1: visitors can demand a higher CPM because more folks go there, 471 00:29:12,960 --> 00:29:15,080 Speaker 1: So ads that are placed there are going to have 472 00:29:15,160 --> 00:29:19,160 Speaker 1: higher visibility than an ad placed on you know, old 473 00:29:19,240 --> 00:29:23,360 Speaker 1: Joe Bob's Duct Tape Museum website. A site that gets 474 00:29:23,400 --> 00:29:26,400 Speaker 1: a lot of traffic can demand more money per thousand impressions, 475 00:29:26,680 --> 00:29:29,200 Speaker 1: kind of like how in the United States, if you 476 00:29:29,280 --> 00:29:31,440 Speaker 1: want to have a commercial played during the Super Bowl, 477 00:29:31,480 --> 00:29:35,080 Speaker 1: it's gonna cost you millions of dollars compared to, you know, 478 00:29:35,200 --> 00:29:38,680 Speaker 1: putting an ad on some niche television channel for late 479 00:29:38,760 --> 00:29:43,080 Speaker 1: night TV. On the flip side, for advertisers, they can 480 00:29:43,080 --> 00:29:47,120 Speaker 1: negotiate a rate with websites for a certain number of impressions, 481 00:29:47,160 --> 00:29:50,240 Speaker 1: which effectively replaced the duration limit for an ad. So 482 00:29:50,280 --> 00:29:53,240 Speaker 1: instead of saying I want an ad running on this 483 00:29:53,320 --> 00:29:57,200 Speaker 1: page for three months, you would say, all right, the 484 00:29:57,280 --> 00:30:01,040 Speaker 1: web page has a CPM rate of ten dollars, meaning 485 00:30:01,080 --> 00:30:03,200 Speaker 1: that you have to pay ten dollars for every one 486 00:30:03,320 --> 00:30:08,600 Speaker 1: thousand views, and you might negotiate for a million impressions, 487 00:30:09,000 --> 00:30:11,920 Speaker 1: which would mean that the advertiser would have to pay 488 00:30:11,920 --> 00:30:15,680 Speaker 1: the website ten thousand dollars to carry that ad in 489 00:30:15,760 --> 00:30:18,520 Speaker 1: order to generate a million impressions. And if the page 490 00:30:18,560 --> 00:30:20,720 Speaker 1: got a million impressions in a couple of days, boom, 491 00:30:20,800 --> 00:30:24,400 Speaker 1: that ad campaign was done. If it takes weeks, well, 492 00:30:24,480 --> 00:30:28,560 Speaker 1: then the campaign would last longer. And obviously that CPM 493 00:30:28,720 --> 00:30:32,120 Speaker 1: was different depending on the popularity of the website. Website 494 00:30:32,120 --> 00:30:34,120 Speaker 1: that's not that popular would have a low CPM, and 495 00:30:34,120 --> 00:30:37,719 Speaker 1: it would also take a longer time to reach whatever 496 00:30:37,800 --> 00:30:41,080 Speaker 1: the agreed upon number of impressions was. By the way, 497 00:30:41,120 --> 00:30:44,400 Speaker 1: this is also why there are a lot of websites 498 00:30:44,440 --> 00:30:47,400 Speaker 1: that have things like quizzes and galleries where you have 499 00:30:47,520 --> 00:30:52,240 Speaker 1: to scroll through each item. Like any website where it's 500 00:30:52,520 --> 00:30:57,720 Speaker 1: top ten movies that feature ghosts that have hair in 501 00:30:57,760 --> 00:31:00,720 Speaker 1: front of their face, and every single tree is its 502 00:31:00,760 --> 00:31:04,360 Speaker 1: own web page. That's because every one of those web 503 00:31:04,400 --> 00:31:08,360 Speaker 1: pages you go to, that's another impression. So it's a 504 00:31:08,400 --> 00:31:11,520 Speaker 1: way of expanding the number of impressions and AD would 505 00:31:11,600 --> 00:31:15,280 Speaker 1: get by making you have to reload the page over 506 00:31:15,320 --> 00:31:17,920 Speaker 1: and over and over again. If all ten of those 507 00:31:17,920 --> 00:31:21,040 Speaker 1: things were on one page, you would get one impression 508 00:31:21,160 --> 00:31:23,720 Speaker 1: for that article. These things, by the way, also have 509 00:31:23,880 --> 00:31:26,880 Speaker 1: massive drop off rates, like if they're not really compelling, 510 00:31:26,920 --> 00:31:30,479 Speaker 1: people will bail on them within like two or three entries, 511 00:31:31,480 --> 00:31:35,320 Speaker 1: so there's a diminishing returns thing going on with them. 512 00:31:35,400 --> 00:31:39,040 Speaker 1: That's just some insight into how web pages generate revenue. 513 00:31:39,320 --> 00:31:42,600 Speaker 1: I've been on that side, and it is not always 514 00:31:42,640 --> 00:31:47,160 Speaker 1: fun anyway. A big part of all this is that 515 00:31:47,400 --> 00:31:51,880 Speaker 1: the ad campaign wasn't tied to other performance metrics, right, 516 00:31:51,920 --> 00:31:54,240 Speaker 1: So it wasn't whether or not people click through the 517 00:31:54,280 --> 00:31:57,680 Speaker 1: ad or whether they actually made a purchase through the ad. 518 00:31:57,720 --> 00:32:01,000 Speaker 1: This was just about how many people actually were exposed 519 00:32:01,120 --> 00:32:03,800 Speaker 1: to the ad itself. The ad had to do the 520 00:32:03,840 --> 00:32:07,680 Speaker 1: rest of the work. Then you had the emergence of 521 00:32:08,000 --> 00:32:11,640 Speaker 1: double Click, that company which is now owned by Google. 522 00:32:11,800 --> 00:32:15,440 Speaker 1: It was actually the focus of a big antitrust lawsuit 523 00:32:15,560 --> 00:32:18,040 Speaker 1: that I'm not going to get into, but it was 524 00:32:18,080 --> 00:32:21,080 Speaker 1: a big deal anyway, and let companies know which of 525 00:32:21,120 --> 00:32:24,360 Speaker 1: their ads were most effective. So double click would give 526 00:32:24,440 --> 00:32:27,720 Speaker 1: feedback to companies about which ads got the most impressions 527 00:32:27,760 --> 00:32:30,320 Speaker 1: and click throughs, and those companies could then get a 528 00:32:30,360 --> 00:32:33,000 Speaker 1: better idea of where they needed to spend their digital 529 00:32:33,040 --> 00:32:37,720 Speaker 1: marketing dollars. Double clicks relationship with advertisers meant that there 530 00:32:37,760 --> 00:32:40,200 Speaker 1: was a lot of rapid innovation in the web marketing 531 00:32:40,200 --> 00:32:44,200 Speaker 1: space as companies began to hone in on what strategies 532 00:32:44,240 --> 00:32:49,000 Speaker 1: worked and which ones didn't work. From there, you add 533 00:32:49,040 --> 00:32:52,520 Speaker 1: all sorts of changes in the space. Some ads moved 534 00:32:52,560 --> 00:32:55,640 Speaker 1: to pay per click models, so that was kind of 535 00:32:55,640 --> 00:32:59,080 Speaker 1: going back to that conversion approach, where in this model 536 00:32:59,160 --> 00:33:02,520 Speaker 1: and advertiser would only pay a website for the number 537 00:33:02,560 --> 00:33:06,280 Speaker 1: of clicks that an ad received. That created an incentive 538 00:33:06,480 --> 00:33:09,360 Speaker 1: for the website operators to try and make sure they 539 00:33:09,360 --> 00:33:12,880 Speaker 1: were pairing ads with pages that would potentially drive the 540 00:33:12,960 --> 00:33:16,160 Speaker 1: most traffic. So if you had a web page about 541 00:33:16,560 --> 00:33:20,760 Speaker 1: I don't know, chainsaws, you probably wouldn't try and pair 542 00:33:20,920 --> 00:33:24,560 Speaker 1: that page with an ad for perfume. Now I'm not 543 00:33:24,640 --> 00:33:28,360 Speaker 1: saying there's no crossover between people who are interested in 544 00:33:28,440 --> 00:33:31,880 Speaker 1: chainsaws and those who are interested in perfume, but from 545 00:33:31,880 --> 00:33:35,120 Speaker 1: a marketing standpoint, you're probably not going to get the 546 00:33:35,280 --> 00:33:40,440 Speaker 1: view that is the best use of your resources. And 547 00:33:40,520 --> 00:33:43,680 Speaker 1: another big change around this time was to how people 548 00:33:43,720 --> 00:33:46,640 Speaker 1: were accessing information on the web in general. While it 549 00:33:46,720 --> 00:33:50,400 Speaker 1: might be theoretically possible to navigate the webs simply by 550 00:33:50,440 --> 00:33:53,680 Speaker 1: following various hypertext links to get from point A to 551 00:33:53,760 --> 00:33:58,160 Speaker 1: point B, to actually do it would be monumentally inefficient. 552 00:33:58,240 --> 00:34:00,680 Speaker 1: I mean, you might have to pass through hundreds or 553 00:34:00,840 --> 00:34:03,720 Speaker 1: thousands of sites to get from A to B. If 554 00:34:03,760 --> 00:34:07,400 Speaker 1: you're limiting yourself to just following links that are on 555 00:34:07,520 --> 00:34:09,719 Speaker 1: the page of A and trying to get to be 556 00:34:09,840 --> 00:34:12,120 Speaker 1: that way, this can be a fun game by the way, 557 00:34:12,120 --> 00:34:14,520 Speaker 1: a kind of seven degrees of separation kind of game, 558 00:34:15,040 --> 00:34:17,160 Speaker 1: but it's not an efficient way to navigate the web. 559 00:34:17,480 --> 00:34:20,839 Speaker 1: So Internet search engines were way more useful. You would 560 00:34:20,880 --> 00:34:23,760 Speaker 1: put in your query in the search bar and boom, 561 00:34:24,040 --> 00:34:26,880 Speaker 1: you got results that and, in theory at least best 562 00:34:26,920 --> 00:34:30,520 Speaker 1: match whatever it was you're looking for. But this meant 563 00:34:31,000 --> 00:34:36,000 Speaker 1: that search engines were incredible aggregators for user behavior. Not 564 00:34:36,200 --> 00:34:39,919 Speaker 1: only could search engines keep itally on which search terms 565 00:34:39,920 --> 00:34:42,799 Speaker 1: were the most popular at any given time, they could 566 00:34:42,840 --> 00:34:46,839 Speaker 1: also consult those cookie files on users computers and see 567 00:34:46,880 --> 00:34:50,000 Speaker 1: what other sites users have been visiting, and search engines 568 00:34:50,040 --> 00:34:53,440 Speaker 1: could sell ads to. In fact, search engines made a 569 00:34:53,480 --> 00:34:57,719 Speaker 1: subtle shift from being all about indexing the web and 570 00:34:57,800 --> 00:35:01,320 Speaker 1: became more about monetizing search and getting into the data 571 00:35:01,360 --> 00:35:05,799 Speaker 1: collection and advertising businesses. That's what happened to Google. I 572 00:35:05,800 --> 00:35:08,560 Speaker 1: mean years ago you might call a you know, Google, 573 00:35:08,560 --> 00:35:11,840 Speaker 1: a search company. Uh this was before the days of 574 00:35:11,920 --> 00:35:15,279 Speaker 1: tons of other Google products emerged, like Google Cloud and 575 00:35:15,280 --> 00:35:18,560 Speaker 1: Android and all that stuff. But at its heart, what 576 00:35:18,680 --> 00:35:21,640 Speaker 1: Google really was wasn't a search engine. It was an 577 00:35:21,680 --> 00:35:25,400 Speaker 1: advertising company. Companies could pay Google to have their sites 578 00:35:25,440 --> 00:35:28,799 Speaker 1: pop up in searches as you know, results in a 579 00:35:28,840 --> 00:35:31,840 Speaker 1: search result, they would be ads, they would be clearly 580 00:35:31,880 --> 00:35:35,319 Speaker 1: marked his ads, but they would be advertised spots. And 581 00:35:35,360 --> 00:35:38,680 Speaker 1: it gave those companies way more visibility than they might 582 00:35:38,719 --> 00:35:43,280 Speaker 1: otherwise have. And Google can target specific users with ads 583 00:35:43,320 --> 00:35:47,520 Speaker 1: that might appeal to them through a collection and analysis 584 00:35:47,560 --> 00:35:50,600 Speaker 1: of all that personal data, both from the users cookies 585 00:35:50,640 --> 00:35:53,880 Speaker 1: and their own searches history, and and if that person 586 00:35:53,920 --> 00:35:56,440 Speaker 1: happens to be using Chrome or an Android device, well 587 00:35:56,440 --> 00:36:00,600 Speaker 1: that's another kettle of fish entirely. So things took a 588 00:36:00,840 --> 00:36:06,000 Speaker 1: massive downturn shortly after search engines began to dominate the 589 00:36:06,120 --> 00:36:09,160 Speaker 1: data collection space, and that's because of the dot com 590 00:36:09,200 --> 00:36:12,600 Speaker 1: bubble bursting in two thousand, two thousand one, and the 591 00:36:12,640 --> 00:36:16,520 Speaker 1: world slid into an economic recession in general that particularly 592 00:36:16,600 --> 00:36:20,080 Speaker 1: hit the online world hard. That was further exacerbated by 593 00:36:20,080 --> 00:36:22,880 Speaker 1: the terrorist attacks on the United States on September eleven, 594 00:36:23,000 --> 00:36:27,359 Speaker 1: two thousand one. But the industry did continue. There were 595 00:36:27,400 --> 00:36:30,160 Speaker 1: just fewer players. Some of the bigger companies were able 596 00:36:30,200 --> 00:36:32,880 Speaker 1: to survive all that turmoil, and the fact that the 597 00:36:33,040 --> 00:36:37,680 Speaker 1: smaller competitors were largely wiped out meant that these companies. 598 00:36:37,840 --> 00:36:42,040 Speaker 1: So companies like Google and Amazon now had an even 599 00:36:42,040 --> 00:36:45,520 Speaker 1: more dominant position in the online world, so it would 600 00:36:45,560 --> 00:36:48,560 Speaker 1: be very hard to overtake those companies, and both Google 601 00:36:48,600 --> 00:36:52,279 Speaker 1: and Amazon are famous for jealously guarding their dominant positions. 602 00:36:53,280 --> 00:36:55,239 Speaker 1: Move forward a few more years and you get the 603 00:36:55,320 --> 00:36:58,520 Speaker 1: rise of social network platforms. My Space was one of 604 00:36:58,560 --> 00:37:02,000 Speaker 1: the first really big social networks. It launched in two 605 00:37:02,040 --> 00:37:05,359 Speaker 1: thousand three, but Facebook soon followed in two thousand four. 606 00:37:06,280 --> 00:37:08,680 Speaker 1: My Space would be the most popular social network until 607 00:37:08,680 --> 00:37:12,360 Speaker 1: about two thousand eight. That's when Facebook overtook it. But 608 00:37:12,440 --> 00:37:17,160 Speaker 1: Facebook also introduced Facebook Ads in two thousand seven. Facebook 609 00:37:17,200 --> 00:37:21,280 Speaker 1: offered even more data points about people than search engines could. 610 00:37:21,800 --> 00:37:25,760 Speaker 1: I mean, people use social networks to connect with friends 611 00:37:25,920 --> 00:37:29,120 Speaker 1: and to share stuff about themselves and the people they like. 612 00:37:29,480 --> 00:37:33,239 Speaker 1: So people were willingly giving up tons of information that 613 00:37:33,360 --> 00:37:37,840 Speaker 1: an advertiser might find extremely relevant. And so Facebook's business 614 00:37:37,880 --> 00:37:41,960 Speaker 1: model largely fell into the realm of collecting and analyzing 615 00:37:42,000 --> 00:37:45,760 Speaker 1: information about users, both from their activities on the platform 616 00:37:45,800 --> 00:37:50,879 Speaker 1: itself and through the consultation of cookies, and Facebook could 617 00:37:51,000 --> 00:37:54,799 Speaker 1: use that information to place highly relevant ads on specific 618 00:37:54,920 --> 00:38:02,880 Speaker 1: user pages. This became the absolute foundation of Facebook strategy 619 00:38:03,000 --> 00:38:06,560 Speaker 1: for the desktop experience, and how they generate revenue, and 620 00:38:06,600 --> 00:38:11,239 Speaker 1: the value proposition Facebook has for advertisers is incredible. They 621 00:38:11,239 --> 00:38:15,200 Speaker 1: can say, Hey, practically everyone in the world is on 622 00:38:15,239 --> 00:38:18,239 Speaker 1: our platform, and we know what they all like and 623 00:38:18,280 --> 00:38:23,360 Speaker 1: what they dislike because of cookies and their activities on Facebook. Plus, 624 00:38:23,719 --> 00:38:28,160 Speaker 1: we've developed algorithms that keep people on Facebook longer. So 625 00:38:28,239 --> 00:38:31,279 Speaker 1: if you advertise on us, we can make sure that 626 00:38:31,360 --> 00:38:33,759 Speaker 1: the people who are most likely to respond to your 627 00:38:33,880 --> 00:38:37,480 Speaker 1: ad are the people who see that ad, and we 628 00:38:37,520 --> 00:38:40,520 Speaker 1: can make sure that they see your ad a lot. 629 00:38:41,719 --> 00:38:44,640 Speaker 1: This also began to introduce the era of read targeting. 630 00:38:45,000 --> 00:38:47,480 Speaker 1: So if you've ever had the experience of going to 631 00:38:47,800 --> 00:38:52,480 Speaker 1: different websites but seeing the same ads play on those websites, 632 00:38:53,000 --> 00:38:57,080 Speaker 1: you've been experiencing red targeting. Maybe you were doing some 633 00:38:57,200 --> 00:39:00,479 Speaker 1: comparison shopping for I don't know how, with a brand 634 00:39:00,480 --> 00:39:04,280 Speaker 1: new toaster, but for whatever reason, you didn't pull the trigger. 635 00:39:04,520 --> 00:39:06,880 Speaker 1: Maybe you went as far as looking at a specific 636 00:39:06,920 --> 00:39:10,839 Speaker 1: toasters page on a shopping site like Amazon, but you 637 00:39:10,920 --> 00:39:15,120 Speaker 1: didn't move forward with the purchase. But now suddenly everywhere 638 00:39:15,160 --> 00:39:18,680 Speaker 1: you go you seem to be seeing ads for that toaster, 639 00:39:19,360 --> 00:39:23,880 Speaker 1: or maybe it's a different but similar toaster. This is retargeting. 640 00:39:24,280 --> 00:39:26,600 Speaker 1: The cookies on your computer have a record of you 641 00:39:26,800 --> 00:39:29,799 Speaker 1: visiting that toaster page, and they also include the fact 642 00:39:29,880 --> 00:39:32,440 Speaker 1: that you didn't actually purchase the toaster when you went 643 00:39:32,520 --> 00:39:35,960 Speaker 1: to that page. Those cookies allow various websites that have 644 00:39:36,120 --> 00:39:39,680 Speaker 1: deals with ad companies that are working with this toaster 645 00:39:39,760 --> 00:39:44,640 Speaker 1: manufacturer to dynamically insert ads for that toaster on all 646 00:39:44,680 --> 00:39:47,279 Speaker 1: the various sites that you go to with the goal 647 00:39:47,360 --> 00:39:49,800 Speaker 1: of convincing you to finally follow through on your purchase 648 00:39:49,840 --> 00:39:53,840 Speaker 1: and buy that dang toaster. All right, we are still 649 00:39:53,880 --> 00:39:58,080 Speaker 1: only just approaching the surface level of water from the 650 00:39:58,120 --> 00:40:02,080 Speaker 1: tip of the iceberg. The really valuable stuff is below 651 00:40:02,160 --> 00:40:05,160 Speaker 1: the surface, and we have Apple to thank for it. 652 00:40:05,760 --> 00:40:18,240 Speaker 1: I'll explain more when we get back. On June seven, 653 00:40:18,320 --> 00:40:22,320 Speaker 1: Apple launched the iPhone. While other smartphones predated the iPhone, 654 00:40:22,360 --> 00:40:24,240 Speaker 1: it's safe to say that the iPhone was the first 655 00:40:24,360 --> 00:40:29,760 Speaker 1: truly successful consumer smartphone that appealed to the mainstream market. Previously, 656 00:40:29,800 --> 00:40:34,680 Speaker 1: smartphones pretty much just targeted geeks and executives. Now everybody 657 00:40:34,719 --> 00:40:38,680 Speaker 1: wanted one. The smartphone would open up brand new opportunities 658 00:40:38,800 --> 00:40:42,200 Speaker 1: for data collection. In two thousand eight, Apple launched the 659 00:40:42,239 --> 00:40:45,880 Speaker 1: first iPhone to include a GPS chip. This allowed for 660 00:40:45,960 --> 00:40:49,160 Speaker 1: really useful features for the users, such as real time 661 00:40:49,200 --> 00:40:51,239 Speaker 1: maps that can give you an accurate view of your 662 00:40:51,280 --> 00:40:55,680 Speaker 1: current location. That was huge, right, enormous benefit, But it 663 00:40:55,760 --> 00:40:58,920 Speaker 1: also meant that along with all the data that companies 664 00:40:58,960 --> 00:41:02,080 Speaker 1: could access thanks to cookies and social media posts and 665 00:41:02,120 --> 00:41:06,680 Speaker 1: search engine activity, they can now also add location data 666 00:41:06,880 --> 00:41:09,680 Speaker 1: to the mix as well. Now companies can know what 667 00:41:09,800 --> 00:41:12,319 Speaker 1: you were doing online and where you were in the 668 00:41:12,360 --> 00:41:15,640 Speaker 1: actual world. Now, to be clear, companies could do that 669 00:41:15,680 --> 00:41:18,360 Speaker 1: for folks who were accessing the Internet on desktops or 670 00:41:18,440 --> 00:41:21,239 Speaker 1: laptops as well. It's just that we don't tend to 671 00:41:21,360 --> 00:41:24,279 Speaker 1: carry desktops around with us at all, and for those 672 00:41:24,280 --> 00:41:26,799 Speaker 1: of us who do use laptops, we don't have them 673 00:41:26,880 --> 00:41:31,240 Speaker 1: on an active all the time. But a smartphone that's different. 674 00:41:31,520 --> 00:41:33,919 Speaker 1: That's a device that can ping back to home base 675 00:41:34,239 --> 00:41:36,719 Speaker 1: several times a day, sometimes more than a hundred times 676 00:41:36,719 --> 00:41:39,520 Speaker 1: a day, and that pain can include stuff like how 677 00:41:39,600 --> 00:41:42,319 Speaker 1: much screen time you've spent on the device that day, 678 00:41:42,640 --> 00:41:46,240 Speaker 1: what apps you've been using, what sites and services you've accessed, 679 00:41:46,600 --> 00:41:50,440 Speaker 1: and where in the world you happen to be. And again, 680 00:41:50,840 --> 00:41:53,919 Speaker 1: this allows companies to target more specific ads your way, 681 00:41:54,239 --> 00:41:56,839 Speaker 1: and now those ads could be location based as well 682 00:41:56,880 --> 00:42:00,200 Speaker 1: as activity based. So maybe you're wandering around a new 683 00:42:00,239 --> 00:42:03,279 Speaker 1: city and you start seeing ads for specific locations like 684 00:42:03,440 --> 00:42:07,560 Speaker 1: restaurants or shops or amusement parks or whatever. On the 685 00:42:07,600 --> 00:42:09,879 Speaker 1: one hand, that could be really useful as you try 686 00:42:09,960 --> 00:42:12,440 Speaker 1: to find things that you might want to experience in 687 00:42:12,480 --> 00:42:15,239 Speaker 1: a new place. But on the other it indicated that 688 00:42:15,320 --> 00:42:18,880 Speaker 1: the smartphone was really gathering a ton of information about you. 689 00:42:19,719 --> 00:42:22,120 Speaker 1: And then there's that scenario I mentioned at the top 690 00:42:22,160 --> 00:42:24,640 Speaker 1: of this episode. If you hang out with other folks 691 00:42:24,640 --> 00:42:28,840 Speaker 1: and everyone happens to have a smartphone, everyone is generating data, 692 00:42:28,920 --> 00:42:31,759 Speaker 1: whether they're actively using their phones or not. And part 693 00:42:31,760 --> 00:42:34,919 Speaker 1: of that data isn't just what they're doing or where 694 00:42:34,960 --> 00:42:37,960 Speaker 1: they are, but also who there with. And now our 695 00:42:38,000 --> 00:42:41,560 Speaker 1: relationships with one another, the time we spend with each other, 696 00:42:41,600 --> 00:42:45,000 Speaker 1: and the places where we spend it, that all becomes 697 00:42:45,120 --> 00:42:48,720 Speaker 1: part of the data grab as well. It represents another 698 00:42:48,760 --> 00:42:52,560 Speaker 1: way that companies can leverage and exploit the information we generate. 699 00:42:53,280 --> 00:42:56,160 Speaker 1: One of the industries that grew out of all of 700 00:42:56,200 --> 00:43:00,680 Speaker 1: this was the data brokerage industry. These are companies that 701 00:43:00,800 --> 00:43:06,319 Speaker 1: collect and maintain massive data repositories about well about us. 702 00:43:06,920 --> 00:43:09,960 Speaker 1: These companies buy and sell personal information as if it 703 00:43:10,000 --> 00:43:13,320 Speaker 1: were any other commodity, because for a lot of entities 704 00:43:13,320 --> 00:43:17,080 Speaker 1: out there, that's exactly what our personal information is. So 705 00:43:17,239 --> 00:43:19,520 Speaker 1: even for companies that might not have the means to 706 00:43:19,560 --> 00:43:23,960 Speaker 1: collect your personal data directly through their own services, can 707 00:43:24,040 --> 00:43:27,760 Speaker 1: pay to get hold of those sweet ones and zeros 708 00:43:27,800 --> 00:43:31,280 Speaker 1: through using data brokers. The U. S State of Vermont 709 00:43:31,440 --> 00:43:34,040 Speaker 1: passed a law a couple of years ago that mandated 710 00:43:34,080 --> 00:43:37,440 Speaker 1: that any company that bought or sold third party personal 711 00:43:37,520 --> 00:43:41,439 Speaker 1: data had to register with the Secretary of State. As 712 00:43:41,440 --> 00:43:45,520 Speaker 1: a result, one hundred twenty one companies registered with the 713 00:43:45,640 --> 00:43:49,400 Speaker 1: Vermont state government. The companies didn't have to provide information 714 00:43:49,480 --> 00:43:52,560 Speaker 1: about who was in their databases. They didn't have to 715 00:43:52,600 --> 00:43:56,040 Speaker 1: say how many people were in them. The rules didn't 716 00:43:56,040 --> 00:44:00,680 Speaker 1: require that these companies make available any information about consumers 717 00:44:00,719 --> 00:44:03,960 Speaker 1: to those consumers, which means if you wanted to check 718 00:44:04,000 --> 00:44:08,200 Speaker 1: to see how much dirt any of those companies might 719 00:44:08,280 --> 00:44:11,840 Speaker 1: have on you, and you, you know, lived in Vermont, 720 00:44:12,080 --> 00:44:14,279 Speaker 1: well you would kind of still be out of luck 721 00:44:14,320 --> 00:44:16,839 Speaker 1: because the law didn't go that far. The law did 722 00:44:16,880 --> 00:44:19,720 Speaker 1: require that the companies had to inform the government about 723 00:44:19,760 --> 00:44:23,280 Speaker 1: any kind of opt out features that the companies provided, 724 00:44:23,640 --> 00:44:26,719 Speaker 1: but that only assumed that they actually were providing an 725 00:44:26,719 --> 00:44:30,959 Speaker 1: opt out feature for the average consumer. What this means 726 00:44:31,040 --> 00:44:33,920 Speaker 1: is that there are more than one companies trafficking and 727 00:44:34,000 --> 00:44:37,560 Speaker 1: personal data, and yours could be among them, and to 728 00:44:37,680 --> 00:44:41,319 Speaker 1: opt out of that system, you would have to contact 729 00:44:41,440 --> 00:44:44,200 Speaker 1: each and every one of these data brokers and go 730 00:44:44,280 --> 00:44:48,120 Speaker 1: through whatever process they might have in order to opt out, 731 00:44:48,880 --> 00:44:53,040 Speaker 1: because by default we are all opted into that system. 732 00:44:53,080 --> 00:44:55,759 Speaker 1: In some cases, we did have a choice, in the 733 00:44:55,800 --> 00:44:58,360 Speaker 1: sense that the choice was whether or not we wanted 734 00:44:58,360 --> 00:45:02,000 Speaker 1: to use a specific service or platform or visit a 735 00:45:02,040 --> 00:45:05,960 Speaker 1: specific website. Again, this is frequently where those long user 736 00:45:06,000 --> 00:45:08,720 Speaker 1: agreements come in, the ones most of us skip right over, 737 00:45:08,840 --> 00:45:11,040 Speaker 1: so that we just click on that I agree button 738 00:45:11,080 --> 00:45:13,560 Speaker 1: and get on with it. We might not be aware 739 00:45:13,760 --> 00:45:15,560 Speaker 1: that we just gave consent to have all of our 740 00:45:15,640 --> 00:45:18,640 Speaker 1: data collected, but that's arguably on us if we didn't 741 00:45:18,680 --> 00:45:22,200 Speaker 1: bother to read the terms and conditions. But in other cases, 742 00:45:22,560 --> 00:45:25,040 Speaker 1: we might not ever have really had a chance to 743 00:45:25,120 --> 00:45:28,560 Speaker 1: opt out of a specific data brokers collections at all 744 00:45:29,080 --> 00:45:31,919 Speaker 1: because we never had direct contact with some of those 745 00:45:32,000 --> 00:45:36,280 Speaker 1: data brokers. Because again information is bought, sold, and traded 746 00:45:36,360 --> 00:45:39,200 Speaker 1: like crazy. So some of these companies are just buying 747 00:45:39,280 --> 00:45:42,359 Speaker 1: up data that was collected by someone else and we 748 00:45:42,440 --> 00:45:46,520 Speaker 1: only ever had contact with that initial point. So as 749 00:45:46,560 --> 00:45:48,480 Speaker 1: an example, let's say that you sign up for a 750 00:45:48,560 --> 00:45:53,360 Speaker 1: social networking platform. We'll call it space Look. So you 751 00:45:53,440 --> 00:45:57,160 Speaker 1: sign up on space Look, and there's this long, boring 752 00:45:57,280 --> 00:46:00,160 Speaker 1: passage of information that you've got to scroll through so 753 00:46:00,200 --> 00:46:02,400 Speaker 1: that you can click the I Agree button, so you 754 00:46:02,480 --> 00:46:04,880 Speaker 1: zoom past all the dull stuff so that you can 755 00:46:04,920 --> 00:46:08,280 Speaker 1: finally get to uploading photos of your adorable kitty cat online. 756 00:46:08,800 --> 00:46:12,520 Speaker 1: But in that dull passage, there are terms that explain 757 00:46:12,600 --> 00:46:16,120 Speaker 1: that space Look will be collecting information about you and 758 00:46:16,160 --> 00:46:19,520 Speaker 1: then using the information to serve you ads. And in addition, 759 00:46:19,920 --> 00:46:23,640 Speaker 1: space Look might also sell or share your personal information 760 00:46:23,920 --> 00:46:28,399 Speaker 1: with other third party entities. So you've effectively signed over 761 00:46:28,440 --> 00:46:31,440 Speaker 1: your personal data to space Look for it to do 762 00:46:31,520 --> 00:46:34,080 Speaker 1: whatever it wants with it within the confines of the 763 00:46:34,120 --> 00:46:37,560 Speaker 1: agreement that you've clicked on, and your info might get 764 00:46:37,600 --> 00:46:40,920 Speaker 1: sent to various data brokers that otherwise you have never 765 00:46:40,960 --> 00:46:47,080 Speaker 1: heard of and never contacted. But wait, it gets worse. Recently, 766 00:46:47,520 --> 00:46:50,160 Speaker 1: Facebook and Google have been in the News for putting 767 00:46:50,200 --> 00:46:51,920 Speaker 1: up a bit of a fuss when it comes to 768 00:46:52,000 --> 00:46:55,080 Speaker 1: the types of data collection that are available to them. 769 00:46:55,440 --> 00:46:58,960 Speaker 1: Facebook got upset at Apple for a new change in 770 00:46:59,040 --> 00:47:03,040 Speaker 1: privacy settings on iOS devices, and it requires Facebook to 771 00:47:03,080 --> 00:47:06,880 Speaker 1: inform users about how it wants to collect data regarding 772 00:47:07,200 --> 00:47:10,719 Speaker 1: user activity on those iOS devices. So like iPhones and 773 00:47:11,320 --> 00:47:15,160 Speaker 1: you know, iPads, things like that, that activity includes the 774 00:47:15,280 --> 00:47:18,480 Speaker 1: use of other apps on the device. So, in other words, 775 00:47:19,320 --> 00:47:23,360 Speaker 1: Facebook collects information on users app activity even if that 776 00:47:23,440 --> 00:47:27,319 Speaker 1: activity isn't directly involving Facebook itself. So if you do 777 00:47:27,360 --> 00:47:30,640 Speaker 1: any banking on your phone, or shopping, or maybe you 778 00:47:30,680 --> 00:47:34,239 Speaker 1: play certain mobile games, well, Facebook wants to know about 779 00:47:34,280 --> 00:47:39,320 Speaker 1: all that because that data is valuable. Apples change requires 780 00:47:39,320 --> 00:47:43,360 Speaker 1: Facebook to alert users that it wants to collect that data, 781 00:47:43,520 --> 00:47:46,920 Speaker 1: kind of like this, this app wants to know your location. 782 00:47:47,040 --> 00:47:49,800 Speaker 1: Is that okay? It's similar to that, And it gives 783 00:47:49,920 --> 00:47:52,960 Speaker 1: the chance for users to opt out of that process 784 00:47:53,200 --> 00:47:57,040 Speaker 1: right then and there in that alert. And Facebook hates 785 00:47:57,320 --> 00:48:00,319 Speaker 1: that because if you give people the choice is to 786 00:48:00,440 --> 00:48:05,000 Speaker 1: not be tracked, often they take that choice. They just 787 00:48:05,280 --> 00:48:08,879 Speaker 1: you know, have to know about it first. So if 788 00:48:08,920 --> 00:48:12,759 Speaker 1: you aren't allowed to bury that notice in page two 789 00:48:12,800 --> 00:48:15,920 Speaker 1: of settings or in a deep user agreement, Well, then 790 00:48:15,920 --> 00:48:18,520 Speaker 1: a lot of folks might just take the option to 791 00:48:18,640 --> 00:48:22,640 Speaker 1: opt out, which means that's a hit on Facebook's revenue, 792 00:48:23,000 --> 00:48:26,120 Speaker 1: and Facebook and a fan of that, it has launched 793 00:48:26,160 --> 00:48:29,360 Speaker 1: a little bit of a campaign that essentially argues that 794 00:48:29,440 --> 00:48:32,560 Speaker 1: Apple's new rules are harmful to small businesses and that 795 00:48:32,600 --> 00:48:36,400 Speaker 1: people should feel badly about opting out, which is um, 796 00:48:36,560 --> 00:48:40,319 Speaker 1: super disingenuous. If you ask me, Facebook, I don't think 797 00:48:40,440 --> 00:48:44,080 Speaker 1: is at all concerned about small businesses except to the 798 00:48:44,160 --> 00:48:49,560 Speaker 1: extent at which those small businesses spend money on Facebook. Similarly, 799 00:48:50,080 --> 00:48:53,440 Speaker 1: Google was in the news when internal documents leaked showing 800 00:48:53,520 --> 00:48:57,759 Speaker 1: that the company had leaned hard on Android device manufacturers 801 00:48:58,440 --> 00:49:04,480 Speaker 1: to hide geo tracking opt out features deep in settings. 802 00:49:04,560 --> 00:49:07,160 Speaker 1: And so Google was essentially saying, hey, that's cool that 803 00:49:07,200 --> 00:49:09,239 Speaker 1: you want to make this Android smartphone, do us a 804 00:49:09,280 --> 00:49:12,480 Speaker 1: solid for the geo tracking stuff. Hide that like on 805 00:49:12,600 --> 00:49:15,120 Speaker 1: page two or three of your settings, so that nobody 806 00:49:15,160 --> 00:49:18,040 Speaker 1: ever bothers to go that far. And the goal was 807 00:49:18,080 --> 00:49:20,120 Speaker 1: just to make it harder for people to find where 808 00:49:20,160 --> 00:49:22,800 Speaker 1: they could turn off geo tracking, so that the company 809 00:49:22,840 --> 00:49:26,120 Speaker 1: could continue to collect that sweet data without too many 810 00:49:26,160 --> 00:49:31,719 Speaker 1: people opting out dirty pool Google. Just a few years ago, 811 00:49:31,880 --> 00:49:34,480 Speaker 1: all this data would have been alarming, but it also 812 00:49:34,800 --> 00:49:36,839 Speaker 1: would have been kind of burdened by the fact that 813 00:49:37,200 --> 00:49:41,000 Speaker 1: there's just so much information that's hard to do anything 814 00:49:41,040 --> 00:49:44,240 Speaker 1: with it. So it's one thing to collect enormous amounts 815 00:49:44,239 --> 00:49:47,440 Speaker 1: of personal information, but it's another to actually find a 816 00:49:47,520 --> 00:49:50,359 Speaker 1: meaningful use for all that data. You get a lot 817 00:49:50,400 --> 00:49:53,360 Speaker 1: of noise along with a little bit of signal. But 818 00:49:53,440 --> 00:49:57,360 Speaker 1: over the past few years, data analysis has advanced dramatically 819 00:49:57,520 --> 00:50:01,400 Speaker 1: and it's become much more sophisticated. Big data, which was 820 00:50:01,400 --> 00:50:04,240 Speaker 1: a buzz term that came largely from the marketing world, 821 00:50:05,040 --> 00:50:07,840 Speaker 1: is a real thing now, and when paired with systems 822 00:50:07,840 --> 00:50:11,719 Speaker 1: that use strategies like machine learning, advertising companies are able 823 00:50:11,760 --> 00:50:15,120 Speaker 1: to get incredibly detailed looks at who each of us 824 00:50:15,239 --> 00:50:19,719 Speaker 1: happens to be. Our data can be parsed and contextualized 825 00:50:20,080 --> 00:50:24,000 Speaker 1: in millions of ways that are incredibly valuable to countless 826 00:50:24,000 --> 00:50:28,480 Speaker 1: people and organizations. Some of those might be largely benign, 827 00:50:28,880 --> 00:50:32,560 Speaker 1: or at least no more malicious than your typical capitalistic endeavor, 828 00:50:32,960 --> 00:50:36,440 Speaker 1: but some might be way more malevolent. And of course 829 00:50:36,560 --> 00:50:40,160 Speaker 1: I haven't even touched on the burden of good stewardship 830 00:50:40,280 --> 00:50:44,040 Speaker 1: when it comes to protecting data. Many of those one 831 00:50:44,160 --> 00:50:47,680 Speaker 1: data broker companies have been targets of hackers, and more 832 00:50:47,760 --> 00:50:50,360 Speaker 1: than a few have had some pretty nasty data breaches 833 00:50:50,520 --> 00:50:54,319 Speaker 1: over the years. So I guess the whole point of 834 00:50:54,320 --> 00:50:57,279 Speaker 1: this episode is to really explain how technology in the 835 00:50:57,320 --> 00:51:02,680 Speaker 1: digital age largely centers around the collection and exploitation of information, 836 00:51:03,040 --> 00:51:05,800 Speaker 1: and that a lot of that information comes from people 837 00:51:05,880 --> 00:51:08,919 Speaker 1: like us, and that if we feel strongly about that, 838 00:51:09,280 --> 00:51:13,560 Speaker 1: we have to take steps to address this issue. Unfortunately, 839 00:51:13,680 --> 00:51:18,439 Speaker 1: right now, those steps are often laborious and convoluted. It's 840 00:51:18,600 --> 00:51:24,240 Speaker 1: easy to get discouraged. It's easy to prioritize convenience over privacy. 841 00:51:24,360 --> 00:51:27,399 Speaker 1: It's easy to give into the statement that Mark Zuckerberg 842 00:51:27,440 --> 00:51:31,560 Speaker 1: famously made in two when he said privacy is dead. 843 00:51:32,239 --> 00:51:37,040 Speaker 1: But as we see implementations of systems that exploit our data, 844 00:51:37,160 --> 00:51:42,759 Speaker 1: and as these become undeniably more invasive, it might benefit 845 00:51:42,880 --> 00:51:45,799 Speaker 1: us to look at them more closely and act in 846 00:51:45,840 --> 00:51:49,560 Speaker 1: our own self interest, because I assure you most of 847 00:51:49,560 --> 00:51:52,080 Speaker 1: these companies are not going to do that for us. 848 00:51:52,960 --> 00:51:56,880 Speaker 1: I'm talking to you, Stephanie, yes, you. I've personalized this 849 00:51:56,920 --> 00:52:01,160 Speaker 1: episode for every listener, and you, step phany need to 850 00:52:01,200 --> 00:52:05,400 Speaker 1: take action. I'm kidding. I didn't personalize this podcast, and 851 00:52:05,440 --> 00:52:08,840 Speaker 1: anyone not named Stephanie is probably just confused, and anyone 852 00:52:08,840 --> 00:52:11,759 Speaker 1: who is named Stephanie is probably flipping out. But no, 853 00:52:11,880 --> 00:52:13,799 Speaker 1: I just wrote that as a joke. But it is 854 00:52:13,840 --> 00:52:16,040 Speaker 1: the sort of thing that websites and apps and other 855 00:52:16,080 --> 00:52:21,000 Speaker 1: Internet related services can do for reals. In fact, technically, 856 00:52:21,360 --> 00:52:24,040 Speaker 1: with enough work, a podcast like this could probably do 857 00:52:24,120 --> 00:52:25,959 Speaker 1: it too. It would just require me to go through 858 00:52:26,760 --> 00:52:29,279 Speaker 1: a very long list of names and record them and 859 00:52:29,320 --> 00:52:33,480 Speaker 1: then have dynamic insertion of that statement in the podcast 860 00:52:33,840 --> 00:52:36,600 Speaker 1: and targeted to specific listeners. It would be a lot 861 00:52:36,600 --> 00:52:38,440 Speaker 1: of work, is what I'm saying, something that I am 862 00:52:38,480 --> 00:52:42,239 Speaker 1: not willing to do, but it could be done, and 863 00:52:42,320 --> 00:52:46,719 Speaker 1: maybe that's not always a good thing. All right. That 864 00:52:46,800 --> 00:52:50,400 Speaker 1: wraps up this very long soapbox edition of tech Stuff. 865 00:52:50,440 --> 00:52:53,920 Speaker 1: Hope you guys learned something in that and found some 866 00:52:54,040 --> 00:52:57,080 Speaker 1: value in that discussion. I will be doing some more 867 00:52:57,120 --> 00:53:01,560 Speaker 1: episodes about privacy related materials like Kappa. I do want 868 00:53:01,600 --> 00:53:05,040 Speaker 1: to talk about that and the intent of that legislation 869 00:53:05,080 --> 00:53:07,560 Speaker 1: as well as the actual impact of it, So look 870 00:53:07,600 --> 00:53:09,400 Speaker 1: forward to that in the future. But If you have 871 00:53:09,480 --> 00:53:11,799 Speaker 1: suggestions for things I should cover in future episodes of 872 00:53:11,800 --> 00:53:14,120 Speaker 1: tech Stuff, reach out to me. The best place to 873 00:53:14,160 --> 00:53:16,680 Speaker 1: do that is on Twitter. The handle for the show 874 00:53:16,840 --> 00:53:19,880 Speaker 1: is tech Stuff H s W and I'll talk to 875 00:53:19,880 --> 00:53:28,919 Speaker 1: you again really soon. Tech Stuff is an I Heart 876 00:53:29,000 --> 00:53:32,759 Speaker 1: Radio production. For more podcasts from My Heart Radio, visit 877 00:53:32,800 --> 00:53:35,840 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 878 00:53:35,960 --> 00:53:37,280 Speaker 1: listen to your favorite shows.