1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,960 --> 00:00:18,120 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,239 --> 00:00:22,040 Speaker 1: and I love all things tech. And on Monday's episode, 5 00:00:22,079 --> 00:00:26,640 Speaker 1: I looked at the topic of privacy or privacy and 6 00:00:26,720 --> 00:00:31,600 Speaker 1: the Internet and how companies collect and use private information. Essentially, 7 00:00:31,920 --> 00:00:34,920 Speaker 1: the point of that episode was to show how in 8 00:00:34,960 --> 00:00:40,680 Speaker 1: the era of big data, practically everything we do generates 9 00:00:40,720 --> 00:00:44,400 Speaker 1: at least a data point that goes into a constantly 10 00:00:44,560 --> 00:00:49,159 Speaker 1: more detailed depiction of who we are, digitally speaking, and 11 00:00:49,240 --> 00:00:53,760 Speaker 1: that numerous companies are profiting off that information. And sometimes 12 00:00:53,800 --> 00:00:56,000 Speaker 1: it means we as individuals actually see some sort of 13 00:00:56,040 --> 00:01:00,480 Speaker 1: benefit from that, right like targeted personalized experien ins can 14 00:01:00,520 --> 00:01:03,040 Speaker 1: be a benefit, but sometimes it really just means we're 15 00:01:03,080 --> 00:01:06,880 Speaker 1: being exploited in some way. Now, you could say that 16 00:01:07,000 --> 00:01:10,280 Speaker 1: for adults, this is just the price you pay to 17 00:01:10,360 --> 00:01:14,479 Speaker 1: be a modern citizen of our digital world, particularly when 18 00:01:14,520 --> 00:01:19,200 Speaker 1: it involves joining various online platforms and services that include 19 00:01:19,240 --> 00:01:23,039 Speaker 1: statements that explain that the service is going to collect, share, 20 00:01:23,120 --> 00:01:27,080 Speaker 1: and leverage data from users. And you could argue, hey, 21 00:01:27,200 --> 00:01:29,200 Speaker 1: you agreed to do this when you signed up for 22 00:01:29,240 --> 00:01:33,200 Speaker 1: an account, even if you didn't bother reading the fine print. However, 23 00:01:33,280 --> 00:01:37,040 Speaker 1: it is a different matter entirely when it comes to kids. 24 00:01:37,880 --> 00:01:41,720 Speaker 1: Kids don't have the awareness of what it means to 25 00:01:41,840 --> 00:01:46,679 Speaker 1: share information about themselves. They don't understand the potential consequences. 26 00:01:46,720 --> 00:01:49,200 Speaker 1: So it wouldn't be ethical to convince a kid to 27 00:01:49,240 --> 00:01:53,560 Speaker 1: sign any sort of binding agreement without that kid's parent 28 00:01:53,800 --> 00:01:57,720 Speaker 1: or guardian present and consenting to it. And so when 29 00:01:57,760 --> 00:02:01,280 Speaker 1: it comes to the privacy of kids, things take a 30 00:02:01,320 --> 00:02:04,840 Speaker 1: pretty serious turn. In the United States. That includes the 31 00:02:04,920 --> 00:02:09,160 Speaker 1: passing of a law in nine called the Children's Online 32 00:02:09,240 --> 00:02:14,240 Speaker 1: Privacy Protection Act or KAPPA CEO p p A. Today, 33 00:02:14,360 --> 00:02:18,600 Speaker 1: I want to explore what prompted the drafting and passage 34 00:02:18,720 --> 00:02:21,600 Speaker 1: of KAPPA, and some cases in which companies have been 35 00:02:21,639 --> 00:02:25,799 Speaker 1: found guilty of failing to comply with the rules of KAPPA, 36 00:02:25,919 --> 00:02:29,880 Speaker 1: and how that has affected content on the Internet in general. 37 00:02:30,600 --> 00:02:34,120 Speaker 1: Generally speaking, when it comes to technology and the law, 38 00:02:34,480 --> 00:02:38,760 Speaker 1: there is a lot of opportunity for messiness. Technology advances 39 00:02:38,960 --> 00:02:41,320 Speaker 1: at a far faster pace than what we see in 40 00:02:41,360 --> 00:02:45,280 Speaker 1: the legal world. Heck, the folks who draft legislation tend 41 00:02:45,320 --> 00:02:48,960 Speaker 1: to be let's say a bit on the older side, 42 00:02:49,320 --> 00:02:52,160 Speaker 1: which can also mean that they can sometimes be a 43 00:02:52,200 --> 00:02:55,919 Speaker 1: little bit out of touch when it comes to modern technology. 44 00:02:56,040 --> 00:02:58,320 Speaker 1: And we see this all the time with politicians just 45 00:02:58,440 --> 00:03:03,360 Speaker 1: struggling to understand the scope and the effect of technology 46 00:03:03,400 --> 00:03:07,959 Speaker 1: while simultaneously trying to draft legislation that in some way 47 00:03:08,000 --> 00:03:11,960 Speaker 1: intersects with technology. It doesn't always go well, and often 48 00:03:11,960 --> 00:03:15,640 Speaker 1: this means there's a figurative disconnect between the law and 49 00:03:15,680 --> 00:03:19,399 Speaker 1: the tech, and that can lead to unintended consequences. At 50 00:03:19,400 --> 00:03:23,800 Speaker 1: the same time, we can see the clear need for legislation, 51 00:03:23,840 --> 00:03:26,960 Speaker 1: at least in some cases. The fact that there aren't 52 00:03:27,120 --> 00:03:30,800 Speaker 1: hard and fast legal protections for online privacy in general 53 00:03:31,080 --> 00:03:33,880 Speaker 1: has led us to this condition in the United States 54 00:03:34,120 --> 00:03:39,000 Speaker 1: in which our data becomes digital currency. Only it's currency 55 00:03:39,040 --> 00:03:43,320 Speaker 1: that we the generators, the people that are making that data. 56 00:03:43,880 --> 00:03:47,080 Speaker 1: We don't get to profit off of that, at least 57 00:03:47,080 --> 00:03:51,120 Speaker 1: not you know, directly. Other people and companies are making 58 00:03:51,280 --> 00:03:55,040 Speaker 1: billions of dollars off the data that we create, which 59 00:03:55,320 --> 00:03:59,640 Speaker 1: doesn't seem terribly fair, does it That we're not compensated 60 00:04:00,040 --> 00:04:03,720 Speaker 1: for generating all this information. Our compensation tends to be 61 00:04:03,920 --> 00:04:08,320 Speaker 1: the use of a service or platform. Anyway, My point 62 00:04:08,400 --> 00:04:11,840 Speaker 1: is we can have situations in which we need legislation, 63 00:04:12,440 --> 00:04:15,800 Speaker 1: and yet we can also have those situations where the 64 00:04:15,880 --> 00:04:18,760 Speaker 1: laws we get may not measure up to what we 65 00:04:18,800 --> 00:04:21,400 Speaker 1: actually need. So let's take a look at Kappa and 66 00:04:21,440 --> 00:04:25,440 Speaker 1: figure out if it does what it's supposed to do. Now, 67 00:04:25,440 --> 00:04:27,200 Speaker 1: of course, this means we also have to look at 68 00:04:27,240 --> 00:04:32,200 Speaker 1: some history, and a predecessor to Kappa is an industry 69 00:04:32,360 --> 00:04:37,120 Speaker 1: organization called the Children's Advertising Revenue Unit or caru CE 70 00:04:37,320 --> 00:04:41,560 Speaker 1: a are you, that formed in nineteen seventy four out 71 00:04:41,560 --> 00:04:45,760 Speaker 1: of the Better Business Bureau here in the United States. Now, traditionally, 72 00:04:46,320 --> 00:04:50,120 Speaker 1: marketing targeted adults, right, there was adults that heard all 73 00:04:50,200 --> 00:04:54,160 Speaker 1: the different ads, But by the nineteen seventies, television programs 74 00:04:54,279 --> 00:04:57,839 Speaker 1: that were aimed specifically at kids were really becoming a 75 00:04:57,880 --> 00:05:02,120 Speaker 1: regular thing. And if the audience for those shows were kids, well, 76 00:05:02,160 --> 00:05:05,320 Speaker 1: that meant that the ads that were being displayed during 77 00:05:05,320 --> 00:05:09,040 Speaker 1: those shows were also being directed towards kids, and that 78 00:05:09,080 --> 00:05:12,120 Speaker 1: meant marketers had to craft ads that would appeal to 79 00:05:12,240 --> 00:05:15,280 Speaker 1: kids potentially so that those kids would go and then 80 00:05:15,320 --> 00:05:18,240 Speaker 1: convince their parents to buy whatever stuff they were seeing 81 00:05:18,279 --> 00:05:20,679 Speaker 1: on television, whether it was a toy or a serial 82 00:05:20,920 --> 00:05:25,120 Speaker 1: or whatever. But again, this brings up some pretty tough 83 00:05:25,320 --> 00:05:29,320 Speaker 1: ethical questions. How do you market to kids who aren't 84 00:05:29,360 --> 00:05:33,000 Speaker 1: old enough to make decisions for themselves? Rather than risk 85 00:05:33,240 --> 00:05:37,479 Speaker 1: having the government step in and get involved. The television 86 00:05:37,480 --> 00:05:41,839 Speaker 1: and advertising industries saw the wisdom in a self regulating body, 87 00:05:42,200 --> 00:05:46,440 Speaker 1: and CARU created some basic core principles which would also 88 00:05:46,480 --> 00:05:52,280 Speaker 1: become important for Kappa. The guidelines that CARU essentially set 89 00:05:52,320 --> 00:05:55,359 Speaker 1: they say stuff like kids may not be aware that 90 00:05:55,440 --> 00:05:59,279 Speaker 1: they're even being advertised too, and they might have limited 91 00:05:59,320 --> 00:06:04,679 Speaker 1: experience with the persuasive nature of advertising. So therefore, advertisers 92 00:06:04,680 --> 00:06:08,800 Speaker 1: need to show a special responsibility when they are marketing 93 00:06:08,839 --> 00:06:11,720 Speaker 1: towards kids. And as someone who grew up in the 94 00:06:11,760 --> 00:06:15,240 Speaker 1: seventies and eighties, I can tell you that it could 95 00:06:15,240 --> 00:06:17,960 Speaker 1: get pretty tricky to separate the ads from the content 96 00:06:18,120 --> 00:06:20,960 Speaker 1: back in those days, partly because a lot of the 97 00:06:20,960 --> 00:06:25,159 Speaker 1: shows geared towards kids were really nothing more than, you know, 98 00:06:25,240 --> 00:06:28,800 Speaker 1: grandiose commercials for lines of toys and stuff. I'm looking 99 00:06:28,839 --> 00:06:33,479 Speaker 1: at you, Transformers and g I Joe and he Man. 100 00:06:34,320 --> 00:06:37,800 Speaker 1: The guideline stress the advertisers need to be substantive in 101 00:06:37,839 --> 00:06:41,160 Speaker 1: their claims when it comes to ads directed at kids. So, 102 00:06:41,200 --> 00:06:45,159 Speaker 1: in other words, ads should not give kids the idea 103 00:06:45,200 --> 00:06:47,480 Speaker 1: that they'd be able to do stuff with a product 104 00:06:47,800 --> 00:06:51,320 Speaker 1: that just isn't possible, you know, like buying a he 105 00:06:51,480 --> 00:06:53,960 Speaker 1: Man plastic sword, and that if you just say, by 106 00:06:53,960 --> 00:06:56,440 Speaker 1: the power of Gray Skull, I have the power, you 107 00:06:56,480 --> 00:07:00,400 Speaker 1: will somehow magically get a page boy haircut and giant muscles. 108 00:07:00,440 --> 00:07:03,880 Speaker 1: I can tell you from personal experience that's just not 109 00:07:04,240 --> 00:07:09,160 Speaker 1: the case. Other guidelines say that advertisers should not advertise 110 00:07:09,200 --> 00:07:14,040 Speaker 1: products or services that are not appropriate for kids two kids, 111 00:07:14,080 --> 00:07:17,760 Speaker 1: So you shouldn't be getting like ads for car dealerships 112 00:07:17,760 --> 00:07:20,640 Speaker 1: in the middle of your Saturday morning cartoon block or 113 00:07:20,680 --> 00:07:24,240 Speaker 1: something like that, and that adds should include a diverse 114 00:07:24,320 --> 00:07:28,520 Speaker 1: representation within them. They shouldn't all be the same ethnicity. 115 00:07:28,760 --> 00:07:32,120 Speaker 1: And that also adds should reinforce positive social interactions, like 116 00:07:32,200 --> 00:07:35,520 Speaker 1: being honest in such In other words, the idea was, yeah, 117 00:07:35,800 --> 00:07:38,840 Speaker 1: we're gonna allow for advertising to kids, but let's do 118 00:07:38,920 --> 00:07:41,120 Speaker 1: it in a way where at least it seems like 119 00:07:41,200 --> 00:07:43,960 Speaker 1: it's wholesome. Now, I think it's fair to say that 120 00:07:44,000 --> 00:07:48,239 Speaker 1: these guidelines, which would again go on to inform the 121 00:07:48,360 --> 00:07:52,520 Speaker 1: rules for Kappa, weren't the product of necessarily a sincere 122 00:07:52,680 --> 00:07:57,120 Speaker 1: concern for children so much as it was an effort 123 00:07:57,160 --> 00:08:00,040 Speaker 1: to head the government off at the past. And I 124 00:08:00,080 --> 00:08:03,000 Speaker 1: say this as someone who has studied advertising and marketing 125 00:08:03,000 --> 00:08:05,560 Speaker 1: a little bit, and I walked away with a distinct 126 00:08:05,560 --> 00:08:10,120 Speaker 1: impression that ethics are you know, that's mostly something that 127 00:08:10,200 --> 00:08:15,960 Speaker 1: happens to other industries compared to marketing and advertising. Sometimes 128 00:08:16,160 --> 00:08:19,640 Speaker 1: ethics were seen as something of a drawback in that field, 129 00:08:19,920 --> 00:08:23,120 Speaker 1: particularly if you study the advertising of the fifties and 130 00:08:23,200 --> 00:08:26,520 Speaker 1: sixties the stuff. You should know. Guys have done episodes 131 00:08:26,600 --> 00:08:30,440 Speaker 1: on advertising from that era, and those episodes are phenomenal 132 00:08:30,480 --> 00:08:36,439 Speaker 1: and they really detail how sleazy that world could be. 133 00:08:37,000 --> 00:08:39,880 Speaker 1: And while that kind of foolishness might be aimed at 134 00:08:39,920 --> 00:08:42,760 Speaker 1: adults without much of a blink of an eye, the 135 00:08:42,800 --> 00:08:47,400 Speaker 1: story is different when there are kids involved. That being said, 136 00:08:47,640 --> 00:08:50,800 Speaker 1: Cairo takes the issue of advertising to kids in a 137 00:08:50,840 --> 00:08:54,520 Speaker 1: responsible way as a serious thing, because, again, if the 138 00:08:54,640 --> 00:08:57,680 Speaker 1: organization fails in this regard, if the industry starts to 139 00:08:57,720 --> 00:09:01,800 Speaker 1: fall short, the government is going to step in. There 140 00:09:01,800 --> 00:09:05,840 Speaker 1: will be a a strong position from the citizens for 141 00:09:05,880 --> 00:09:08,280 Speaker 1: the government to step in and do something about it. 142 00:09:08,600 --> 00:09:12,040 Speaker 1: And like many industries, the advertising world is not super 143 00:09:12,160 --> 00:09:16,240 Speaker 1: keen on the idea of regulation. But let's move forward. 144 00:09:16,600 --> 00:09:19,320 Speaker 1: So while you might look back on the children's programming 145 00:09:19,320 --> 00:09:22,880 Speaker 1: and advertising from the eighties and into the nineties and say, huh, 146 00:09:22,960 --> 00:09:26,199 Speaker 1: it seems like those guidelines were pretty lucy goosey, and 147 00:09:26,240 --> 00:09:29,439 Speaker 1: that a lot of a lot of programming may not 148 00:09:29,520 --> 00:09:33,200 Speaker 1: have followed it all that closely, the fact was that 149 00:09:33,440 --> 00:09:35,800 Speaker 1: Cairo was in charge of making sure things didn't go 150 00:09:35,920 --> 00:09:39,400 Speaker 1: too far astray in the United States, and then in 151 00:09:39,440 --> 00:09:43,320 Speaker 1: the early nineties, the mainstream public became aware of this 152 00:09:43,600 --> 00:09:47,800 Speaker 1: thing called the Internet. Now, those of you who have 153 00:09:47,840 --> 00:09:50,480 Speaker 1: listened to this show for a long time, or those 154 00:09:50,480 --> 00:09:53,280 Speaker 1: of you who have studied the Internet, know that the 155 00:09:53,320 --> 00:09:56,880 Speaker 1: Internet and its predecessors like our ponnet have actually been 156 00:09:56,880 --> 00:09:59,600 Speaker 1: around for a really long time. But outside of a 157 00:09:59,640 --> 00:10:03,000 Speaker 1: RelA tive ly small population of researchers and students and 158 00:10:03,040 --> 00:10:07,440 Speaker 1: government officials, hardly anyone knew anything about it. And that 159 00:10:07,600 --> 00:10:10,800 Speaker 1: changed with the launch and evolution of the Worldwide Web. 160 00:10:11,760 --> 00:10:15,720 Speaker 1: The Web was a much more user friendly and intuitive 161 00:10:15,760 --> 00:10:19,000 Speaker 1: way to access the Internet. The adoption of the Web 162 00:10:19,360 --> 00:10:23,280 Speaker 1: by the mainstream didn't happen overnight, but without the Web, 163 00:10:23,320 --> 00:10:25,680 Speaker 1: I think it's pretty safe to say that the general 164 00:10:25,720 --> 00:10:28,840 Speaker 1: awareness about the Internet in general h and what you 165 00:10:28,840 --> 00:10:31,880 Speaker 1: could do with it would have lagged behind by several 166 00:10:32,000 --> 00:10:36,360 Speaker 1: years at the least. Anyway, once websites started to really 167 00:10:36,480 --> 00:10:39,600 Speaker 1: be a thing, and once companies began to kind of 168 00:10:39,640 --> 00:10:43,760 Speaker 1: cautiously dip their corporate toe in the web and see 169 00:10:43,800 --> 00:10:47,240 Speaker 1: how they might conduct commerce and further, how they might 170 00:10:47,320 --> 00:10:51,320 Speaker 1: advertise to people online, CARU really began to get a 171 00:10:51,320 --> 00:10:54,719 Speaker 1: bit proactive in the online space. There was a legitimate 172 00:10:54,760 --> 00:10:58,040 Speaker 1: concern that one of the things that makes the web great, 173 00:10:58,559 --> 00:11:01,959 Speaker 1: that being that it's easy to access stuff if you 174 00:11:02,080 --> 00:11:05,280 Speaker 1: just have a browser, uh, internet connection and some basic 175 00:11:05,320 --> 00:11:08,400 Speaker 1: web skills, that also makes the web a potential pitfall 176 00:11:08,480 --> 00:11:12,920 Speaker 1: when it comes to how children access and process information. Moreover, 177 00:11:13,480 --> 00:11:16,600 Speaker 1: as I mentioned in Monday's episode, it didn't take very 178 00:11:16,640 --> 00:11:18,920 Speaker 1: long at all for companies to use the Internet in 179 00:11:19,040 --> 00:11:23,040 Speaker 1: order to start gathering information about users in order to 180 00:11:23,480 --> 00:11:26,880 Speaker 1: advertise to them more effectively, And it wasn't long at 181 00:11:26,920 --> 00:11:30,800 Speaker 1: all before companies began to build out databases of user information. 182 00:11:31,280 --> 00:11:34,080 Speaker 1: And while you might make the argument that a mature adult, 183 00:11:34,679 --> 00:11:37,000 Speaker 1: or you know, at least someone who's of an adult 184 00:11:37,080 --> 00:11:40,360 Speaker 1: age has the wherewithal to sign over their right to 185 00:11:40,440 --> 00:11:44,160 Speaker 1: privacy with an understanding of what that actually means. The 186 00:11:44,240 --> 00:11:48,120 Speaker 1: saint can't be said of kids, and thus CARU focused 187 00:11:48,160 --> 00:11:51,960 Speaker 1: on how various sites and services would gather and use 188 00:11:52,000 --> 00:11:56,240 Speaker 1: information online when it came to the information of children. 189 00:11:57,400 --> 00:12:01,600 Speaker 1: In nine CARU published a new set of guidelines called 190 00:12:01,960 --> 00:12:05,840 Speaker 1: Interactive Electronic Media. This was in an effort to get 191 00:12:05,880 --> 00:12:08,680 Speaker 1: ahead of issues that were starting to pop up thanks 192 00:12:08,679 --> 00:12:11,720 Speaker 1: to the web. The goal was to create new guidelines 193 00:12:11,800 --> 00:12:14,640 Speaker 1: for the online space that would protect the privacy of 194 00:12:14,679 --> 00:12:17,880 Speaker 1: those under the age of thirteen, which was kind of 195 00:12:17,880 --> 00:12:21,800 Speaker 1: an arbitrary age picked it. It reflected the ages that 196 00:12:21,880 --> 00:12:25,120 Speaker 1: CARU was focused on as part of advertising in general, 197 00:12:25,600 --> 00:12:27,880 Speaker 1: and it was easy to see the need for this 198 00:12:27,960 --> 00:12:32,680 Speaker 1: approach because even as early as nineteen there were companies 199 00:12:32,720 --> 00:12:36,360 Speaker 1: that were advertising to kids online and they weren't necessarily 200 00:12:36,440 --> 00:12:39,080 Speaker 1: following the CARU guidelines that had been in place for 201 00:12:39,240 --> 00:12:43,840 Speaker 1: like television advertising. Plus, there was the issue of companies 202 00:12:43,840 --> 00:12:48,960 Speaker 1: collecting information about these young consumers without parental consent. The 203 00:12:49,040 --> 00:12:51,360 Speaker 1: canary in the coal mine for this issue turned out 204 00:12:51,360 --> 00:12:54,480 Speaker 1: to be a website called kids Calm, which, as the 205 00:12:54,559 --> 00:12:58,839 Speaker 1: name implies, was a child focused site. Kids Calm used 206 00:12:58,880 --> 00:13:03,640 Speaker 1: tools like register straation forms, pen Pal programs, and contest 207 00:13:03,840 --> 00:13:08,120 Speaker 1: entries to gather information about their users, that is, kids 208 00:13:08,240 --> 00:13:10,559 Speaker 1: who are visiting the site, and hey, this is a 209 00:13:10,600 --> 00:13:13,520 Speaker 1: good time to remind you that if you happen to 210 00:13:13,520 --> 00:13:18,160 Speaker 1: be entering a sweepstakes or contest online or even offline, 211 00:13:18,440 --> 00:13:21,080 Speaker 1: what you're really doing is you're handing over your information 212 00:13:21,120 --> 00:13:24,680 Speaker 1: to some third party, and you can bet that information 213 00:13:24,760 --> 00:13:27,920 Speaker 1: is going to be used in some way. It may 214 00:13:28,160 --> 00:13:30,880 Speaker 1: be used directly by the entity you hand it to. 215 00:13:31,200 --> 00:13:33,800 Speaker 1: So it may be that, for example, it's a magazine 216 00:13:33,800 --> 00:13:36,240 Speaker 1: publisher and you filled out this information. Well, now the 217 00:13:36,280 --> 00:13:40,040 Speaker 1: magazine publisher is going to market other magazines directly to you. 218 00:13:40,720 --> 00:13:44,160 Speaker 1: Or it might mean that your information gets put into 219 00:13:44,240 --> 00:13:48,320 Speaker 1: a database that other companies can pay to access, or 220 00:13:48,360 --> 00:13:51,520 Speaker 1: it could be a combination of the two. Anyway, kids 221 00:13:51,559 --> 00:13:54,200 Speaker 1: Calm was doing this, but the big problem was that 222 00:13:54,800 --> 00:13:57,920 Speaker 1: a lot of kids Calm users were you know kids. 223 00:13:58,960 --> 00:14:03,040 Speaker 1: The FTC investigated Kids Come after receiving a complaint letter 224 00:14:03,120 --> 00:14:06,760 Speaker 1: about the site from the Center of Media Education. The 225 00:14:06,840 --> 00:14:10,160 Speaker 1: FTC found kids Come in violation of several FTC rules 226 00:14:10,160 --> 00:14:13,120 Speaker 1: with regard to the collection and sharing of data, and 227 00:14:13,200 --> 00:14:16,000 Speaker 1: it had turned out that kids Come was in fact 228 00:14:16,320 --> 00:14:20,280 Speaker 1: sharing a database of user information with third parties. On 229 00:14:20,320 --> 00:14:23,680 Speaker 1: the bright side, the data was an aggregate, so it 230 00:14:23,800 --> 00:14:28,160 Speaker 1: was not formatted to reveal personal information unique to individuals. 231 00:14:28,520 --> 00:14:31,600 Speaker 1: It was more inaggregate. It was all collected so that 232 00:14:31,640 --> 00:14:35,320 Speaker 1: there was no personally identifiable people in there. It could 233 00:14:35,320 --> 00:14:38,400 Speaker 1: have been worse, is what I'm saying. Kids Come agreed 234 00:14:38,440 --> 00:14:40,960 Speaker 1: to change its ways and to conform to the rules, 235 00:14:41,480 --> 00:14:44,640 Speaker 1: and that site stuck around till two thousand nineteen. But 236 00:14:44,680 --> 00:14:47,520 Speaker 1: the case of kids Come laid bare the potential dangers 237 00:14:47,560 --> 00:14:51,040 Speaker 1: of data collection when it comes to young kids. This 238 00:14:51,080 --> 00:14:53,320 Speaker 1: matter was seen as a serious one and it led 239 00:14:53,360 --> 00:14:56,800 Speaker 1: to the Federal Trade Commission or FTC, to draft the 240 00:14:56,920 --> 00:15:00,920 Speaker 1: Children's Online Privacy Protection Act KAPPA, and again that was 241 00:15:00,960 --> 00:15:06,360 Speaker 1: passed into law. In that law used CARUS Interactive Electronic 242 00:15:06,480 --> 00:15:10,600 Speaker 1: Media Guidelines as a foundation. The law applies to websites 243 00:15:10,640 --> 00:15:14,240 Speaker 1: and service operators that either directly target children for the 244 00:15:14,280 --> 00:15:18,560 Speaker 1: purposes to collect, use, or disclose personal information of those children, 245 00:15:19,280 --> 00:15:23,479 Speaker 1: or that have actual knowledge that those sites and services 246 00:15:23,520 --> 00:15:25,600 Speaker 1: are in the process of doing whatever it is they 247 00:15:25,640 --> 00:15:30,160 Speaker 1: are doing, also collecting, using, or disclosing children's personal information. So, 248 00:15:30,200 --> 00:15:33,440 Speaker 1: in other words, whether a company is setting out explicitly 249 00:15:33,600 --> 00:15:38,120 Speaker 1: to collect information about specific kids or that just happens 250 00:15:38,160 --> 00:15:40,680 Speaker 1: to be a byproduct of whatever the company is doing, 251 00:15:41,280 --> 00:15:44,920 Speaker 1: these laws have to apply to those types of entities. 252 00:15:45,520 --> 00:15:47,920 Speaker 1: But as we'll see, this approach is not quite as 253 00:15:48,000 --> 00:15:51,200 Speaker 1: black and white as it sounds. We're gonna take a 254 00:15:51,240 --> 00:16:01,560 Speaker 1: quick break and we'll be right back. Okay, So I 255 00:16:01,600 --> 00:16:05,000 Speaker 1: want to clarify something I said before the break, which 256 00:16:05,040 --> 00:16:08,680 Speaker 1: is how the law applies to sites and services online. Now, Essentially, 257 00:16:08,720 --> 00:16:12,240 Speaker 1: an Internet based entity would need to follow COPPA if 258 00:16:12,320 --> 00:16:16,800 Speaker 1: that entities service was targeting those under thirteen and also 259 00:16:16,880 --> 00:16:21,400 Speaker 1: collecting personal information about those users, or allows a third 260 00:16:21,440 --> 00:16:24,880 Speaker 1: party to collect information about those users, like an advertiser, 261 00:16:25,520 --> 00:16:28,160 Speaker 1: or if the entity runs some sort of ad network 262 00:16:28,280 --> 00:16:32,040 Speaker 1: or uses plug ins that also collect information, such as 263 00:16:32,120 --> 00:16:36,960 Speaker 1: like the Honey extension, that one collects personal information, and 264 00:16:37,240 --> 00:16:42,200 Speaker 1: if Honey knows for a fact that among the users 265 00:16:42,440 --> 00:16:46,400 Speaker 1: of its service there are children under the age of thirteen, 266 00:16:47,040 --> 00:16:50,360 Speaker 1: it has to comply with KAPPA. So, in other words, 267 00:16:50,520 --> 00:16:52,760 Speaker 1: if you know for a fact that the information you 268 00:16:52,800 --> 00:16:58,960 Speaker 1: are collecting includes information from kids, KAPPA applies. Third if 269 00:16:59,000 --> 00:17:01,920 Speaker 1: your site or service aims for a general audience, but 270 00:17:02,040 --> 00:17:04,520 Speaker 1: you happen to know that within that general audience there 271 00:17:04,520 --> 00:17:06,920 Speaker 1: are people under the age of thirteen, and you are 272 00:17:07,000 --> 00:17:11,600 Speaker 1: gathering information on your audience. Kappa applies and see as 273 00:17:11,680 --> 00:17:14,160 Speaker 1: kind of at the heart of what would cause issues 274 00:17:14,160 --> 00:17:17,880 Speaker 1: for content creators on YouTube. There are channels that clearly 275 00:17:17,960 --> 00:17:21,600 Speaker 1: target younger audiences, and there are some that, based on 276 00:17:21,640 --> 00:17:26,120 Speaker 1: the nature of their content, clearly aren't meant for younger audiences. 277 00:17:26,840 --> 00:17:29,760 Speaker 1: But there are tons of channels where it hits this 278 00:17:29,880 --> 00:17:33,240 Speaker 1: kind of gray area in which it might seem at 279 00:17:33,240 --> 00:17:36,480 Speaker 1: a casual glance to be aimed at a younger viewer group, 280 00:17:37,040 --> 00:17:40,280 Speaker 1: but in fact the channel contains content that is not 281 00:17:40,480 --> 00:17:43,520 Speaker 1: appropriate for kids. And this gets into a bunch of 282 00:17:43,520 --> 00:17:46,359 Speaker 1: tangential matters that open up so many kinds of worms 283 00:17:46,359 --> 00:17:48,879 Speaker 1: that I'm just gonna touch on it here. We're not 284 00:17:48,920 --> 00:17:51,520 Speaker 1: going to dive into it. That would be an entirely 285 00:17:52,040 --> 00:17:54,800 Speaker 1: separate episode. So what I mean by all this is 286 00:17:54,840 --> 00:17:58,320 Speaker 1: that there are some forms of media that traditionally people 287 00:17:58,400 --> 00:18:02,399 Speaker 1: associate with chill rans media. So I'm thinking about stuff 288 00:18:02,440 --> 00:18:07,440 Speaker 1: like puppets or animation or video games, and yeah, I'm 289 00:18:07,480 --> 00:18:09,879 Speaker 1: sure most of y'all listening to this can think of 290 00:18:09,960 --> 00:18:13,560 Speaker 1: plenty of examples of those forms of entertainment that are 291 00:18:13,960 --> 00:18:20,000 Speaker 1: definitely and definitively not for kids, like the Broadway show 292 00:18:20,040 --> 00:18:24,760 Speaker 1: Avenue Q that features puppets, but that show ain't for kids. Uh. 293 00:18:24,800 --> 00:18:28,639 Speaker 1: There are lots of different animated series like anime that 294 00:18:28,840 --> 00:18:32,400 Speaker 1: are far too sophisticated and contain content that would not 295 00:18:32,480 --> 00:18:35,560 Speaker 1: be appropriate for children, might disturb them and upset them, 296 00:18:35,560 --> 00:18:38,359 Speaker 1: and it's just not meant for younger viewers. And of course, 297 00:18:38,400 --> 00:18:41,560 Speaker 1: we also know that the idea that video games are 298 00:18:41,560 --> 00:18:45,359 Speaker 1: for kids is really an outdated concept. I mean, nearly 299 00:18:45,560 --> 00:18:49,840 Speaker 1: forty of the video game playing population is between the 300 00:18:49,880 --> 00:18:54,000 Speaker 1: age of eighteen and thirty four. Another significant percentage is 301 00:18:54,040 --> 00:18:57,560 Speaker 1: over the age of sixty five, So when you look 302 00:18:57,600 --> 00:19:01,000 Speaker 1: at it, kids make up a minority of the people 303 00:19:01,000 --> 00:19:02,840 Speaker 1: who are playing video games, and yet we still have 304 00:19:02,920 --> 00:19:06,560 Speaker 1: this association that video games are for kids. But the 305 00:19:06,600 --> 00:19:09,720 Speaker 1: fact that these types of entertainment have this traditional association 306 00:19:09,760 --> 00:19:12,960 Speaker 1: with children's media is an ongoing issue that I'm sure 307 00:19:12,960 --> 00:19:17,359 Speaker 1: we're gonna loop back into before the end of this podcast. Anyway, 308 00:19:17,760 --> 00:19:21,760 Speaker 1: Kappa passes in and while I've talked about who is 309 00:19:21,800 --> 00:19:24,760 Speaker 1: subject to KAPPA. I haven't really covered the actual rules yet, 310 00:19:25,119 --> 00:19:27,480 Speaker 1: so we're gonna go over those. And these rules come 311 00:19:27,600 --> 00:19:32,040 Speaker 1: straight from the f TCS website on Kappa. Those entities 312 00:19:32,040 --> 00:19:35,760 Speaker 1: that are subject to Kappa must and I quote, post 313 00:19:35,840 --> 00:19:40,840 Speaker 1: a clear and comprehensive online privacy policy describing their information practices. 314 00:19:40,840 --> 00:19:45,600 Speaker 1: For personal information collected online from children, they must provide 315 00:19:45,640 --> 00:19:50,439 Speaker 1: direct notice to parents and obtain verifiable parental consent with 316 00:19:50,640 --> 00:19:56,119 Speaker 1: limited exceptions, before collecting personal information online from children. They 317 00:19:56,200 --> 00:19:59,399 Speaker 1: must give parents the choice of consenting to the operator's 318 00:19:59,440 --> 00:20:03,080 Speaker 1: collection and internal use of a child's information, but prohibiting 319 00:20:03,119 --> 00:20:06,880 Speaker 1: the operator from disclosing that information to third parties, unless 320 00:20:06,960 --> 00:20:10,080 Speaker 1: disclosure is integral to the site or service, in which 321 00:20:10,080 --> 00:20:14,119 Speaker 1: case this must be made clear to parents. They must 322 00:20:14,160 --> 00:20:18,000 Speaker 1: provide parents access to their child's personal information to review 323 00:20:18,200 --> 00:20:22,120 Speaker 1: and or have the information deleted. They must give parents 324 00:20:22,160 --> 00:20:26,000 Speaker 1: the opportunity to prevent further use or online collection of 325 00:20:26,000 --> 00:20:30,960 Speaker 1: a child's personal information. They must maintain the confidentiality, security, 326 00:20:31,040 --> 00:20:35,120 Speaker 1: and integrity of information they collect from children, including by 327 00:20:35,160 --> 00:20:39,120 Speaker 1: taking reasonable steps to release such information only to parties 328 00:20:39,160 --> 00:20:44,000 Speaker 1: capable of maintaining its confidentiality and security. They must retain 329 00:20:44,080 --> 00:20:47,320 Speaker 1: personal information collected online from a child for only as 330 00:20:47,359 --> 00:20:51,000 Speaker 1: long as is necessary to fulfill the purpose for which 331 00:20:51,080 --> 00:20:55,040 Speaker 1: it was collected, and delete the information using reasonable measures 332 00:20:55,040 --> 00:20:59,600 Speaker 1: to protect against its unauthorized access or use. And they 333 00:20:59,680 --> 00:21:03,800 Speaker 1: must uh not condition a child's participation in an online 334 00:21:03,800 --> 00:21:07,120 Speaker 1: activity on the child providing more information than is reasonably 335 00:21:07,160 --> 00:21:10,840 Speaker 1: necessary to participate in that activity. So what that last 336 00:21:10,840 --> 00:21:12,720 Speaker 1: one means, by the way end quote, what that last 337 00:21:12,720 --> 00:21:15,200 Speaker 1: one means is that you can't require kids to fill 338 00:21:15,240 --> 00:21:19,400 Speaker 1: out like a lengthy online form with stuff like their 339 00:21:19,440 --> 00:21:22,120 Speaker 1: full name and home address and phone number and email 340 00:21:22,560 --> 00:21:25,200 Speaker 1: just so that they can play a game, because it's 341 00:21:25,240 --> 00:21:27,560 Speaker 1: not necessary to have that information in order for the 342 00:21:27,600 --> 00:21:30,120 Speaker 1: kid to play a game. That's what that last one means. 343 00:21:31,000 --> 00:21:34,639 Speaker 1: The FTC also goes on to define what constitutes personal information, 344 00:21:35,080 --> 00:21:39,600 Speaker 1: so specifically, the FTC when they say personal information, they 345 00:21:39,680 --> 00:21:42,639 Speaker 1: mean the first and last name of a person, their 346 00:21:42,680 --> 00:21:46,080 Speaker 1: home or address including street name and town or city, 347 00:21:46,560 --> 00:21:50,960 Speaker 1: their online contact information, their screen name or handle that 348 00:21:51,040 --> 00:21:55,480 Speaker 1: can serve as online contact information on certain services and platforms. 349 00:21:56,119 --> 00:22:00,000 Speaker 1: A telephone number counts as personal information, social security numb 350 00:22:00,160 --> 00:22:04,520 Speaker 1: or a persistent identifier that indicates a specific user across 351 00:22:04,600 --> 00:22:08,240 Speaker 1: different user sessions and different sites and services. So this 352 00:22:08,320 --> 00:22:11,439 Speaker 1: is something that would be unique to each person, but 353 00:22:11,600 --> 00:22:15,480 Speaker 1: not something that they would have necessarily provided to the service. 354 00:22:15,560 --> 00:22:18,719 Speaker 1: The service provides this to the person. So let's say 355 00:22:18,880 --> 00:22:21,480 Speaker 1: I'm the third person to ever sign up for Facebook, 356 00:22:21,480 --> 00:22:26,880 Speaker 1: and Facebook assigns me user number zero zero zero zero, etcetera, etcetera. Three, 357 00:22:27,080 --> 00:22:30,359 Speaker 1: that's my personal identifier. It's unique to me. It means 358 00:22:30,359 --> 00:22:33,879 Speaker 1: Facebook can track me and my my activities. That's the 359 00:22:33,920 --> 00:22:35,840 Speaker 1: kind of thing they're talking about there. That counts as 360 00:22:35,880 --> 00:22:39,600 Speaker 1: personal information, even though it's something that the site assigns 361 00:22:39,600 --> 00:22:42,320 Speaker 1: to the person, not something that the person gives to 362 00:22:42,400 --> 00:22:45,960 Speaker 1: the site. It also includes any file that has the 363 00:22:46,040 --> 00:22:48,720 Speaker 1: child's image or voice in it, so any sort of 364 00:22:48,800 --> 00:22:52,680 Speaker 1: video file or image file or audio file. It also 365 00:22:52,720 --> 00:22:56,320 Speaker 1: includes geolocation data that can target where the child is, 366 00:22:56,400 --> 00:23:00,560 Speaker 1: including like a street uh. It includes information collected by 367 00:23:00,600 --> 00:23:04,360 Speaker 1: the service from the child or child's parents that complements 368 00:23:04,400 --> 00:23:07,439 Speaker 1: the other types of information mentioned. So that's like the 369 00:23:07,560 --> 00:23:11,359 Speaker 1: registration forms and stuff I was talking about, and some 370 00:23:11,440 --> 00:23:15,399 Speaker 1: of these types of personal identification subsets, such as that 371 00:23:15,480 --> 00:23:19,120 Speaker 1: geolocation data, those were not part of Kappa originally back 372 00:23:19,160 --> 00:23:23,240 Speaker 1: in n and ninety six, geolocation wasn't much of a 373 00:23:23,320 --> 00:23:26,879 Speaker 1: thing because we were still waiting for GPS to be 374 00:23:26,960 --> 00:23:29,440 Speaker 1: kind of opened up to the general public. It was 375 00:23:29,480 --> 00:23:33,200 Speaker 1: still very much a military centered thing, and we were 376 00:23:33,240 --> 00:23:35,359 Speaker 1: just in the very early days of having that change. 377 00:23:35,840 --> 00:23:40,840 Speaker 1: But in the FTC expanded KAPPA a bit, given that, 378 00:23:41,040 --> 00:23:44,639 Speaker 1: you know, tech like smartphones had really created a new 379 00:23:44,840 --> 00:23:47,720 Speaker 1: way to collect data and new types of data that 380 00:23:47,800 --> 00:23:51,560 Speaker 1: could be useful to collect. So that's when we started 381 00:23:51,560 --> 00:23:53,199 Speaker 1: to see some of those other things add it in, 382 00:23:53,280 --> 00:23:57,840 Speaker 1: like geolocation and the personal identifier number. In twenty seventeen, 383 00:23:57,920 --> 00:24:00,879 Speaker 1: the FTC would expand KAPPA again to apply it to 384 00:24:01,040 --> 00:24:04,920 Speaker 1: not only to Web browsers and services on computers and smartphones, 385 00:24:05,320 --> 00:24:08,320 Speaker 1: but also to other types of Internet connected devices like 386 00:24:08,359 --> 00:24:12,520 Speaker 1: IoT type stuff. And that was because with the proliferation 387 00:24:12,560 --> 00:24:16,080 Speaker 1: of IoT sensors and toys and games, the FTC wanted 388 00:24:16,119 --> 00:24:18,720 Speaker 1: to make certain that the rules would apply to these 389 00:24:18,800 --> 00:24:22,760 Speaker 1: new technologies and protect children. I did an episode I 390 00:24:22,760 --> 00:24:26,040 Speaker 1: think it was on Forward Thinking actually the Forward Thinking 391 00:24:26,080 --> 00:24:30,199 Speaker 1: podcast from several years ago, where we talked about the 392 00:24:30,240 --> 00:24:34,640 Speaker 1: Internet connected Barbie doll and how that ended up posing 393 00:24:34,720 --> 00:24:39,000 Speaker 1: as a potential security and privacy threat for children, and 394 00:24:39,080 --> 00:24:41,240 Speaker 1: that's the sort of thing that the FTC really wanted 395 00:24:41,280 --> 00:24:44,720 Speaker 1: to get, you know, a handle on. Now, assuming that 396 00:24:44,840 --> 00:24:49,160 Speaker 1: an entity that must comply with Kappa is following the rules, 397 00:24:49,480 --> 00:24:52,400 Speaker 1: then that entity should be legally in the clear right 398 00:24:52,440 --> 00:24:54,360 Speaker 1: if they as long as they're following the rules then 399 00:24:55,280 --> 00:24:58,159 Speaker 1: and they're making sure they're getting parental consent first, they 400 00:24:58,160 --> 00:25:01,480 Speaker 1: can still collect data, they can still use that data. 401 00:25:01,560 --> 00:25:03,800 Speaker 1: They have to delete it after they use it, but 402 00:25:03,920 --> 00:25:07,000 Speaker 1: they're allowed to do it. But if there's an entity 403 00:25:07,080 --> 00:25:10,639 Speaker 1: that is supposed to comply with Kappa and fails to 404 00:25:10,720 --> 00:25:14,520 Speaker 1: follow the rules with regard to personal information, what happens then? 405 00:25:15,480 --> 00:25:17,280 Speaker 1: All the first thing that someone has to do is 406 00:25:17,320 --> 00:25:21,480 Speaker 1: file a complaint with the FTC or a state authority 407 00:25:21,840 --> 00:25:25,160 Speaker 1: and explain the nature of this complaint with regard to 408 00:25:25,280 --> 00:25:29,720 Speaker 1: the handling of children's information. The FTC would then investigate 409 00:25:29,800 --> 00:25:33,040 Speaker 1: this claim or a state authority would, and they would 410 00:25:33,040 --> 00:25:36,359 Speaker 1: see if the accused online site or service was in 411 00:25:36,400 --> 00:25:40,639 Speaker 1: fact in violation of Kappa. And if the authority determines 412 00:25:40,720 --> 00:25:44,000 Speaker 1: that there has been a violation occurred, they can then 413 00:25:44,040 --> 00:25:49,240 Speaker 1: bring a civil lawsuit against that site or service. The 414 00:25:49,320 --> 00:25:51,960 Speaker 1: rules of Kappa state that the civil case can seek 415 00:25:51,960 --> 00:25:57,400 Speaker 1: a penalty of forty three thousand seven dollars per violation. 416 00:25:58,280 --> 00:26:02,000 Speaker 1: That's the maximum amount per violation, but the FTC might 417 00:26:02,040 --> 00:26:06,320 Speaker 1: not seek the maximum amount depending upon circumstances. So determining 418 00:26:06,320 --> 00:26:09,000 Speaker 1: factors include the number of children affected. If it was 419 00:26:09,040 --> 00:26:11,560 Speaker 1: a few children as opposed to a lot, then that 420 00:26:11,640 --> 00:26:16,399 Speaker 1: changes things how egregious the violation was. If it's something 421 00:26:16,440 --> 00:26:21,160 Speaker 1: where it's found that a company was collecting user names, 422 00:26:21,200 --> 00:26:25,920 Speaker 1: for example, but no other personal identifiable information, they might 423 00:26:25,960 --> 00:26:29,919 Speaker 1: not get hit as hard um the types of personal 424 00:26:29,920 --> 00:26:33,359 Speaker 1: information involved, how the information was used will determine it. 425 00:26:33,920 --> 00:26:36,879 Speaker 1: And like I said, COPPA allows for state agencies to 426 00:26:37,000 --> 00:26:39,959 Speaker 1: enforce compliance with KAPPA with respect to entities that are 427 00:26:40,000 --> 00:26:43,320 Speaker 1: within those states jurisdiction, So there can be state level 428 00:26:43,359 --> 00:26:47,040 Speaker 1: COPPA cases as well as the federal level ones. And 429 00:26:47,040 --> 00:26:50,640 Speaker 1: then we get to the concept of safe harbor within KAPPA. 430 00:26:50,920 --> 00:26:54,119 Speaker 1: So safe harbor is a general concept in which a 431 00:26:54,200 --> 00:26:58,760 Speaker 1: person or company faces limited legal liability so long as 432 00:26:58,800 --> 00:27:02,000 Speaker 1: that person or company is following certain conditions. And I 433 00:27:02,080 --> 00:27:04,919 Speaker 1: know that's super general, but the concept applies to a 434 00:27:05,000 --> 00:27:07,479 Speaker 1: lot of different situations, so it has to be general. 435 00:27:07,800 --> 00:27:13,520 Speaker 1: So for example, a user generated content platform typically enjoys 436 00:27:13,560 --> 00:27:17,560 Speaker 1: a certain amount of safe Harbor protection because those platforms 437 00:27:17,600 --> 00:27:21,840 Speaker 1: are not responsible for the content that's published by their users. 438 00:27:21,920 --> 00:27:25,800 Speaker 1: Right If I join a content platform as a user 439 00:27:26,320 --> 00:27:30,080 Speaker 1: and I upload stuff that is against their policies, that's 440 00:27:30,119 --> 00:27:33,199 Speaker 1: on me, not on the platform. However, as I mentioned, 441 00:27:33,200 --> 00:27:37,040 Speaker 1: safe harbor typically only applies as long as certain conditions 442 00:27:37,080 --> 00:27:41,000 Speaker 1: continue to be met. So for a user generated content platform, 443 00:27:41,200 --> 00:27:44,320 Speaker 1: one of those conditions could be that the platform has 444 00:27:44,320 --> 00:27:47,920 Speaker 1: to take down an instance of user generated content if 445 00:27:47,920 --> 00:27:52,560 Speaker 1: it's proven that that instance includes copyrighted material that doesn't 446 00:27:52,600 --> 00:27:56,439 Speaker 1: belong to the user. So if I start uploading you know, 447 00:27:56,720 --> 00:28:01,640 Speaker 1: they might be giants tracks, and the platform i'm uploading 448 00:28:01,680 --> 00:28:03,960 Speaker 1: them to get a notice, Hey, this guy is doing 449 00:28:03,960 --> 00:28:08,040 Speaker 1: this without our permission, then that platform would be expected 450 00:28:08,080 --> 00:28:11,679 Speaker 1: to you know, ban me or delete my material or whatever, 451 00:28:12,600 --> 00:28:14,840 Speaker 1: and then it would continue to enjoy the protections of 452 00:28:14,840 --> 00:28:18,320 Speaker 1: safe harbor because it actually took steps to address the issue. 453 00:28:18,600 --> 00:28:21,880 Speaker 1: If platforms don't follow whatever those rules are, and it's 454 00:28:21,880 --> 00:28:24,359 Speaker 1: a case by case kind of thing, then they no 455 00:28:24,440 --> 00:28:28,720 Speaker 1: longer enjoy the safe harbor protection. Well, with Kappa, it's 456 00:28:28,760 --> 00:28:32,480 Speaker 1: possible for industry groups to file for safe Harbor status 457 00:28:32,560 --> 00:28:36,280 Speaker 1: under the FTC, and these groups have to establish a 458 00:28:36,359 --> 00:28:39,040 Speaker 1: set of rules and policies that are at least the 459 00:28:39,120 --> 00:28:43,160 Speaker 1: same or greater than those defined by KAPPA. And if 460 00:28:43,200 --> 00:28:45,960 Speaker 1: they do that, then they can apply for safe Harbor 461 00:28:46,040 --> 00:28:49,920 Speaker 1: status under KAPPA. And then regulation and enforcement kind of 462 00:28:49,920 --> 00:28:53,520 Speaker 1: falls to that industry groups and the organizations that belong 463 00:28:53,600 --> 00:28:56,800 Speaker 1: to industry group. So if a company then becomes a 464 00:28:56,800 --> 00:29:00,880 Speaker 1: member of a particular industry group, and if industry group 465 00:29:00,960 --> 00:29:05,240 Speaker 1: has safe Harbor status, the burden of responsibility really falls 466 00:29:05,280 --> 00:29:07,200 Speaker 1: to that industry group to make sure that all the 467 00:29:07,240 --> 00:29:11,160 Speaker 1: different member organizations are in compliance with that group's policies. 468 00:29:11,800 --> 00:29:13,600 Speaker 1: And I actually have a real world example. I can 469 00:29:13,680 --> 00:29:16,200 Speaker 1: cite of a company that at one point belonged to 470 00:29:16,200 --> 00:29:19,040 Speaker 1: a Safe Harbor group and it later got in trouble 471 00:29:19,080 --> 00:29:23,840 Speaker 1: with the FTC. That company was the game developer Mini Clip, 472 00:29:24,200 --> 00:29:27,880 Speaker 1: which in two thousand nine joined the children's advertising review 473 00:29:28,000 --> 00:29:32,280 Speaker 1: Units Safe Harbor. So, yeah, CARU, it does have a 474 00:29:32,320 --> 00:29:37,200 Speaker 1: Safe Harbor industry group under KAPPA. But in two thousand fifteen, 475 00:29:37,280 --> 00:29:40,960 Speaker 1: CARU terminated Many Clips status as a member of that group. 476 00:29:41,680 --> 00:29:44,520 Speaker 1: I'm not actually sure why that happened. I don't know 477 00:29:44,680 --> 00:29:49,400 Speaker 1: what led to many Clips status terminating under that Safe 478 00:29:49,440 --> 00:29:53,880 Speaker 1: Harbor group, but the FTC pursued a complaint against many 479 00:29:53,920 --> 00:29:57,720 Speaker 1: Clip because, according to the FTC, Many Clip continued to 480 00:29:57,760 --> 00:30:00,959 Speaker 1: display a message on its websites saying that the company 481 00:30:01,120 --> 00:30:04,760 Speaker 1: was part of Caru's Safe Harbor group well into two 482 00:30:04,800 --> 00:30:07,680 Speaker 1: thousand nineteen, which was years after the company had its 483 00:30:07,720 --> 00:30:12,000 Speaker 1: membership terminated, and the FTC's main complaint was that Mini 484 00:30:12,040 --> 00:30:15,840 Speaker 1: Clip was misrepresenting itself. It was presenting a falsehood that 485 00:30:15,920 --> 00:30:18,480 Speaker 1: it was still a member of this Kappa compliant group, 486 00:30:18,960 --> 00:30:21,320 Speaker 1: when in fact that was no longer the case. So 487 00:30:21,360 --> 00:30:24,040 Speaker 1: this showed that Kappa applies not just to the direct 488 00:30:24,040 --> 00:30:27,600 Speaker 1: activities that companies engage in that involved the collection and 489 00:30:27,760 --> 00:30:30,920 Speaker 1: use of personal data that belongs to kids, but also 490 00:30:31,000 --> 00:30:36,320 Speaker 1: how those companies represent themselves or in this case, misrepresent themselves. However, 491 00:30:36,400 --> 00:30:39,800 Speaker 1: plenty of companies have been found guilty of violating Copper 492 00:30:39,920 --> 00:30:43,400 Speaker 1: rules with regard to children's data. For example, back in 493 00:30:43,480 --> 00:30:47,479 Speaker 1: two thousand eight, Sony b MG, the music label, was 494 00:30:47,640 --> 00:30:50,960 Speaker 1: sued by the FTC for collecting information on an estimated 495 00:30:51,160 --> 00:30:55,720 Speaker 1: thirty thousand users below the age of thirteen through its websites. 496 00:30:56,200 --> 00:30:59,479 Speaker 1: See Sony b MG had sites that included some social 497 00:30:59,520 --> 00:31:03,320 Speaker 1: networking aspects to them, and that required users to include 498 00:31:03,320 --> 00:31:06,520 Speaker 1: stuff like their names and addresses and email address and 499 00:31:06,520 --> 00:31:08,440 Speaker 1: that kind of thing, you know, the standard stuff that 500 00:31:08,480 --> 00:31:10,760 Speaker 1: you have to fill in when you create an account 501 00:31:11,200 --> 00:31:14,560 Speaker 1: on a social networking site. Now, Sony claimed on these 502 00:31:14,560 --> 00:31:18,120 Speaker 1: sites that they were not meant for people under the 503 00:31:18,160 --> 00:31:21,480 Speaker 1: age of thirteen. It was for thirteen or older, but 504 00:31:21,680 --> 00:31:24,640 Speaker 1: there were no actual measures in place to actually prevent 505 00:31:24,880 --> 00:31:27,760 Speaker 1: kids from signing up. There was no sort of age 506 00:31:27,840 --> 00:31:31,959 Speaker 1: gate process there, and the FTC alleged that not only 507 00:31:32,200 --> 00:31:36,160 Speaker 1: was Sony not preventing it, the company was aware that 508 00:31:36,240 --> 00:31:39,360 Speaker 1: thousands of users were under the age of thirteen. In 509 00:31:39,400 --> 00:31:42,240 Speaker 1: the end, Sony BMG agreed to pay a one million 510 00:31:42,280 --> 00:31:45,880 Speaker 1: dollar settlement to the FTC, and by in the end, 511 00:31:45,920 --> 00:31:49,040 Speaker 1: I mean the day after the FTC filed the lawsuit 512 00:31:49,440 --> 00:31:53,120 Speaker 1: Sony agreed to settle. Sony also agreed to delete all 513 00:31:53,160 --> 00:31:56,160 Speaker 1: personal information related to users under the age of thirteen, 514 00:31:56,480 --> 00:31:59,040 Speaker 1: along with some other measures that were mandated by the 515 00:31:59,160 --> 00:32:03,400 Speaker 1: terms of the settlement. Sony's big issue was that, while 516 00:32:03,400 --> 00:32:06,960 Speaker 1: it proclaimed that the sites were intended for those thirteen 517 00:32:06,960 --> 00:32:09,640 Speaker 1: and older. It really had no measures in place to 518 00:32:09,680 --> 00:32:12,920 Speaker 1: actually enforce that, and kids under that age could register 519 00:32:13,120 --> 00:32:16,000 Speaker 1: and even include their actual age in the process, and 520 00:32:16,040 --> 00:32:20,240 Speaker 1: that meant Sony was knowingly collecting data belonging to people 521 00:32:20,320 --> 00:32:23,320 Speaker 1: under the age of thirteen. That's a big deal. Like, 522 00:32:23,600 --> 00:32:26,800 Speaker 1: if you unknowingly collect the data of people under the 523 00:32:26,800 --> 00:32:29,240 Speaker 1: age of thirteen, you actually have a bit of a 524 00:32:29,280 --> 00:32:32,960 Speaker 1: defense if you can prove that you did so unknowingly. 525 00:32:33,600 --> 00:32:36,840 Speaker 1: But when you knowingly collect it, it's a different kettle 526 00:32:36,880 --> 00:32:39,360 Speaker 1: of fish. When we come back, we'll talk about a 527 00:32:39,360 --> 00:32:41,840 Speaker 1: few other cases in which companies have tried to sidestep 528 00:32:41,840 --> 00:32:46,480 Speaker 1: the issues of Kappa entirely, and also how Kappa really 529 00:32:46,560 --> 00:32:51,920 Speaker 1: freaked out a huge population of YouTube creators back in 530 00:32:53,360 --> 00:33:03,480 Speaker 1: But first let's take another quick break. So when it 531 00:33:03,560 --> 00:33:06,360 Speaker 1: comes to companies trying to avoid dealing with Kappa, we 532 00:33:06,440 --> 00:33:09,640 Speaker 1: got to talk about Facebook. Pretty much. Since the time 533 00:33:09,680 --> 00:33:13,640 Speaker 1: Facebook opened up beyond college students, it has maintained that 534 00:33:13,720 --> 00:33:17,240 Speaker 1: the services for people thirteen or older, and to make 535 00:33:17,280 --> 00:33:19,880 Speaker 1: a Facebook profile, you have to include your name and 536 00:33:19,920 --> 00:33:22,760 Speaker 1: an email address at a birth date, even if you 537 00:33:22,800 --> 00:33:27,440 Speaker 1: don't intend on showing anyone else your birthdate, Facebook uses 538 00:33:27,440 --> 00:33:29,760 Speaker 1: that information to in part to check your age. So 539 00:33:29,800 --> 00:33:32,760 Speaker 1: if you're younger than thirteen, Facebook won't let you make 540 00:33:32,760 --> 00:33:36,959 Speaker 1: a profile. Why because then Facebook would be subjected to 541 00:33:37,000 --> 00:33:40,440 Speaker 1: the rules of Kappa, and as a platform that hosts 542 00:33:40,560 --> 00:33:44,120 Speaker 1: user generated content, that would be really hard for Facebook 543 00:33:44,160 --> 00:33:46,640 Speaker 1: to comply with. And moreover, it would mean that Facebook 544 00:33:46,680 --> 00:33:50,440 Speaker 1: would have to implement some serious restrictions on how it 545 00:33:50,520 --> 00:33:54,280 Speaker 1: gathers information. As I've pointed out many times, because the 546 00:33:54,400 --> 00:33:57,920 Speaker 1: vast majority of Facebook's massive revenue comes from how the 547 00:33:57,920 --> 00:34:03,080 Speaker 1: platform harvests and uses our personal information. So Facebook has 548 00:34:03,120 --> 00:34:06,000 Speaker 1: the age gate approach. If a user fakes their birthday 549 00:34:06,040 --> 00:34:09,000 Speaker 1: to get in, well, that's not really Facebook's fault, is it. 550 00:34:09,440 --> 00:34:13,360 Speaker 1: That's just someone being dishonest, and Facebook isn't knowingly collecting 551 00:34:13,400 --> 00:34:16,160 Speaker 1: the data of a child. The company would still be 552 00:34:16,200 --> 00:34:19,040 Speaker 1: collecting that data, mind you, but as long as there 553 00:34:19,080 --> 00:34:22,120 Speaker 1: was no indication that the profile actually belonged to a child, 554 00:34:22,560 --> 00:34:24,880 Speaker 1: the company would have the defense that, according to the 555 00:34:24,920 --> 00:34:28,960 Speaker 1: information submitted to Facebook, that user was over the age 556 00:34:29,000 --> 00:34:32,280 Speaker 1: of thirteen and so by positioning itself as a company 557 00:34:32,320 --> 00:34:36,120 Speaker 1: that has a social networking site intended for thirteen and older, 558 00:34:36,560 --> 00:34:41,640 Speaker 1: Facebook isn't subject to Kappa The system isn't perfect by 559 00:34:41,680 --> 00:34:44,560 Speaker 1: any stretch of the imagination, but it's hard to argue 560 00:34:44,680 --> 00:34:47,360 Speaker 1: that Facebook is just using smoke and mirrors to seem 561 00:34:47,480 --> 00:34:50,680 Speaker 1: as if it's complying. It might be easy to circumvent 562 00:34:50,719 --> 00:34:55,160 Speaker 1: the rules, but there are rules. At various points. Since 563 00:34:55,239 --> 00:34:59,120 Speaker 1: Kappa became law, lawmakers have considered expanding the rules and 564 00:34:59,239 --> 00:35:03,480 Speaker 1: upping the age to eighteen years old. Some lawmakers have 565 00:35:03,520 --> 00:35:07,120 Speaker 1: asked why thirteen has been this arbitrary age, which kind 566 00:35:07,160 --> 00:35:10,000 Speaker 1: of dates back to the old CARU guidelines, and so 567 00:35:10,040 --> 00:35:13,560 Speaker 1: far the government has not increased the age limit on Kappa, 568 00:35:13,680 --> 00:35:16,440 Speaker 1: which is a good thing for platforms like Facebook because 569 00:35:16,440 --> 00:35:19,200 Speaker 1: a change like that would have a massive effect on 570 00:35:19,239 --> 00:35:22,680 Speaker 1: that social network, as well as on countless other sites. 571 00:35:23,400 --> 00:35:28,080 Speaker 1: And then we come to YouTube. Yikes, alright, so YouTube 572 00:35:28,120 --> 00:35:31,680 Speaker 1: has had more than a rough recent history when it 573 00:35:31,760 --> 00:35:35,719 Speaker 1: comes to content and kids. First of all, YouTube has 574 00:35:35,719 --> 00:35:38,520 Speaker 1: a policy that's pretty much the same as facebooks. You 575 00:35:38,600 --> 00:35:41,879 Speaker 1: are not supposed to create a YouTube profile unless you're 576 00:35:41,960 --> 00:35:46,080 Speaker 1: at least thirteen years old, and like Facebook, Google age 577 00:35:46,120 --> 00:35:49,799 Speaker 1: gates this. Of course, someone might lie about when their 578 00:35:49,800 --> 00:35:52,400 Speaker 1: birthday was, or a parent might set up an account 579 00:35:52,440 --> 00:35:54,600 Speaker 1: and fudge a birthday so that their kid can watch 580 00:35:54,600 --> 00:35:58,279 Speaker 1: stuff on YouTube. It's also entirely possible to just go 581 00:35:58,480 --> 00:36:01,839 Speaker 1: to YouTube without being on a profile at all and 582 00:36:01,880 --> 00:36:06,000 Speaker 1: just watch content on YouTube as an anonymous user. So 583 00:36:06,120 --> 00:36:10,400 Speaker 1: while YouTube age gates profiles, the platform doesn't actually age 584 00:36:10,440 --> 00:36:15,120 Speaker 1: gate the general content on YouTube itself, at least not 585 00:36:15,400 --> 00:36:19,399 Speaker 1: to that extent. And what's more, YouTube slash Google. When 586 00:36:19,440 --> 00:36:22,240 Speaker 1: I use YouTube as a company name, you can probably 587 00:36:22,280 --> 00:36:25,640 Speaker 1: substitute Google or even Alphabet in there, because it's all 588 00:36:26,360 --> 00:36:31,359 Speaker 1: one big, dysfunctional family. Anyway, YouTube knows all about this, 589 00:36:31,800 --> 00:36:34,080 Speaker 1: which I mean no surprise that company is in the 590 00:36:34,080 --> 00:36:37,239 Speaker 1: business of collecting and understanding our data better than we do. 591 00:36:37,800 --> 00:36:42,280 Speaker 1: So in meetings with like big companies like Hasbro and Mattel, 592 00:36:42,440 --> 00:36:45,600 Speaker 1: you know, big toy companies, YouTube has bragged about how 593 00:36:45,680 --> 00:36:48,960 Speaker 1: it is a platform that is incredibly popular with children, 594 00:36:49,320 --> 00:36:51,520 Speaker 1: like kids between the ages of two and twelve and 595 00:36:51,560 --> 00:36:55,400 Speaker 1: stuff like that, so they're well aware that young kids 596 00:36:55,400 --> 00:36:58,520 Speaker 1: are watching YouTube. And then there's YouTube Kids, which is 597 00:36:58,560 --> 00:37:01,720 Speaker 1: the actual app that's supposed to filter content by age 598 00:37:01,719 --> 00:37:05,560 Speaker 1: group so that you can select using the app which 599 00:37:06,120 --> 00:37:09,759 Speaker 1: videos should be shown to your kid, as in, like 600 00:37:10,200 --> 00:37:13,640 Speaker 1: which videos that are appropriate to certain ages are allowed 601 00:37:13,800 --> 00:37:15,920 Speaker 1: to be shown to your kid. Uh. The ages I 602 00:37:15,920 --> 00:37:19,640 Speaker 1: think are five, eight, and thirteen, and it just depends 603 00:37:19,640 --> 00:37:22,480 Speaker 1: on which one you set when you set up the account. 604 00:37:23,400 --> 00:37:27,120 Speaker 1: Like the standard version of YouTube, YouTube Kids generates revenue 605 00:37:27,120 --> 00:37:30,319 Speaker 1: through ads, and Google got into trouble with this with 606 00:37:30,480 --> 00:37:34,720 Speaker 1: the Campaign for a Commercial Free Childhood and the Center 607 00:37:34,800 --> 00:37:38,360 Speaker 1: for Digital Democracy. They both argued that the way that 608 00:37:38,480 --> 00:37:42,239 Speaker 1: ads were being presented to children on YouTube Kids made 609 00:37:42,239 --> 00:37:45,000 Speaker 1: it seem like the ads were part of the content itself, 610 00:37:45,080 --> 00:37:48,720 Speaker 1: which was considered to be misleading and in violation of Kappa. 611 00:37:48,800 --> 00:37:52,800 Speaker 1: So Google subsequently made those ads stand out a little 612 00:37:52,800 --> 00:37:55,160 Speaker 1: bit more from the content itself in order to make 613 00:37:55,160 --> 00:37:58,680 Speaker 1: a more clear divider between what was an ad and 614 00:37:58,800 --> 00:38:02,040 Speaker 1: what was content. And my guess is you've heard the 615 00:38:02,040 --> 00:38:06,040 Speaker 1: stories about weird and disturbing videos popping up both on 616 00:38:06,080 --> 00:38:12,560 Speaker 1: YouTube and ultimately on YouTube Kids that was specifically targeting children. 617 00:38:13,040 --> 00:38:17,560 Speaker 1: These videos usually involve recognizable characters, ones that clearly have 618 00:38:17,719 --> 00:38:22,080 Speaker 1: not been licensed, but characters like Elsa from Frozen or 619 00:38:22,239 --> 00:38:25,720 Speaker 1: Spider Man from Marvel, and they're engaging in all sorts 620 00:38:25,719 --> 00:38:30,120 Speaker 1: of weird and sometimes upsetting activities. Sometimes it's live action 621 00:38:30,360 --> 00:38:34,279 Speaker 1: people dressed in costumes. Sometimes it's crude animation. Often the 622 00:38:34,400 --> 00:38:37,359 Speaker 1: videos are completely wordless and just set to music, which 623 00:38:37,360 --> 00:38:40,239 Speaker 1: means there's no language barrier there. So they can have 624 00:38:40,280 --> 00:38:44,520 Speaker 1: a pretty wide appeal globally, and they frequently involve activities 625 00:38:44,560 --> 00:38:48,560 Speaker 1: that kids find fascinating. You know, stuff that happens in 626 00:38:48,600 --> 00:38:51,880 Speaker 1: the world that kids think is really unusual or strange, 627 00:38:51,960 --> 00:38:57,160 Speaker 1: like pregnancy or toilet's or you know, all sorts of 628 00:38:57,200 --> 00:39:00,160 Speaker 1: things of that nature. Getting a shot from the doctor. 629 00:39:00,280 --> 00:39:03,480 Speaker 1: That's a big one too. It's definitely not high cinema, 630 00:39:03,520 --> 00:39:06,440 Speaker 1: but it's the kind of stuff that kids really fixate 631 00:39:06,520 --> 00:39:09,680 Speaker 1: on to different degrees and for different reasons. So through 632 00:39:09,719 --> 00:39:14,359 Speaker 1: a combination of using meta keywords and other means, these 633 00:39:14,440 --> 00:39:17,920 Speaker 1: videos would perform really well in in YouTube algorithms and 634 00:39:18,000 --> 00:39:21,799 Speaker 1: often make the transition to YouTube kids, even if arguably 635 00:39:21,880 --> 00:39:25,400 Speaker 1: the content was not appropriate, and then kids would stumble 636 00:39:25,400 --> 00:39:28,319 Speaker 1: across them, and that in turn would upset parents who 637 00:39:28,400 --> 00:39:30,920 Speaker 1: found out that their kids were watching these things, and 638 00:39:31,000 --> 00:39:34,160 Speaker 1: that ultimately made the news. Now, this issue had been 639 00:39:34,160 --> 00:39:36,840 Speaker 1: going on for years, really, but it really came to 640 00:39:36,920 --> 00:39:39,759 Speaker 1: light in twenty seventeen. That's when it it made headlines 641 00:39:39,880 --> 00:39:42,280 Speaker 1: in the United States and it brought a pretty harsh 642 00:39:42,360 --> 00:39:46,080 Speaker 1: spotlight on YouTube as a result, so the company began 643 00:39:46,160 --> 00:39:50,120 Speaker 1: banning thousands of these garbage content channels, but now found 644 00:39:50,160 --> 00:39:54,080 Speaker 1: itself under some serious scrutiny. And that scrutiny included people 645 00:39:54,080 --> 00:39:57,239 Speaker 1: who were concerned that videos that seemed like they might 646 00:39:57,320 --> 00:40:01,280 Speaker 1: be aimed at kids were in fact not kid friendly, 647 00:40:01,600 --> 00:40:05,160 Speaker 1: and that because YouTube's revenue model depends heavily on advertising, 648 00:40:05,400 --> 00:40:07,320 Speaker 1: it's also meant that kids might be seeing ads that 649 00:40:07,360 --> 00:40:11,640 Speaker 1: weren't really appropriate or didn't comply with CARU guidelines, and 650 00:40:11,680 --> 00:40:16,560 Speaker 1: also that YouTube might be collecting information about kids without 651 00:40:16,600 --> 00:40:21,600 Speaker 1: parental consent. In September twenty nineteen, the FTC and the 652 00:40:21,680 --> 00:40:25,440 Speaker 1: New York Attorney General reached a settlement with YouTube regarding 653 00:40:25,440 --> 00:40:28,680 Speaker 1: an allegation that the company had been collecting personal information 654 00:40:28,680 --> 00:40:32,720 Speaker 1: in the form of persistent identifiers used to track specific 655 00:40:32,800 --> 00:40:35,680 Speaker 1: users as they navigate through the site and beyond, and 656 00:40:35,719 --> 00:40:39,440 Speaker 1: that that included children under the age of thirteen and moreover, 657 00:40:39,840 --> 00:40:43,560 Speaker 1: that YouTube did not first notify parents about this and 658 00:40:43,600 --> 00:40:47,800 Speaker 1: get the parents consent, which thus was a violation of Kappa. 659 00:40:48,120 --> 00:40:51,080 Speaker 1: That settlement meant that YouTube would have to pay one 660 00:40:51,239 --> 00:40:55,160 Speaker 1: hundred seventy million dollars in fines, which was the largest 661 00:40:55,160 --> 00:40:58,880 Speaker 1: amount for a Kappa case at that point. The FDC 662 00:40:59,040 --> 00:41:03,360 Speaker 1: chairman said, quote, YouTube touted its popularity with children to 663 00:41:03,400 --> 00:41:07,080 Speaker 1: perspective corporate clients, yet when it came to complying with Kappa, 664 00:41:07,239 --> 00:41:10,400 Speaker 1: the company refused to acknowledge that portions of its platform 665 00:41:10,440 --> 00:41:14,200 Speaker 1: were clearly directed to kids. There's no excuse for YouTube's 666 00:41:14,280 --> 00:41:18,560 Speaker 1: violations of the law. End quote. The complaints stated that 667 00:41:18,640 --> 00:41:22,279 Speaker 1: YouTube positions itself as a general audience site, but in 668 00:41:22,320 --> 00:41:27,400 Speaker 1: fact has many channels clearly geared toward younger audiences. For example, 669 00:41:27,840 --> 00:41:30,960 Speaker 1: videos of people unboxing toys that are meant for little 670 00:41:31,040 --> 00:41:34,600 Speaker 1: kids seems to fall pretty neatly into that category. And 671 00:41:34,640 --> 00:41:38,200 Speaker 1: so YouTube, as a platform that knowingly played host to 672 00:41:38,360 --> 00:41:42,120 Speaker 1: child directed channels, had a responsibility to make certain those 673 00:41:42,200 --> 00:41:46,360 Speaker 1: channels and then, more importantly, the advertising on those channels 674 00:41:46,400 --> 00:41:50,040 Speaker 1: complied with Copper rules. Well, there's nothing like a big 675 00:41:50,080 --> 00:41:53,640 Speaker 1: old fine to incentivize a company to changing its policies, 676 00:41:53,920 --> 00:41:57,480 Speaker 1: and that's what YouTube did. Creators now have to indicate 677 00:41:57,480 --> 00:42:01,160 Speaker 1: whether their channels are child directed or not. They can 678 00:42:01,239 --> 00:42:04,880 Speaker 1: also label specific videos on a case by case basis 679 00:42:05,120 --> 00:42:08,280 Speaker 1: as to whether or not that video is directed toward children. 680 00:42:08,760 --> 00:42:12,840 Speaker 1: Any video determined to be child directed, either because the 681 00:42:12,960 --> 00:42:17,160 Speaker 1: user made that indication or YouTube figured it out or 682 00:42:17,239 --> 00:42:20,799 Speaker 1: decided that was the case. Anyone that did that would 683 00:42:20,840 --> 00:42:23,520 Speaker 1: not be allowed to include certain ways of collecting personal 684 00:42:23,560 --> 00:42:29,600 Speaker 1: information from viewers. Now, most creators don't actually collect information 685 00:42:29,640 --> 00:42:32,200 Speaker 1: from their viewers at all, and at least not to 686 00:42:32,320 --> 00:42:35,200 Speaker 1: this level. Most of the creators who are YouTube partners 687 00:42:35,239 --> 00:42:38,000 Speaker 1: and who are running ads on their videos are relying 688 00:42:38,040 --> 00:42:41,480 Speaker 1: on YouTube's algorithms to serve up advertising. They don't usually 689 00:42:41,520 --> 00:42:44,080 Speaker 1: actually have a say in what kind of ads are 690 00:42:44,080 --> 00:42:47,080 Speaker 1: going to play against their videos. And because of YouTube's 691 00:42:47,160 --> 00:42:50,600 Speaker 1: dynamic ad program, two different people in two different parts 692 00:42:50,640 --> 00:42:53,319 Speaker 1: of the world who watched the exact same video with 693 00:42:53,400 --> 00:42:57,040 Speaker 1: the same number of ad breaks could see very different ads. 694 00:42:57,040 --> 00:43:00,719 Speaker 1: But for videos that can't collect personal information, and that 695 00:43:00,800 --> 00:43:04,239 Speaker 1: selection of ads gets whittled down quite a lot. So 696 00:43:04,440 --> 00:43:07,919 Speaker 1: that meant that creators who had child directed channels would 697 00:43:07,960 --> 00:43:10,920 Speaker 1: find it more difficult to monetize their work and they 698 00:43:10,920 --> 00:43:13,279 Speaker 1: would see a lower return on investment, and for some 699 00:43:13,360 --> 00:43:16,560 Speaker 1: creators that could be severe enough to make it unprofitable 700 00:43:16,600 --> 00:43:21,000 Speaker 1: and thus unsupportable to continue making videos on YouTube. Creators 701 00:43:21,000 --> 00:43:23,800 Speaker 1: complained to YouTube, saying that it was difficult to determine 702 00:43:23,840 --> 00:43:27,440 Speaker 1: whether or not content was really child directed or not. 703 00:43:27,800 --> 00:43:32,799 Speaker 1: Maybe it was just family friendly but not specifically child directed. So, 704 00:43:32,920 --> 00:43:36,680 Speaker 1: for example, videos of Let's Plays, in which content creators 705 00:43:36,719 --> 00:43:40,399 Speaker 1: play video games and often provide commentary and so on, 706 00:43:40,920 --> 00:43:43,680 Speaker 1: they kind of fall into that gray area because there 707 00:43:43,800 --> 00:43:46,400 Speaker 1: is still that perception that video games are for kids, 708 00:43:46,760 --> 00:43:50,600 Speaker 1: even though most of the population playing games these days 709 00:43:51,480 --> 00:43:56,120 Speaker 1: are older than thirteen, and most games contain content that's 710 00:43:56,160 --> 00:43:59,080 Speaker 1: not really kid friendly. I'm pretty sure no one would 711 00:43:59,120 --> 00:44:02,520 Speaker 1: claim like Grand Theft Auto is appropriate content for small kids, 712 00:44:02,840 --> 00:44:05,360 Speaker 1: or that the Resident Evil franchise is great for a 713 00:44:05,400 --> 00:44:09,040 Speaker 1: six year old. But because there's this social association of 714 00:44:09,160 --> 00:44:11,960 Speaker 1: video games with children, it could be difficult to argue 715 00:44:12,000 --> 00:44:16,920 Speaker 1: that videos featuring gameplay are not actually child directed. And 716 00:44:17,000 --> 00:44:20,359 Speaker 1: since the FTC's decision meant that creators could be held 717 00:44:20,440 --> 00:44:24,320 Speaker 1: liable for future violations of Kappa, and because these creators 718 00:44:24,320 --> 00:44:27,080 Speaker 1: are dependent upon ad revenue as part of their income, 719 00:44:27,520 --> 00:44:30,319 Speaker 1: that gets to be a big problem. If kids are 720 00:44:30,320 --> 00:44:33,360 Speaker 1: watching the videos, then that means the kids are also 721 00:44:33,400 --> 00:44:36,920 Speaker 1: being served advertisements, and if that's happening, it means that 722 00:44:36,960 --> 00:44:39,319 Speaker 1: there's a chance those kids are having some form of 723 00:44:39,320 --> 00:44:44,480 Speaker 1: personal identification shared with those advertising parties without parental consent, 724 00:44:44,880 --> 00:44:47,319 Speaker 1: and thus we get back to the violation of Kappa. 725 00:44:47,640 --> 00:44:51,280 Speaker 1: It doesn't matter that the creators themselves wouldn't have access 726 00:44:51,320 --> 00:44:55,120 Speaker 1: to that personal information. Just by serving as the conduit 727 00:44:55,360 --> 00:44:58,520 Speaker 1: through which kids data could be hoovered up by an advertiser, 728 00:44:58,920 --> 00:45:02,200 Speaker 1: the creator would be on the hook. That means that 729 00:45:02,200 --> 00:45:05,719 Speaker 1: that content creator on YouTube could potentially have to pay 730 00:45:05,800 --> 00:45:09,960 Speaker 1: up to forty two thousand dollars per video that violates 731 00:45:10,000 --> 00:45:13,760 Speaker 1: the rules, which is a big ouch. In January twenty 732 00:45:14,320 --> 00:45:17,279 Speaker 1: YouTube put the new rules in place, and videos that 733 00:45:17,280 --> 00:45:19,719 Speaker 1: were either selected as being child directed or that were 734 00:45:19,760 --> 00:45:24,160 Speaker 1: subsequently determined to be child directed would have their various 735 00:45:24,200 --> 00:45:28,600 Speaker 1: features turned off. For example, comments would get turned off 736 00:45:28,640 --> 00:45:31,480 Speaker 1: for those videos because the comment section could serve as 737 00:45:31,480 --> 00:45:33,920 Speaker 1: a means for someone to collect data about members of 738 00:45:33,920 --> 00:45:37,359 Speaker 1: the audience. So these videos would no longer have personalized 739 00:45:37,360 --> 00:45:40,880 Speaker 1: ads because of course, those ads depend upon tracking personal data, 740 00:45:40,960 --> 00:45:44,840 Speaker 1: so you can't have a personalized ad if that's not allowed, 741 00:45:44,880 --> 00:45:48,920 Speaker 1: so child directed videos can't use that. So instead the 742 00:45:48,920 --> 00:45:52,000 Speaker 1: ads would be contextually appropriate based upon the content of 743 00:45:52,040 --> 00:45:55,759 Speaker 1: the video. Other monetization features, which include stuff like a 744 00:45:55,800 --> 00:45:59,000 Speaker 1: merchandise option, those would also be turned off because those 745 00:45:59,040 --> 00:46:02,880 Speaker 1: require users to handover information in order to interact with 746 00:46:02,920 --> 00:46:07,040 Speaker 1: those features, so those were not allowed. Playlists, the mini 747 00:46:07,120 --> 00:46:10,880 Speaker 1: player notifications, all of these features would be turned off 748 00:46:10,920 --> 00:46:13,719 Speaker 1: as well. That also means that these videos would be 749 00:46:13,760 --> 00:46:18,320 Speaker 1: adversely affected when it comes to YouTube's recommendation engine. Videos 750 00:46:18,320 --> 00:46:20,839 Speaker 1: that get a lot of engagement tend to go up 751 00:46:20,880 --> 00:46:24,720 Speaker 1: on those lists, but the child directed stuff by default, 752 00:46:24,840 --> 00:46:28,160 Speaker 1: has a lot of those different features that affect metrics 753 00:46:28,520 --> 00:46:32,320 Speaker 1: turned off, so that leaves the creators with a disadvantage. 754 00:46:32,480 --> 00:46:35,719 Speaker 1: Their content is less likely to be seen and discovered, 755 00:46:35,920 --> 00:46:39,520 Speaker 1: which in turn means lower revenues for those content creators. 756 00:46:40,239 --> 00:46:42,839 Speaker 1: And that's kind of where we are now. It's it's 757 00:46:42,840 --> 00:46:45,920 Speaker 1: a rough place because on the one hand, you certainly 758 00:46:45,960 --> 00:46:49,720 Speaker 1: see the FTCs point in that you don't want children 759 00:46:49,840 --> 00:46:53,400 Speaker 1: to be exploited. You don't want companies to be collecting 760 00:46:53,520 --> 00:46:59,120 Speaker 1: data about kids when there's no real accountability there. You 761 00:46:59,200 --> 00:47:01,799 Speaker 1: definitely want the parents to be involved in all of this, 762 00:47:01,920 --> 00:47:04,600 Speaker 1: and it's very easy for parents to be left out 763 00:47:04,600 --> 00:47:08,719 Speaker 1: of the loop, even well intentioned and you know, attentive 764 00:47:08,800 --> 00:47:12,040 Speaker 1: parents can be left out of the loop. So there 765 00:47:12,120 --> 00:47:14,560 Speaker 1: does need to be some sort of measures in place, 766 00:47:15,200 --> 00:47:18,480 Speaker 1: and it's not necessarily I can't really blame YouTube for 767 00:47:18,520 --> 00:47:20,880 Speaker 1: passing these rules. I know a lot of YouTube creators 768 00:47:20,880 --> 00:47:24,319 Speaker 1: were really upset when YouTube made these announcements, but the 769 00:47:24,400 --> 00:47:28,520 Speaker 1: company has an obligation as well, and ultimately is trying 770 00:47:28,560 --> 00:47:31,600 Speaker 1: to protect content creators because those are the people who 771 00:47:31,600 --> 00:47:33,799 Speaker 1: are going to be held responsible if they're found in 772 00:47:33,920 --> 00:47:38,080 Speaker 1: violation of KAPPA. I also feel for the content creators though, 773 00:47:38,160 --> 00:47:40,480 Speaker 1: because a lot of them are creating content that they 774 00:47:40,560 --> 00:47:44,680 Speaker 1: never intended to be directed towards children, and yet sometimes 775 00:47:44,840 --> 00:47:48,279 Speaker 1: children end up kind of, you know, latching on to 776 00:47:48,440 --> 00:47:52,480 Speaker 1: that content. Well, what does the creator supposed to do 777 00:47:52,520 --> 00:47:56,320 Speaker 1: about that? Because they're in the business of making stuff 778 00:47:56,840 --> 00:48:00,000 Speaker 1: that they're trying to entertain people with. They're not necessary 779 00:48:00,000 --> 00:48:02,279 Speaker 1: early intending it for kids, but kids are coming to 780 00:48:02,320 --> 00:48:05,279 Speaker 1: watch it. That really puts them in a in a 781 00:48:05,360 --> 00:48:10,520 Speaker 1: tight spot too. So it's tough. Like, there's no easy 782 00:48:10,560 --> 00:48:14,160 Speaker 1: answer to this. Uh, there are a lot of competing 783 00:48:15,480 --> 00:48:19,279 Speaker 1: motivations and and obligations going on here, and there's not 784 00:48:19,320 --> 00:48:22,640 Speaker 1: really a simple way forward. So while we are in 785 00:48:22,719 --> 00:48:25,120 Speaker 1: a in a bit of a mess. I'm not certain 786 00:48:25,160 --> 00:48:28,759 Speaker 1: that there's really a neat way of doing this. If 787 00:48:28,800 --> 00:48:31,400 Speaker 1: you think otherwise, I'm curious to hear your thoughts. You 788 00:48:31,400 --> 00:48:33,840 Speaker 1: can reach out to me on Twitter. The handle for 789 00:48:33,920 --> 00:48:37,839 Speaker 1: the show is text Stuff hs W. I'm sure I'll 790 00:48:37,880 --> 00:48:43,040 Speaker 1: do more episodes that relate to privacy and technology and 791 00:48:43,719 --> 00:48:46,680 Speaker 1: the ways that those are in conflict with one another, 792 00:48:46,880 --> 00:48:50,400 Speaker 1: and what we can expect from that and what maybe 793 00:48:50,440 --> 00:48:52,840 Speaker 1: we should do about it. I'm sure I'll do a 794 00:48:52,840 --> 00:48:54,960 Speaker 1: lot more of those in the future, but I suspect 795 00:48:54,960 --> 00:48:56,960 Speaker 1: that next week we're going to be covering some totally 796 00:48:57,000 --> 00:48:59,960 Speaker 1: different topics, So stay tuned for that and I'll tell 797 00:49:00,080 --> 00:49:08,920 Speaker 1: to again really soon. Y Text Stuff is an I 798 00:49:09,040 --> 00:49:12,520 Speaker 1: Heart Radio production. For more podcasts from my Heart Radio, 799 00:49:12,880 --> 00:49:16,040 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 800 00:49:16,120 --> 00:49:17,640 Speaker 1: you listen to your favorite shows.